Now that’s funny.
Now that’s funny.
Still Drinking blogs Programming Sucks. If you have ever written a line of computer code, read it and empathize. If you’ve ever had to deal with a team of programmers, read it and a light bulb will explode in your head.
Read it all. For a taste, this extract titled All Code is Bad:
Every programmer occasionally, when nobody’s home, turns off the lights, pours a glass of scotch, puts on some light German electronica, and opens up a file on their computer. It’s a different file for every programmer. Sometimes they wrote it, sometimes they found it and knew they had to save it. They read over the lines, and weep at their beauty, then the tears turn bitter as they remember the rest of the files and the inevitable collapse of all that is good and true in the world.
This file is Good Code. It has sensible and consistent names for functions and variables. It’s concise. It doesn’t do anything obviously stupid. It has never had to live in the wild, or answer to a sales team. It does exactly one, mundane, specific thing, and it does it well. It was written by a single person, and never touched by another. It reads like poetry written by someone over thirty.
Every programmer starts out writing some perfect little snowflake like this. Then they’re told on Friday they need to have six hundred snowflakes written by Tuesday, so they cheat a bit here and there and maybe copy a few snowflakes and try to stick them together or they have to ask a coworker to work on one who melts it and then all the programmers’ snowflakes get dumped together in some inscrutable shape and somebody leans a Picasso on it because nobody wants to see the cat urine soaking into all your broken snowflakes melting in the light of day. Next week, everybody shovels more snow on it to keep the Picasso from falling over.
There’s a theory that you can cure this by following standards, except there are more "standards" than there are things computers can actually do, and these standards are all variously improved and maligned by the personal preferences of the people coding them, so no collection of code has ever made it into the real world without doing a few dozen identical things a few dozen not even remotely similar ways. The first few weeks of any job are just figuring out how a program works even if you’re familiar with every single language, framework, and standard that’s involved, because standards are unicorns.
And if you think this is good, you can buy his book for pennies on the dollar at Amazon, where it is rated 4½ stars.
Rachel Delacour at Java Developer’s Journal thinks 2014 will be The Year the Entire Cloud Becomes Your Data Warehouse:
Data sources proliferate at a dizzying speed. Almost every week, new streams come online that a modern company needs to monitor and analyze, from sensors and consumer actions to supplier feeds. If they don’t tap into this data, they literally leave money on the table.
That’s why the smartest enterprises will increasingly demand cloud services that are open ended, agile and flexible enough to accommodate as many new data streams as possible. Not a 10 or 20, but 50 or 60 data streams, no matter whether they are structured or unstructured, on-premise or from a server half a world away. What’s more, these modern services have to be designed in a way that allows users to easily connect and query them to get answers to their pressing business questions at the spur of a moment.
In 2014, cloud BI will offer companies or teams inside an organization an easy, quick and affordable shortcut through the maze of messy sources, bypassing old and proprietary infrastructure. Think of it as an invitation to just “pay as you know,” channeling and profiting from data streams we haven’t even heard of today.
While it is certainly true that data can be gathered from ever-increasing sources, and that analyzing them right off the cloud would be useful, to think that the transformation will happen next year is uber-optomistic. Organizations are only now wrestling with choosing what data to focus on and learning the tools to do so. To think that disparate cloud vendors will deliver the unified functionality of cross-repository BI analysis is unrealistic. Someday, maybe. 2014, no.
However, Darren Guarnaccia writing for CMS Wire agrees that BI from disparate sources is essential and predicts that 2014 will be The Year of Connected Customer Data:
[Marketers] want the ease-of-use and efficiency gained through connecting the customer data and putting it to work.
This has become a priority because for the first time marketers have lots of data at their fingertips. Advances in technology — drawn in part from the advertising industry — have shown the marketing industry how to collect masses of new data from customers. Marketers can now use similar strategies and want to look at that data as a whole. Connecting this data will enable marketers to gain a 360 review of each individual customer.
Finally, marketers can track the behavior of “Customer Joe” across channels; he will be the same Joe on the web, as he is in email marketing, on Facebook and when he calls in to the call center. The data that represents Joe has to be cumulative — and the view of Joe has to be the same no matter where Joe connects. This will be the year that marketers make sense of the data — and so make sense of “Joe” — by connecting that disparately collected customer data together. Because after last year’s escalation, marketers have realized that until data is connected, it is not meaningful.
So how does the modern organization get that true 360° view of customers? How does it target specific demographics with meaningful content that drives people to purchase products and services or (as in the case of charities like St. Jude) to donate and keep donating year after year?
Jorge Lopez posits that Hadoop will finally enter the mainstream with Four Big Data and Hadoop Trends for 2014:
In 2012 we heard, “What is Hadoop?” In 2013 the question evolved into, “What do I do with Hadoop?” Today’s question? “What’s the best way to do it with Hadoop?” …
The good news is, the universe of big data and Hadoop seem to be playing “Back to the Future” all over again. With commercial and open source vendors rushing to fill in the functional and usability gaps, expect lots of progress in 2014 to bring a new order where data scientists are not the only ones allowed into the party.
Hadoop has enjoyed a very happy childhood in Silicon Valley — loved and pampered by many high-tech companies. Its influence was mostly limited to high-tech companies with the skills and expertise to extract its benefits. But the tide is rapidly changing as more and more traditional businesses start to leverage Hadoop (in part as a result of trend number two above).
Will Hadoop be more widely embraced as functionality for specific industries expand? Almost certainly. Will it succeed or go the way of Case Tools and Iridium? Only time will tell.
On Venus we’ve observed sulfuric acid rain and the snow contains heavy metals. It may rain diamonds on Neptune and Uranus as well as Saturn and Jupiter. and the Hubble has identified a planet with rain of liquid glass.
Sometimes the most fascinating posts are found in blog comments rather than the main article. I found this bit of wisdom by the post’s author (Tor developer Mike Perry) but buried in the comments to his post PRISM vs. Tor (emphasis in original):
I truly believe that the use of weaponized exploits risks crashing the world economy. Software engineering is simply not prepared to deal with this threat.
With the number of dependencies present in large software projects, there is no way any amount of global surveillance, isolation, or firewalling could sufficiently isolate and protect the software development process of widely deployed software projects in order to prevent scenarios where malware sneaks in through a dependency into software that is critical to the function of the world economy.
Such malware could be quite simple: One day, a timer goes off, and any computer running the infected software turns into a brick.
This shit is a doomsday scenario on the order of nuclear conflagration, and anything short of global disarmament risks humanity or at least large sectors of the world economy losing access to computing for months or even years.
There is no M.A.D. scenario as a deterrent here either. Stockpiling more exploits does not make us safer. In fact, the more exploits exist, the higher the risk of the wrong one leaking — and it really only takes a chain of just a few of the right exploits for this to happen. There will also be no clear vector for retaliation. Moreover, how do you retaliate if you have no functioning computer systems or networks left?
If there’s anything we should be spending the NSA’s $10B+/yr budget on, it’s making sure key software development processes are secure against tampering, exploitation, and backdoors, not reading people’s fucking email.
End the madness before it’s too late.
For those that aren’t familiar with Tor, it is a volunteer project that protects user’s privacy online by encrypting traffic and randomly routing it through a series of relays. Originally developed for the US Navy, it is now used by a wide variety of people and is recommended by the EFF (Electronic Frontier Foundation). The post’s author, Mike Perry, is one of Tor’s developers.
Some things never change, and one of those seems to be the utter disregard that cats have for that which is important to us, but not them. From The Atlantic:
Now, via medievalist Emir O. Filipovic, evidence that cats have been up to this same mischief for six centuries: inky pawprints, gracing a page of the 13th volume of “Lettere e commissioni di Levante,” which collated copies of letters and instructions that the Dubrovnik/Ragusan government sent to its merchants and envoys throughout southeastern Europe (Bosnia, Serbia, Croatia etc.), according to Filipovic — sort of a 15th-century Federal Register. The particular document that the cat got its paws on dates to March 11th, 1445.
Consider the hours it took for the poor scribe to copy and bind the book. The modern day equivalent of kitty carelessly strolling across your keyboard and deleting your term paper.
Good thing kitties are so damn cute or we would of killed them out a long time ago.
I give you a fascinating pair of articles from Wired on the topic of the dramatic rise of autoimmune diseases in modern times.
The first extracts some verbiage from a book on the topic, but specifically addresses the island of Sardinia in the Mediterranean:
That is a great set-up for the 2nd article:
A fast-growing body of research suggests that immune systems, produced by millions of years of evolution in a microbe-rich world, rely on certain exposures to calibrate themselves. Disrupt those exposures, as we have through modern medicine, food and lifestyle, and things go haywire.
Wired Science has a pretty awesome list of Botched Spacewalks, Crash Landings, and Smuggled Sandwiches: Spaceflight’s Most Badass Maneuvers. Here’s just one:
Astronaut Gordon Cooper, one of the original seven Mercury astronauts, probably wins as the most clear-headed and fast-thinking space pilot of all time. On the final manned Mercury mission in 1963, Cooper flew the Faith 7 spacecraft into orbit.
After nearly 20 successful trips around Earth, Faith 7 experienced a life-threatening malfunction. Carbon dioxide levels in the vehicle began to climb and the temperature jumped to over 100 degrees Fahrenheit. Cool as a cucumber, Cooper took manual control of the spaceship and estimated the angle he needed to approach for re-entry. By using star patterns, drawing lines on his window, and checking his wristwatch to keep time, Cooper calculated his orientation and fired his rockets at just the right time to land almost exactly by the ship waiting to pick him up.
Now that’s the stuff heroes are made of.