I am either a Luddite geek or a geeky Luddite, but I'm not sure which. While I recognize the tremendous value and potential technology has in our lives, I am wary of ceding too much power to a growing network of machines beyond our ability to control. Further, that the "service disruption" of such technologies would render humans, individually or collectively, incapable of remedying the situation without dire consequences.
|
When the file hits your WiFi like a big data pie, that's Cloud Computing. |
This past week has given us one small example of this: the Amazon web service outage (
source). With a variety of services far beyond the web shopping giant itself (including Reddit, Quora, and Foursquare), the self-proclaimed "networking event" to Amazon's Northern Virginia servers has wreaked havoc on a variety of services that web companies who rent Amazon's server space have come to rely on. Beyond the usual problems this incident - along with the (unrelated) disruption to Sony's PlayStation Network - have caused, a larger debate is being held (
source) over the future of the "Cloud Computing" model that Google, Microsoft, and others have been pushing since 2007 (
source).
Essentially, Cloud Computing deploys data from a centralized source (the "cloud") to all of the individual computers networked to it. This model is as old as the Internet itself with one critical exception: the "cloud" is the
source of the data, not merely an intermediary between two machines. In practical terms, this means that an individual computer, say a laptop communicating via WiFi, doesn't store the data at all - rather the user's music, documents, pictures, etc. are all stored in a centralized online database. This has obvious advantages, of course, including multiple devices (computers, smart phones, tablets, even a refrigerator) being able to communicate with the server to retrieve and download information relevant to the user's needs and preferences. No more carrying around discs, drives, updating versions of files on multiple machines, or sending them across a network back and forth. It's easy! It's all up in the cloud, and Google's new Chrome OS has been especially optimistic about the weather forecast for this data distribution model (
source): "no matter what happens to your Chrome Notebook, your data will be safe and sound" ... with
Google.
But a storm is brewing in the future for Cloud Computing. Data centralization has all sorts of inherent problems - including the obvious requirement that a total of three devices work in tandem to accomplish any given task: the server, the network connection, and the end user's computer. If the network goes down or if the server crashes, well, your laptop is baking in the sun without a cloud in the sky, waiting for the rain of data to pour down.
And to me, this makes Cloud Computing a Pie in the Sky idea. Relying on companies with server farms far, far away to store your data seems inherently risky for the reasons listed above, but the potential ramifications are far more nefarious. Ongoing alarmism about the centralized tracking and storage of smart phone GPS coordinates is warranted (
source), so is the threat to Fourth Amendment protections as government agencies demand user information from Google, Yahoo, Facebook, and other warehouses of private data - often without a warrant (
source). As the Wikileaks controversy of 2010 showed us (
source), companies like Amazon have very little interest in upholding civil liberties in the face of government intimidation and public controversy. And if users are willing to store not only contact information on private server space, but more of their documents, pictures, videos, and other content, who will hold such companies accountable for protecting the secrecy of this data? Is it ever really deleted? Who internally has access to it?
|
"Access denied. The Cloud tells us what to do now, human." |
To me, "Cloud Computing" sounds uncomfortably similar to the fictional Cyberdyne Systems' brainchild program, "Skynet." It's as though the advocates of such a model are unaware that the "Terminator" films are cautionary tales of dystopia rather than ideals toward which to strive. Not that Technological Singularity (
source) is happening in the next six weeks or in six months, but isn't the Cloud Computing model
exactly what would be necessary for such a phenomenon to occur? And once The Cloud is self-aware, with access to all the information and wired electronic devices on the planet, would machines cease to be our servants and instead become our slaveholders? Would all devices operating on The Cloud - including the unmanned Predator Drones (another ominously-named machine) flying over much of Afghanistan and parts of Pakistan - no longer respond to human input once a singular machine consciousness arises? These are uncomfortable questions, only made moreso by the elusive stamp of human error buried within the lines of code in every machine. Imagine what "networking event" would alter the software that once-benevolent machines were programmed with.
Personally, I don't trust my data any further than I can throw it - and unplug it if it gets out of hand. ...In case you are wondering: I haven't backed up this blog or my Facebook account... but all my video files are triple-backed-up for archival. For the projects I
really want to keep, I even export to good, old-fashioned tape!