By Vanessa Quirk click here for the original article on ArchDaily
Your Macbook Air has come at a price. And I’m not talking about the $1,000 bucks you shelled out to buy it. I’m talking about the cost of lightness. Because the dirty secret of the “Cloud” – that nebulous place where your data goes to live, thus freeing up your technological devices from all that weight – is its very physical counterpart.
Data Centers. Giant, whirring, power-guzzling behemoths of data storage – made of cables, servers, routers, tubes, coolers, and wires. As your devices get thinner, the insatiably hungry cloud, the data centers, get thicker.
So why are you struggling to picture one in your mind? Why do we have no idea what they look like? What they do? Where they are? Because Data Centers have been hidden away and, although carefully planned, intentionally “undesigned.” The goal is to make the architecture so technologically efficient, that the architecture becomes the machinery, and the machinery the architecture. In the words of author Andrew Blum, Data Centers are “anti-monuments” that ”declare their own unimportance.”
But if architecture is the expression of our society’s values and beliefs, then what does this architectural obliteration mean? That we are willfully ignoring the process that creates the data we daily consume. As long as the internet works, who cares where it came from (or at what cost — and there is a considerable cost)?
So can design change our alienated relationship to our data? Should it? And if so, how?
The Data Center: An Introduction
Tukwila is less a building than a machine for computing. “You look at a typical building, [...] and the mechanical and electrical infrastructure is probably below 10 percent of the upfront costs. Whereas here it’s 82 percent of the costs.” Little thought is given to exterior appearances; even the word “architecture” in the context of a data center can be confusing: it could refer to the building, the network or the software running on the servers. 
This description of Microsoft’s Data Center in Tukwila, Washingington, via its then-general manager Michael Manos, in conversation with New York Times reporter Tom Vanderbilt, sets up the Data Center as the hybrid creature it is: an architectural machine. There are many reasons for this amalgamation, but let’s begin with the most intrinsic to the Data Center: power.
Data Centers require a never-ending, unimaginably large flow of electricity. To put it in perspective: if the Data Centers in the “Cloud” made up a country, that country would be among the top 5 energy users in the world. By 2020, this consumption is expected to multiply fifty times over. 
Unfortunately, most companies in the Cloud are still getting that electricity from traditional “dirty” sources of power; instead of finding ways to decrease energy consumption as a whole (by investing in renewable energy sources, for example), many have focused on making their use of energy as efficient as possible – which has led Data Centers to become the infrastructure of the data itself. 
Another key concern is security. Data Centers often store highly sensitive information – and I don’t just mean the photos you share on Facebook. Think: every financial transaction on the Stock exchange or email sent by a Government employee resides, in some form or other, in a Data Center. Moreover, there’s also the motivation of keeping a Data Center’s cutting-edge design secret from the competition.
To both ends, many Data Centers resemble warehouses turned high-security military facilities (and in some cases they actually were) . Restricted barriers, security cameras, biometric devices that scan your irises – when Andrew Blum, author of Tubes: A Journey to the Center of the Internet, visited Google’s Data Center in The Dalles, Oregon, he likened it to a prison, and couldn’t even get past the cafeteria.
The need for both security and energy-efficiency has led Cloud companies to place their Centers far away from urban society, close to plentiful power sources and existing telecommunications infrastructures, and, increasingly, in naturally cool environments (eliminating the need for coolers to offset the heat these centers produce). Tax incentives and proximity to endusers are also key considerations. [5
But while cold, isolated locations make a lot of sense for Data Centers as they exist today, they weren’t always hidden away. In the 60s, IBM mainframes were located in a privileged spot in corporate headquarters: the “glasshouse.” Kenneth Brill, founder of data-center research and consulting group, the Uptime Institute, explains: “It was located near the executive suite. Here you’d spent $15 to 30 million on this thing — the executives wanted to show it off.” 
As this historical episode shows, Data Center design and site aren’t fixed – they depend on human priority. And while the current model is rather entrenched, that doesn’t mean it isn’t due for a change.
Data Center design is at a crossroads. There are some companies, such as HP, who are betting that modular data centers (shipping containers filled with servers) are the way of the future. Comparably low cost, fast to set up, energy-efficient, and easy to cool, these containers eliminate the need for architecture at all.
But Facebook has also spearheaded a new trend in Data Center Design: transparency. For their last center in Prineville, Oregon, Facebook made the plans public. As Blum blogged: ”Believing in the efficiency and innovation of the building’s design—and the environmental benefits of extending that to other parts of the Internet’s infrastructure—Facebook has published all the plans, all the way from the custom-designed motherboards to the unusual swamp cooler-like system that keeps the building cool. Architecture always expresses the ideals of an organization. In Facebook’s case, this meshes with Mark Zuckberg’s founding vision of making the world more open and connected.”
Facebook has also pledged to rely more on renewable energy sources (their next center will be in Sweden, the leading supplier of renewable energy). But the design doesn’t just embrace environmental innovation; as Blum points out, it crystallizes Facebook’s vision of openness by creating a space that is approachable. There are no scary unbroken facades here. Prineville boasts large windows, welcoming colors, natural materials, local memorabilia – a human environment for man and machine.
As more and more private companies get into the Data Center game (Amazon, eBay, and Walmart are already big players), a company’s Data Center design could become increasingly important. Perhaps Data Centers will eventually go back to their roots, becoming integrated into our cities, transparent and proud “glasshouses.”
Lets hope that they do. Because without design, the design that bridges the man to the machine, the Data Center remains an impenetrable fortress, separating us from a technology that we use daily (and that daily damages the earth). Design can turn these”anti-monuments” back into glasshouses – architecture that gives us a peek into our own digital age.
 Vanderbilt, Tom. “Data Center Overload.” The New York Times. June 8, 2009.
 Greenpeace. “Why Clouds Aren’t Green” CLOG: Data Space. Ed. Kyle May. 2012.
 Corbo, Stefano. “Data Centers and Physical Spaces: Tracking the Invisible.” CLOG: Data Space. Ed. Kyle May. 2012.
Graham, Stephan. “Data Archipelagos.” CLOG: Data Space. Ed. Kyle May. 2012.
 “How Green Is Your Cloud?” Greenpeace. April 2012.