Saturday, September 6, 2014


Behind the scenes at ISG’s Wichita data center


Digital Editor- Wichita Business Journal

I got a chance to tour ISG Technology’s Wichita data center as the company plans to expand some technology it was piloting there for using less energy to keep machines cool.
The 900-square-foot facility, built in 2012, is one of three ISG data centers. A similiar-sized underground facility is in Columbia, Mo., in an area mined out of Baltimore limestone, and a much larger center — more than 16,000 square feet — is in Topeka.
Curtis Mead, who leads sales for ISG’s data center services segment, says many ISG customers turn to one of the data centers for disaster recovery. They run their main operations in-house but use the data center for backup, in case of data loss.
A few do the opposite, calling upon the data center’s storage for their main operations and maintaining a local backup. That’s because, Mead says, the data center includes several safeguards a local system might not have, such as an uninterruptable power system and 24/7 monitoring.
Pricing for data center use varies based on factors like how many storage cabinets a business needs and how much data needs to run in and out of the facility each day.
However, Mead says ISG has found one change it believes will make the facility less expensive to operate. At the Wichita data center, the company has been testing a hot aisle-cold aisle system in which, in alternating aisles, heat coming off the back of servers is captured and sent directly out of the room to be cooled, so it doesn’t spread through the room. In other aisles, cool air is forced up through the floor.
Mead says the limited test has worked well enough that ISG plans to expand the concept to other aisles in the Wichita facility.







 

Sept. 4, 2014-BOSTON - Security controls and compliance with industry-specific regulatory standards are top priorities for financial services firms when selecting a data center colocation provider. Iron Mountain® Incorporated (NYSE: IRM), the storage and information management company, announced it has signed a multi-year agreement to provide data center colocation services for The First Marblehead Corporation (NYSE: FMD), a leading provider of private student loan solutions for lenders, credit unions and schools.

“We are a long standing customer of Iron Mountain’s off-site data management services, and have developed a trusted relationship with them,” stated Richard Shepherd, managing director of Infrastructure and Shared Services at First Marblehead. “As a highly regulated financial services firm, it’s imperative that our data center service provider adhere to the highest security and compliance standards. When we recognized the need for changing providers, we believed that Iron Mountain clearly hit the mark with our requirements.”

In addition to their security and compliance requirements, First Marblehead was also looking for a provider to flexibly support their evolving business needs.  Iron Mountain’s secure Boston data center delivers a full portfolio of managed services – including carrier-neutral network connectivity, cloud backup and replication, certified technicians, helping hands services and more – for increased efficiencies. Colocating with Iron Mountain enables First Marblehead to leverage a single provider for its data center and managed services needs while benefitting from flexible and predictable contract options.

Iron Mountain’s superior commitment to customer service was another factor in First Marblehead’s decision to colocate in the Boston data center. “We’re pleased to welcome First Marblehead into our Tier III LEED Gold certified facility in Northborough,” said Mark Kidd, senior vice president and general manager, data centers, Iron Mountain. “We look forward to extending the same high standards of service they’ve come to expect from our data management business to meeting their data center needs.  Partnering with Iron Mountain allows customers to demand more and get more from their data center provider.”

Beginning September 2014, First Marblehead will occupy a secure dedicated space at Iron Mountain’s Boston data center, located in Northborough, Massachusetts.

First Marblehead Expands Relationship with Iron Mountain to Include Data Center Colocation

Sept. 4, 2014-BOSTON - Security controls and compliance with industry-specific regulatory standards are top priorities for financial services firms when selecting a data center colocation provider. Iron Mountain® Incorporated (NYSE: IRM), the storage and information management company, announced it has signed a multi-year agreement to provide data center colocation services for The First Marblehead Corporation (NYSE: FMD), a leading provider of private student loan solutions for lenders, credit unions and schools.
“We are a long standing customer of Iron Mountain’s off-site data management services, and have developed a trusted relationship with them,” stated Richard Shepherd, managing director of Infrastructure and Shared Services at First Marblehead. “As a highly regulated financial services firm, it’s imperative that our data center service provider adhere to the highest security and compliance standards. When we recognized the need for changing providers, we believed that Iron Mountain clearly hit the mark with our requirements.”
In addition to their security and compliance requirements, First Marblehead was also looking for a provider to flexibly support their evolving business needs.  Iron Mountain’s secure Boston data center delivers a full portfolio of managed services – including carrier-neutral network connectivity, cloud backup and replication, certified technicians, helping hands services and more – for increased efficiencies. Colocating with Iron Mountain enables First Marblehead to leverage a single provider for its data center and managed services needs while benefitting from flexible and predictable contract options.
Iron Mountain’s superior commitment to customer service was another factor in First Marblehead’s decision to colocate in the Boston data center. “We’re pleased to welcome First Marblehead into our Tier III LEED Gold certified facility in Northborough,” said Mark Kidd, senior vice president and general manager, data centers, Iron Mountain. “We look forward to extending the same high standards of service they’ve come to expect from our data management business to meeting their data center needs.  Partnering with Iron Mountain allows customers to demand more and get more from their data center provider.”
Beginning September 2014, First Marblehead will occupy a secure dedicated space at Iron Mountain’s Boston data center, located in Northborough, Massachusetts.
- See more at: http://www.ironmountain.com/Company/Company-News/News-Categories/Press-Releases/2014/September/4.aspx#sthash.S9n9GSkK.dpuf

First Marblehead Expands Relationship with Iron Mountain to Include Data Center Colocation

Sept. 4, 2014-BOSTON - Security controls and compliance with industry-specific regulatory standards are top priorities for financial services firms when selecting a data center colocation provider. Iron Mountain® Incorporated (NYSE: IRM), the storage and information management company, announced it has signed a multi-year agreement to provide data center colocation services for The First Marblehead Corporation (NYSE: FMD), a leading provider of private student loan solutions for lenders, credit unions and schools.
“We are a long standing customer of Iron Mountain’s off-site data management services, and have developed a trusted relationship with them,” stated Richard Shepherd, managing director of Infrastructure and Shared Services at First Marblehead. “As a highly regulated financial services firm, it’s imperative that our data center service provider adhere to the highest security and compliance standards. When we recognized the need for changing providers, we believed that Iron Mountain clearly hit the mark with our requirements.”
In addition to their security and compliance requirements, First Marblehead was also looking for a provider to flexibly support their evolving business needs.  Iron Mountain’s secure Boston data center delivers a full portfolio of managed services – including carrier-neutral network connectivity, cloud backup and replication, certified technicians, helping hands services and more – for increased efficiencies. Colocating with Iron Mountain enables First Marblehead to leverage a single provider for its data center and managed services needs while benefitting from flexible and predictable contract options.
Iron Mountain’s superior commitment to customer service was another factor in First Marblehead’s decision to colocate in the Boston data center. “We’re pleased to welcome First Marblehead into our Tier III LEED Gold certified facility in Northborough,” said Mark Kidd, senior vice president and general manager, data centers, Iron Mountain. “We look forward to extending the same high standards of service they’ve come to expect from our data management business to meeting their data center needs.  Partnering with Iron Mountain allows customers to demand more and get more from their data center provider.”
Beginning September 2014, First Marblehead will occupy a secure dedicated space at Iron Mountain’s Boston data center, located in Northborough, Massachusetts.
- See more at: http://www.ironmountain.com/Company/Company-News/News-Categories/Press-Releases/2014/September/4.aspx#sthash.S9n9GSkK.dpuf

This Underground Data Center In Sweden Looks Like A Real-Life Bond Villain Lair

The Pionen White Mountains bunker is located 100 feet below ground and shielded by 16-inch-thick metal doors, all within a few miles of Stockholm, Sweden.

The%20Pionen%20White%20Mountains%20bunker%20is%20located%20100%20feet%20below%20ground%20and%20shielded%20by%2016-inch-thick%20metal%20doors%2C%20all%20within%20a%20few%20miles%20of%20Stockholm%2C%20Sweden.

The bunker was used by Sweden’s Civil Defense in the 1970s but was decommissioned and converted into a data center in 2008 by Internet service provider Bahnhof.

The%20bunker%20was%20used%20by%20Sweden%u2019s%20Civil%20Defense%20in%20the%201970s%20but%20was%20decommissioned%20and%20converted%20into%20a%20data%20center%20in%202008%20by%20Internet%20service%20provider%20Bahnhof.%20

It took more than two years to blast out 141,000 cubic feet that Bahnhof needed to fit all their backup generators and server racks into the caves.

It%20took%20more%20than%20two%20years%20to%20blast%20out%20141%2C000%20cubic%20feet%20that%20Bahnhof%20needed%20to%20fit%20all%20their%20backup%20generators%20and%20server%20racks%20into%20the%20caves.

One company that also used this space as a data center HQ was WikiLeaks...

One%20company%20that%20also%20used%20this%20space%20as%20a%20data%20center%20HQ%20was%20WikiLeaks...

In 2010 the bunker housed the servers for the website here because the country has one of the world's strongest laws to protect confidential source-journalist relationships.

In%202010%20the%20bunker%20housed%20the%20servers%20for%20the%20website%20here%20because%20the%20country%20has%20one%20of%20the%20world%27s%20strongest%20laws%20to%20protect%20confidential%20source-journalist%20relationships.

And while most data centers are stationed in warehouses, this one looks almost like a Bond villain lair...

And%20while%20most%20data%20centers%20are%20stationed%20in%20warehouses%2C%20this%20one%20looks%20almost%20like%20a%20Bond%20villain%20lair...

With its glass-walled conference room, 1970s memorabilia, and of course, its unique waterfall features.

With%20its%20glass-walled%20conference%20room%2C%201970s%20memorabilia%2C%20and%20of%20course%2C%20its%20unique%20waterfall%20features.

Bahnhof Chairman Jon Karlung says the plants and waterfalls were inspired by the 1972 Bruce Dern sci-fi film "Silent Running."

Bahnhof%20Chairman%20Jon%20Karlung%20says%20the%20plants%20and%20waterfalls%20were%20inspired%20by%20the%201972%20Bruce%20Dern%20sci-fi%20film%20%22Silent%20Running.%22%20

The bunker even has greenhouses, simulated daylight, a huge saltwater fish tank, and two huge submarine engines that act as backup power generators.

The%20bunker%20even%20has%20greenhouses%2C%20simulated%20daylight%2C%20a%20huge%20saltwater%20fish%20tank%2C%20and%20two%20huge%20submarine%20engines%20that%20act%20as%20backup%20power%20generators.

The company says that their employees feel happier working in this environment, and are more productive.

The%20company%20says%20that%20their%20employees%20feel%20happier%20working%20in%20this%20environment%2C%20and%20are%20more%20productive.

Who wouldn't be more productive when they're working from a Bond-esque hideaway?

Who%20wouldn%27t%20be%20more%20productive%20when%20they%27re%20working%20from%20a%20Bond-esque%20hideaway%3F

And if you're wondering how safe you are working here, this bunker was originally built to survive an atomic bomb.

And%20if%20you%27re%20wondering%20how%20safe%20you%20are%20working%20here%2C%20this%20bunker%20was%20originally%20built%20to%20survive%20an%20atomic%20bomb.%20

Sounds like a workplace dream!

Sounds%20like%20a%20workplace%20dream%21


InfoBunker

The lure of underground data centers

Subterranean facilities provide security and energy savings — and they’re cool, too.

- Tech Page One Aug 26 2014


Every evil genius deserves his own subterranean bunker with a supercomputer to plot world domination. The economy being the way it is these days, however, most can’t afford to build their own lair. Fortunately, there are plenty of underground facilities that you can share with other businesses and organizations. There is a coolness about these locations, and not just because the ground temperature is in the 50s year round.
“There is a certain James Bond allure to the underground data center,” said John Clune, president of Cavern Technologies. “Many of our customers utilize the location in their marketing to show how seriously they take data storage and protection.”
One of TelecityGroup’s five data centers in Helsinki, Finland, is housed under the Uspenski Cathedral. Credit: Shutterstock
Here are four companies in the U.S. and Europe that are running underground data centers.

Unorthodox location

London-based TelecityGroup operates five data centers in Helsinki, Finland. One of them is located in a former bomb shelter 100 feet below the 150-year-old Eastern Orthodox Uspenski Cathedral. By using sea water and a heat exchanger, it uses 80 percent less energy for cooling than a typical data center. But not all the heat is sent to the sea. The water first circulates to a heat exchanger serving the city’s district heating system, with the servers providing enough heat for 500 homes. The data center won an Uptime Institute Green Enterprise IT award in 2010.

Data mining

 

Cavern Technologies built a 50-megawatt, 300,000-plus square-foot data center in a former limestone mine near Lenexa, Kansas. Being 125 feet underground means that the site is secure from the tornados, ice storms and hail that can hit above-ground data centers in the region.
“From day one, we have a hardened F5 tornado-proof [261 to 318 miles per hour] structure,” said Clune. “Above ground, it costs up to $150 per square-foot for a hardened shell. On a 100,000-square-foot building, that is a cost savings of $15 million that we can pass on to our customers.”

Secure storage

 The Green Mountain Data Center is a Tier III+ facility located in a former NATO ammunition storage facility on an island in western Norway. Three-hundred-foot-long tunnels connect the data center rooms to the outside world. Although the site is physically remote, high-speed connections mean it is only 4.5 milliseconds from Aberdeen, Scotland and 6.5 milliseconds from London.

One of the main advantages of the site is its energy efficiency. Located on the shoreline, it draws 46- degrees -Fahrenheit water from a fjord, an arrangement that allows a 200-kilowatt pump to produce 26,000 kilowatts of cooling. The system is designed for high-density computing, up to 60 kilowatts per rack. Since it is deep underground, the cooling system never has to offset heating caused by the sun striking the walls and roof. In addition, Norway has abundant hydropower, so the data center operates without greenhouse gas emissions and the power costs about 40 percent less than it does in London.
“We wanted to build the greenest data center in the world and being underground helped in so many ways,” said Jonathan Evans, Green Mountain’s international accounts director.

Cold War bunker

When InfoBunker went looking for an ultra-secure location for a high-availability data center, it wound up taking over a building that was already designed for high-tech applications: a former military communications center near Des Moines, Iowa, that was built to survive a direct nuclear hit.
“From a functionality standpoint the building is performing exactly the same services it did while under military control, only now geared towards the private sector,” said Jeff Daniels, InfoBunker’s executive vice president. “From a cost perspective it was also far less expensive than a greenfield data center project as we could make use of almost all the existing base infrastructure and our [capital expenditure] was limited to upgrading/modernizing systems and building the actual data floor into what was essentially white-box space.”
Although at just 25 feet deep it is much closer to the surface than some of the other underground data centers, the amount of steel and concrete used gives it strength. Daniels said that to replace the building today, with all its hardening and critical systems would have cost over $100 million. But it did also require additional work to drill holes for pipes and conduits.
“The floor is two foot thickness of 6,000 PSI-rated concrete and has steel reinforcing bars the size of your wrist all through it,” said Daniels. “It eats core drill bits like popcorn.”
The facility is designed for 10 kilowatts per rack and uses outside air-cooling nine months of the year, using the heated air to warm the offices. The building stays at about 55 degrees Fahrenheit year-round and can act as a buffer to absorb some of the heat if the HVAC goes down. Waste heat from the servers is used to keep the office spaces comfortable.

Zombie apocalypse

Data centers such as these are even being touted as a place where people could potentially survive a “zombie apocalypse.”
“Despite advertising ourselves as a 20-megaton nuclear-hardened data center we do not anticipate ever being nuked,” said Daniels. “InfoBunker was however nominated one of the top six ‘zombie-proof’ green data centers worldwide.”

Friday, March 21, 2014


Questions to Ask Yourself When Selecting a Data Center: Is the infrastructure built to meet or exceed Tier III standards?

According to a study by the Ponemon Institute, the cost of data center downtime across industries is approximately $7,900 per minute, which is a 41% increase from the $5,600 cost in 2010. This same study also showed that 91% of data centers have experienced an unplanned outage in the past 24 months. (Read more about the average costs involved with outages for 2013 here). Facility outages are not only financially devastating, but seriously harmful to an organization’s reputation.
Thankfully, the data center industry has adopted a standardized methodology to determine availability in a facility, which will help you determine what is right for your business so you can make an informed decision.  Developed by the Uptime Institute, this tiered system offers companies a way to measure both performance and return on investment (ROI).
To be considered a Tier III facility it must meet or exceed the following standards:
  • Multiple independent distribution paths serving the IT equipment
  • Concurrently maintainable site infrastructure with expected availability of 99.982%
  • 72 hour power outage protection
  • All IT equipment must be dual-powered and fully compatible with the topology of a site’s architecture
Another important element in Tier III compliance is N+1 redundancy on every main component, which provides greater protection and peace of mind for crucial IT operations by ensuring a redundant system is always available in case a component fails or must be taken offline for maintenance.

Each LightEdge data center, including the new Kansas City facility currently being built at SubTropolis Technology Center, meets or exceeds the concurrent maintainability requirements of the Uptime Institutes Tier III standards. With our Tier III infrastructure, any one component can fail and the datacenter will remain operational.

LightEdge’s Kansas Center data center is scheduled to open during the spring of this year. Check out our Facebook, Twitter, Google+, and LinkedIn pages for the most recent photos of our construction progress. Download the spec sheet to learn more about the facility here.


Some things to consider when running a data center

As businesses go increasingly digital, the need for data centers to secure company information is more important than ever before. It is not only tech giants like Google and Facebook that need a place to house their information - businesses across the healthcare, government and industrial sectors are looking to data centers as a solution to their storage needs. But running a data center is not something that can be done impulsively. Whether your company has the funds and scale of operations to occupy its own center or ends up looking into existing facilities, here are some important considerations to keep in mind to maximize an enterprise data center operation.

Consider renewable energy solutions
 
In Hollywood movies, data centers are generally represented as massive, noise-intensive operations that actively drain energy out of whatever area they are occupying. This public perception of such facilities is understandable given that data centers must rely on a constant supply of energy - after all, their functionality depends on remaining active at all times. But just because they harness energy sources does not mean data centers can't function in an environmentally-minded, sustainable way.
Just ask Google, a company that has been dealing with data storage needs ever since it rented its first data storage facility - a closet-sized, 7 foot by 4 foot operation with a mere 30 computers - in 1998, according to CNET. Google has come a long way since then, and so has its dedication to sustainable methods of data center operation. The tech giant now has a vast network of data centers spanning the globe.
What unites Google's facilities is a singular commitment to renewable energy. With renewable energy currently powering more than a third of Google's data storage facilities, the company is always looking for ways to expand the use of solar and wind power, according to its site. Because it is challenging to have a renewable power generator on location, the company did the next best thing: It reached out to renewable energy providers in the area - such as wind farms - and made deals to buy energy from them. Among Google's energy suppliers are wind farms in Sweden and Oklahoma. Through these sources, the company is not only able to maintain solid data room cooling practices, but benefit the local community.   

Have good outside air cooling

When it comes to maintaining an optimal data room temperature, it's best to follow the lead of companies well-versed in data storage. Google and Microsoft are two such businesses, and they both share a commitment to harnessing natural resources to keep their data centers cool.
In Dublin, Microsoft has invested more than $800 million to date in order to build a data center that covers almost 600,000 square feet. The enormous size of the facility would seem to present a major cooling challenge, but the company has been able to surmount that by using fresh air cooling, Data Center Knowledge reported. By building the center in Ireland, where the temperature remains optimal for data room cooling, Microsoft is able to maximize the location as a natural cooling solution - a move that saves significant energy costs while keeping the company environmentally friendly as well. And its commitment to environmentally sound solutions does not end with cooling: the center also recycles 99 percent of the waste it produces.
Google has a similarly cooling-minded approach with its data facility in Finland, which it hopes will be almost completely powered by wind energy by 2015, according to Data Center Knowledge. The wind energy will come from a wind park located nearby. But the center is not waiting until then to implement good temperature practice. Instead of relying on chillers and other machine-based cooling techniques, Google relies on seawater from a nearby Gulf to cool the facility. Its efforts in Finland are part of a broader effort to expand the Google sphere of influence.
"The Google data center in Hamina offers Eastern Finland a tremendous opportunity to jump from the industrial to digital age," said Will Cardwell, a professor at a nearby university.

But just as important as what goes on inside a center is the environment around it. That is because data centers are invariably affected by the physical location in which they are located. With that in mind, here are some more things to look into in order to maximize your data center potential.

Choose the location wisely

Considering that data centers are necessarily connected to the physical environment they inhabit, it is important to pinpoint the best location possible. Data centers are always going to require top-notch capabilities to maintain a good server room temperature, but the ease with which that happens can depend on the location of the center. As always, Google is at the top of the game with regard to location selection. Its Hamina, Finland center is strategically placed near the Gulf of Finland, enabling an easy and natural data room cooling solution.
But Google is not the only company maximizing natural environments for data center growth. Iron Mountain specializes in underground data center solutions, according to Data Center Knowledge. Formerly a storage company for physical records, Iron Mountain already had a 145-acre underground storage facility in a former limestone mine before it got into the data center business. This location turned out to be perfect for data center needs. Blocked from the sunlight and other external heat sources, the underground facility stays at about 52 degrees without any kind of additional cooling function. An underground lake provides further protection against ever needing to bring in a machine cooling system. The company's so-called "data bunker" gained so much popularity that Iron Mountain decided to expand its sphere of operations.

Give back to the community the center is in

Data centers often require a big fleet of staff to operate. Fortunately, they're usually built near communities from which workers can be hired. But as much as data centers plan to benefit from the community they inhabit, it is just as important to look for ways to give back. This kind of behavior encourages connectedness with the community and improves the reputation of the center - and therefore the company - in the public eye.
Google paid special attention to the local community as it developed its Hamina center. When they began mapping out the concept for the center, Google realized that construction would take about 18 months. And so they turned to the locals for help. In the process, they provided steady employment for 800 workers in the engineering and construction sectors, according to Data Center Knowledge. Google's willingness to involve locals in the construction process helped forge a lasting bond between the tech giant and the city.
This bond did not go unnoticed.
"Google's investment decision is important for us and we welcome it warmly," Finnish president Jyrki Katainen said.
And for those who work at the center, life is good.
"No two days are the same as we change our roles around frequently to keep things fresh and new," said Julian Cooper, a hardware operations worker at the facility.

Be prepared to surmount environmental obstacles

In the event of a disaster like a hurricane or earthquake, it is vitally important for all enterprises - especially data centers - to make sure their stock is safe. Iron Mountain understands the principle of environmental preparadness quite well, which is why they offer underground data storage solutions. By storing data underground, Iron Mountain protects it against any conceivable natural disaster. This nature-prove construction is especially important for companies like Marriott, which chose to house data at the Iron Mountain bunker because of the sense of complete security it afforded.
"We have always had a rigorous and constant focus on having disaster preparedness in place," said Marriott operational vice president Dan Blanchard. "Today we have a data center that provides Marriott with a tremendous capability for disaster recovery, and we have a great partner in Iron Mountain."
According to tech journalist David Geer, earthquakes pose a huge threat to data centers in many areas around the world, since they can be difficult to predict and potentially cause large-scale damage. If a company intends to build its facility in an area susceptible to earthquakes, it should apply the most stringent safeguards, including building a center that is capable of withstanding a quake one degree higher than the requirement for the zone it occupies.

CHARLES DOUGHTY<br/>
Iron Mountain 
CHARLES DOUGHTY
Iron Mountain
Charles Doughty is Vice President of Engineering, Iron Mountain, Inc.
We’re all familiar with Moore’s Law, stating that the number of transistors on integrated circuits doubles approximately every two years. Whether we measure transistor growth, magnetic disk capacity, the square law of price to speed versus computations per joule, or any other measurement, one fact persists: they’re all increasing and doing so exponentially. This growth is the cause of density issues plaguing today’s data centers. Simply put, more powerful computers generate more heat which results in significant additional cooling costs each year.
Today, a 10,000 square-foot data center that is running about 150 watts per square foot costs roughly $10 million per megawatt to construction, depending on location, design and cost of energy. If the approximately 15 percent rate of data growth of the last decade continues over the next decade, that same data center would cost $37 million per megawatt. A full thirty percent of these costs are related to the mechanical challenges of cooling a data center. While the industry is experienced with the latest chilled water systems, and high-density cooling, most organizations aren’t aware that Mother Nature can deliver the same results for a fraction of the cost.

The Efficiency Question: Air vs. Water

Most traditional data centers rely on air to direct cool a data center. When we analyze heat transfer formulas, it turns out water is even more efficient at cooling a data center, and the difference is in the math, namely the denominator:
formula-air-water
air-water-volume
With the example above, the energy consumed by the 10,000 square-foot data center creates over 5 million BTUs of heat rejection. Using the formulas in the figures above and assuming a standard delta T of 10 degrees, this data center would require more than 470,000 cubic feet per minute (CFM) of air to cool that facility, but only 1,000 gallons of water per minute. In order to cool this data center, the system would need between 150-200 horsepower to convey that many cubic feet of air per minute, but only 50-90 horsepower to convey 1,000 gallons per minute – roughly 462 times more efficient! If analyzed on a per cubic foot basis – one cubic foot of air to one cubic foot of water, water is actually about 3,400 times more efficient than air.

Physics 101: The Thermodynamics of the Underground

However, for an underground data center, there’s more at work. In a subterranean environment, Mother Nature gives you a consistent ambient temperature of 50 degrees. (So to begin with, you don’t have to depend on cooling systems as much since it is cool to start. Then you can get further efficiencies by using an underground water source or aquifer.)
The ideal environment for a subterranean data center is made of aquifers, or stone that has open porosity like basalt, limestone and sandstone; aquicludes, such as dense shales and clays, will not work as effectively. In a limestone subterranean environment, heat rejection can increase from 4 to 500 percent because of the natural heat sink characteristics of the stone. The most appealing implication here is that the stone can manage the energy fluctuations and peaks inherent to any data center. (the limestone absorbs heat which further reduces the need for cooling)
As the water system funnels 50 degree water from the aquifer to cool the data center, the heat is rejected into the water which is then funneled back about 10 degrees warmer. Mother Nature deals with that heat by obeying the second law of thermodynamics which governs equilibrium and the transfer of energy. For the subterranean data center operator, this means working within the conductivity of the surrounding rock, thus it is important to be knowledgeable of the lithology and geology of the local stratus, along with understanding, the effects of a continuous natural water flow and the psychrometric properties of air.

The Cost of Efficiency

Of course, there are other data center cooling strategies being used aside from the subterranean lake designs including well systems, well point systems and buried pipe systems to name a few. Right now, well systems are being used in Eastern Pennsylvania to cool nuclear reactors producing hundreds of megawatts of energy with mine water. Well point systems are generally used in residential applications, but the concept doesn’t scale well without becoming prohibitively expensive. Buried pipe systems are used quite a bit and require digging a series of trenches backfilled with a relatively good conductive granular material, but beyond 20-30 kilowatts, this method does not scale well.
How much cost do each of these methods incur? An underground geothermal lake design will cost less than $500 per ton, while well-designed chill water systems range from $2,000-4,000 a ton. The discrepancy in cost is created by the mechanics – in a geothermal lake, there are no mechanics: water is simply pumped at grade. Well and buried pipe systems can cost more than $5000 a ton, and these systems do not scale very well.
By understanding Mother Nature and using her forces to our advantage, we can increase the capacity and further improve on the effectiveness of the geothermal lake design. By drilling a borehole from the surface into the cavern, air transfer mechanisms can easily be incorporated; anytime the air at the surface is at or below 50°, that cool air will to drop into the mine. Even without motive force or air handling units, a four to five foot borehole can contribute about 30,000 cubic feet of air per minute! If an air handling unit is add, the 30,000 CFM of natural flow can easily become 100,000-200,000 CFM. What was a static geothermal system is now a dynamic geothermal cooling system with incredible capacity at minimal incurred cost.

Opportunities for the Future

When analyzing and predicting what data centers are going to look like in the future, a recurring theme starts to emerge: simplicity and lower-cost. Because of the cost pressures facing IT departments and CFOs alike, underground data centers using hybrid water, air and rock cooling mechanisms are an increasingly attractive option.
There are even opportunities to turn these facilities into energy creators. For example, by adding power generating turbines atop boreholes, operators can harness the power of heat rising from the data centers below. Furthermore, by tapping into natural gas reserves, subterranean data centers could become a prime energy source, thus eliminating the need for generators and potentially achieving a power usage effectiveness measurement of less than one. The reality is that if you know Mother Nature well, you can work with her – she’s very consistent – and the more we learn, the more promising the future of data center design looks.
 http://www.datacenterknowledge.com/archives/2014/03/18/tomorrows-data-centers-mother-nature-cools-best/
Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating. View previously published Industry Perspectives in our Knowledge Library.

Friday, March 14, 2014

Going underground: Will data centers become data bunkers?


Hong Kong is one of the most vertical cities in the world. Like many cities, when it ran out of space, it simply built up instead of out.
But Hong Kong has almost reached its upper-limit for going skyward. And as a financial capital of the world with an incredibly dense population, space is at a premium. That means that large dedicated computing areas, like data centers, have to compete for space with everything else, even though they are in great demand.




The solution for Hong Kong might be to stop building skyward and to start looking under its feet. Data Center Knowledge recently reported that Hong Kong may  dig out rock caves under the city and build new data centers down there. Apparently putting a data center in a deep cave isn't such a bad idea because the naturally cooler below-ground air could help maintain temperatures as long as the cave is properly ventilated.
The biggest problem with the underground concept is likely to be price. It was estimated in the Hong Kong scenario that digging out a tunnel for a data center would cost up to $600,000 per meter. That makes for an expensive project that all but the most profitable data centers would be hard pressed to ever overcome.
But there may be other advantages to building underground. Apparently other underground data center projects are in the works, or even  have been completed in other places, using decommissioned military bunkers as their base of operations. Swedish IPS Bahnhof converted a bunker below central Stockholm into a state-of-the-art data center back in 2008.
The main advantage to using a military bunker, besides the fact that it's already been dug out, is that they were built to survive a nuclear war. Governments looking for the ultimate level of security may want to consider it.
Even with the structure already in place, it will still be expensive to store data in a nuclear-proof  bunkers. Probably credit card companies will take advantage of it. So you can rest easy knowing that in the event of a nuclear war, both cockroaches and your MasterCard bill will survive.

Cavern Technologies builds area’s largest data center underground


Pete Clune, CEO of Cavern Technologies, is pictured in one of the data suites in its underground data center at Meritex Lenexa Executive Park.

Cavern Technologies, located in the underground portion of Meritex Lenexa Executive Park, has begun a $10 million, 100,000-square-foot expansion that will make it the largest data center in the Kansas City region.
The company, which has developed 60,000 square feet of underground space since its founding in 2007, will leapfrog ahead of 1102 Grand LLC in Kansas City, now the region’s largest data center with 110,000 square feet.
Pete Clune, founder and CEO of Cavern Technologies, said several factors were driving the company’s growth.
One is the fact that the data-storage needs of Corporate America are roughly doubling every year and a half, Clune said. Cavern Technologies’ more than 90 existing tenants are expanding their data-storage capacity by 30 percent annually, he said.
Another growth driver is the cost and reliability of electric power provided by Kansas City Power & Light, Clune said. KCP&L charges 8 cents per kilowatt hour, about half of what utilities on the coasts charge.
Cavern Technologies charges its tenants based on power consumed rather than square footage occupied. But as part of its unique data center colocation model, Clune said, it developed the concept of data suites, which allow clients to house servers in dedicated spaces rather than on racks in a huge shared space.
Tenants also like the advantages of having their data center space underground, Clune said. Being “a data center without walls,” he said, Cavern Technologies gives tenants the flexibility of moving and expanding their space quicky. It also protects data from natural disasters and offers a 65-degree ambient temperature that helps tenants minimize cooling costs. A remote energy monitoring system and suite-design recommendations from Cavern Technologies’ staff also keep power costs down, said Scott Herron, the company’s vice president of data center operations.
In addition to providing access to multiple KCP&L substations, Cavern offers high-capacity bandwidth from multiple carriers. The data infrastructure is so robust, Clune related, that a London-based company with space in the center reported that it could send data from London to Lenexa to Scotland faster than it can send it directly to Scotland.
“With Cavern’s focus on the infrastructure piece, including the space, power, cooling, security and bandwidth, our client’s IT department can focus on their unique mission-critical business operations,” Clune said.
The model has attracted some of the nation’s leading health care, financial services, legal and tech companies, said Clune, whose son John is president of the company.
They have guided the underground business to full occupancy in its present 60,000-square-foot-space. In addition, the company has secured commitments for 25 percent of the 100,000 square feet now being built out.
The company will finish the year with nearly $6 million in revenue, Pete Clune said, and will be posting $15 million to $20 million by the time the additional 100,000 square feet is fully occupied.
JE Dunn Construction is the contractor for the expansion. Bell/Knott & Associates is providing the data center design, and Gibbens Drake Scott Inc. is the engineering firm.
Bill Seymour, a senior vice president with Meritex Enterprises, which owns the underground park, said he began working with Pete Clune 10 years ago, when he operated a managed services firm at Meritex.
“I never imagined the type of scale Cavern has now achieved,” Seymour said. “Pete and John have proven the concept, done what they said they would do and give the customers what they want.”

Helsinki's underground master plan


CNN's Richard Quest takes a look at the development of Helsinki's vast underground and eco friendly programme.
http://www.cnn.com/video/data/2.0/video/world/2011/02/14/qmb.fc.helsinki.underground.cnn.html

Iron Mountain

SOLABS Chooses Iron Mountain to Host its Data Center



Life sciences eQMS software provider selects Iron Mountain for its secure and compliant data center services

Iron Mountain(R) Incorporated (NYSE: IRM), the storage and information
 management company, announced it has signed a multi-year colocation 
agreement with SOLABS, an Enterprise Quality Management software (eQMS) 
provider for life sciences companies. SOLABS will co-locate aspects of 
its data center operations within Iron Mountain's underground data 
center complex in western Pennsylvania. 
SOLABS' quality management software helps life sciences organizations automate quality operations and comply with Federal Drug Administration (FDA) regulations. Until now, the SOLABS QM software solution has been sold to customers as licensed software and installed on the client's network. Now, in response to customer demand, the company is preparing to offer the software as a SaaS/cloud offering. In order to make this switch, the company sought a secure and compliant data center home for this new offering. SOLABS selected Iron Mountain based on its track record of protecting customer information in secure and compliant facilities.
"Our customers entrust us to protect their electronic records, and we take that extremely seriously," said Philippe Gaudreau, chief executive officer, SOLABS. "When we made the decision to co-locate our data center operations, we wanted our customers to have peace of mind knowing where their data is being stored. Iron Mountain is a well-known and trusted storage and information management company that already provides services to many of our life sciences clients. This seemed liked a natural fit for us."
"We appreciate SOLABS' business and believe our secure data center colocation facility is a perfect home for their company's new SaaS offering," said Mark Kidd, senior vice president and general manager, data centers, Iron Mountain. "Iron Mountain Data Centers is designed to help companies like SOLABS in highly regulated businesses. From employee training to the infrastructure of our buildings, we take a stringent approach to complying with industry-specific regulations such as PCI, FISMA and HIPAA. Our DNA in managing information assets from creation to destruction also differentiates us to these organizations. No one in today's data center market has our track record in security and facilitating compliance."
Gaudreau added: "We wanted a cloud platform that allows us to provide a 24/7 SaaS offering and options to our on-premise clients such as constant monitoring, remote test environments, offsite backup and disaster recovery programs. By partnering with Iron Mountain, we will move closer to our vision of 'feeling local' for every client, no matter where the SOLABS QM software resides."
After more than a decade of providing wholesale data center space to corporate and government organizations, Iron Mountain now offers retail colocation services for companies that do not require a dedicated data hall. Iron Mountain's wholesale offering provides dedicated, secure space for all or part of an organization's data center operations and offers a range of services, including engineering and design, development and construction, and ongoing facility operations and management. Iron Mountain's retail colocation solution provides customers a shared environment with highly reliable, scalable, and secure power and cooling. Iron Mountain Data Centers customers also have access to additional services such as migration, networking, tape handling, IT asset tracking, disposition and more.
Based in western Pennsylvania, Iron Mountain's underground data center is ideal for enterprises and government clients seeking an ultra-secure environment. The former limestone mine is 220 feet below ground, providing both efficient geothermal cooling and natural protection from extreme weather.
For more information, visit: www.ironmountain.com/datacenters
About SOLABS
For more than thirteen years SOLABS has helped organizations in the Life Sciences industry improve their operational efficiency and maintain compliance with '21 CFR Part 11' by automating their Quality Operations. SOLABS QM 10.0, released in 2013, is the company's sixth major release, exclusively dedicated to serving the Life Science Industry. The software allows companies to manage Quality Processes such as CAPA, Complaint, Change Control, Audit, Controlled Documents (such as SOPs, Work Instructions and Methods) as well as Employee Training Activities in one single user interface. SOLABS QM is can be implemented as a 'stand-alone' system but it is also compatible with and may be fully integrated into ERP, LIMS and other enterprise software. For more information on SOLABS, contact Ericka Moore (info@solabs.com or 1-877-322-1368 ext. 219).
About Iron Mountain
Iron Mountain Incorporated (NYSE: IRM) is a leading provider of storage and information management services. The company's real estate network of over 64 million square feet across more than 1,000 facilities in 36 countries allows it to serve customers with speed and accuracy. And its solutions for records management, data backup and recovery, document management, and secure shredding help organizations to lower storage costs, comply with regulations, recover from disaster, and better use their information for business advantage. Founded in 1951, Iron Mountain stores and protects billions of information assets, including business documents, backup tapes, electronic files and medical data. Visit www.ironmountain.com for more information.
http://www.cityutilities.net/business/springnet.htm

After more than a decade of operation, City Utilities is looking to sell off the climate-controlled data center it operates out of the Springfield Underground.
It's not because the facility, known as SpringNet Underground, hasn't been successful - the cavernous space in the former limestone quarry has proved popular with businesses that need a secure place to store computer servers and other equipment.
So popular, in fact, that "we're coming to the point where we're out of space," CU General Manager Scott Miller said.

2006 SpringNet® is a division of CU, which offers telecommunication services. SpringNet® has

 SpringNet Continues Driving Jobs and Revenue for Local Community
A year has passed since we covered SpringNet in Springfield, Missouri, and its remarkable impact on local businesses and economic development. We recently spoke with SpringNet Director, Todd Murren, and Network Architecture Manager, Todd Christell, to get an update on how the network is progressing.
Demand for SpringNet’s high-speed data services continues to grow steadily. Financial statements for City Utilities of Springfield show the network generated $16.4-million in operating revenue last year against costs of $13.2-million. Better yet, revenues have increased around 3% per year while cost increases are closer to 0.5%. The end result is close to $3 million in annual net income for SpringNet. And all of this comes from a network that only serves commercial and public sector clients because Missouri state law restricts municipal network provision to only “Internet service,” meaning SpringNet cannot offer triple-play packages to compete with incumbent providers.
One of the highlights of SpringNet’s economic development success has been the attraction and retention of travel giant Expedia. After a large national provider failed to deliver on negotiations with the company, SpringNet stepped in to make sure Expedia brought its call center to Springfield. That effort has paid off handsomely for SpringNet and the local community. Expedia now employs close to 900 in the area after announcing in July that it was hiring another 100 employees in Springfield.
Up next for SpringNet is an effort to leverage its fiber infrastructure to create even more jobs. Believing that future job growth will revolve around the advancements enabled by gigabit networks, SpringNet is working with the Mid-America Technology Alliance (MATA) to host a hackathon with partners in Kansas City to explore what is possible between gigabit cities.
As Murren and Christell tell it, someone in Springfield can now send data to Kansas City with a 5-millisecond delay. It’s like they are in same building despite being hundreds of miles apart. This capability spells opportunity for new ways of doing business and delivering services. SpringNet wants to help the gigabit community develop these opportunities.
Deep underground in Butler County, the Iron Mountain National Data Center functions like its own little city. With more than 3,000 badged employees, the former mine has its own fire department, restaurant, transit systems, water treatment center, and medical center. (Justine Coyne/Pittsburgh Business Times).

But what may be most surprising about this unique facility, located outside of Pittsburgh in Boyers, is the treasures that are stored inside.
Formerly a limestone mine owned by United States Steel Corp. (NYSE: X) in the early 1900s, today the mine houses everything from original prints of classic motion pictures, to data storage for more than 2,300 business and government agencies across the U.S.
Located 220 feet underground, Nick Salimbene, director of business development for Iron Mountain, said what makes the facility so unique is that it is virtually impervious to natural disasters. Salimbene said being housed underground also provides a stable environment that naturally maintains a temperature of 52 degrees.
Of the 145 acres that are developed, Salimbene said about 70 percent is used for physical storage. But demand is shifting toward more data storage. He said the demand for data storage has been increasing between 20 percent and 30 percent annually in recent years. Today, he said about 80 percent of what is coming into the center today is data.
For the privacy of its customers, Iron Mountain does not disclose who uses the Boyers facility by name. But it includes many big names. One tenant Salimbene could discuss is Corbis Corp., which houses its collection of over 20 million photographs .
"There's a little bit of everything here," Salimbene said. "But the most important thing for us, is that our customers feel secure having their items located here."
Obviously, with such valuable objects in its facility, security is very tight at Iron Mountain, with armed guards keeping watch 24 hours a day, seven days a week.
"We have companies trusting us with their most valuable assets," Salimbene said. "That's not something we take lightly."

Saturday, November 2, 2013


http://www.theguardian.com/world/interactive/2013/nov/01/snowden-nsa-files-surveillance-revelations-decoded#section/1

Surprise Visitors Are Unwelcome At The NSA's Unfinished Utah Spy Center (Especially When They Take Photos)

 http://nsa.gov1.info/dni/index.html

 
 IF YOU HAVE NOTHING TO HIDE...YOU HAVE NOTHING TO FEAR
NSA
 Government is not reason; it is not eloquent; it is force. Like fire, it is a dangerous servant and a fearful master.
George Washington

If the freedom of speech is taken away then dumb and silent we may be led, like sheep to the slaughter.
George Washington



Most people who visit Salt Lake City in the winter months are excited about taking advantage of the area’s storied slopes. While skiing was on my itinerary last week, I was more excited about an offbeat tourism opportunity in the area: I wanted to check out the construction site for “the country’s biggest spy center.”

An electrifying piece about domestic surveillance by national security writer James Bamford that appeared in Wired last year read like a travel brochure to me:
In the little town of Bluffdale, Big Love and Big Brother have become uneasy neighbors. Under construction by contractors with top-secret clearances, the blandly named Utah Data Center is being built for the National Security Agency. A project of immense secrecy, it is the final piece in a complex puzzle assembled over the past decade. Its purpose: to intercept, decipher, analyze, and store vast swaths of the world’s communications as they zap down from satellites and zip through the underground and undersea cables of international, foreign, and domestic networks. The heavily fortified $2 billion center should be up and running in September 2013. Flowing through its servers and routers and stored in near-bottomless databases will be all forms of communication, including the complete contents of private emails, cell phone calls, and Google searches, as well as all sorts of personal data trails—parking receipts, travel itineraries, bookstore purchases, and other digital “pocket litter.”
My outing to the facility last Thursday was an eventful one. I can confirm that the National Security Agency’s site is still under construction. It was surprisingly easy to drive up and circle its parking lot. But if you take photos while there, it is — much like Hotel California – very hard to leave.

When the University of Utah professor who invited me to Salt Lake City to talk to his students asked how I wanted to spend three hours of downtime Thursday afternoon, the super-secret spy center was at the top of my list. The professor, Randy Dryer, was dubious about the value of visiting the construction site, assuming there would be a huge fence that would prohibit us from getting close or seeing anything significant. That turned out not to be the case.
View from the highway. There's a similar bird's eye view available on Google Earth.
We drove about 30 minutes south of downtown Salt Lake City to an area described to me as “out in the desert.” As we got close, I could see from the highway four grey mortared buildings that will soon be holding massive amounts of the world’s data. They appeared half-finished. I snapped some photos with my iPad (which, yes, does make me feel like a ridiculous person).
Then we came to a paved turn-off on the right that led directly to the facility. Driving up the road, we came to a sign emblazoned with the seals of the National Security Agency and the Office of the Director of National Intelligence; it was topped with a digital banner that proudly declared in flashing lights, “Look This… Sign Works!!!!” Behind the sign was a building that looked like a gas pumping station, minus the pumps. We took a right into a parking lot, where I snapped photos of the majestic view of the mountains that NSA data workers will have, another building that looked almost like a visitor’s center (it is #1 — a $9.7 million Visitor Control Center — in this diagram from Wired), and closer views of the data center and the unimposing, barbed-wire-topped fence that surrounded it. That seemed to be the end of the tour. I expressed surprise to Randy Dryer that no one had come out to see why we were slowly driving through the lot.
Two minutes later as we circled back to the flashing sign to take a few more photos, including one of a green sign with an arrow that read, “Rejection Lane,” a uniformed but baby-faced officer with NSA and “K9 unit” badges came out and walked up to the car.
Where we stopped for an hour to 'engage in a chat'
“Were you taking photos?” he asked. I said that I was. He responded, “You’re going to need to delete those.”
I explained that I was a journalist and that I preferred not to. He insisted, saying we were on restricted federal property and that taking photos there was illegal. Luckily for me, Randy Dryer is not just a university professor but a practicing and long-experienced media lawyer. He explained to the officer who we were, why we were there and that we hadn’t realized we were on restricted property. The officer, who carried a gun and a portable radio, began writing everything we said down in a little green notebook. When the officer insisted again that the photos be deleted, Dryer asked if we could talk to his supervisor.
At this point another uniformed officer pulled up behind us. He came up to the car and went essentially through the same question-asking routine while the first officer, who took our driver’s licenses, walked away from the car to call his supervisor. Officer #2, who seemed slightly older than the first but who also carried a little green notebook to record what we had to say, told us he would like for me to delete the photos, and mentioned that it would be easier if we did and that we could be charged with a crime for trespassing and for taking the photos.
Honestly, I was starting to feel pretty nervous at this point but also painfully aware of the irony of the situation. They didn’t want me to capture information about a facility that will soon be harvesting and storing massive amounts of information about American citizens, potentially including many photos they’ve privately sent.
I also remembered that I’d recently turned the passcode off on my iPad so it wouldn’t lock up on me during a presentation to political science students about “privacy watchdogs;” I suddenly had a strong urge to turn it back on.
View from the parking lot
We sat there for about 30 minutes with the car window down and the cold Utah air making its way inside. As we waited for “the supervisor,” we began chatting with the NSA officers. They asked for more information about us, including whether we had guns in the car. (This wouldn’t be hugely surprising in the state of Utah, but we did not.) I confessed that the photos I had weren’t terribly revealing. “You can see the facility from the highway,” I argued. One of the officers grimaced at that and suggested that this has occurred to him and he “didn’t think they built it in the best spot.”
“We didn’t see any signs on our way in,” said Dryer. “They must be tiny.”
“Yeah, that road recently opened,” said Officer #1. “I was just thinking the other day as I was driving in that those signs are too small.”
I said that I expected the construction to be farther along at this point, given that the center is due to be completed in seven months. They said this is “just the half I can see.” Gesturing at the rather puny looking fence, I told them it didn’t seem very high-security.
“Oh it’s stronger than it looks,” replied Officer #2. “It would stop a tractor trailer.”
“Yeah, but too bad it’s not higher and not see-through,” added Officer #1.

Meanwhile, I saw a more casually dressed man make his way into the back entrance of the building that would hold the junk food, sodas, and cash register if this were indeed the gas station it appeared to be. Officer #1 went in to join him. We asked who the supervisor was. “Agent Federman,” we were told. (It sounded like “Federman;” I’m not sure about the spelling.)
The mountainous view for Data Center workers
We sat in the car some more, while they — I assume — ran background checks on us, Googled us, checked my Forbes credentials, poked around my Facebook page and called other supervisors, and perhaps a Public Information Officer to decide what to do about us. After maybe another 15 minutes, an aggressively chummy man with piercing blue eyes, wearing a sweater and slacks, came out to the car. He introduced himself as a special agent and asked us to explain why we were there, with an aside to Officer #1 that he wanted him to record everything. Dryer offered a lengthy explanation, including all of the classes I’d spoken to. Agent Federman responded with a direct question: “Did anyone send you to take those photos and do you plan to distribute them to enemies of the United States?”
I would have laughed at that had I not been so intimidated and nervous. I said no one sent me and that I didn’t intend to do that. He asked why I did take them. I said I was amused by the sign and wanted to document the trip, and that I’m a journalist and recording information is what I do. He asked whether I would distribute or publish them, and I said again that I was a journalist so that was a possibility. He asked if I had already sent them from my device elsewhere. While the thought had certainly crossed my mind, I had not emailed, Facebooked, or Instagrammed them (yet). He asked me to describe the photos I’d taken, which I did.
He asked me again if I would delete them saying this would make things easier. Feeling like Bartleby the Scrivener by that point, I told him that I would prefer not to. He told me I could have called the Public Information Office, requested a tour and gotten official photographs; he suggested I delete my photos and do that instead. (It struck me at that moment as his version of “come back with a warrant.”) Dryer asked if we could go on a tour now. “No,” he responded. He went back inside the building.
I later contacted James Bamford, the author of the Wired article, to ask whether he requested a tour of the facility. He did not as it was just a hole in the ground when he first wrote the article many months before it came out. “But, having written about NSA for years, I’ve had little success in getting ‘tours’ of NSA facilities,” he said by email.
Now Officer #1 began asking for more information, such as my home address, the name of my hotel in Salt Lake City, where we had been driving from and where we were driving to. (If I didn’t have a government intelligence file before, I certainly do now.) He also asked for our social security numbers. We declined to give them – though I suspect it wouldn’t be very hard for these types to get them if they wanted them.
We began chatting again. Officer #2 expressed some personal discomfort around having photos taken, saying that if a photo of him was taken and put on the Internet that someone might come after him just because of who he worked for. “I had enough of that in the Army,” he said.
Officer #1 said they had to protect against “just anyone coming up here.” After the Wired article came out, there were two “sovereign citizens” who drove up and wanted to know “exactly what was going on in there;” the guards turned them away. The sovereigns are considered a domestic terrorist movement by the FBI. Officer #1 mentioned that both Dryer and I had clean records.
Our encounter with the officers started around 3:30. At this point, it was nearing 4:30. I was wondering if there was going to be a showdown and whether they were going to seize my iPad. I started thinking about whether I might have anything sensitive on there that I needed to worry about.
Agent Federman came back out. This time he came around to my side of the car. “Can I see the photos?” he asked. I was hesitant but it seemed like a reasonable request; plus, I was starting to fear that a federal citation was going to be my souvenir from this trip. So I scrolled through the photos I’d just taken on my iPad for him. He apparently didn’t see anything too objectionable. He asked me to go through again and count them. There were 13. He asked if I would delete two of the photos, which showed a K9 unit SUV including its license plate. I didn’t want to out of principle, but after an hour of being detained in a cold car – or as they described it, “engaging in a chat,” – I was really wanting to leave. I agreed to do so. That’s when they let us depart.
Coincidentally, the gas station-looking building where we were questioned turned out to be an entrance where people working in the facility presumably will have to show their credentials to gain entrance. Our interrogation took place in the lane with the green sign that read “Rejection Lane.” We were the first of the rejects.
The only warnings to stay away from the facility at this point

On the road on the way out, we noticed the signs we had missed: a speed limit sign, a small yellow one to the right side of the road that said “authorized personnel only” and another at the turn off – on the opposite side of the road, in front of a big field – that said “no trespassing.” We stopped at each one, of course, so I could take pictures of them.
It was an intimidating hour. While I’ve interviewed federal agents for stories, I’ve never been interrogated by them before. We may have been treated as gently as we were because I’m a mainstream journalist with a prominent platform and because I was accompanied by a lawyer. I was grateful that I could hold up “professional journalist” as my own badge; it felt protective.
I suspect this would’ve been a much more difficult encounter for someone without journalism credentials. That’s despite the fact that people have legitimate questions about the lengths to which intelligence agencies are going in order to monitor our communications and electronic activity to look for threats. My trespass and capture of information about the center was easy for NSA officers to spot, but the extent of the electronic trespassing against American citizens that might occur inside that data center when it’s finally completed will be much harder for us to discern. And, as the Supreme Court recently ruled in turning back a challenge to U.S. government surveillance of communications with people abroad, if you can’t prove that an unconstitutional invasion of privacy is happening, you can’t stop it from happening.
(contributor_data.name)!?html Kashmir Hill Forbes Staff
I'm a privacy pragmatist, writing about the intersection of law, technology, social media and our personal information. If you have story ideas or tips, e-mail me at khill@forbes.com. PGP key here. These days, I'm a senior online editor at Forbes. I was previously an editor at Above the Law, a legal blog, relying on the legal knowledge gained from two years working for corporate law firm Covington & Burling -- a Cliff's Notes version of law school. In the past, I've been found slaving away as an intern in midtown Manhattan at The Week Magazine, in Hong Kong at the International Herald Tribune, and in D.C. at the Washington Examiner. I also spent a few years traveling the world managing educational programs for international journalists for the National Press Foundation. I have few illusions about privacy -- feel free to follow me on Twitter: kashhill, subscribe to me on Facebook, Circle me on Google+, or use Google Maps to figure out where the Forbes San Francisco bureau is, and come a-knockin'.
Underground Secure Data Center Operations

Technology based companies are building new data centers in old mines, caves, and bunkers to host computer equipment below the Earth's surface.

Underground Secure Data Center Operations have a upward trend.

Operations launched in inactive gypsum mines, caves, old abandoned coal mines, abandoned solid limestone mines, positioned deep below the bedrock mines, abandoned hydrogen bomb nuclear bunkers, bunkers deep underground and secure from disasters, both natural and man-made.

The facility have advantages over traditional data centers, such as increased security, lower cost, scalability and ideal environmental conditions. There economic model works, despite the proliferation of data center providers, thanks largely to the natural qualities inherent in the Underground Data Centers.

With 10,000, to to over a 1,000,000 square feet available, there is lots of space to be subdivided to accommodate the growth needs of clients. In addition, the Underground Data Centers has an unlimited supply of naturally cool, 50-degree air, providing the ideal temperature and humidity for computer equipment with minimal HVAC cost.

They are the most secure data centers in the world and unparalleled in terms of square footage, scalability and environmental control.

Yet, while the physical and cost benefits of being underground make them attractive, they have to also invested heavily in high-speed connectivity and redundant power and fiber systems to ensure there operations are not just secure, but also state-of-the-art.

There initially focused on providing disaster recovery solutions, and backup co-location services.

Clients lease space for their own servers, while other provides secure facilities, power and bandwidth. They offers redundant power sources and multiple high-speed Internet connections through OC connected to SONET ring linked to outside connectivity providers through redundant fiber cables.

Underground Data Centers company augments there core services to include disaster recovery solutions, call centers, NOC, wireless connectivity and more.

Strategic partnering with international, and national information technology company, enable them to offer technology solutions ranging from system design and implementation to the sale of software and equipment.

The natural qualities of the Underground Data Centers allow them to offer the best of both worlds premier services and security at highly competitive rates.

Underground Data Centers were established starting in 1990's but really came into there own after September 11 attacks in 2001 when there founders realized the former mines, and bunker offered optimal conditions for a data center. The mines, and bunkers offered superior environmental conditions for electronic equipment, almost invulnerable security and they located near power grids.

Adam Couture, a Mass.-based analyst for Gartner Inc. said Underground Data Centers could find a niche serving businesses that want to reduce vulnerability to any future attacks. Some Underground Data Centers fact sheet said that the Underground Data Center would protect the data center from a cruise missile explosion or plane crash.

Every company after September 11 attacks in 2001 are all going back and re-evaluating their business-continuity plans, This doesn't say everybody's changing them, but everybody's going back and revisiting them in the wake of what happened and the Underground Data Center may be just that.

Comparison chart: Underground data centers

Five facilities compared
Name InfoBunker, LLC The Bunker Montgomery Westland Cavern Technologies Iron Mountain The Underground
Location Des Moines, Iowa* Dover, UK Montgomery, Tex. Lenexa, Kan. Butler County, Penn.*
In business since 2006 1999 2007 2007 Opened by National Storage in 1954. Acquired by Iron Mountain 1998.
Security /access control Biometric; keypad; pan, tilt and zoom cameras; door event and camera logging CCTV, dogs, guards, fence Gated, with access control card, biometrics and a 24x7 security guard Security guard, biometric scan, smart card access and motion detection alarms 24-hour armed guards, visitor escorts, magnetometer, x-ray scanner, closed-circuit television, badge access and other physical and electronic measures for securing the mine's perimeter and vaults
Distance underground (feet) 50 100 60 125 220
Ceiling height in data center space (feet) 16 12 to 50 10 16 to 18 15 (10 feet from raised floor to dropped ceiling)
Original use Military communications bunker Royal Air Force military bunker Private bunker designed to survive a nuclear attack. Complex built in 1982 by Louis Kung (Nephew of Madam Chang Kai Shek) as a residence and headquarters for his oil company, including a secret, 40,000 square foot nuclear fallout shelter. The office building uses bulletproof glass on the first floor and reception area and 3-inch concrete walls with fold-down steel gun ports to protect the bunker 60 feet below. Limestone mine originally developed by an asphalt company that used the materials in road pavement Limestone mine
Total data center space (square feet) 34,000 50,000 28,000 plus 90,000 of office space in a hardened, above-ground building. 40,000 60,000
Total space in facility 65,000 60,000 28,000 3 million 145 acres developed; 1,000 acres total
Data center clients include Insurance company, telephone company, teaching hospital, financial services, e-commerce, security
monitoring/surveillance, veterinary, county government
Banking, mission critical Web applications, online trading NASA/T-Systems, Aker Solutions, Continental Airlines, Houston Chronicle, Express Jet Healthcare, insurance, universities, technology, manufacturing, professional services Marriott International Inc., Iron Mountain, three U.S. government agencies
Number of hosted primary or backup data centers 2 50+ 13 26 5
Services offered Leased data center space, disaster recovery space, wholesale bandwidth Fully managed platforms, partly managed platforms, co-location Disaster recovery/business continuity, co-location and managed services Data center space leasing, design, construction and management Data center leasing, design, construction and maintenance services
Distance from nearest large city Des Moines, about 45 miles* Canterbury, 10 miles; London, 60 miles Houston, 40 miles Kansas City, 15 miles Pittsburgh, 55 miles
Location of cooling system, includng cooling towers Underground Underground Above and below ground. All cooling towers above ground in secure facility. Air cooled systems located underground. Cooling towers located outside
Chillers located above ground to take advantage of "free cooling." Pumps located underground.
Location of generators and fuel tanks Underground Above ground and below ground Two below ground, four above ground. All fuel tanks buried topside. Underground Underground
*Declined to cite exact location/disatance for security reasons.