Archive for August, 2011

Getting data centers ready for disaster

Tuesday, August 30th, 2011
Hurricanes can have a major impact on data center operations.

The East Coast of the United States is turning into a shining example of data center disaster recovery and preparation, as an earthquake recently struck Virginia, sending tremors throughout as far as New England, and a hurricane is expected to sweep through the entire coastal region in the coming days.

According to a recent TechTarget report, many data center managers are moving quickly to analyze their current strategies and evaluate if any changes need to be made before the hurricane strikes.

James Duffy, vice president and enterprise architect for Advanced Health Media, told the news source his company is currently working to check virtual private networks, bandwidth capacities and support models to make sure the data center will be able to handle more traffic from remote locations if employees are unable to make it into the office and have to work from home.

William Moore, CTO at CareCore National, which has data centers directly in the path of the hurricane, told TechTarget the company has taken strides to secure the building’s physical structure to keep equipment on server racks safe from the elements.

“We touched base with our diesel vendors and topped up our generators. We walked the roofs to make sure there was no loose flashing and checked the latches on our air handling units,” Moore told the news source.

Matt Cunningham, CareCore senior vice president for IT, told TechTarget the company has gone as far as to load test all of its data centers to make sure each is capable of handling the company’s entire workload on its own. This is performed in case a data center fails entirely due to the weather and needs to be quickly migrated to a backup facility that can handle the load.

Disaster recovery strategies are a key part of data center management, as companies need to prepare their data centers to withstand natural disasters regardless of where they are built. For example, Stream Data Centers recently announced plans to build a new data center in Richardson, Texas. The purpose-built facility is designed to meet the highest standards for technology and security, and it will follow Miami-Dade County building codes to ensure it can withstand natural disasters. The codes mandate that the building can handle sustained winds of more than 146 mph.

Stream Data Centers moving to Texas

Tuesday, August 30th, 2011
Stream Data Centers recently announced plans to open a new data center in Texas.

Stream Data Centers recently announced plans to develop a new data center property filled with server racks and state-of-the-art infrastructure in Richardson, Texas, a suburb of Dallas.

The new Stream Private Data Center will be built from the ground up and will eventually be built into a 72,500-square-foot facility. The company decided to construct the facility in Richardson because the town features a robust power infrastructure that is capable of supporting a state-of-the-art data center. The town’s power utilities also have a reputation for being especially inexpensive.

To help ensure the facility is able to function properly, even in some of the most extreme circumstances, the company is building the data center to meet the standards set forth by the Miami-Dade County building council. As a result, it will be responsible to meet building codes that mandate such provisions be able to withstand sustained winds of 145 mph. The company is also striving to meet LEED Gold standards for environmental efficiency and sustainability.

The data center will be able to use almost 3.4 megawatts of power, and it will be designed with infrastructure that can be upgraded to a capacity of more than 6.5 megawatts. This system will be set up in a standard 2N electrical/ N+1 configuration that is fed by a dual-power supply system to provide redundancy. These power systems will be designed to support three separate areas of raised floor space for open frame racks. Each section will be approximately 10,000 square feet.

Robert Kennedy, co-managing director of Stream Data Centers, said the project is especially exciting because so few purpose-built data centers have been constructed from the ground up in the region.

“We are excited to break ground on our new Private Data Center in Richardson. This will be the first ground-up, purpose-built wholesale data center in north Texas,” said Kennedy.

Richardson is emerging as a popular data center destination. The Digital Realty Trust recently announced the completion of the first phase of a project to upgrade a large complex in the Dallas suburb. Data Center Knowledge reported that the enhancements come in response to expanding demand for data center services in the region. The first refurbished building in the complex has now been fully leased by clients in a variety of industries.

The business case for data center upgrades

Tuesday, August 30th, 2011
Technological investments can pay off in the data center.

Data centers are complex facilities where equipment on server racks, power systems and cooling infrastructure must work in concert with other technological platforms to operate effectively. As a result, an upgrade in one area may require consequent enhancements to other parts of the facility’s operations. According to a recent InfoWorld report, this rule of upgrades is especially prevalent when dealing with cooling systems.

Periodically, InfoWorld will publish a true story of an IT professional’s personal experience. In a recent report, the news source tells the tale of a data center employee who experienced the full brunt of what can happen when cooling systems are upgraded relative to other technologies.

The data center involved in the report was running contemporary server and power systems for the most part, but it was also using an archaic cooling setup. Despite attempts to convince the company to upgrading cooling infrastructure, business managers refused because of cost, political and logistical reasons. The refusal to upgrade created major problems one hot summer day.

The employee in the report concluded his morning commute and found the data center in disarray with all of the windows and doors open and IT employees running around, setting up as many fans as possible. It turns out the cooling system gave out during the early morning, and temperatures had risen to over 100 degrees inside the facility. At first, hardware operations were maintained. However, the company quickly realized it had to shut all of the equipment down to prevent hardware failure and subsequent data loss. Some systems were easy to power down, and approximately 20 percent of the data center servers were properly shut down immediately. However, the rest of the servers had to be manually turned off, and the original SAN was setup with a power system that had 50 switches that needed to be hit in a specific order to shut down properly. The whole shut down process took a few hours.

Two days later, the air conditioning system was running again. However, the report said, it took another similar failure to get the company to upgrade the infrastructure, and the enhancement only duplicated the current system adding redundancy.

Businesses that operate data centers need to understand that the cost of downtime can often outweigh the expense of having to upgrade infrastructure. A recent Ponemon Institute whitepaper researched the total cost of downtime and found that most data centers will lose approximately $5,600 per minute when systems are down.

East Coast earthquake has minimal impact on data centers

Tuesday, August 30th, 2011
An earthquake that struck most of the East Coast had almost no impact on data centers.

A recent earthquake with an epicenter in Virginia, but an overall affect felt through the Carolinas and as far north as New England, had minimal impact on data center operations, as equipment on server racks was unscathed, and many operators report almost no disruption to services, PCWorld reported.

According to the news source, many data center operators reported their status through Twitter and a variety of other methods such as company blocks. Overall, many companies experienced slight disruptions because facilities were evacuated, but hardware was not adversely affected. A few other organizations reported minor issues with web page response times and other load time issues, but these were slight and only lasted for a short period.

For example, PCWorld said that SeatGeek, an Amazon Web services company, experienced a major spike in page load times as a result of the earthquake. In a company blog post, SeatGeek explained its experience during the earthquake.

“Over here at SeatGeek, we were excitedly discussing the tremor when Mike, our trusty sysadmin, realized that our Amazon AWS servers were all in Virginia, right near the epicenter,” the blog said.

The SeatGeek blog went on to say the company learned two valuable lessons from the experience. The first is that web servers typically respond poorly to earthquakes. The second is that real-time analytics are extremely beneficial. SeatGeek was able to track how the servers were affected by the earthquake as the company responded to the event and worked to resolve any issues that occurred.

Many other service providers explained how the earthquake impacted them. For the most part, data center operators said they really were not impacted by the earthquake, because the tremors were relatively slight in nature, and facilities are built in such a way that they would not be impacted significantly by a minor earthquake. However, the report said, some operators are using the event as an opportunity to emphasize the importance of disaster recovery, because situations could have been much worse.

Earthquakes can be devastating for data centers, especially when they disrupt power infrastructure. This issue struck many facilities during the disaster in Japan earlier this year. During the earthquakes and tsunami, power infrastructure was put out of service to such an extent that even redundant power systems could not withstand.

Data center, supercomputer networks becoming more similar

Friday, August 26th, 2011
Data center and supercomputer networks are quickly becoming more similar.

Data center and supercomputer infrastructures often rest on server racks in state-of-the-art facilities capable of providing foundational support for some of the most advanced technologies in the market. However, data centers traditionally have not had to focus too much on networking. Virtualization is changing this and making networks more important in the data center sector. As a result, data center and supercomputer networks are quickly becoming more similar, according to a recent EE Times report.

The news source compiled data from a variety of major industry research papers that will likely be presented at the Hot Interconnects event, and it concluded that both data center and supercomputer network infrastructures are becoming more intelligent in design.

The report explained that the papers include detailed accounts of infrastructure from three major network deployments from both supercomputer and data center settings. Though the three networks are built from different elements, the core structural and foundational elements of the systems are more similar than usual, according to the report.

Fabrizio Petrini, chairman of the event and a researcher at IBM, told the news source the growing similarity between supercomputer and data center networks comes because businesses are attempting to install smarter, more efficient infrastructure.

“The trend we see is for efficacy reasons networks are becoming smarter, and more of the networking stack is moving into the network chip – even parts of the message-passing interface library,” Petrini told the news source. “You can execute algorithms in the network in nanoseconds at line rate – this is very powerful and fast compared to sending operations to the compute node.”

According to the EE Times, the Hot Interconnects event will also see presentations that focus on how supercomputer networks will grow to a point where they can handle extremely large quantities of data and how new switching solutions are impacting the data center sector.

Smart networks are becoming critical in the data center sector because virtualization tends to put more data on less physical hardware. As a result, many industry experts agree that simply adding more bandwidth is not an adequate solution because the bandwidth will quickly be used up by data-heavy applications. Instead, network optimization and virtualization are typically presented as the top data center networking solutions because they improve routing and can make the infrastructure operate more intelligently.

Digital Realty Trust finding success in Texas

Friday, August 26th, 2011
Dallas and the surrounding region are proving to be a prime area for data center expansion.

The Digital Realty Trust has found another tenant for its redeveloped, state-of-the-art data center in Datacenter Park – Dallas, which it recently opened in Richardson, Texas. This means the facility is now fully rented.

According to a recent Data Center Knowledge report, the multi-tenant facility boasts approximately 105,000 square feet of space for open frame racks and other equipment. It also has approximately 13 megawatts of electricity to support IT systems.

The data center is designed to provide robust and diverse hosting services, and it now includes space for clients in financial services, consumer products, colocation and international network services, the report said.

Datacenter Park – Dallas features a number of facilities of varied sizes. The Digital Realty Trust is currently redeveloping a number of the buildings on the site as part of its plans to serve the rapidly growing data center market in the region. The site features fiber-optic network connectivity from multiple providers, 40 megawatts of power with enough capacity to expand to 125 megawatts and more than 750,000 square feet of total data center space.

The Digital Realty Trust is now moving to the second phase of its upgrades and new leasing movement at the park, creating more opportunities for expansion, according to the report.

Brent Behrman, senior vice president at Digital Realty Trust, told the news source that the data center park’s access to inexpensive power and the company’s efforts to upgrade the facility make the park one of the premier hosting locations in the region.

“Our redevelopment of the property is delivering first-class data center space to a market that has very strong demand from companies based in north Texas as well as from national and multinational corporations with operations in the Dallas-Fort Worth area. Regardless of size, our customers can expect the same high level of quality, reliability, security and attention at all of our facilities,” Behrman told Data Center Knowledge.

The Digital Realty Trust has been expanding aggressively recently. The company has announced plans to purchase new land near London to expand its footprint in the region. Furthermore, Data Center Knowledge reported the company plans to expand even further than the initial land purchase near London.

Nebraska officials considering new data center project

Friday, August 26th, 2011
The Lincoln Board of Education is set to consider the merits of a colocation arrangement.

Officials on the Lincoln Board of Education are considering an investment in a new data center project for the Nebraska-based school district. The project, if approved, would involve converting a boiler room into a data center and filling it with server racks, the JournalStar reported.

The project is currently at a standstill, however, as the Lincoln Board of Education recently announced plans to delay a vote on the facility’s construction to consider a proposal from a local business.

The report explained that the Lincoln Chamber of Commerce and Lincoln Independent Business Association encouraged the Lincoln Board of Education to delay its vote on the project so the group could hear a proposal from a local colocation provider that may be able to offer the same functionality but at a lower price.

The project is somewhat urgent due to a recent fire within district facilities, but the board is ready for this construction as it had already set aside funds to refurbish the boiler room with data center upgrades in mind.

New data center projects abound throughout the United States. Internap recently announced the completion of a new data center in Boston, helping the company expand its footprint outside of its primary area of operation in Atlanta.

Data center managers considering new networking procedures

Wednesday, August 10th, 2011
New networking trends are emerging in the data center sector.

Carrier Ethernet has been the typical network layout for most data center operators in recent years. However, the rising amount of data traffic has outpaced carrier Ethernet, and data center managers are increasingly finding new ways to hook equipment on server racks to the internet, Data Center Knowledge reports.

According to the news source, many data center operators are tapping into dark fiber networks and using them in conjunction with DWDM infrastructure to improve bandwidth capabilities. However, this arrangement creates difficulties in terms of accessing a variety of telecom networks.

As a result, the report recommends data center managers implement connection-oriented Etherent, a solution that combines MPLS-TP and a variety of other Ethernet solutions that can support increasing bandwidth rates and provide the telecom connectivity many data center operators require.

Connection-oriented Ethernet infrastructure combines optical and packet-based transport technologies to provide high-performance network capabilities and are ideal for data center interconnection networks, the report said.

Improving data center interconnections is a rising trend in the sector. Recently, NYSE Technologies and DuPont Fabros Technology partnered to create a data center interconnection network between their facilities.

Data centers could heat homes

Tuesday, August 9th, 2011
Data center exhaust could end up heating homes.

Improving efficiency has been a key movement in the data center sector. To a great extent, strategies to make facilities more sustainable focus on keeping equipment in server racks cool. However, using exhaust to heat offices is becoming more prevalent.

According to a recent study from researchers representing both Microsoft and the University of Virginia, using exhaust heat to power homes could be feasible.

The study, which was titled, “The data furnace: Heating up with cloud computing,” theorized that servers running cloud systems could be used to provide heating for a wide variety of purposes.

While the servers that enable the cloud do not get hot enough to generate enough steam to use as an energy source, the study found they create the perfect amount of thermal energy to heat buildings.

Realistically, apartments and offices are the most viable area to deploy small data centers and use exhaust for heat. However, the study also experimented with doing this in single-family homes and found the area has potential.

While using exhaust to heat other parts of the data center may improve efficiency, it is not a panacea. A recent ZDNet report explains data center managers need to take a holistic approach to improve efficiency and not simply depend on a single, ultra-sustainable strategy.

SSDs fit eBay’s storage need

Tuesday, August 9th, 2011
eBay recently upgraded one of its data centers with SSDs and achieved astounding results.

eBay recently faced a major problem because server racks were filled with too much equipment, and virtual machines were being slowed down by poor I/O infrastructure that required new hardware investments to resolve.

According to a recent Computerworld report, the major online retailer was eventually able to resolve its issues by investing in new solid state drives to replace hard disk drives, which are comparatively slow.

The report said eBay was able to transfer approximately 100 TB of data to SSDs in a year and is already seeing benefits including less server density on open frame racks and other storage equipment and dramatically improved performance.

Michael Craft, eBay’s manager of QA systems administration, told the news source the company has now installed 12 SSD storage arrays that rival HDD storage capacities.

“This is a pure play in our virtualization stack, but we’re looking to expand that as it fits other people’s needs,” Craft told Computerworld.

SSDs have the potential to offer dramatic performance increases in the data center. According to a recent Tom’s Hardware study, SSD reliability may not be as good as some predict but may instead be fairly even with HDDs. However, SSD performance is definitely better, with low-end models outperforming HDDs by more than 85 percent.