Anda di halaman 1dari 24

Volume 3 Number 7

September 2011

Data Center Consolidation


Getting the most processing power in the smallest, sustainable space possible.

Air handlers move air to cool the data center at the National Renewable Energy Lab. Learn more about this LEED Platinum facility on page 10.
Photo: Courtesy NREL/DOE; Photographer Dennis Schroeder

A Trusted Partner for Ensuring Data Center Consolidation Success

www.gdit.com

By Jeff Erlichman, Editor, On The FrontLines

OMB Boosts Responsibility of Federal CIOs

or years Federal CIOs have had to do more with less. On August 8, 2011, OMB Memorandum M-11-29 changed all that. OMB director Jacob Lew wrote: As the federal government implements the reform agenda, it is changing the role of Agency CIOs away from just policymaking and infrastructure maintenance, to encompass true portfolio management for all IT. This will enable CIOs to focus on delivering IT solutions that support the mission and Jacob Lew business effectiveness of their agencies and overcome bureaucratic impediments to deliver enterprise-wide solutions. This memo is designed to clarify the primary area of responsibility for Agency CIOs throughout the government, as identified in the IT Reform Plan. Later that evening on the White House Blog, Steven Van Roekel, US Chief Information Officer wrote about The Changing Roles of Federal Chief Information Officers. He wrote federal CIO achievements have already fundamentally changed the way the federal government manages IT. The memoSteven Van Roekel randum will help CIOs deliver on key areas to drive results and yield an even greater impact. He reiterated Lews message that agency CIOs must be positioned with these responsibilities and authorities to improve the operating efficiency of their agencies. Added to their statutory responsibilities CIOs now have four main areas in which they have the lead role: governance, commodity IT, program management and information security.

Track Data Center Consolidation Progress @ Data.gov


The Federal Data Center Consolidation Initiative targets a minimum of 800 data center closures by 2015. On July 20, 2011, the US CIO released the latest FDCCI compliance guidelines (see page 5). Also, to keep the effort on track, the Federal CIO Council has organized a government-wide Data Center Consolidation Task Force comprised of data center program managers, facilities managers and sustainability officers from 24 agencies. These leaders meet monthly to review the progress of each consolidation project and ensure government-wide alignment between agency efforts, where appropriate. As its work evolves, the Task Force will serve as a community of practice for agency CIOs and data center program managers to share best practices and enhance consolidation effectiveness, according to the CIO Council. In all 373 data centers are scheduled to be closed and consolidated by the close of 2012 including 114 in 2011 and 178 more by December 31, 2012. Click here to track FDCCI progress at Data.gov.
Source: Data.gov August 15, 2011

Counting On Consolidation
Neither Lew nor Van Roekel gave CIOs any guarantee of any new IT investments to consolidate data centers or move applications to the cloud. Budget realities make this problematic. But what every CIO finally gained was more power to do more even though they may have to do more with less. Lew wrote, CIOs must drive the investment review process for IT investments and have responsibility over the entire IT portfolio for an Agency. For years now, many CIOs have wished for control over their agencys IT portfolio. Now they have it. The question is: Can they use their power to get their agency to work together to shed silos of excellence, eliminate duplicative systems, reduce the number of data centers and move applications to the cloud? They have their work cut out to reduce operating dollars, turnaround or terminate troubled projects and deliver meaningful functionality even faster all while enhancing the security of

information systems. CIOs are counting on data center consolidation to do three things reduce operating costs, increase operating efficiency and then have the savings fund needed new IT investments or migration to the cloud. Accountability now rests squarely with these agency CIOs. Now they have all the reasons in the world to just do it. n Inside On The FrontLines

4 FDCCI + SSPP = DOEs Approach To Data Center Efficiency 8 One Path To Consolidation: Use ESPCs 10 NRELs Living Laboratory 12 OTFL Interview: Grant Schneider, CIO, Defense Intelligence Agency 14 OTFL Viewpoint: Jim Flyzik on Will Budget Cuts Darken Clouds 16 Which One Are You: Microwave, Oven or Crock Pot? 18 OTFL Industry Roundtable: Their Best Advice 22 Resources

Copyright 2011 Trezza Media Group, Public Sector Communications, LLC

Data Center Consolidation 3

By Jeff Erlichman, Editor, On The FrontLines

With a PUE of 1.13 the data center at the National Renewable Energy Lab is a living model of how to design and build a data center. Courtesy: NREL/DOE; Photographer Dennis Schroeder

FDCCI + SSPP = DOEs Approach To Data Center Energy Efficiency


DOEs Strategic Sustainability Performance Plan (SSPP) combined with the FDCCI provides a clear path to reducing the overall energy and real estate footprint of government data centers.

er the FDCCI, agencies are attacking consolidation through a combination of new data center construction and the retrofitting of existing facilities. Teams comprised of IT, facilities, procurement and business operations must work together to figure out how to offer the most processing power in the smallest space possible. Further by FY15 all data centers new, retrofits and those remaining must meet the Council of Environmental Quality (CEQ) benchmark metric of a 1.4:1 PUE (Power Usage Effectiveness) ratio. PUE is the metric used to determine the energy efficiency of a data center. A perfect PUE ratio is 1.0 (1:1). No wonder agencies are wishing there was someone with the certified title of Data Center Energy Practitioner (DCEP) on their staff, a professional practitioner capable of doing assessments and advising facility or IT as to what are the best practices to implement in the data center.

management). The goal is to get the two to work together, take a holistic view and meet the energy efficiency requirements Wooley explained. Our approach is to take a business look at what are the costs? Look at the different data centers and see what best practices, what types of retrofits need to be done to bring it up; and then let some smart business decisions determine which ones we are going Jake Wooley to keep and which ones we are going to close. Wooley acknowledged that many facilities and IT personnel operate data centers without a true understanding of what might be the best practices for energy efficiency. Through the Industrial Technologies Program (ITP) at DOE, a Certified Data Center Energy Practitioner training and certification program has been initiated, said Wooley.

Certified Data Center Practitioner


Helping make the DCEP a reality is Jake Wooley, DOEs manager of IT Sustainability Programs. We are truly promoting integration and unified management, Wooley told On The FrontLines in a recent interview. We are talking about collaboration between the facilities (who are primarily responsible for energy management) and IT organizations (who have not in the past been responsible for energy
4 Data Center Consolidation

Measure To Manage
Wooley said the certified DCEP is just one component of DOEs SSPP approach to reaching the data center 1.4 PUE efficiency benchmark. The others include doing a DC-Pro Assessment (now required for every DOE Data Center); doing a Green IT best practice selfassessment; including DC & IT ECM projects in Site Sustainability Plans (SSP) and monitoring CPU utilization through sub-metering.

Our recommendation is managers integrate both the sustainability and our data center consolidation initiatives; that we are able to leverage both and achieve the goals that are set for both initiatives, urged Wooley.

Quantifying Consolidation Benefits


During the Federal Executive Forum on Data Center Consolidation, government leaders talked about the tangible benefits consolidation delivers.

Energy Assessments
The DOE national labs, working with the ITP have developed what they call the Data Center Profile tool or DC-Pro Assessment tool. The Data Center Profile tool allows you to do an energy assessment at data centers, and we now have a new version, said Wooley. Part of DOEs best practices is we are requiring our sites to do an annual energy assessment for our data centers to identify what are some of the opportunities for implementing best practices. Each site is responsible for developing an annual site sustainability plan and identifying what they are going to do. The new tool is being rolled out now and allows DOE sites to do the assessment. We can roll it up and monitor on the dashboard how well they are doing. Thats going to come out here as soon as we issue our guidance for the sustainability plan, and looking to that as a best practice, said Wooley.

Cindy Cassil Director, Systems and Integration Office Department of State Watch Video

Much To Do
Wooley told OTFL there are a lot of benefits to solving sustainability and data center consolidation issues, not just saving taxpayer dollars. Its also about improving operational efficiency of our infrastructure and be able to leverage IT to do our job better. Mobile computing, cloud computing, things like that all play in this as far as what we are trying to do here. However Wooley concedes it is going to take a major initiative and a major push in order to make the US government much more on par with industry as far as how their IT infrastructure and how they do their business. And even though my focus is on the sustainability side, its all contributing to achieving a major transformation in IT infrastructure and the government being able to do their job better and at a reduced cost. After all, if you cant measure it, you cant manage it Wooley said echoing a familiar refrain. n

Cheryl Rogers Director, Office of IT Optimization FAA Watch Video

Wooley wants government to adopt the best practices that make DOEs National Renewable Energy Lab data center in Golden Colorado DOEs gold standard. More on page 10. Courtesy NREL/DOE Photographer: Dennis Schroeder

Mike Mestrovich Senior Technology Officer for Innovation Defense Intelligence Agency Watch Video
Data Center Consolidation 5

The FDCCI Challenge: Is It Attainable?


Go to Data.gov and you can track how many data centers each agency currently has and how many each needs to trim by 2015. There is even a map showing where those on the chopping block are. If all goes well, by the end of 2012, 373 data centers nearly 20% of the FY15 target will have been consolidated and/or eliminated. But 800 by 2015 is still a daunting challenge. Is it really attainable? What about savings? Are they real? On October 7, we will get an idea of how much can be saved when agencies post their consolidation plans on their public websites. Data centers can comprise from 30%-50% percent of the IT budget, so reducing the number of data centers by 20% should translate into billions of dollars in savings. Hopefully that money will be reinvested into other agency IT priorities, not just disappear from agency budgets.

Deadline Time For FDCCI Deliverables


On July 20, 2011, an OMB memo from the federal CIO laid out latest FDCCI activities and deliverables deadlines. FDCCI Drivers/Benefits Promote the use of Green IT by reducing the overall energy and real estate footprint of government data centers Reduce the cost of data center hardware, software and operations Increase the overall IT security posture of the government Increase the use of more efficient computing platforms and technologies FDCCI timeline August 31, 2011: FDCCI Industry/Government Day By August 31, 2011, GSA informs agencies about DCC solutions and offers a forum to share consolidation lessons learned and approaches. September 30, 2011: DCC Plans & Progress Report All missing elements of consolidation plans submitted to GSA with a full quarter-by-quarter schedule of closures through FY15. CIOs must attest in writing the completeness of plans. October 7, 2011: Publish Agency Data Consolidation Plans Agencies post their plans on their public websites and cross post with www.cio.gov. October 1, 2011: Quarterly Reporting of DC Closures Begins Starting October 1 and every quarter thereafter make a current list of planned and completed closings available to www.Data.gov and cross posted on www.cio.gov. March 1, 2012: Government-wide Best Practices Report Published The DCC Taskforce will report on the adoption of DC consolidation and optimization best practices. FY13 Budget and Beyond: Accountability Agencies are required to include DCC cost savings, account for changes in asset inventories and reflect the progress of consolidation, including funding needs in the agencys FY13 budget submission. OMB will work with agencies to finalize figures. June 9, 2012: Create Government-wide Marketplace for Data Center Availability By this date, OMB, GSA and the FDCCI Task Force will marketplace to better utilize spare DC capacity and establish a mechanism for agencies to offer and acquire capacity and infrastructure.
Source: OMB Memo, Vivek Kundra, July 20, 2011

6 Data Center Consolidation

Consolidations are challenging and risky without insight and expertise.

Rely on Quests proven record and award-winning products to access, migrate, manage, recover and secure your physical and virtualized systems. With Quest, you can control the costs of your consolidation project and improve IT performance and organizational accountability. Use DLT Solutions to purchase your Quest Software and to handle all of your consolidation needs.

www.dlt.com/quest

One Path To Consolidation: ESPCs


ESPCs are IDIQs managed by DOEs Federal Energy Management Program (FEMP). DOE is using an ESPC IDIQ to transform the IT infrastructure at DOE headquarters. Then leaders want to use what they have learned to help you. Jake Wooley, DOEs manager of IT Sustainability Programs, told OTFL that ESPCs are alternative financing contract vehicles already used extensively throughout the federal government. The DOE ESPC is comprised of 16 energy service companies qualified to assist customers identify energy conservation measures for their facility. If there is a major need, the contract holders obtain third party financing to make the needed capital investments. The investment is paid off over time, based solely on the energy and operations and maintenance savings that have to be measured and verified. What this does is allow management to make a major change to a facility and capital investment in energy efficiency Wooley explained. It is also something that is equally applicable to both new construction and retrofits. If you are looking at shutting down a bunch of data centers and want to put them into a new facility, then it could be used for that also, said Wooley. If you are looking at trying to retrofit and bring a data center up to a level of energy efficiency, it could be used for that as well.

Energy Savings Performance Contracts (ESPCs) allow Federal agencies to accomplish energy savings projects without up-front capital costs.
get the energy savings needed to help pay off capital investments. Thats why Wooley said we are going through this proof of concept to verify, to see what the challenges are; we want to document the lessons learned, but we want to make it a repeatable process for other agencies to use. So the proof of concept, we are going to be the first test case to make it work.

Ready Next Year


Using the ESPC, Wooley says DOE has identified their energy services company which is just beginning their preliminary assessment at no cost to the government. They come in, they baseline infrastructure; we tell them what the scope of the effort is going to be. Theyll go through that and its probably going to take 8 10 weeks to do. The result is a report that identifies and recommends what kind of projects that would be good for the ESPC. We sit down with them and go through them and identify which ones we want them to pursue. Then they do what they call investment grade audits, Wooley explained. Thats where they have to do an engineering study and actually develop the business case in order to go and secure the financing for the capital investment. Once thats completed then we actually write the task order (TO), added Wooley. Not until the projects are implemented and government is actually achieving the savings, does the repayment process begin. The ESPC guarantees that the improvements will generate energy cost savings sufficient to pay for the project over the term of the contract. After the contract ends, all additional cost savings accrue to the agency. Contract terms up to 25 years are allowed according to DOE. Because it takes time to do the proof of concept Wooley said our time line would be the first half of next year that we would be prepared to write a task order on it. But again, once we have that task order written, they have already done all the engineering work, its a matter of them just implementing. n

Proof of Concept
Wooley says DOE is using an ESPC as a proof of concept to transform the IT infrastructure at DOE headquarters. We are going to look at consolidating our IT infrastructure, retrofitting one of our data centers to be highly energy efficient, and consolidating our infrastructure to that facility. Then we will close the other data centers and server rooms that we have at our headquarters location. Its never been used for an IT transformation of the IT infrastructure. So this is a proof of concept at DOE headquarters, explained Wooley. While any government agency can use the DOE ESPCs for its own data center consolidation efforts, Wooley noted that they are not for small projects. Normally they are for fairly significant sites like chiller plants, doing lighting upgrades, putting solar panels in. They are fairly significant capital investments that are required to do that. The challenge that we are going to face with the ESPC, and again I call it the proof of concept, is that many of the facilities upgrades have a long, long life cycle, he said. However while facilities have a long lifecycle, IT does not. So technology needs to be aligned with operations and maintenance to
8 Data Center Consolidation

ESPC Quick Facts


Using ESPCs, as of May 2011, more than 570 projects worth $3.9 billion were implemented at 25 Federal agencies in 49 states and D.C. These projects saved an estimated: 32.8 trillion Btu annually; equivalent to the energy consumed by 345,000 households or a city with a population of 893,000. $13.1 billion in energy costs (approximately $10.1 billion goes to fund energy efficiency projects and $3 billion is reduced Federal Government spending.)
Source: DOE

NRELs Living Laboratory


When you post an 81% savings in your data center energy bills, data center managers and budget analysts alike take notice.
Courtesy NREL/DOE; Photographer Dennis Schroeder

PUE stands for Power Usage Effectiveness. PUE is a key metric for measuring data center Chuck Powers energy efficiency. Figuring out PUE is simple. Cooling + Power + Equipment divided by Power = PUE. A perfect PUE is 1.0. For example, at the new National Renewable Energy Lab data center in Golden Colorado the PUE is 1.13. At the old NREL legacy data center PUE was 3.3. In fact, most data centers PUE is in the 3+ range. But that is going to cease in FY15 2015 when all federal data centers new, retrofits and those remaining must meet the Council of Environmental Quality (CEQ) benchmark metric of a 1.4. Clearly federal data center managers need to cash in on what NREL is doing if they are going to meet their PUE target.

how all of the 17 DOE national labs are working on sustainable IT management. The results are a new report from DOE titled DOE Laboratories Leadership in Green IT. I worked with all 17 labs and tried to capture their successes, stories or accomplishments around the whole IT sustainable management of desk tops and laptops, their data center efforts and who is using renewable energy to power data centers, Powers said. I tried to capture a story about how high performance computing really contributes to overall sustainability. Even though they consume lots and lots of watts, we are savings thousands of watts because of the outcomes. So even though these high performance computer resources are consuming lots and lots of power, the outcomes are probably some of the most impressive outcomes for environmental sustainability anywhere. So that was my point, and the lab runs thousands of these models annually. Theyve got thousands of these of these projects. n

Spreading Innovation
Chuck Powers, NREL IT Strategist, in is charge of the data center. He told On The FrontLines in a recent interview that while the new data center has redefined world class, we have a lot to live up to and weve been successful. Replication is a very important outcome of the ideas and concepts that weve put in including the data center, Powers explained. The good news is lots of organizations have (including On The FrontLines) toured our data center and organizations have asked us to come in and evaluate and perform energy assessments for their data centers. So we are on track to meet that expectation. So far, Powers has already worked with his colleagues in Berkeley to write a best practices guide for building energy efficient data centers. At the same time, he is being very proactive spreading the word not only about the NREL data center, but
10 Data Center Consolidation

More than 19,000 linear feet of wood from trees killed by bark beetles was used to decorate the lobby of the new RSF. But this is no ordinary wall. As it climbs through the lobby space, the wall tilts in and out giving it a 3-D look all its own. Courtesy NREL/DOE; Photographer Heather Lammers

Prescription for Savings


The American Institute of Architects Committee on the Environment (COTE) honored the NREL data center with its 2011 AIA/COTE Top Ten Green Project Award. The AIA COTE Top Ten Projects program celebrates projects that exemplify sustainable design solutions arrived at through an integrated approach to architecture, natural systems and technology. So, how did NREL build a data center so sustainable and environmentally friendly? I had opportunity to tour the Golden, Colorado facility recently. From the hot-aisle containment to virtualization technologies, the facility is a living laboratory for data center managers. Read my complete story on AOL Government. Jeff Erlichman

when you can say yes,

lives are changed.

Say yes to new IT capabilities while still securing your systems. Learn how you can make it safer to do more:
ca.com/us/products/industry-solutions/ public-sector/security

Grant Schneider

OTFL Interview

Deputy Director for Information Management and Chief Information Officer Defense Intelligence Agency

The DIA CIO talks about data management, data center consolidation, the Quad Initiative and more in this OTFL interview.

or me what is exciting about this job is getting to work with all these different communities. I really find that as the DIA chief information officer I have maybe not a unique, but a very broad view of DIA and the entire Intelligence community. Grant Schneider spends a lot of his time interacting with peers within DIA and across the Intelligence community. I am looking for ways I can leverage whats already done or put intellectual capital in or solved a problem, he told OTFL editor Jeff Erlichman in a recent interview. You asked earlier: what is the benefit to the customer. Providing tangible benefits to the customer is what keeps me excited. Here is more from that one-on-one.

IT on the TSSBI environment and those assets got rolled into DIA. So weve been in the process of establishing what we call an Enterprise Service Delivery Center in St. Louis where weve been centralizing services and elements from each of the combatant command locations to get all our data centralized there. Looking forward, we are in the process of moving to having two major data centers for the environment. The one in St. Louis that I mentioned (and) today our second one is within our building in Washington DC. We are in the process of looking to transition from the DC one into a more purpose-built data center facility as opposed to a building thats really made for people and has a computer room in the basement.

On The FrontLines: What progress have you made in the last two years to restructure in terms of IT infrastructure and IT management? Can you describe DIA data center consolidation plans and current progress?

OTFL: Is this new data center construction, or are you retrofitting?


Grant Schneider, DIA: It looks like where we are headed is to partner with the Department of Energy and leverage Oak Ridge National Lab. We are going to end up in an existing facility that they have down there. We are looking at it as a partnership with the Department of Energy and with the Lab. Also we think we think there are some mission areas where we can actually leverage each other moving forward. They are going to own the facility, run it, manage and operate the facility. We are going to retain the database, (and) running and managing of the systems and applications that will be there, almost all of which will be done remotely and be able to continue to be the systems admins, if you will.

Grant Schneider, DIA: Weve got a number of things that we are doing (in the areas of) data consolidation and data center consolidation depending on how you define a data center. What I mean by that is: for a number of years weve been in the process of bringing as much data back into our central repositories as we can. For example we operate today nearly 140 Defense Attach offices within embassies around the world. A number of years ago we would have email servers within each of those Attach offices and their email resided there. So we centralized those services and brought them back into central data centers so that we can better do disaster recovery and protect the data and We are looking at the be good stewards of the information while maintaining the access from the edge to it. ability to run any of our So over the last few years weve been in applications against the process of bringing in as much data as any of our data because we can for operational and security reasons. our data is indexed in We also have been in the process of reducing the number of data centers that we such a way that all the have. About five years ago DIA took over, if applications can see it you will, (and) it was probably more a merger and readily digest and acquisition with the J2 elements at each the information. of the combatant commands; there, they each had an IT organization largely delivering

OTFL: How have your efforts to better manage data affected your data center consolidation and cloud initiatives?
Grant Schneider, DIA: Our need to better manage and protect our data is driving us to some centralization and consolidation of it, which is also driving us to need more data center space. But the other aspect of it for us (is) from a data management perspective, and this touches a little bit on the cloud environment. Because when I think of the cloud environ-

12 Data Center Consolidation

ment I think of three elements of the cloud: any of the agencies. So we will be taking one being the data layer, and Ill come back some of our, for me, as I was just talking We want it to be more to that; one being an applications layer, where about our layer and our applications layer, seamless across our the applications reside, whether they are widand I will be pointing it into the Common getized or however they are residing in that Operating Environment. architectures and our layer; and the third one being an identity and Im not going to replicate everything infrastructure; and wed access management layer, which determines into the Common Operating Environment. like to save money and who I am as a user, which applications I get to The data will largely stay within each of we want to enhance our our four infrastructures, but from the use, and which data sets get returned to me when I run applications against the data. security. So (there are) Common Operating Environment users This is really a transition for us to get to really three objectives in will be able to get into the data sets that where we leverage and utilize the data in what they need. And it will be based on who the Quad environment. they are as opposed to where they are. I would call its native location or whatever the authoritative data source is. Historically access to data or even disSo instead of needing a brand new capability and making cop- coverability of data is often based on what network you happen ies of the data and the search engines and the link analysis tools to be residing on, as opposed to who you are. And then by having and marrying them up into one proper noun capability, instead a Common Operating Environment we will be able to get to where we are looking at the ability to run any of our applications against its truly attribute-based. Im Grant Schneider, Im the DIA CIO, I any of our data because our data is indexed in such a way that have whatever the clearance is and therefore I am able to see this all the applications can see it and readily digest the information. amount of data and get access to a probable subset of that data.

OTFL: What is your Quad Initiative? Is there timeline to implement?


Grant Schneider, DIA: The Quad Initiative is an agreement. NSA, NRO, NGA and DIA have gotten together and their Directors have said we want to be able to share information more easily; we want it to be more seamless across our architectures and our infrastructure; and wed like to save money and we want to enhance our security. So (there are) really three objectives in the Quad environment. Theyve tasked their CIOs to come back on how we can best do that. And we are going to be going back to the Directors this fall. But we are building out a phase one expansion of what we are calling a Common Operating Environment, which will be for users of the Quad. We are going to start with the 200 users from across our four agencies and they are going to have to be able to access this Common Operating Environment, and from there they will have access to some common services email, collaboration, voice, videos, that sort of thing so that they can talk to each other seamlessly across other networks. And from the Common Operating Environment they will be able to get access to data and information that they need from

OTFL: How do you feel about the possibility for CIO responsibilities or authorities to expand?
Grant Schneider, DIA: I think its exciting. We are in a tumultuous financial environment right now for the nation and we are in one thats exciting because weve really got to put our focus into making the right business decisions for the nation. And for me, for DIA as an agency and for the Intelligence community and I think that the memo really highlights that fact that: the CIOs do have a perspective thats somewhat unique in the environment. Because we see the entire enterprise, whatever enterprise that is, agency level or national level, where we sit. So I think its exciting that I think its been recognized. One of the conversations we have in the agency is we often talk about our mission folks and our enablers or support element, and where does IT sit? We sort of come down to its in both. We are an enabler, and yet the mission cant get done without IT being there. So I think its exciting to see the government look at the CIO role. It is going to have some impact, I think from whats on paper and the reality is I think most of the CIOs have been operating to some degree in that manner depending on how their agency and department heads wanted them to for years. n

Data Center Consolidation 13

Will Budget Cuts Darken Clouds?


By Jim Flyzik The Flyzik Group

OTFL Viewpoint

verybody is talking cloud computing and data center consolidation to increase efficiency and reduce cost. If hundreds of data centers are closed, it is intuitively obvious that a lot of money will be saved. However, getting there in light of the current spending cuts may not be so easy. It will take some upfront money to do the consolidations to achieve those savings down the road. But will the investment money be there to replace/virtualize/move equipment? The mood in Congress is clear cut spending now. During recent discussions with several CIOs, we talked about the potential lack of funding available to make the data center moves. One CIO talked about the fact that his office only had about 20% of the needed funds to move to the consolidated location and lacked funds to virtualize and modernize before moving. Thus a worst case scenario arises only some of the current data center equipment can be moved this year resulting in incurring O&M costs at the new location but still needing to operate the existing location. Some argue to let the Private Sector own the equipment and offer it back to the government as a service. But this creates a second problem with the assumption of risk associated with Infrastructure as a Service (IaaS) at consolidated centers. What happens if the program for which the equipment was purchased is cut? Will the private sector be free to use the equipment for other agencies or commercial entities? We are talking about a culture change here that is significant. Were talking about Service Level Agreements and contractual terms and conditions unlike any in past government programs.

security requirements than the other Treasury Bureaus, as did the other bureaus with differing missions. Privacy concerns too were different. A place like the IRS had more stringent privacy concerns where sensitive taxpayer information was involved. A seemingly innocuous application such as email consolidation will raise issues about who manages the system, who has access to whose email, levels of security on the system, access to social sites, wireless synchronization to name just a few. The Reality Is... I dont mean for this article to sound the alarm and darken the outlook for cloud computing and data center consolidation. Rather, it is about pointing out some of the realities and complexities that will be encountered hoping they can be dealt with early before becoming show stoppers. It is about keeping momentum up for the investment dollars needed to reap the longterm savings. In discussions around town certain lessons-learned are becoming evident. Before doing a consolidation, Agencies need to have in place a strong governance process capable of resolving difficult issues. A means to bring decisions to closure is crucial to staying on schedule. Pre-defined cost allocation processes need to be agreed to early on in the program. And relentless leadership needs to be exhibited to keep funding from being swept up to cover other cuts. Data center consolidation and cloud computing are examples of areas where IT can prove its role in dramatically reducing costs and improving government services. It is imperative that the IT community rally to support these efforts. We dont want political bickering over budget cutting to jeopardize these promising opportunities. n Jim Flyzik

Culture and Security Challenges


Another challenge to agencies with diverse cultures will be the cost allocation model for the consolidated centers. During my days at Treasury, we had a sound business case that a consolidated nationwide network saved the department a lot of money. But not a day went by that we didnt haggle over how the 14 bureaus would share the costs. I heard 14 different ways to allocate costs each bureau spinning it to justify why they should pay less and the other 13 bureaus more. This issue will undoubtedly be a big deal with any significant data center consolidation involving multiple component offices. And then there are the security issues. On the technical side, there are actually some opportunities to view security holistically and make improvements and get a chance to get rid of all the band aid security fixes over the past years. There are many promising products and solutions emerging for virtualized environments and cloud computing. The more difficult challenges will be agreeing on levels of security. When I was at Secret Service, we had significantly different
14 Data Center Consolidation

For Data Center Consolidation, A Hybrid World Down The Road

I see a hybrid world down the road, a combination of commercial products entering the government workspace and some things government will have to continue to do on its own Watch Video

Brocade is deploying Ethernet fabric solutions today. From increased automation to more scalable and resilient network architectures, Brocade Ethernet fabrics atten your network. In fact, you can manage the entire fabric as one single, logical entity.

A dramatically more automated network.


Reduce complexity and experience a network that works the way you always imagined it should. Brocade Ethernet fabrics enable cloud-optimized networks that make your business more agile. Thats why 90% of the Global 1000 already rely on Brocade.

Find out what Brocade customers already know. Visit brocade.com/everywhere

2011 Brocade Communications Systems, Inc. All Rights Reserved.

By Jeff Erlichman, Editor, On The FrontLines

Which Group Of IT Chefs Are You The Microwaves, The Ovens or


The Data Center Consolidation Cookbook will help IT chefs save money and improve service levels/attributes no matter where you are in your consolidation program.

hey say the best medicine is laughter. So, I had to chuckle out loud during my recent interview with MeriTalk founder Steve OKeeffe as he was describing the differences between the Microwaves, Ovens and Crock Pots. While I was chuckling as I was listening, and thinking about half-baked solutions, OKeeffe and a prominent committee of data center consolidation leads are making serious progress to craft a how-to, best-practices guide for agency data center consolidation chefs. We are always talking about doing more with less, but every year we do more with Steve OKeeffe more, OKeeffe noted. We cant afford to do things the way we did them; we have to find new ways and we have to work to change that equation.

learn cumulatively and build a best practices library; and finally quantify your success to win support for the next phase.

Cookbook Contents

In the DCC Cookbook, consolidation starts with taking stock of your ingredients, an audit to establish a baseline said OKeeffe. Then comes planning. Engage business owners. Set up a Customer Advisory Committee that includes the CFO. Project needs and financial, IT personnel and business goals. Define SLAs and map cost/operational models. Consolidation is a time for modernization. This is the time to think about applications and moving to open source solutions. This is the time to consolidate platforms and negotiate software licenses. This is the time to de-duplicate and automate, map your timeline and prioritize costs. When it comes time to implement, use the Data center Fall 2011 Report spiral approach while engaging your Customconsolidation is The committee will report its findings er Advisory Committee. Use approved Projcomplex and difficult. ect Management methodology such as EVM during Fall 2011 and includes DHSs MagWhat are the best gie Graves, Interiors Bernard Mazer, ATFs or CPIC and leverage to manage customers, Walter Bigelow and Anil Karmel, Los Alamos practices from agencies contractors, and your team. National Lab solutions architect. The final analysis is to measure by evaluthat have achieved What the DCC Cookbook will do is provide ating budget, tracking against SLAs, closing success straightforward analyses for different types the loop with the Customer Advisory Comis there a cookbook of agencies that have different needs exmittee at every opportunity and developing plained OKeeffe. takeaways and lessons learned that carefully to share? He said the groups are divided by the numdocument each phase. ber of data centers they have to cut more than 75; between 20 OKeeffe said meetings with agencies are well underway and and 75 and less than 20; What he envisions is these in groups who open panel sessions like the one held at Innovation Nation are have similar issues will come together to share best practices. keeping the project on track. Further, when complete, there will However the current reality is agencies are setting their conbe Resource page for each consolidation phase where chefs can solidation alarm clocks to ring according to how fast and how reach out and get advice from experienced cooks such as Jake radically they have to consolidate. Wooley from Energy or Karen Petraska from NASA. Hence, the Microwaves are those who are moving ahead It will also have a tools pane, a living wiki if you will, OKeeffe fast, the Ovens are those who are moving more slowly and explained. Here chefs can find white papers, analysts resources those whose plans are still simmering are the Crock Pots. and information on why an agency picked a particular product, The plan is to create different DCC Cookbooks for each group vendor or solution. OKeeffe explained. No matter what group an agency is in, OKeeffe said all agenMeriTalk Data Center Exchange cies need to think first about how to break consolidation into The DCC Cookbook is an initiative of the Data Center Exmanageable phases; then focus on low risk/high return projects; change. OKeeffe calls it a vertical public/private partnership always do everything to ensure non-stop services; continue to where data center leads for agencies can come together to talk
16 Data Center Consolidation

In? The Crock Pots?


challenges, successes, collaboration, best practices and share war stories. The Exchange generates operational content, applications, and programs to help the Federal government realize its data center consolidation goals according to its website. To that end, the Exchange is working on a Data Center Capacity Open Table, described as a portal to catalogue the data center inventory at Federal agencies. By centralizing this information, the portal will facilitate cross-agency provisioning of data center resources, helping to increase efficiency and reduce waste. Whether your agency is a Microwave, Oven or Crock Pot, OKeeffe invites you to help stir the pot and participate in online and in-person events. So, grab your favorite cooking utensil and click here to join their mailing list to receive more information about the Data Center Exchange. n

Consolidation Conundrum
Its quite a conundrum. The number of data centers may be going down, but the requirements for new data center space and capabilities are going up. So, how do we get smaller, as we get bigger and be sustainable too? And what about investment dollars for consolidation? To get ROI, you need the I to get the R, MeriTalks OKeeffe stated. We are not going to get this without funding; there is no magic formula for change. Once again that is a key takeaway from the August 2011 Consolidation Conundrum research study conducted by MeriTalk and sponsored by Juniper Networks. The study is one of a series of studies on Data Center Consolidation that demonstrate the issues facing agency consolidation chefs. Other key takeaways include: 1. Data center consolidation is not as easy as it looks. The more executed the more realization how complex they can be. In fact, Feds are skeptical: Just 10% believe Feds will meet OMBs mandate of consolidating 800 or more data centers by 2015. Nearly a quarter (23%) believe the government will have more data centers in 2015 than now. 2. Can we shrink and grow at the same time? Fewer sites, but the need for more computing capacity. In fact, Feds estimate: Their computing needs will increase by 37% over the next 5 years. Their data centers will need to be scaled up by 34%to meet their growing needs. 3. Fewer Data Centers Equal More Complex Data Centers. Remaining data centers will house complex legacy systems and reluctance to change compounds the challenge of increasing data center capacity. In fact: 48% are running 20 or more management software applications at their data center 62% dont believe its reasonable for their organization to utilize managed services from other organizations making cloud migrations more cloudy. Thats just part of the conundrum for chefs said OKeeffe. Their biggest challenges are figuring what metrics to use to make smart consolidation decisions. For example, should the metric be server utilization or energy consumption and its resulting costs? The research says utilization is between 65-70 percent. It also says 12% of data center costs are for electricity, but many dont know how much they are paying. Which metric to use? Is there a better one? If we dont have the right metrics to make decisions, how do we know which data centers to close? Quite a conundrum!. Click here to get your copy of Consolidation Conundrum research.
Data Center Consolidation 17

By Jeff Erlichman, Editor, On The FrontLines

Their Best Advice


I
f consolidating data centers were just a matter of buying the latest technology servers, storage, UPS and HVAC then data center chefs could generate a few RFQs, buy what they need and install everything the way they always have. But the fact is data center consolidation is not about technology. Its really about a fundamental transformation in the way we view, plan, operate, consume and pay for technology. Everybody recognizes things must change for the smarter and for the better. The impact of transformation on the workforce will be immense as the FDCCI and CIO Council acknowledge. Having a government and contractor workforce fluent in the skills to thrive in cloud first environment is essential. Exactly what skills, training and education needed was one of the topics of a recent On The FrontLines Roundtable with seven of governments leading providers of data center consolidation products and services. Below is some of their best advice on what skills your workforce needs, governance, cloud migration and other issues facing data center chefs.

Leading private sector practitioners offer their best advice on how to save time, energy, angst and dollars when consolidating data center facilities and assets.
ensuring critical prerequisites and milestones are met. It also will mean changes in how staff is deployed and their duties. This is especially true in the area of information assurance (IA), cyber security and migration to the cloud. People with software systems and security engineering backgrounds are required to make sure that cloud applications will operate properly in shared environments instead of the isolated, excess capacity environments of the past, he said. The impact on jobs will also vary depending on the type of cloud (e.g. public, private, hybrid) that is deployed, and the extent to which the cloud is used according to Ryan, who noted organizations have traditionally purchased, deployed and managed their own IT infrastructure. When all or part of that infrastructure is consolidated to a public cloud or traditional infrastructure, they may need to refocus their skills and attention on management, acquisitions, logistics, engineering, service desk and cyber security staff.

Invest In Workforce Transition Manage Change Via Communication


We see four fundamental pillars to any successful data center consolidation program: governance, project management, systems engineering and management of change, Dave Ryan from General Dynamics IT explained. As with any large, complex project with numerous stakeholders, a key to success is management of change via communication. It is also critically important to ensure that key stakeholders are present and active in all phases of the program including design reviews, risk planning, Certification & Accreditation (C&A) efforts and regular status meetings. Ryan acknowledges achieving mutual understanding and consensus can be time-consuming and difficult, however it pays dividends in the long run by eliminating misunderstandings and Tracy Haugen of Deloitte Consulting LLP noted that agencies are realizing the importance of retaining specific resources required for an effective migration. They are investing in workforce transition plans, even with a contractor workforce, so the migration to a consolidated service model can be achieved with minimal disruption to service, she said. While much of the workforce concern relates to job security, consolidation can often mean more opportunity for employees in the data center. With the concentration of IT professionals, more management positions may be needed and deeper technical skills can be rewarded. Therefore, training should be focused on deepening current IT skills and growing soft skills such as customer service,

The Data Center Consolidation Roundtable


Bill Clark Vice President, Public Sector Technical Sales CA Technologies Chip Copper Brocade Solutioneer Brocade Tracy Haugen Principal Deloitte Consulting LLP Dmitry Kagansky Chief Technologist Quest Software Public Sector Gary Newgaard Director of Federal Solutions Isilon (the newest division of EMC)
18 Data Center Consolidation

Dave Ryan Vice President and Chief Technology Officer, Navy/Air Force Division General Dynamics IT Mark Weber Senior Vice President and General Manager NetApp U.S. Public Sector

account management, and general management, Haugen said. If a significant percentage of the operations will be contracted out, vendor management becomes a critical skill for the federal workforce. Haugen counseled that it is important to set early expectations with the customer base that the organization may need time to move up the learning curve if the concept of operations is very different from the current state. Adapting Service Level Agreements and/or other performance metrics during the transition period may help reduce strain on the workforce as they settle into new positions, she said. Employees may feel undervalued if they think the new leadership is not fully aware of their individual strengths. Increased management attention to staff in the early period can avoid attrition typically associated in the startup period. The bottom line is frequent and transparent communication is key to dealing with the IT workforce.

Meet Future Program Requirements


Gary Newgaard from Isilon talked about how in his experience the most successful large scale initiatives have some common DNA. The pitfalls of large scale IT transformations are pretty well documented most have to do with assumptions around cost, management and complexity turning out to be wrong, he said. The most successful share some common DNA namely they do not rely on technology and product futures which are not already well defined, or for which the foundations are not already present in the current technology. In other words theyre not relying on a plan which has a lot of really optimistic dependencies. Newgaard said what this means from a planning, selection, deployment and maintenance point of view is that the technologies and processes selected should already contain the foundation of the capabilities they need to deliver several years down the road. This way you can manage to an outcome which delivers success to near, mid and long term program criteria. A proven architecture with well understood capabilities that a planner can reasonably and confidently project forward to meet future program requirements is so important to being successful in large scale IT transformation, added Newgaard. One of the biggest opportunities of consolidation is centralizing data storage for decreased costs and increased ease of management he said. Obviously to have centralized data, the data itself must first be migrated from multiple data centers to the selected central location. Because government data centers frequently have 100s of TB or even multiple PB of data to migrate, this process alone can be a significant project, noting that it is critical to require big data expertise from a supplier as part of any solution.

you are, so you can measure if you are progressing in the direction you want. Measuring and baselining are what we would consider basic hygiene, said Kagansky. Have a Plan B if something goes wrong. You will have a lot more chances to correct any mistakes early on. If you get to the end, it is tough to go back and fix and patch after the fact. Kagansky that with consolidation there is a lot more for the same amount of staff to do. When you are consolidating the data center, you are not really consolidating systems, he said. You still are going to have the same number of systems; you are just physically changing where they are running or how they are running. Consolidation often adds to complexity, pointing to how virtualization makes things more complex and it puts a premium on identity management and assigning access rights. He also points to the ever present issues when retraining and redeploying your workforce. There often is a feeling that the training levels arent there and that staff has not been adequately trained to handle these new environments, he said. Someone has to be capable of monitoring and managing that. In addition, things need to be partitioned to make sure no one has more data access rights than they should. Kagansky said moving to a new building is usually an opportunity for things to go wrong. He worries that many are making the move haphazardly, doing what they need to do; then there is a lot of rework and backtracking. He also stressed the fact that metrics are still not clear and that in the past government didnt measure and establish metrics. They are getting performance data of the new data centers. For the old systems, no one has done baselining. Many were built before there were any mandates or any interest in baselining. When customers say applications are running kind of slow, they often dont have any comparisons as to what was made before the move. His advice: The first thing is to set your baseline; before you make any sort of move; before you pick up a new database or app or put in new hardware or virtualize. Get a baseline, take a snapshot of what performance was like and what users are going to expect right now before you cut over.

Automation Is Central To Consolidation


Bill Clark of CA Technologies described how virtualization technologies and the accompanying push towards automation have changed the skill sets and education needs of IT professionals. They are also silo breakers. When doing automation you are typically breaking down silos, because Im taking a process that used to require the Service Desk and and now I can automatically provision a server based on a request to a service catalog, he said, noting that change management, quality assurance, configuration management and systems administration processes are being automated. Automation is a central to consolidation according to Clark
Data Center Consolidation 19

Basic Hygiene: Take Measurements, Baseline and Have Plan B


Dmitri Kagansky from Quest Software observed that data center chefs need to start taking measurements baselining where

noting there is a lot of low hanging fruit out there that can deliver immediate ROI. There are still agencies that dont offer password reset. When you offer password reset from the Service Desk and it is automated, we have seen instances where Help Desk tickets are down 50 percent the first couple of months. There are a lot of IT processes that can be automated said Clark, especially in the testing and development area. Why take 40 days to slow down test and development, when you can provision automatically and deprovision very quickly? he asked. One of the things IT operations should do is look at how to facilitate that as quickly as possible. It allows you to use your capacity more efficiently which means you dont have to purchase more hardware. Clark is also a big believer in portfolio management to align what you are doing in IT with what the agency mission is; and report in business terms what you are doing in IT. That is easier said than done; it requires you to understand how your assets in IT are delivering services; what is the relationship there and how to preserve the service levels being delivered. With this information Clark added, If the government comes and says cut 10 percent you can say this is what is going to happen if you do this downstream; these are the service levels under that could be potentially not as good as they were before.

cause we wont have effectively either changed the number of elements that we have to manage. Nor will we have changed the way or the methodology that we are going to use for managing all of those different elements. He said instead of that we have to have the same sort of change in approach for consolidation in the data center for the networking and infrastructure side that we had for storage area networking a decade ago. We have to change this mindset which says that the network is something which is going to require very constant manual intervention and is seen as a second resource, Copper explained. As opposed to going to an environment where we are building end-to-end application environments, we can set up policies and the network is going to be able to automatically respond to whats going on through the rest of the infrastructure just as the application servers and the hypervisor servers and the storage servers are going to have to do.

Run As If It Were Your Own Business


Mark Weber from NetApp called out again for standardizing data center efficiency metrics and making them more in line with the private sector. We are expecting people to monitor server, storage utilization and efficiencies. Those are some of the suggestions the IT community is making, said Weber, who reaffirmed the fact that virtualization and storage are the hot skills for the future. Universities have been calling us to teach about our OS and storage efficiency tools. They see virtualization and storage management as the hot jobs of the future. He also sees those IT organizations that can set themselves up as service providers for their agencies will be rewarded. For example, I see a lot of private clouds emerging within agencies. so those efficient IT groups are becoming private cloud providers for the rest of the agency, Weber said. Their reward is more work and not getting shut down. When asked what his best advice was to government data center chefs, Weber was blunt. My message is simple. To me I would act like it is my personal money and run my IT world in the most efficient way possible because that is the right thing to do, Weber said. I would treat every dollar I spend as not a federal dollar, not a corporate dollar, but I like my own money. When people do that, they spend the right amount of money, not more not less. Weber said CIOs recognize that investing in efficiency tools to get a better return wont be cheap. And that they already know what to do. I think if our CIOs of the federal government do that, they dont need all the guidance of how many data centers to consolidate and what tools to use. They will be great stewards of their own dollar, they will be efficient. They need macro-guidance because not everyone is going to behave the same way, continued Weber. I like the macro guidance we are getting, but I still think no one is going to manage your house better than you. n

Get Out Of Your Comfort Zone


Chip Copper from Brocade made it clear: You cant keep looking for the solutions to new problems in the same old places; theres going to have to be a change in the mindset of how data centers are going to be architected. They are going to have to be able to go out of their comfort zone just a bit; the person who has been manually configuring the network for 40 years is going to have to get used to the idea that maybe I dont have to do all this manually. Maybe the network can be smart enough to do this for me automatically. Copper said that in 10 years of doing consolidation, there first was the move to take storage devices out of local servers and putting them into the network. People started seeing storage as being more of a pooled resource rather than something that was associated with each server. This is significant because storage area networking began started to change the mindset of how to view resources in the data center. For years, you could walk into a data center and youd be able to point to something and say, thats my email machine. Or thats my database server, or thats my web front end. And so there was this very much of a stove pipe construction that was just ingrained in the way data center managers were building things, he said. Relating this to data center consolidation, Copper said we can take two paths. The first thing is that we can take all of this stuff across all of these data centers and we can try and just jam it into a smaller number of physical facilities, he said. If we are not careful thats exactly what we are going to do and its going to be a mess be20 Data Center Consolidation

GET THE

DATA CENTER doWNLoad


OMB is driving data center consolidation as the way to fund the cloud transition and more. We know we have to cut the number of data centers, but questions abound. Whats the right way to go about consolidation? How much money will we save? How do we know what space is available at other agencies? Whats really possible and on what timeline?
The Data Center Exchange DCX is the Federal Data Center garage. We test and research, we pull together best practices, we build apps. DCX its where government and industry come together to get the data center consolidation download.

R e g i s t eR today www.meritalk.com/datacenterexchange

Resources
Websites

Presented by

All links available at www.onthefrontlines.net/ITconsolidation

Federal Data Center Consolidation Initiative American Council For Technology Apps.gov (GSA) CIO Council Cloud Computing Data Center Consolidation DLT Solutions FedRAMP Federal Risk and Authorization Management Program Infrastructure 2.0 or Dynamic Infrastructure ITIL Information Technology Infrastructure Library National Business Center Data Center Services Department of Interior Nebula Cloud Computing Platform NIST OpenStack The Open Source, Open Standards Cloud Data Center Exchange Data Center Cookbook MeriTalk Case Studies, Memos, Research, White Papers & Special Reports OMB CIO Authorities Memo August 8, 2011 OMB FDCCI Deadline Memo July 20, 2011 Deloitte White Paper ACE Automated Commercial Environment Fact Sheet DHS Consolidate with Confidence and Less Cost NetApp Consolidation Solutions Slash Data Center Costs NetApp Datacenter Consolidation Strategies for the Federal CIO DLT Solutions Department of State Data Center Consolidation Presentation October 2010 FDDCI Update Memo October 1, 2010 Kundra/Spires Effectively & Securely Using the Cloud Computing Paradigm (NIST) Federal Cloud Computing Initiative (GSA Presentation) Federal Data Center Consolidation Initiative FAQs Federal Data Center Consolidation Initiative Final Baseline Inventory Federal Data Center Consolidation Initiative Initial Data Center Consolidation Plan

Federal IT Consolidation Research Study June 2011 MeriTalk/NetApp Government Cloud Computing (Dataline) How to Achieve IT Optimization Whitepaper DLT Solutions i360Gov Special Report: Redefining Make Do DLT Solutions Ongoing Virtualization Activities at NRC Privacy Recommendations for the Use of Cloud Computing by Federal Departments and Agencies Transition to IPv6 Vivek Kundra - State of Cloud Computing May 2010 Vivek Kundra Testimony on Cloud Computing: Benefits and Risks of Moving Federal IT into the Cloud Vivek Kundra, Federal CIO House of Representative Testimony, July 2010 Vivek Kundra, Federal CIO The Economic Gains of Cloud Computing

Videos
Future Visions During the Federal Executive Forum on Data Center Consolidation, leaders talked about what data center consolidation will look like in the future. Cindy Cassil Director, Systems and Integration Office Department of State Watch Video Cheryl Rogers Director, Office of IT Optimization FAA Watch Video

Mike Mestrovich Senior Technology Officer for Innovation Defense Intelligence Agency

Dmitry Kagansky Chief Technologist Quest Watch Video

22 Data Center Consolidation

Agility. It can save your hide


Elit luptat. Onsectetum zzriurem dolute tio ent ulput il iure dolore el eugue dio delit aliquat accum dolenis ationsectem ad tisim quisi. Sequat vel dolesequis euissectet irilit adiam, si te facipit wiscidunt velessed er se dolore delessis augue delis nullumm odipiscil enisim volorpe riusciliquat wis eu feugue.

Big challenges demand brilliant solutions

Complex technology issues present new opportunities. Turn to Deloitte. Whether its cyber security, cloud, mobility, IT management, or integration, Deloitte stands ready to help. We have the people, insight, and experience to help your organization. See how. Visit www.deloitte.com/us/federaltechnology.

As used in this document, Deloitte means Deloitte LLP and its subsidiaries. Please see www.deloitte.com/us/about for a detailed description of theDeloitte means Deloitte LLP and its subsidiaries. Please see www.deloitte.com/us/ As used in this document, legal structure of Deloitte LLP and its subsidiaries.

about for a detailed description of the legal structure of Deloitte LLP and its subsidiaries. Certain services may not Copyright 2010 Deloitte Development LLC. All rights reserved. be available to attest clients under the rules and regulations of public accounting. Member of Deloitte Touche Tohmatsu 36USC220506 Copyright 2011 Deloitte Development LLC. All rights reserved. Member of Deloitte Touche Tohmatsu Limited

Do

ust alk about he loud. Put t o w k.

IT thought leaders and over 1 billion end users benet from clouds built on a NetApp storage foundation. To make sure your storage architecture is designed to deliver all the rewards the cloud has to offer, visit NetApp.com/federal.

Valuable clouds are


Come visit us at www.netapp.com/federal
2011 NetApp. All rights reserved. Specications are subject to change without notice. NetApp, the NetApp logo, and Go further, faster are trademarks or registered trademarks of NetApp, Inc. in the United States and/or other countries. All other brands or products are trademarks or registered trademarks of their respective holders and should be treated as such.

Anda mungkin juga menyukai