September 2011
Air handlers move air to cool the data center at the National Renewable Energy Lab. Learn more about this LEED Platinum facility on page 10.
Photo: Courtesy NREL/DOE; Photographer Dennis Schroeder
www.gdit.com
or years Federal CIOs have had to do more with less. On August 8, 2011, OMB Memorandum M-11-29 changed all that. OMB director Jacob Lew wrote: As the federal government implements the reform agenda, it is changing the role of Agency CIOs away from just policymaking and infrastructure maintenance, to encompass true portfolio management for all IT. This will enable CIOs to focus on delivering IT solutions that support the mission and Jacob Lew business effectiveness of their agencies and overcome bureaucratic impediments to deliver enterprise-wide solutions. This memo is designed to clarify the primary area of responsibility for Agency CIOs throughout the government, as identified in the IT Reform Plan. Later that evening on the White House Blog, Steven Van Roekel, US Chief Information Officer wrote about The Changing Roles of Federal Chief Information Officers. He wrote federal CIO achievements have already fundamentally changed the way the federal government manages IT. The memoSteven Van Roekel randum will help CIOs deliver on key areas to drive results and yield an even greater impact. He reiterated Lews message that agency CIOs must be positioned with these responsibilities and authorities to improve the operating efficiency of their agencies. Added to their statutory responsibilities CIOs now have four main areas in which they have the lead role: governance, commodity IT, program management and information security.
Counting On Consolidation
Neither Lew nor Van Roekel gave CIOs any guarantee of any new IT investments to consolidate data centers or move applications to the cloud. Budget realities make this problematic. But what every CIO finally gained was more power to do more even though they may have to do more with less. Lew wrote, CIOs must drive the investment review process for IT investments and have responsibility over the entire IT portfolio for an Agency. For years now, many CIOs have wished for control over their agencys IT portfolio. Now they have it. The question is: Can they use their power to get their agency to work together to shed silos of excellence, eliminate duplicative systems, reduce the number of data centers and move applications to the cloud? They have their work cut out to reduce operating dollars, turnaround or terminate troubled projects and deliver meaningful functionality even faster all while enhancing the security of
information systems. CIOs are counting on data center consolidation to do three things reduce operating costs, increase operating efficiency and then have the savings fund needed new IT investments or migration to the cloud. Accountability now rests squarely with these agency CIOs. Now they have all the reasons in the world to just do it. n Inside On The FrontLines
4 FDCCI + SSPP = DOEs Approach To Data Center Efficiency 8 One Path To Consolidation: Use ESPCs 10 NRELs Living Laboratory 12 OTFL Interview: Grant Schneider, CIO, Defense Intelligence Agency 14 OTFL Viewpoint: Jim Flyzik on Will Budget Cuts Darken Clouds 16 Which One Are You: Microwave, Oven or Crock Pot? 18 OTFL Industry Roundtable: Their Best Advice 22 Resources
With a PUE of 1.13 the data center at the National Renewable Energy Lab is a living model of how to design and build a data center. Courtesy: NREL/DOE; Photographer Dennis Schroeder
er the FDCCI, agencies are attacking consolidation through a combination of new data center construction and the retrofitting of existing facilities. Teams comprised of IT, facilities, procurement and business operations must work together to figure out how to offer the most processing power in the smallest space possible. Further by FY15 all data centers new, retrofits and those remaining must meet the Council of Environmental Quality (CEQ) benchmark metric of a 1.4:1 PUE (Power Usage Effectiveness) ratio. PUE is the metric used to determine the energy efficiency of a data center. A perfect PUE ratio is 1.0 (1:1). No wonder agencies are wishing there was someone with the certified title of Data Center Energy Practitioner (DCEP) on their staff, a professional practitioner capable of doing assessments and advising facility or IT as to what are the best practices to implement in the data center.
management). The goal is to get the two to work together, take a holistic view and meet the energy efficiency requirements Wooley explained. Our approach is to take a business look at what are the costs? Look at the different data centers and see what best practices, what types of retrofits need to be done to bring it up; and then let some smart business decisions determine which ones we are going Jake Wooley to keep and which ones we are going to close. Wooley acknowledged that many facilities and IT personnel operate data centers without a true understanding of what might be the best practices for energy efficiency. Through the Industrial Technologies Program (ITP) at DOE, a Certified Data Center Energy Practitioner training and certification program has been initiated, said Wooley.
Measure To Manage
Wooley said the certified DCEP is just one component of DOEs SSPP approach to reaching the data center 1.4 PUE efficiency benchmark. The others include doing a DC-Pro Assessment (now required for every DOE Data Center); doing a Green IT best practice selfassessment; including DC & IT ECM projects in Site Sustainability Plans (SSP) and monitoring CPU utilization through sub-metering.
Our recommendation is managers integrate both the sustainability and our data center consolidation initiatives; that we are able to leverage both and achieve the goals that are set for both initiatives, urged Wooley.
Energy Assessments
The DOE national labs, working with the ITP have developed what they call the Data Center Profile tool or DC-Pro Assessment tool. The Data Center Profile tool allows you to do an energy assessment at data centers, and we now have a new version, said Wooley. Part of DOEs best practices is we are requiring our sites to do an annual energy assessment for our data centers to identify what are some of the opportunities for implementing best practices. Each site is responsible for developing an annual site sustainability plan and identifying what they are going to do. The new tool is being rolled out now and allows DOE sites to do the assessment. We can roll it up and monitor on the dashboard how well they are doing. Thats going to come out here as soon as we issue our guidance for the sustainability plan, and looking to that as a best practice, said Wooley.
Cindy Cassil Director, Systems and Integration Office Department of State Watch Video
Much To Do
Wooley told OTFL there are a lot of benefits to solving sustainability and data center consolidation issues, not just saving taxpayer dollars. Its also about improving operational efficiency of our infrastructure and be able to leverage IT to do our job better. Mobile computing, cloud computing, things like that all play in this as far as what we are trying to do here. However Wooley concedes it is going to take a major initiative and a major push in order to make the US government much more on par with industry as far as how their IT infrastructure and how they do their business. And even though my focus is on the sustainability side, its all contributing to achieving a major transformation in IT infrastructure and the government being able to do their job better and at a reduced cost. After all, if you cant measure it, you cant manage it Wooley said echoing a familiar refrain. n
Wooley wants government to adopt the best practices that make DOEs National Renewable Energy Lab data center in Golden Colorado DOEs gold standard. More on page 10. Courtesy NREL/DOE Photographer: Dennis Schroeder
Mike Mestrovich Senior Technology Officer for Innovation Defense Intelligence Agency Watch Video
Data Center Consolidation 5
Rely on Quests proven record and award-winning products to access, migrate, manage, recover and secure your physical and virtualized systems. With Quest, you can control the costs of your consolidation project and improve IT performance and organizational accountability. Use DLT Solutions to purchase your Quest Software and to handle all of your consolidation needs.
www.dlt.com/quest
Energy Savings Performance Contracts (ESPCs) allow Federal agencies to accomplish energy savings projects without up-front capital costs.
get the energy savings needed to help pay off capital investments. Thats why Wooley said we are going through this proof of concept to verify, to see what the challenges are; we want to document the lessons learned, but we want to make it a repeatable process for other agencies to use. So the proof of concept, we are going to be the first test case to make it work.
Proof of Concept
Wooley says DOE is using an ESPC as a proof of concept to transform the IT infrastructure at DOE headquarters. We are going to look at consolidating our IT infrastructure, retrofitting one of our data centers to be highly energy efficient, and consolidating our infrastructure to that facility. Then we will close the other data centers and server rooms that we have at our headquarters location. Its never been used for an IT transformation of the IT infrastructure. So this is a proof of concept at DOE headquarters, explained Wooley. While any government agency can use the DOE ESPCs for its own data center consolidation efforts, Wooley noted that they are not for small projects. Normally they are for fairly significant sites like chiller plants, doing lighting upgrades, putting solar panels in. They are fairly significant capital investments that are required to do that. The challenge that we are going to face with the ESPC, and again I call it the proof of concept, is that many of the facilities upgrades have a long, long life cycle, he said. However while facilities have a long lifecycle, IT does not. So technology needs to be aligned with operations and maintenance to
8 Data Center Consolidation
PUE stands for Power Usage Effectiveness. PUE is a key metric for measuring data center Chuck Powers energy efficiency. Figuring out PUE is simple. Cooling + Power + Equipment divided by Power = PUE. A perfect PUE is 1.0. For example, at the new National Renewable Energy Lab data center in Golden Colorado the PUE is 1.13. At the old NREL legacy data center PUE was 3.3. In fact, most data centers PUE is in the 3+ range. But that is going to cease in FY15 2015 when all federal data centers new, retrofits and those remaining must meet the Council of Environmental Quality (CEQ) benchmark metric of a 1.4. Clearly federal data center managers need to cash in on what NREL is doing if they are going to meet their PUE target.
how all of the 17 DOE national labs are working on sustainable IT management. The results are a new report from DOE titled DOE Laboratories Leadership in Green IT. I worked with all 17 labs and tried to capture their successes, stories or accomplishments around the whole IT sustainable management of desk tops and laptops, their data center efforts and who is using renewable energy to power data centers, Powers said. I tried to capture a story about how high performance computing really contributes to overall sustainability. Even though they consume lots and lots of watts, we are savings thousands of watts because of the outcomes. So even though these high performance computer resources are consuming lots and lots of power, the outcomes are probably some of the most impressive outcomes for environmental sustainability anywhere. So that was my point, and the lab runs thousands of these models annually. Theyve got thousands of these of these projects. n
Spreading Innovation
Chuck Powers, NREL IT Strategist, in is charge of the data center. He told On The FrontLines in a recent interview that while the new data center has redefined world class, we have a lot to live up to and weve been successful. Replication is a very important outcome of the ideas and concepts that weve put in including the data center, Powers explained. The good news is lots of organizations have (including On The FrontLines) toured our data center and organizations have asked us to come in and evaluate and perform energy assessments for their data centers. So we are on track to meet that expectation. So far, Powers has already worked with his colleagues in Berkeley to write a best practices guide for building energy efficient data centers. At the same time, he is being very proactive spreading the word not only about the NREL data center, but
10 Data Center Consolidation
More than 19,000 linear feet of wood from trees killed by bark beetles was used to decorate the lobby of the new RSF. But this is no ordinary wall. As it climbs through the lobby space, the wall tilts in and out giving it a 3-D look all its own. Courtesy NREL/DOE; Photographer Heather Lammers
Say yes to new IT capabilities while still securing your systems. Learn how you can make it safer to do more:
ca.com/us/products/industry-solutions/ public-sector/security
Grant Schneider
OTFL Interview
Deputy Director for Information Management and Chief Information Officer Defense Intelligence Agency
The DIA CIO talks about data management, data center consolidation, the Quad Initiative and more in this OTFL interview.
or me what is exciting about this job is getting to work with all these different communities. I really find that as the DIA chief information officer I have maybe not a unique, but a very broad view of DIA and the entire Intelligence community. Grant Schneider spends a lot of his time interacting with peers within DIA and across the Intelligence community. I am looking for ways I can leverage whats already done or put intellectual capital in or solved a problem, he told OTFL editor Jeff Erlichman in a recent interview. You asked earlier: what is the benefit to the customer. Providing tangible benefits to the customer is what keeps me excited. Here is more from that one-on-one.
IT on the TSSBI environment and those assets got rolled into DIA. So weve been in the process of establishing what we call an Enterprise Service Delivery Center in St. Louis where weve been centralizing services and elements from each of the combatant command locations to get all our data centralized there. Looking forward, we are in the process of moving to having two major data centers for the environment. The one in St. Louis that I mentioned (and) today our second one is within our building in Washington DC. We are in the process of looking to transition from the DC one into a more purpose-built data center facility as opposed to a building thats really made for people and has a computer room in the basement.
On The FrontLines: What progress have you made in the last two years to restructure in terms of IT infrastructure and IT management? Can you describe DIA data center consolidation plans and current progress?
Grant Schneider, DIA: Weve got a number of things that we are doing (in the areas of) data consolidation and data center consolidation depending on how you define a data center. What I mean by that is: for a number of years weve been in the process of bringing as much data back into our central repositories as we can. For example we operate today nearly 140 Defense Attach offices within embassies around the world. A number of years ago we would have email servers within each of those Attach offices and their email resided there. So we centralized those services and brought them back into central data centers so that we can better do disaster recovery and protect the data and We are looking at the be good stewards of the information while maintaining the access from the edge to it. ability to run any of our So over the last few years weve been in applications against the process of bringing in as much data as any of our data because we can for operational and security reasons. our data is indexed in We also have been in the process of reducing the number of data centers that we such a way that all the have. About five years ago DIA took over, if applications can see it you will, (and) it was probably more a merger and readily digest and acquisition with the J2 elements at each the information. of the combatant commands; there, they each had an IT organization largely delivering
OTFL: How have your efforts to better manage data affected your data center consolidation and cloud initiatives?
Grant Schneider, DIA: Our need to better manage and protect our data is driving us to some centralization and consolidation of it, which is also driving us to need more data center space. But the other aspect of it for us (is) from a data management perspective, and this touches a little bit on the cloud environment. Because when I think of the cloud environ-
ment I think of three elements of the cloud: any of the agencies. So we will be taking one being the data layer, and Ill come back some of our, for me, as I was just talking We want it to be more to that; one being an applications layer, where about our layer and our applications layer, seamless across our the applications reside, whether they are widand I will be pointing it into the Common getized or however they are residing in that Operating Environment. architectures and our layer; and the third one being an identity and Im not going to replicate everything infrastructure; and wed access management layer, which determines into the Common Operating Environment. like to save money and who I am as a user, which applications I get to The data will largely stay within each of we want to enhance our our four infrastructures, but from the use, and which data sets get returned to me when I run applications against the data. security. So (there are) Common Operating Environment users This is really a transition for us to get to really three objectives in will be able to get into the data sets that where we leverage and utilize the data in what they need. And it will be based on who the Quad environment. they are as opposed to where they are. I would call its native location or whatever the authoritative data source is. Historically access to data or even disSo instead of needing a brand new capability and making cop- coverability of data is often based on what network you happen ies of the data and the search engines and the link analysis tools to be residing on, as opposed to who you are. And then by having and marrying them up into one proper noun capability, instead a Common Operating Environment we will be able to get to where we are looking at the ability to run any of our applications against its truly attribute-based. Im Grant Schneider, Im the DIA CIO, I any of our data because our data is indexed in such a way that have whatever the clearance is and therefore I am able to see this all the applications can see it and readily digest the information. amount of data and get access to a probable subset of that data.
OTFL: How do you feel about the possibility for CIO responsibilities or authorities to expand?
Grant Schneider, DIA: I think its exciting. We are in a tumultuous financial environment right now for the nation and we are in one thats exciting because weve really got to put our focus into making the right business decisions for the nation. And for me, for DIA as an agency and for the Intelligence community and I think that the memo really highlights that fact that: the CIOs do have a perspective thats somewhat unique in the environment. Because we see the entire enterprise, whatever enterprise that is, agency level or national level, where we sit. So I think its exciting that I think its been recognized. One of the conversations we have in the agency is we often talk about our mission folks and our enablers or support element, and where does IT sit? We sort of come down to its in both. We are an enabler, and yet the mission cant get done without IT being there. So I think its exciting to see the government look at the CIO role. It is going to have some impact, I think from whats on paper and the reality is I think most of the CIOs have been operating to some degree in that manner depending on how their agency and department heads wanted them to for years. n
OTFL Viewpoint
verybody is talking cloud computing and data center consolidation to increase efficiency and reduce cost. If hundreds of data centers are closed, it is intuitively obvious that a lot of money will be saved. However, getting there in light of the current spending cuts may not be so easy. It will take some upfront money to do the consolidations to achieve those savings down the road. But will the investment money be there to replace/virtualize/move equipment? The mood in Congress is clear cut spending now. During recent discussions with several CIOs, we talked about the potential lack of funding available to make the data center moves. One CIO talked about the fact that his office only had about 20% of the needed funds to move to the consolidated location and lacked funds to virtualize and modernize before moving. Thus a worst case scenario arises only some of the current data center equipment can be moved this year resulting in incurring O&M costs at the new location but still needing to operate the existing location. Some argue to let the Private Sector own the equipment and offer it back to the government as a service. But this creates a second problem with the assumption of risk associated with Infrastructure as a Service (IaaS) at consolidated centers. What happens if the program for which the equipment was purchased is cut? Will the private sector be free to use the equipment for other agencies or commercial entities? We are talking about a culture change here that is significant. Were talking about Service Level Agreements and contractual terms and conditions unlike any in past government programs.
security requirements than the other Treasury Bureaus, as did the other bureaus with differing missions. Privacy concerns too were different. A place like the IRS had more stringent privacy concerns where sensitive taxpayer information was involved. A seemingly innocuous application such as email consolidation will raise issues about who manages the system, who has access to whose email, levels of security on the system, access to social sites, wireless synchronization to name just a few. The Reality Is... I dont mean for this article to sound the alarm and darken the outlook for cloud computing and data center consolidation. Rather, it is about pointing out some of the realities and complexities that will be encountered hoping they can be dealt with early before becoming show stoppers. It is about keeping momentum up for the investment dollars needed to reap the longterm savings. In discussions around town certain lessons-learned are becoming evident. Before doing a consolidation, Agencies need to have in place a strong governance process capable of resolving difficult issues. A means to bring decisions to closure is crucial to staying on schedule. Pre-defined cost allocation processes need to be agreed to early on in the program. And relentless leadership needs to be exhibited to keep funding from being swept up to cover other cuts. Data center consolidation and cloud computing are examples of areas where IT can prove its role in dramatically reducing costs and improving government services. It is imperative that the IT community rally to support these efforts. We dont want political bickering over budget cutting to jeopardize these promising opportunities. n Jim Flyzik
I see a hybrid world down the road, a combination of commercial products entering the government workspace and some things government will have to continue to do on its own Watch Video
Brocade is deploying Ethernet fabric solutions today. From increased automation to more scalable and resilient network architectures, Brocade Ethernet fabrics atten your network. In fact, you can manage the entire fabric as one single, logical entity.
hey say the best medicine is laughter. So, I had to chuckle out loud during my recent interview with MeriTalk founder Steve OKeeffe as he was describing the differences between the Microwaves, Ovens and Crock Pots. While I was chuckling as I was listening, and thinking about half-baked solutions, OKeeffe and a prominent committee of data center consolidation leads are making serious progress to craft a how-to, best-practices guide for agency data center consolidation chefs. We are always talking about doing more with less, but every year we do more with Steve OKeeffe more, OKeeffe noted. We cant afford to do things the way we did them; we have to find new ways and we have to work to change that equation.
learn cumulatively and build a best practices library; and finally quantify your success to win support for the next phase.
Cookbook Contents
In the DCC Cookbook, consolidation starts with taking stock of your ingredients, an audit to establish a baseline said OKeeffe. Then comes planning. Engage business owners. Set up a Customer Advisory Committee that includes the CFO. Project needs and financial, IT personnel and business goals. Define SLAs and map cost/operational models. Consolidation is a time for modernization. This is the time to think about applications and moving to open source solutions. This is the time to consolidate platforms and negotiate software licenses. This is the time to de-duplicate and automate, map your timeline and prioritize costs. When it comes time to implement, use the Data center Fall 2011 Report spiral approach while engaging your Customconsolidation is The committee will report its findings er Advisory Committee. Use approved Projcomplex and difficult. ect Management methodology such as EVM during Fall 2011 and includes DHSs MagWhat are the best gie Graves, Interiors Bernard Mazer, ATFs or CPIC and leverage to manage customers, Walter Bigelow and Anil Karmel, Los Alamos practices from agencies contractors, and your team. National Lab solutions architect. The final analysis is to measure by evaluthat have achieved What the DCC Cookbook will do is provide ating budget, tracking against SLAs, closing success straightforward analyses for different types the loop with the Customer Advisory Comis there a cookbook of agencies that have different needs exmittee at every opportunity and developing plained OKeeffe. takeaways and lessons learned that carefully to share? He said the groups are divided by the numdocument each phase. ber of data centers they have to cut more than 75; between 20 OKeeffe said meetings with agencies are well underway and and 75 and less than 20; What he envisions is these in groups who open panel sessions like the one held at Innovation Nation are have similar issues will come together to share best practices. keeping the project on track. Further, when complete, there will However the current reality is agencies are setting their conbe Resource page for each consolidation phase where chefs can solidation alarm clocks to ring according to how fast and how reach out and get advice from experienced cooks such as Jake radically they have to consolidate. Wooley from Energy or Karen Petraska from NASA. Hence, the Microwaves are those who are moving ahead It will also have a tools pane, a living wiki if you will, OKeeffe fast, the Ovens are those who are moving more slowly and explained. Here chefs can find white papers, analysts resources those whose plans are still simmering are the Crock Pots. and information on why an agency picked a particular product, The plan is to create different DCC Cookbooks for each group vendor or solution. OKeeffe explained. No matter what group an agency is in, OKeeffe said all agenMeriTalk Data Center Exchange cies need to think first about how to break consolidation into The DCC Cookbook is an initiative of the Data Center Exmanageable phases; then focus on low risk/high return projects; change. OKeeffe calls it a vertical public/private partnership always do everything to ensure non-stop services; continue to where data center leads for agencies can come together to talk
16 Data Center Consolidation
Consolidation Conundrum
Its quite a conundrum. The number of data centers may be going down, but the requirements for new data center space and capabilities are going up. So, how do we get smaller, as we get bigger and be sustainable too? And what about investment dollars for consolidation? To get ROI, you need the I to get the R, MeriTalks OKeeffe stated. We are not going to get this without funding; there is no magic formula for change. Once again that is a key takeaway from the August 2011 Consolidation Conundrum research study conducted by MeriTalk and sponsored by Juniper Networks. The study is one of a series of studies on Data Center Consolidation that demonstrate the issues facing agency consolidation chefs. Other key takeaways include: 1. Data center consolidation is not as easy as it looks. The more executed the more realization how complex they can be. In fact, Feds are skeptical: Just 10% believe Feds will meet OMBs mandate of consolidating 800 or more data centers by 2015. Nearly a quarter (23%) believe the government will have more data centers in 2015 than now. 2. Can we shrink and grow at the same time? Fewer sites, but the need for more computing capacity. In fact, Feds estimate: Their computing needs will increase by 37% over the next 5 years. Their data centers will need to be scaled up by 34%to meet their growing needs. 3. Fewer Data Centers Equal More Complex Data Centers. Remaining data centers will house complex legacy systems and reluctance to change compounds the challenge of increasing data center capacity. In fact: 48% are running 20 or more management software applications at their data center 62% dont believe its reasonable for their organization to utilize managed services from other organizations making cloud migrations more cloudy. Thats just part of the conundrum for chefs said OKeeffe. Their biggest challenges are figuring what metrics to use to make smart consolidation decisions. For example, should the metric be server utilization or energy consumption and its resulting costs? The research says utilization is between 65-70 percent. It also says 12% of data center costs are for electricity, but many dont know how much they are paying. Which metric to use? Is there a better one? If we dont have the right metrics to make decisions, how do we know which data centers to close? Quite a conundrum!. Click here to get your copy of Consolidation Conundrum research.
Data Center Consolidation 17
Leading private sector practitioners offer their best advice on how to save time, energy, angst and dollars when consolidating data center facilities and assets.
ensuring critical prerequisites and milestones are met. It also will mean changes in how staff is deployed and their duties. This is especially true in the area of information assurance (IA), cyber security and migration to the cloud. People with software systems and security engineering backgrounds are required to make sure that cloud applications will operate properly in shared environments instead of the isolated, excess capacity environments of the past, he said. The impact on jobs will also vary depending on the type of cloud (e.g. public, private, hybrid) that is deployed, and the extent to which the cloud is used according to Ryan, who noted organizations have traditionally purchased, deployed and managed their own IT infrastructure. When all or part of that infrastructure is consolidated to a public cloud or traditional infrastructure, they may need to refocus their skills and attention on management, acquisitions, logistics, engineering, service desk and cyber security staff.
Dave Ryan Vice President and Chief Technology Officer, Navy/Air Force Division General Dynamics IT Mark Weber Senior Vice President and General Manager NetApp U.S. Public Sector
account management, and general management, Haugen said. If a significant percentage of the operations will be contracted out, vendor management becomes a critical skill for the federal workforce. Haugen counseled that it is important to set early expectations with the customer base that the organization may need time to move up the learning curve if the concept of operations is very different from the current state. Adapting Service Level Agreements and/or other performance metrics during the transition period may help reduce strain on the workforce as they settle into new positions, she said. Employees may feel undervalued if they think the new leadership is not fully aware of their individual strengths. Increased management attention to staff in the early period can avoid attrition typically associated in the startup period. The bottom line is frequent and transparent communication is key to dealing with the IT workforce.
you are, so you can measure if you are progressing in the direction you want. Measuring and baselining are what we would consider basic hygiene, said Kagansky. Have a Plan B if something goes wrong. You will have a lot more chances to correct any mistakes early on. If you get to the end, it is tough to go back and fix and patch after the fact. Kagansky that with consolidation there is a lot more for the same amount of staff to do. When you are consolidating the data center, you are not really consolidating systems, he said. You still are going to have the same number of systems; you are just physically changing where they are running or how they are running. Consolidation often adds to complexity, pointing to how virtualization makes things more complex and it puts a premium on identity management and assigning access rights. He also points to the ever present issues when retraining and redeploying your workforce. There often is a feeling that the training levels arent there and that staff has not been adequately trained to handle these new environments, he said. Someone has to be capable of monitoring and managing that. In addition, things need to be partitioned to make sure no one has more data access rights than they should. Kagansky said moving to a new building is usually an opportunity for things to go wrong. He worries that many are making the move haphazardly, doing what they need to do; then there is a lot of rework and backtracking. He also stressed the fact that metrics are still not clear and that in the past government didnt measure and establish metrics. They are getting performance data of the new data centers. For the old systems, no one has done baselining. Many were built before there were any mandates or any interest in baselining. When customers say applications are running kind of slow, they often dont have any comparisons as to what was made before the move. His advice: The first thing is to set your baseline; before you make any sort of move; before you pick up a new database or app or put in new hardware or virtualize. Get a baseline, take a snapshot of what performance was like and what users are going to expect right now before you cut over.
noting there is a lot of low hanging fruit out there that can deliver immediate ROI. There are still agencies that dont offer password reset. When you offer password reset from the Service Desk and it is automated, we have seen instances where Help Desk tickets are down 50 percent the first couple of months. There are a lot of IT processes that can be automated said Clark, especially in the testing and development area. Why take 40 days to slow down test and development, when you can provision automatically and deprovision very quickly? he asked. One of the things IT operations should do is look at how to facilitate that as quickly as possible. It allows you to use your capacity more efficiently which means you dont have to purchase more hardware. Clark is also a big believer in portfolio management to align what you are doing in IT with what the agency mission is; and report in business terms what you are doing in IT. That is easier said than done; it requires you to understand how your assets in IT are delivering services; what is the relationship there and how to preserve the service levels being delivered. With this information Clark added, If the government comes and says cut 10 percent you can say this is what is going to happen if you do this downstream; these are the service levels under that could be potentially not as good as they were before.
cause we wont have effectively either changed the number of elements that we have to manage. Nor will we have changed the way or the methodology that we are going to use for managing all of those different elements. He said instead of that we have to have the same sort of change in approach for consolidation in the data center for the networking and infrastructure side that we had for storage area networking a decade ago. We have to change this mindset which says that the network is something which is going to require very constant manual intervention and is seen as a second resource, Copper explained. As opposed to going to an environment where we are building end-to-end application environments, we can set up policies and the network is going to be able to automatically respond to whats going on through the rest of the infrastructure just as the application servers and the hypervisor servers and the storage servers are going to have to do.
GET THE
R e g i s t eR today www.meritalk.com/datacenterexchange
Resources
Websites
Presented by
Federal Data Center Consolidation Initiative American Council For Technology Apps.gov (GSA) CIO Council Cloud Computing Data Center Consolidation DLT Solutions FedRAMP Federal Risk and Authorization Management Program Infrastructure 2.0 or Dynamic Infrastructure ITIL Information Technology Infrastructure Library National Business Center Data Center Services Department of Interior Nebula Cloud Computing Platform NIST OpenStack The Open Source, Open Standards Cloud Data Center Exchange Data Center Cookbook MeriTalk Case Studies, Memos, Research, White Papers & Special Reports OMB CIO Authorities Memo August 8, 2011 OMB FDCCI Deadline Memo July 20, 2011 Deloitte White Paper ACE Automated Commercial Environment Fact Sheet DHS Consolidate with Confidence and Less Cost NetApp Consolidation Solutions Slash Data Center Costs NetApp Datacenter Consolidation Strategies for the Federal CIO DLT Solutions Department of State Data Center Consolidation Presentation October 2010 FDDCI Update Memo October 1, 2010 Kundra/Spires Effectively & Securely Using the Cloud Computing Paradigm (NIST) Federal Cloud Computing Initiative (GSA Presentation) Federal Data Center Consolidation Initiative FAQs Federal Data Center Consolidation Initiative Final Baseline Inventory Federal Data Center Consolidation Initiative Initial Data Center Consolidation Plan
Federal IT Consolidation Research Study June 2011 MeriTalk/NetApp Government Cloud Computing (Dataline) How to Achieve IT Optimization Whitepaper DLT Solutions i360Gov Special Report: Redefining Make Do DLT Solutions Ongoing Virtualization Activities at NRC Privacy Recommendations for the Use of Cloud Computing by Federal Departments and Agencies Transition to IPv6 Vivek Kundra - State of Cloud Computing May 2010 Vivek Kundra Testimony on Cloud Computing: Benefits and Risks of Moving Federal IT into the Cloud Vivek Kundra, Federal CIO House of Representative Testimony, July 2010 Vivek Kundra, Federal CIO The Economic Gains of Cloud Computing
Videos
Future Visions During the Federal Executive Forum on Data Center Consolidation, leaders talked about what data center consolidation will look like in the future. Cindy Cassil Director, Systems and Integration Office Department of State Watch Video Cheryl Rogers Director, Office of IT Optimization FAA Watch Video
Mike Mestrovich Senior Technology Officer for Innovation Defense Intelligence Agency
Complex technology issues present new opportunities. Turn to Deloitte. Whether its cyber security, cloud, mobility, IT management, or integration, Deloitte stands ready to help. We have the people, insight, and experience to help your organization. See how. Visit www.deloitte.com/us/federaltechnology.
As used in this document, Deloitte means Deloitte LLP and its subsidiaries. Please see www.deloitte.com/us/about for a detailed description of theDeloitte means Deloitte LLP and its subsidiaries. Please see www.deloitte.com/us/ As used in this document, legal structure of Deloitte LLP and its subsidiaries.
about for a detailed description of the legal structure of Deloitte LLP and its subsidiaries. Certain services may not Copyright 2010 Deloitte Development LLC. All rights reserved. be available to attest clients under the rules and regulations of public accounting. Member of Deloitte Touche Tohmatsu 36USC220506 Copyright 2011 Deloitte Development LLC. All rights reserved. Member of Deloitte Touche Tohmatsu Limited
Do
IT thought leaders and over 1 billion end users benet from clouds built on a NetApp storage foundation. To make sure your storage architecture is designed to deliver all the rewards the cloud has to offer, visit NetApp.com/federal.