The federal Chief Information Officers Council is reorganizing to address the government’s technology trends and priorities.
The goal is to make it easier to give agency CIOs tools and guidance to meet the administration’s initiatives.
The council is moving from five committees to three main ones along with three communities of practice and two task forces.
Bernie Mazer, the Interior Department’s CIO, said the council is maturing to be a lot more active in the federal community.
Insight by ProPricer: During this webinar James Woolsey, the president of the Defense Acquisition University, Frank Kelley, the vice president of the Defense Acquisition University and Michelle Currier, the professor of contract management at the Defense Acquisition University, will discuss the future of DoD contracting, pricing and acquisition. In addition, Michael Weaver, the professor of contract management at ProPricer will provide an industry perspective.
The three new committees are:
Mazer said the innovation committee will look at open data.
“It’s really about how we come to grips with characterizing or moving things out of datasets to make it more open and more accessible to the public,” Mazer said at the Brocade Federal Forum event in Washington Tuesday. “Underneath portfolio management are a lot of the activities associated with PortfolioStat, things like shared services, things like the TechStats, things like the Federal Data Center Consolidation Initiative. One of the things we will look at is a more uniform, harmonious way of coming up with metrics — metrics that can be widely shared.”
A government source, who requested anonymity because the source didn’t get permission to talk to the press, said the CIO Council has not named a new vice chairman to replace former DHS CIO Richard Spires, who resigned in May.
Mazer said the council also will be doing more active outreach with other CXO councils. He said there is a lot of good work going on, so the CIO Council wants to do a better job of getting the word out and receiving input.
The council’s reorganization is the first in several years. Over the past decade, it has transformed to address issues such as identity management or shared services, but this is the first wholesale restructure in several years.
The council’s changes coincide with the Obama administration’s IT and management focus areas, including project and program governance, moving to the cloud, and cybersecurity.
Data center metrics under development
The new committees address the governance piece through the portfolio management committee. The security and identity management committee stays basically the same, while the innovation committee will cover everything from cloud to big data to open data to anything else that comes down the pike.
The data center task force will support the portfolio management committee in helping agencies meet several goals, including reducing and optimizing more than 7,000 federal data hubs.
The Office of Management and Budget merged the Federal Data Center Consolidation Initiative into the PortfolioStat program in May.
Mazer said the FDCCI task force is coming up with metrics that will be part of the conversations through PortfolioStat.
“We are encouraging agencies to pursue an aggressive alternative analysis as to where to place the stuff in the data center and where it can be placed,” he said. “A lot of it is still in the aspects of having one agency talk to another agency, but I do know within the FDCCI when we see a build of a data center in one particular location, we are saying, ‘there is another data center in another agency that has all this space, so why are you doing that?’ It becomes a three-way type of engagement. It could be GSA because they are doing the renovation of the facility, then it’s the agency [customer] and then the agency that has the capacity.”
Mazer added several agencies are exploring buying data center space and services from others, or having a government-owned, but contractor-operated model.
The innovation committee focus is to help agencies with the move to the cloud, virtualization and mobility.
Want to stay up to date with the latest federal news and information from all your devices? Download the revamped Federal News Network app
Navy to virtualize
The Department of the Navy is a good example of implementing many aspects of the innovation agenda. Terry Halvorsen, the Department of the Navy’s CIO, signed two memos last week requiring all Navy servers to be virtualized by 2014 and a memo laying out the more serious look at virtual desktop initiative or zero client.
Halvorsen said the zero client, or hosted virtual desktop (HVD), is slowed by the bid protest of the NGEN contract award. The Navy awarded the $3.5 billion enterprise network infrastructure contract to HP in June, but CSC protested the award to the Government Accountability Office.
“Why are we doing it? Two big reasons: cost and security,” he said. “If I can get to a higher volume of the HVD, a couple of things happen for us: There is nothing left on the machine. The pure cost of the device is less, even though I get a really good price on computers.”
Halvorsen said the cost of zero client PCs are coming down similar to what happened with color televisions, where the price got cheaper and cheaper.
“You can do more of your operations to include security at the network level, not every device needs to be licensed to do everything,” he said. “There are lots of smart reasons for us to do that. It also lets us get more mobile.”
Halvorsen said he recently bought a computer for his home and installed the Common Access Card (CAC) reader and HVD software in about 20 minutes. He then accessed his full desktop without any problems, and when he logged off, there was no data or residue left on the machines.
Halvorsen said he could imagine that same scenario for tens of thousands of people in the Navy, who could access data from anywhere, at anytime without a huge security risk.
As for the server virtualization, Halvorsen said the tenant commands likely will not have a lot of options for how they will transform their computer boxes.
“We are going to look at the best tool sets. We will do some competitive contracting and pretty much say, ‘Based on the data we have, Command X, your servers are ready to be virtualized and your environment can change.’ The other thing that will happen here is we need to get off the command-by-command basis for doing IT. This needs to be a more global Navy and Marine Corps strategy than it can continue to be a local strategy.”
The growing through-put problem
The move to cloud and virtualization is causing a new challenge for many agencies: bandwidth.
Several senior officials at the conference said all the new tools such as video chat or video teleconferencing is taxing their agency backbones. Mazer said one concept Interior is looking at is a bandwidth on demand model.
“There are a lot of different government agencies looking at bandwidth on demand, moving into a more utility type of function,” he said. “In the old days, we used to work on local caching of data and content as opposed to relying on the data and the pipes. One thing my team is looking at is about how to deal with that. We are in an active engagement with our telecom service providers of how we look at that to get into these types of things. We are looking at some type of managed services on there.”
Interior created a telecommunications enhancement board to figure out how to come up with that managed service approach. Mazer said bandwidth on demand would let Interior turn up or down the amount of bandwidth needed at any given time of day.
Mazer said Interior is taking an approach to the cloud that includes reducing data centers, rationalizing and reducing the number of applications it supports, and focusing more on the interdependency of all of these efforts to be successful.