Reporter’s Notebook

jason-miller-original“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.

Submit ideas, suggestions and news tips  to Jason via email.

 Sign up for our Reporter’s Notebook email alert.

Cloud bill gains support as FedRAMP sets JAB approval cap

With Rep. Will Hurd’s (R-Texas) field hearing in San Antonio, Texas last week on the state of federal cloud computing, the challenges around contracting and budgeting for these services remains the biggest obstacle for a wider acceptance.

Most would agree the broad budgetary changes needed for agencies to alter the way they buy isn’t happening anytime soon. But there is a growing acceptance that another approach to funding cloud computing is starting to get some attention on Capitol Hill.

Rich Beutel, a former House Oversight and Government Reform Committee senior staff member and one of the main forces behind the Federal IT Acquisition Reform Act (FITARA), has been circulating a cloud bill  with lawmakers over the last six months. Beutel is modeling his cloud bill from a  funding perspective after the continuous diagnostics and mitigation (CDM) program run by the Homeland Security Department.

Beutel has streamlined his bill to focus on three main areas:

  • The codification of the cloud security program called the Federal Risk Authorization Management Program (FedRAMP) and making it mandatory for all cloud deployments.
  • The creation of revolving working capital funds and broader budget flexibilities for cloud transitions.
  • A requirement for agencies to accelerate their transitions off legacy hardware to new technologies by requiring them to complete operational assessments once a year. A recent Government Accountability Office report found agencies were not doing this and missing out on billions of dollars in savings.

Beutel said he had strong support from at least one member in the Senate and a growing interest from the House.

He wouldn’t go into more details about who or which committees, but it’s pretty safe to assume the logical choices are his old House Oversight committee and the Senate’s counterpart, Homeland Security and Governmental Affairs.

The concept of a working capital fund for cloud transitions is probably the change that agencies are most in need of. Having a pot of money to lean on when turning off old systems and moving them to the cloud would help address a host of systemic issues. Agencies tend not to have “extra” money lying around to make these changes so having a dedicated fund for technology upgrades has shown to work well.

The FedRAMP section also brings up an interesting issue.

While FedRAMP has been successful for the most part, there is a growing frustration on both the vendor and government sides of the effort over the process that many see as too cumbersome.

Several government officials have said vendor’s are not submitting complete cloud packages, which is leading to the 12-18 month average to get through the Joint Authorization Board (JAB). The FedRAMP folks have put the JAB, which is made up of CIOs from DHS, the General Services Administration and the Defense Department, as the gold standard for cloud security authorizations.

At the same time, vendors say the process is expensive and agencies are relying too much on the JAB because they don’t want to pay for their own authorizations or don’t have the expertise to conduct the reviews.

Now, FedRAMP Director Matt Goodrich said the program management office (PMO) can only support, maintain and oversee 50 cloud service provider (CSPs) approvals at any one time.

That limit of 50 cloud service providers shocked several industry experts who follow cloud closely.

Now to be clear, Goodrich said he’s been saying there is a limit for the PMO for some time based on its current set of people and funding, and it doesn’t mean the 50 CSPs are static either.

“We plan to roll JAB authorities to operate (ATOs) to agency ATOs for those CSPs that are not being used governmentwide,” Goodrich said. “We tell them to maintain it and then bring in new governmentwide ATOs.”

He said the JAB is working with about 35 cloud providers so far and there are about another 20 vendors working through the readiness phase to get in front of the authorization board.

Goodrich said the PMO has suggested to OMB that the JAB needs more funding and resources. While that request works its way through the normal process, he said the PMO is piloting a continuous monitoring approach to make it easier to ensure cloud providers are meeting FedRAMP security standards.

“We will be starting next month, using a tool that takes in data from other tools that scan, and then compare the data to FedRAMP parameters,” Goodrich said. “We will be taking on agency ATOs and getting the same information agencies get and pilot it for three months. We want to find out how useful the reports will be and see if we can relieve some of the burden on the JAB and agencies to maintain cloud service provider ATOs.”

He said the pilot also will help determine if the tool, developed for FedRAMP, would be scalable to more CSPs.

In the meantime, Goodrich said the PMO is working with OMB to encourage more agencies to take on ATO approval efforts.

“The program wasn’t built for everything to go through JAB,” he said. “There has been plenty of discussion about whether the JAB should do all of the ATOs, and we can redesign the process to do that, but that requires more money and more people.”

Goodrich said the fastest way to get an ATO is through the agency process.

In fact, Goodrich said the PMO is close to hiring someone whose job will be solely dedicated on helping agencies become better and faster with the ATO process.

The steps Goodrich and FedRAMP are taking to address the challenges of the growing program will help, but it’s obvious that legislative help in the form of a fund or some way to dedicate funding toward cloud migrations is what’s needed to speed up the transition off legacy infrastructure.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


State of federal cloud remains optimistic with a chance of budget pessimism

Rep. Will Hurd (R-Texas), chairman of the Oversight and Government Reform Subcommittee on IT operations, asked an interesting question at a recent hearing in San Antonio, Texas: “What is the state of federal cloud computing?”

Not quite as enthralling as a President’s State of the Union — and of course a lot less standing ovations at Hurd’s hearing — but it brings up a timely question about just how much progress agencies have made since the Office of Management and Budget’s 2011 “cloud-first” mandate. This February will be five years since former federal Chief Information Officer Vivek Kundra issued that mandate.

Nearly every CIO is moving something to the cloud — email, public websites and other basic technology services. Others such as the Federal Communications Commission, the Homeland Security Department and even the Medicaid and CHIP Payment and Access Commission (MACPAC) have done a lot more than these basics, putting entire infrastructures in the cloud.

Mark Kneidinger, the Homeland Security Department’s director of Federal Network Resilience Division in the Office of Cybersecurity and Communications, offered an interesting, though a bit old, status update at the hearing.

Kneidinger, the only federal executive to testify at the field hearing, said a February 2015 data call among the CFO Act agencies revealed they implemented 32 infrastructure-as-a-service (IaaS), 24 platform-as-a-service (PaaS) and 77 software-as-a-service (SaaS) instances. DHS conducted this survey to better understand how agencies are applying the security feature known as the Trusted Internet Connection (TIC) program.

Kneidinger said DHS found “the majority of services were for email, customer relationship management, SharePoint, case management applications, collaboration tools, Web hosting and help desk capabilities, with few instances where agencies had migrated high-value applications.”

A recent survey of federal CIOs by Federal News Radio told  a similar story. A majority of respondents said they’ve migrated email and/or a collaboration system to the cloud, while other potential functions such as financial management, records management and software test and development saw a much lower rate of cloud adoption.

When asked about which types of systems are top of their priority list to move to the cloud, a higher percentage of CIOs said agency-specific apps and mission critical apps.

The one concerning response when it comes to cloud is more than 80 percent of the CIOs said they wouldn’t move their priority apps to the cloud for at least seven months, and 31 percent of that 80 percent said it would be more than a year before they are off legacy systems.

CIOs also are more comfortable with the commercial cloud. Just over half said for the most part their apps reside in a commercial cloud, while another 28 percent said they use a hybrid cloud or government-only commercial cloud. This trend likely will continue as well as agencies move more applications to the cloud. CIOs ranked commercial and hybrid clouds as their top priority for future deployments.

So what does this all mean?

Agencies are making progress, but — like many things — it’s taking time. The slow roll, as Hurd said at the hearing, is mainly attributed to the ongoing battle of federal agencies to change spending habits.

Federal CIO Tony Scott said recently agency spend on legacy systems is climbing back toward the 80 percent mark after several years of decline.

Scott has promised to come out with a strategy or an approach this fall to swing the pendulum the other way toward development, modernization and enhancement (DME) spending.

Federal News Radio’s survey found 26 percent of CIOs said  they spend more than 76 percent of their budget on legacy systems, while a total of 56 percent are spending 51 percent or more. Additionally, 74 percent of all respondents said their agency struggles to get out of the cycle of spending on legacy systems.

“[It’s] very difficult to retire a legacy system or program,” one CIO wrote in the survey. “[It’s] also challenging to get new programs underway via the budget process.”

Another CIO said, “We simply do not have the resources to update legacy systems, nor the governance to force that change.”

To change their spending habits, agencies need to understand how much it’s costing them to run the entire workload, said Alan Boissy, product line manager at VMware vCloud Government Service, who testified at the hearing.

Boissy said piloting apps on a small scale will give agencies that experience of understanding how much the entire package costs — not just the applications, but the data center, the power, the people and other parts that shows the cost savings and/or efficiencies the cloud brings.

But Kneidinger said it’s more than just changing spending habits, but it’s a matter of trust between cloud providers and agencies through the contractual relationship to delineate each party’s roles and responsibilities.

Without a doubt, agencies are moving to the cloud. It has been one of the most readily accepted technology evolutions over the past 20 years, but the Office of Management and Budget’s plan or approach or strategy — or whatever they end up calling it — to move agencies off of O&M spending more quickly is the most important piece to the cloud puzzle.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


Inside the Reporter’s Notebook – IGs, IT executives to experience a FISMA détente

Inside the Reporter’s Notebook is a biweekly dispatch of news and information

Jason Miller
Jason Miller

you may have missed or that slipped through the cracks at conferences, hearings and other events. This is not a column or commentary – it’s news tidbits, strongly-sourced buzz, and other items of interest that have happened or are happening in the federal IT and acquisition communities.

As always, we encourage you to submit ideas, suggestions and, of course, news to Jason via email.

Be the first to know when a new Inside the Reporter’s Notebook is posted. Sign up today for our new Reporter’s Notebook email alert.


IGs, IT executives to experience a FISMA détente

There always has been healthy tension between auditors and operators. After what seemed a thaw between inspectors general and IT executives over the last few years, a recent event highlighted the continued friction between the two parties in how agencies protect federal data and networks.

During last Thursday’s panel discussion sponsored by AFFIRM in Washington, several CISOs and agency chief information officers talked about the difficulty in moving to a risk-based framework.

Jim Quinn, the lead system engineer for the Department of Homeland Security’s continuous diagnostics and mitigation (CDM) program, said too often IGs rely on checklists to determine whether or not agencies complied with the policy and law requirements.

“They have a standard pro-forma checklist that says ‘Have you done A, B and C?’ with no acknowledgement of whether A, B and C are really things that are important to what you are trying to achieve or whether you have done other things to make those controls less relevant because you’ve put compensating things in that limits your risk on them,” he said. “I think that this is one of the challenges, even looking at things like Federal Information Security Management Act (FISMA) metrics is how do we allow the agencies and departments and the mission groups to really be able to say ‘You have to look at the risk I’m willing to take in the context of what I am doing.’”

Read more


Transportation finding the right balance as it reforms IT procurement

The Transportation Department is an outlier when it comes to implementing the Federal IT Acquisition Reform Act (FITARA). It’s not that Richard McKinney, the DoT chief information officer, and other senior executives at the agency don’t believe in the spirit and intent of the law. It’s just DoT must deal with its 800-pound gorilla in the form of the Federal Aviation Administration and the fact that the FAA comes under a different set of rules than other parts of the department.

McKinney raised some eyebrows at the AFCEA Bethesda breakfast last week, saying the FAA’s lawyers decided the administration will not have to follow FITARA the same way as the rest of government.

McKinney clarified his statement after the event to say, yes, the FAA is following FITARA — for human resources, for planning, for accountability. The one area the FAA is different is on the procurement front where FAA CIO Tina Amereihn will approve all IT acquisitions instead of McKinney. FITARA requires agency CIOs to approve all IT spending, but includes the ability for the headquarters CIO to delegate some of that responsibility to the bureau level.

McKinney called the change a customization of FITARA based on current laws.

McKinney said because of the way Congress created the laws governing FAA acquisitions, administration lawyers and DoT executives decided best way forward was to have him delegate the IT spending approval authority to Amereihn.

Read more


New shared services organization to be model for all of DoD

The new Pentagon technology shared service went live July 20, and is moving toward full operating capability in the coming year. The move to full operational capability is developing the roadmap for others across DoD to follow.

Defense Department chief information officer Terry Halvorsen said he’s pleased where the Joint IT Single Service Provider-Pentagon (JITSSPP) is today.

Barbara Hoffman, DoD’s deputy chief management officer, said the last 45-to-60 days they have been looking at contract consolidation and the best way to merge the two main components of this new organization—the Army IT Agency (ITA) and Enterprise Information Technology Service Division (EITSD).

“We are still working some of our initial service consolidations for video teleconferencing, service desk and the computer network defense,” Hoffman said during a recent call with reporters. “That is all moving along nicely and we are now entering into the phase for when we go FOC, which is an undetermined time, but we do have to start thinking and prepping for that.”

Read more


FBI joins growing list of agencies with IT executive turnover

The FBI is the latest agency to make major changes to its IT organization. First, Jerry Pender, the FBI’s chief information officer, left after more than three years to join the private sector.

Now sources say Dean Hall is retiring in early October.

Pender left the FBI Aug. 25 and now is the managing director and operating partner at Z Capital Group, an investment-management firm.

Sources say they didn’t know what Hall would do next.

An FBI spokesperson said Brian Truchon is the acting executive assistant director and CIO of the FBI’s Information and Technology Branch.

So this means the FBI is losing two-long serving IT executives at a time when cyber, mobile and other capabilities are more vital than ever.

Read more


FBI joins growing list of agencies with IT executive turnover

The FBI is the latest agency to make major changes to its IT organization. First, Jerry Pender, the FBI’s chief information officer, left after more than three years to join the private sector.

Now sources say Dean Hall is retiring in early October.

Pender left the FBI Aug. 25 and now is the managing director and operating partner at Z Capital Group, an investment-management firm.

Sources say they didn’t know what Hall would do next.

An FBI spokesperson said Brian Truchon is the acting executive assistant director and CIO of the FBI’s Information and Technology Branch.

So this means the FBI is losing two-long serving IT executives at a time when cyber, mobile and other capabilities are more vital than ever.

Pender has been CIO since 2012 and Hall has been with the FBI since 2002 when he came over on a detail from the CIA.

He has been deputy CIO since 2007.

It’s not that Truchon doesn’t know the FBI. He started as a special agent in 1987 and in 1996 was promoted to supervisory special agent in the Criminal Investigative Division’s Safe Streets and Gang Unit. He became assistant director of the IT division in May 2014.

Along with the FBI, NASA and the Energy Department are among the agencies beginning a significant overhaul of their CIO’s office.

It seems like the FBI is on better footing than maybe others. The FBI inspector general hasn’t put the bureau’s IT management on its management challenges since 2011. Pender and previous CIOs have been implementing a five-year IT modernization plan, which needs updating this year.

In addition to the changes at the FBI, the National Archives and Records Administration is losing a long-time executive.

Paul Wester, NARA chief records officer, is moving to be director of the Agriculture Department’s National Agriculture Library starting Oct. 19.

In an email sent to record managers across government, Wester said he’s proud of the efforts and accomplishments of the record management community over the last several years, including on the director to better manage government records from NARA and the Office of Management and Budget.

Larry Brewer, director of National Records Management Programs, will serve as acting chief records officer.

There also were two Defense Department personnel changes worth noting.

First, President Barack Obama nominated Navy Rear Adm. Elizabeth Train to the rank of vice admiral and for assignment as deputy chief of naval operations for Information Dominance, N2/N6, Office of the Chief of Naval Operations/director of Naval Intelligence.

Why is Train’s nomination noteworthy?

She would replace Vice Adm. Ted Branch, who has been in that role since July 2013 and hasn’t been allowed to see any classified information since December 2013—meaning he can’t do a major part of his job.

As my colleague Jared Serbu wrote in August 2014 that the suspension of Branch’s access to classified material has nothing to do with his current role, but relates back to Branch’s unclear involvement with Glenn Defense Marine, the Singapore-based company at the center of the bribery and fraud scandal. Branch served as commander of Carrier Strike Group One for a little more than a year starting in October 2009.

Even though the case, which involved colorful characters such as Fat Leonard, has been slowly coming to a head over the last year, the Navy or Branch must have decided enough was enough and it was time to move on.

Train currently is the director of the National Maritime Intelligence Integration Office in the Office of Naval Intelligence.

Finally, Defense Secretary Ash Carter appointed Navy Rear Adm. Raquel Bono to the rank of vice admiral and as the new director of the Defense Health Agency.

Bono would replace Lt. Gen. Douglas Robb, who led DHA through a huge consolidation of military health services and the award of the new electronic health records system.

Bono comes to DHA from after being the director of the National Capital Region Medical Directorate/chief of the Medical Corps at Walter Reed National Military Medical Center in Bethesda, Maryland.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


IGs, IT executives to experience a FISMA détente

There always has been healthy tension between auditors and operators. After what seemed a thaw between inspectors general and IT executives over the last few years, a recent event highlighted the continued friction between the two parties in how agencies protect federal data and networks.

During last Thursday’s panel discussion sponsored by AFFIRM in Washington, several CISOs and agency chief information officers talked about the difficulty in moving to a risk-based framework.

Jim Quinn, the lead system engineer for the Department of Homeland Security’s continuous diagnostics and mitigation (CDM) program, said too often IGs rely on checklists to determine whether or not agencies complied with the policy and law requirements.

“They have a standard pro-forma checklist that says ‘Have you done A, B and C?’ with no acknowledgement of whether A, B and C are really things that are important to what you are trying to achieve or whether you have done other things to make those controls less relevant because you’ve put compensating things in that limits your risk on them,” he said. “I think that this is one of the challenges, even looking at things like Federal Information Security Management Act (FISMA) metrics is how do we allow the agencies and departments and the mission groups to really be able to say ‘You have to look at the risk I’m willing to take in the context of what I am doing.’”

Quinn, who spent a majority of his career in the private sector, said these types of risk-based decisions are made often in the commercial world.

“We are not allowing CIOs or risk executives within the government [to make those decisions],” he said. “We nominally say they can, but when push comes to shove and they are going through an audit, that financial audit is going to go through that standard list of all of this stuff and they are going to say, ‘We don’t care that you had this compensating control, you didn’t have the fire extinguishers every 10 feet and you failed.”

David Bray, the Federal Communications Commission CIO, said he recently had a similar experience with auditors.

The FCC had a review of its cloud-based email system. Bray said the auditors said, “you have not thought about what you would do if the cloud-based email went down.”

Bray responded, “The whole reason why we went to the cloud is because it’s a global company. If they go down, we have other issues, but I was dinged. It is sort of like you are saying, they are teaching to the test, and it’s not really more the critical thinking that needs to be done.”

Quinn and Bray’s experiences are not uncommon across the government.

The inspector general community recognized they need to change and has been trying to transition to a new way of thinking.

In the 2015 FISMA reports, IGs used a new maturity model for information security and continuous monitoring to analyze how agencies are protecting their networks and data.

The IGs detail levels 1 through 5 across people, processes and technology.

So when the agency reviews begin coming out late in 2015 or early 2016, we will have a better sense if this frustration is real or just left over from previous audits.

To that end, sources say the Office of Management and Budget are finalizing the FISMA metrics for 2016. The annual FISMA guidance for the fiscal year also usually comes out in early October. Both documents likely will focus on several areas the cyber sprint highlighted such as identity management and access control, patching critical vulnerabilities and reducing the number of privileged users in addition to the continued move to information security continuous monitoring.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


New shared services organization to be model for all of DoD

The new Pentagon technology shared service went live July 20, and is moving toward full operating capability in the coming year. The move to full operational capability is developing the roadmap for others across DoD to follow.

Defense Department chief information officer Terry Halvorsen said he’s pleased where the Joint IT Single Service Provider-Pentagon (JITSSPP) is today.

Barbara Hoffman, DoD’s deputy chief management officer, said the last 45-to-60 days they have been looking at contract consolidation and the best way to merge the two main components of this new organization—the Army IT Agency (ITA) and Enterprise Information Technology Service Division (EITSD).

“We are still working some of our initial service consolidations for video teleconferencing, service desk and the computer network defense,” Hoffman said during a recent call with reporters. “That is all moving along nicely and we are now entering into the phase for when we go FOC, which is an undetermined time, but we do have to start thinking and prepping for that.”

We first reported the Pentagon’s decision to consolidate the technology infrastructure and services provided to those offices who work in the Pentagon building.

This effort could be the first of many consolidations.

Halvorsen said his office will pay close attention to the lessons learned with the joint IT services office.

“Once this model is in place and we’ve done this, the Pentagon isn’t the only place that exists where there is not a clear service provider,” he said. “For example, if you go to a Navy base, generally the Navy is the clear service provider. We are working how that will apply to all the people are tenants to that base. But there are areas in the world where that isn’t clear and this needs to become the model so we have that right, joint-service model in those areas where there isn’t one provider to consolidate on so we get the same effect.”

Halvorsen said joint bases without a clear single service provider could be the next targets for this type of consolidation.

“I think it will be a model for overseas locations where we have some of this. There isn’t a single lead service that dominates the provider of enterprise services,” he said.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


Transportation finding the right balance as it reforms IT procurement

The Transportation Department is an outlier when it comes to implementing the Federal IT Acquisition Reform Act (FITARA). It’s not that Richard McKinney, the DoT chief information officer, and other senior executives at the agency don’t believe in the spirit and intent of the law. It’s just DoT must deal with its 800-pound gorilla in the form of the Federal Aviation Administration and the fact that the FAA comes under a different set of rules than other parts of the department.

McKinney raised some eyebrows at the AFCEA Bethesda breakfast last week, saying the FAA’s lawyers decided the administration will not have to follow FITARA the same way as the rest of government.

McKinney clarified his statement after the event to say, yes, the FAA is following FITARA–for human resources, for planning, for accountability. The one area the FAA is different is on the procurement front where FAA CIO Tina Amereihn will approve all IT acquisitions instead of McKinney. FITARA requires agency CIOs to approve all IT spending, but includes the ability for the headquarters CIO to delegate some of that responsibility to the bureau level.

McKinney called the change a customization of FITARA based on current laws.

McKinney said because of the way Congress created the laws governing FAA acquisitions, administration lawyers and DoT executives decided best way forward was to have him delegate the IT spending approval authority to Amereihn.

“The two of us, we really think a lot alike,” he said. “She didn’t like the ‘us vs. them’ way of thinking either. The two of us said ‘let’s change that.’ It’s not going to happen overnight. She’s trying to create an enterprise shared service environment at FAA. I’m trying to do that with the rest of our models. The thing that Tina and I are trying to figure out is where we can do things together and to each other’s benefit. It’s a good working relationship.”

So DoT’s approach to “customize” FITARA led me to think about other agencies that could be pushing for similar approaches.

One government source familiar with FITARA plans says they are not aware of any other agencies asking for similar authorities.

The Energy Department’s national labs tried to put a rider in its fiscal 2016 spending bill to get out of implementing the law. But the White House came out against such as carve out in its Statement of Administration Policy earlier this summer.

Tony Scott, the federal CIO, said in late August that agency baselines comparing their current state to the desired state laid out in the FITARA implementation guidance were good.

“There are none that are perfect. You can see every agency’s personality show up in the plan that they have submitted. Right now, we are reviewing those plans. We are providing some feedback to the agencies to get more information about why they said some of the things that they said—both positive and negative,” he said at the 930Gov conference during a live presentation of Ask the CIO. “We’re seeing opportunities where our guidance could be clarified more, where there was some confusion. And we’ve seen some great examples of agencies really this on and being very explicit about how they are changing their governance process and the role of the CIO at the agency. I’d say overall I’d give it probably a B-plus in terms of the work I’ve seen.”

OMB is expected to release the agency baselines before the end of calendar year 2015.

Scott said getting the baseline right, especially around how agencies are spending their IT budgets is even more important than ever. He said to change the cost curve toward more development, modernization and enhancement (DME) and less operations and maintenance (O&M) must happen sooner than later.

In fact, McKinney said he just kicked off such a campaign to swing the DME vs. O&M pendulum the other way.

“I’m putting the brakes on us buying any more hardware. We are not going to do that for a while and we are going to see how that works. We’re not not going to need hardware, but we are not going to own it. We are going to get out of that business and get to the point where my office is managing and brokering services,” he said. “It’s not going to be easy, but…if you take DoT’s operating and maintenance spend, it’s climbing and our modernization spend is going down. That’s just unsustainable. It’s headed to where we if we were not to reverse this—and we are—but if we were not to reverse this, we would end up having no money for modernization and all our money would be Band-Aiding what we had. And that’s unacceptable.”

He said he’s telling the business units they need to provide their IT spend plan and if there is any money that extends current on-premise infrastructure, it will not be approved.

“The only thing I will approve is a path away from that,” McKinney said. “I’m at the very beginning of it. I can’t give you a report card right now. Everybody’s eyes are big and I think it’s energizing some folks.”

OMB is expected to release new data on agency O&M vs. DME spend in the coming months, and a plan or approach to stop the support of legacy systems.

Congress expected FITARA to help with that by giving CIOs more insight into when and how the IT budget is spent.

But FITARA is a solid year or more from really having that intended impact so in the meantime expect a more directed approach from OMB later this fall.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


Inside the Reporter’s Notebook – New cyber threat center to hit initial stride in October

Inside the Reporter’s Notebook is a biweekly dispatch of news and information

Jason Miller
Jason Miller

you may have missed or that slipped through the cracks at conferences, hearings and other events. This is not a column or commentary — it’s news tidbits, strongly-sourced buzz, and other items of interest that have happened or are happening in the federal IT and acquisition communities.

As always, we encourage you to submit ideas, suggestions and, of course, news to Jason via email.

Be the first to know when a new Inside the Reporter’s Notebook is posted. Sign up today for our new Reporter’s Notebook email alert.


New cyber threat center to hit initial stride in October

Just in time for cybersecurity awareness month in October, the White House will launch the initial operating capability of the cyber threat intelligence integration center (CTIIC).

Michael Daniel, the White House cybersecurity coordinator, said Sept. 10 at the NIST cybersecurity event in Washington that some of the initial capabilities are moving in place.

“We are hopeful we can get all of that together and have it start producing some of its products in the first part of the fiscal year,” Daniel said.

As a quick refresher, the White House announced the creation of the CTIIC in February, modeling it after the approach used after the Sept. 11, 2001 attacks to better bring together terrorism-related information. The broader goal of the CTIIC is to look beyond the ones and zeros, and combine what the intelligence community knows about malware with what it also knows about the rest of the world, including the state and non-state actors who are using it.

Read more


A deeper dive into Energy’s cyber defenses

The Energy Department’s cybersecurity is awful — well, at least that’s what many people believe based on the recent USA Today story.

The news organization found hackers were successful 159 times in penetrating Energy’s network between 2010 and 2014, including 53 instances in which the attackers took control of the “root” servers.

USA Today found that the National Nuclear Security Administration (NNSA) experienced 19 successful attacks during the four-year period, according to the Freedom of Information Act records it obtained.

On the surface, it seems the Energy Department is just another federal agency that is, in the words of former White House cybersecurity official Melissa Hathaway, complacent, apathetic and/or negligent when it comes to securing its networks and data.

But when you take a closer look at the statistics, the picture isn’t all bad.

Read more


FCC pulls IT upgrade from jaws of defeat

The government gets a bad rap for failing to meet deadlines and messing up major system overhauls.

The history of federal IT is littered with these examples.

But it’s rare we get insight into how one agency pulls a potential major blunder from the jaws of failure. But that’s what happened to the Federal Communications Commission earlier this week during a major systems upgrade.

The FCC kicked-off its modernization effort Sept. 2, shifting more than 200 servers and transferring more 400 applications associated with those servers to a commercial cloud.

But nearly a week later, the systems were not back online, causing delays for FCC customers in filing reports and other documents.

Read more


State has second thoughts about cyber playbook

The State Department’s idea of creating a series of cyber playbooks got “86’ed” rather quickly.

A State spokesperson confirmed the department cancelled the request for information about a week after releasing the notice on FedBizOpps.gov, and are not accepting any responses.

“We are continuously looking for ways to improve our cybersecurity. We do not have further information to share at this time,” the spokesperson said.

State didn’t just withdraw the RFI, it took the link down from FedBizOpps and removed any evidence it existed.

A government source familiar with the State Department’s cyber efforts said there were several reasons why the agency pulled the RFI.

Read more


State has second thoughts about cyber playbook

The State Department’s idea of creating a series of cyber playbooks got “86’ed” rather quickly.

A State spokesperson confirmed the department cancelled the request for information about a week after releasing the notice on FedBizOpps.gov, and are not accepting any responses.

“We are continuously looking for ways to improve our cybersecurity. We do not have further information to share at this time,” the spokesperson said.

State didn’t just withdraw the RFI, it took the link down from FedBizOpps and removed any evidence it existed.

A government source familiar with the State Department’s cyber efforts said there were several reasons why the agency pulled the RFI.

“It wasn’t coordinated broadly enough across the department,” said the official, who requested anonymity because they didn’t obtain permission to talk about this topic. “There were certain things in there that were not appropriate for State to be looking at. And the timing was off with Chinese state visit at the end of the month.”

In the RFI, State said it was looking for “specialized cybersecurity experts” who can write a how-to guide for responding to cyber attacks and coordinating offensive cyber activities.

The Federal Times first reported State’s decision to scratch its playbook plans.

The bigger question is why did State back away from what seems like a good idea?

The general concept is a good one — focusing on industry best practices to address cyber challenges.

I wonder if State would’ve called it a cyber “strategy” instead of a “playbook” if the powers-that-be wouldn’t have been concerned and pulled back on the idea?

The playbook concept is fairly new to the government with the Office of Management and Budget introducing one for the digital services. But when you do a search on cyber playbook on The Google, you find Mitre Corp. has a one, Cisco says using this concept helps organize information on a specific subject, such as cyber, and Invotas wrote a blog in June asking what your cyber playbook looks like?

Unfortunately, the State Department spokesperson offers no insight into what concerns it had or if it would come out with something different, maybe a strategy on GitHub in the near future.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


FCC pulls IT upgrade from jaws of defeat

The government gets a bad rap for failing to meet deadlines and messing up major system overhauls.

The history of federal IT is littered with these examples.

But it’s rare we get insight into how one agency pulls a potential major blunder from the jaws of failure. But that’s what happened to the Federal Communications Commission earlier this week during a major systems upgrade.

The FCC kicked-off its modernization effort Sept. 2, shifting more than 200 servers and transferring more 400 applications associated with those servers to a commercial cloud.

But the effort took two more days than planned to get the systems back online, causing delays for FCC customers in filing reports and other documents.

Instead of the IT problems just becoming another example of government and contractor ineptitude, an email obtained by Federal News Radio sent to FCC employees and partners shows the effort to rescue the project.

“We could have always asked for more time up front, possibly padded our schedules; instead we chose to be ambitious in our timelines because that’s what a startup-mentality culture does,” wrote David Bray, the FCC chief information officer in the email. “We aimed high, adjusted, pivoted, and succeeded in our outcomes. If those in public service take from what we did as a ‘Team’ this summer — and from it and see that it is okay to take risks as long as you are fully committed to seeing them through and getting it done — then we will have helped ‘hack the bureaucracy’ for the better.”

Digging a bit deeper into the efforts, Federal News Radio has learned the FCC and contractor team, led by IBM, worked for 55 hours straight to straighten out the problems.

Sources said the problems stemmed from contractors at the IBM data center had mixed up hundreds of server cables creating a configuration that was unworkable.

Instead of blaming the contractor or blaming the government, FCC team members worked 24/7, and in some cases, sources say, slept in the server room.

By Sept. 10, systems were back online and available to FCC customers.

The FCC said in a release that the move to a commercial cloud service provider would help reduce the costs to maintain the systems, improve their resiliency and allow it to shift legacy applications to cloud solutions in the long term.

Bray told Federal News Radio in August that he plans to get the FCC out of running its own infrastructure. He said the end goal is to create a modern, secure infrastructure using equipment, applications and systems that focus on mission success.

Bray now is plotting phase 3 of this modernization effort, which will include rewriting IT systems to employ reusable “service catalog” of modular components across the FCC.

The FCC’s accomplishment may be something that happens often in government where smart, technical federal employees successfully complete or rescue an IT project. The problem is we don’t hear about it often enough, and instead all we hear the politicals complain about the state of federal IT and how the private sector is so far ahead of the government.

Well, that dialogue could change if the federal IT community does a better job of celebrating successes and stop retreating when there is a failure.

So, send me your successes — on the record, on background, under cover of night in a dark parking garage in Rosslyn. But let’s change the federal IT dialogue together.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


« Older Entries

Newer Entries »