Reporter’s Notebook

jason-miller-original“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.

Submit ideas, suggestions and news tips  to Jason via email.

 Sign up for our Reporter’s Notebook email alert.

Early returns on GSA’s EIS contract, IT modernization is not in play

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The idea of iterative or agile development hasn’t quite come to the Enterprise Infrastructure Solutions (EIS) contract.

If the initial handful of fair opportunity soliticiations are any indication of what’s to come, the pressure on agencies to transition to the new $50 billion telecommunications contract run by the General Services Administration by May 2020 is pushing them to take a “winner take all” approach.

And industry experts say that approach likely means the Trump administration’s goal of using EIS to jumpstart IT modernization efforts will be overlooked.

Diana Gowen, general manager and senior vice president for MetTel’s federal program, said based on what she’s seen from the three solicitations so far, agencies are mostly taking a “like-for-like approach and asking vendors to tell them how they would modernize in order to get going. It’s not unlike Networx. So despite OMB pushing and all that stuff, it looks like same old same old, which is unfortunate.”

Denny Groh, executive director for corporate relations for Accelera Solutions, an IT services firm, and a former GSA executive who managed the FTS 2000 and FTS 2001 long distance contracts before retiring in 2003, said while agencies may be motivated to transform through EIS, he estimates less than one-third will actually do anything different.

“A lot of agencies don’t necessarily have all the buy-in of their sub entities and that makes it tough to get a unified front to figure out what they have and what they will do,” he said. “Many will do like-for-like, or they are going to transform in some moderate way.”

Both Groh and Gowen pointed to the three fair opportunity solicitations that are out today as an initial small sample size of the direction EIS may be heading.

The departments of Labor and Justice are among the first out of the gate and both solicitations are taking the “winner take all” approach for voice, video and data.

Justice is taking an interesting approach, Gowen said. The agency issued one solicitation, but it has three sections: one for the FBI’s voice and data where the vendor needs a top secret facility clearance, one for just the Bureau of Prisons telecommunications needs and finally one for the rest of DoJ. Vendors can bid on all three or any combination of the three.

The Social Security Administration, which was the first agency to release its request for proposal, wasn’t much better, looking for a vendor to provide data and voice services, which accounts for a majority of the agency’s needs.

The fact that these agencies seemingly are not transforming or modernizing through EIS doesn’t bode well for GSA’s situation that is getting tougher by the month. The Networx contract ends in May 2020 and GSA has said many times it doesn’t want to extend the contract so agencies have to get their transitions done.

Kay Ely, GSA’s assistant commissioner for the Office of Information Technology Category (ITC) in the Federal Acquisition Service, opened the door slightly to an extension saying in July that if agencies can show they are transforming more than transitioning to EIS, then an extension may be possible.

“If an agency says all I can do between now and May 2020 is like-for-like, then extensions are off the table,” Ely said at an AFFIRM event on July 26. “But if they are doing a hybrid or partial transformation and there is only so much work that can be done, we know in the back of our minds an extension is still out there. But agencies need to put all their efforts into transformation.”

But Groh Gowen and others say an extension is all but assured.

As of Aug. 8, GSA reported four of nine EIS vendors have completed the business support systems testing, and all nine contractors will not be done until early November at the soonest.

After the BSS functional testing, all vendors’ systems must go through the final security approval process.

“GSA is now beginning to be concerned about how get everyone through that authority to operate (ATO) hurdle. They are thinking about how do they expedite this ATO process,” Gowen said. “One of the things that has been suggested is for those awardees who want to work on sprint can nominate themselves to get to the FISMA moderate accreditation faster. We certainly would raise our hands.”

Gowen said given the timeline of vendors not getting ATOs until the spring, GSA likely will have little choice but to extend Networx.

“As we are working with agencies we suggest, as they craft proposals, that they factor in whether all of their modernization goals are achievable within the targeted transition timelines. Timeframes are going to be aggressive,” said a Verizon spokeswoman by email. “When agencies are making their decisions, they should consider past performance to determine if the vendor they are selecting is capable of executing on an aggressive transition. Agencies must balance the urgency of transition with the necessity of modernization.”

The Interior Department may be one of the IT modernization outliers under EIS.

Tim Quinn, Interior’s chief of enterprises infrastructure, said at the Network Modernization Forum in June sponsored by ACT-IAC, that the agency started discussing what transformation would look liked under EIS more than a year ago.

“Go big means citizen delivery,” Quinn said. “If we have better, more complete big data driven, high performance computing heavy model driven interoperability between what USGS does with ground water and surface water with what NOAA does with its weather models, we can get much better predictions of things like [the flooding] in Ellicott City, Maryland. So when I talk about going big with EIS ,in order to do those things I have to be prepared to deliver 100,000 sensors over the next few years. So going big is putting your business first.”

Quinn said Interior is looking for technology that industry may not be ready for today but since EIS is a 15-year contract there are a lot of possibilities on the horizon.

“We have tremendous change capacity built into the contract. I can write and award a new fair opportunity a year later,” he said. “We continue to write fair opportunities under Networx. We made changes along the way. One we did was a broadband fair opportunity through Networx and brought in a technology we didn’t even think about when we started Networx. I think it’s been a good thing for both Interior and government.”

Quinn said that approach is the future of EIS as well where agencies can bring on new technologies as they are ready.

“We want to get off Networx as fast as humanely possible, but we also want to change as fast as possible,” he said. “I have customers who view me as a dinosaur. We are behind and we need to be innovative so we need to partner with everybody to help each other get there faster.”

Gary Hall,  the director of strategy, planning and operations at Cisco, said transformation for many agencies doesn’t have to mean a full scale revolution, but more of an evolution.

“They should transform themselves from a perspective of providing operations and maintenance of on-premise gear to brokering the services they need, however they need whether on-premise or in the cloud,” Hall said.

Read more of the Reporter’s Notebook


OMB putting a twist on applied research to solve federal management challenges

The Office of Management and Budget’s idea to create a public-private applied research center to focus on federal challenges is not necessarily new. Over the last half century, there are dozens of examples of partnerships between the government and the private sector around challenges such as transportation, parks and recreation, and high-speed Internet access.

But what is different about OMB’s plan to create the Government Effectiveness Advanced Research (GEAR) Center is the focus on internal federal challenges such as workforce modernization and the use of data.

“When I arrived here in government, I was actually surprised to find there was a lot of cross-cutting applied research that brought that academic rigor to the problems of management in the public sector. The challenges we have around procurement, HR and IT are all intersecting challenges that I believe interdisciplinary skillsets of the private sector and the academic world will help us solve,” said Margaret Weichert, the deputy director for management at OMB, during a webinar on the GEAR effort on Aug. 23. “At its heart, what we are hoping to achieve in the GEAR Center, is to get your innovative ideas from the marketplace of ideas that you all live in and how we might tackle those challenges.”

 

Much of that applied research Weichert was referring to is done by places like the Partnership for Public Service, the National Academy of Public Administration and the Performance Institute and industry, which pays for the reports and studies.

While there is a level of objectivity in those efforts, the vendors aren’t paying tens of thousands of dollars for a study that doesn’t benefit their business line. For example, if vendor X provides management consulting services, they don’t want a report that finds agencies have plenty of skills to manage programs and projects.

The question then comes back to whether this GEAR Center can be any different? Can it be sustainable?

Those are among the questions OMB is asking for feedback by Sept. 14 on an RFI issued earlier this month.

Weichert said the vision of the GEAR Center is one that has an “independence of thought” and isn’t hamstrung by federal procurement rules and requirements.

Margaret Weichert is the deputy director for management at OMB.

“We know there are a number of universities who have centers of academic research. We know across government there are experts at Defense Advanced Research Projects Agency, Defense Innovation Unit in the Defense Department, the National Science Foundation involved in working with the academic community and many good government organizations have many good ideas. My personal favorite, state and local governments have great ideas about how they are partnering in their communities to transform old-style jobs into the jobs of the 21st century,” she said. “We didn’t want to confine our thinking to our initial hypotheses. We wanted actually to go to the experts, the people who are experts in data science, experts in continuous learning and reskilling and ask you what would be the things you would focus on and how might we structure this? How do we fund test and learn activities?”

At the end of the day, Weichert said the GEAR Center is about innovation and the ability to test theories that may lead to the need to change policies or laws.

“Part of the goal of the center is to help unlock money for infrastructure, innovation through test and learn activities that would benefit not only the government, but the private sector providers of those solutions,” she said. “This would be a message to the private sector folks…think about all the things you find difficult when trying to pitch new ideas or new economic models to government, how a center like this might help us ingest that innovation and how we might use this to create an on-ramp to pay-for-performance type solutions or better alignment to pay-as-you-go models. Those are all the kinds of things I could see happen here.”

OMB to come up with initial funding

Mark Bussow, a program manager in OMB’s Office of Performance and Personnel Management (OPPM), said the GEAR Center is expected to launch in 2019.

He said a lot of the responses to the RFI will help influence what the center will eventually look like. Weichert added that OMB will have some seed funding of a couple millions of dollars to get the center going.

She said the GEAR Center likely will be similar to centers at academic institutions that take on corporate funding to work on projects that have a commercial benefit in the future and can create long-term sustainable funding models that way.

“We at the government can provide a seat at the table, clarity around the vision and the agenda. We’d anticipate in whatever form that this center takes shape that the governance model would include probably two seats on whatever governing body there would be for this center for the government. One would probably be the DDM position and then we would probably have more of an institutional director at GSA also have a seat. Whether that’s the administrator or someone else is yet to be determined,” Weichert said. “The goal in having those two seats at the table is to help create that on-ramp back into the government. Ultimately the vision would be that the funding model would be self-sustaining outside of government.”

Weichert said the real measure of success is more than creating a test-and-learn environment and a self-sustaining funding model. It is taking the applied research and transforming the government.

“I’ve been asked to identify how many federal workers I can retrain in the next two years toward the jobs of the 21st century, particularly around IT and cybersecurity jobs. I haven’t settled on the final number that I can make a commitment to, but if GEAR Center were an effective place I could turn to and say, could I could train 200,000 workers in the next 18 months in the skills they would need to migrate from a paper-based process oriented type of jobs to a cybersecurity job or a data center job, what might that look like? What tools could I access on demand to help make that transition?” she said. “If this center would enable us to provide examples of that, that we could actually deploy, that would be an incredibly successful start.”

Weichert said the biggest difference in the GEAR Center approach and the other previous ideas is the lens by which the initiative is looking for solutions. She said GEAR is taking an operator lens versus a legal or policy lens.

“Both are critical to any question that has to do with government, but starting with the question of how do we act differently and how do we implement differently is a question that is normally in government only gets asked around mission,” she said. “DoD asks this questions all the time in the field. FEMA asks this question all this time in the field. But it hasn’t been a priority to invest in these types of questions around disciplines that are just not that talked about in Washington. All the things I’m passionate about on the management agenda, finance, accounting, procurement, IT, information security, personnel policy, digital customer experience are not common priorities.”

She said the GEAR Center would help find those communities outside of government to help agencies better understand how technology can improve services to citizens.

“Government is in the services business, at least 50 percent of our services are delivered electronically, but we don’t invest that way,” she said. “That is the biggest difference to shift from the inside out to an outside in and look at what industry is doing around these same questions.”

Read more of the Reporter’s Notebook


Does innovation exist in federal procurement? OFPP is on the look out

Innovation has become one of those words that has lost its meaning, particularly in the federal market.

Think about what is considered “innovative” these days. Reverse industry days? Vendors and agencies are supposed to talk.

Other transaction authority? The Defense Department, NASA and other agencies have had access to OTAs for 25-plus years.

Cloud computing? Ever heard of managed services or alternative service provider (ASP), these were the cloud before “the cloud.”

So does that mean innovation in the federal sector is unattainable? Is it just, in the words of Mark Forman, the former administrator of e-government and IT at the Office of Management and Budget under the President George W. Bush administration, “putting lipstick on the pig?”

“There is more discussion around innovation both inside and outside of the government, but there is not as many solid use cases and stories that break things down in practical sense that will help people,” said one federal procurement expert, who requested anonymity because they didn’t get permission to talk to the press. “The biggest challenge that I see is with innovation in general is people are throwing out the word and do not know how to apply it, how to problem solve. They are just saying, ‘We are doing something innovative,’ but it may not be innovative.”

This question of what is innovation and how to measure is at the center of new effort by the Office of Federal Procurement Policy (OFPP). It has asked the industry association ACT-IAC’s Institute for Innovation for help in identifying and detailing federal acquisition innovation across government.

Tim Cooke, CEO of ASIGovernment and a member of the institute, said a team of 40-to-50 volunteers are starting to work on a project to figure out what’s working in government and develop a series of use cases for others to follow.

“I’ve heard from lawmakers that they feel like they’ve got to find a way to reduce the burden and get some of the red tape out of the acquisition process for agencies,” Cooke said. “A lot of the concerns are real because there is a perception that there is a lot of red tape and a lot of barriers to entry to the federal market.”

Cooke said OFPP asked the ACT-IAC working group to focus on four specific areas:

  • Problem focused – What is the problem such as getting email to the cloud, and how to get late adopters to look at what early adopters have done? This area is focused on the President’s Management Agenda, specifically around the IT modernization goal.
  • Process focused – This is focused on the work of places like the Homeland Security Department’s Procurement Innovation Lab. Who is looking at the acquisition process and innovating, testing and expanding it? “They’ve gone to FAR Part 1 and set the stage by doing what makes sense for government. They are using business judgement and making the rules do what they need them to do,” Cooke said.
  • Build a catalog of learning by innovative organizations – What have agencies learned over the last 3-to-5 years about how to use “new” authorities such as OTAs, commercial solutions opening (CSO) or challenges? GSA launched a pilot program around CSOs in early August. “GSA’s CSO procedure offers fast-track vendor selection timelines, simplified contract terms, and a preference for allowing the vendor to retain core intellectual property, when appropriate. CSO is designed to attract start-up companies and those new to the federal market and should benefit both government and taxpayers with reduced costs and improved performance,” writes Chris Hamm, the director of FEDSIM in a 8 blog post.
  • Identify innovative organizations that don’t get a lot of attention – Cooke said there are more than 3,000 buying organization across the government, and while places like GSA’s FEDSIM or the National Institutes of Health are well known, what are the other organizations that have successes they can share with the rest of government and industry?

Cooke said ACT-IAC hopes to deliver initial findings at the Executive Leadership Conference (ELC) in October.

Survey says agencies desire innovation

And it looks like the sharing of information can’t come fast enough. A recent survey by the Professional Services Council and Grant Thornton found in their 9th annual biennial survey of federal acquisition leaders that the use of innovative practices is a major priority over the next 2-to-3 years.

The survey found 82 percent of the respondents expect the use of innovative practices to increase by 2021, while more than half rated the use of and access to innovation a 2 on a scale of 1-to-4.

“This was an area I was a little disappointed in. It was clear that the term innovation is being talked about a lot in government, but it’s not well understood. I don’t think senior leaders are defining the issue well and I don’t think the acquisition community understands what is expected of them in terms of innovation,” said Alan Chvotkin, senior vice president and counsel at PSC. “The acquisition respondents we talked to really didn’t believe they were doing a good job in either using innovative acquisition techniques or gaining access to innovative capabilities in industry.”

Chvotkin said there clearly are pockets of innovation with the labs and the rise in the use of OTAs are two examples of where change is happening.

But he said agencies should refine where they really need innovation, in the acquisition process, in the technology and services they buy or both.

The federal procurement expert said the results of the survey were not surprising.

The expert said too often agencies are satisfied to continue with the status quo because being innovative requires more work and more risk.

“The larger piece to all of this is you can’t turn the Titanic easily, and the government is not set up whether around acquisition or finance or human resources or IT to ‘innovate’ quickly,” the expert said. “The resources just aren’t there. If you had a project and wanted to pitch your executive with options, folks may want to do it, but to actually do it right, you will have to devote resources that aren’t there. Other offices will not give up their resources for something that is potentially experimental. They are more willing to go the traditional way of doing the work.”

But experts say hope is far from lost. DHS’ Procurement Innovation Lab (PIL), GSA’s Federal Acquisition Service, the Department of Health and Human Services’ Buyer’s Club and several other examples prove that the combination of leadership and desire can create the right environment for change.

Now whether it’s innovation or just bringing existing tools to the front and center, well that’s a different discussion.

Read more of the Reporter’s Notebook


GSA to close down reverse auction platform after 5 years

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Look back over the last say 25 years and see if you can remember the last time an agency publicly admitted defeat with a program or project and decided to move on. Don’t think about a contract that failed or even an act of Congress ending a specific program, but focus on a service offering that just couldn’t make it.

It’s hard to find one. Right?

This is why the General Services Administration’s decision to shut down its reverse auction tool on Dec. 31 is significant.

GSA is pulling the plug after just over five years.

“GSA has made the difficult decision to decommission our Reverse Auctions (RA) platform. Unfortunately, the platform did not prove to be financially viable,” Erv Koehler, the acting deputy commissioner of the Federal Acquisition Service, wrote in a note to GSA agency customers and industry partners on Aug. 7. “Our focus now is to ensure the platform is shut down in an orderly way. As such system operations will conclude at the end of Fiscal Year 2018, which means the RA platform will not be available for either the creation or management of auctions after Sept. 30, 2018. Auctions with end dates after Oct. 1, will be allowed to conclude as scheduled. GSA will maintain RA system access for users through Dec. 31, allowing for the retrieval of auction-related documents.”

What’s even more surprising is that fact that Koehler wrote the note to announce the decision. Koehler, who when not acting deputy commissioner is the regional FAS commissioner in Atlanta, spearheaded the reverse auction platform so he launched it and shut it down.

A GSA spokeswoman added: “Decommissioning GSA’s Reverse Auctions will allow GSA to refocus resources and personnel to support other critical growth areas in the Federal Acquisition Service. Our customer service directors are working with current RA vendors and customers to find the GSA tools, platforms and programs that best meet their specific acquisition needs.”

That last sentence from the spokeswoman really is at the heart of why the reverse auction platform ultimately failed. Experts say GSA’s tool never did enough to meet customers’ specific acquisition needs.

Reid Jackson, the CEO of Compusearch, which  runs the FedBid reverse auction platform, buying the company in 2017 —said GSA’s decision caught him by surprise but customer agencies clearly wanted more from the platform.

“We ran into competition from GSA’s account managers at agencies who had competitively awarded contracts with FedBid and it seemed to us GSA was actively competing,” Jackson said in an interview. “There are different models for reverse auctions. GSA was running more of a self-service portal where agencies posted things like they do on eBuy. FedBid is more of a full-service portal where we help buyers recruit sellers and drive competition to the marketplace. This statement by GSA is perhaps evidence that what buyers are looking for is more full service offering to drive competition, save time and actually reduce procurement action lead time. We think buyers want a more comprehensive service offering than a self-service portal.”

GSA’s platform didn’t pick up steam

Comparative data is between FedBid and GSA is hard to come by. A July Government Accountability Office report on reverse auctions found in 2017 agencies conducted about 19,000 reverse auctions valued at about $1.5 billion, but didn’t break down how many went through GSA, how many went through FedBid and how many went through some of the other lesser known tools such as the Army’s CHESS IT e-mart auctions.

GSA says on its reverse auction website that it has awarded $249 million through the platform across 31 agencies.

Tim DiNapoli, GAO’s director of Contracting and National Security Acquisitions, said only a very small percentage of the reverse auctions in the federal marketplace went through GSA’s platform.

“It is worth mentioning that GSA started the initiative some five years ago. It seems to me that despite their efforts, the use of the platform never seemed to pick up enough steam or visibility to make it, in GSA’s eyes, a viable proposition relative to the costs of providing the services or diverting the staff resources from other priorities,” he said.

To some, it was unclear why GSA even got into the reverse auction game in the first place.

One former federal procurement official, who requested anonymity because their current company still does work with the government, said the market spoke years ago about which platform they liked better.

“The GSA tool didn’t have all of the functionality, features and ancillary services that some industry providers offered. The economics of it weren’t any more compelling than using existing tools,” said the former official. “I really hope that the lesson learned here is when GSA or other service providing agency sees an opportunity to improve a technology, service or tool provided by industry already they try to partner with industry to benefit all partners.”

The former official said GSA didn’t work with FedBid or Compusearch, which had its own reverse auction tool before it bought FedBid, to create an integrated offering, and instead tried to compete with the vendors.

“It was clear from the initial look and feel of GSA’s tool that it was modeled after existing reverse auction platforms and used the same characteristics and terminology,” the former official said. “Despite the time GSA took to improve the tool and an aggressive marketing of it, the vast majority of government buyers chose the industry option over GSA’s option. I think this also points to some overriding concerns around GSA’s services in general. I think that with all of the industry and government focus on trying to bring innovative technology to government buyers, this strange imbalance exists where GSA too rarely thinks creatively about how to partner with industry. The reverse auction tool is an example of a missed opportunity to improve service delivery.”

Reverse auctions settled down

Compusearch’s Jackson said the reverse auction market seems to have settled into a good rhythm and understanding. He said FedBid has seen a 10 percent growth in 2018 over 2017, and for the first time in the last five or so years lawmakers didn’t include any new provisions in the Defense authorization bill targeting reverse auctions. The last memo from the Office of Federal Procurement Policy around reverse auctions came in 2015, and agencies, including GSA, are successfully using reverse auctions with little or no fanfare.

“It is one of many available tools and as such there are places where it makes a lot of sense when buying commercial items,” he said. “Now people are understanding the right sorts of acquisitions that lend themselves and what don’t. I think during the first 5-to-10 years of reverse auctions people may not have understood that as well and may have applied reverse auctions where it wasn’t the right tool.  I think we’ve moved beyond that.”

Jackson said most agencies use reverse auctions for commercial auctions below the simplified acquisition threshold (SAT) of $250,000, and use traditional solicitations for more complex goods and services.

The former federal procurement official said they give GSA a lot of credit for deciding to shut down the reverse auction tool because too often agencies continue to “throw good money after bad for fear of embarrassment of admitting failure.”

Read more of the Reporter’s Notebook


VA, DISA bring in new acquisition executives

It feels like you can’t go by a week without another federal IT executive or two or three leaving for the private sector or the Florida sun.

And while the path to the private sector and retirement remain well traveled, the good news is reinforcements are arriving.

At the Veterans Affairs Department, Secretary Robert Wilkie appointed Karen Brazell as the new principal executive director for the Office of Acquisition, Logistics and Construction, on Aug. 6.

Karen Brazell is the new principal executive director for the Office of Acquisition, Logistics and Construction.

Brazell replaced Greg Giddens who retired in November after 37 years in government.

In her new role, Brazell oversees acquisition, contract administration and supply-chain processes for VA as well as serving as the agency’s chief acquisition officer.

One of her big focus areas likely will be VA’s continued focus to improve its construction project management. In January 2016, Giddens launched an initiative to improve the project and program management of projects around five core principles.

In January, VA said the Rocky Mountain Regional Medical Center in Aurora, Colorado, was 98 percent complete, and the department expected patients to begin using the new medical center in August.

In addition to construction challenges, Brazell will have a full plate with the agency moving to the cloud in a big way, is trying to bring in new technologies more quickly through a Lighthouse initiative, and continues to struggle with IT logistics initiatives.

Before coming to VA, Brazell has worked in several different government organizations and served in the Army for four years in the 1980s.

She comes to VA after serving as the chief of staff for the White House Military Office, where she oversaw strategic planning, engagement planning, communication product development, staff coordination and integration, special projects, policy development, and resource management.  She also was deputy director of the Acquisition and Resource Integration for the Naval Facilities Command.

Additionally, Brazell spent 17 years as a contractor in the Defense sector before joining the DoD civilian service in 2006.

The Defense Information Systems Agency also filled a key acquisition position by naming Carlen Capenos as its newest Office of Small Business Programs (OSBP) director.

Capenos started in her new role on Aug. 6 after spending the last 22 years working for the Department of Defense in contracting and with small businesses. She has been with DISA since 2015.

She replaces Sharon Jones, who retired in April after 40 years of federal service.

Capenos moves into her new role with a goal of continuing to expand DISA’s contracting success with small businesses. The agency reports that in fiscal 2018 it awarded $1.7 billion in prime contracts to small businesses. These 6,522 contract actions represented 28.2 percent of all contracts awarded by DISA.

Capenos joined DISA in 2015 where she worked as the chief of the acquisition resources and special projects branch. Prior to that, she worked with the U.S. Army Corps of Engineers in a number of roles including as the deputy for small business programs and as chief of the Secure Environment Contracting Branch.

Labor, USDA put out help wanted signs

While VA and DISA added new executives, the Agriculture and the Labor departments are looking at resumes to fill key positions.

Tony Cossa, who served in significant technology roles at USDA including as director of cloud strategy and acting chief technology officer, left government after more than a decade to join Oracle. Cossa is a senior product strategist for the software giant.

Cossa spent the last four months working as a senior advisor on Agriculture’s technology modernization effort under the White House’s Center of Excellence (CoE) initiative.

He also worked at the General Services Administration for four years and the Homeland Security Department.

Over at Labor, Mika Cross jumped to the private sector after spending nearly three years working as the director of strategic communications, digital and public engagement for the Veterans’ Employment and Training Service.

Cross is now vice president of employer engagement and strategic initiatives at Flexjobs, a service that helps workers find flexible/telecommuting jobs from multiple sectors.

In addition to her time at Labor, Cross set up a well-respected telework program at USDA, worked at the Office of Personnel Management as a HR consultant and the Consumer Financial Protection Bureau as the director of work/life and flexible workplace strategy.

Read more of the Reporter’s Notebook


How agencies can stop playing ‘Russian Roulette’ with their email security

The number of agencies playing “Russian Roulette” with their email remains amazingly high.

With less than two months before the Homeland Security Department’s Oct. 16 deadline, the number of agency domains still not meeting the requirements under Binding Operational Directive 18-01 is more than 200.

The main focus of the BOD from last October is for agencies to move to full use of the Domain-based Message Authentication, Reporting and Conformance (DMARC) protocol, which is an email-validation system designed to detect and prevent email spoofing. They also must implement Hyper Text Transfer Protocol Secure (HTTPS), HTTP Strict Transport Security (HSTS) and disable weaker cryptography standards.

“If you don’t know where an email comes from, that is creating a risk from the number one communications platform,” said Alexander Garcia-Tobar, CEO of VailMail, a cybersecurity company focused on implementing DMARC and other email safeguarding standards, at the recent cyber summit sponsored by 1105 Government Information Group in Washington. “The risk is still growing as email is completely unsecure. Criminals, state actors and others are taking advantage of the fact that email isn’t authenticated. You wouldn’t accept a credit card without swiping it, but it seems to be okay in accepting email on its face value.”

The Office of Management and Budget’s website tracking agency progress against BOD 18-01 and other related requirements shows some surprising agencies that have made little to no progress in complying. The Federal Election Commission is at 28 percent complete. The Consumer Financial Protection Bureau is at 33 percent complete. And the Treasury Department is at 55 percent complete.

Source: OMB website pulse.cio.gov

These are just three agencies that deal with the public and hackers could easily spoof their email accounts to trick citizens into revealing personal information. Add to that: OMB found in the 2017 Federal Information Security Management Act report to Congress that the number of attacks via email or phishing doubled in 2017 to more than 7,300.

Patrick Peterson, the founder and executive chairman of Agari, another cyber company protecting emails, said 81 percent of the civilian agencies have adopted phase one of DMARC, meaning they can authenticate their email address to other users.

He said 52 percent of the agencies have implemented the second part of DMARC, which focuses on protecting, rejecting and enforcing the domain name security protocols.

“Over the next two-to-three months to get to 100 percent across government will not be easy,” Peterson said. “In order to get to phase 2, agencies have to track down all third party senders, so that means all sub-agencies that use subdomain to send email. That does take work. But hopefully by the October deadlines agencies will be much closer to 75-to-80 percent. That would be a pretty good one-year turnaround.”

HHS case study using DMARC

Peterson pointed to a case study Agari likes to highlight as to why DMARC matters so much. Agari worked with the Department of Health and Human Services to protect the HealthCare.gov website.

After implementing DMARC in 2016, HHS saw no phishing campaigns against the popular health care website.

“Their chief information security officer sent us a note saying there was no phishing going on and he thought there was something wrong with system. We doubled checked it, and found everything was fine,” Peterson said. “The emails went to reject and didn’t get delivered. The bad guys had gone off to attack other agencies because emailing citizens with fake notices wasn’t working well.”

An example such as this one should be enough to convince every agency and private sector organization to move quickly to DMARC.

Peterson said there are 217 domains subject to directive not yet compliant with phase 1, but a majority of the consumer facing ones, including IRS.gov, HealthCare.gov and others are in good shape for phase 1 if not also for phase 2.

But that’s not the case, most surprisingly, at the intelligence community, including the CIA, the Office of the Director of National Intelligence and the Terrorist Screening Center, which OMB’s website shows are 0 percent complete. Now to be clear, the IC doesn’t have to comply with the BOD because national security systems are exempt, but it’s nonetheless surprising.

John Sherman, the assistant director of National Intelligence and Intelligence Community chief information officer, said in an email to Federal News Radio that the IC has a range of activities to implement cybersecurity best practices.

“Cybersecurity is a key priority of mine, and IC CIO is currently in the process of coordinating with the Intelligence Community a cybersecurity implementation plan that will identify the foundational tasks needed to improve our safeguarding posture and drive some really important conversations on risk,” he said.

DoD to implement email security by Dec. 31

At the same time, lawmakers want DoD to implement DMARC. In the 2019 Defense Authorization bill, Congress included a provision requiring the Pentagon to implement the email security protocol.

Additionally, lawmakers also are requiring DoD to implement future BODs by having the DoD CIO “notify the congressional defense committees within 180 days of the issuance by the Secretary of Homeland Security after the date of the enactment of this act of any Binding Operational Directive for cybersecurity whether the Department of Defense will comply with the directive or how the Department of Defense plans to meet or exceed the security objectives of the directive.”

At the same time, DoD CIO Dana Deasy told Sen. Ron Wyden (D-Ore.) in July that the Joint Force Headquarters DoD-Information Networks (JTF-DoDIN) will issue a tasking order by mid-August to implement the BOD’s requirements with a completion date for most requirements by Dec. 31.

But even if you take out the IC and Defense community, the number of domains that still have a long way to go with less than 60 days left is disconcerting for many reasons.

Rob Holmes, vice president of email security at Proofpoint, said the biggest challenges for agencies include identifying legitimate senders, finding internal owners of email programs/mail flows and working with authorized third parties to align their sending practices with the constraints of the DMARC standard.

“While there are no technical reasons why certain agencies may not be able to deploy DMARC, there are technical reasons why it may be more difficult and risky for some agencies to deploy DMARC,” Holmes said. “For example, if an agency has a particularly large and/or complex email ecosystem that uses a number of different email service providers across different locations with different change control processes. Some agencies might feel that the BOD 18-01 was sprung on them and therefore might not have the necessary DMARC deployment funds and resources in addition to an already established budgeting cycle.”

Marcus Christian, a cybersecurity and data privacy attorney with Mayer Brown and a former executive assistant U.S. attorney for the Southern District of Florida, said DMARC implementation is a good news story for agencies. He said this is a good example of federal employees getting ahead of the private sector and changing the perception that the government can’t be ahead of the private sector when it comes to technology.

“There is no reason why all of these domains couldn’t be secured by DMARC,” Agari’s Peterson said. “Even those that aren’t used all that often are actually easier it is to apply DMARC to. We don’t see any rhyme or reason why agencies can’t meet the Oct. 16 deadline. It’s just a matter of agencies having their act in gear.”

Read more of the Reporter’s Notebook


GSA-OPM set March 2019 timeline to complete initial merger

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

September will mark 13 years since the General Services Administration merged the Federal Technology Service and the Federal Supply Service to make the Federal Acquisition Service.

It’s a good time to remember what GSA went through in 2005 because once again, the agency is going down a similar path of trying to bring employees, their cultures and the services together. GSA is implementing the Trump administration’s proposal to bring the HR Solutions office at the Office of Personnel Management into FAS.

Back in 2005, GSA hired a contractor to help with the merger, chose leaders of the organizations to develop and implement a plan, received congressional approval and then completed the blending of the two organizations.

Today, GSA is following a comparable playbook. The agency first named Mary Davie, the deputy commissioner of FAS, to lead the planning effort. Now, GSA is looking for vendor help. It released a request for information last week seeking feedback across five broad areas.

“This is a very ‘GSA’ way of going about things to hire contractors and get recommendations,” said one former OPM official, who also is familiar with GSA and who requested anonymity because they did not get permission to speak to the press. “Part of the reason for needing a help is OPM doesn’t have the acumen to support this merger.”

The RFI also gives the first timeline for the merger. GSA said it expects the vendor to complete its work, which includes the rebadging of 462 employees in HRS and other actions, by March 30, 2019 to prepare for operations day one, which is scheduled for later in the spring. By 2020, GSA said it expects to complete the full merger.

“Contractor support for this effort is envisioned as an extension of the GSA/OPM task force, to support the transition and transformation activities,” the RFI states. “The scope of this initial effort includes: Supporting the planning and execution of the initial transition of the HRS organization and personnel into GSA; optimizing services and costs, improving alignment with GSA offices and functions and identifying opportunities to reduce duplication and overlap; and identifying further opportunities to transform the delivery of services that today comprise HRS and other, related services currently offered today by OPM and GSA.”

Responses to the RFI are due Aug. 17 and GSA said it expects to release a final solicitation before Sept. 30.

OPM-GSA merger makes sense

The eventual contractor will have its work cut out for them as former OPM and GSA officials say like any merger, there are culture and process obstacles that will need to be addressed.

“I think that contractors are very good [at] being objective listeners and facilitators of communications, but internally people have to want to listen,” the source said. “A contractor is not going to change what a federal employee has been living with for years and years. That will have to be driven by internally by leadership.  They will have to believe in it.”

A government source with knowledge of OPM said moving some parts of HRS into GSA make sense for several reasons. The source, who requested anonymity because their current agency didn’t give them permission to talk to the press, said OPM already transferred the contracting capabilities of HRS to GSA to run the $11.5 billion Human Capital and Training  Solutions (HCATs) multiple award contract. In fact, as a bit of an aside, it’s interesting that GSA and OPM also announced last week they were reducing the fee under HACTs to 0.75 percent from 2 percent as a way to make the contract more competitive and less costly than the schedules program.

Without a doubt over the last two years, HCATs has not lived up to expectations. So dropping the fee is an attempt to make the contract more attractive over the last two months of 2018 and into 2019.

The source said it also makes sense for GSA to take over the shared services that HRS provides USAJobs.gov, USAstaffing.gov and other related systems.

“Probably the hardest overall issue will be figuring out the cost recovery model. In the past, the training and management assistance (TMA) covered a lot of the costs for the rest of the HRS organization. When TMA integrated with GSA [through HCATs] that model was not sustainable in setting fees,” the government source said. “My guess is the other HRS programs probably have questionable financials now. GSA will likely consider each of these HRS components on a standalone basis and that will likely lead to a hard look at staffing and making investment decisions based on profitability versus an OPM culture that prioritizes HR programs, even unprofitable ones, and carrying people, even unproductive ones.”

At the same time, however, figuring out where HRS’s traditional training efforts such as the Federal Executive Institute, or its fee-based consulting services for human resources strategies and development would fit inside GSA is not as clear.

“One of things they are trying to do is make a case for the value proposition for why they would do this so it will help to have a third party do that work,” the source said. “What’s interesting to me is there have been opportunities for OPM and GSA to partner before. But it was not until HCATs that they made it work. OPM really takes an OPM-centric view and in the past didn’t see themselves as shared services provider, but rather as doing their mission. In a sense, they weren’t excited for shared services and may have missed some opportunities early on. Now that the merger is actually happening, GSA and OPM will have to deal with the culture change that comes with any big change like this one.”

OPM heading toward a culture clash?

The first source agreed that OPM is a unique place compared to GSA.

“GSA has a much more results-oriented approach to delivering programs than OPM,” the source said. “Overall, especially in parts of HRS that use contractors, this will be a much better way to do business. I think especially in intermediate and senior level staff the move to GSA will be a bit of a shock and they will have challenges adapting to GSA’s culture.”

One way GSA is trying to address these culture challenges is by bringing in David Vargas, the former OPM director of the HR Line of Business and acting chief information officer and deputy CIO.

One industry expert, who also requested anonymity because their company does business with both GSA and OPM, said the merger is concerning for several reasons.

First, by adding HRS from OPM, GSA becomes that much bigger of an agency and will play a larger role in other agencies day-to-day so what’s the impact of those changes across government.

“The government needs to focus on decentralizing and distributing key OPM authorities out to the agencies,” the source said. “The track record of federal HR shared services is poor and has increased costs and added layers of inefficiency instead of the other way around – how does this help? What’s the business case for it? What does this do to free agencies to access better technologies to improve talent acquisition and talent management business processes and outputs inside the federal hiring agency?”

Read more of the Reporter’s Notebook


Oracle, DoD face-off over cloud contract rests on single award rationale

If nothing else, Oracle Corp. has kept the federal community entertained over the last few years. First, the software giant submitted a stingingly candid commentary on the Obama administration’s IT modernization efforts, detailing three false narratives perpetuated over the last decade.

Now, Oracle offered a biting rejection of the Defense Department’s Joint Enterprise Defense Infrastructure (JEDI) cloud contract just 11 days after the Pentagon released the final solicitation.

Oracle’s protest of DoD’s decision to make JEDI a single source, single award is almost as aggressive of a takedown as was the company’s IT modernization comments.

“This anti-competitive RFP violates law and regulation, and creates significant risk that DoD will award a 10-year, $10 billion contract to a company that will not offer the best value for all the potential JEDI Cloud user’s current and future cloud service needs,” Oracle’s lawyers write.  “The DoD determination and findings oddly intimates that DoD will receive proposals for firm fixed prices to meet DoD’s future, unarticulated tactical cloud computing needs (classified and unclassified) for the next 10 years and today can determine the single best value cloud computing technological leader over the next 10 years when some — if not most — of the impactful technology has yet to be developed.”

While the Government Accountability Office reviews and analyzes the JEDI case under a November deadline, we asked three federal acquisition lawyers for their take on the case. None are associated with Oracle or the case, and all are offering their opinions based on Oracle’s public filings.

Federal News Radio talked to Steve Schooner, Nash & Cibinic professor of Government Procurement Law and co-director of the Government Procurement Law Program at the George Washington University. FNR also spoke with Antonio Franco, a partner with PilieroMazza PLLC, and Charles Tiefer, a professor of Government Contracting at the University of Baltimore Law School.

Of Oracle’s three arguments about why the RFP is flawed, which is strongest?

Franco: Oracle’s strongest argument is that the Department of Defense has not established that a single award is appropriate for such a large, long term contract. With multiple award contracts offering more options to accommodate technological innovations, competition among multiple contractors seems more appropriate under the FAR. Putting aside questions about the rationale for the large single award, Oracle also raises legitimate questions about compliance with the FAR.

Tiefer: Oracle’s strongest argument is that Defense’s restriction of JEDI to a single awardee violates the policy of both Congress, by the Federal Acquisition Streamlining Act (FASA) law, and the FAR regulation, to compete task and delivery orders.  Congress, by the FASA law, and the FAR regulation do not intend that there be only one competition at the beginning, and then monopoly award of delivery orders, in a 10-year contract for billions of dollars of services.

The JEDI contract concerns services that are very rapidly changing and evolving, and Defense just doesn’t justify skipping competitions over task orders for the next decade. It violates common sense to set up what is basically a monopoly contract for such services.

Schooner: If GAO determines that the contracting officer’s underlying determinations and findings was defective, that’s a much easier case. Maybe my favorite/best technical/micro-level argument is the lack of finite scope criticism. The RFP does not purport to identify the “specific tasks to be performed” in contract year one, much less over the 10-year period of performance. In other words, there is no baseline for apples-to-apples comparison.

Oracle writes, “The RFP neither establishes the prices that DoD will use across the contract term nor identifies the specific tasks to be performed. Both the pricing and the cloud service offerings are dynamic. The same is true of the commercial marketplace of third party software DoD seeks to access.” That’s all part and parcel of the “RFP does not provide a reasonable basis to assess the relative price to DoD of making a single award …” Oracle writes.

Did Oracle make a good enough case to show that they were actually prejudiced enough by the RFP?

Tiefer: Oracle made a convincing case that they are actually prejudiced. They show that the only way DoD could find a loophole to make a single award was to find the unbelievable. Namely, Defense would have to credibly set out now the details for competing on a firm fixed price for all future task orders. But, Oracle cannot be obliged to set genuine firm fixed price bids for the highly diverse future task orders, impossible to anticipate this far in advance, for which the details are, of course, quite unavailable.

Franco: Oracle has made a credible claim of prejudice as the GAO does resolve doubts about prejudice in favor of the protestor when an agency allegedly violates procurement regulations. Oracle has raised legitimate issues for the GAO’s review. If the GAO ultimately finds that no procurement regulations have been violated, the GAO can decide the case on the merits. Finding that Oracle has not been prejudiced by the alleged procurement violations to avoid a decision on the merits would be unfortunate as Oracles raises important questions about a huge contract.

Schooner: Neither DoD nor commercial technological marketspace leaders can accurately predict where the still nascent cloud computing industry will be or who will lead it five years from now, much less 10. With quantum computing, blockchain, artificial intelligence and machine learning, internet of things and other technologies actively disrupting a disruptive technology, the only constant is change. DoD knows this, which leads to this obvious “duh” moment. Firm fixed price contracts looking ahead 10 years for evolving technologies is professional guesswork.

Relying on, or expecting to hold a contractor to, those FFP prices is unrealistic, a fool’s errand, inconsistent with experience, delusional, irresponsible — fill in the blank with something that rhymes with arbitrary and capricious. The DoD determination and findings oddly intimates that DoD will receive proposals for FFPs to meet DoD’s future, unarticulated tactical cloud computing needs — classified and unclassified — for the next 10 years and today can determine the single best value cloud computing technological leader over the next 10 years when some – if not most – of the impactful technology has yet to be developed.

What legal questions remain outstanding about this entire procurement?

Schooner: Challenging the nature of a stated requirement or the basic acquisition strategy – in a vacuum – is a tough road to hoe. Agencies are entitled to a fair amount of discretion in most acquisition planning disciplines. The bar should be relatively low for DoD to demonstrate there was a rational basis for this strategy. The Competition in Contracting Act of 1984 and the Federal Acquisition Regulation suggest a strong open market, i.e. full and open competition, bias. Nonetheless, closing a market, even a large market, for a long period of time is not in and of itself objectionable. DoD frequently closes massive markets through industry consolidation.

The Navy relies on a single shipyard for nuclear aircraft carriers and two shipyards for nuclear submarines. The Air Force selected a single air frame for the next generation of in-flight refueling. The Army did the same with its new sidearm, or modular handgun system. The reality is that sustaining competitive markets typically involves purchasing excess capacity or subsidizing potential competitors, longstanding, common practices that have fallen out of favor over the last few decades.

Franco: This procurement likely raises a number of other legal questions which I have not been able to evaluate. The one question that does come to mind was how the DoD is going to reconcile its supposed need to support a “common environment” for artificial intelligence and machine-learning requirements when the department has recognized it will always have multiple cloud environments. The DoD contemplates cloud users to include all of the DoD, including the Coast Guard, Intelligence Community, countries with which the United States has defense arrangements, and federal government contractors. How does DoD reconcile the conflicting goal of having multiple cloud environments with the common environment the procurement seeks to develop?  If the agency decides that it needs to go with one common environment, it means that the huge cloud market may be the domain of one contractor for the next 10 years. It is questionable whether that serves the government’s best interest and offers the best value.

Tiefer: The key remaining legal question is how much backing the top leadership of DoD can give to its nominal findings that the RFP can justify a single award for 10 years based on firm fixed prices. Another legal question for the JEDI contract, not covered in this protest, is whether DoD is setting up this RFP so that it is tilted in favor of Amazon because of Amazon’s unique credential of a past CIA contract. The super-high level of classified treatment for the CIA is unnecessary for 99 percent of what DoD handles, and for Amazon to have a high preference just for that limited aspect is for the tail to wag the dog.

Oracle’s long term view

GAO has until November to make a decision. But Schooner said this initial protest by Oracle may be more about setting the groundwork for future complaints.

“Also, even if Oracle doesn’t prevail, the protest may build a record that could potentially constrain DoD or tie DoD’s hands once the competition actually begins,” Schooner said. “In other words, Oracle may simply be seeking to obtain additional information and create a record upon which to build a subsequent protest, once Oracle is excluded from the competition or DoD actually chooses its JEDI partner.”

The one thing is clear in all of this, it will be a lot of fun watching this saga unfold and hopefully all parties involved will learn something about listening.

Read more of the Reporter’s Notebook


5 lessons CIOs can take from confusion around FCC’s alleged cyber attack

After reading the Federal Communications Commission’s inspector general report that the commission’s then-chief information officer overstated — maybe even lied — in 2017 about suffering from a distributed denial of service (DDOS) attack when HBO’s John Oliver did a segment on the dangers of ending net neutrality, I had a simple question: Who cares?

Ajit Pai
Federal Communications Commission Chairman Ajit Pai

It’s not that anyone should condone the hyperbole or the outright attempt to misinform the public by a federal employee. But I just couldn’t get too excited about the report, especially given the fact that David Bray, the FCC CIO at the time, has been gone for more than a year and the net neutrality debate is over for now. On top of that, Bray’s reputation as an honest and caring federal employee leaves me wondering if this was more politics than problem.

So instead of going over all the unpleasant details of the IG’s 106-page report and trying to use our 20/20 hindsight to place blame on Bray and his staff for jumping to conclusions or sticking to a narrative that obviously had gone off course, I’ve asked former federal CIOs to offer some lessons learned to current federal executives about dealing with a similar situation that is likely to happen again.

Lesson No. 1: Stay calm

Sounds obvious, but panicking or jumping to conclusions based on an emotional reaction or political pressure is a real threat. Jonathan Alboum, the former CIO at the Agriculture Department and now chief technology officer at Veritas, said CIOs need space to assess the situation.

“It’s too easy to jump to conclusions, which is a sign that you don’t have a great understanding of your IT environment,” he said. “The better you understand what and where your data is and how your systems function, the faster you can get to the root cause.”

Tony Scott, the former federal CIO, said IT executives need to be careful in what they say without a thorough understanding of the facts because Congress is listening. Be cognizant that stating something as fact could have long-term repercussions.

Lesson No. 2: Have a plan

Several former CIOs highlighted the need to know what to do when something bad happens. It’s more than who takes the system offline or what’s the best number to reach the Homeland Security Department, but you need to know all of the things that happen a few days, a few weeks and a few months after a cyber event.

“Ensure that effective incident response policies and standard operating procedures exist,” said Simon Szykman, the former Commerce Department CIO and now chief technology officer at Attain. “A strong policy/process framework will help ensure that incident response actions and reporting match the actual circumstances surrounding an incident, making it less likely that follow-on activities or communications get off track.”

The plan also should dip into the system architecture environment, said Shawn Kingsberry, the former CIO at the Recovery Accountability and Transparency Board and now vice president for digital government and citizen services for Unisys Global Public Sector.

“It is also clear that a plan should be in place to protect expected demand volumes, and a contingency plan should be also available, should the volumes greatly exceed anticipated demand,” he said. “This should become a part of the DNA of the organization and how they execute.”

Alboum said while outages are not common and can happen for a variety of reasons, having a strong resiliency program will help agencies turn their focus from what happened to the speed with which the organization can respond, correct the issue and resume successful operations.

Lesson No. 3: Obtain multiple points of view

The one big mistake Bray and the FCC made was failing to alert the Homeland Security Department’s U.S. Computer Emergency Response Team (U.S.-CERT) if this was indeed a DDOS attack. Federal law and regulations require agencies to contact US-CERT should a major incident happen, and the taking down of the commission’s comment system would seem to fit the bill. The former CIOs said having that third party review audit logs and data will help put you on more solid ground.

“You need to get multiple points of view or multiple analyses of the problem,” Scott said. “You shouldn’t rely on a single source. Triangulate what actually happened because it could easily be a different technical issue than you first thought.”

Szykman added having that independent, objective analysis, whether it’s through an informal request like a chief information security officer from another agency or through formal channels to US-CERT, puts the CIO in a better place when discussing what happened with agency executive, Office of Management and Budget officials or lawmakers.

Lesson No. 4—Inspire trust

This is the one thing Bray always had going for him, the federal IT community, FCC leadership had confidence in what he did and what he said. So when he said the attack appeared to be a DDOS type of attack, FCC commissioners and lawmakers had every reason to believe him.

Alboum said trust from being correct as much as it does from simply saying “I don’t know right now.”

“At the same time, the leader must be able to articulate the things they are doing to get to the bottom of the situation,” he said. “However, if you don’t have visibility into your data or understand how your systems are working, then it’s much harder to articulate how you are going to resolve a crisis.”

Scott said sometimes you can inspire trust by admitting you were wrong with your first conclusion.

“CIOs have learned over and over again that what things first appear to be often change when the full facts become available,” he said. “Most of the time, your first inclination is probably wrong.”

Lesson No. 5: Data rules the day

This was another shortfall of Bray and his staff. FCC IT staff told auditors they didn’t have enough information from event logs to make any thorough determination of what happened. That led auditors and other experts to question why they were so sure the comment system suffered a DDOS attack versus just a significant spike in usage after the John Oliver show.

Every former CIO said having the right data and being able to present it to agency leadership and lawmakers is how you keep the conversation moving forward.

“Government organizations today need to be prepared and make strategic choices around data collection, data storage, utilization and the location of that data – and how it can be safeguarded against system outages or cyber-attacks. Recovery from an a system or data center outage is exacerbated by the complexity that has been built into our IT environments. Getting to the bottom of a problem in our complex, hybrid, multi-cloud world, is not easy. There too many moving pieces,” Alboum said. “To understand the true cause of a system or data center outage or cyber-attack takes time. It also takes understanding of how all the pieces fit together. If you don’t understand where your data is located, you will be at a loss to answer questions from your leadership or respond to the public in the fastest and best way possible.”

Kingsberry added if agencies are building systems using digital standards then when looking at an incident, you will have consistent development and security operational services to understand the problem.

“Logs from applications, network devices and security tools can provide data to confirm or invalidate assumptions regarding the nature/source of an incident,” Szykman said. “Don’t draw firm conclusions prior to completing a sound analysis, and be open to revising hypotheses if necessary.”

And finally Scott said having the right data also lets you fill all the gaps in communication with auditors, Congress, agency leaders and cyber staff.

In the end, communicating with all the stakeholders from a position of knowledge will ensure you can avoid getting caught up in a political firestorm.

Read more of the Reporter’s Notebook


DHS moving quickly to get National Risk Management Center off the ground

The Homeland Security Department isn’t waiting around to get its new National Risk Management Center up and running. DHS name Bob Kolasky to serve as the center’s first director. Kolasky currently is the assistant secretary for infrastructure protection in the National Protection and Programs Directorate (NPPD).

“Bob is uniquely qualified to lead this significant undertaking and I am confident he is ready for the challenge,” wrote Chris Krebs, the DHS undersecretary of NPPD, in an email to staff obtained by Federal News Radio. “Bob will stand up a planning team and begin his transition to lead the center.”

Bob Kolasky

DHS Secretary Kristjen Nielsen announced the new National Risk Management Center on July 31 at DHS’ Cybersecurity Summit. Nielsen said the new center would help break down some of the communication barriers that exist between the government and sectors when it comes to sharing cybersecurity threats.

“Our goal is to simplify the process, to provide a single point of focus for the single point of access to the full range of government activities to defend against cyber threats,” Nielsen said. “I occasionally still hear of companies and state and local [governments] who call 911 when they believe they’ve been under a cyber attack. The best thing to do would be to call this center — this will provide that focal point.”

Krebs said Steve Harris will serve as acting principal deputy assistant secretary alongside Scott Breor who will continue to serve as acting deputy assistant secretary.

“Steve, along with the rest of the IP leadership team, will continue the work Bob has undertaken, including enhancing our physical security capability and continuing regionalization efforts,” Krebs wrote.

Kolasky has been with DHS since 2007 serving in a variety of roles, including the assistant director in the Office of Risk Management and for the last six years in OIP. He also spent time as an analyst for the Government Accountability Office and worked in industry.

In addition to naming Kolasky and making other related  personnel moves, Krebs offered more insight into how the new center will work with existing DHS programs, including the National Cybersecurity and Communications Integration Center (NCCIC) and the National Infrastructure Coordinating Center (NICC).

“[W]e identified a clear need for tighter collaboration across industry and government, not just in cybersecurity efforts, but in generally understanding and addressing existing and emerging risks. So as we continue to integrate the watch and warning functions of the NCCIC and NICC, we must also enhance efforts to understand holistic risk conditions across our nation’s infrastructure, whether cyber or physical — what’s essential, what’s a potential single point of failure, and what functions and services underpin our very society, government, and economy,” he wrote. “The NCCIC will continue to be our eyes and ears for cyber and the NICC for physical threats. The National Risk Management Center will be the engine for how we understand and the platform by which we’ll collectively defend our infrastructure.”

Krebs said pulling the eyes, ears and body together is part of how DHS will operationalize risk management.

“That higher order understanding of risk, criticality, and how to increase resilience has been at the heart of Office of Cyber and Infrastructure Analysis’ (OCIA) mission since its inception,” he wrote. “The establishment of the center represents the elevation of that mission and the operationalization of the secretary’s authorities to lead and coordinate national critical infrastructure protection efforts alongside our government and industry partners.”

From FERC to FDIC

Along with the changes at DHS, there are several other important people on the move in the federal technology and acquisition communities.

Mittal Desai moved to the Federal Deposit Insurance Corporation (FDIC) from the Federal Energy Regulatory Commission (FERC) to be the deputy chief information security officer.

Desai had been at FERC for 11 years, including the last four as its CISO.

The FDIC has replaced a good portion of its CIO executives over the last year with Howard Whyte rising to be FDIC’s chief information officer in October. Whyte hired Zach Brown in from the Consumer Financial Protection Bureau April to be the permanent CISO and now Desai.

The Marines Corps named Brig. Gen. Lorna Mahlock as its new CIO in July.  Mahlock is the first African-American woman to achieve the rank of brigadier general in the Marines. She received the promotion in April.

Before becoming CIO, Mahlock served as the deputy director for plans, policy and operations and commanding officer of the Marine Air Control Group 18 in Okinawa, Japan.

The Marine Corps had been without a permanent CIO since Brig. Gen. Dennis Crall left in February for a new position in the Office of the Secretary of Defense. Ken Bible, deputy director of C4 and deputy CIO, had been acting CIO.

New executive at NIST and former VA CIO lands

Additionally, Andy Blumenthal started a new position as program manager in the Office of Associate Director for Management Resources (ADMR) at National Institute of Standards and Technology.

Blumenthal joined the government in 2000 and served in a variety of senior IT roles including chief enterprise architect for the Secret Service and the State Department’s CIO for Global Information Services. Since 2017, he worked in the Department of Health and Human Services as the deputy chief operating officer in the Office of the Assistant Secretary for Preparedness and Response.

Finally, former Veterans Affairs CIO Scott Blackburn returned to his former company, McKinsey & Company, after leaving the government in early July.

Blackburn, who previously spent nine years with McKinsey, came back to the consulting firm in the public sector office focusing on health care, technology and large-scale transformation.

He spent more than three years at VA, including the eight months as interim CIO.

Read more of the Reporter’s Notebook


« Older Entries

Newer Entries »