Reporter’s Notebook

Air Force joins growing list of agencies paving a new cyber-approval path

The Air Force is joining an ever-growing number of agencies so frustrated with the arduous and burdensome authority to operate (ATO) process that it developed an alternative plan.

Similar to the National Geospatial-Intelligence Agency and the General Services Administration’s 18F organization, the Air Force figured out a way to speed up the process to get systems approved to run on its network, while keeping the necessary rigor and adding a new twist—continuous monitoring.

Air Force undersecretary and chief information officer Matt Donovan signed a memo March 22 detailing the new process that comes under the Defense Department’s risk management framework.

Called the “Fast Track ATO,” Donovan said the new process gives authorizing officials the discretion to make decisions based on several factors: the cybersecurity baseline, an assessment or penetration test and ensuring there is a continuous monitoring strategy for the system.

Frank Konieczny is the CTO of the Air Force.

“A fundamental tenet of this Fast-Track ATO process is the authorizing official will make these decisions by working closely with information systems owners and warfighters to find the appropriate balance between rapid deployment and appropriate level of risk assessment,” writes Air Force deputy CIO Bill Marion in accompanying guidance to the new policy. “Use cases for Fast-Track include applications developed for deployment to secured cloud infrastructure, and authorizing officials may consider other applicability as well; system that have not ‘baked security in’ to the system design and are not prepared to endure a strong penetration test, are not good candidates for Fast-Track.”

Frank Konieczny, the Air Force’s chief technology officer, said the penetration testing assessment is the key piece to the entire faster process because it’s giving some relief to system owners from the need to comply with every security control in the risk management framework.

“The penetration testing will actually answer some of those controls right away, and, in fact, in better cases because it’s not compliance anymore but how you operationally put information out there,” he said at the RSA Security conference in Washington, D.C. “As we roll this out, what do we mean by penetration test? We are trying to explain that now by getting back to the operational side. What do we really need to support the system going forward and doing it faster than just by doing paperwork?”

Konieczny said the Air Operations Center tested out the Fast Track ATO completed one in about a week for an application that lives on a highly structured platform that uses a dev/ops approach.

“They are doing a lot of testing automatically. They are filling out most of the controls automatically. What they do after that is the penetration test, if it passes, then it’s ready to go,” he said. “The penetration testing is really an operational viewpoint. That will eventually take over some of the compliance issues.”

Fast-Track tested at Kessel Run

The service also tested out the Fast Track ATO at its Kessel Run organization, which is the Air Force’s new agile software development office.

The Air Force’s requirement for continuous monitoring is the piece to Fast Track. He said it could mean different things to different organization ranging from redoing the code every week with another penetration test to using automation to test the system and track any changes to the code.

“Each authorizing official has the authority to do whatever they really want to do and take that risk or determine how much risk they want to take. They can determine the depth of the penetration test. The deeper the penetration test the better the results will be, and the best way to go into operational. I assume that more critical applications will actually receive a very deep penetration test as well as the continuous monitoring they want to field as well.”

The reason why the Air Force is joining the ranks of agencies finding a better, faster approach to the ATO process is the frustration of how long it takes to get new capabilities to warfighters.

The military services and DoD agencies, too often, view the risk management framework as a compliance issue, meaning there is no sufficient evidence that any one system is secure.

“The RMF process was taking too long based on the workload everyone was having and we wanted to go back to something that was more operational relevant,” Konieczny said. “The focus now is looking at real risk and operational risk. We looked at compliance risk before and everything was focused on compliance, which was good. But I can be a very devious programmer and I can get through the compliance issues without any problems, but I can still have an operational hole in my system. This is a way to fix that operational hole.”

OMB ATO streamlining strategy expected soon?

The Office of Management and Budget and others have recognized over the years that the ATO process was broken. Back in 2017, OMB said it was running a pilot program to consider other approaches to shorten the ATO life cycle and may potentially look at a “phased ATO.”

It’s unclear what happened to those pilots around a phased approach to an ATO as OMB never publically discussed those results or findings.

The attempt to fix the ATO process has been an ongoing project for OMB.

If you go back to 2013 in the annual FISMA guidance, OMB told agencies they had four years to get to continuous monitoring of systems, which would change the ATO process by making it an infrequent event to one that happens every time there is a change to the system.

As part of the President’s Management Agenda’s IT modernization cross-agency priority goal, improving the ATO process, specifically for cloud services is one of the goals.

“OMB and GSA are also developing a process to better incorporate agile methodologies into the ATO process, providing a more flexible approach for federal agencies and cloud service providers,” the December 2018 update says.

Additionally, OMB, DHS and GSA say they have issued “a draft strategic plan for streamlining ATO processes, to include vision for future of FedRAMP and rollout of activities,” and sometime in early 2019, they expect to issue a final strategic plan.

OMB hasn’t offered any update on its progress to revamp the ATO process, but back in October, Margie Graves, the deputy federal CIO, offered this insight: “If we can get to the point where we are doing continuous authorization through automated controls and automated use of data, then suddenly all the authority to operate (ATO) paperwork and approach becomes totally different. There is more veracity and more accurate because it’s based on data in the environment. That’s where we are going.”

The sooner OMB can provide some guidance around improving the time it takes to achieve an ATO, the more consistent approach agencies can take instead of these one-offs that are quickly developing.


RPA more than a passing fad, just look at the data

If you want to measure the impact of specific technologies on federal agencies, one good way is with acquisition data.

And the General Services Administration’s Alliant 2 governmentwide acquisition contract (GWAC) is as good as any barometer given the fact the agency developed the procurement vehicle to make sure agencies have access to the latest, greatest technology available.

Bill Zielinski, the acting assistant commissioner for the Office of Information Technology Category (ITC) in GSA’s Federal Acquisition Service, wrote in a March 29 blog post that agencies issued 978 “unique, leading-edge technology projects valued at or above $1[million] per project” through Alliant 2 in fiscal 2018. Of the 978 projects, cybersecurity (128 projects), big data (119 projects) and virtual networking (114 projects) were most popular.

While the popularity of these topics are far from surprising, what stood out from GSA’s data is how many agencies sought autonomic computing — otherwise known as robotics process automation (RPA) — and the often related artificial intelligence contracts.

Zielinski said Alliant 2 saw 72 task orders for autonomic computing and 61 for AI. He didn’t say how much agencies spent through these task orders or whether they were for one month or one year.

But the fact that RPA and AI made the top 10 goes to show both the popularity and wide-spread acceptance of these technologies.

Gartner estimated in November that global spending on robotic process automation software is estimated to reach $680 million in 2018, an increase of 57% year over year. The research firm says RPA software spending is on pace to total $2.4 billion in 2022.

Deloitte’s Center for Government Insights said in 2017 that RPA could save agencies as much as $41.1 billion over the next seven years.

Numbers, however, don’t tell the entire story. Another way to measure the impact of technology on agencies is through the anecdotes executives tell in how they are using these technologies to improve and reduce the cost of back-office functions.

GSA, NASA and more piloting RPA

And you can’t shake a plate on the rubber chicken circuit without RPA coming up during a panel discussion.

From GSA’s own chief financial officer’s office using RPA to reduce more than 13,000 hours of unnecessary or duplicative work to NASA’s well-known use of robotics to reduce the manual processing of paperwork around grants, nearly every agency is jumping on the RPA bandwagon. And over the past two decades, it’s hard to remember a technology that caught on so quickly and has had such an impact as robotics.

“For RPA, it’s how can we save money and make the processes better,” said Marisa Schmader, the assistant commissioner for fiscal accounting in the Office of Financial Innovation and Transformation in the Bureau of Fiscal Service in the Treasury Department, at the recent Association of Government Accountants financial systems summit. “We’ve implemented things that you may not see or experience. Things that are repetitive like resetting passwords or sending reminders about passwords. It’s the things that have a lot of rigor around them that we are transitioning to a bot. Customers will have no way to tell, but it’s saving us money.”

Think about what Schmader said for a second — save money and make the processes better. Those simple concepts have been the promise of technology since Wang and Texas Instruments first put computers on federal employees’ desks

Over the years there has been a lot of promises, but few technologies have delivered real results so quickly like RPA.

GSA’s CFO Gerrard Badorrek said through the use of bots, he believes the agency can eliminate well over 50,000 annualized hours of unnecessary work.

Schmader said the fiscal service is applying bots to financial statement reporting tool that removed the need to process 300-plus reports in Excel.

Others have taken fast notice of the early adopters at GSA and Treasury.

HUD could save 60,000 hours

Bill Apgar, the branch chief of the Interior Business Center’s financial management directorate at the Department of the Interior, said his office started a RPA pilot in early March with Deloitte.

“We identified process areas where we can use bots, including client invoicing and trial balance and reporting. These are high reporting and low complexity work,” he said at the AGA summit. “We are looking to expand in other accounting opportunities.”

Over at the Department of Housing and Urban Development, Irv Dennis, the agency’s CFO, said he has identified 50,000-to-60,000 hours that can be converted to robotics.

“We have a lot of manual processes and they are ideal candidates for RPA. It’s not expensive, they are easy to use and easy to implement,” Dennis said at the AGA summit. “We had processes at HUD, like grant accruals, which require 2,200 man hours to do that over six months. We put a RPA around it and brought that 2,200 hours down to 65 hours in just three weeks.”

Dennis emphasized that the move to RPA is not about getting rid of jobs or employees, but moving people to high quality jobs, such as data analytics.

“Once you understand capability of RPA, it can be helpful on the compliance side too,” he said.

And this is the other reason why RPA is gaining so much attention. Initial fears over “robotics taking our jobs” went away quickly once agencies started to fully understand how bots work.

That understanding, for many agencies, remains in the nascent stage. To that end, GSA launched on April 18 a RPA community of practice.

“With the advancements in emerging technology, it’s important for the federal government to capitalize on technological solutions in order to obtain the benefits of cost-effectively automating manual, repetitive and rule-based operations. Many agencies are currently piloting RPA or already have bots in production, but so much more can be learned, accomplished, and shared with the collective efforts of industry and government,” Ed Burrows, GSA’s RPA program manager in the CFO’s office, wrote in a blog post. “By creating a RPA CoP, the federal government can reduce duplication and streamline efforts to implement RPA across government to help advance agency missions today, and into the future. The GSA Office of the CFO will leverage the existing Technology Transformation Service CoP management capabilities and expertise to lead the RPA CoP. The CoP will mobilize federal RPA leaders to share information, define technical options and outline best practices for implementation in order to accelerate operational achievements and the benefits of RPA.”

GSA’s CFO office and TTS will co-lead the community of practice.

Federal CIO Suzette Kent also said last year she expects to issue guidance to help agencies manage RPA software and laying out potential security and identity requirements for the bots.

“There are limitless opportunities for us as shared service provider to use bots to gain more effectiveness and efficiencies,” Schmader said.


Is the CIA’s new cloud procurement a signal to DoD to update JEDI?

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The CIA created quite a stir in the federal IT community as word spread over the last week that it’s ready to upgrade its commercial cloud offering called Commercial Cloud Services (C2S).

As the industry day documents spread like wildfire across industry and the media, the question we have to ask is the CIA, the intelligence community more generally, trying to give the Defense Department some top cover for its controversial and protest entangled Joint Enterprise Defense Initiative (JEDI) cloud procurement?

When you review the CIA’s market research survey as well as its industry day presentation, everything about it seems to be saying “Hey DoD, we have seen the light and multi-cloud, multi-vendor is the only way to go.”

The intel agency said in its market research survey that it “will acquire foundational cloud services, as defined in the scope section below, from multiple vendors.”

In industry day documents, the CIA said that the Commercial Cloud Enterprise’s (C2E) program objective is to “acquire cloud computing services directly from commercial cloud service providers…”

The CIA said it plans to award one or more indefinite delivery, indefinite quantity type contracts.

Industry experts said the message couldn’t be any clearer to DoD and it’s plans for JEDI.

Trey Hodgkins, president and CEO of Hodgkins Consulting, said the CIA’s C2E puts the conversation around DoD’s JEDI on a different trajectory.

“C2E puts the conversation on a different trajectory. It puts out there that the IC has identified new needs so the prudent person would go back and ask the question, ‘if they need hybrid, on premise and commercial cloud, does that change the thinking at DoD?’” said Trey Hodgkins, president and CEO of Hodgkins Consulting. “I don’t think there is any visibility into DoD’s thought process, but you’d have to think they are asking the same question at the department.”

DoD currently is conducting an internal review of JEDI after a bid protest from Oracle highlighted a potential conflict of interest. Additionally, DoD and JEDI are facing a potential FBI investigation.

Sam Gordy, the general manager of IBM federal, said the CIA strategy with C2E should not only inform DoD, but influence the Pentagon’s plans going forward.

“These [C2E and JEDI] are diametrically opposed approaches. Clearly the CIA has five-to-six years of experience in a single cloud environment and they are making a strategic decision to wholeheartedly move into multi cloud world. It’s a critical next step for the evolution of IT support for the IC,” Gordy said in an interview with Federal News Network. “DoD should take advantage of those five-to-six years of experience in the IC and the national security community to inform what they are doing going forward.”

Gordy said the CIA is taking the approach that the private sector has moved to over the last few years. He added that unlike JEDI, the CIA is making it clear why the multi-cloud approach is necessary because they are saying in the industry day documents and the market survey what they want to use the cloud for today and in the future.

Under phase 1, the CIA said it wants vendors to provide infrastructure-, platform- and software-as-a-service capabilities as well as support services.

Source: CIA industry day presentation from March 22, 2019.

“Knowing they have an enterprisewide cloud contract already and that they are using that capability, this tells me they need hybrid, on-premise and commercial solution and this creates a mechanism to do that,” Hodgkin said. “I didn’t see anything shocking or that caught me off guard. The CIA has clearly spelled out to the industrial base what they need, and one of them is to deliver some or all of the three types of cloud, and when they put their data into those clouds, it must be portable so they can move it to another cloud or somewhere else. Those are the two elements that are different than what they have now, and ones that you haven’t seen it called out in previous acquisitions, at least not at this level.”

CIA needs cloud diversity, data portability

John Weiler, the executive director of the IT Acquisition Advisory Council and an outspoken critic of JEDI, said the CIA’s approach for C2E is a recognition that the current C2S contract isn’t working like they expected.

“If it had worked they would’ve just resigned up with Amazon Web Services,” Weiler said. “One cloud can’t solve all your problems. When you look at workloads on Oracle or legacy Microsoft platforms, it makes no sense to move them to Amazon or Google or IBM. Those cloud are not designed for those environments. These strategies to be effective have to acknowledge that there are certain platforms that are legacy can move to a specific cloud and not just to any cloud.”

Industry experts said there is a growing desire inside the intelligence community for something more than C2S.

One industry source, who requested anonymity in order to talk about inner working of the IC, said there have been varying degrees of unhappiness with the Amazon contract, including at least two IC agencies rejecting the C2S cloud and building their own.

Another industry source said in many ways C2S was a long-term pilot and now the CIA and others in the IC recognize they weren’t happy with the price they were getting for cloud services, interoperability was more difficult than first imagined especially between C2S and existing data centers, and they were limited in the ability to add new features in a timely manner.

“They’ve had time to see what works and what doesn’t, and they’ve realized cloud providers are becoming specialized. It’s easier to move workloads from on-premise to the cloud with the same vendor. They realized migrations can be expensive,” the source said. “The CIA realized that cloud diversity and price competition help bring down costs. The industry and the CIA weren’t in a position to do that six years ago, but now they are, which is good.”

The first industry source added the IC had real concerns about vendor lock-in and how hard it was to move data between cloud infrastructures.

“I’ve heard that a lot that people didn’t expect going into Amazon to have the level of lock-in that they have. Once they migrated data to Amazon, it became much more difficult to lift and shift to say a Microsoft cloud because the systems was configured in way that was only good for the Amazon cloud,” the source said.

Implementation of cloud services is key

A third industry source was even more blunt about the C2S contract:

“AWS has relentlessly leveraged C2S since its inception, proclaiming to federal agencies that there was only one cloud service provider good enough for the CIA, so they needn’t look further. But like a handsy, insecure boyfriend, it seems like AWS held the CIA a little too close, proudly boasting about their exclusive relationship while competing suitors flexed their innovation muscles,” source said. “Not surprisingly, since the relationship first began, the CIA has noticed it has options and doesn’t need to commit. So while it’s understandable AWS wants to put a ring on it, the agency would clearly rather stay friends and play the field.”

An AWS spokesman said they are excited about C2E and the CIA’s intent to build on the existing C2S efforts.

“As a customer obsessed organization, we’re focused on driving innovation that supports the mission and spurs solutions that allow for missions to be performed better, faster, and in a more secure manner,” the spokesman said.

Weiler said no matter the strategy that the CIA or DoD chooses, the key is the implementation. He said nearly every agency needs to address legacy systems and the consistent challenge of cloud migration.

IBM’s Gordy said C2S shouldn’t be considered a failure by any means as it greatly helped inform the CIA’s current strategy.

“This does sync up with a recompete on C2S, but I don’t think C2E is in anyway a replacement for C2S,” he said. “The CIA will probably continue to have the need for a broad business application cloud which is what C2S is being used for today. And then they will need to have a mission oriented cloud, which is the reason they are going to C2E, which seems to be for the optimization of those mission workloads.”


Increasing threats against mobile devices force HHS, others to rethink protections

The first time the intelligence community issued public warning to government and industry executives traveling overseas came before the 2008 Summer Olympics in Beijing.

Joel Brenner, then the head of U.S. Counter Intelligence in the Office of the Director of National Intelligence and a former National Security Agency inspector general, said taking your phone, laptop or other device to China was dangerous and would end up with lost data and the real possibility of having your home network compromised.

“We suggested they take stripped down devices, if you are taking a device at all,” Brenner said in a recent interview with Federal News Network. “That advice was widely adopted by many companies as well as the government. I think it’s good, but tough advice to follow.”

Now, 11 years after that initial warning, the Department of Health and Human Services is taking it a step further. While most agencies prohibit executives taking devices to countries like China or Russia, HHS is not letting officials take any device with government information overseas no matter the country.

HHS Chief Information Security Officer Janet Vogel issued a memo in December addressing the increased level of risk and the need to safeguard government furnished equipment (GFE) while on foreign travel.

“Two key components of the memo are that while abroad, HHS employees must use loaner GFEs containing no sensitive information. Employees are also required to connect to secure, password-protected Wi-Fi, as well as a virtual private network (VPN) when accessing HHS resources with their loaner GFE,” Vogel told FNN in an email. “Increasing the strictness of our GFE procedure for travel was necessary to minimize the risk of increasing and new security threats. HHS has a global presence and often has representatives deployed around the world for reasons such as health conferences, responses to pandemics, etc. This approach to GFE use helps to ensure that the assets and data that travel around the globe are appropriately protected. By requiring HHS employees to use loaner GFE that do not contain sensitive information, the damage resulting from a cybersecurity incident would be lessened. Additionally, requiring secure Wi-Fi combined with a VPN, makes exploitation of GFE more difficult. Limiting the amount of exploitable information on a device, as well as decreasing the chance for such an exploitation, is an effective method of risk reduction for HHS.”

HHS detailed six basic rules to follow:

  1. Only loaner GFE encrypted devices are allowed on foreign travel.
  2. Devices received from foreign nationals/governments (i.e., conferences, gifts, etc.), and devices purchased while on travel are not permitted to conduct HHS business.
  3. Secure remote access via Virtual Private Networks (VPN) is required.
  4. No sensitive data (e.g. personally identifiable information [PII], protected health information [PHI], HHS intellectual property, etc.) are permitted on loaner GFE, unless the devices are encrypted.
  5. All GFE devices used while on foreign travel must remain powered off during travel to and from foreign countries, segregated from HHS networks/systems, and submitted to the IT Helpdesk immediately upon return for evaluation and sanitization.
  6. All devices must be sanitized upon return and before re-use.

This means whether an HHS executive goes to China or Germany or Canada, the device and information on it are considered at-risk.

HHS is ahead of the curve

One federal cyber executive, who requested anonymity in order to speak about their agency’s security requirements, said the HHS policy is one of the strictest in government.

“HHS is ahead of the curve and that’s a good thing because it is dealing with it in a prioritized manner,” the official said. “People who are traveling at all agencies are not low level and they have a lot of other important things to be worrying about so by giving them a new device, it makes it easier for them not to have to worry as much about the security, especially with cost of technology continuing to come down.”

The federal cyber executive added that in some ways HHS is solving a people problem with technology instead of the other way around.

“People are lazy. It’s as simple as that, and if it gets complicated people don’t want to deal with it. This is why a technology-first approach makes sense,” the executive said.

Brenner, who now teaches at the Massachusetts Institute of Technology and and runs his own consulting and law practice,  said it’s more than people are lazy, it’s a lack of understanding especially by executives.

“They don’t want to deal with the aggravation and having to take special steps before they go and when they get back,” he said.

Agencies are beginning to recognize the need to better secure mobile devices. Symantec reported in 2018 that new mobile malware types jumped 54 percent from 2016 to 2017

Vincent Sirtipan, a portfolio manager in the physical and cybersecurity division in Office of Mission and Capability Support in the Department of Homeland Security’s Science and Technology Directorate, said agencies have focused for a long time on mobile device management (MDM) software to protect their devices. But that is only one piece of the bigger puzzle.

“It has to be a MDM and other technology that enable security whether that’s identity management or mobile application vetting or a mobile threat defense solution,” he said. “When you are talking about mobile phones, we are still maturing as an enterprise as is the entire market. What controls and capabilities do we need on a mobile phone to secure it? We recognize it poses a broader threat landscape and a broader attack surface.”

NIST updating mobile security standards

Sirtipan said DHS recently completed is fifth review under the government cybersecurity architecture review (GovCAR) initiative that looked only at agencies’ mobile infrastructures.

“The review team identified if an employee’s multiple mobile security technologies, including application vetting and identity and access management means agencies have a greater security posture against mobile attacks,” he said. “They looked the attackers’ process and desire to move laterally based on mobile attacks. They are able to identify if agencies employ certain tools, they can see what their security posture looks like, and when they employ a compilation of more mobile security tools they are able to mitigate adversary actions and limit their ability to attack us.”

Jon Johnson, the former director of the enterprise mobility program at the General Services Administration, and now director at Redhorse Corp., said agencies have had standards from the National Institute of Standards and Technology to several years for their mobile devices. He said NIST 800-124, work by DHS S&T and others have increased awareness, and now it’s just matter of agencies understanding their risk postures.

Sirtipan said NIST and others in government are updating SP 800-124, and the draft revision should be out for public comment in the next few months.

“We are looking at things like leveraging the National Information Assurance Protection (NIAP) protection profiles, and talking about picking a device that has been trusted and secured,” he said. “We have rechartered and renamed the federal mobility services category management team and mobile security tiger team to be one federal mobility group. It includes 45 agencies and departments to help move us all toward a better security posture.”

Sirtipan said while adding more technology and standards are helpful, it comes back to the user.

And that takes us full circle to HHS.

Vogel, the HHS CISO, said since cyber threats cross all borders, more needs to be done.

“Cybersecurity threats exist outside of the United States, and United States citizens, especially government employees, are often targeted while traveling abroad. Employees are not allowed to connect to HHS systems or networks using unsecured networks — from internet cafes, coffee shops, etc. — regardless of whether they are in the United States or abroad,” she said. “That said, the United States has strong cybersecurity protections, while safeguarding in other countries may not be as robust. Requiring employees to connect to secure, password-protected networks and use a VPN help strengthen our cybersecurity posture and combat potential threats.”


Bid protest win continues to show fragility of multiple-award contracts

Right now, 81 small businesses are wondering why?

Why their ticket to a potential $15 billion lottery has been lost.

Why after waiting a year to begin marketing and promoting task orders through the Alliant 2 small business contract they may have to be even more patient and wait potentially another 12 months?

And why another multiple award small business contract is mired in a bid protest?

These, and probably a host of eye rolls, sighs of frustration and shakes of the head, came fast and furious last week when the General Services Administration announced it was rescinding all 81 awards made in February 2018 under the Alliant 2 Small Business governmentwide acquisition contract (GWAC).

And it left one small business thinking, “We told you so.”

GSA withdrew the awards after the Court of Federal Claims ruled in favor of Citizant in its protest of being excluded from Alliant 2 SB awards.

The judge found GSA erred in evaluating proposals, specifically around having a qualified cost accounting system and price reasonableness.

“The court presumes that Citizant was prejudiced because the record reflects multiple instances of the contracting officer evaluating proposals in an arbitrary, capricious, or irrational manner,” the court states. “Simply stated, the court finds that Citizant has shown that it had a substantial chance of receiving a contract if the contracting officer did not make the aforementioned errors.”

The judge told GSA to re-evaluate all bidders to address the errors Citizant pointed out.

GSA made the initial Alliant 2 awards in 2017 for the unrestricted track and February 2018 for the small business track.

Procurement experts say while GSA doesn’t have to necessarily start over, the re-evaluation could take six months and then the procurement would take another six months to get through the expected protests.

“The problem here is multi layered. It goes back to the issue of GSA’s self-scoring system and this whole idea of trying to make it easier for agencies to go through the proposal process and take the next step in the procurement,” said Tony Franco, a partner with the law firm PilieroMazza, which specializes in small business procurements. “The reason why GSA has to go back and fix this is because it looks like the agency messed up on the front end with regard to that first step of the evaluation process, self scoring. It resulted in a number of contractors thrown into the equation that maybe should’ve been disqualified earlier.”

Another federal procurement attorney familiar with the case, who requested anonymity because the sensitive nature of the proceedings, said the judge expected GSA to hold everyone to the same requirements and during the discovery part of the case, it became clear the contracting officer didn’t do that.

“I can’t imagine GSA will re-evaluate all 500-plus proposals,” the attorney said. “I think GSA will redo the self-scoring checklist, and they may just throw out those companies that shouldn’t have been qualified in the first place. And that could cause more protests. This is the song that doesn’t have an end. That’s the problem with large procurements, they are so important and valuable to vendors that they are willing to protest.”

A spokesman for Citizant declined to comment on the judge’s decision.

Alliant’s faced more than 40 protests

Alliant 2 SB remains under protest even with the Citizant decision.

Three more cases from RX Joint Venture LLC, TISTA Science and Technology Corp. and Metrica Team Venture are before the appeals court.

So far over the last three years, the Alliant GWAC process has faced more than 40 protests.

“Whenever agencies trying to create these multiple award contracts with so many different companies, it will be very hard for them to treat everyone consistently the way they are supposed to,” Franco said. “With complicated proposals and solicitations, and multiple offerors, procurement shops with limited resources struggle, and it will inevitably lead to protests like this where you can always find some flaw in procurement.”

Franco said as GSA and other agencies continue to develop these large multiple award contracts, agencies will create problems that these types of contracts were trying to avoid in the first place.

“This makes me question whether agencies should be using these MACs with so many offerors. Wouldn’t it make more sense to issue separate solicitations or go through the schedules?” he said. “Why create these complicated procurements that at the end of the day are designed to make the source selection process easier downstream when on the front end you may spend years figuring out who are the right contractors? There is so much potential for fallibility when you have humans involved and issues fall through cracks.”

Foreshadowing problems for other MACs?

The Alliant 2 small business experience is the perfect precursor to what is likely to happen to several procurements that are just getting off the ground.

GSA and the Air Force’s 2GIT multiple award contract with a ceiling of $5.5 billion is just getting started and could face a pre-award protest right off the bat. Industry sources say vendors are concerned about violations of the Small Business Act of 2010.

Then there is GSA’s Center of Excellence Discovery blanket purchase agreement, which is entered the second phase of the acquisition process. Last week, the Federal Acquisition Service posted seven challenge questions for each of the areas with a due date of April 1.

In no more than 1,500 words, FAS wants vendors to outline their approach to determine where things stand now, the path forward for implementation and how they will ensure modernization efforts continue beyond implementation.

Both of these procurements as well as the others that are coming over to GSA schedules as blanket purchase agreements, including those MACs from the FBI and the Homeland Security Department, have the strong potential to face protests from unsuccessful bidders. And like the Alliant 2 small business GWAC, it’s to ask if all the time and resources that go into these contracts is worth it. Maybe it’s time to think of another way like having Congress modernize the GSA schedules so this need to create BPAs on top of the schedules or standalone GWACs can go away. This would be a huge step toward getting agencies and vendors alike out of this protest merry-go-round.


After 2 years, OMB still lacks permanent controller and that’s a problem

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The government hasn’t had a permanent chief financial officer, or in government talk, a controller, for more than two years.

Dave Mader, now at Deloitte, was the last person to hold the title of Office of Management and Budget controller and he left in January 2017.

The Trump administration is now considering its nominee, Fred Nutt, who had been waiting since September 2017 for Senate confirmation, for another position in the government, according to a senior administration official.

This means the administration will be three years old before it has a permanent CFO.

A Senate Homeland Security and Government Affairs Committee staff member said Nutt never received a vote in the full Senate because there were concerns about his qualifications. The aide offered no further details about the concerns. The committee held Nutt’s confirmation hearing in May 2018.

And like I said with the administrator of the Office of Federal Procurement Policy last summer, and the Federal Chief Information Officer before that, Where art thou, OMB controller?

“Not having a controller has a very negative effect when it lasts so long,” said Mark Reger, who worked in the federal financial management community for more than 40 years. “The controller is the financial community’s window into Congress. If Congress wants to know what’s going in the financial management community, the controller is intimately involved in issues, develops a relationship and fosters better communications and understanding, especially in these times when we want to do things differently and we have great opportunities to streamline, gain efficiency and combine operations. Right now, we don’t have that advocate. We just miss that advocate.”

The controller’s role is even more significant when you consider that nearly every initiative under the President’s Management Agenda has a financial component to it.

These include the obvious ones like getting payments right or federal IT spending transparency, but there also ones like moving from low-value to high value work where so much of that work is being done in the financial management community through robotics process automation (RPA) at places like the General Services Administration and the Department of Housing and Urban Development, or sharing quality services, where key milestones are around things like payroll modernization and other back-office functions.

“The PMA is one of the best ones in terms of what it covers and I’ve been impressed with Margaret Weichert’s leadership. But I would like to see more about financial management in the PMA. It’s heavy on IT and other areas,” said one federal financial management official, who requested anonymity because they didn’t get permission to speak to the press. “What the financial management community has done is the Treasury’s vision for financial management and that has filled one gap. But it’s not ideal to have that vacancy open for so long.”

Treasury released that document in August seeking to improve federal financial management across four areas.

Another federal financial management official said the CFO community is tightly coordinated and understands the path it must follow.

Is the OMB controller even necessary anymore?

So that leads us to the question of whether a Senate-confirmed controller is even needed anymore?

Similar to what I heard from the procurement community, federal financial management experts say there are definitely things having a Senate confirmed controller helps with, but CFOs and deputy CFOs know what they have to do to get the job done and move things forward.

Experts praised deputy OMB controller Tim Soltis for both is leadership and accessibility to the community.

“Tim has been an invaluable asset to not only OMB but also to the entire government financial management community. In the absence of a controller, Tim has stepped in to fill both the role of controller and deputy controller of OMB,” said Ann Ebberts, CEO of the Association of Government Accountants. “There is a lot going on and he’s bound to be pulled in multiple directions to fulfill the needs of both roles. He does a great job at balancing the external and internal demands of the job, but I do remember in prior administrations both controller and deputy controller being extremely busy working with other CFOs across the government to push new initiatives in financial management.”

Reger said while Nutt, serving as a senior advisor to the OMB director, can participate in meetings and can influence priorities and plans, he can’t, however, draw people together, be the public face of the financial management community or lead the CFO community like a permanent controller would be able to.

“CIOs are making great progress, but other administration initiatives containing cost, about being more efficient, federal financial people have lot of information that can help with that,” Reger said. “I’m not sure how well those things being married because there is no controller to help direct, coordinate and foster those changes. I don’t see it as a position that particularly effects the day-to-day operations of the government, but it’s a position that effects directional things, enforces and encourages change, and puts the initiatives in the face of people. The financial community acts well when asked to do something, but they need a leader.”

Move out of the compliance mindset

The first federal financial management executive said having that leadership in the community would be especially critical right now.

“There is a big opportunity for us to get away from being so heavy on compliance by using innovative technologies. We need to define what high-value work is for the financial management community and that’s where the controller position is pivotal,” the executive said. “Now more than ever, a federal CFO is needed. We need that leadership to take the community to the next level. Right now we are a little too heavy on compliance and not doing enough to emphasize the value that a strong controller could bring.”

Add to the need for a leader becomes more critical now that the Government Accountability Office is performing a review of the CFO Act of 1990.

“As part of our study, we are seeking information from the CFOs and deputy CFOs of federal agencies to assist us in obtaining information on progress in federal financial management as we approach the 30th year anniversary of the CFO Act and in identifying challenges and leading practices,” the GAO letter to the financial community states, which Federal News Network obtained.

And to take this discussion one step further, the lack of permanent CFOs across the agencies is dramatic.

Of the 24 CFO Act agencies, at least six agencies don’t have permanent CFOs.

AGA’s Ebberts said the lack of permanent CFOs across the government is as big of a problem as not having a permanent controller.

So the question comes back to whether the Trump administration will comprehend the importance of the OMB controller and get someone in place before 2020, or will it continue to under estimate the impact and critical role this and other management positions play in making the government run well?


What does ‘best-in-class’ really mean for federal contracts?

Forgive me for a minute while I play a little semantics. What does the term “best-in-class contracts” really mean?

If you are NASA and your SEWP contract is “best-in-class” does that mean the rest of your contracts are, say, “worst-in-class” or “meh-in-class?”

I bring this up now because the new category management memo from the Office of Management and Budget strongly encourages agencies to use “best-in-class” contracts to help meet the initiative’s goals.

Among the things OMB told agencies to do is: “Annually establish plans to reduce unaligned spend and increase the use of BIC solutions for common goods and services, consistent with small business and other statutory socioeconomic responsibilities.”

Additionally through the President’s Management Agenda, OMB is setting specific spending goals for each agency to use those vehicles like NASA SEWP or the General Services Administration’s enterprise infrastructure solutions (EIS) contract.

“The BIC goal is a reflection of the many benefits that have been realized from increasing the visibility and use of model contract solutions – including billions in cost avoidance aided by reduced contract duplication for identical products at wide price variations, increased use of common specifications, and greater reliance on government and industry best practices,” the memo states.

There currently are 38 BICs, including 25 that GSA runs, ranging from IT services to IT products to leasing cars to booking hotel rooms to body armor to hearing aids.

Several are duplicates, including four different “best-in-class” contracts for IT products and seven for IT services as well as seven BICs that OMB decided were “mandatory.”

OMB put out a definition of “best-in-class” when the term first came up a few years ago. Best-in-class criteria is:

  • Rigorous requirements definitions and planning processes
  • Appropriate pricing strategies
  • Data-driven strategies to change buying and consumption behavior (i.e., demand management)
  • Category and performance management strategies
  • Independently validated reviews

I — and real procurement experts — contend that the term “best-in-class” in-and-of itself is problematic for many reasons.

“What are you buying when you use a BIC?” Roger Waldron, president of the Coalition for Government Procurement and host of the Federal News Network podcast Off the Shelf, asked. “The use of BIC is confusing for contracting officers. The Federal Acquisition Regulations have priorities already, and this is another way to articulate priorities? The language around BIC, I think, sends the wrong message. A single agency contract might provide better outcomes than a governmentwide vehicle or vice versa. So much of this really depends on the agency’s mission and what they are trying to accomplish through the procurement action.”

And even OMB admits the definition could be improved.

“Initial designations of BIC contracts have been based largely on demonstrated use of strong contract management strategies. Designations will become more outcome-based as prices paid, performance and other information about agency vehicles within a given category becomes more readily available,” OMB writes in a footnote of the memo.

Same mistakes as strategic sourcing?

Larry Allen, president of Allen Federal Business Partners, took the question about “best-in-class” one step further, asking if the use of BIC is an attempt to reduce the supplier base, similarly to what started as strategic sourcing and eventually turned into category management.

“If you follow logic of the memo, it will result in a reduced supplier base in the federal market. What you are really talking about is reducing channels for acquisition and reducing the lanes where contractors have to supply services and products,” he said. “Just like there are winners and losers when bidding on acquisition vehicles there will be winners and losers among contractors because of the BIC designation. The administration has to be okay with that.”

In the memo, OMB was clear about giving agencies flexibility to consider BIC contracts first and then others as needed to meet small business or other mission-related goals.

But experts warn that aggregation of contracts around this term “best-in-class” could lead to unintended consequences like what happened with strategic sourcing where the office supplies industrial base dropped by 24 percent over six years.

“If the goals of category management is to buy smarter through the use of preexisting vehicles, then those are all good goals and things industry shares and are worth pursuing. The question still gets to process and how the government should go about buying smarter,” Waldron said. “How does that translate into requirements development? I’m not sure it does. Perhaps BIC should be a concept around requirements development where OMB is identifying organizations and processes that deliver sound requirements development. I think getting to that fundamental level is where BIC should evolve to at some point.”

Without a doubt category management is evolving. The thinking behind this concept initially was governmentwide strategic sourcing, which failed under the office supplies effort and had limited other successes around wireless and desktop/laptops bulk purchases.

Over the last few years, OMB brought in the concepts of spend under management (SUM), addressing inefficient buying methods and finally workforce development—all areas where OMB provides agencies with new goals around in the memo.

Memo replaces the 2016 circular

The new category management policy was more than a year in the making. The Trump administration is putting its mark on the initiative that started in 2014 with a series of pilots.

At one point, the Obama administration wanted to make category management a circular so it would be more institutionalized than just a memo. But that draft circular never got any legs in the Trump administration.

This memo, for all intents-and-purposes, replaces that circular and carries with it many of the same goals.

“Teams of experts in each category of spending help agencies increase their use of common contract solutions and practices and bring decentralized spending into alignment with organized agency- and government-level spending strategies by sharing market intelligence, government and industry best practices, prices paid data, and other information to facilitate informed buying decisions,” the policy states. “This memorandum is designed to build on these activities in order to help the government buy as a coordinated enterprise and avoid the waste associated with duplicative contract actions.”

OMB is hosting a question and answer session this Thursday on the new memo and category management with industry.

Jack Coley, president and CEO of Coley and Associates, which consults with small businesses on government contracting, said while he doesn’t have too much concern about the term “best-in-class;” the potential impact on small businesses is similar to that of strategic sourcing.

“Under the Office Supplies 3 vehicle, GSA was able to claim and did meet all small business and socio-economic goals, but tens of millions of dollars went to only a limited number of small businesses. We saw a number of small businesses going out of business who were doing well before getting shutout of strategic sourcing,” Coley said in an interview with Federal News Network. “My concern is that any effort by the government that starts to consolidate spending like category management using BIC contracts, what they are going to do to ensure they aren’t just funneling contracts to fewer and fewer small businesses. That will limit the number of small businesses available, which will negatively impact bringing new and innovative solutions to the government.”

Coley said OMB needs to improve how it defines what common goods are and services as well as what is threshold for a common good or service.

“It’s like the definition of a commodity, are we talking about something like printing paper or pens, if so, buying them on lowest price is fine. But when you get to services and solutions, those need to be defined clearly as what’s commodity,” he said. “I know they have category definitions for many of these services and products, but not everything fits into the category nice and clean, and agencies need some flexibility to create their own contracts.”

Hard to commoditize services

Give credit to OMB for recognizing the one-size-fits all approach of the previous administration doesn’t work.

In the memo, OMB told agencies that they still must meet their statutory small business goals and that the agency’s Office of Small and Disadvantaged Business Utilization (OSDBU) should use the small business dashboard and other information to help the agency achieve the best balance of BIC, governmentwide, agencywide and local contracts.

Additionally, agencies must develop a vendor management plan that includes pre-award and post-award strategies as well as a communication plan.

The bigger question is how OMB will hold agencies accountable. The latest data on category management shows agencies achieved several goals last year, including exceeding governmentwide goals around spend under management, using BICs and cumulative cost avoidance through more efficient buying.

Another big question is the prices paid portal and the inherent problems that comes with creating such as database and promoting widespread use.

Coley said data leakage is especially concerning particularly around proprietary information for vendors.

Allen tagged back to the Coley’s concerns around commoditization of services and no two purchases are exactly alike.

“This memo and this decision making process does raise questions about whether or not we are regulating by memo. Much of what’s going on here was articulated in circular back end of last administration and we continue to have questions that we raised back then,” Waldron said. “At the core of one of those questions is whether this type of micro management of agency procurement is consistent with the Office of Federal Procurement Policy Act and administrative authority. Asking to give comments before issuing memos is a fair thing to ask for, especially on something that directly impacts contractors and frames their opportunities to bid and win work. OMB should’ve put the memo out for comment first.”


OMB, GSA set table for next round of payroll services consolidation

At first glance, the Technology Modernization Fund Board’s $20.7 million loan to the General Services Administration made perfect sense. GSA’s proposal to modernize its federal payroll system checked off many of the boxes the board was looking for — updating legacy IT and processes, improving a shared service and addressing a high-value program that others could learn and benefit from.

But when you dig a little deeper into the board’s decision to lend GSA money that by law they have to pay back, it seems as though something bigger and possibly more disruptive is at play.

Industry and former government officials said all signs point to the Office of Management and Budget consolidating several existing federal payroll providers either into GSA or through the use of quality service management offices through GSA.

Experts said the fact is GSA can only pay the loan back through a limited number of ways:

  • Fewer people
  • Greater IT efficiencies
  • More customers

And many also said it’s only through the third option that GSA can attain the savings necessary to pay back almost $21 million over the next five years.

“I think it will be major consolidation of payroll providers in civilian market because I’m not sure how you go any other way,” a former federal official with knowledge of payroll and shared services, who requested anonymity because they didn’t get permission to speak to the press, said. “I could see either a movement of agencies onto a new platform or GSA is going to contract with companies A and B and set up these platforms and agencies are going to move to it and then charge agencies a fee. The fee usually is per head for all processing, and that would create a revenue stream to include a payback percentage for the TMF loan. They clearly cannot take it from appropriated funds or the revolving funds at GSA. So to pay back the loan it’s either through more customers or fewer people, so it very much looks like consolidation of payroll providers.”

GSA did not respond to repeated requests for details on how it plans to pay back the TMF loan or what are its plans for NewPay.

A senior administration official said in an email to Federal News Network that GSA submitted its proposal last summer to the TMF Board to help with the implementation of NewPay.

“As was evidenced during the lapse, the current complexity of the payroll environment showcased a critical need for modernization. We hope to apply the success of the NewPay program across the government and to other payroll providers,” the official said. “For many years agencies have been directed to develop strategic plans regarding specific common administrative functions. As agencies continuously evaluate performance, security and status of their current solutions for these functions, they will consider more modern solutions as part of their individual strategic planning processes. In the example of payroll, accelerating the availability of modern payroll solutions through NewPay will provide more timely solution alternatives for consideration.”

Margaret Weichert, the deputy director for management at OMB and acting Office of Personnel Management director, said in an interview with Federal News Network it’s too early to know exactly how many payroll providers will exist in the end, but it will be fewer than the four today.

“There are agencies today who provide services that don’t want to be in that business and want to focus on that mission. So GSA, their core mission is to support others in government,” Weichert said. “NewPay is an absolute priority. The proof will be in the pudding in terms of how quickly and how effectively we can roll out the new program before I can answer the final question about how many.”

Draft RFI for payroll modernization

Another industry source highlighted a potential fourth way GSA could pay the money back, which is through appropriations.

A little known provision in the Modernizing Government Technology (MGT) Act lets agencies restructure their appropriations requests.

The law states, “An agency may reduce out-year budget requests in existing IT accounts and restructure the agency’s request to instead include an appropriations request in the IT WCF that will then be used to repay the TMF.”

Of course that approach may not be too popular with folks on Capitol Hill given Reps. Will Hurd (R-Texas) and Gerry Connolly (D-Va.) push to save money and use it for IT modernization efforts.

This theory of consolidation is underscored in the recent draft request for information GSA issued to the two teams of payroll modernization providers under the $2.5 billion blanket purchase agreement it awarded in September.

The draft RFI to the two teams, which Federal News Network obtained, states a potential forthcoming task order would be limited to “GSA and a component of a second shared service provider, to be identified. The scope of this acquisition is to migrate federal employees from the agencies…to modern, secure, cost-effective SaaS solutions.”

Later on in the RFI, the consolidation plan becomes clearer with GSA asking for services ranging from its 21,000 customers up to hundreds of thousands of customers.

“The contractor shall provide approximately 300,000 Integrated Payroll and WSLM software-as-a-service subscription for a to-be identified SSP and its customers,” the RFI states.

GSA only has about 21,000 customers under its payroll and time-and-attendance services, meaning the other 279,000 customers must come from one of two places: The National Finance Center at the Agriculture Department or the Interior Business Center. NFC currently serves more than 600,000 federal employees, while IBC serves about 150 large and small agencies.

NFC bracing for change

An industry source familiar with the NFC said the shared services provider has been bracing for big changes over the last six months.

“The NFC has been under a hiring freeze of sorts. It may not be official, but they haven’t been able to fill open positions and there is a feeling that the NFC as they know it is going away,” the industry source said. “If you just changed the NFC’s reporting structure, but it still continues to be NFC, I don’t think it’s a big deal. But it could take five to 10 years to move into NewPay based on the current complexities of the system that is run on mainframes and serves more than 100 different types of payroll functions.”

This source and others said the most likely first step of consolidation is through the quality service management organization approach outlined in the President’s Management Agenda, which also would give GSA additional revenue to pay back the TMF loan.

“GSA could charge NFC or IBC for those NewPay services and act as a third party of sort,” the source said.

Source: President’s Management Agenda December 2018 update.

Another industry source, who also requested anonymity because their current company is involved in federal shared services, said a good way to describe the QSM approach is IBC or NFC would be subcontractors to GSA, which is acting as the prime contractor with the NewPay vendors.

“I don’t know if the end game is consolidation, but they have to get started someplace and this hub-and-spoke model where IBC will still be part of Interior or NFC will still be part of Agriculture, but they will work closely with GSA,” the third industry source said. “Historically, no one has owned oversight over the federal shared service providers. The agency CFOs didn’t want them. They were orphans in some sense. This new model at GSA gives them a home where they can work together to improve.”

To that end, Hill and government sources confirmed the Office of Management and Budget is expected to issue a new shared services memo in the coming month or so that details this QSM strategy.

A Hill source said the memo also coincides with IBC, NFC, the Defense Financial Accounting Service and GSA had to submit plans in 2018 for how they would improve their shared service offerings through NewPay.

More than $20M requested for modernization

And add to that, the 2020 budget request that went to Capitol Hill earlier this month, IBC and NFC asked for money to modernize their payroll services.

USDA asked Congress for an increase of $7.5 million in 2020 “to begin to transition USDA employee payroll accounts from the current legacy system to the NewPay system, which will be more user friendly, improve data security, and save future costs.”

Interior, meanwhile, requested $12.5 million “to support implementation planning and transition activities for the governmentwide payroll and Work Schedule and Leave Management modernization initiative entitled NewPay.”

This takes us back to the TMF loan. GSA’s request for any extra funding for NewPay in 2020 is unclear.

“The request includes $5.2 million to support agency reform priorities including the OPM transition and the NewPay million to support initiative, offset by a decrease in the working capital fund contribution,” GSA budget justification states. “OPM and GSA will be one of the first agencies to join under the blanket purchase agreement in FY 2019, which will start the configuration phase of NewPay, followed by migrations scheduled for FY 2020.”

But how much of that $5.2 million is for going for NewPay isn’t known, and even if a majority of it goes to NewPay, it will not provide for enough efficiencies to pay back the $20.9 million loan.

Now add to the fact that GSA states in the budget document that each of the payroll shared service providers will migrate “at least one agency (approximately 10,000 employees) to the new platform, and adoption of a Talent Management suite by four agencies.”

It becomes clearer that GSA will pay back the TMF loan through the management of the QSM organizations.

The irony in all of this is GSA got out of the financial management line of business services business in 2015, moving its financial management solution to USDA, and it wanted to stop providing human resources services in 2013. Now through the QSMs, it looks like GSA is getting deeper into providing these services and managing the future of shared services both at the tactical and operational levels.

“GSA will leverage the new-shared service model that focuses on data standardization, modernization and security into service offerings. This will benefit the American people by transitioning government back office operations to modern technology, reducing costs and risk, and leveraging commercial best practice,” GSA states in its budget justification.

Experts said there are definite concerns about consolidating to one major payroll provider for civilian agencies as DFAS and the State Department — the other two providers — will continue to serve their specific customer bases.

But at the same time, GSA has to get started because no one argues that federal payroll systems are working well.

“To move 2 million accounts and move it in a way that no one can screw it up will take five or six years. So the TMF giving them the ability to get started now as opposed to waiting for the 2020 or 2021 budget is good, but I don’t know how far $21 million will go,” one industry source said. “I know people are concerned about the change, but when I look at the physical placement of the providers, it makes sense they are and continue to be distributed in different parts of the country. This is for a number of reasons including for continuity of operations and for workforce considerations because you can over-saturate an area in terms of finding workers. So the QSM is an ideal model in many ways.”


Agencies likely to miss March 31 deadline to release RFPs under new telecom contract

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The news that CenturyLink passed the last major hurdle to begin offering services under the $50 billion Enterprise Infrastructure Solutions (EIS) contract isn’t necessarily going to set off a tidal wave of solicitations from agencies.

CenturyLink is the first vendor to receive its authority to operate for its business systems, meaning it can accept and process task orders or service orders, provision or deliver services and bill for services. Other incumbent vendors such as Verizon and AT&T aren’t far behind, and the six other “new” companies should be completed in the coming month or two.

Until those vendors receive their ATOs, EIS will remain on a slow roll as agencies are hesitant, maybe even a little fearful of bid protests, to make awards until they feel like there is adequate competition.

“We can award before the ATOs are done, but we can’t execute the order until the ATOs are done. I think most organizations will wait until most or all the ATOs are done to award,” said one agency EIS transition executive, who requested anonymity because they didn’t get permission to speak to the press. “If you don’t wait for all nine vendors, there is a chance you will you go into a protest situation. I think there is some anxiety for how vendors will handle that. Some say only the incumbents have advantage in dealing with ATO, but I think that thinking is full of crap. All the vendors have known what the requirements are to get an ATO for years. But at the same time if I’m going to award, then I want to execute and not wait until there is an ATO because I’m still paying higher rates while I’m waiting for the approval.”

While this is only one expert’s opinion, others in and out of government believe most agencies will wait until the vendors have their ATOs, thus putting the General Services Administration’s March 31 deadline for agencies to release their EIS solicitations further in doubt.

As you may remember, GSA changed the transition timeline for agencies with the caveat of March 31 and Sept. 30 deadlines for releasing solicitations and making awards, respectively. GSA is extending the current Networx contract to May 2023.

“Agencies need to release their solicitations to industry and make timely task order awards so they can make the transition within the four-year window to modernize their IT infrastructure,” writes Bill Zielinski, the acting assistant commissioner in GSA’s Office of Information Technology Category in the Federal Acquisition Service, in a March 14 blog post. “To be clear, GSA intends to extend expiring telecommunications contracts so agencies have enough time to complete the transition and modernize, not to extend the time for the solicitation and task order award process. Agencies need to keep their foot on the gas to ensure they have time to transition their telecom services from their existing contracts and providers to EIS. For example, agencies should issue their solicitations to industry by March 31, 2019. If you don’t issue your solicitations to industry by this date, GSA may cease providing one of our transition support tools for solicitation development — the Transition Ordering Assistance program.”

Source: GSA blog post from March 14.

Additionally, Zielinski said since the 35-day government shutdown didn’t impact most agencies, GSA is keeping to its schedule, but will work with agencies on a case-by-case basis if they need additional help or time.

Bob Woods, a former GSA telecommunications official and now president of Topside Consulting, said GSA should instill and agencies should have a sense of urgency into the entire EIS effort.

“Sometimes you have to create a crisis and that hasn’t happened yet,” Woods said at the recent Independent Telecommunications Pioneer Association (ITPA) lunch in Vienna, Virginia. “GSA has the tools to help agencies, but they can’t issue an edict. They tend to go through the Office of Management and Budget and/or the President’s Management Council (PMC). When the secretary or deputy secretary comes back and asks, ‘why are we behind?’ that’s brings the pressure and gets the attention on the transition.”

Multiple government sources involved in the transition say OMB hasn’t sent any memos, guidance or instructions related to EIS transition, and it hasn’t come up too often during CIO Council meetings.

“The last time for Networx, the communications were more frequent and more structured from OMB,” said another government official involved in EIS transition. “GSA, I think, is getting pushed really hard by OMB to meet the transition dates. GSA said it would work with agencies who are behind.”

And so far as of Jan. 31—the last time GSA updated the transition chart—large agencies released 25 out of an expected 104 solicitations. The Defense Department accounts for 58 out of those 104 total RFPs.

Source: GSA EIS website.
Source: GSA EIS website.

The medium-sized agencies are further behind with only 5 of 40 solicitations out to vendors.

“There is a tsunami of solicitations coming because you are taking Networx, Washington Interagency Telecommunications Services (WITS), the regional contracts and saying we will combine them all into EIS, and that creates the tsunami in-and-of itself. Then GSA is saying do it fast,” the second federal executive said. “Then there are the changes in terms of how GSA is able to handle ordering and billing of the services, which is not nearly as flexible from a workload standpoint than it was previously under Networx.”

The federal executive added their agency is having to hire more people and processing the orders and dealing with the billing will be more costly for the agency.

“I can’t say enough about how well it worked before and how efficient it was,” the executive said. “We are not sure how many more people we will need. We hope we can keep it to 10 people, but some of the estimates say it may be more than that. We previously had about three people doing the ordering and maybe two doing the billing, and we had a defined and automated workflow in with GSA’s workflow.”

The first executive said they expect their transition to EIS to take two years in part because they expect the provisioning of services to take longer due to the complexity of the EIS contract’s approach using task orders.

Experts also say whenever the tsunami of solicitations come out, vendors and GSA alike likely will be overwhelmed.

“Some of industry struggled on Networx to put together quality packages. It was a little embarrassing. We knew we couldn’t call them out and tell them the bids were horrible, but they weren’t good,” said the second executive. “We want everyone bidding, but we may not get that as vendors will have to make hard choices on which opportunities they will cherry pick.”


Transportation, State, CIA experiencing change in CIO roles

Quietly, two agencies are making moves in their chief information officer shops.

The Transportation Department didn’t wait long to fill its vacant CIO role, hiring Ryan Cote without much fanfare. Cote, who started Feb. 4, came to DOT from Gartner where he was an executive partner.

Ryan Cote started as the Transportation Department’s CIO on Feb. 4.

He replaces Vicki Hildebrand, who left in December after just over a year on the job.

While Transportation filled its role quickly, the State Department hasn’t had a permanent CIO for more than 15 months, and now its acting CIO, Karen Mummaw, is retiring in April.

Sources confirm to Federal News Network that Mummaw announced her plans to leave in February.

State hasn’t had a permanent CIO since Frontis Wiggins retired in December 2017, and may not until the Senate confirms State’s undersecretary of management nominee, Brian Bulatao, who has been stuck in the nomination process since July.

Cote comes to Transportation after spending four years in the Marines and then his entire career in the private sector. He worked as the CIO and senior vice president of IT at iForce, a staffing and recruiting company, and for IBM as a senior practice consultant.

As the DOT CIO, Cote likely is picking up where Hildebrand left off in reshaping how the agency uses technology through the nine BHAGs—big, hairy, audacious goals—that focused on everything from cybersecurity to shrinking the IT footprint to implementing intelligent software.

DOT has a $3.7 billion IT budget, with 78 percent of all projects are on schedule and 68 percent are on budget, according to the federal IT dashboard.

Among his biggest challenges will be to continue the partnership with the modal organizations, particularly the Federal Aviation Administration.

Over at State, Mummaw caps a 31-year career at State where she spent her first 10 years as part of the Foreign Service working in technology and telecommunications roles at various embassies around the world and has spent the next 21 years working at both headquarters and overseas as an IT executive.

During her tenure, Mummaw helped led State’s continued transformation to the cloud. State has two main goals as part of its modernization plan: the centralization of back-office or commodity IT and consuming IT-as-a-service.

State still faces several challenges with its $2.2 billion IT budget, of which Mummaw’s office controls only about $725 million. The federal IT dashboard says 84 percent of State’s projects are on schedule, but only 52 percent are on budget. Additionally, State continues to recover from a recent breach of its unclassified email system.

CIA, NIST put out help wanted signs

The National Institute of Standards and Technology and the CIA also are looking for new CIOs. NIST posted a job opening on USAJobs.gov in late February. Resumes are due March 27.

The CIA announced its CIO, John Edwards, received a promotion to be the deputy chief operating officer. On March 19, the agency named Juliane Gallina, a former CIA officer currently at IBM, as its new CIO.

She will start April 1.

Gallina served as a naval officer with a specialization in cryptology and information warfare. In 2013, she retired from the Navy (Reserve) as a commander. Gallina graduated with honors from the U.S. Naval Academy in 1992. She graduated from the Naval Postgraduate School in 1998 with a Masters Degree in Space Systems. She earned a Masters Degree in Electrical Engineering from George Washington University in 2006

NextGov first reported the CIA’s hiring of Gallina.

Edwards has been the CIA CIO since March 2016 and served 14 years as a communications and technical operations officer within the Directorate of Science and Technology (DS&T) and five years serving as the chief of staff to the CIA’s executive director.

During his tenure, Edwards led the CIA’s move into the commercial cloud hosted by Amazon Web Services, and making it a part of the broader intelligence community IT modernization effort.

Additionally, he implemented what he has called a “franchise” model for IT where CIA offices must adhere to a strict set of standards and security requirements, but are able to operate their own IT infrastructures.

Among his long-term priorities, Edwards focused on mobility, interoperability, data management and ensuring capabilities at the edge.

Additionally, the Government Accountability Office is looking for a chief data scientist, the Agriculture Department’s Agriculture Research Service is looking for an assistant CIO to run its technology efforts, and Washington Headquarters Services in the Defense Department is seeking a new CIO.

Finally, Somer Smith is the new permanent chief of staff for Federal CIO Suzette Kent. She had been acting chief of staff since August.

Smith had been a performance analyst for the Office of Management and Budget since August 2017.

This also means OMB is hiring a new supervisory policy analyst.

“The position performs duties related to IT reform efforts, consistent with the Information Technology Oversight and Reform (ITOR) fund. Additionally, the incumbent will collaborate with agencies and policy teams in terms of the CIO Act, cyber policy initiatives, Evidence Based Policy and relevant executive orders,” the job listing states.


« Older Entries