Reporter’s Notebook

jason-miller-original“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.

Submit ideas, suggestions and news tips  to Jason via email.

 Sign up for our Reporter’s Notebook email alert.

HUD will use second $5M transfer to move IT modernization initiative to production

The Technology Modernization Fund still has more than $35 million to “loan” to agencies, and the board is inching closer to making a third round of awards.

Maria Roat, the Small Business Administration’s chief information officer and member of the board, said about 12 projects are in the draft phase where the agencies are working the program management office to finalize their proposals.

“Several others also have come in for phase two where they are pitching their proposals to the board,” she said after the CFO-CIO Summit sponsored by the Association of Government Accountants and the Association for Federal Information Resources Management (AFFIRM).

Roat said during the panel discussion that the board received 50 proposals worth more than $500 million over the last year plus, reviewed 37 and funded seven so far.

She said the board receives agency project details on Fridays and meets on Mondays to deliberate.

One of those projects that received funding was the mainframe modernization effort at the Department of Housing and Urban Development.

HUD’s loan now at $10 million

HUD deputy CIO Kevin Cooke said the agency received it second tranche of funding from the board in the last few weeks, another $5 million to go with the same amount it received last fall.

“All of the architecture work is done. We’ve built a prototype or pilot already to make sure there was no issues with the hundreds of thousands of lines of code to make sure it would work seamlessly. That was a big part of the proof of concept,” Cooke said in an interview after the panel. “The money will allow that jump into the actual projects, not just the proof of concept.”

Cooke said the first $5 million of the $20 million loan went toward making sure the mainframe modernization plan was sound.

“In order to get this done, there is some reverse engineering of the applications that are there to make sure as we change the platform the programs do not lose any functionality,” he said. “That was a big agreement that we had with them that during this period of time they would be up and running the whole time with their current systems, and this replatform would not cause them to lose any functionality. From that standpoint, it’s a good process, but it’s a slow process. It’s not like we are starting from scratch and you get to decide all the new interfaces and all the new APIs and ways of doing this. We are working closely with them. This allows us to continue on our trajectory on the project.”

HUD is modernizing seven different mainframe systems all of different sizes. He expects the agency to complete about 20 percent of the project by the end of 2019.

“You get smarter as you go along so it gets faster as you go,” he said.

And HUD needs to get smarter and faster because that loan is coming due. HUD has to start paying back the money to the TMF in mid-2020.

Energy project delays are over

Energy won $15 million in June to accelerate its move to email in the cloud and received $2.2 million so far.

Bryan Long, Energy’s deputy CIO, said the project is a little behind schedule because of a protest of its CIO Business Operations Support Services (CBOSS) contract, which it plans to use to move some 65 disparate email systems from laboratories and offices to the cloud.

CBOSS is a $2 billion single award blanket purchase agreement for a host of IT services. The Government Accountability Office rejected ActioNet’s protest and backed Energy’s decision to award the contract to Accenture Federal Services.

Accenture’s team includes Unisys, General Dynamics Information Technology (GDIT) and Red River.

“We are behind where we had hoped to be at this point, but we do have the project awarded now under our new IT project and it will be moving out,” he said. “There is no doubt this certainly will accelerate our shift to cloud email for the remaining on-premise email systems.”

While Roat declined to name which agencies are in the final stages, it’s clear agencies are interested in the TMF loans.

Officials from two of the current awardees, HUD and the Energy, said they have other projects proposals they want to send or resend to the board.

Cooke said the agency submitted three total proposals to the board last year and may send one of those back for a second review.

“One of ones we looked at was looking at our enterprise data management. There were so many different programs involved in that and one of the things we didn’t do is show the direct impact, not just more efficiencies, but it will be easier. When we talk about the ease around data analytics, business intelligence and reporting, those were outcomes that meant something to the department, but I didn’t think we did a good enough job of showing to the outside how much better that would be in terms of being able to look at our data more holistically across all of the 19 different programs that we had.”

Cooke said the project also would’ve taken too long to show a return on investment.

Long said Energy submitted four proposals to the board last year, including the idea to move to a desktop-as-a-service and an application rationalization effort. He said Energy has learned valuable lessons from its experience with the board.

“As you are looking at projects, you need to factor in what are the savings, where are they coming from and how quick are you going to accumulate them, and what’s that overall return on investment look like,” he said. “Will it take you five years or 10 years to recoup that money that you used to do the project. Those are some of the key things you have to evaluate.”

Audit of TMF project underway

While Cooke and Long both praised the benefits of the TMF because the loan is helping each agency move more quickly on their respective projects, both said they expect more oversight and attention to their programs.

Cooke said HUD’s inspector general recently began an audit of the project.

“You want to make sure you have enough people and the right people to support these projects. It’s very high profile. There are a lot of external eyes on it, and we chose it be because of how important it is to the agency so everybody has got to be focused on it,” Cooke said. “That’s something we would have to consider when we look at what we have in front us right now that we are doing, what spare capacity would we have” for an additional TMF funded project?

One of the most talked about and highest profile pieces to the TMF is how agencies will pay back the loan.

Long said Energy runs a fee-for-service model for its enterprise cloud email so that’s the way it will pay back the loan.

Cooke said HUD expects the costs of moving off the mainframe and into a modern architecture will pay for itself quickly.


FEMA, GSA gain new IT executives

It took nearly a year for the Federal Emergency Management Agency to find a new chief information officer.

A year after Adrian Gardner joined a long list of CIOs who were reassigned, FEMA’s search ended with a Defense Department veteran.

The agency quietly named Lytwaive Hutchinson as its new CIO in early May. Hutchinson came to FEMA after serving as the vice director of the Joint Service Provider in the Defense Information Systems Agency (DISA).

Lytwaive Hutchinson is the new CIO at FEMA.

This is Hutchinson’s first civilian agency assignment. She spent 21 years as an active duty member of the Army and then joined Washington Headquarters Services in DoD in 2002 after retiring.

Patsy Garnett, who had been acting FEMA CIO since Gardner left, reassumes her previous position as the agency’s deputy CIO. Gardner retired in April after 30 years in government.

As the vice director of the JSP, Hutchinson developed, maintained and facilitated the implementation of the organization’s IT infrastructure across JSP’s customers. She oversaw an annual IT budget of more than $500 million, managed enterprisewide IT programs and initiatives, and served as the executive authority on the utilization of IT capabilities, resources, and systems.

As FEMA’s CIO, Hutchinson inherits major challenges to upgrade the agency’s IT infrastructure. In October, the Homeland Security Department’s inspector general told the House Homeland Security Subcommittee on Emergency Preparedness, Response and Communications that since 2005 auditors have found FEMA’s outdated IT systems and infrastructure did not enable it to effectively carry out disaster response and recovery efforts. The IG found significant and longstanding deficiencies that continue to hamper emergency support operations particularly around its ability to manage and track disaster funding and share information with external partners.

The IG also told lawmakers the CIO’s office lacks budget authority, formalized governance and oversight over IT investments and is missing an overall IT strategic plan.

FEMA is attempting to address some long standing IT challenges. In its fiscal 2020 budget request, FEMA asked for funding to reduce its complexity, such as $18.3 million to upgrade its network, $42.1 million to modernize its grants system and $8.1 million to address its aging financial management system.

It also asked for $9.1 million to continue “a multiyear effort to enable the agency to work smarter through data analytics and ultimately deliver better outcomes for survivors and communities. [The] Enterprise Data and Analytics Modernization Initiative will enable FEMA to streamline the work necessary to stay ahead of emergencies and deliver swift, effective assistance in times of greatest need,” DHS writes in its 2020 budget document.

Hill technology veteran comes to GSA

Along with FEMA, the General Services Administration is bringing in some new IT expertise.

Reynold Schweickhardt joined the agency as a senior technology advisor to “provide technology and cyber management perspective for GSA related technology initiatives,” according to his LinkedIn page.

This is Schweickhardt’s first experience in the executive branch after spending the last 24-plus years working for the legislative branch. He was the House of Representatives’ Director of Technology Policy for eight years before joining GSA. He also served as CIO and chief technology officer for the Government Printing Office.

While FEMA gained a technology executive, the U.S. Citizenship and Immigration Service lost one.

Eric Jeanmaire, the division chief for Identity, Records and National Security Delivery at USCIS, left government to become the CEO of Finality, a security engineering firm.

He spent nearly 10 years in government, including the last seven at USCIS working to modernize the E-Verify program.

Job openings at FDIC, Army Futures Command

Here are a couple of other interesting job openings in government:

The Federal Deposit Insurance Corporation (FDIC) is looking for a chief innovation officer for its Tech Lab (FDiTech). “The CINO will facilitate the transformation process by partnering with other FDIC divisions/offices, the Chief Information Officer organization, and the Office of the Chief Information Security Officer to strategically address challenges through the adoption of innovative technologies. Creates an environment that fosters and enables technological innovation and transformation within the FDIC, by building partnerships and serving as a change agent,” FDIC writes in the job posting.

The agency is accepting applications through May 29.

The Army’s Futures Command is looking for a director of futures integration within the Futures and Concepts Center.

The director will apply analysis to threats “through the lens of the unifying concept and recommends the Army Futures Command (AFC) ‘top-down’ requirements for inclusion in the Army Modernization Strategy and Annual Mission Guidance,” according to the job posting on USAJobs.gov.

The director also will direct the execution of plans and programs to ensure the Future Force Modernization Enterprise maintains consistency across warfighting functions, including synchronization of doctrine, organization, training, materiel, leader development, personnel, facilities and policy development actions, guiding requirements efforts of the nine FCC capabilities development and integrations directories (CDIDs).

Hurry up and apply, applications are due by May 17.


New details from Oracle point to former Navy official as third executive caught up in JEDI controversy

Oracle fired another salvo at the Defense Department’s $10 billion cloud procurement initiative. The soap opera that is the Joint Enterprise Defense Initiative (JEDI) cloud procurement entered another dramatic turn last week with a new court filing by Oracle, which alleges former Defense Department employees have been “caught in a web of lies, ethics violations and misconduct” in the development of the JEDI solicitation.

The Court of Federal Claims filing — 128 pages that reads much like a paperback novel — reiterated and added more details about the potential role of two DoD officials who the software giant had already claimed had direct influence in the development of the JEDI solicitation while having job offers in hand from Amazon Web Services.

One of the officials, Deap Ubhi, has been at the center of this controversy for most of the year-long battle. Federal News Network now has confirmed through multiple sources that the third person in the latest redacted filing and who DoD found may have violated acquisition policy and laws is Victor Gavin, the former deputy assistant secretary of the Navy for command, control, communications, computer systems and intelligence. Gavin is currently head of federal technology vision and business development for Amazon Web Services. The second person who figures into this controversy is Anthony DeMartino,  who served as a consultant for AWS through January 2017 and became deputy chief of staff for the Office of the Secretary of Defense.

DoD’s internal investigation found JEDI didn’t suffer any prejudice from the participation of the three officials who are alleged to have had conflicting connections with AWS, although it did refer potential ethical violations to the department’s inspector general.

But Oracle says in the filing that the contracting officer didn’t interview Ubhi or anyone from AWS or anyone on the JEDI solicitation team, and that there were additional inconsistencies about Ubhi’s claim that he firewalled himself from the JEDI solicitation.

The complaint also details Gavin’s role in JEDI and what Oracle says are clear conflicts of interest and mistakes. Oracle alleges Gavin, whose name is redacted throughout the complaint for unknown reasons, “began employment discussions in the late summer of 2017 and continued the discussions throughout JEDI. Like Ubhi, [Gavin] continued participating on JEDI even after accepting an employment offer from AWS. For instance, in [Gavin’s] final JEDI meeting, held three days after [Gavin] accepted an offer to serve as a principal in AWS’ [federal technology] division, [Gavin] participated in and received access to the DoD source selection sensitive draft acquisition strategy.”

Oracle says DoD has determined both Ubhi and Gavin violated Federal Acquisition Regulation section 3.101-1 [improper business practices and personal conflicts of interest] and possibly 18 U.S.C. § 208 and its implementing regulations about taking actions that directly benefit their financial interests.

A broad fishing expedition?

An AWS spokesman declined to comment about Oracle’s filing and declined to make Gavin or Ubhi available to answer questions citing ongoing litigation.

AWS, however, has over the last few months pointed to three separate investigations—one by the Government Accountability Office and two by DoD— that found no conflicts of interest that would’ve affected the JEDI procurement.

“Rather, Oracle seeks to engage in a broad fishing expedition primarily to find support for its claim that the solicitation at issue is tainted by alleged conflicts of interest involving two former Department of Defense employees and defendant-intervenor, Amazon Web Services, Inc.,” the government wrote in its January filing in response to Oracle’s initial filing.

AWS also has called Oracle’s amended complaint “wildly misleading and a desperate attempt to smear” the company by distorting the facts.

DoD said in its January response that Oracle is creating “unnecessary delays, burdensome information requirements, and excessive documentation” in order to conduct a detailed review of Ubhi’s actions.

At the same time, Oracle’s revised complaints seem to bring new details to light about both Ubhi and Gavin’s role in JEDI.

“Indeed, the record also makes clear that AWS failed to take the necessary steps to firewall Ubhi and [Gavin] fully and adequately when they joined AWS and the contract officer’s suggestion to the contrary contradicts existing law,” Oracle writes. “As previously discussed, the contracting officer knows that the affidavit submitted by [Gavin] was inaccurate. For example, [Gavin] averred in his original affidavit that he ‘had no access to the DoD’s acquisition plan, source selection procedures, or any other information that could provide a competitor an unfair advantage.’ But the contracting officer knew this statement was inaccurate given that she attended the JEDI Cloud meeting with [Gavin]. During which the participants discussed the source-selection-sensitive draft acquisition plan. Significantly, the contracting officer determined this much without conducting any search of the JEDI records related to [Gavin].”

Oral arguments slated for July

While DoD, Oracle and AWS joust over facts in court over the next few months—the judge told the parties to expect a ruling by early-to-mid July—the software giant’s latest filing has to call into question whether JEDI is even viable any more. The court announced on May 9 that oral arguments over the protest would take place July 10 in Washington, D.C.

“Like their previous pleadings, Oracle’s supplemental complaint eloquently paints a damning picture of deeply flawed process. My guess is that, for casual followers who have been quick to dismiss Oracle’s prior filings as sour grapes, reading this document would be a real eye-opener for them,” said Steve Schooner, a Nash and Cibinic Professor of Government Procurement Law at The George Washington University in Washington, D.C. in an email to Federal News Network. “This is not business as usual, nor should it be the way DoD conducts its business generally. And it surely shouldn’t be the way DoD awards its largest, most important, highest profile contracts.”

Many long-time federal procurement experts said IF the details in the Oracle complaint are true or even mostly true, the JEDI procurement is starting to rise to the same level as the Air Force’s tanker procurement where Darlene Druyun, the principal deputy undersecretary for acquisition, ended up going to jail for inflating the price of the contract to favor her future employer, Boeing, and for passing information on the competing contractors.

Schooner said he would hope that DoD would do all it can to ensure it resolves even the optics of a conflict of interest around JEDI.

“DoD would be particularly inclined to do the right thing, send a strong message to the community and take immediate, bold, clear, and definitive action to ensure that a contract decision of this size and institutional significance was not tainted,” he said.

Oracle’s court filing also confirms something Federal News Network reported in March: that the FBI is investigating the JEDI procurement.

While Oracle offers no more details about the FBI’s involvement, the fact its lawyers discussed it twice in the complaint reinforces the seriousness of concern about JEDI.

Oracle said in its filing that the problems are not just with Gavin, Ubhi or DeMartino, but also with AWS’s actions.

“The contracting officer likewise treats AWS as somehow blame-free despite its heavy hand in the misconduct,” Oracle states in its amended complaint. “For instance, AWS necessarily knew both that AWS had entered employment discussions with Ubhi and that Ubhi was serving as the JEDI lead product manager. Yet, AWS did not advise DoD of the employment discussions or even require Ubhi to provide an ethics letter to support Ubhi’s simultaneous participation in employment discussions with AWS while serving as the JEDI lead product manager. Instead, AWS purportedly relied on Ubhi’s statement that he had no restrictions on his conduct notwithstanding that AWS necessarily knew that to be false.”

Oracle also contends that AWS knew it had offered Ubhi a job and that he didn’t recuse himself from JEDI.

Schooner said Oracle’s complaint should be the wakeup call to the Pentagon.

“In addition to the pathologies evident in the original acquisition strategy, the current conflicts narrative, as painstakingly laid out in the supplemental complaint, offers DoD a much needed — even if not initially welcome — lifeline to reassess and reevaluate their original approach to the procurement, and start again with a clean slate,” he said. “Frankly, DoD would do well to grab the rope, escape to safety and start from scratch on this procurement. Sadly, I fear that the level of investment to date may be too high to permit DoD’s leadership to come the right conclusion at this point.”

Oracle is asking the Court of Federal Claims to either find that AWS is ineligible for award or require DoD to further investigate and resolve the conflict of interest claims.


Exclusive

Oracle sends 8 letters to lawmakers asking for stronger oversight of DoD’s JEDI Program

Oracle Corp. wrote about 1,000 pages in its bid for the Defense Department’s $10 billion cloud program known as the Joint Enterprise Defense Initiative (JEDI).

The company says DoD eliminated it without getting much past the first eight pages, which detailed Oracle’s qualifications under the first evaluation gate.

“The first gate required that ‘the addition of DoD unclassified usage will not represent a majority of all unclassified usage’ on the vendor’s overall usage. DoD decided to base this determination on data averaged across two selected months (January and February 2018), seven months before the JEDI request for proposal (“RFP”) submission deadline and now almost 18 months ago. Significantly, Oracle was eliminated for falling 0.79 percent short of the mandated threshold based on DoD’s approach, but would have cleared the same threshold had DoD averaged data just 30 days later (February and March 2018) or in any subsequent period,” Ken Glueck, Oracle’s executive vice president writes in a letter to House and Senate armed services and appropriations committees, which Federal News Network obtained.

DoD informed Oracle and IBM in early April that the two firms didn’t make it out of the initial review of bids, leaving Amazon Web Services and Microsoft as the remaining two competitors for JEDI.

The letter is the latest way Oracle is turning up the heat on the Pentagon and its plans for a single award cloud contract. The software giant also has a protest pending before the Court of Federal Claims.

Oracle asked lawmakers to exercise their oversight authority over DoD and JEDI and push for a fairer competition.

“DoD’s presumed intent behind the gating criteria was to ensure it received bids from large, enterprise-grade cloud service providers (CSPs) with robust cloud service offerings capable of meeting the unique needs of the DoD,” Glueck writes. “Certainly, IBM and Oracle — two of the world’s largest technology companies — meet these general criteria on its face. By eliminating IBM and Oracle, DoD eliminated two of the most enterprise and security-focused CSPs from competition, leaving only two companies to compete for up to a 10-year, single-vendor award.”

Piquing Congressional interest

An industry source familiar with the JEDI competition said lawmakers have been interested in the procurement since it kicked off more than a year ago. The source said Oracle is trying ramp up lawmakers’ interest with the letter.

“If the goal is to get to the best answer for DoD, then what’s wrong with reading four proposals? There weren’t 200 before the agency, there were four,” said the source, who requested anonymity in order to talk about the controversial procurement. “I think most people knew from the outset that AWS and Microsoft could make it through what many say is an arbitrary gating process. But what I think Oracle is telling Congress is that it’s okay to use gates, but not ones that clearly exclude qualified competitors for this 10 year contract.”

Sources say IBM made it through gate 1, but DoD decided it didn’t meet the requirements under gate 2, high availability and failover.

In all, DoD outlined seven gate criteria under JEDI. Others include commerciality, independence, automation and data.

The Pentagon structured the procurement where the gates operate in a waterfall approach where vendors have to qualify for gate 1 before they get to gate 2 and so on.

This means if DoD disqualified Oracle at gate 1, then it likely didn’t bother reviewing the rest of its proposal.

Industry, including Oracle, Microsoft and Google, which didn’t end up bidding, has been concerned about the gate evaluations in JEDI since the draft request for proposals came out.

The source said during the comment period on the draft RFP, during and after the JEDI industry day, and in Oracle’s bid protest, contractors brought up uneasiness about the gates.

The source said DoD declined to change the gating requirements, worried that it would open them up to dozens or 100s of bids.

“To be sure, competition in the commercial cloud services market is robust and advances in cloud computing technology are accelerating. JEDI’s arbitrary — and dated — gating criteria deprives DoD the opportunity to secure the most innovative technology at the best price,” Glueck writes. “Next generation cloud vendors are innovating rapidly around performance, security, artificial intelligence, agility and many other attributes of modern cloud. Ironically, by focusing on a measurement and “gate” now almost 18 months old, DoD virtually assures itself to only evaluate legacy cloud alternatives, depriving the warfighter of the newest generation of cloud technology.”

Oracle’s letter comes as lawmakers asked DoD officials about the JEDI program last week. Pentagon executives offered little new comments or insights about the program.

House committee questions JEDI plan

Rep. Steve Womack (R-Ark.) pressed acting DoD Secretary Patrick Shanahan during the May 1 Appropriations subcommittee on defense hearing.

Here is the exchange between Womack and Shanahan:

Womack: “Why does your department continue with what I believe to be an ill-conceived strategy on a single vendor JEDI cloud program? There’s been a down select to two organizations, that in my strong opinion, continues a strong pattern of limiting competition on a program that is potentially extremely expensive. We don’t know just how expensive it’s going to be, and I have strong concerns about how the  approach by the Department of Defense in this arena seems to be geared toward producing a desired outcome with a specific vendor. So I’ll leave it there, but I’m concerned about JEDI.”

Shanahan: “Well, digital modernization is probably one of the most important undertakings the department has. So for us to be successful in cyber, we have to be able to protect ourselves. So the cloud is one element of infrastructure that we’re modernizing. The fundamental premise on our approach to the JEDI implementation was to have competition. And this is an important underpinning of that competition, preclude vendor lock-in. So we want to be locked into one supplier, just like in the situation we have with Electronic Health Record, have flexibility if things aren’t working out.

Across the department, there is a proliferation in terms of implementing clouds. Everyone was moving to the cloud. The JEDI competition is about creating a pathway so that we can move as a department on a small scale.

This isn’t wholesale. It sometimes gets advertised as this is winner-take-all. This is winner-take-all for a very small subset of the amount of cloud infrastructure we’re going to have to build out over time. Besides creating competition, we’re creating the standard processes so that the department can migrate, so that we don’t have each and every department trying to figure out how to move to a cloud.”

Womack also asked Shanahan if DoD is talking to the intelligence community about its decision to move to a multi-cloud environment. The acting Defense secretary said they were paying attention and “we’re taking what they learned from their experience and translating it into what we’re doing.”

The congressman was less than satisfied.

“I’ll leave the subject as that it is clear that multi-vendor cloud environments are widely used by large organizations for a simple reason, they increase competition, they improve security and capability and they provide cost savings and in an environment like we’re in right now, I would assume that that would be a key issue for our Department of Defense,” Womack said.

While Womack’s interest in JEDI isn’t new—he wrote a letter to the DoD inspector general in October asking for an investigation—Oracle hopes to spur more interest and more questions before the expected award of JEDI by mid-July and the Court of Federal Claims decision that should come before the award.


This is the vendor who sees GSA’s 2GIT contract as the hill ‘I’m dying on’

Rick Vogel isn’t sitting still this time. Back in 2010, when the General Services Administration kicked off the strategic sourcing initiative for office supplies and tried to shut the door on hundreds of small businesses under Schedule 75, Vogel sat by, watched and waited.

We know now that the Obama administration’s office supplies strategic sourcing effort not only failed, but put dozens, if not hundreds, of small businesses out of business. Vogel, who is the federal government sales manager for Coast to Coast Computer Products in Simi Valley, California, said the remaining contractors under Schedule 75 are facing such low margins, they can hardly stay afloat.

Vogel knows this because he is one of those surviving contractors.

Now nearly nine years later, Vogel said he understands with bright clarity what GSA is trying to do with its 2nd Generation IT (2GIT) multiple award contract.

“The 2GIT solicitation is a violation of the Small Business Jobs Act and it will decimate the industry,” Vogel said in an interview with Federal News Network. “GSA got away with it under Schedule 75. I look at this as I’m fighting for my livelihood here. I’m dying on this hill.”

Coast to Coast filed a pre-award bid protest with the Government Accountability Office alleging GSA violated several provisions of the 2010 small business law with the 2GIT solicitation.

Among the provisions in the law that Coast to Coast alleges GSA violated are:

  • No analysis of 2GIT’s potential impact on small businesses.
  • No evidence of having evaluated alternative contracting approaches to a multiple award contract with a limited number of awards.
  • 2GIT will negatively impact more than 700 small businesses under the five special item numbers that are part of the new contract.
  • 2GIT violates the bundling statute because it’s not a direct follow-on to the Air Force’s NetCents contract.
  • The evaluation factors are unduly restrictive, including the requirement for the small firms to be certified under the ISO-9001 standard.
  • The self-evaluation factors are unduly restrictive giving too much power to resellers ProMark and the Immix Group.

“They are making it impossible for so many small businesses to compete for a spot on 2GIT,” Vogel said.

Additionally, Vogel said GSA also now is in violation of the Competition in Contracting Act (CICA) for not placing a “stay” on the procurement given that GAO accepted Coast to Coast’s protest. He said GSA sent a letter to Schedule 70 contractors saying despite the protest bids are still due on May 6.

“I can’t think of any urgent or compelling reason for this procurement to continue while there is an active protest,” he said.

A GSA spokesman said the agency doesn’t comment on active litigation.

GAO has until July 31 to make a decision on the protest.

Similar pattern as office supplies?

Now before you dismiss Vogel’s complaints as “just another whiny small business,” let’s take a step back and understand why he is so fired up over 2GIT.

Vogel has spent the last 20 years building a federal practice for Coast to Coast. Over the last decade, he had to shift from providing office supplies to IT products. Coast to Coast made a strategic decision not to bid on the strategic sourcing office supplies procurement because it had just won a big deal with the Army and didn’t want to over-extend themselves.

It seemed like a smart business decision from a responsible company until the Army mandated the use of the governmentwide office supplies contract despite having just awarded its own contract for similar products.

All of a sudden Coast to Coast was all but locked out of the market as more and more agencies required their contracting officers to use the strategic sourcing contract.

“I’ve had to reinvent my life once already and I don’t want to do it again,” Vogel said. “We went from selling office supplies to IT products under Schedule 70. If GSA closes down Schedule 70 like they did with Schedule 75, I’m getting out of the government marketplace and will work only in the commercial market. There will not be any compelling reason for any company to get into marketplace if 2GIT gets into place.”

Vogel said he sees the same pattern with 2GIT. He said GSA will award nine spots on the IT products contract and unofficially mandate its use, thus potentially cutting out thousands of small businesses out of the market.

“GSA said it during industry days and verbally told me that they have an agreement with the Air Force to demand its use as NetCents sunsets, and they have other agencies who are interested and are asking others. They’ve also said they plan to expand it to state and local governments,” he said.

To be clear, GSA’s industry day slides do not talk about mandating 2GIT, and, generally speaking, agency officials have strongly pushed back against mandating any contract vehicle, saying the value of the approach should be enough for agencies to want to use it.

But like strategic sourcing for office supplies neither GSA nor the Office of Management and Budget mandated its use, but several agencies from the Army to Census to DHS required contracting officers to justify why they wouldn’t use the vehicle.

Self-scoring factors questioned

Vogel said there are other signs that small businesses are in trouble beyond the real or imagined forthcoming mandate. The self-evaluation scoring system makes it nearly impossible for most small businesses to compete.

The protest states, “If a small business concern does not have a teaming agreement with both Promark and Immix Group, it will be impossible for them to score competitively against the limited subset of contractors who are receiving support from these two large business concerns.”

Vogel said the way GSA created the self-scoring system puts small businesses at a disadvantage because to get the allotted number of points to score in the top nine, a company has to have a teaming agreement with one of the big resellers to meet the breadth and depth of the products sought under 2GIT.

“You can’t offer the full basket of products required under 2GIT without an agreement with the Immix Group or Promark Technology because they have exclusive letters of supply with enough companies that would make it hard to have enough points to be in the top nine,” he said. “So that means Immix or Promark control 1,500 points in the self assessment scores and without an agreement, you are basically out of the competition.”

Vogel said GSA gives 200 points as a small business preference, but that’s not enough to balance out the 1,500 points given to companies that have agreements with the resellers.

If your next question is why doesn’t Coast to Coast just get an agreement with Immix or Promark, Vogel said both companies are limiting the number of agreements they will make.

Additionally, Vogel said another example of GSA following the same playbook at the office supplies effort is the lack of impact analysis that 2GIT would have on small firms. He pointed to the 2014 Small Business Administration decision that GSA didn’t conduct an adequate assessment of the office supplies program impact on small firms as required by the Small Business Jobs Act as a reminder.

Vogel has been working with SBA’s Office of Advocacy, which has asked GSA’s Procurement Center Representative (PCR) to look at 2GIT’s impact on small businesses.

Will Congress pay attention to 2GIT?

The irony in all of this is GSA Administrator Emily Murphy, when she was a staff member for the House Small Business Committee, expressed serious concerns over how the office supplies program impacted small firms. Vogel said he’s prepared to bring this issue to her attention.

When Murphy was with the House Small Business Committee, lawmakers added a provision to the 2015 Defense authorization bill requiring GAO to report on the impact offices supplies program’s impact on small businesses. The provision didn’t make it into the final bill, but GAO has done several reports on strategic sourcing that found GSA and the Office of Management and Budget have not done enough incorporate small business needs into strategic sourcing or its follow-on initiative, category management.

“We want them to scrap the procurement and do what they promised for the Air Force by rewriting it so it’s a follow-on to the NetCents scope. NetCents was limited to the Air Force and to those agencies with an agreement with the Air Force. It was limited in scope for products,” Vogel said. “There is so much impact and reach for 2GIT if it is implemented this way. I don’t know if GSA either doesn’t see it or doesn’t care. I don’t know if their end game is to narrow down the number of contractors on Schedule 70, but that is what it seems to be.”

And after watching the decimation from strategic sourcing under Schedule 75, Vogel couldn’t sit still any longer.


Air Force joins growing list of agencies paving a new cyber-approval path

The Air Force is joining an ever-growing number of agencies so frustrated with the arduous and burdensome authority to operate (ATO) process that it developed an alternative plan.

Similar to the National Geospatial-Intelligence Agency and the General Services Administration’s 18F organization, the Air Force figured out a way to speed up the process to get systems approved to run on its network, while keeping the necessary rigor and adding a new twist—continuous monitoring.

Air Force undersecretary and chief information officer Matt Donovan signed a memo March 22 detailing the new process that comes under the Defense Department’s risk management framework.

Called the “Fast Track ATO,” Donovan said the new process gives authorizing officials the discretion to make decisions based on several factors: the cybersecurity baseline, an assessment or penetration test and ensuring there is a continuous monitoring strategy for the system.

Frank Konieczny is the CTO of the Air Force.

“A fundamental tenet of this Fast-Track ATO process is the authorizing official will make these decisions by working closely with information systems owners and warfighters to find the appropriate balance between rapid deployment and appropriate level of risk assessment,” writes Air Force deputy CIO Bill Marion in accompanying guidance to the new policy. “Use cases for Fast-Track include applications developed for deployment to secured cloud infrastructure, and authorizing officials may consider other applicability as well; system that have not ‘baked security in’ to the system design and are not prepared to endure a strong penetration test, are not good candidates for Fast-Track.”

Frank Konieczny, the Air Force’s chief technology officer, said the penetration testing assessment is the key piece to the entire faster process because it’s giving some relief to system owners from the need to comply with every security control in the risk management framework.

“The penetration testing will actually answer some of those controls right away, and, in fact, in better cases because it’s not compliance anymore but how you operationally put information out there,” he said at the RSA Security conference in Washington, D.C. “As we roll this out, what do we mean by penetration test? We are trying to explain that now by getting back to the operational side. What do we really need to support the system going forward and doing it faster than just by doing paperwork?”

Konieczny said the Air Operations Center tested out the Fast Track ATO completed one in about a week for an application that lives on a highly structured platform that uses a dev/ops approach.

“They are doing a lot of testing automatically. They are filling out most of the controls automatically. What they do after that is the penetration test, if it passes, then it’s ready to go,” he said. “The penetration testing is really an operational viewpoint. That will eventually take over some of the compliance issues.”

Fast-Track tested at Kessel Run

The service also tested out the Fast Track ATO at its Kessel Run organization, which is the Air Force’s new agile software development office.

The Air Force’s requirement for continuous monitoring is the piece to Fast Track. He said it could mean different things to different organization ranging from redoing the code every week with another penetration test to using automation to test the system and track any changes to the code.

“Each authorizing official has the authority to do whatever they really want to do and take that risk or determine how much risk they want to take. They can determine the depth of the penetration test. The deeper the penetration test the better the results will be, and the best way to go into operational. I assume that more critical applications will actually receive a very deep penetration test as well as the continuous monitoring they want to field as well.”

The reason why the Air Force is joining the ranks of agencies finding a better, faster approach to the ATO process is the frustration of how long it takes to get new capabilities to warfighters.

The military services and DoD agencies, too often, view the risk management framework as a compliance issue, meaning there is no sufficient evidence that any one system is secure.

“The RMF process was taking too long based on the workload everyone was having and we wanted to go back to something that was more operational relevant,” Konieczny said. “The focus now is looking at real risk and operational risk. We looked at compliance risk before and everything was focused on compliance, which was good. But I can be a very devious programmer and I can get through the compliance issues without any problems, but I can still have an operational hole in my system. This is a way to fix that operational hole.”

OMB ATO streamlining strategy expected soon?

The Office of Management and Budget and others have recognized over the years that the ATO process was broken. Back in 2017, OMB said it was running a pilot program to consider other approaches to shorten the ATO life cycle and may potentially look at a “phased ATO.”

It’s unclear what happened to those pilots around a phased approach to an ATO as OMB never publically discussed those results or findings.

The attempt to fix the ATO process has been an ongoing project for OMB.

If you go back to 2013 in the annual FISMA guidance, OMB told agencies they had four years to get to continuous monitoring of systems, which would change the ATO process by making it an infrequent event to one that happens every time there is a change to the system.

As part of the President’s Management Agenda’s IT modernization cross-agency priority goal, improving the ATO process, specifically for cloud services is one of the goals.

“OMB and GSA are also developing a process to better incorporate agile methodologies into the ATO process, providing a more flexible approach for federal agencies and cloud service providers,” the December 2018 update says.

Additionally, OMB, DHS and GSA say they have issued “a draft strategic plan for streamlining ATO processes, to include vision for future of FedRAMP and rollout of activities,” and sometime in early 2019, they expect to issue a final strategic plan.

OMB hasn’t offered any update on its progress to revamp the ATO process, but back in October, Margie Graves, the deputy federal CIO, offered this insight: “If we can get to the point where we are doing continuous authorization through automated controls and automated use of data, then suddenly all the authority to operate (ATO) paperwork and approach becomes totally different. There is more veracity and more accurate because it’s based on data in the environment. That’s where we are going.”

The sooner OMB can provide some guidance around improving the time it takes to achieve an ATO, the more consistent approach agencies can take instead of these one-offs that are quickly developing.


RPA more than a passing fad, just look at the data

If you want to measure the impact of specific technologies on federal agencies, one good way is with acquisition data.

And the General Services Administration’s Alliant 2 governmentwide acquisition contract (GWAC) is as good as any barometer given the fact the agency developed the procurement vehicle to make sure agencies have access to the latest, greatest technology available.

Bill Zielinski, the acting assistant commissioner for the Office of Information Technology Category (ITC) in GSA’s Federal Acquisition Service, wrote in a March 29 blog post that agencies issued 978 “unique, leading-edge technology projects valued at or above $1[million] per project” through Alliant 2 in fiscal 2018. Of the 978 projects, cybersecurity (128 projects), big data (119 projects) and virtual networking (114 projects) were most popular.

While the popularity of these topics are far from surprising, what stood out from GSA’s data is how many agencies sought autonomic computing — otherwise known as robotics process automation (RPA) — and the often related artificial intelligence contracts.

Zielinski said Alliant 2 saw 72 task orders for autonomic computing and 61 for AI. He didn’t say how much agencies spent through these task orders or whether they were for one month or one year.

But the fact that RPA and AI made the top 10 goes to show both the popularity and wide-spread acceptance of these technologies.

Gartner estimated in November that global spending on robotic process automation software is estimated to reach $680 million in 2018, an increase of 57% year over year. The research firm says RPA software spending is on pace to total $2.4 billion in 2022.

Deloitte’s Center for Government Insights said in 2017 that RPA could save agencies as much as $41.1 billion over the next seven years.

Numbers, however, don’t tell the entire story. Another way to measure the impact of technology on agencies is through the anecdotes executives tell in how they are using these technologies to improve and reduce the cost of back-office functions.

GSA, NASA and more piloting RPA

And you can’t shake a plate on the rubber chicken circuit without RPA coming up during a panel discussion.

From GSA’s own chief financial officer’s office using RPA to reduce more than 13,000 hours of unnecessary or duplicative work to NASA’s well-known use of robotics to reduce the manual processing of paperwork around grants, nearly every agency is jumping on the RPA bandwagon. And over the past two decades, it’s hard to remember a technology that caught on so quickly and has had such an impact as robotics.

“For RPA, it’s how can we save money and make the processes better,” said Marisa Schmader, the assistant commissioner for fiscal accounting in the Office of Financial Innovation and Transformation in the Bureau of Fiscal Service in the Treasury Department, at the recent Association of Government Accountants financial systems summit. “We’ve implemented things that you may not see or experience. Things that are repetitive like resetting passwords or sending reminders about passwords. It’s the things that have a lot of rigor around them that we are transitioning to a bot. Customers will have no way to tell, but it’s saving us money.”

Think about what Schmader said for a second — save money and make the processes better. Those simple concepts have been the promise of technology since Wang and Texas Instruments first put computers on federal employees’ desks

Over the years there has been a lot of promises, but few technologies have delivered real results so quickly like RPA.

GSA’s CFO Gerrard Badorrek said through the use of bots, he believes the agency can eliminate well over 50,000 annualized hours of unnecessary work.

Schmader said the fiscal service is applying bots to financial statement reporting tool that removed the need to process 300-plus reports in Excel.

Others have taken fast notice of the early adopters at GSA and Treasury.

HUD could save 60,000 hours

Bill Apgar, the branch chief of the Interior Business Center’s financial management directorate at the Department of the Interior, said his office started a RPA pilot in early March with Deloitte.

“We identified process areas where we can use bots, including client invoicing and trial balance and reporting. These are high reporting and low complexity work,” he said at the AGA summit. “We are looking to expand in other accounting opportunities.”

Over at the Department of Housing and Urban Development, Irv Dennis, the agency’s CFO, said he has identified 50,000-to-60,000 hours that can be converted to robotics.

“We have a lot of manual processes and they are ideal candidates for RPA. It’s not expensive, they are easy to use and easy to implement,” Dennis said at the AGA summit. “We had processes at HUD, like grant accruals, which require 2,200 man hours to do that over six months. We put a RPA around it and brought that 2,200 hours down to 65 hours in just three weeks.”

Dennis emphasized that the move to RPA is not about getting rid of jobs or employees, but moving people to high quality jobs, such as data analytics.

“Once you understand capability of RPA, it can be helpful on the compliance side too,” he said.

And this is the other reason why RPA is gaining so much attention. Initial fears over “robotics taking our jobs” went away quickly once agencies started to fully understand how bots work.

That understanding, for many agencies, remains in the nascent stage. To that end, GSA launched on April 18 a RPA community of practice.

“With the advancements in emerging technology, it’s important for the federal government to capitalize on technological solutions in order to obtain the benefits of cost-effectively automating manual, repetitive and rule-based operations. Many agencies are currently piloting RPA or already have bots in production, but so much more can be learned, accomplished, and shared with the collective efforts of industry and government,” Ed Burrows, GSA’s RPA program manager in the CFO’s office, wrote in a blog post. “By creating a RPA CoP, the federal government can reduce duplication and streamline efforts to implement RPA across government to help advance agency missions today, and into the future. The GSA Office of the CFO will leverage the existing Technology Transformation Service CoP management capabilities and expertise to lead the RPA CoP. The CoP will mobilize federal RPA leaders to share information, define technical options and outline best practices for implementation in order to accelerate operational achievements and the benefits of RPA.”

GSA’s CFO office and TTS will co-lead the community of practice.

Federal CIO Suzette Kent also said last year she expects to issue guidance to help agencies manage RPA software and laying out potential security and identity requirements for the bots.

“There are limitless opportunities for us as shared service provider to use bots to gain more effectiveness and efficiencies,” Schmader said.


Is the CIA’s new cloud procurement a signal to DoD to update JEDI?

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The CIA created quite a stir in the federal IT community as word spread over the last week that it’s ready to upgrade its commercial cloud offering called Commercial Cloud Services (C2S).

As the industry day documents spread like wildfire across industry and the media, the question we have to ask is the CIA, the intelligence community more generally, trying to give the Defense Department some top cover for its controversial and protest entangled Joint Enterprise Defense Initiative (JEDI) cloud procurement?

When you review the CIA’s market research survey as well as its industry day presentation, everything about it seems to be saying “Hey DoD, we have seen the light and multi-cloud, multi-vendor is the only way to go.”

The intel agency said in its market research survey that it “will acquire foundational cloud services, as defined in the scope section below, from multiple vendors.”

In industry day documents, the CIA said that the Commercial Cloud Enterprise’s (C2E) program objective is to “acquire cloud computing services directly from commercial cloud service providers…”

The CIA said it plans to award one or more indefinite delivery, indefinite quantity type contracts.

Industry experts said the message couldn’t be any clearer to DoD and it’s plans for JEDI.

Trey Hodgkins, president and CEO of Hodgkins Consulting, said the CIA’s C2E puts the conversation around DoD’s JEDI on a different trajectory.

“C2E puts the conversation on a different trajectory. It puts out there that the IC has identified new needs so the prudent person would go back and ask the question, ‘if they need hybrid, on premise and commercial cloud, does that change the thinking at DoD?’” said Trey Hodgkins, president and CEO of Hodgkins Consulting. “I don’t think there is any visibility into DoD’s thought process, but you’d have to think they are asking the same question at the department.”

DoD currently is conducting an internal review of JEDI after a bid protest from Oracle highlighted a potential conflict of interest. Additionally, DoD and JEDI are facing a potential FBI investigation.

Sam Gordy, the general manager of IBM federal, said the CIA strategy with C2E should not only inform DoD, but influence the Pentagon’s plans going forward.

“These [C2E and JEDI] are diametrically opposed approaches. Clearly the CIA has five-to-six years of experience in a single cloud environment and they are making a strategic decision to wholeheartedly move into multi cloud world. It’s a critical next step for the evolution of IT support for the IC,” Gordy said in an interview with Federal News Network. “DoD should take advantage of those five-to-six years of experience in the IC and the national security community to inform what they are doing going forward.”

Gordy said the CIA is taking the approach that the private sector has moved to over the last few years. He added that unlike JEDI, the CIA is making it clear why the multi-cloud approach is necessary because they are saying in the industry day documents and the market survey what they want to use the cloud for today and in the future.

Under phase 1, the CIA said it wants vendors to provide infrastructure-, platform- and software-as-a-service capabilities as well as support services.

Source: CIA industry day presentation from March 22, 2019.

“Knowing they have an enterprisewide cloud contract already and that they are using that capability, this tells me they need hybrid, on-premise and commercial solution and this creates a mechanism to do that,” Hodgkin said. “I didn’t see anything shocking or that caught me off guard. The CIA has clearly spelled out to the industrial base what they need, and one of them is to deliver some or all of the three types of cloud, and when they put their data into those clouds, it must be portable so they can move it to another cloud or somewhere else. Those are the two elements that are different than what they have now, and ones that you haven’t seen it called out in previous acquisitions, at least not at this level.”

CIA needs cloud diversity, data portability

John Weiler, the executive director of the IT Acquisition Advisory Council and an outspoken critic of JEDI, said the CIA’s approach for C2E is a recognition that the current C2S contract isn’t working like they expected.

“If it had worked they would’ve just resigned up with Amazon Web Services,” Weiler said. “One cloud can’t solve all your problems. When you look at workloads on Oracle or legacy Microsoft platforms, it makes no sense to move them to Amazon or Google or IBM. Those cloud are not designed for those environments. These strategies to be effective have to acknowledge that there are certain platforms that are legacy can move to a specific cloud and not just to any cloud.”

Industry experts said there is a growing desire inside the intelligence community for something more than C2S.

One industry source, who requested anonymity in order to talk about inner working of the IC, said there have been varying degrees of unhappiness with the Amazon contract, including at least two IC agencies rejecting the C2S cloud and building their own.

Another industry source said in many ways C2S was a long-term pilot and now the CIA and others in the IC recognize they weren’t happy with the price they were getting for cloud services, interoperability was more difficult than first imagined especially between C2S and existing data centers, and they were limited in the ability to add new features in a timely manner.

“They’ve had time to see what works and what doesn’t, and they’ve realized cloud providers are becoming specialized. It’s easier to move workloads from on-premise to the cloud with the same vendor. They realized migrations can be expensive,” the source said. “The CIA realized that cloud diversity and price competition help bring down costs. The industry and the CIA weren’t in a position to do that six years ago, but now they are, which is good.”

The first industry source added the IC had real concerns about vendor lock-in and how hard it was to move data between cloud infrastructures.

“I’ve heard that a lot that people didn’t expect going into Amazon to have the level of lock-in that they have. Once they migrated data to Amazon, it became much more difficult to lift and shift to say a Microsoft cloud because the systems was configured in way that was only good for the Amazon cloud,” the source said.

Implementation of cloud services is key

A third industry source was even more blunt about the C2S contract:

“AWS has relentlessly leveraged C2S since its inception, proclaiming to federal agencies that there was only one cloud service provider good enough for the CIA, so they needn’t look further. But like a handsy, insecure boyfriend, it seems like AWS held the CIA a little too close, proudly boasting about their exclusive relationship while competing suitors flexed their innovation muscles,” source said. “Not surprisingly, since the relationship first began, the CIA has noticed it has options and doesn’t need to commit. So while it’s understandable AWS wants to put a ring on it, the agency would clearly rather stay friends and play the field.”

An AWS spokesman said they are excited about C2E and the CIA’s intent to build on the existing C2S efforts.

“As a customer obsessed organization, we’re focused on driving innovation that supports the mission and spurs solutions that allow for missions to be performed better, faster, and in a more secure manner,” the spokesman said.

Weiler said no matter the strategy that the CIA or DoD chooses, the key is the implementation. He said nearly every agency needs to address legacy systems and the consistent challenge of cloud migration.

IBM’s Gordy said C2S shouldn’t be considered a failure by any means as it greatly helped inform the CIA’s current strategy.

“This does sync up with a recompete on C2S, but I don’t think C2E is in anyway a replacement for C2S,” he said. “The CIA will probably continue to have the need for a broad business application cloud which is what C2S is being used for today. And then they will need to have a mission oriented cloud, which is the reason they are going to C2E, which seems to be for the optimization of those mission workloads.”


Increasing threats against mobile devices force HHS, others to rethink protections

The first time the intelligence community issued public warning to government and industry executives traveling overseas came before the 2008 Summer Olympics in Beijing.

Joel Brenner, then the head of U.S. Counter Intelligence in the Office of the Director of National Intelligence and a former National Security Agency inspector general, said taking your phone, laptop or other device to China was dangerous and would end up with lost data and the real possibility of having your home network compromised.

“We suggested they take stripped down devices, if you are taking a device at all,” Brenner said in a recent interview with Federal News Network. “That advice was widely adopted by many companies as well as the government. I think it’s good, but tough advice to follow.”

Now, 11 years after that initial warning, the Department of Health and Human Services is taking it a step further. While most agencies prohibit executives taking devices to countries like China or Russia, HHS is not letting officials take any device with government information overseas no matter the country.

HHS Chief Information Security Officer Janet Vogel issued a memo in December addressing the increased level of risk and the need to safeguard government furnished equipment (GFE) while on foreign travel.

“Two key components of the memo are that while abroad, HHS employees must use loaner GFEs containing no sensitive information. Employees are also required to connect to secure, password-protected Wi-Fi, as well as a virtual private network (VPN) when accessing HHS resources with their loaner GFE,” Vogel told FNN in an email. “Increasing the strictness of our GFE procedure for travel was necessary to minimize the risk of increasing and new security threats. HHS has a global presence and often has representatives deployed around the world for reasons such as health conferences, responses to pandemics, etc. This approach to GFE use helps to ensure that the assets and data that travel around the globe are appropriately protected. By requiring HHS employees to use loaner GFE that do not contain sensitive information, the damage resulting from a cybersecurity incident would be lessened. Additionally, requiring secure Wi-Fi combined with a VPN, makes exploitation of GFE more difficult. Limiting the amount of exploitable information on a device, as well as decreasing the chance for such an exploitation, is an effective method of risk reduction for HHS.”

HHS detailed six basic rules to follow:

  1. Only loaner GFE encrypted devices are allowed on foreign travel.
  2. Devices received from foreign nationals/governments (i.e., conferences, gifts, etc.), and devices purchased while on travel are not permitted to conduct HHS business.
  3. Secure remote access via Virtual Private Networks (VPN) is required.
  4. No sensitive data (e.g. personally identifiable information [PII], protected health information [PHI], HHS intellectual property, etc.) are permitted on loaner GFE, unless the devices are encrypted.
  5. All GFE devices used while on foreign travel must remain powered off during travel to and from foreign countries, segregated from HHS networks/systems, and submitted to the IT Helpdesk immediately upon return for evaluation and sanitization.
  6. All devices must be sanitized upon return and before re-use.

This means whether an HHS executive goes to China or Germany or Canada, the device and information on it are considered at-risk.

HHS is ahead of the curve

One federal cyber executive, who requested anonymity in order to speak about their agency’s security requirements, said the HHS policy is one of the strictest in government.

“HHS is ahead of the curve and that’s a good thing because it is dealing with it in a prioritized manner,” the official said. “People who are traveling at all agencies are not low level and they have a lot of other important things to be worrying about so by giving them a new device, it makes it easier for them not to have to worry as much about the security, especially with cost of technology continuing to come down.”

The federal cyber executive added that in some ways HHS is solving a people problem with technology instead of the other way around.

“People are lazy. It’s as simple as that, and if it gets complicated people don’t want to deal with it. This is why a technology-first approach makes sense,” the executive said.

Brenner, who now teaches at the Massachusetts Institute of Technology and and runs his own consulting and law practice,  said it’s more than people are lazy, it’s a lack of understanding especially by executives.

“They don’t want to deal with the aggravation and having to take special steps before they go and when they get back,” he said.

Agencies are beginning to recognize the need to better secure mobile devices. Symantec reported in 2018 that new mobile malware types jumped 54 percent from 2016 to 2017

Vincent Sirtipan, a portfolio manager in the physical and cybersecurity division in Office of Mission and Capability Support in the Department of Homeland Security’s Science and Technology Directorate, said agencies have focused for a long time on mobile device management (MDM) software to protect their devices. But that is only one piece of the bigger puzzle.

“It has to be a MDM and other technology that enable security whether that’s identity management or mobile application vetting or a mobile threat defense solution,” he said. “When you are talking about mobile phones, we are still maturing as an enterprise as is the entire market. What controls and capabilities do we need on a mobile phone to secure it? We recognize it poses a broader threat landscape and a broader attack surface.”

NIST updating mobile security standards

Sirtipan said DHS recently completed is fifth review under the government cybersecurity architecture review (GovCAR) initiative that looked only at agencies’ mobile infrastructures.

“The review team identified if an employee’s multiple mobile security technologies, including application vetting and identity and access management means agencies have a greater security posture against mobile attacks,” he said. “They looked the attackers’ process and desire to move laterally based on mobile attacks. They are able to identify if agencies employ certain tools, they can see what their security posture looks like, and when they employ a compilation of more mobile security tools they are able to mitigate adversary actions and limit their ability to attack us.”

Jon Johnson, the former director of the enterprise mobility program at the General Services Administration, and now director at Redhorse Corp., said agencies have had standards from the National Institute of Standards and Technology to several years for their mobile devices. He said NIST 800-124, work by DHS S&T and others have increased awareness, and now it’s just matter of agencies understanding their risk postures.

Sirtipan said NIST and others in government are updating SP 800-124, and the draft revision should be out for public comment in the next few months.

“We are looking at things like leveraging the National Information Assurance Protection (NIAP) protection profiles, and talking about picking a device that has been trusted and secured,” he said. “We have rechartered and renamed the federal mobility services category management team and mobile security tiger team to be one federal mobility group. It includes 45 agencies and departments to help move us all toward a better security posture.”

Sirtipan said while adding more technology and standards are helpful, it comes back to the user.

And that takes us full circle to HHS.

Vogel, the HHS CISO, said since cyber threats cross all borders, more needs to be done.

“Cybersecurity threats exist outside of the United States, and United States citizens, especially government employees, are often targeted while traveling abroad. Employees are not allowed to connect to HHS systems or networks using unsecured networks — from internet cafes, coffee shops, etc. — regardless of whether they are in the United States or abroad,” she said. “That said, the United States has strong cybersecurity protections, while safeguarding in other countries may not be as robust. Requiring employees to connect to secure, password-protected networks and use a VPN help strengthen our cybersecurity posture and combat potential threats.”


Bid protest win continues to show fragility of multiple-award contracts

Right now, 81 small businesses are wondering why?

Why their ticket to a potential $15 billion lottery has been lost.

Why after waiting a year to begin marketing and promoting task orders through the Alliant 2 small business contract they may have to be even more patient and wait potentially another 12 months?

And why another multiple award small business contract is mired in a bid protest?

These, and probably a host of eye rolls, sighs of frustration and shakes of the head, came fast and furious last week when the General Services Administration announced it was rescinding all 81 awards made in February 2018 under the Alliant 2 Small Business governmentwide acquisition contract (GWAC).

And it left one small business thinking, “We told you so.”

GSA withdrew the awards after the Court of Federal Claims ruled in favor of Citizant in its protest of being excluded from Alliant 2 SB awards.

The judge found GSA erred in evaluating proposals, specifically around having a qualified cost accounting system and price reasonableness.

“The court presumes that Citizant was prejudiced because the record reflects multiple instances of the contracting officer evaluating proposals in an arbitrary, capricious, or irrational manner,” the court states. “Simply stated, the court finds that Citizant has shown that it had a substantial chance of receiving a contract if the contracting officer did not make the aforementioned errors.”

The judge told GSA to re-evaluate all bidders to address the errors Citizant pointed out.

GSA made the initial Alliant 2 awards in 2017 for the unrestricted track and February 2018 for the small business track.

Procurement experts say while GSA doesn’t have to necessarily start over, the re-evaluation could take six months and then the procurement would take another six months to get through the expected protests.

“The problem here is multi layered. It goes back to the issue of GSA’s self-scoring system and this whole idea of trying to make it easier for agencies to go through the proposal process and take the next step in the procurement,” said Tony Franco, a partner with the law firm PilieroMazza, which specializes in small business procurements. “The reason why GSA has to go back and fix this is because it looks like the agency messed up on the front end with regard to that first step of the evaluation process, self scoring. It resulted in a number of contractors thrown into the equation that maybe should’ve been disqualified earlier.”

Another federal procurement attorney familiar with the case, who requested anonymity because the sensitive nature of the proceedings, said the judge expected GSA to hold everyone to the same requirements and during the discovery part of the case, it became clear the contracting officer didn’t do that.

“I can’t imagine GSA will re-evaluate all 500-plus proposals,” the attorney said. “I think GSA will redo the self-scoring checklist, and they may just throw out those companies that shouldn’t have been qualified in the first place. And that could cause more protests. This is the song that doesn’t have an end. That’s the problem with large procurements, they are so important and valuable to vendors that they are willing to protest.”

A spokesman for Citizant declined to comment on the judge’s decision.

Alliant’s faced more than 40 protests

Alliant 2 SB remains under protest even with the Citizant decision.

Three more cases from RX Joint Venture LLC, TISTA Science and Technology Corp. and Metrica Team Venture are before the appeals court.

So far over the last three years, the Alliant GWAC process has faced more than 40 protests.

“Whenever agencies trying to create these multiple award contracts with so many different companies, it will be very hard for them to treat everyone consistently the way they are supposed to,” Franco said. “With complicated proposals and solicitations, and multiple offerors, procurement shops with limited resources struggle, and it will inevitably lead to protests like this where you can always find some flaw in procurement.”

Franco said as GSA and other agencies continue to develop these large multiple award contracts, agencies will create problems that these types of contracts were trying to avoid in the first place.

“This makes me question whether agencies should be using these MACs with so many offerors. Wouldn’t it make more sense to issue separate solicitations or go through the schedules?” he said. “Why create these complicated procurements that at the end of the day are designed to make the source selection process easier downstream when on the front end you may spend years figuring out who are the right contractors? There is so much potential for fallibility when you have humans involved and issues fall through cracks.”

Foreshadowing problems for other MACs?

The Alliant 2 small business experience is the perfect precursor to what is likely to happen to several procurements that are just getting off the ground.

GSA and the Air Force’s 2GIT multiple award contract with a ceiling of $5.5 billion is just getting started and could face a pre-award protest right off the bat. Industry sources say vendors are concerned about violations of the Small Business Act of 2010.

Then there is GSA’s Center of Excellence Discovery blanket purchase agreement, which is entered the second phase of the acquisition process. Last week, the Federal Acquisition Service posted seven challenge questions for each of the areas with a due date of April 1.

In no more than 1,500 words, FAS wants vendors to outline their approach to determine where things stand now, the path forward for implementation and how they will ensure modernization efforts continue beyond implementation.

Both of these procurements as well as the others that are coming over to GSA schedules as blanket purchase agreements, including those MACs from the FBI and the Homeland Security Department, have the strong potential to face protests from unsuccessful bidders. And like the Alliant 2 small business GWAC, it’s to ask if all the time and resources that go into these contracts is worth it. Maybe it’s time to think of another way like having Congress modernize the GSA schedules so this need to create BPAs on top of the schedules or standalone GWACs can go away. This would be a huge step toward getting agencies and vendors alike out of this protest merry-go-round.


« Older Entries

Newer Entries »