Reporter’s Notebook

jason-miller-original“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.

Submit ideas, suggestions and news tips  to Jason via email.

 Sign up for our Reporter’s Notebook email alert.

Could updated controls from NIST drive up cloud security costs?

Subscribe to Federal Drive’s daily audio interviews on iTunes or PodcastOne

Among the biggest complaints about the cloud security program known as the Federal Risk Authorization Management Program (FedRAMP) have been the cost for vendors and the time it takes to get approved.

The FedRAMP program management office has tried to address both over the last few years, most recently introducing the Tailored program for low-impact, software-as-a-service offerings last month.

But now the program management office is concerned that many of those advances could be at risk with the updated security controls from the National Institute of Standards and Technology.

In its public comments about NIST Special Publication 800-53, Revision 5, FedRAMP said the move from Revision 4 to Revision 5 could cost millions of dollars across the cloud service providers, third-party certifiers and the federal Joint Authorization Board (JAB) to update the approved cloud services and related standards.

“We wanted to understand the financial impacts of this update for both government and vendors in order to understand what sort of return on investment implementing these changes would provide,” said Matt Goodrich, FedRAMP program manager, in an email to Federal News Radio. “Our cost estimates are based on our experience with the transition from Revision 4 to Revision 5, and were a high-level estimate based on documentation updates alone and not any costs associated with implementing and assessing new security requirements.”

NIST released the Revision 5 in August and comments were due Sept. 12. NIST says it expects to issue a final draft in October and the final version of 800-53 Revision 5 by December.

But it’s more than just cost — a recent report by Coalfire found the average cloud service provider (CSP) spent between $350,000 and $865,000 to get FedRAMP certified — that worries the program office.

Goodrich detailed in the comments three areas of concern, and made two recommendations to NIST.

A NIST spokeswoman said the agency doesn’t comment on another agency’s comments about a draft publication.

One area is around the priority of the new controls versus updated controls.

“Are all new and updated controls considered to have the same positive impacts on security and should be treated equally? Based on FedRAMP’s review, this is not the case,” the comments stated.

The second area is around what NIST calls “new security concepts.” FedRAMP stated these new security concepts need to be defined more clearly and better explained why these concepts are important to Revision 5.

A third concern is around the security objectives for each new or updated control. FedRAMP stated NIST should “clearly articulate the intent of each control so alternate implementations and mitigating controls can be properly analyzed.”

“Without understanding what the objectives are, how are agencies and vendors going to be sure they meet the intent of security controls if not meeting a security control explicitly as stated? Without the objectives, it is difficult to understand if alternate implementations or mitigating security controls are sufficient,” FedRAMP stated in its comments.

NIST also hasn’t discussed the test cases for the new controls, which vendor experts say are key to implementing any revision.

“They put out test cases for 800-53A under Rev 3, but didn’t update them for Rev 4. Since Rev 5 is a big change, I would expect NIST is looking at test cases. And they need to because I think moving to Rev 5 will need a long on-ramp,” said Maria Horton, CEO of EmeSec, a third-party assessment organization (3PAO), in an interview with Federal News Radio. “FedRAMP, 3PAOs, the PMO and the cloud service providers will have to evaluate the test cases to see how they are implemented. There will be some challenges because the changes to Rev 5 reflect the new digital economy.”

Horton said FedRAMP, 3PAOs and CSPs will need to architect, design and adopt the new controls for cloud services.

“I would recommend to NIST and FedRAMP give certified-CSPs and anyone FedRAMP-ready a year beyond when they settle on the test cases to allow for investment and adaptation to the digital economy, and privacy and security requirements,” she said.

Horton and other executives at 3PAOs say the security and privacy changes in Rev 5 are the most significant updates to the controls.

NIST says among the changes it’s proposing are to:

  • Make the security and privacy controls more outcome-based by changing the structure of the controls;
  • Fully integrated the privacy controls into the security control catalog creating a consolidated and unified set of controls for information systems and organizations, while providing summary and mapping tables for privacy-related controls;
  • Separate the control selection process from the actual controls, thus allowing the controls to be used by different communities of interest including systems engineers, software developers, enterprise architects; and mission/business owners.

Doug Barbin, a principal and cybersecurity leader for Shellman and Company, a 3PAO, said in an interview with Federal News Radio that while privacy was always a part of Rev 4 and previous revisions, Rev 5 brings in more of the generally accepted privacy requirements, policies and guidelines for information sharing.

“Every single area of the control set around data or information has privacy controls,” Barbin said. “For cloud providers, it gets interesting and starts going toward the delivery model such as infrastructure- or software-as-a-service, which may be data agnostic so some of those controls may get pushed down to the agency to implement.”

Barbin said for SaaS, both the cloud provider and the agency customer must consider what data is being collected and which users are potentially interacting with that information, especially if it’s personally identifiable information (PII).

“This will take time to roll into FedRAMP because you will have to update all the core templates for authorization,” he said. “We also will have to do control tailoring, and come up with the criteria for testing and analysis that the 3PAOs will perform.”

Abel Sussman, director of the cyber risk advisory group for Coalfire, said the integration of the privacy controls with the rest of the security controls is the biggest change.

But Sussman said he believes about 40 percent of the controls at the moderate level will need to be changed.

“I don’t think any of the new controls will require a major uplift,” he said. “It’s about documenting how things are already implemented with appropriate tweaks.”

Sussman added for cloud service providers, the Rev 5 changes also means ensuring the security changes are ingrained in the training, planning and reporting functions.

“As organizations develop good risk compliance programs, they will be able to meet many of these controls,” he said. “One of the things that came up is the wording from JAB. They want security controls that are outcome-based. It means there is more direct language of what is expected. For example, multi-factor authentication for privileged accounts in Rev 5 versus Rev 4 is more direct. and that may be confusing because CSPs may not be sure what outcome-based means. This is where test cases could help.”

There seems to be a lot of uncertainty around what Rev 5 will mean for CSPs, 3PAOs and agencies, and that’s why the test cases will be critical.

“Technology Transformation Services (TTS) has a spirit of transparency, and our FedRAMP team spends a lot of time engaging with our industry partners to better understand their concerns and thoughts on major programmatic updates. The comments note common questions we’ve heard from our stakeholders related to this update,” Goodrich said. “FedRAMP continues to have a thoughtful dialogue with NIST and OMB about the issues raised in our comments and how to best address the concerns noted.”

Return to the Reporter’s Notebook


Number of small business prime contractors down by 25 percent since 2010

Federal agencies met the governmentwide small business goal for the fourth straight year in fiscal 2016. Last year also saw a $9 billion increase in the total of prime contracts going to small firms.

What these numbers aren’t showing, and what should really worry the Small Business Administration and the broader contracting community, is the number of small businesses winning prime contracts is dramatically down.

Deltek, the market research firm, analyzed the data and found a 25 percent decrease in the number of small business prime contractors since 2010.

“There are fewer small businesses that are engaging as a prime and that is the same pattern across large businesses,” said Kevin Plexico, the vice president of information solutions at Deltek, during the FedFocus 2018 event in Vienna, Virginia on Oct. 10. “The dollars are holding steady but the number of prime participants is declining.”

The government’s focus on total dollars going to small businesses may be shortsighted. The number of businesses in the marketplace is just as important as the total amount of contracts small firms are winning.

“The Deltek data is interesting, but I’m not surprised to see the analysis. It certainly is a concern across the federal government with respect to what’s happening in the market space and the impact of consolidation, strategic sourcing and category management, and this is somewhat a result of that,” said John Shoraka, managing director of PilieroMazza Advisory Services, LLC, a consulting firm focused on small businesses, and a former associate administrator of Government Contracting and Business Development at SBA during the Obama administration. “There’s fewer requirements out there. There is a lot of consolidation of requirements because of strategic sourcing and the category management and the drive to efficiency.”

Shoraka added that the pressure on the agency contracting officials to use more governmentwide vehicles and multiple award contracts also limits the number of opportunities for small businesses.

Steve Koprince, a managing partner of Koprince Law, which represents small businesses, echoed Shoraka’s comments about not being surprised about the shrinking marketplace and the impact of strategic sourcing and category management.

Concerns about the impact of category management on small firms aren’t new, but the data seems to confirm many of those worries.

The General Services Administration is holding each of the managing partners for category management accountable for meeting small business goals.

But the goals are focused on total dollars, and not total dollars and participation.

That’s why Congress got involved in the 2016 Defense authorization bill, by requiring the SBA scorecard to measure the participation rate of small businesses in contracting.

Shoraka said this new requirement is a direct pushback against the consolidation happening across government and the fact that agencies still were their meeting goals.

At the same time, Koprince also said because of the upturn in the economy, those commercial firms who decided to try government contracting decided to self-select out because they didn’t need the public sector market any longer.

Other drivers, Koprince said, are the changes SBA made in 2015 around the mentor-protégé program and the rules governing joint ventures and teaming.

“SBA loosened up the requirements for joint ventures and prime-sub teaming, and the upshot of those changes are coming at the same time as you see these contract efficiencies,” he said. “SBA made it easier for small businesses to team together as well as making joint ventures easier for small businesses to team with large businesses through the mentor-protégé program. So now, no matter how large a mentor is, they can team with a small business and pursue a set-aside. There is both a greater interest by small businesses and there’s a realization that if my competitors are doing this, so should I.”

SBA’s change to the joint venture rules only requires both companies to be below the size standard threshold individually instead of adding both companies’ revenues together and then determining if the joint venture is below the threshold.

Koprince said he’s working on joint venture or teaming agreements almost every day.

Shoraka added the changes to the joint venture and teaming rules are another reaction to the strategic sourcing, category management and overall move to larger, more complex requirements across the government.

“For existing government contractors, the landscape is changing,” Shoraka said. “When they used to go after a set-aside contract, their competitors used to be other small businesses. But now with the joint ventures, the competition is tougher because of changes in rules.”

Shoraka and Koprince also pointed out that mergers and acquisitions reduced the number of small businesses eligible for prime contracts.

Further data from Deltek seems to support this as well.

Large and small contractors alike aren’t seeing the opportunity to grow by winning new work, so buying competitors or buying into a sector, such as defense, cyber or homeland security is the next best way increase revenue.

Deltek said contracting spending is estimated to be about $473 billion in 2017, down from $479 billion in 2016. But the company expects agencies to spend more in 2018, particularly the Defense Department.

Plexico said the data shows that 2018 will see the second highest value of potential opportunities in the last 13 years, with professional services accounting for about $150 billion, or half of all opportunities Deltek is tracking.

He said Defense opportunities represent 71 percent of the total contract value available, and across the government, 91 percent of all opportunities are considered follow-on contracts.

Shoraka said one way to stem the tide of small firms leaving the prime contracting arena is for large multiple award contracts to offer on-ramps and off-ramps more often. He said this way growing small firms don’t have to wait until there is a recompete on a contract like Alliant small business or OASIS small business to earn their hunting license.

Koprince said despite the 25 percent decrease of small business primes since 2010, there is plenty of competition in the market.

“In general, when I ask my clients, they say they may be seeing the same usual suspects, but that probably has always been the case in a specific niche,” he said. “There are plenty of acquisitions where there are 12 bids and tons of interest and it’s very competitive at a macro level.”

That’s good news for the government, but it’s something SBA and the Office of Federal Procurement Policy should pay closer attention to as the long tail of category management and strategic sourcing continues to grow.

Return to the Reporter’s Notebook


DoD CIO sets baseline for mobile app security

About six years ago, as the Obama administration was launching the cloud security program known as the Federal Risk Authorization Management Program (FedRAMP), Tom Suder, a mobile government expert and consultant, suggested that maybe the government needs a similar process for mobile applications.

Well, the Defense Department may have just taken a major step toward establishing a baseline security standard for mission-critical mobile apps.

John Zangardi, the acting DoD chief information officer, signed a memo Oct. 6 outlining a new process for securing mobile apps that sets a baseline standard, promotes reciprocity across the military and clarifies which apps need to go through this new approach.

“For the Department of Defense, mobility has been increasingly vital to fulfilling its mission from digital flight bags to logistical support,” said Suder, president of Apcerto, which provides a mobile application security platform. “This memo codifies security to an appropriately high level. I suspect civilian agencies would start to follow the DoD’s lead on this mandatory National Information Assurance Partnership (NIAP) certification policy.”

Zangardi instructed the services and DoD agencies to use the NIAP profile, “Requirements for Vetting Mobile Applications from the Protection Profile for Application Software.”

“The NIAP developed the baseline set of security requirements for organizations engaged in locally evaluating mobile applications,” the memos states. “These requirements are achievable, testable, and repeatable and provide a basis for technical evaluation and risk determination by Authorization Officials (AOs).”

And that’s the key here, achievable, testable and repeatable — just like FedRAMP.

Additionally, DoD is following the FedRAMP model by creating the pieces and parts to make this process work.

Zangardi said among the things the DoD CIO will take on are creating a mobile application portal and providing guidance and direction in the development of the DoD Mobile Application Evaluation templates.

The National Security Agency will continually update and evaluate the NIAP risk profile.

The Defense Information Systems Agency (DISA) will provide the heavy lift in this new process.

Zangardi tasked DISA with developing the template and creating the portal within 90 days. DISA also will update the applicable Security Technical Implementation Guides (STIG) to ensure alignment with the NIAP profile.

Each of the DoD services and agencies also will be responsible for evaluating apps, reviewing the mobile app portal and commercial apps stores prior to developing, buying or evaluating new software, and for user training of potential security threats.

The DoD memo follows closely the 2015 recommendations made by the Federal CIO Council’s mobile tiger team to use the NIAP profile as the governmentwide standard for vetting mobile apps. NIAP and the National Institute of Standards and Technology also have been working together to ensure the profile is closely aligned to Special Publication 800-163, vetting the security of mobile applications.

Chris Gorman, the chief operating officer of Monkton, a mobile application development firm, said the memo brings some better practicality into the mobile environment.

“If you are using Uber or ESPN, or anything that is not mission related and doesn’t have any sensitive content, then put the risk framework around the app at a reasonable level and it doesn’t require a lot of DoD resources or funding to secure,” Gorman said in an interview with Federal News Radio. “The apps that are for the mission or are line-of-business related, DoD is saying that is where they want to spend their time on. Whether it’s a commercial app like Adobe or Salesforce, or a government app, DoD is saying, let’s make sure those are secure because that is where the sensitive data that will persist at rest or transmitted to the government data center will live.”

Gorman said the memo also gives vendors a place to start from as they develop apps for DoD. He said previously there was no common starting place and that slowed down the development and acceptance of mobile apps across DoD.

“The memo goes a long way to give common guidance so no one is reinventing the wheel when it comes to using a risk management framework. The NIAP is the baseline, and if you don’t give a common baseline, then reciprocity doesn’t have a place to live,” he said. “Now all of DoD will be vetting to the same requirements, and now you will know what to do instead of waiting on the authorizing official to make a decision of what is secure enough.”

Gorman said the memo clearly states the authorizing official still makes the final determination of risk, but the fact that the portal will have the artifacts to start with helps a great deal.

There are a couple of issues the memo doesn’t address or go far enough in detailing.

Gorman said there is no mention of derived credentials in terms of validating and asserting the authenticity of the user who needs to access sensitive data via apps.

Additionally, he said creating a culture of trust will take time. The templates and portal are good starting points — similar to how FedRAMP increased its acceptance.

“I’m optimistic that the civilian agencies also will go down this path. There are just too many reasons that they should, rather than just why they shouldn’t,” Gorman said. “If you look at what the Homeland Security Department’s CTO’s Office and the Science and Technology Directorate have been doing with the Carwash program and other efforts, it got everyone thinking about how to get this initial capability out there and secure the apps.”

Return to the Reporter’s Notebook


How one contractor belittled the White House’s IT modernization strategy

Subscribe to Federal Drive’s daily audio interviews on iTunes or PodcastOne

The White House is busily reviewing more than 90 comments on its draft IT modernization strategy.

The comments came from industry associations, specific companies and individuals, including federal employees, and most were pretty vanilla, offering basic support for the initiatives in the draft strategy and insights, both generally and specific to the organizations’ or vendors’ area of expertise.

But none was more fascinating then the flames Oracle decided to throw about the entire IT modernization effort over the last nine years.

Kenneth Glueck, the senior vice president in the Office of the CEO for Oracle, wrote a 13-page takedown of many of the Obama administration’s key technology efforts.

“We respectfully suggest the government has not gone far enough in articulating a plan that will result in significant change and instead seems to be driving the government in the opposite direction,” Oracle stated. “Many of the report’s recommendations and current modernization efforts seem out of sync with the best technology practices deployed in a Fortune 50 company today.”

Oracle laid out three false narratives that it said are driving the modernization efforts in the wrong direction.

First, Oracle said the government thinks it should act more like a start-up.

Second, agencies believe they need in-house development expertise, such as the General Services Administration’s 18F and the U.S. Digital Service at the White House.

Third, the mandate to use open source is required so the software is available to the taxpayer.

“These false narratives have led to a series of actions that is unquestionably holding the [government] back from modernizing its IT, some of which are contained in the report, but all of which are being deployed across government, to the bewilderment of many in the private sector,” Oracle wrote.

Open source and customization

Oracle outlined nine broad problems with the current IT modernization efforts.

You can read them here, but let me highlight a few that stood out.

“The largest contributor to cost and complexity is customization, yet actions of the [government] and the report seem to embrace both government developed bespoke technology and customization,” Oracle wrote.

This is where Oracle goes after 18F and USDS for promoting the writing of code instead of seeking to “leverage and scale by engineering out labor costs, including process engineering.”

Oracle also claims the push for open source is coming from 18F and USDS.

“The actions of 18F and USDS plainly promote open source solutions and then propagate those mandates across government with the implicit endorsement of the White House. The [government’s] enthusiasm for open source software is wholly inconsistent with the use of open source software in the private sector,” the Oracle stated.

Instead, the company said open source should be competed against proprietary software for what works best for the functions desired.

“There is no math that can justify open source from a cost perspective as the cost of support plus the opportunity cost of forgoing features, functions, automation and security overwhelm any presumed cost savings,” Oracle stated. “Developing custom software and then releasing that code under an open source license puts the government at unnecessary security risk as that code is not ‘maintained by a community,’ but is rather assessed and exploited by adversaries. Further, this practice puts the government — most likely in violation of the law — in direct competition with U.S. technology companies, who are now forced to compete against the unlimited resources of the U.S. taxpayer.”

But this is where Oracle’s argument begins to fall apart. The Office of Management and Budget issued a policy in August 2016 requiring agencies who develop any new, custom source code available for other departments to access and use. But this wasn’t the first time OMB encouraged the use of open source.

It started during the administration of President George W. Bush. OMB referenced the use of open source in a 2004 memo reminding agencies how to license this type of software, and the Defense Department issued open source policies in 2003 and again in 2009.

The resulting Code.gov portal from the 2016 memo includes thousands of examples from 27 agencies, and if sharing open source code saves money, what’s the harm? There still is no mandate to use the code, but OMB wants agencies to look at the portal first before developing new code or buying it from vendors.

Concerns about Login.gov

Oracle goes even further to take on 18F and USDS, claiming that initiatives to bring on software engineers and other experts from the private sector “resulted in the predictable outcome of creating favoritism for those vendors’ solutions, and seems to replace presumed technical expertise with the more complex task of procuring, implementing, maintaining and securing systems over the long term.”

Another flaming arrow takes a shot at the Login.gov platform. Oracle called 18F’s attempt to build a single sign-on capability for federal services “misdirected security resources,” which will leave citizens without a modern approach to identity management.

While vendor frustration with 18F and USDS isn’t new, the case Oracle makes is a whole new level of angst.

The argument against 18F and USDS also is misplaced. There were a lot of problems with USDS and 18F, but slowly the two organizations are repairing their initial challenges.

Oracle said agencies shouldn’t be coding or hiring technology experts. That logic is flawed as well.

Just look at the work former Social Security Administration CIO Rob Klopp did in turning around that agency. He relied on a stable of federal employees who learned advanced coding languages, and supplemented those skills with a host of contractor support.

It’s not just at SSA, the move toward digital services expertise relies heavily on contractor support at agencies such as the departments of Defense, Veterans Affairs and Homeland Security, as well as the Environmental Protection Agency.

Oracle does have it right to be concerned about Login.gov. The government has failed three other times to create a single-sign on capability, and it’s unclear if the current approach will find success.

Oracle also makes a dozen or so recommendations, including modernizing the cloud security standards process known as the Federal Risk Authorization Management Program (FedRAMP), and focusing cyber efforts at both the data level and at the perimeter.

While the impact of Oracle’s comments is unclear, the company has benefitted from the IT modernization effort it so criticized.

According to USASpending.gov, from 2012 to 2016, Oracle, directly and through its resellers, won more than $4 billion from federal contracts, including tens of millions of dollars from those same agencies that have led the use of digital services, agile development and many of the things that Oracle seems to think don’t work in government.

We can admire Oracle for being an outspoken critic and probably saying many things that other vendors were too scared to say, but the question comes back to why now and why so publicly?

Along with Oracle’s comments, here are a few others that were interesting or out of the ordinary:

• Avue Technologies pushed back against the government’s “monopolies” in the shared services area. Avue, which provides human resources services in the cloud, wrote that payroll services, for example, are 20-30 years behind and insecure. “As with all monopolies, inertia ruled the day and the government never modernized the IT infrastructure, architecture, cybersecurity or systems used by the government’s SSCs. In addition, the inefficiencies baked into these systems required adherence to outdated business processes which drove costs up, not down, and constrained rational changes in policy that would move the government into a 21st century workforce and talent management framework,” Avue wrote. “After eliminating the private sector from competing, the government eliminated any mechanism of accountability for federal service suppliers and shared service centers to achieve the results that underlie the theory of shared service efficiencies. The dramatic cost increases and concurrent productivity declines are the result of this lack of accountability.”

• Google makes a big push for better cloud security, saying agencies need to get out of the perimeter-based cloud security mindset. “A perimeter-centric security mindset can translate into prescriptive controls and compliance requirements that prevent government from accessing the best of commercial cloud security. While timely, comprehensive patching stands out as one of the key security advantages of cloud services, recertification requirements can risk muting that advantage by serving as a de facto gate to deploying patches and new security features,” Google stated. Google pitched the need to recognize and put international standards into practice more often. “Given the constant advances in and evolutionary nature of the cloud security model, the federal government should consider ways to harmonize its standards with those defined by internationally recognized security standards organizations to enable agencies to benefit from commercial capabilities (including in security) at a faster pace. Where agencies find those standards to fall short, they should engage in a dialog with commercial providers to better understand whether their security model and practices meet the desired security outcomes. The adoption of these standards would reduce this disparity and increase the availability of commercial services to the federal government,” Google wrote.

• Salesforce made a strong push for the “as-a-service” model as part of the move to shared services. The company says everything from capital planning and investment control- to acquisition- to change-as-a-service would reduce red tape and duplicative approaches. “Collectively, this approach presents a repeatable approach to ensure that no other modernization project will have to live through overcoming previously solved problems on their own,” Salesforce wrote.

• Adobe is encouraging the White House to not just accelerate the continuous diagnostics and mitigation program, but move to phase four immediately. Under phase four, DHS would provide data protection tools, such as encryption and digital rights management. Additionally, Adobe brought up the lack of any mention of citizen services in the draft strategy. “On balance, the report’s recommendations include networks, security controls and improved contracting. But these are tactics on a road toward digital modernization strategy. In contrast, a strategic focus for improving government begins with tackling the citizen and government customer experience. Ensuring a concurrent focus—or equally prioritized emphasis—on modern digital experiences achieves an even greater outcome of reduced operating costs, increased performance, and better advocacy from the electorate, as well as the hardworking personnel who execute the business of government,” Adobe stated.


DHS, GPO get ‘new’ IT executives; VA losing long-time acquisition leader

Barry West came back to government after the election to provide some experience and insights to the Homeland Security Department’s then-new chief information officer.

West is staying a big longer now.

DHS acting CIO Stephen Rice announced to staff on Oct. 4 that West will be continuing on as senior adviser under a limited-term Senior Executive Service appointment and is the new acting deputy CIO.

West’s limited-term appointment starts Oct. 15, Rice wrote in the email obtained by Federal News Radio.

A limited-term SES appointment can last up to three years, is nonrenewable and must be to an SES General position, which will expire because of the nature of the work, according to the Office of Personnel Management.

In his email, Rice didn’t say how long West’s appointment would last.

West initially came to DHS to support former CIO Richard Starapoli, who resigned in August after only four months on the job.

The decision to extend West’s time at DHS isn’t surprising. West is replacing Rice on an interim basis. Since coming to headquarters from the Transportation Security Administration in June, Rice had been acting deputy CIO.

This will be West’s sixth agency where he is serving as an IT executive. Previously, he was CIO at the Federal Deposit Insurance Corporation, the Pension Benefit Guaranty Corporation, Department of Commerce, FEMA and the National Weather Service.

Additionally, he was the president of the Mason Harriman Group, a management consulting company.

Along with DHS, the Government Publishing Office is staying with a familiar face for its CIO.

Tracee Boxley is now the permanent CIO after taking over as acting since November. GPO Director Davita Vance-Cooks made the announcement in a release on Oct. 5.

“Tracee has provided great leadership and a steady hand to our IT department during the last 11 months and I am proud to name her our new CIO,” Vance-Cooks said. “Tracee’s IT background and knowledge of GPO will provide leadership to this critical position, as the agency continues to meet the ever-changing technology requirements of Congress, federal agencies and the public.”

Boxley has been with GPO since 2006 and was promoted to deputy CIO in 2012.

Before coming to GPO, Boxley was chief of the American Housing Survey Division at the Census Bureau, and deputy CIO and chief of the Technical Services Division at the Food Nutrition Service (FNS).

Another federal IT executive changed jobs earlier this year, and it may not have made it on many people’s radar. In fact, another federal CIO just found out about it recently so I figured it’s a good time to catch up.

Jack Wilmer, the former vice director for the Defense Information Systems Agency’s development and business center, joined the Office of Science and Technology Policy (OSTP) on detail back in April.

Wilmer is working on cybersecurity and IT modernization efforts. Wilmer’s detail is for one year with potential to extend it another year.

He joined DISA in 2010 from the private sector where he worked on a variety of network and enterprise services.

The National Institute of Standards and Technology has a new director. The Senate confirmed Dr. Walter Copan on Oct. 5.

Copan, who also holds the title of undersecretary of Commerce for Standards and Technology, comes to NIST after serving in the academic, non-profit and private sectors during his career.

Most recently, Copan was the president and CEO of the IP Engineering Group Corporation, which provides services in intellectual property strategy, technology commercialization and innovation. Until June 2017, he was founding CEO and chairman of Impact Engineered Wood Corp., an advanced materials technology company.

Copan earned dual B.S./B.A. degrees in chemistry and music from Case Western Reserve University, and then went on to get his Ph.D. in physical chemistry from Case Western.

VA’s Giddens to retire

It’s not all promotions and job changes. Two agencies are looking for new executives.

Greg Giddens, the Veterans Affairs Department’s acting director of the Office of Enterprise Integration, is retiring at the end of November.

In an email to staff obtained by Federal News Radio, Giddens said he announced his decision a little earlier than normal in order to give VA time to begin the process to fill his former role as principle executive director of the Office of Acquisition, Logistics and Construction. Giddens has been on detail to the OEI since April where he had helped lead VA’s modernization and reform efforts.

“Between now and the end of November, I will ‘run through the tape.’ I look forward to continuing working together with you as we continue to improve the veteran experience, improve the employee experience, and improve our stewardship of taxpayer dollars,” Giddens wrote.

Giddens told my colleague Nicole Ogrysko in September that he led the effort to survey thousands of VA employees over the last few months. One major theme centered on the role of VA headquarters and how it makes decisions that impact VA medical facilities, cemeteries and benefits offices in the field.

Giddens is leaving federal service after 37 years when he started as an engineering student while finishing his engineering degree from Georgia Tech.

During his career, Giddens worked at a host of agencies including the departments of Defense, Transportation and Homeland Security.

“I could not think of any better jobs than working in OALC and the modernization office to be my last two points of federal service,” he wrote.

Giddens came to VA in 2010 and was named the principle executive director of the Office of Acquisition, Logistics and Construction in 2015.

During his tenure, Giddens aimed to improve and modernize VA’s acquisition efforts, which had come under intense scrutiny for construction failures.

Along with VA, the U.S. Mint is searching for a new chief information officer.

The Mint posted a job description on the USAJobs.gov website on Oct. 5.

Lauren Buschor has been the CIO at the Mint since 2014.

An email and phone call to the Mint asking details were not returned, and a LinkedIn message and an email to Buschor were not returned either.

DeAnna Wynn is the deputy CIO for the Mint and served as acting CIO from July to November 2013.


Don’t get caught in the ‘hype cycle’ of blockchain

The hottest new federal buzzword is blockchain. You can’t got to a conference without a session on or someone talking about this emerging technology.

But as one vendor, who cornered me recently after a session on blockchain, asked, “Is any agency actually doing anything around blockchain or are they just talking about it?”

That’s a fair question and one I didn’t have an answer for.

It’s clear from the Gartner hype cycle that blockchain is past the “peak of inflated expectations,” and heading into the “trough of disillusionment.”

But before we all jump on the bandwagon and begin holding “blockchain conferences” or create an “Office of Blockchain,” or head in the other direction and decide blockchain was the SEAT management of the 2000s or QR codes of the 2010s, let’s step back and understand some of the potential of blockchain.

First, let’s define it.

MIT Sloan assistant professor Christian Catalini, an expert in blockchain technologies and cryptocurrency, said in a recent MIT Sloan Management article that blockchain technology lets computers agree on the current and ongoing status of a distributed ledger.

“Such ledgers can contain different types of shared data, such as transaction records, attributes of transactions, credentials, or other pieces of information,” Catalini said in the article. “The ledger is often secured through a clever mix of cryptography and game theory, and does not require trusted nodes like traditional networks. This is what allows bitcoin to transfer value across the globe without resorting to traditional intermediaries such as banks.”

The reason why blockchain is popular is the growth and acceptance of bitcoin and other digital currencies that use the underlying technology, but also because companies and technologists see the security benefits.

Catalini said in the article “the ledger is distributed across many participants in the network — it doesn’t exist in one place. Instead, copies exist and are simultaneously updated with every fully participating node in the ecosystem. A block could represent transactions and data of many types — currency, digital rights, intellectual property, identity or property titles, to name a few.”

Agencies are starting to get excited about the potential uses for blockchain.

Justin Herman, the General Services Administration’s Emerging Citizen Technology program office in the Technology Transformation Service, said at the recent 930Gov event in Washington that more and more agencies are seeking his office’s help to test out blockchain and bringing big money to the table.

“Blockchain has up-ended everything. It has shifted in tenor from ‘you should check this out,’ to ‘you really have to check this out’ to ‘we need GSA to take a leadership role in this and build a governmentwide roadmap,’” Herman said. “Every week at least two agencies are coming to the table, and it’s only increasing, ready to invest in blockchain. Not just pilot programs at $50,000 a click, but talking about programs at $4 million a click for this. This is happening right now.”

He said there is so much interest in blockchain that it’s taking up a majority of his day in the emerging technologies office.

The goal with blockchain is to make sure your records are trusted and traceable, and that’s why there is a lot of excitement around the technology.

Herman said GSA has received more than 300 potential use cases for blockchain from federal agencies.

One of those agencies which wants to know more about this technology is the Treasury Department’s Bureau of Fiscal Service. It set up four pilots to test new or emerging technologies, including robotics and blockchain.

John Hill, the assistant commissioner in the Office of Financial Innovation and Transformation, said his office will kick off a blockchain pilot this year focused on physical asset inventory control.

“We are starting very slowly. What we are doing is building a prototype for tracking Blackberrys within a small organizational unit so you can constantly reconcile the master inventory with the actual, physical inventory at any point in time,” Hill said at the 2017 Shared Service Summit sponsored by the Association of Government Accountants and the Shared Services Coalition. “We hope this pilot will prove that the blockchain will produce real value, eliminate all these independent records and simplify the whole process and keep the actual inventory and the master inventory in sync in real time. If that works, and it works well, then I think it’s an easy intellectual or conceptual leap to the next one, which is if you can keep track of something physical like a Blackberry, then perhaps an intergovernmental difference could be managed or perhaps something of more value that is not physical, but in fact is virtual.”

Hill said he expects a couple of hundred people to take part in the pilot, which is expected to last three-to-nine months.

There still are a lot of questions about blockchain, including whether or not it can scale to more than 1 million users.

Herman said there are several other obstacles to getting blockchain into the mainstream federal environment.

He said one common issue is not understanding the use case for blockchain. As a good rule of thumb, blockchain is not valuable if there isn’t an implicit lack of trust in the process.

Herman said blockchain usually is part of a process, not an end unto itself. There are questions about interoperability and vendor lock-in, and there are the usual governance and funding issues that have to be addressed before implementing.

But Herman and others say the potential of blockchain is real, and GSA is trying to help the adoption go smoothly.

“The sober way of looking at [blockchain] is analyzing your problem and then articulating it in such a way that potentially a solution could be blockchain or smart automation,” Herman said. “We are working with about 200 federal managers on blockchain initiatives. Financial management, procurement, supply chain management, smart contracts, records, workforce data, government issued credentials, it’s not a matter of if, it’s happening right now. There are needs and opportunities at the table to help shape this.”


Exclusive

OPM names new CIO just as it begins initiative to modernize electronic personnel file

Subscribe to Federal Drive’s daily audio interviews on iTunes or PodcastOne

The Office of Personnel Management has a new chief information officer. Multiple sources confirmed that David Garcia, the former CIO for the state of Maryland, is starting at the agency today in the same technology executive role.

Garcia replaces Dave DeVries, who retired in September and is now the CIO for the state of Michigan. OPM made short work of finding a new CIO, bringing Garcia in as a political appointee.

Garcia, who left as the Maryland CIO in January after almost two years on the job, spent eight years in the federal market, including three as the chief of the telecommunications and network management divisions for the Army Center for Health Promotion and Preventive Medicine. He also worked for the Army as a signals intelligence analyst and as a consultant for Keane Federal Systems.

David Garcia is the new OPM chief information officer.

OPM’s decision to appoint Garcia comes as it’s collecting information from vendors on a new initiative to modernize the electronic personnel file.

The Sept. 15 request for information is one of the first attempts by an agency to test the theory behind the Modernizing Government Technology (MGT) Act.

OPM wants to use the savings from turning off the legacy systems to pay for future modernization efforts.

“In order to improve and enhance the federal government human capital experience and to make human capital data more effective and efficient for data-driven decision-making, OPM seeks to establish a complete, accurate and secure Employee Digital Record (EDR) that will contain all relevant employee data for the entire human capital lifecycle,” the RFI stated. “The existing environment ultimately limits the government’s ability to effectively understand the federal workforce landscape, inform strategic policy and decisions and to provide to agencies the tools and services that foster timely, data-driven decisions. Moreover, it is currently impossible to construct and exchange a single, machine-readable employee digital record throughout the federal employee’s career.”

The current environment is a hodgepodge of systems that pull data from 19 different systems and includes structured and non-structured data. To construct the current electronic personnel file, OPM must address data inaccuracies, a lack of standards across the information and business processes, and an “immense amount” of manual interaction and intervention.

OPM is asking vendors to consider all options to modernize the electronic file, including the use of share-in-savings, open source and plug-and-play technology.

OPM is holding an industry day in Washington, D.C. on Oct. 5, and responses to the RFI are due Oct. 12.

Chris Cairns, a founder of the 18F organization at the General Services Administration and now a partner with Skylight Digital, said he applauds OPM’s initiative because it’s addressing a problematic system with modern thinking.

“The big thing is to try to avoid a big-bang approach, in which nothing new is delivered and savings aren’t realized until the date the new system is turned on and the old systems are turned off,” Cairns said. “It’s important to architect a modernization path for these legacy systems that start to deliver modernized functionality/infrastructure incrementally and start to generate incremental savings.”

The fact that OPM is considering a share-in-savings approach is a huge step forward to get to the vision of the MGT Act, which sets up working capital funds in each agency where agencies can repurpose funds from turning off legacy systems.

“This is the time to talk about share-in-savings,” said Frank McNally, the director of learning and content development for the Public Spend Forum, an organization working to share best practices around public-sector procurement. “It’s wise for them to hear from industry about how share-in-savings would work, since there are a lot of open questions, including under what authority to do it, how to handle up-front termination liabilities and other related things. What OPM is saying is they need to lower their initial investment and they want to see if vendors are willing to take that ride.”

The share-in-savings concept is interesting for several reasons. First off, with OPM’s IT budget request of $37 million for infrastructure improvements likely to get cut in half in fiscal 2018, funding a huge project that involves millions of federal employees would be difficult.

Second, the vendor would have a captive audience, meaning the fluctuation in size will be minimal, as is the revenue stream that agencies pay for the service.

Cairns said share-in-savings is allowed under the Federal Acquisition Regulations and OPM, which already has a revolving fund, could take advantage of this approach.

“I think for the share-in-savings model to be a viable option here, several things need to happen. First, this is a completely new contracting approach that requires a multi-disciplinary approach. OPM should start small with a few pilot projects, working on small modernization components and learn from those,” he said. “Vendors probably won’t be willing to front 100 percent of the investment, so these initial projects will probably need to be a hybrid. For example, 75 percent of the effort funded by OPM, and the other 25 percent funded by the vendor, with hard caps on how much reward the vendor can reap. As OPM and the vendor community get better and better at these, they scale up the use of share-in-savings to cover a broader scope of modernization and shift more and more of the investment burden on vendors. You also can’t punish smaller companies who don’t have the financial capacity to invest, which is why the hybrid model could make sense.”

If OPM knows the baseline costs for the electronic personnel file today, then figuring out how much savings could come from a modernized system would open the door to this approach even more, McNally said.

Garcia should know this isn’t the first time OPM tried to modernize the electronic personnel file. The eOPF was part of the Enterprise HR Integration project under the George W. Bush administration’s e-government initiatives starting in 2003. Then in 2009, the HR Line of Business put out a report recommending the integration of these disparate systems and data.  But over the last decade, progress has been limited, and the current mishmash of systems are growing older and more expensive to maintain and secure.

Additionally, Garcia should know that OPM has failed numerous times to modernize its retirement systems, which face similar challenges as the electronic personnel file, and will undoubtedly be another major priority.

Cairns said a major initiative such as this one requires a technology vision, strong leadership from the top, and maybe most importantly, the culture change management to bring the rest of the government into the modern world.

“From a delivery standpoint, I don’t think OPM can succeed without a highly technical staff in place who are government employees. I mean all-star designers, software engineers, data engineers, security engineers, etc. who are well-versed in the application of modern practices and technologies, and can help oversee and facilitate the integration of technical work products from multiple vendors,” he said.

OPM has a host of systems that are in desperate need of updating, so let’s hope a new CIO and the RFI are the first steps in this long, overdue journey.

Return to the Reporter’s Notebook


Rep. Hurd has a message to CFOs, others: Keep your hands off of IT modernization funds

With the Modernizing Government Technology (MGT) Act seemingly on a clear path to passage in the Defense authorization bill, the federal community is now asking two simple questions — how will the Trump administration implement the law? And what if chief financial officers and deputy secretaries don’t play nicely and use the money from the IT savings for other priorities?

While there is no simple answer to either of these questions, both Matt Lira, the special assistant to the president for innovation and policy initiatives, who works with the Office of American Innovation, and Rep. Will Hurd (R-Texas), the author of the MGT Act, are trying to lay the groundwork.

Let’s start with the second question first: What if CFOs or deputy secretaries don’t let agency chief information officers use all or some of the money realized by moving off of old systems to new ones?

Hurd said the MGT Act addresses that concern in the legislation.

“The legislation says that money can only be used for additional modernization. That’s why we needed that legislation to protect that money,” Hurd said, after speaking at the George Washington University Center for Cyber and Homeland Security event in Washington, D.C. on Sept. 29. “[The CFOs and deputy secretaries] would not have that ability unless they came back to Congress and Congress approved it.”

Hurd added that while there are 100 different scenarios for how the MGT Act may not work, the goal is to get it in place and address the challenges as they come up.

“Yes, it’s a concern, but that’s the point of the legislation and that’s why we spent so much time with appropriators, that’s why the appropriators have agreed to this and that’s why the appropriators’ insight into this legislation was so important,” he said.

And Hurd said he plans to add MGT Act implementation to the Federal IT Acquisition Reform Act (FITARA) scorecard.

“The FITARA scorecard is evolving to a digital hygiene scorecard. We will start keeping track of the working capital fund for modernization and whether you are taking advantage of it or not. If so, then there is a culture of modernization,” he said. “Some agencies will be able to take advantage of it and others will not. That is why having a working capital fund at each agency and a central fund made the most sense. We should have 26 different experiments on how to modernize federal IT infrastructure.”

Hurd said it’s clear that some CIOs will be more equipped than others to take advantage of the working capital fund.

“The Office of Management and Budget and OAI have been really intimately involved in the MGT Act process. They have ideas on how they want to implement it,” he said. “My biggest fear is CIOs are not prepared as soon as this goes into law to take advantage of it. That is where many folks can be helpful to some of these federal CIOs, to be put in a position to take advantage of MGT Act.”

That leads us down the effort by OAI and OMB to prepare for MGT.

The administration collected 93 comments on its draft IT modernization plan.

“It will be an interagency process to review the comments, and after the final update is cleared, it goes to the president for his review and approval,” Lira said in an interview after speaking at the Fedstival event, sponsored by Government Executive, in Washington on Sept. 19. “Anything that is short term will start to happen very quickly. The final report should speak to that in more detail as well.”

Lira said the biggest challenge is the sequencing of all the initiatives to ensure that they all lead to bigger and more impactful improvements over time, instead of just one-off wins.

Lira said every agency CIO faces different challenges to modernizing their technology, but the one common theme that comes out in meetings is for OMB and OAI to remove policy blockers that are out of date.

“These kinds of things have a real detrimental impact on the ability to use modern systems if they have out-of-date regulations governing their systems,” he said. “I know Margie Graves, the acting federal CIO, and many others are really proactively thinking about how to be air support for IT modernization and drive that along.”

OMB got rid of the first 59 of those outdated IT regulations in June.

The final strategy also will address one item that raised eyebrows in the draft report. In the initial version, OAI described a cloud email and collaboration pilot by calling out four vendors as “suggested industry partners” — Google, Salesforce, Amazon and Microsoft.

Lira said he doesn’t “anticipate” the final version will call out specific vendors to take part in pilots.

Another potential change that could come in the final IT strategy is more of a focus on improving citizen services.

The draft strategy included no mention of citizen services.

But Lira said at the TechTrends conference, sponsored by the Professional Services Council, that OAI’s top priority was improving citizen services.

“The actual experiences with government are wildly inconsistent with their expectations from even mid-level private-sector companies,” he said. “Everyone says it has to be as good as the top-level apps, and that would be great, but they aren’t even as good as mid-level apps. That drives public frustration and introduces inefficiencies, and are insecure.”

Lira said OAI will continue to bring stakeholders together to make IT modernization happen. He said the office is trying to create relationships between CIOs, CFOs and others, and address specific issues.

“We hope to set some sort of strategic visions collectively with all of those people as well,” he said. “One of the great assets of the last few years, both from the U.S. Digital Service, 18F and from an industry perspective, there has been a lot of efforts by a lot of actors to do some modernization, and some have succeeded and some have failed, and some are in progress. But what they all have done, regardless of what category they fall into, they have learned what the barriers are very intimately, whether it’s a legislative barrier, a culture barrier or a communication barrier. There is a lot of detail that may not have existed before, so we can get all these people together and figure out what is statutory and what is not. We’ve spent a lot of time together figuring out what is statutory, what is OMB guidance and what is agency interpretation. I think this will be an ongoing project to get us there.”

Return to the Reporter’s Notebook


GSA decision to debar contractor is overruled after judge calls it ‘arbitrary and capricious’

The former government contractor who was busted for hacking into a non-federal competitor’s network is back.

Ariel Friedler, the founder and former CEO of Symplicity Corp. — a government contractor who had run FedBizOpps.gov and other contracting platforms for much of the 2000s — won a significant suspension and debarment decision in the Washington D.C. District Court.

The court ruled that the General Services Administration unfairly debarred Friedler in 2015 because GSA did not give him notice of all of the grounds for his debarment and an opportunity to respond to each of them prior to the agency’s final debarment determination.

The court ruled that GSA “relied on Friedler’s alleged post-conviction conduct in reaching the conclusion that he should be debarred but failed to notify him of these purported violations — a failure that is unquestionably improper under the applicable provisions of the Federal Acquisition Regulation. And because this court cannot reasonably find that [GSA] would have debarred Friedler on the basis of his criminal conviction alone, the court cannot conclude that the agency’s error in relying on the two additional grounds without providing notice was harmless.”

Dismas Locaria, a partner with Venable in Washington, D.C., wrote in a blog post that despite usually giving deference to the agency when it comes to suspension and debarment, the courts are willing to hold agency officials accountable for giving contractors the right to respond.

“This is not a huge decision, but it does set precedent,” Locaria said in an interview with Federal News Radio. “It’s a great decision for contractors because it means the courts are holding the government’s feet to the fire when it comes to suspension and debarment decisions.”

Generally speaking, courts don’t second-guess the government’s decision when there appear to be grounds for suspension and debarment.

But Locaria said GSA didn’t give Friedler 30 days to respond to the new findings of fact and therefore the grounds for debarment were not just.

An email sent via LinkedIn to Friedler seeking comment on the court’s decision was not returned. Friedler’s lawyer, Fred Levy, called me back, but after returning his call, didn’t call me again.

A Symplicity spokeswoman said, “Ariel Friedler has not worked at Symplicity for several years. As this case is a personal matter for Mr. Friedler which does not involve Symplicity, we would suggest that you reach out to him directly.”

GSA initially suspended Friedler and proposed him for debarment in 2014, when he pleaded guilty to hacking into a competitor’s network. He had been under FBI investigation since 2009.

GSA initially said it would debar Friedler based on the hacking conviction.

But in GSA’s final notice debarment, the agency listed two other reasons — Friedler was back working at Symplicity and talking to employees, and was developing new lines of business in the federal market. GSA and Friedler agreed he would not do both of these actions during the negotiations of the debarment.

“The Friedler court found that GSA ran afoul of these concepts by debarring Friedler based on two new factual grounds not included in the Notice of Proposed Debarment,” Locaria wrote in the blog post. “The D.D.C. noted that not only were the grounds themselves labeled as ‘new causes’ under the notice, but that even without that language, such conduct necessarily constituted a new cause because it had not yet occurred at the time the Notice of Proposed Debarment was issued.”

Additionally, Locaria said the judge “was unconvinced that Friedler’s criminal conviction alone provided a basis for Friedler’s debarment; had Friedler been afforded the opportunity to respond to the two other identified causes, it is possible no debarment would have resulted.”

Locaria said GSA could re-propose Friedler for debarment based on all three factors, or they could just let the case go away quietly.

He said there is no statute of limitations or double jeopardy for suspension and debarment cases.
Overall, suspension and debarment cases have leveled off after a huge push by Congress and the Obama administration over the last five years.

In the fiscal 2016 report to Congress, the Interagency Suspension and Debarment Committee said there were 718 suspensions, 1,855 proposed debarments and 1,676 debarments. That is 42 percent more suspensions (417) than in 2009, and more than double the number of proposed debarments (750) and actual debarments (669) than seven years ago.

Source: Interagency Suspension and Debarment Committee 2016 report to Congress.

The Army, the Department of Homeland Security and the Navy were most active with debarments, while the Department of Housing and Urban Development suspended the most contractors last year.

“I don’t see the focus on suspension and debarment letting up anytime soon. There is bipartisan support to go after government contractors who are seen as committing waste, fraud and abuse,” Locaria said. “I think agencies are at capacity more than anything and they can’t do more, which is why the numbers have leveled off. But right now, with people concerned about the hurricane response efforts, I could see in two or three years, there will be conversations about the federal money spent to rebuild those areas affected.”

Source: Interagency Suspension and Debarment Committee 2016 report to Congress.

Return to the Reporter’s Notebook


How the House wants to make federal procurement less complex, more competitive

Subscribe to Federal Drive’s daily audio interviews on iTunes or PodcastOne

As the House and Senate go to conference to solve differences in the Defense authorization bill in the next few weeks, the acquisition community will be watching Section 801 of the lower chamber’s bill closely.

That provision is commonly known as the “Amazon” amendment, developed by Rep. Mac Thornberry (R-Texas), chairman of the Armed Services Committee, and would require the General Services Administration to set up more than one online marketplace featuring commercial companies. Agencies could then buy from these online marketplaces the same way average people buy from Amazon.com, BestBuy.com, Walmart.com or any other similar commercial provider.

As the Senate finished its work on its version of the NDAA, that provision was not included amid concerns among industry associations and some Senate staff members.

“The fact that the Senate didn’t put anything in their version of the NDAA means they have grave reservations, and it means they are also at complete opposite ends of negotiation spectrum,” said one industry expert, who requested anonymity, in order to talk about the controversial provision. “There have been no formal hearings or public discussions or vetting of this provision’s impact. The House folks are just doing it even though it’s a major change in the way agencies will buy in the future.

Multiple emails to Senate Armed Services Committee Chairman John McCain (R-Ariz.) seeking comment on the provision were not returned.

Concerns over Section 801 are not new — with industry raising them as far back as June — but uneasiness rose higher when word spread that Amazon and Deloitte were planning a fundraiser for Thornberry on Sept. 27, where they were asking attendees for donations between $500 and $2,500.

An Amazon spokeswoman said that event has been postponed. The spokeswoman added Amazon routinely holds fundraisers and donates to lawmakers’ reelection campaigns, including hosting or co-hosting 25 events this year.

Industry observers who follow this type of political lobbying and electoral support say a fundraiser like this isn’t unusual and part of building relationships.

But some said if nothing else, the timing of it was suspect, as the NDAA was heading to conference and Thornberry would be a conferee. But the fundraiser has been postponed, so it becomes more of a footnote to the larger discussion of Section 801.

Who was behind 801?

There are serious disagreements over who is responsible for the provision and why it may go governmentwide.

Industry sources say the strong rumor is Amazon played a huge role in working with House Armed Services Committee staff members to create the provision.

But a senior staff member of the House Armed Services Committee emphatically denied Amazon came up with the provision.

“The idea for Section 801 came from conversations the committee staff had with DoD. It did not originate in the private sector and it didn’t come from Amazon,” the staff member told Federal News Radio. “There may be people who will lose out in this new construct that offers real competition, so they are resorting to whatever tactics to end-run it. But the whole point of the provision is to inject actual real competition into the system instead of government-generated faux competition.”

The aide added the committee issued the draft NDAA six weeks before the markup to generate feedback, and spent the previous year exploring how such a provision would work.

A second HASC staff member said the idea for the provision came from DoD, saying general officers were frustrated with the current procurement process and the time it takes to obtain goods and services for the warfighters.

When pressed on who in DoD asked for the language, the staff member wouldn’t say, but did confirm it came from the acquisition community.

Industry sources, however, say there are some parts of the military that are not in favor of the provision. One potential signal of the Pentagon’s concerns is the somewhat expedited transition of DoD’s EMALL to FedMall, an online marketplace that aims to streamline the registration, ordering and research processes.

Additionally, the rumor gained momentum because HASC members visited Amazon in Seattle to understand their business model, but didn’t take field trips to any other companies. The staff member confirmed the visit to Seattle, and said there wasn’t time for other visits, but the committee met with several other similar companies.

One industry source said the way the language is written, some companies like Granger may not be eligible for a contract. Companies also are concerned about getting into a situation where the only platform for DoD to buy from is Amazon, and how Amazon would then use the data.

Implementation is the key

Jon Etherton, a former Hill staff member and now president of Etherton and Associates, said while the inclusion of commercial online marketplaces in federal acquisition would be a major change, companies and agencies shouldn’t get too stressed about it.

“Everyone likes the concept, so it’s just a question of how do you work through the details to make it compatible with government procurement and that sort of thing. It’s not an either-or situation,” Etherton said. “I think it will be an issue where we will work out a lot of details during implementation, but I believe that we will see something in the final bill at the end of the day.”

Etherton added GSA will not just “flip a switch” to get the marketplaces stood up.

Part of the issue for industry is HASC hasn’t been clear in how the marketplaces would work.

The HASC staff member said the committee has improved the provision since they first introduced it.

The staff member pointed to the expansion of the provision governmentwide, the language asking for more than one marketplace and clarifying it so it could work for all types of business models.

Additionally, the committee made some explicit additions to ensure agencies met governmentwide requirements and goals around small businesses.

But that’s the rub in this discussion. HASC wants to remove complexity and ensure “real competition” in federal procurement, but the policy and regulations around small business goals, or Buy American or the Berry amendment or the Truth in Negotiations Act (TINA) are part of why federal acquisitions are complex.

The HASC staff member said the core set of statutes that agencies and vendors comply with today will not change under this new model. The provision would require GSA to provide information on those kinds of products in the online portal that satisfy those laws. The staff member added agencies can fundamentally still buy American-made goods without the hassles of the current procurement model.

Industry experts question how GSA can make that happen without the costs of compliance driving up prices, like many say happens on the GSA Schedule and other federal procurement vehicles.

“If I go to Amazon or Walmart or Granger, how can I validate the Trade Agreements Act or the Small Business Act? How can I make the determination about the size of the company or the source of product or if the products continue to be eligible for purchase by the government?” said another industry source. “How does GSA Schedule 70 fit in with this concept? If a vendor has to be compliant with the FAR on Schedule 70 and not on a commercial marketplace, that scares the government. And why would those rules apply if you can buy the same commercial products on GSA schedule or other government vehicles, if they don’t apply to the commercial marketplace?”

Why not improve GSA?

The other issue that many experts bring up is why not improve the GSA schedules or other vehicles instead of imposing new approaches that may or may not work.

The HASC staff member said the goal is to give agencies more options, particularly one that will drive competition and innovation.

The staff member said the online marketplace concept taps into the existing free market, where agencies could buy high-quality products at less expensive prices and have access to more goods. The staff member also said this approach also improves transparency and accountability because, through online e-commerce portals, agencies will be able to do so much more spend analysis, understand what they’re buying, how they are buying it and that will lead them to make smarter decisions.

Etherton said the government has to move more toward the commercial marketplace and this approach is a potential candidate.

“One of the things that I come back to is even the folks who are most concerned about different aspects of the provision say conceptually they are in agreement that the government needs to buy more commercially. So at the end of the day, that’s where we have to end up,” he said.

Return to the Reporter’s Notebook


« Older Entries

Newer Entries »