Reporter’s Notebook

jason-miller-original“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.

Submit ideas, suggestions and news tips  to Jason via email.

 Sign up for our Reporter’s Notebook email alert.

First Look

How one Russian group exposed the soft underbelly of federal cyber defenses

In early November, at least two agencies fell victim to a cyber attack from a group based in Russia.

The hacking group Killnet took responsibility on Twitter for taking down sites run by the Commerce Department and the Cybersecurity and Infrastructure Security Agency in the Department of Homeland Security.

While the distributed denial of service (DDoS) attack was more of a headache than anything else, it spurred the Office of Management and Budget to relook at agency protections against a type of attack that experts say is tried often, but is rarely successful any longer. But, at the same time, has re-emerged on the threat landscape poking at federal IT’s soft underbelly.

The Federal Chief Information Security Officer’s Office and CISA issued a data call about 10 days after the DDoS attack asking for details on how agencies are protecting against these threats and asking agency chief information officers to validate the domains they are protecting.

Federal and industry cyber executives say the concern isn’t so much this one successful attack against Commerce and CISA, but understanding how well agencies are prepared against DDoS attacks that have been rising in Eastern Europe at an alarming rate and against soft targets like airports in the U.S.

“Our view is it did appear that the activity was starting to target the U.S. government, so if that’s the case, let’s be prudent and take this opportunity to fast track some efforts that were underway already to start doing automated discovery and confirmation of content delivery networks and DDoS mitigation protections across all federally owned websites,” an OMB official told Federal News Network. “We have a general view of the current state, and there’s pretty good coverage across federal government in this space. We know that anecdotally from prior data calls and from conversations with CISOs and CIOs at CIO and CISO council meetings. But this is new and different. CISA has created a tool and capability so let’s get, like, full veracity here and make sure agencies have a central viewpoint of all the websites that you own, operate and are responsible for.”

Impact on sites was minimal

The November attack that Commerce and CISA faced resulted in no operational impact, federal executives say.

Commerce’s main website came down for several hours, government sources say. A Commerce official said they were still doing a root cause analysis and didn’t offer any further details about the attack.

At CISA, the DDoS attack took down the front end of the Protected Critical Infrastructure Information (PCII) Program user website for a few hours, according to internal emails obtained by Federal News Network.

Other agencies, including the National Nuclear Security Administration and the Treasury Department, also faced DDoS attempts by Killnet to take down websites, but were unsuccessful.

A CISA official told Federal News Network that the DDoS attack had no operational impact and only impacted the external facing site.

“The application was still up. The PulseSecure server that sits in front of [the site] went down because it couldn’t handle the traffic load,” the official said.

Another CISA official described the attack as a “general resource exhaustion attack,” where the bad actor is sending too much traffic for the site to handle at once.

The problem of growing technical debt

The attack highlights a bigger issues agencies continue to face: Old technology that either isn’t patched because they don’t know about it, known as shadow IT, or they can’t afford to replace it because of budget limitations.

OMB said in 2016 agency technical debt is more than $7 billion. OMB, agency CIOs, industry and some lawmakers continue to ring the alarm bells about this amount of technical debt as it raises the risk level agencies face from cyber attack.

CISA, the agency, as opposed to the policy shop that puts out binding operational directives and other guidance, is no different.

Early indications from CISA internal emails obtained by Federal News Network showed that the attack found success because the agency was using legacy servers that were unpatched.

“CISA support has identified the Pulse server is currently running on old codes. Issue has been escalated to Akamai engineers to replace CISA old code on Pulse Secures,” according to a Nov. 4 internal email.

The first CISA official said analysis later on determined the root cause was not old code running on the servers, and the agency had all configurations up to date. The official didn’t explain what had happened, except the attack overwhelmed the external facing site.

But one federal cyber official familiar with the incident questioned why CISA would send out an email saying what the problem is if they weren’t sure or were guessing. The official said in the first hours of a cyber incident either you know what the problem is and you tell your team, or you say you don’t know what the problem is because it could cause other problems.

“I don’t think a CISO would send that note unless it’s not true, as communications during an incident are key. Everyone is fine with not knowing what happened, but if you know you should state it. I think they knew exactly what was wrong and their network and security operations center confirmed it,” said the official who requested anonymity in order to talk about the cyber attack. “Agencies were told to get off PulseSecure more than a year ago because they had a known vulnerability. I think they were shocked it was even there because they should’ve gotten rid of it because they knew about it being a problem.”

The official and other experts called the Killnet attack basic and not much better than a “script kiddie.”

The federal official said Killnet was probably surprised the DDoS attack worked at all.

“DDoS are not complicated attacks. They are script driven. The cloud has enabled DDoS attacks with scale and speed that we haven’t seen before. It’s not highly technical. It’s low rent and this was an opportunity Killnet took advantage of,” the official said. “This will happen again. I’d expect Killnet to come back.”

To be fair, it was the weekend before the mid-term election, so many experts give CISA credit for raising awareness at the time.

The data indicates Killnet, or whomever, will be back, which is part of the reason for OMB’s data call. In fact, CISA sent out a governmentwide email on Nov. 4, the day of the successful attack against agencies, warning against a possible spike in DDoS activity.

“We wanted to highlight increased DDoS activity being reported against federal agencies. Several agencies have confirmed impact from this activity. If you are experiencing any DDoS activity, we ask that you please report it to CISA via standard reporting mechanisms and share any relevant information from that activity,” the email, which Federal News Network obtained, stated.

Earlier in October, CISA also updated its DDoS guidance for agencies: Understanding and Responding to Distributed Denial-of-Service Attacks and Capacity Enhancement Guide (CEG): Additional DDoS Guidance for Federal Agencies.

Spike in DDoS attacks in Eastern Europe

OMB reported in the fiscal 2021 Federal Information Security Management Act (FISMA) report to Congress released in September that attrition attacks, which DDoS fall under as a category, accounted for about 1% of all attacks agencies suffered last year. The report states there were 440 known attrition-type attacks last year out of more than 32,000 total incidents.

In the meantime, DDoS activity has picked up massively in Eastern Europe since Russia’s invasion of Ukraine. Eastern Europe is typically the target about 1%-to-2% of global DDoS attacks; since the invasion, Eastern Europe has been the target of up to 30% of global DDoS attacks, said Patrick Sullivan, the chief technology officer for security strategy for Akamai.

“This represents more than a 1,100% increase in DDoS attacks in Eastern Europe compared to trend lines prior to February 2022. We’ve also seen some of these attacks pack quite a punch, with records broken for the greatest number of packets per second of DDoS attacks in Europe,” he said. “There have been DDoS attacks targeting organizations in the U.S. that have been attributed to organizations that have publicly pledged loyalty to Russia in this conflict. U.S. state and local governments, airports and other industries have been targeted by Killnet. Overall, these are more isolated; we are not seeing the massive increase in activity directed at targets in the U.S. like what we have observed in Eastern Europe.”

And hacking groups will continue to look for soft spots, like an unpatched PulseSecure server.

John Pescatore, the director of emerging security trends at the SANS Institute, said agencies should also be concerned about bad actors using DDoS attacks as a distraction to get the security and network defenders focused on one thing and then attacking somewhere else with something more serious like ransomware.

He said DDoS and other basic attack approaches take advantage of old technology that either agencies don’t know exist or haven’t had the resources to upgrade.

“With the PulseSecure vulnerabilities over the past two years, agencies have been super slow to patch them, and the government was a big chunk of their business. The vulnerabilities are getting exploited in variety of ways,” Pescatore said. “Earlier this year, Akamai put out warning about a middle box reflection attack where something vulnerable is used for dual purposes. It became a man-in-the-middle attack doing TCP reflection attack but also a DDoS attack against the organization that owned that machine. We have no idea why the government has been so slow to patch the PulseSecure servers. The patches were coming out and it was not like they were zero days.”

A more scalable cyber defense

This brings us back, again, to OMB’s data call.

OMB wanted agencies by Nov. 18 to identify and validate all domains, and then CISA will provide agencies with data from its content delivery network (CDN) reporting and mitigation database.

Agencies were then to “review the findings and provide feedback to CISA on how they will mitigate the risk of a sophisticated distributed denial of service (DDoS) attack against sites not showing a known CDN provider.”

A third CISA official said they are working on a more scalable approach to determine where protections may be lacking and how could they provide visibility to agencies so they can implement mitigations.

“We generally feel that the federal civilian executive branch enterprise is well protected against DDoS, which is why you’re referencing a couple of attacks over a few weeks when there are 10s of 1000s of web apps across government. But we want to make sure there are always stragglers, there is some low hanging fruit. We want to make sure that agency CIOs and CISOs are aware of where protections may be lacking so they can work urgently to put needed protections in place,” the official said about the data call.

The OMB official added CISA’s tool will help identify gaps where maybe agencies thought they had turned coverage on but for whatever reason it isn’t there today.

“It could be something they’ll be able to make a quick fix on or maybe they may have made an intentional decision at one point in time, 10 years ago or whenever, to not have the coverage and maybe in 2022, they should be making a different decision,” the official said. “The interesting thing about these exercises is you learn a lot as you get the data and as you have agencies discover what is going on. From an automated perspective, this is the first time we’re doing it federalwide, so it’s a big exercise. We’re really excited about it because it just feels like, why not take this moment to fast track that and put it on the front burner?”

Whether or not this was a case of CISA, Commerce and possibly other agencies getting caught with their proverbial shields down because of outdated technology or shadow IT, the fact is the ongoing challenges around old and outdated technology and easy ability and low cost to launch DDoS and other basic attacks makes agencies more vulnerable. While the impact of DDoS and similar attacks may not be great, they do cause federal executives a great deal of time, energy and heartburn. This serves as just another reminder that getting out from under technical debt should be the top priority for the administration and agency leaders.

 


DHS, VA, GSA, OMB hire new federal technology executives

Like leaves changing, federal executives heading into retirement, the private sector or just finding new opportunities in the government is an annual rite of fall.

The executives on the move range from NASA’s chief data officer to the Federal Communications Commission’s head of cybersecurity to the Marshals Service’s chief technology officer.

This is by no means a complete list of every executive in the federal technology sector that moved to a new role over the last few months. If I missed any big moves, feel free to let me know.

Let’s start with some folks in new roles in government.

The CTO at the Marshals Service, Christine Finnelle, is now the new director of enterprise architecture at the Department of Homeland Security.

Christine Finnelle left the Marshals Service for DHS in November.

DHS CTO David Larrimore confirmed Finnelle’s move on LinkedIn on Nov. 26.

She has been with the Marshals Service since 2016 and worked for the Justice Department since 2003. She became CTO for the Marshals Service in 2019 to help create the agency’s long-term technology roadmap.

Finnelle is the second high ranking technology executive to leave the USMS this year. Karl Mathias, the USMS chief information officer, moved to the Department of Health and Human Services in March.

Over at the Department of Veterans Affairs, Kim Pugh announced she is the new director of software-as-a-service (SaaS) in the CIO’s office, coming over from the Veterans Health Administration.

Pugh has worked at VA for 18 years, including the last three years as VHA’s director of investment governance services.

The General Services Administration announced Dan Lopez is the new director of Login.gov. Lopez comes to GSA after serving the last three years as the director of software engineering for the Philadelphia city government.

“Dan holds an extensive background in engineering and leadership, and he hails from Philadelphia. Login.gov program is excited to enter this new chapter with Dan at the helm,” Login.gov tweeted on Nov. 28.

According to his LinkedIn, Lopez started in September.

Dan Lopez is the new director of GSA’s Login.gov program.

During his time in Philadelphia, Lopez said he was responsible for phila.gov, critical applications ranging from payments middleware (handling more than $1 billion a year) to the city’s campaign finance system, geographic information system applications like atlas.phila.gov and property.phila.gov, and much more.

GSA advertised for the position in February after it was vacant for the better part of two years. Amos Stone had been acting since August 2019.

The Office of the Federal CIO also filled a long-time vacant role. Mitch Hericks became the permanent director of federal cybersecurity after being the acting director since January.

“A year ago, I took a big leap in moving (back) from New York City to DC for a job working with our Federal CIO Clare Martorana and Federal CISO Chris DeRusha. It’s been a wild ride, but driving forward our implementation of the Executive Order on Improving the Nation’s Cybersecurity has been a once in a lifetime opportunity,” Hericks said on LinkedIn in early October. “I’m grateful to continue working with this team, driving security outcomes across the federal government as director for federal cybersecurity.”

Hericks replaces Steve McAndrews, who left in January to be the deputy CIO at the Energy Department’s National Nuclear Security Administration.

DoD, NASA, IRS, FCC looking to fill holes

While these agencies brought in new technology executives, several moved on, leaving agencies to fill key holes.

Two long-time federal technology experts retired.

Kevin Dulany, the Defense Department’s CIO’s director of cyber policy and partnerships, retired after 38 years of federal service, including 20 years in the Marines Corps.

“I am truly appreciative for all of the support I have received over my career from leadership, peers, and especially the staff who supported me that really made things happen!” Dulany wrote on LinkedIn in September. “I am also blessed with meeting such great people from not only internally within the department, but throughout the federal space and industry … keep charging and PLEASE — when you are pushing a capability to our warfighters, cybersecurity has to be ‘baked in, not bolted on;’ and last but not least, please keep those military members who are ‘forward deployed in not-so-nice neighborhoods’ in your thoughts and prayers!”

Dulany also worked at the Defense Information Systems Agency and led the Defensewide Information Assurance Program for 11 years.

Over at NASA, Ron Thompson, the space agency’s CDO, also called it a federal career on Sept. 30. He retired after over 30 years of federal service, including the last three at NASA and stints at the Agriculture Department, the Census Bureau and HHS.

“Since 1984, when I put on a uniform, I have been serving this great nation as a public servant. I had the opportunity to serve in seven agencies rising through the ranks to the senior executive service — I have been truly blessed. It has been my privilege to serve this wonderful nation and making lifelong friends, benefiting from amazing mentors’ leaders throughout the journey,” Thompson wrote on LinkedIn. “As I reflect back on my time, it is amazing how quickly the time passed and my hope is my contributions made the mission better than I found it and I helped others as I have been helped along the way.”

Thompson recently joined Quantum Space as its chief data officer and executive director.

The IRS and the FCC also saw key executives move on.

Andrea Simpson left the FCC to be the CISO at Howard University.

Ray Coleman, who had been the IRS’s executive program director of the cloud management office for the last year, became the CIO for Koniag Government Services, an Alaskan Native-owned IT services firm.

Coleman spent the last 12 years in federal service, serving as CIO for the Defense Contract Management Agency and USDA’s Natural Resources Conservation Service.

Andrea Simpson, the FCC’s chief information security officer, is taking her talents to academia. She joined Howard University in Washington, D.C. in the same capacity.

Simpson had been at the FCC CISO for two years, including the last six months as acting CIO, and joined federal service in 2013.


CISA signature federal cyber program warrants more than a passing anniversary nod

The continuous diagnostics and mitigation (CDM) program turned 10 years old last month. And what a long strange trip it has been.

As agencies move toward zero trust and continue to face an ever changing cyber threat, it’s clear CDM has hit its stride.

Now the Cybersecurity and Infrastructure Security Agency is positioning the program to bring a level of visibility and proactive response the original framers of CDM only dreamed of back in 2012.

“CDM was built on continuous monitoring that had been mandated under the Federal Information Security Management Act (FISMA) of 2002. Continuous monitoring was a thing. People talked about it. They did it. But they did it in lots of different ways across the civilian agencies. They did it with very little automation. There was certainly no central visibility. Agencies did it in different ways within their components or their elements,” said Betsy Kulick, the deputy program manager for CDM, at the recent FCW CDM Summit. “Most people relied on manual inventories at the end of the year and spreadsheets that offered a picture in time. There was not much beyond that in terms of accuracy, to say nothing of telling you how well protected at that particular endpoint or device. So there were people at the State Department, wise people at the Office of Management and Budget as well as in Congress that thought that automating it would be the smart way to move and that the state of the industry was such the tools existed that would allow us to do that. We were funded in 2012 to begin to try to standardize mainly continuous monitoring as the first effort in terms of device management, but ultimately, to go through the whole NIST (National Institute of Standards and Technology) Special Publication 800-53 controls to automate that, to the extent possible to provide a far more secure way of securing the federal civilian networks. It was an ambitious program, we knew we’d been working at this for 10 years.”

And 10 years later, the CDM program, warts and all, is widely considered a success.

The Department of Homeland Security launched the program in 2012 making awards to 17 companies with a $6 billion value.

The idea was borrowed by the State Department, which set up a system of continuous monitoring and alerting of hardware and software vulnerabilities.

DHS updated the program in 2017 to its current approach that focused on using system integrators to help groups of agencies with similar needs or in similar places implement approved products to fill in specific cyber gaps.

Since 2017, agencies have been receiving at no charge a series of tools and capabilities to get more visibility into their networks through asset, identity, data security and network protection management tools. CISA also provides a dashboard at both the agency level and one that provides data to create a governmentwide picture for CISA .It also helps small and micro agencies with a shared services platform.

Strong support from Congress

The decade of CDM has been far from smooth. Industry protested task orders. Agencies expressed frustration on several occasions about delays in getting key toolsets. DHS ran into bureaucratic, regulatory and legislative obstacles that needed to be cleared. And then there is the ever-present culture change aspect of trusting CISA to help, but not judge individual agency cybersecurity efforts.

But despite a dozen years of challenges, CDM has consistently found support from multiple administration and from Congress.

Congress has been unusually supportive of CDM and really CISA more broadly when it comes to federal cyber networks. Since 2012, DHS has received more than $2.36 billion specifically for CDM, which also included a sizable chunk of the $650 million CISA received from the American Rescue Plan Act. CISA hopes to receive another $4 billion through 2033 to continue to run and evolve the program.

Source: CISA 2021 report to Congress.

So what did all that money get?

CISA says the foundation for better more proactive cyber defense is coming into place.

Richard Grabowski, the deputy CDM program manager, said agencies are seeing real value from some of the work that CISA has led over the last year plus.

“Everything that we’ve been doing over the last 16 months and in the near term are going about building that collaborative defensive posture. So you see what we see, we can make very helpful recommendations that you can triage and take back at machine data speed,” Grabowski said. “We’ve made investments in the Elastic search tool, in technologies for end-point detection and response (EDR), helping you get in front of mitigation, coverage making sure that every and all shadow IT has some amount of spotlight on it, and then bringing into other asset classes like mobile.”

The CDM toolset has come in handy during every cyber threat and incident agencies have faced over the last five years. Whether it was the WannaCry ransomware attack or Log4J or any number of threats, agencies and CISA can turn to the dashboard from Elastic to discover more complete data more immediately.

Dashboard expansion coming

Judy Baltensperger, the project manager for the CDM dashboard at CISA, said the dashboard has come in especially handy to help address requirements in recent binding operational and emergency directives that CISA put out to the agencies over the last two years such as after the SolarWinds and the Log4J cyber incidents.

“We were able to share with them what CDM data is actually available, and what kind of automated reporting can we feasibly do. I don’t think people realize how expansive the dashboard is,” she said. “We have about 89 dashboards deployed, 78 of them reporting data. We do have a large amount of coverage across the network now, and we were now at the point where that synergy came together.”

Baltensperger added the dashboard has impacted agencies’ ability to meet specific compliance requirements and address long-standing cyber hygiene challenges such as patching and asset management.

There are several new capabilities coming to agencies from CDM to improve this proactive and collaborative defense posture.

Baltensperger said one of them is something called cross cluster search that will give CISA an even deeper look at health of agency networks, which came in handy during a recent cyber threat, something called open SSL3, considered a high risk vulnerability.

“What that gives us here, this is a federal level is object level data visibility into the dashboards. So as of this moment, we have about 20 dashboards out of the 78 that we have this object level data visibility,” she said. “Starting several days ago, last Friday (Oct. 28), we were able to then with that object level data, deep dive down to what was being scanned. Within the ecosystem, we have more visibility than we have ever had in the past. That’s expanding with our implementation and enablement of cross cluster search. But that needs to improve. But that’s a significant improvement.”

Baltensperger added CISA expects to expand this cross-cluster capability to more agencies in 2023 because it provides a level of automation of information collection that will accelerate when agencies know if they have a vulnerability so they can remediate it and reduce their risks.

Additionally, CISA will upgrade all agency dashboards to version 6 and a new service under the dashboard to help agencies identify when they are using end of life products or are getting close to end of life so they can replace them and reduce cyber risks.

More agencies moving to shared services

Finally, Baltensperger said another capability that is gaining momentum is around dashboard-as-a-service.

“If an agency does not have their own hosting environment and they would like to pass that on to us, our team can do that for a significant cost savings of about $80,000 to $100,000 per dashboard. What we can do is on our side is provide you access to that dashboard. So that means the product gets paid for, all the infrastructure gets paid for, the storage gets paid for because we are managing a similar type product and we’re able to repurpose our labor,” she said. “We’ve gotten much more efficient with the number of people that it takes to operate and maintain and upgrade that particular solution because we basically built a shared service off on the side, and we can offer it to all the agencies.”

Currently, five CFO Act agencies are using the dashboard-as-a-service and another seven or eight plan to join in 2023.

“What that means is their dashboard is moving out of their system boundary and we are hosting it on their behalf. Now the data still belongs to them. They’re still responsible for the data. But all of the burden of operating, maintaining patching, keeping up with the operating system patches, figuring out if you are susceptible to OpenSSL, all of that work is coming over to our team, and we’re already doing it for ourselves,” she said. “What we’re doing is just extending it to the agencies. But it means that we’re funding the infrastructure. And because we’re funding the infrastructure out in the cloud in a shared service manner, we’re able to realize cost savings.”

Shared services, cost savings and most importantly, better cybersecurity, those were the initial goals and vision for CDM. No one would claim this was an easy path and CDM is far from perfect, but it’s clear agencies are better off because DHS, the State Department, OMB and a host of visionaries took a collective leap into the cyber unknown.

It’s not often the government celebrates program successes, especially cybersecurity initiatives. But CISA, OMB and every agency should take a moment, offer a smile or two and delight in what they have accomplished through CDM over the last decade.

And I hope CISA at least had some cake to mark the anniversary and all that is good about the continuous diagnostics and mitigation program.

 


50,000 companies on hold because of GSA’s UEI validation problems

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Editor’s Note: Updated on Nov. 2 with comments from GSA about the UEI validation service backlog.

It’s now November, and the General Services Administration still hasn’t fixed the Unique Entity Identifier (UEI) transition to the new validation service that began in April.

And as many as 50,000 companies and grantees still are waiting to fix validation issues that is causing delays in awards and in getting paid.

The almost six-month old problem continues to cause broad concern across industry and on Capitol Hill.

“GSA briefed our subcommittee staff and is working to recover from a transition to new federal contractor system, which experienced many challenges and shortcomings. GSA did not predict or foresee many of the critical responsibilities of a system designed to identify and certify our private sector partners so they can work with agencies to improve how government operates,” said Rep. Gerry Connolly (D-Va.), chairman of the Oversight and Reform Subcommittee on Government Operations. “While GSA addresses these problems, they must work to reduce the unacceptable backlog. I will continue to work with our private sector partners to ensure that companies get certified efficiently and do not risk financial collapse because GSA underestimated the complexity of the system needed to undergird government’s engagement with federal contractors.”

Connolly wrote to GSA in July wrote to GSA asking for a briefing and update on their progress in fixing the validation service.

A GSA spokesperson said the 50,000 backlog represents a “snapshot in time” of vendor problems that are currently under manual review.

The spokesperson said  only a fraction of that 50,000 trouble tickets have been in the manual review process for more than two months.

“The numbers do reflect that we have a high volume of tickets coming in, and we remain focused on completing each review as quickly as possible to minimize burden on businesses and other entities, while maintaining the rigor and integrity of the entity validation process overall,” the spokesperson said.

But while the agency is making progress in reducing the backlog and fixing the problems, Connolly and industry representatives remain frustrated.

Stephanie Kostro, the executive vice president for policy for the Professional Services Council, an industry association, said member companies remain stuck in limbo. While GSA has resolved some of the companies’ issues, PSC has member companies who have been facing significant problems since the summer.

“They can’t submit bids. They can’t get paid from work they completed,” Kostro said in an interview with Federal News Network. “There should be a systemic solution, but all we’ve heard from GSA is that they are working on it. But there is no evidence that the end is near, or any light at the end of the tunnel.”

Grantees impacted on  larger scale

One industry executive, who requested anonymity because they didn’t get permission to speak to the media, said the continued delays to get validated are frustrating for the time it takes as much as the process. The executive said, for instance, GSA tells vendors to log in every five days for an update, even if there is no update. And if the vendor doesn’t log in after five days, they are kicked out of the line.

“That’s just insane. Shouldn’t GSA have an email notification or something if there is a change in your status?” the executive said.

Cynthia Smith, director of government affairs and advocacy at Humentum, a global nonprofit working with humanitarian and development organizations to improve how they operate and to make the sector more equitable, accountable and resilient, said her members have not seen much, if any, improvement since April.

“Humentum continues to hear from our members on a weekly or bi-weekly basis with questions and concerns about the delayed processing of their and their local partners’ UEI tickets. We continue to hear about protracted delays, concerns about the seemingly arbitrary closing of tickets prior to their resolution, and local partners’ lack of responsiveness from the GSA’s Federal Service Desk to their inquiries,” Smith said in an email to Federal News Network. “The continued impacts are delays in contracting, programming, and funding local partners to do the work they are best positioned to do – particularly for those foreign local partners who are not able to avail themselves of USAID’s temporary exception, because they are working with funding from State or another US federal granting/contracting agency.”

The continued problems aren’t just impacting vendors, but agencies too. The Defense Department issued a deviation to its acquisition regulations in September, allowing the services and defense agencies to do business with companies who aren’t fully registered in the governmentwide acquisition system.

Kostro said given the Biden administration’s desire to bring more small disadvantaged businesses as well as new entrants into the federal market, it would make sense for GSA to have a little more urgency and communication on how they are fixing this problem.

“I can’t think of a greater disincentive for a company to want to participate in the federal market if you can’t register or get paid,” she said. “I think that would have lit a fire under GSA but we have not seen any evidence of that yet.”

Smith added as the validation process continues to be hamstrung and with GSA not offering a “clear and viable plan” to remedy the situation, agencies are “working at cross-purposes to its own stated objectives of engaging new and local partners around the world to advance our foreign assistance and national security priorities.”

GSA reducing need for manual reviews

The GSA spokesperson said to date, more than 373,000 entities have successfully completed the validation process. GSA launched the move from the old Dun and BradStreet number to UEI in April after years of planning. The validation piece of the transition became a problem almost immediately.

PSC’s Kostro said there are plenty of examples of simple problems that took weeks to fix, such as having an extra space on the other side of an “&” or  missing the plus-four for a company’s zip code.

“Roughly 80% of these entities did not need a manual review and therefore proceeded without delay. For the roughly 20% of entities that require a manual review, GSA is surging support to the program and prioritizing accordingly,” the spokesperson said. “We are seeing positive results. Due to GSA’s improved workflow, communications and stakeholder outreach and education, the number of entities able to complete their validation with the first manual review increased by 30 percentage points. Once an entity is successfully validated, they are unlikely to face a similar problem again because their validated information is in the new database for future annual renewals.”

The spokesperson added GSA is continuing to take steps to resolve issues, improve response times and provide immediate relief to companies.

“GSA’s biggest challenge has been the volume of tickets submitted, which has far exceeded expectations,” the spokesperson said. “Right now we are focused on managing the unanticipated volume by surging support and prioritizing entities most at risk of financial impacts, while also making changes to help us better understand and support the needs of our users overall. Additionally, we implemented an automatic, 30-day extension for any existing SAM.gov entity registration with expiration dates between April 29, 2022, and April 28, 2023, and a 60-day extension for all registrations expiring in August and September.”

Other sources confirmed GSA’s efforts to fix the problem, including encouraging agencies and vendors to quickly raise any problems particularly around getting paid higher, and by surging people, data and automation to accelerate the ticket resolution process.

Sources also say GSA is trying to address system and workflow processes to help vendors trying to validate their information.

While it sounds like GSA is doing as much as possible, including surging contracting officers and other senior leaders to the problem, it’s clear industry feels more transparency is needed. To GSA’s credit, it has held virtual listening and Q&A sessions as well keeping its Interact site updated. But at the same time, its alert on SAM.gov hasn’t been changed since mid-September. That’s a long time without any word on the hub of federal procurement.

 


The government’s Section 508 transparency problem

Agencies have a transparency problem when it comes to Section 508. It’s not that agencies are ignoring the law Congress passed 24 years ago to ensure federal technology is accessible to people with disabilities. It’s the lack of discussion, data, evidence or even reporting of progress that is causing concern on Capitol Hill and among other experts.

The Biden administration’s clarion call for diversity, equity, inclusion and accessibility (DEI&A) is ringing hollow unless agencies do more to show and tell how they are meeting both the spirit and intent of Section 508.

“Given the current absence of public, governmentwide evaluations of federal technology accessibility, it is critical that the General Services Administration’s timely data and analysis be made available to Congress so that we may better evaluate compliance with and the effectiveness of existing accessibility laws and programs,” wrote a bipartisan group of senators in an Oct. 7 letter to GSA Administrator Robin Carnahan. “Accessible websites and technology are extremely important to these populations—and the federal employees who provide them services—yet there is mounting evidence the government is not meeting its obligations as required by Section 508.”

Sen. Bob Casey (D-Pa.), chairman of the Special Committee on Aging, is leading the charge to bring some sunlight onto agency 508 efforts.

Bob Casey
Sen. Bob Casey (D-Pa.) is the chairman of the Special Committee on Aging.

Efforts that, for the most part, have been in the dark for the better part of a decade.

For example, the Justice Department hasn’t issued a governmentwide report on 508 compliance since 2012. The 1998 updates to the Rehabilitation Act of 1973 required DOJ to issue an annual report.

GSA, as another example, hasn’t made governmentwide 508 compliance summary data public since 2018. The letter from Casey, Sens. Tim Scott (R-S.C.), ranking member of the Aging Committee, Gary Peters (D-Mich.), chairman of the Homeland Security and Governmental Affairs Committee, and Rob Portman (R-Ohio), ranking member of the Homeland Security and Governmental Affairs Committee, and Patty Murray (D-Wash.), chairwoman of the Health, Education, Labor and Pensions Committee, pointed out that despite the Office of Management and Budget’s 2013 strategic plan to improve federal technology accessibility requiring GSA to collect this data, nothing has been published for four years.

“One of things that became clear in the committee’s work looking at the VA was that a lot of the enforcement mechanisms, whether it was GSA or DOJ, were not being carried out in the way Congress had expected  or anticipated,” said a committee source, who requested anonymity in order to talk to the press. “A lot of times these types of reports and functions wither on the vine with absence of attention from Congress. This is about generating some action on the executive branch side.”

The bipartisan letter to GSA requested data about agency 508 compliance by Nov. 14.

A GSA spokeswoman didn’t address the Senators’ request directly, but pointed to the “Governmentwide Strategic Plan to Advance Diversity, Equity, Inclusion, and Accessibility in the Federal Workforce” that highlights GSA’s work  with the Office of Management and Budget, the U.S. Access Board and the Federal CIO Council’s Accessibility Community of Practice to review existing accessibility guidance and best practice resources and make updates as necessary to help agencies build and sustain an accessible federal technology environment.

Federal Chief Information Officer Clare Martorana said in an email to Federal News Network that the administration recognizes why accessibility is critical to digital service delivery, customer experience and DEI initiatives.

“We are continuing to track agencies’ progress on accessibility to make sure they are prioritizing accessibility, remediating existing accessibility issues, and are on a path to deliver more accessible IT from the beginning,” she said.

Additionally, OMB says it’s working with agencies to prioritize IT accessibility in performance and budget conversations — especially when an agency is modernizing a website or digitizing a form that impacts service delivery.

“OMB is working with General Service Administration (GSA), U.S. Access Board, and the Department of Justice (DOJ) to improve public reporting and expand automated data collection so that the public can better see government’s progress on accessibility,” an OMB official said. “Currently, OMB requires agencies to report twice per year information about IT accessibility including web accessibility as well as the maturity of their Section 508 program. OMB analyzes these reports, tracks agency progress and uses these to engage in performance management conversations with agencies.”

More than just data that is lacking

Of course, for some agencies like the Department of Veterans Affairs, it’s more than just a lack of transparency. Casey wrote to VA in June and held a hearing in July highlighting consistent problems with VA and other agency websites not meeting accessibility standards.

It’s true some agencies are better than others, but 24 years into the requirements and, especially as technology has made it easier to ensure people with disabilities can access services, federal efforts remain tepid. A June 2021 report by the IT and Innovation Foundation found 50 of the 72 federal websites tested (70%) passed the accessibility test for their homepage. But as ITIF went further into those sites, that percentage dropped to 52% for the top three visited pages of a specific site.

To be clear, not all the blame can put on agency shoulders. Congress hasn’t really been paying attention either for most the last decade.

Casey’s hearing in July was the first one since at least 2011. The senator’s request in August for the Government Accountability Office to review agency compliance with Section 508 would be its first major study since at least 2016 if not well before that, the committee source said.

The source confirmed that GAO has accepted the request to review 508 compliance.

“In the course of the committee’s oversight work, it became clear that with DOJ not doing their report, GSA not making the information public and the absence of Congressional oversight, there are some agencies who are doing well, but others who aren’t and this needs more attention,” said the committee source. “We are figuring out whether this nearly 25-year-old law may need some updates to refresh what executive branch agencies are doing.”

While websites and other agency efforts may be falling short in some cases, it’s clear agencies are not ignoring the law altogether. At last week’s Annual Interagency Accessibility Forum, sponsored by GSA, the accessibility initiatives from more than 20 civilian, defense and intelligence community agencies were on display.

“I hope that you’ve seen over the last two days as well as the last two years, that accessibility is not an afterthought for President [Joe] Biden, and all of us who work for this administration. We know that government only works if it works for everyone, and we believe that if it’s not accessible, it’s not equitable, both for the people, who are public servants, and for the Americans that we serve, which includes about 16 million who have disabilities,” said Katy Kale, GSA deputy administrator at the event. “Our government and our democracy have a responsibility to ensure that people with disabilities, both visual and invisible, can pretend can participate in public life.”

Expanding governmentwide efforts

GSA continues to be at the center of ensuring agencies have the tools and knowledge to meet and exceed 508 standards through its Office of Governmentwide Policy and through its Federal Acquisition Service.

Andrew Nielsen, the director of governmentwide IT accessibility programs at GSA’s OGP, said at the event there are several ongoing initiatives to help agencies meet 508 requirements including a new tools called open accessibility conformance report (ACR).

“The benefit that we see from this open ACR is to develop and to define a data schema for a machine readable version of an accessibility conformance report. So rather than a report produced in a Word document that is then typically in a PDF, we are encouraging people to use the open ACR data schema, the definition for how that data should be relayed in a machine readable fashion,” Nielsen said at the event. “We can then benefit from the ability to more readily share accessibility conformance information right alongside other product information, specifications and descriptions. So when we’re reviewing other information and making purchasing decisions, we can include more readily accessibility conformance information as part of that.”

Additionally, GSA plans to create an ACR repository in the coming months to make it even easier for agencies to find and use this information to ensure the products they buy meet or exceed the 508 standards.

Nielsen said a second initiative is to update their accessibility requirements tool for procurements. He said this too will be easier to use and published on open source repository so others can customize it or bring it behind a firewall to use on classified systems.

“We also have a tool available for federal employees only that is the available for review of solicitations posted on SAM.gov. We are still developing that solicitation review tool. It’s will use artificial intelligence, machine learning and natural language processing to scrape the information posted in solicitations to identify and then flag those that don’t include any accessibility related requirements,” he said. “The intent there is to train up our machine learning tool to improve the logic and actually interface or reuse the logic from the accessibility requirements tool. In the future state, it not only will flag solicitations for the owners, but also using the logic from the accessibility requirements tool give them recommendations for which language to include in the solicitation. Our hope is to improve our approach to accessible or procurement of accessible products.”

Improving testing consistency

In addition to those two acquisition focused tools, GSA and the U.S. Access Board developed an information communications and technology baseline for websites to reduce testing ambiguity and increase consistency of results.

Dan Pomeroy, the deputy associate administrator in the Office of Information Integrity and Access in GSA’s OGP, said at the accessibility forum that the baseline describes how to evaluate conformance to the 508 standards, which align with Web Content Accessibility Guidelines (WCAG) 2.0.

“It’s organized by categories to help users easily identify applicable requirements. It’s important to note that the baseline is not a test process in and of itself, but rather a tool that should be used to create an accessibility testing process,” he said. “While other baselines such as the ICT baseline for software are in the works, the ICT testing baseline for the web is live. It can be found under the testing section at section508.gov.”

Pomeroy said another related effort from the governmentwide IT accessibility team is the creation of an accessibility policy framework.

This guidance aims to assist agencies with assessing accessibility policies across the functions like finance or procurement.

“The intent of the framework is to help agencies prioritize which policy documents they should review and update to improve accessibility information they contain with the overall goal of improving the digital accessibility of the products or services covered by the policy,” Pomeroy said. “The accessibility framework is currently in development and is expected to be released on Section 508.gov later this fiscal year.”

 

 


GSA leadership, IG continue to butt heads over schedule price reasonableness

The long-standing debate over whether prices on the General Services Administration Schedule contract are “fair and reasonable” reached a new level of discord.

GSA’s inspector general makes what some may say are a shocking series of recommendations about schedule prices in a new report. Auditors told GSA to cancel the six-year-old Transactional Data Reporting (TDR) program. The IG also told GSA to tell its agency customers to make their own price reasonableness determinations because they cannot trust that they are getting the lowest prices through the schedule.

And GSA’s Federal Acquisition Service responds with contempt for those suggestions, calling TDR valuable, schedule prices more than reasonable and compliant with the Federal Acquisition Streamlining Act and the Competition in Contracting Act (CICA).

The latest episode in this long-running squabble over schedule prices is an unfortunate and predictable escalation.

Industry experts called the IG’s report more than just another shot across the TDR bow, but an attempt to sink the ship.

“The OIG recommending that agencies should perform independent price determinations feels like a significant escalation in the battle between GSA OIG and GSA over the effectiveness of TDR,” said Leo Alvarez, a principal in the government contractor solutions practice of Baker Tilly, in an email to Federal News Network. “In essence, the OIG is saying they have lost confidence in the GSA schedules program to achieve reasonable prices. A proverbial ‘buyer beware’ sign is on the program.”

Alan Chvotkin, a partner with the law firm Nichols Liu, said the IG has never liked the TDR program since its inception.

“The GSA IG opposed the TDR pilot from the outset because both the commercial services practices (CSP) and the price reduction clause (PRC) would be/are inapplicable to firms participating in the TDR pilot,” Chvotkin said in an email. “The IG has also opposed almost every effort to adjust or eliminate the CSP and the PRC, such as recommended by the GSA MAS Advisory Panel. I served on that panel and supported the recommendation to eliminate the CSP and the PRC under certain circumstances. Yet both of these elements have been consistently identified by ‘commercial item’ providers as barriers to their willingness to participate in GSA’s schedules market.”

Part of the reason the IG has been against TDR and for keeping the PRC has to do with its ability to recoup alleged overcharges or mischarges.

No good way to check prices?

While the IG may like the PRC over TDR, this latest report finds that both approaches are deficient.

“When performing price analyses on TDR pilot contracts, FAS contracting personnel do not have access to TDR data that can be used for pricing decisions and as a result, they mainly compared proposed pricing to other MAS and government contracts,” the IG stated. “However, this approach does not provide customer agencies with assurance that FAS achieved pricing that reflects the offerors’ best pricing and will result in the lowest overall cost alternative to meet the government’s needs.”

The IG also heard from contracting officers; 7 out of 11 expressed concerns about TDR’s value.

“We sampled eight contracts under the TDR pilot with an estimated total value of $2.5 billion and found that TDR data was not analyzed for any of the sampled contracts,” the IG stated. “Accordingly, FAS contracting personnel followed the guidance as outlined in [2016] and relied on the pricing tools to evaluate the relative competitiveness of the proposed pricing.”

As for using the price reduction clause, the IG found “FAS contracting personnel frequently accepted commercial pricing information from offerors that was unsupported, outdated or that identified no comparable commercial sales. As a result, FAS cannot provide customer agencies with assurance that MAS contract pricing will result in the lowest overall cost alternative to meet the government’s needs.”

The IG’s analysis went one step further, looking at 20 recent MAS contract and option awards and found contracting officers’ price analyses couldn’t provide customer agencies with assurance that orders placed against MAS contracts will result in the lowest overall cost alternative.

Trust in schedules returned

This was part of the reason the IG recommended GSA tell its agency customers to conduct their own price reasonableness determination.

Trust in GSA schedule prices hit an all-time low in 2014 when the Defense Department, which is the largest customer of the program, and NASA issued deviations to the Federal Acquisition Regulations telling contracting officers to do their own price reasonableness determinations.

But since then, GSA made changes that reestablished faith in the schedules prices. In 2018, the Naval Postgraduate School and the Coalition for Government Procurement looked at prices on GSA Advantage and found they were better than commercial offerings by more than 50%.

GSA cited the Naval Postgraduate School analysis in its response to the IG, and the IG responded by throwing the study’s conclusion back in the agency’s face: “While the study found that, in some cases, GSA Advantage! pricing was better than Amazon Business, the study did not recommend using GSA Advantage! due to minimum order requirements and instead found that Amazon Business was a viable option for purchases below the micro-purchase threshold, currently at $10,000.”

Baker Tilly’s Alvarez said what seems to be the issue at hand is the long-time battle between best value and lowest price.

The IG seems to believe GSA must always achieve lowest price, while the federal acquisition community over the last 25 years has preached best value.

“In their view the program, under TDR, fails to fulfill the requirements of CICA as it does not result in the ‘lowest overall cost alternative’ to the government,” Alvarez said. “I think that highlights something more fundamental about how GSA OIG and GSA perceive the Schedules program. The concept of ‘best value’ has long been a bedrock principle of the program. In fact, at industry meetings where the TDR program was being proposed and ultimately rolled out, GSA officials stated on a number of occasions that the data would not be used to facilitate a race to the bottom on prices. Yet cost appears to be the overwhelming focus of the OIG’s audit report.”

He added while focusing on low prices for products may be easy, it’s not the case for services. And agencies are spending more on services every year, reaching more than $380 billion out of $637 billion in fiscal 2021.

“Pricing evaluations for services at the MAS contract level have always presented a challenge, and it remains unclear how GSA will effectively determine price reasonableness and best value for services contractors under TDR,” he said. “With services representing the majority of sales under the GSA MAS program, it will be interesting to see if GSA MAS contracting officers working with contractors under TDR continue to rely on the previously mentioned pricing tools, more frequently request additional supporting information, use other methods or simply struggle with drawing conclusions about price reasonableness.”

TDR expansion coming

It’s clear the IG’s criticisms aren’t having an effect on GSA’s plans to expand TDR.

Sonny Hashmi, the commissioner of the Federal Acquisition Service, said while he appreciates the IG’s input and suggestions, TDR is valuable and GSA plans to expand it.

“The future of how we buy in government is going to require real time data, and the price reduction clause, which served a particular purpose a decade ago, two decades ago, isn’t good enough. We have to rethink how we have we buy products and services in government. And programs like the TDR are the way forward,” Hashmi said in an interview with Federal News Network after the IT Modernization Summit sponsored by FCW. “Our current focus right now is to make sure that the quality and complete centric completeness of data continues to go up. We’ve made significant progress this year and we want to continue to make that progress. We’re continuing to integrate that data into the analysis tools that our contracting officers use every day when we’re doing fair price analyses.”

Hashmi said TDR has proven its value through initiatives like category management, through the pandemic and through hurricane responses.

While Hashmi wouldn’t offer details or a timeline about how GSA is expanding TDR, the IG report says the agency plans to move it out of the pilot phase and expand it across all schedule contracts starting Nov. 1.

“The goal though is to move to a regime over time that leverages transactional data rather than relying on clauses for price or product price assurances. We want to get we want to do that when we’re all comfortable. It’s the right time to do one of the big pushes,” he said.

Alan Thomas, the former FAS commissioner at GSA and now chief operating officer at IntelliBridge, said the IG report actually gives the agency the opportunity to rethink this entire process.

“TDR upends that model and requires the IG to change while continuing to provide important oversight of Contracting Officers’ work. This isn’t easy, but I think the report offers a pathway to a new normal.  Doing so will require leadership buy-in from FAS and the IG,” he said. “The expanded use of additional data and analytic techniques to get the best value for government buyers is an area for collaboration between FAS and IG’s audit team.  With the right leadership support, this latest report could be the catalyst for putting the best minds from FAS and the IG together on this topic?”

The question is can the IG and FAS leadership put away years of acrimony to come together on this important topic? It’s clear that GSA isn’t canceling TDR or telling its customers that schedule prices are not fair and reasonable. And it’s clear the IG will continue to say schedule prices are problematic.

 


You don’t speak DoDAF? The Navy feels your pain with its new plain language design concept

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcastsor PodcastOne.

Among the first words out of Don Yeske’s mouth last Friday was “I hate architects.”

Yeske is the chief solutions architect for the Department of Navy’s chief information officer’s office, so right away the standing room only luncheon crowd at the AFCEA NOVA event offered a hearty, if not a little uncomfortable, laugh.

Don Yeske is a cloud solutions architect for the Department of Navy’s chief information officer’s office,

“We work in this obscure language that nobody speaks, called the Defense Department Architecture Framework (DoDAF). It’s great language, there’s nothing wrong with it. We have extra people come on board to a project or program and those people go and bother the people who are the engineers, the testers, the developers, the people turning wrenches, and they say, ‘Hey, I need information. So I can fill out this DoDAF view because we got to go back to the Joint Requirements Oversight Council (JROC) and get it approved, or we got to go back to the program officer to the milestone decision authority, the chief engineer has to approve these things in order for us to field this thing we’re working on,’” Yeske said. “I have to steal some of your time to go create this product, so that it can be checked by guess who? Another architect whose whole job is to check the homework of the first architect. They will argue, believe me; they will go back and forth with one another because if the second guy doesn’t reject the first guy’s work, once or twice, he’s not doing his job. In the meantime, what’s actually happening? Everybody else is out there, like building stuff, actually testing things and delivering things, hopefully, to the end user who actually needs them. And the architecture really served as a gating function. It really served as a thing that would slow you down, and that would prevent you from eventually delivering that capability. So I hate architects, because that’s what we do.”

And with more than a dozen reference architectures across DoD, ranging from zero trust to the Joint All-Domain Command and Control (JADC2) to cybersecurity to the Joint Information Enterprise (JIE), it’s no wonder architects are disliked, ignored and yawned at.

But Yeske is no self-hating architect. In fact, he’s more of a modern-day architect.

Doing development differently

He’s leading an effort within the Department of the Navy CIO to change not only architecture, but, more importantly, how the service delivers capabilities.

“The purpose of an architecture, taking it completely out of the DoD context, is if I’m building a house, it’s the instructions. It’s how you’re supposed to build the house that someone should be looking at. The engineers and the contractors should be looking at those plans and figuring out what to do. And by the way, occasionally saying, ‘Hey, your plans are wrong, change this or we did it differently.’ That’s how that’s supposed to work, but it’s totally not how it works in the Department of the Navy, or in DoD broadly,” Yeske said. “The grand idea that we have in the Department of the Navy is to do things differently. We’re going to push out information that encapsulates a lot of higher order architectures.”

To that end, the DoN CIO published on Sept. 6 version one of the Capstone Design Concept for Information Superiority. To borrow from the old Oldsmobile commercial, “This is not your father’s architecture.”

Yeske said it’s 14 pages long, it will be updated, and, don’t tell anyone, but it’s an architecture.

“We actually want people to pick up the document and read it because a lot of what it says is also said in the DoD zero trust reference architecture now in its second major version; is also said in the JADC reference architecture, now on its third major version; is also said in the cybersecurity reference architecture. There are a dozen or more major architectures that DoD and the Department of the Navy have produced that say all the same things. Nobody read them because nobody speaks DoDAF,” he said. “But we all speak English, at least passably well, and it is a highly technical language. So let’s try it. It’s a crazy different approach. Let’s see if it works.”

The Department of Navy’s CIO’s office spent a year writing the Capstone Design Concept for Information Superiority. It has one overarching goal: “To securely move any information from anywhere to anywhere else.”

Under the main objective, the DoN outlined two primary outcomes that the design concept document is moving toward:

  • Operational resilience: Yeske said this is about how resilient the system or application is? Is it down all the time? Is it approachable? Is it usable? Can people depend on the thing that you’re delivering, even under the worst possible circumstances? And if so, how do you know that? “We’re going to ask everybody, what were your measures? And how are you doing? And how do you know how you’re doing on these lines?”
  • Customer experience: Yeske said this focuses on how easy is it for people to use your application? “That’s a crazy thought, right? The two things everybody’s going to measure and report on are, how easy is your thing to use? Tell me what your customer experience actually is? Tell me what feedback you got from the end users of your thing that told you it worked? Does it work? And how do you know?”

Very simple and straight forward questions that every developer, mission owner and architect should be asking and answering.

Of course, just writing a 14-page, easy to read (hopefully) document is only step one. Yeske said the DoN CIO’s office needs to not just get developers and mission owners to use it, but truly understand the value it brings.

Step one in that effort is delivering services that embody the capstone design document’s goals and objectives.

First enterprise service approved

Yeske said the Information Superiority Advisory Board approved the first enterprise service that exemplifies the architecture concepts just recently.

“The Naval Integrated Modeling Environment hosts model-based systems engineering tools and provides a shared repository for the models to live in, so that people can iteratively, incrementally and collaboratively develop and deliver anything that you can do through a digital engineering approach,” he said. “It’s just a shared set of tools with a shared repository to do that digital engineering work. If we didn’t have that, or something like it as an enterprise service, what would we do? Well, I can tell you what we would do, because it’s what we’re doing now, everybody’s trying to create their own version of that. Everybody’s trying to create their own shared repository. Everybody’s trying to create their own standards around digital engineering.”

Through the Naval Integrated Modeling Environment, the DoN is creating a standard infrastructure with reusable services that is based on the architecture. But, for the most part, developers and users don’t need to know that.

The advisory board’s is likely to approve the Naval Identity Services (NIS) as its next enterprise service.

Yeske said because every application requires identity verification and authorization, it makes sense for the DoN to create that common platform.

“What we do right now is we all implement our own solutions for that. That’s a huge waste. And it’s also preventing us from getting after the next objective,” he said.

In the end, the architecture or Capstone Design Concept for Information Superiority is just a tool to get the Department of the Navy to its end goal, systems that serve the warfighter’s needs, that are secure, agile and rely on standards.

Pretty simple to understand, even for an architect.

 

 


Martorana pressed about IT project oversight, role of Federal CIO by House lawmakers

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Federal Chief Information Officer Clare Martorana’s time before the House Oversight and Reform Subcommittee on Government Operations on Sept. 16 lacked many of the trite lines of questioning that have usually come with federal IT hearings.

There were no complaints about the definition of a data center. Lawmakers did a nice job of keeping big “P” politics to a minimum. And concerns about specific constituent issues were mostly left out during questioning.

At the same time, Martorana, who reached her 18-month mark on the job earlier this month, kept the lawmakers at bay at least for a few more months around hot topics like cyber and customer service metrics.

What we did learn from the hearing, however, puts the Office of Management and Budget and the Office of Federal CIO on record to produce public, transparent metrics and deliver on promises in fiscal 2023.

Here are my three takeaways from the hearing:

IT funding questions

One long-held view across the federal sector over the last two decades is agencies need more money to get out from under the technical debt.

OMB hasn’t shared a new estimate about how much truly old technology agencies are working with for at least six years. Former Federal CIO Tony Scott projected in 2016 that federal technical debt topped $7 billion.

This is why when Congress included $1 billion for the Technology Modernization in the American Rescue Plan Act, Rep. Gerry Connolly (D-Va.) and others called it a down payment.

But more than a year after receiving the money, not every member of the subcommittee is convinced that more money for federal IT is the answer.

Rep. Jody Hice (R-Ga.), ranking member of the subcommittee, raised concerns about how agencies are spending money to modernize technology and the Federal CIO’s oversight of that spending.

Rep. Jody Hice (R-Ga.) is the ranking member of the Oversight and Reform Subcommittee on Government Operations.

“There’s an underlying assumption that the vast amounts of funding somewhere in the neighborhood of $100 billion a year will somehow deliver the intended results. But in my time in Congress, at least, and certainly during my time as ranking member of this subcommittee, I’ve learned that it’s probably not wise to make that assumption,” he said. “While my Democratic colleagues claimed the source of the problem is lack of funding, I, quite frankly, reject that premise. Simply pouring more money into a black hole is not a solution. What we need is solid oversight that is backed by reliable information in order to determine the true state of our federal IT and to determine whether federal IT projects are delivered on time and on budget.”

Hice’s comment may come off as partisan given Republicans general dislike for spending more money.

But stepping back from the “politics” of the concept, OMB’s oversight of federal spending has become less transparent.

The PortfolioStat and TechStat processes from the Obama administration have been in a deep slumber for more than five years. Former OMB staff have said PortfolioStats have not been regularly performed for several years. Instead, OMB reassigned resources to other priorities.

The PortfolioStat implementation guidance hasn’t been updated since 2015 and there no public discussion by OMB about how they are using the process to address IT projects that may be in trouble. In fact, the Government Accountability Office made recommendations in 2015 to improve the PortfolioStat process and OMB implemented two of four of those suggestion with GAO closing the two that weren’t implemented.

The Federal IT Dashboard does still discuss savings from PortfolioStat, more than $407 million in fiscal 2021 alone. But there is little public evidence for how agencies achieved those savings and what OMB’s role was in overseeing those efforts.

OMB designed TechStat and PortfolioStat to bring some much needed top-level oversight to federal IT projects. At one point, OMB encouraged agencies to do their own internal oversight sessions and several did early on.

But what has happened over the last five-plus years around oversight of IT project is unclear, and that lack of transparency came out during the hearing.

Del. Eleanor Holmes Norton (D-D.C.) asked Martorana that exact question.

“Empowering CIOs and then holding them accountable for using their authorities effectively is the goal of our subcommittee through the biannual FITARA scorecard,” he said. “So may I ask you, how will you work with Congress to provide the public data and information that will help you and your efforts to highlight IT leadership and accountability?”

Earlier on during Holmes Norton’s questioning, Martorana offered some insights into how she views her role. She said the Federal CIO helps agency CIOs navigate a complex set of rules, regulations and laws that drive their operating environments.

“It is really incumbent upon this role to make sure we are playing an oversight role, that we are measuring and where we are able to that we are sharing best practices across every federal agency and CIO that I work with,” she said. “We’re all trying to solve the same problems. We don’t want to start from a blank piece of paper. So when one agency does goes on an IT modernization journey, for example, we want to make sure that we share those best practices across the entire federal enterprise.”

Hice piled on this line of questioning later in the hearing.

“You bring up your position, and with the ability you do or do not have to actually produce change. I’m curious about that. I’m going to give you three questions that I would like for you to respond back to the committee,” he said. “Question number one, can you supply this committee with a copy of your job description? Secondly, who established that position? How did the process come about that the Federal CIO position was established? And then thirdly, do other CIOs recognize this position and do they submit to your proclaimed authority? If you can send me an answer to those questions here in the next week or so I would appreciate it.”

Rep. Gerry Connolly, chairman of the subcommittee, added to Hice’s request seeking answers about the Federal CIO’s relationship with the Federal Chief Technology Officer, which is currently vacant as the Biden administration hasn’t nominated anyone yet, and how the roles of those two offices have evolved over the past decade.

All good questions from Hice, Connolly and the members because throwing more money at a problem rarely has been the answer and usually just exacerbates the underlying issues for why more money is needed in the first place.

What is the Federal CIO’s oversight role and how are they ensuring agencies are accountable for IT spending? And please don’t tell me the budget side of OMB and desk officers are the first line of defense.

TMF slush fund?

Hice reiterated his concerns about the Technology Modernization Fund from the FITARA hearing in July. It’s a good sound bite, for sure. While there is little evidence or truth behind that thought, Hice, once again, highlights OMB’s ongoing challenges to communicate and demonstrate the value of the TMF.

Hice’s comments focused on OMB’s reduced requirement for agencies to repay the “loans,” and whether OMB is ignoring the spirit and intent of the TMF’s underlying law, the Modernization Government Technology Act.

“The broader MGT Act meant doing away with the types of ancient systems that still run too many of our vital government programs. In addition, the tenet of the TMF was that it would create an efficient cycle,” he said. “The Biden administration has opted for partial or even minimal reimbursements. I want to know why. It’s also emphasizing cybersecurity and customer experience projects, which in and of themselves are fine, but doing so rather than retiring old systems. Again, it’s not that these practices in and of themselves are bad, but it simply and clearly is not the intent of Congress. So why is the administration doing this? We need answers. Does the savings based model of the TMF not work? Or is it simply inconvenient? This committee needs to know and what progress is being made to retire legacy systems.”

On a side note, Hice asked if there was a definition of legacy systems, which smart folks in industry pointed out to me that yes, there is, of course. And it’s in the MGT Act as IT systems that are “outdated or obsolete system of IT.”

But going back to the TMF, questions about the repayment requirements have long been a sticking point for both agencies and Congress.

Clare Martorana is the federal chief information officer.

Martorana said the year before OMB changed the repayment process, the TMF Board saw only one proposal to obtain money. That may be the first time we’ve heard that tidbit about the lack of interest in applying for the TMF.

Martorana offered a few statistics about the impact and excitement over the TMF since the repayment changes and the flush of money that came in.

She said the board received more than 150 TMF proposals for projects totaling over $2.8 billion.

“The TMF Board has invested more than half of the TMF ARP funding, and – as the board continues to invest the remaining ARP funds – our goal is to balance speed with ensuring we invest in high quality, impactful proposals that have a high likelihood of success,” Martorana said in her written testimony. “Looking ahead, we will focus on targeted investment areas, such as those in the Customer Experience (CX) Allocation announced in June 2022, as well as coordinate within OMB and with other key stakeholders to set goals for the next fiscal year that better integrate agency budget requests and results.”

Martorana promised Hice and the subcommittee that repayment remains a goal for every TMF project.

“I think within the next year you are going to see such dramatically improved outcomes from the TMF projects, because we are managing them in a completely different way than we did previously by having technologists upfront in every single part of the investment,” she said. “We review our investments quarterly, if people are not hitting their milestones, we do not give them additional funding. If teams are failing at a component, we rally people together to be able to support them with the subject matter expertise that will help them be effective and efficient.”

But as Martorana shared, calculating and achieving cost savings from IT modernization projects isn’t easy.

Before becoming Federal CIO, Martorana was the CIO at the Office of Personnel Management where she tried to modernize old mainframes and eventually move the workloads to the cloud.

“The challenging part was we weren’t able to recognize the cost savings as quickly as I would have hoped. You had to start first by reengineering all of your business processes because you can’t just lift and shift and do exactly what you did on the mainframe without interrogating the way that you do business because newer systems are differently efficient, and they potentially have the opportunity for us to really leapfrog. So you want to make sure that you’re thinking about the business process and not just moving old antiquated because that’s the way we did it 25 years ago to the cloud, for example,” she said. “I had originally planned once we were able to get the new mainframes up and running, I thought we would be able to sunset the old equipment, so get rid of operations and the maintenance cost and all of the ancillary costs, and staffing that had to be burdened managing those systems. It took years of compliance activity that we needed to go through in order to actually get those offline and stop paying for both. So we were really challenged in recognizing cost savings.”

It’s clear OMB has to explain to Congress why achieving cost savings, while admirable, may not make the most sense as one key end goal. Martorana’s example is a good start, but they need about 20 more explained in grave detail so it sinks in with members.

Law or not, FedRAMP must improve

Connolly has been on a bit of a mission to codify the Federal Risk Authorization Management Program (FedRAMP) for the past few years. His FedRAMP Authorization Act of 2021 was the first bill the House passed in January. Additionally, the House adopted the bill as an amendment to the 2023 defense authorization act, giving it another path to become law.

It’s now a question of whether the Senate will support it, and previously, the Senate Homeland Security and Governmental Affairs Committee had been hesitant, particularly Ranking Member Sen. Rob Portman (R-Ohio).

But Sen. Gary Peters (D-Mich.) and others introduced the Federal Secure Cloud Improvement and Jobs Act last fall to provide “quicker, more secure commercial cloud capabilities in government, which will improve cybersecurity and empower agencies to deliver modern digital services to citizens.” The bill made it out of committee in May, but hasn’t advanced on the Senate floor.

No matter what happens with the FedRAMP bill, Martorana said OMB recognizes the program needs to improve.

“We’re on a path to really make sure that FedRAMP is the most robust marketplace it can possibly be. But there are many small companies with  innovative software that we would love to be able to have go through the FedRAMP program, but it is cost prohibitive for some of these small organizations,” she said. “We have actually asked members of my team to work collaboratively with GSA and the program team and really roll up our sleeves. We need to fix this to make sure that not only we are supporting the supply chain issues, making sure they’re secure software development, but also making sure that we can meet the speed of the need of federal agencies to have some innovative technology available to them with the umbrella security of the FedRAMP seal of approval in a way.”

What that effort will look like is unclear.

To their credit, the FedRAMP program management office has consistently looked for ways to improve the speed, but not lose any rigor of the program. That led them to developing the FedRAMP tailored process as well as the use of Open Security Control Assessment Language (OSCAL) to automate the security documentation process and speed up approvals.

Just last week, FedRAMP issued its draft Authorization Boundary Guidance, which is critical to helping cloud service providers and their security package going to the JAB. The guidance is open for public comment until Oct. 17.


New CISOs come on board at VA, Transportation

The federal cybersecurity community is seeing an unusual amount of change.

In the last five weeks, no fewer than six chief information security officers or deputy CISOs took on new positions across the government.

The movement among cyber executives may not be surprising given new data ISC2 that says there are more than 2.72 million open cyber jobs worldwide with opening reaching 3.5 million by 2025. Additionally, from the Enterprise Strategy Group that 60% of respondents in recent study says it takes two-to-five years to become proficient in cybersecurity and 17% says it takes more than five years.

At a micro level, agencies and contractors are using, in some cases pay — think financial services agencies — and in most cases, mission appeal as the way to attract experts from other organization.

Basically, as we’ve heard over the last decade, the competition for cyber talent is hot and these executives moving to new positions or taking on new duties is expected given the seemingly never-ending desire for these skillsets.

Let’s start with Jay Riberio who joined the Department of Transportation as its new CISO and associate chief information officer on Aug. 28.

He comes to DOT from the Justice Department’s Bureau of Alcohol, Tobacco, Firearms and Explosives as the CISO. He was with ATF since 2018. Prior to that, Ribeiro worked at the Federal Election Commission and the State Department in senior IT roles.

Riberio takes over for Andrew Orndorff, who had been DOT’s CISO and associate CIO for strategic portfolio management for the last two years.

Jay Ribeiro is the chief information security officer at the Transportation Department.

In coming to DOT, Riberio inherits a $345 million cybersecurity budget in fiscal 2022, up from $334 million last year. DOT requested $391 million for 2023.

More specifically, Riberio is on tap to receive as much as $48 million, up from $39 million in 2022, in direct cyber funding from Congress. In the House version of the 2023 spending bill, lawmakers wrote the money would be for “essential program enhancements, infrastructure improvements and contractual resources to enhance the security of the department’s computer network and to reduce the risk of security breaches.”

VA promotes Sherrill, Roy

Another CFO Act agency turned to a familiar face to be its new CISO.

The Department of Veterans Affairs named Lynette Sherrill as its new deputy assistant secretary for information security and CISO also on Aug. 28.

In an email to staff, Assistant Secretary for OI&T and CIO Kurt DelBene said Sherrill, who had been acting CISO for seven months, will lead cybersecurity programs and risk management activities.

“In her seven months as acting CISO, Ms. Sherrill has already led high-profile efforts, including the development of VA’s new zero trust first cybersecurity strategy — the heart of OIT’s approach to security excellence. Additionally, she is driving efforts to implement continuous evaluation of systems and metrics, allowing OIT to respond to cyber threats in real time,” he wrote. “As she begins her role as the permanent CISO, I’m confident she will continue to lead with vision and passion in service of our nation’s veterans.”

Lynette Sherrill is the Veterans Affairs Department’s new new deputy assistant secretary for information security and CISO.

Sherrill has been with VA since 2004 starting out in IT security after working in industry and for the Army earlier in her career. Before she became acting CISO after Paul Cunningham retired in February, Sherrill was executive director of the enterprise command operations where she oversaw tools and capabilities to understand the dependencies across VA’s large network and monitor the IT infrastructure to address problems before they impact the network.

As the CISO, Sherrill inherits a cyber budget of $450 million in 2022. VA requested a $137 million increase in 2023.

Joining Sherrill is Faith Roy as her new deputy CISO and executive director for cybersecurity integrations, logistics and planning in the Office of Information Security.

DelBene said Roy is responsible for implementing cybersecurity programs, policies and strategies. She had been acting deputy CISO since Sherrill moved up in February.

“Ms. Roy brings a wealth of public and private sector expertise in information technology, human capital and financial management. She is also a U.S. Army Veteran,” DelBene wrote.

Similar to Sherrill, a few others ascended to new positions in their agencies.

Treasury, CBP hire new executives

The Treasury Department named Christopher Adams its new CISO for departmental offices in headquarters in August as well. Sarah Nur remains the Treasury CISO.

He has spent much of his career in working for the Air Force and is currently an Air Force reservist with the 7th Space Operations Squadron where he is assistant director of operations.

The Treasury Department named Christopher Adams its new CISO for departmental offices  in August as well.

Treasury has a $829 million cyber budget in 2022 and a significant increase to $970 million budget if Congress funds the 2023 request. Adams doesn’t control the entire budget, but some of the funding will go to securing departmental offices systems and data.

More specifically, House lawmakers approved $135 million for Treasury’s cybersecurity enhancement account, which is $55 million more than it received in 2022, but $80 million less than it requested.

Lawmakers said in its report on the bill that CEA is “a dedicated account designed to identify and support departmentwide investments for critical IT improvements, including the systems identified as high value assets.”

Once the spending bill becomes law, Treasury will have 60 days to submit a quarterly spend plan to Congress detailing how they will obligate funds, any carryover funding from previous years and how that money will be spent.

After serving for two years as the deputy CISO, Scott Davis took over as the top cyber executive at the Customs and Border Protection directorate in the Department of Homeland Security.

He joined CBP in 2020 after spending two years as the Labor Department’s  deputy CISO. He joined the government in 2010 coming from industry to work on cyber issues for the old National Protection and Programs Directorate at DHS. NPPD is now they Cybersecurity and Infrastructure Security Agency.

Finally, the Defense Department brought in a familiar face to take over some key cyber activities.

Ray Letteer started in a new position as the principal deputy director for risk assessment and operational integration at DoD CISO on Aug. 15.

“It has been an honor and privilege to serve in my prior roles in the Marine Corps, and I will carry with me those lessons and examples learned over the past 19 years into my new position. Semper Fi!!” he wrote on LinkedIn.

Letteer spent the previous 19 year with the Marine Corps where he was compliance branch deputy chief for cybersecurity and its authorizing official for the last two-plus years. He also served as the Marines CISO and chief of the cybersecurity division for 16 years.

SSA’s new cyber, technology leaders

One last new person in the cybersecurity community is Tim Amerson, who became the deputy CIO and deputy CISO at the Social Security Administration on Aug. 12.

He joins SSA from VA, where he was the director of infrastructure operations cybersecurity management for the last four years. Amerson worked at VA for nine years and spent 32 years serving in the Army National Guard before retiring in 2018.

And finally, one non-cyber related move that is valuable.

Sudhanshu ‘Sid’ Sinha is the new chief technology officer (CTO) at SSA, filling a position that has been vacant for some time.

Sinha comes to SSA after spending the last eight years with the IRS, where he was director of enterprise architecture. In that role over the last 11 months, he helped lead the architecture strategy and modernization planning and execution for the American Rescue Plan Act (ARPA).

“[I] had a great start the first week, meeting with the solid leadership team at SSA. I am looking forward to continuing my public service, improving outcomes and experience for the American public that rely on the SSA,” Sinha wrote on LinkedIn. “[I] wish to also convey thanks to my IRS colleagues and collaborators, for an amazing run over the last nine years.”

He previously worked as the deputy CIO for the U.S. Mint and worked in assorted IT roles in industry.

 


The fate of the SBIR program hangs in the balance of the next month

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

In about 30 days, one of the longest running and most successful small business programs will expire.

The House will have 14 days in September with votes scheduled to reauthorize the Small Business Innovation Research (SBIR) program when it returns to Washington, D.C. after Labor Day. Meanwhile, the Senate reconvenes on Sept. 6 and doesn’t spell out how many days it plans to be in D.C. and voting on bills.

To be sure, the fate of the SBIR program hangs in the balance of what Congress can do by Sept. 30.

If Congress doesn’t act, and it’s still a pretty big if at this point, the SBIR program would come to an end after 40 years. And this would be a travesty.

As Emily Murphy, the former administrator of the General Services Administration and long-time Hill staff member who worked on small business acquisition issues, wrote in April when she warned of the program’s impending expiration, the results of the SBIR program speak for themselves.

“Companies such as iRobot, Sonicare and Symantec are household names. Per the Small Business Administration, 70,000 patents and 700 public companies have resulted from the program. A recent study by scholars at Rutgers and the University of Connecticut looked at SBIR awards at the National Science Foundation and found that the SBIR program allowed the government to select risky but high impact ventures,” she said.

Now it seems likely that Congress will renew SBIR and all this consternation will have been for naught. But the question remains is whether they let it expire on Sept. 30 and renew it in October or November or whenever, and what impact that will have on agencies and industry alike will be significant. The threat of expiration already put agencies behind the time curve, Murphy said.

Emily Murphy is the former GSA administrator and spent 9 years serving as a staff member on Capitol Hill.

“Some small businesses will see an expiration as a sign the program isn’t stable, and may go elsewhere instead of becoming the next generation government contractors we need. They will look for other R&D funding paths that don’t promote federal mission needs, or may simply stick to traditional lines of business,” Murphy said in an interview with Federal News Network. “The government risks foreign countries closing the innovation gap, and warfighters, medical researchers, and others not receiving the support, tools and technologies they need to meet their mission.”

The authorization would impact the Defense Department in a big way, but it also would impact the National Institutes of Health, the Energy Department, NASA and the National Science Foundation.

Are “SBIR mills” a problem?

At the heart of the matter is Sen. Rand Paul (R-Ky.), ranking member of Small Business Committee, concerns about SBIR and how some companies game the system.

Paul outlined his concerns about of what he calls SBIR mills at a hearing last September on SBIR and its cousin the Small Business Technology Transfer (STTR).

“But only a select few win and have figured out how to make the SBIR program work for them and them alone. Some companies have been so successful in creating an entire business model and revenue stream that is solely for these grants they are known as SBIR mills. An analysis by the State Science and Technology Institute (SSTI) showed that from 2009 to 2019, 21% of the awards were made to the mills, which the institute defined as ‘firms who receive more than 40 phase one awards,’” Paul said. “Forty grants to just one company, may raise a few eyebrows as unnecessary and excessive and someone abusing the system. According to SBA’s public data, 196 businesses received more than 100 awards each. Some businesses received more than 900 awards. Sounds like somebody has figured out the system here.”

There is a lot of debate about whether SBIR mills really are a problem. As Paul highlighted, 21% of the SBIR awards from 2009 to 2019 went to these “mills,” but what he doesn’t mention is SSTI found 41.5% of the awards went to just one company and 56% of the awards went to companies winning two-to-19 awards, which is a pretty big spread. But this means a majority of all awards went to companies winning fewer than 20 total awards over a 10-year period. That is an average of two awards a year. Given the Defense Department made almost 17,000 individual phase 2 awards worth $14.4 billion between 1995 and 2018, two a year doesn’t seem too crazy.

A spokesperson for Paul said in an email to Federal News Network that ongoing negotiations are close to coming up with a compromise bill.

“Legislative aides for the Senate and House Small Business Committees and the House Science Committee have been in bipartisan bicameral negotiations every day for the last few weeks. As they were putting the final touches on compromise legislation to reauthorize the program, lobbyists for some of the worst offenders tried to stop Congress from taking action to curb SBIR mill abuse and destroy critical research security measures to secure the taxpayer’s investments in R&D from China, Russia, and other foreign influence,” the spokesperson said. “While our team has continued to push for a deal and work towards a compromise, Democrats are now the ones backing away from their own proposal to establish a benchmark for commercialization rather than a cap on awards.”

Cautious optimism remains

Sen. Ben Cardin (D-Md.), chairman of the Small Business Committee, offered a little more optimization about the likelihood of reauthorization.

“The SBIR and STTR programs harness the creativity and ingenuity of America’s entrepreneurs and innovators to solve the most pressing public health and national security challenges confronting our nation. Congress must reauthorize SBIR and STTR before they expire on Sept. 30 to avoid disrupting the research of small business participating in the program,” Cardin said in a statement to Federal News Network. “I will continue working in good faith with my colleagues to reach a bipartisan compromise that will reauthorize SBIR and STTR before it expires while protecting our national security and bringing more of the programs’ technologies to market.”

The Defense Department, which is one of the largest users of SBIR, offered its feedback on the reauthorization in July as well as comments on what the expiration would mean.

“Failure to reauthorize the [SBIR/STTR] programs will result in approximately 1,200 warfighter needs not being addressed through innovative research and technology development,” wrote Heidi Shyu, the undersecretary of Defense for research and engineering, and Bill LaPlante, the Undersecretary of Defense for acquisition and sustainment, in a letter to the House Small Business Committee. “Without a program targeted towards small businesses, the department will potentially lose access to talent and innovation inherent in America’s small businesses. In addition, uncertainty in the program will discourage small companies from doing business with DoD in the future.”

DoD says the return on investment through SBIR investments is substantial and increasingly making the program more valuable.

Source: DoD SBIR-STTR National Economic Impact Study 1995-2018

DoD, for instance, already put out a notice on Aug. 24 saying if Congress doesn’t reauthorize SBIR by Sept. 30, it will not move forward with its current broad agency announcement.

Eric Blatt, a lawyer with Scale LLP who advises startups that engage the SBIR program, said the program’s disruption also would mean companies possibly having to lay off employees, close altogether or seek funding from elsewhere, including China, which is a concern Paul brought up as a reason to hold up the reauthorization.

“SBIR is an important source of funding for companies, which are using this in addition to venture capital and other sources of funding. It’s very difficult to take an early stage company and bring the technology to market, and disrupting any source of funding can be disruptive or even fatal to that effort,” Blatt said in an interview with Federal News Network. “A lot of the technology that DoD wants to fund is hard to get venture capital dollars for because the DoD market is a difficult nut to crack and VCs look at that market as less attractive. This makes SBIR an incredibly important program to fund defense-oriented technology.”

A scalpel, not a blunt change to SBIR

Murphy, Blatt and other experts say concerns about SBIR mills are overblown. Yes, there will be some bad actors and they do need to be addressed and potentially removed from participating in the program.

“Any program of this size has underachievers, but the test Congress is promoting is a very blunt tool when we need a scalpel to determine who is producing results and who is not,” Murphy said. “There are better ways to address possible problems, such as requiring more reporting of the outcomes of phase II awards and if or when there are phase III awards or other types of commercialization.”

Blatt added many companies agree with the notion of adding more rigor to commercialization requirements of SBIR, some of which Congress is considering.

“Misuse of the SBIR program is not a significant issue in the scheme of things,” he said. “I don’t think reauthorization should hinge on this issue. It’s not a bad thing for companies to have incentives to do everything they can to turn their technology into competitive products. There are a lot of proposals on the table that have broad-based support and would further incentivize effective commercialization efforts.”

In the end, that’s what Sen. Paul is trying to do, make the program more effective and ensure the money that DoD and other agencies award is creating the next Pixar or global positioning satellite technology vs. the next Betamax or LaserDisc.

 


« Older Entries

Newer Entries »