Reporter’s Notebook

jason-miller-original“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.

Submit ideas, suggestions and news tips  to Jason via email.

 Sign up for our Reporter’s Notebook email alert.

DHS to use federal procurement to further reduce risks to the supply chain

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The Homeland Security Department’s initiatives over the past year to address supply chain risks aren’t even close to hitting a crescendo. But the pace and volume of the drumbeat is distinctly mounting.

If the efforts to ban Kaspersky Lab, ZTE and Huawei products were just the prelude to the symphony, then the National Risk Management Center’s initial sprint topics, the business due diligence request for information and the latest effort to use the power of federal procurement are the opening sonata.

Chris Krebs is the the DHS undersecretary of NPPD.

“There is a growing awareness and understanding to this issue. Our biggest challenge today is not having a national strategy around it while other countries do,”  said Jennifer Bisceglie, president and  CEO of Interos Solutions, which provides risk assessment services.  “Until we have a national strategy, you will have pop up policies or programs or studies, like the one from MITRE. The time is beyond here to have a national strategy.”

The White House’s National Cyber Strategy gave a brief mention to supply chain risk management, saying the government should “improve awareness of supply chain threats and reduce duplicative supply chain activities within the United States government, including by creating a supply chain risk assessment shared service.” But it offered no specific details or initiatives.

Only now are those starting to emerge through a series of DHS-led efforts.

Chris Krebs, the DHS undersecretary of NPPD, offered further insights at several events over the last few weeks, setting up bigger expectations for 2019.

The National Risk Management Center seems to be one major hub of activity for many of the supply chain initiatives.

Among the first sprints the NRMC is undertaking is around information and communications technology (ICT) with a new task force. Krebs said the kick-off meeting is this week where it will convene under the critical infrastructure partnership advisory council. He said it will be the nexus for the government for addressing supply chain risks.

A fact sheet on the task force provided by DHS details some of its initial goals and plans.

DHS said the group will “examine and develop consensus recommendations for action to address key strategic challenges to identifying and managing risk associated with the global ICT supply chain and related third-party risk.” It also will “focus on potential near- and long-term solutions to manage strategic risks through policy initiatives and opportunities for innovative public-private partnership.”

DHS formally announced plans for the task force in July. Without a doubt one major focus area in 2019 will be around reducing risk in federal acquisition.

“On the one hand, we have to make sure in the procurement cycle we are enabling the contracting officers to write the contracts the right way with cybersecurity in mind. But also as the decision process comes through it can be intelligence and threat informed so that we can knock off the bad options if and when they are presented,” Krebs said at the CyberNext conference. The event was sponsored by the Coalition for Cybersecurity Policy & Law, the Cyber Threat Alliance, and the National Security Institute at George Mason University’s Antonin Scalia School of Law in Washington on Oct. 4. “We also are looking at when are in the deployment phase and something is out there, how do we operationalize what we know so if we have information about a compromise or some other sort of actions, how can we take the appropriate risk management steps to protect federal networks.”

Headquarters of Kaspersky Lab in Moscow

Krebs said DHS wants to get out of reactive mode when it comes to addressing these real and potential risks. The entire situation to ban Kaspersky Lab products, which several cyber experts have said DHS and the intelligence community knew were a problem for years, required nearly a year-long effort to get the software off of federal networks, and left the government embroiled in a lawsuit.

“I don’t ever want to be in a position to have to issue a [bill of distribution] like that ever again. We want to stop those deployments from happening in the first place so how do we operationalize intelligence, how do we get it into the procurement cycle as earlier as possible to write smart contracts and inform the decisions makers,” Krebs said. “We must have good options on the table when [we] take bad ones off the table. One of things the ICT task force will consider is what are those incentives to drive more trustworthy options? The federal government has a great incentive package through the procurement cycle and the power of the purse.”

New details on DHS RFI

The idea of writing smarter procurements is behind the request for information DHS released Aug. 17, and recently made public questions and answers from the Sept. 27 industry day.

In the RFI, DHS wants to see what capabilities exist to provide ICT information through “due diligence” research based on publicly and commercially available unclassified data.

“DHS seeks information about capabilities that address risk as a function of threat, vulnerability, likelihood, and consequences, and aggregate multiple data sets into structured archives suitable for analysis and visualization of the relationships of businesses, individuals, addresses, supply chains, and related information,” the RFI states. “The information generated through the due diligence capability will be shared between organizations and may be used in combination with other information to broadly address supply chain risks to federal, state, local, tribal and territorial governments, and critical infrastructure owners and operators.”

The General Services Administration ran a similar effort several years ago, but it didn’t get a lot of traction.

Interos’ Bisceglie said the recent RFI is addressing many of the same issues as the GSA pilot, but what’s changed is the understanding of the supply chain risks agencies and industry are facing. Interos ran four of the pilots under the GSA effort in 2016 and 2017. GSA also tried to stand up a business due diligence shared service for agencies, but it didn’t get consistent long-term support.

“They had several civilian agencies used it and those that did, they made defendable acquisition or market decisions based on the GSA pilot. The challenge was we couldn’t get executive leadership support or get the program resourced correctly,” she said. “There is a clear need and clear void for a due diligence program. I think DHS will see how the market has matured in four years, and then put out larger multi-year contract for these services. It will be interesting to have multi-year program that is shared between DHS, GSA, NASA SEWP, the National Institutes of Health’s acquisition organization and others. That would get a lot of the large IT acquisition buying under one program where you could collect once and share often.”

DHS said in the questions and answers that it has not yet determined if there will be a solicitation in 2019.

“The Commerce, Justice, and Science Appropriations Act has a requirement that certain agencies (e.g. Commerce, Justice, NASA and National Science Foundation) conduct supply chain risk assessments for all of their FIPS high and moderate IT purchases. DHS is engaged with these stakeholders and reached out to them for help when drafting the RFI,” DHS states in its answers. “There is no way to ingest all data feeds but the desired outcome is to improve awareness. DHS wants to be able to calibrate the risk assessment to the risk tolerance of the end user/company.”

DHS said one less rigorous example of this type of effort already in place is with the continuous diagnostics and mitigation (CDM) program. In August 2017, DHS and GSA updated the CDM cyber supply chain risk management plan, requiring vendors to answer some basic questions related to manufacturing and tracking of the product before being added to the approved products list.

DHS states that it is  working with agencies this year to discover “actionable information” that would be shared across government.

Connected to National Cyber Strategy

“For each risk indicator, we need to figure out what the appropriate shelf life is. Continuous data monitoring will also have an impact. Veracity: we want data from an authoritative source,” DHS states.

And both the business due diligence and NRMC supply chain sprint tag back to the National Cyber Strategy.  In the document, the White House makes a specific point to say DHS will have greater insight and oversight of contractor systems from a cyber perspective if they hold federal data, particularly high value assets.

Krebs said while it’s still too early to determine the exact direction of this effort, he said there are several questions and facets to this effort.

“This is a longer term cycle that we have to look at whether GSA has the appropriate authorities? Do we have the appropriate authorities under FISMA? Do we need other federal acquisition authorities to ensure the supply chain is secure. We have a suite of tools capabilities at NPPD, things like cyber hygiene scanning, things like Automated Indicator Sharing (AIS) so what sort of umbrella can we extend across the contractor base particularly those who touch high value assets,” Krebs said. “Alternatively what are the security outcomes we really want to achieve through contracting and we expect of our contractors, not just in the first tier but second, third and fourth tier and how do they attest to that. There is a lot more to come here. This is a significant opportunity space.”

It’s been over a year since agencies, and DHS more specifically, started to apply a much finer and public focus on supply chain risks. The signs are clear from the White House, from DHS and from Congress that contractors and agencies can no longer be passive participants in this effort.

Having a consistent OMB DDM can impact federal management more than any specific agenda

From left: Jonathan Bruel talks with current OMB DDM Margaret Weichert and three former DDMs, Andrew Mayock, Clay Johnson and Sally Katzen, At the 20th anniversary event for the IBM Center for the Business of Government.

If you were to rank the three most important roles in government management today, Margaret Weichert currently holds two of them — the deputy director for management at the Office of Management and Budget and the director of the Office of Personnel Management.

Weichert, who took over the OPM role on an acting basis just over a week ago when Jeff Pon suddenly resigned or was asked to resign or was dismissed — nobody is really sure what happened including most of those inside OPM as its email was conveniently down for most of four days when the change happened and the White House has talked little about the reasons for Pon’s departure. But that’s a story for a different time (hint, hint: DM @jmillerwfed me if you want to talk).

The third role, of course, is the administrator of the General Services Administration, which Emily Murphy currently holds.

For this discussion, let’s just focus on Weichert’s role as DDM. Over the eight months since the Senate confirmed her, Weichert has rolled out the full President’s Management Agenda, initiated the Government Effectiveness Advanced Research (GEAR) effort to create an applied research effort to tackle management challenges, and is spearheading a major reorganization and reform effort across the government.

And on top of these and many other initiatives, more importantly is that Weichert has garnered widespread respect from Capitol Hill, industry experts and inside agencies. As we saw time and again during the Obama administration, the lack of a consistent DDM stunted far-reaching management changes over the last eight years.

The benefits of consistent messaging and leadership were probably the biggest messages from the former DDMs who took the stage Oct. 10 to help commemorate and recognize the IBM Center for the Business of Government’s 20-year anniversary.

Clay Johnson, who served six years as the DDM during the George W. Bush administration, is standard bearer when it comes to consistency in the role. Many times during his tenure from 2003 to 2009, the OMB leaders underneath him whether Karen Evans or Robert Shea or Paul Dennett knew Johnson had their back and they routinely used him to address grumblings at the agency level.

Johnson’s consistency in the position meant the Bush administration’s PMA had one voice to continually push progress.

The longer Weichert stays, the better for federal management

If Weichert does nothing else, staying in the DDM position for 3-4 years will make the kind of impact on federal management areas like IT modernization, better use of data and a reskilled workforce that the Trump administration hopes to achieve.

Weichert said one of her biggest surprises over the last eight months is the size of the appetite for change there is in the federal workforce.

“We can connect to the power of 2.1 million civilian workers in our workforce, who are dedicated, have passed our background investigations and have been here and know all the problems,” she said at IBM Center event in Washington. “We have to enable the power of the people in government. They are our best brand actors. They are our best storytellers. They are the emblems of what we are all trying to do.”

So how can Weichert continue to harness the power of the federal civilian workers and enable them to power government? Here is what the former DDMs said she and others should keep in mind as the government reform effort continues to evolve:

“The way [the PMA] was designed to be done made it successful, which I think is a huge lesson for DDMs of the future. It was not done as a separate deal. Some deal across the street from OMB where a bunch of smart people were trying to figure out how the government ought to work and then go in and do it to the agencies. It was done within OMB. Most intelligently and brilliantly, it was done with the resource management officers (RMOs) at OMB. One of the most important facts of fiscal life is everybody wants their budget officer to be really happy. So if the budget officer was involved in the management programs, they were going to be paying a lot of attention at the agencies on what the management folks wanted done because it almost certainly will impact what kind of budget they got.” — Clay Johnson, DDM from 2003 to 2009.

“I think OMB is uniquely situated to provide conventional wisdom to sometimes unconventional senior people to say exactly what could or should be done maybe by taking your idea and saying, ‘We tried that once and it failed miserably. That doesn’t mean you have to abandon your idea, but it does mean you have to come to grips with the sources of the failure and you can learn from that.’ The thing I did when I was in the Obama transition was to preach nonstop, ‘LISTEN to the civil service for god’s sake. Bring them in and ask them questions.’ I kept saying the same thing because it was critically important.” — Sally Katzen, DDM from 1999 to 2001.

“Be creative and be creative also by embracing your career colleagues, in particular, as you work with them on your creative ideas. I go back to the U.S. Digital Service (USDS) example in that, one could’ve planned and drafted legislation and a whole detailed approach to this concept, then put it in a budget and asked Congress for the money and maybe some authorizing legislation to go try to pursue this thing. Or, one could just go do it within existing authorities. One had to look at that authority, look at OPM and find the right kind of hiring authorities that existed and put all the pieces together to create out of basically whole cloth this group of basically 200-plus technologists that came in overnight and worked for a couple of years. As we found out over the couple of years, that concept was validated by Congress by bringing more money to the table as they did their work and Congress bringing even broader authority through the form of legislation.” — Andrew Mayock, DDM 2016-2017.

And, of course, Weichert and other DDMs always can rely on the IBM Center for the Business of Government. Over the last 20 years, it has issued more than 350 reports, published 23 books and conducted more than 500 radio and podcast interviews with government leaders who shared their insights about the challenges and opportunities agencies face.

As the DDMs said time and again, federal management is a non-partisan issue and good government is easy to coalesce around so let’s hope Weichert sticks around long enough to accomplish some of these important tasks.

Google decides not to bid on DoD’s $10B cloud procurement

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Google will not submit a bid for the Defense Department’s $10 billion cloud procurement known as JEDI.

Alieen Black, Google’s executive director, industry lead and group leader for US Government, said in an exclusive interview with Federal News Network that the Joint Enterprise Defense Infrastructure (JEDI) solicitation was not right for the company for several different reasons.

“We couldn’t be assured that it would be aligned with our artificial intelligence (AI) principles. There is one single cloud vendor,” she said. “We determined there were portions of the contract that were out of scope given current government certifications and requirements. Had JEDI allowed the opportunity to have multiple vendors we could’ve submitted a very compelling solution for portions of it. Google believes a multi-cloud approach is in the best interest of government agencies because it allows them to choose the right cloud for the right workload. At a time when new technology is constantly becoming available, customers really should, like DoD, take advantage of that innovation.”

Black said Google will continue to go after cloud opportunities within DoD as well as with other federal agencies that are more open and more multi-cloud oriented.

While not surprising to industry observers, the decision by the Mountain View, California, company is a clear signal to the Pentagon that the JEDI strategy is well outside of the norm.

Alfred Rivera, the former principal director of enterprise services for the Defense Information Systems Agency and now a principal at Breakwater Solutions, said Google’s decision wasn’t surprising for several reasons.

First, he said, the company has been reluctant to “even consider offering a separate infrastructure beyond what they currently have. With JEDI, the fact that a separate dedicated infrastructure for DOD would be required doesn’t seem to fit their delivery model.”

Second, because JEDI would require some level of cybersecurity oversight by DoD, Google also hasn’t been keen on giving direct access to their systems either through the review of code or management of infrastructure components.

DoD not following cloud trends

As far as the signal that DoD is outside the norm, just take a look at what CompTIA reported in its May 2018 report on cloud computing. It found a “vast majority of companies — 83 percent — have performed some type of secondary migration [to the cloud]. Most of those have been a move of either infrastructure or applications to a second cloud provider. There are a variety of motivations here. Better offerings or features top the list, with 44 percent of companies saying this was the reason for their move. Security followed close behind, with 41 percent of companies citing concerns with their original provider. Other common reasons for a move are high costs (37 percent), more open standards (35 percent), and problems with outages (30 percent).”

Going one step further to talk just about the federal market, Nutanix found in a recent survey of federal IT managers that 20 percent of all respondents are using a multi-cloud approach, and of them, 75 percent say it’s working well or very well. Additionally, 44 percent of the respondents recognized that using multiple clouds makes them more secure.

And Deltek, the market research firm, says on average each military department already has 77 cloud providers. DoD officials have been clear that JEDI will not be the only cloud instance across the services and agencies, but account for only about 15 percent-to-20 percent of all cloud services.

So this takes us back to Google’s decision about which Black said was pretty straight forward once her team reviewed the strategy.

“We are aligning ourselves to contract vehicles that allow a multi-cloud approach and we are heavily pursuing those,” she said. “This certainly wasn’t an opportunity that very many cloud vendors took to support Google. Leaning forward and looking at the overall, the fact of the matter is the DoD is a multi-cloud environment and will continue to be one, and Google will pursue those multi-cloud, open source type environments because we believe that’s the right thing for our customers.”

The other big issue for Google, and possibly for other vendors, is there are requirements in the JEDI solicitation that the company couldn’t meet.

One of those was being certified as a level 6 under the DoD Cloud Computing Security Requirements Guide.

It seems only Amazon Web Services has met the Level 6 requirement. Microsoft has received the Level 5 certification.

“Our plans are to continue to meet some of the requirements, but Google is well known for our ability to provide secure solutions,” Black said. “We are continuing to scale the compliance regimes required throughout the government, however at this time in the way JEDI was currently positioned, there were some compliance or specifications that we do not meet.”

Google’s AI principles at risk

Another issue for Google, Black said, was a single cloud approach may violate the company’s AI principles. Google released its AI principles in June after employees raised concerns about its work with DoD on Project Maven.

Black said while the decision not to bid on JEDI isn’t related to Project Maven, but DoD’s strategy to go with a single vendor could put Google in a tough situation.

An industry observer, who requested anonymity since their company does business with DoD, said it’s clear that with Google deciding not to bid, JEDI will come down to AWS or Microsoft.

“The CIA chose AWS so a lot of people seem to think that makes it the favorite to win JEDI.  If that happens, two questions come to mind: First, does the U.S. government care that it is on course to effectively creating a cloud monopoly?  And second, and this is probably more urgent, what are the security and insider threat implications of entrusting so much of the nation’s national security data to one cloud provider?” the observer said. “For the sake of competition and national security, I hope someone is considering both. If a top-three member of the cloud industrial base has decided not to bid on a premier opportunity like JEDI, what does that say about the DoD’s ability to leverage the breadth of American innovation the way the Chinese leverage their own?”

One question Google’s decision immediately brings up is how it will impact Oracle’s bid protest of the JEDI solicitation. Oracle submitted three amendments to its protest, including one as recently as Oct. 1.

Sources say with Google dropping out the likely bidders are Microsoft, AWS, IBM and Oracle.

Rivera said Google’s decision to withdraw likely will not impact the overall competition.

“First of all, a key part of providing the cloud solution would be support of a migration approach for DoD’s current systems (both legacy apps and current cloud based systems).  I’m not confident that Google is positioned to support a strategy to assist in migrating legacy systems into their cloud-based solution.  If Google doesn’t provide such an approach, each component would have to acquire these services through other means, thus making transitions more complex. That defeats the purpose of having a single cloud solution approach,” he said. “Finally, I think the other players that are potential candidates do have all these back-end services in place to support a single-cloud solution as well as a migration strategy for DoD‘s mission application.”

Black said while Google would’ve liked to support DoD on JEDI, it knows there are plenty of other opportunities in DoD and across the civilian sector to work on.

“Certainly large contracts like that are something every company wants to pursue. But under these circumstances, it makes sense for Google, for where we are in the market, our go-to-market model and our principles, to pursue support of the government in other ways,” she said.

Read more of the Reporter’s Notebook

Why USDA’s IT modernization effort is different this time

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

For the Agriculture Department, everything about its IT modernization effort is different this time.

Over the course of the last 15 years, USDA has tried to reduce, consolidate and upgrade its networks, its web services and other facets of its technology infrastructure. But agency chief information officers have found only limited success.

Gary Washington, the chief information officer at the Agriculture Department said he has so much faith in the Centers of Excellence approach to IT modernization that this time has to be different.

“There is an extraordinary amount of support and commitment. The employees see that. The team here at [General Services Administration] sees that. The team here USDA sees that,” Washington said in an interview with Federal News Network. “I think probably in the past we’ve talked a lot about modernization, but I think there is a very strong commitment to modernization. Whether it is the technology itself, the policy or funding, the relationship between GSA and OMB has been phenomenal for USDA. At every level you can actually see there is a team effort to make it successful.”

It’s also more than just a commitment to change. If you look back at what previous USDA CIOs have said, whether it was Jonathan Alboum or Cheryl Cook or Chris Smith, the pledge to move to better technology and services has been a traditional talking point. In fact in 2011, then Secretary Tom Vilsack approved a report detailing 379 recommendations for improving agency operations and saving administrative money to reinvest into citizen services.

And each CIO made some progress. Smith took USDA’s email to the cloud becoming an early adopter in 2010.

Cook consolidated tier one help desk services in 2014.

During his tenure, Alboum developed a cloud strategy and reduced duplicative software by reworking its email archiving contract and consolidating the contract for the agency’s emergency notification system subscription service and saved or avoided spending $9 million.

58 percent of USDA’s IT projects on schedule

Despite all of these efforts, USDA continued to struggle with modernization efforts. The Federal IT Dashboard says only 58 percent of all agency projects are on schedule, while 70 percent are on  budget. The IT Dashboard in 2017 reported USDA is spending about 80 percent of its IT budget on legacy IT and the remaining 20 percent on either development, modernization or enhancement, or provisioned services. Data for 2018 is not yet available.

Washington said USDA, OMB and other agencies know what the challenges are so it’s time to find a solution.

“This is modernization on a massive scale. It’s challenging, it’s hard, but it’s fun and I think, we as a government, will benefit from this,” he said.

While fun may not be the word most CIOs would use for an IT modernization initiative, Washington clearly understands all eyes are on USDA.

That started with the Phase 1 of the Centers of Excellence initiative, which focused on developing an updated view of the current state of USDA, a transition plan to the new technology infrastructure and services and a cost-benefit analysis.

And recently, GSA and USDA recently made contract awards under Phase 2 of the CoE effort that will implement the recommendations made under Phase 1.

USDA picked 12 companies across all five CoEs, including 10 firms to provide cloud adoption and infrastructure optimization services. Additionally, USDA awarded a contract to run its Business Modernization Office Support Services contract.

“Over the next 12-to-18 months we be implementing modern solutions in those five CoEs,” Washington said. “Some of the activities we have already started implementing. Some of the quick wins will be the closure of our data centers. We were slotted to close 39 and we’ve closed 21 of the 39 already. In the data analytics CoE, we’ve rolled out a dashboard on our administrative areas across the department. In the next fiscal year, we will be focusing on program data and putting program data in the dashboard so executives and managers can make informed decisions on the same data. We will continue to improve on to make the customer experience better for our farmers in the field, and we have some other functional areas we plan to address as well.”

Washington said he expects the new vendor partners to hit the ground running as they begin arriving in the agency anywhere between Oct. 11 and Oct. 18 —unless there is a bid protest on the awards, which wouldn’t at all be surprising.

“We don’t have a lot of time and this is an aggressive schedule. We already have defined goals in mind that have been laid out in an approach,” he said. “The vendors have already been made aware that they have to come in here and it’s going to be an intense pace.”

CoEs to rely on agile development

That pace and the expectations of the vendors and CoE teams making change quickly is another big difference. Unlike many federal IT programs, the pace tends to be slower and there is more discovery over the initial 30 days.

“This pace makes people focus on why we are here to conduct business properly,” he said. “In this environment, in Phase 1 and Phase 2, the goal posts has been set and you have to meet your marks. People are really focused on making sure we implement these solutions, they work and they provide value to our customers and citizens, and to the USDA employees. There is not a lot of wiggle room to sit down and analyze things forever. It makes you think about what you are doing very quickly and rolling things out in an expedient manner.”

Washington said Agriculture is leaning on the agile or iterative methodology particularly through the customer experience CoE, to roll out functionality in short time frames.

“We measure success on a monthly basis, however, we meet weekly to discuss where we are. We have defined metrics going into this. We know what we want to achieve, and we manage toward those goals and performance metrics,” he said. “We already know what we want to look like and where we want to go, now it’s just about getting there and take the steps to realize those goals.”

Another way this IT modernization effort is different than previous attempts is the oversight and attention senior leadership at the agency and in the White House is paying to the CoE initiative. This is especially true considering the centers of excellence is a Trump administration invention.

“I have a weekly meeting with the CoE management team. I meet with my deputy secretary on a biweekly basis. I meet with the secretary once a month,” Washington said. “We meet with the Office of American Innovation and we brief them monthly on where we are on the milestones, what we are implementing and does it bring value. We have a steering committee.”

40 percent of USDA’s apps ready for the cloud

Even with the oversight, Washington knows quite well that any IT modernization effort is really all about change management.

He said getting the mission and program offices to understand and accept the new ways of doing business, which includes moving applications and systems to the cloud.

“We have identified systems and applications that we are going to move, and have partnered with our business folks. Beyond that, we have to be in lock-steps with any larger migrations because there is a capacity part of this too. I don’t see cloud being a problem. We have a plan moving forward.”

Washington said about 40 percent of all systems and applications are ready to move the cloud today, while for the remaining 60 percent, USDA needs to decide if they are duplicative, necessary or what would it take to modernize them.

Washington said the ultimate goal is for USDA to deliver digital services that is driven by data and rides on a modern infrastructure that includes commercial cloud and internal cloud providers.

It seems USDA has all the internal and external pieces in place to finally make major changes to the direction of its technology systems and services. There are a lot of eyes in other agencies, across the administration, in industry and on Capitol Hill watching closely making the pressure to succeed even higher. Let’s hope Washington and the team of CoEs are up to it.

Read more of the Reporter’s Notebook

OPM, USCIS seeking new deputy CIOs

It has been a slow few weeks toward the end of the summer and into early fall around personnel changes in the federal IT community after a busy summer that saw three long-time chief information officers leave their roles.

Let’s recap the summer changes: Beth Killoran moved to a new role in at the Department of Health and Human Services and Sylvia Burns took a new role at the Federal Deposit Insurance Corporation from the Interior Department. Pam Dyson left the Securities and Exchange Commission after eight years for a new job at the Federal Reserve Bank of New York.

Joining the trail of IT executives changing roles is Robert Leahy, the deputy CIO at the Office of Personnel Management.

Leahy returned to the IRS after spending the last almost two years with OPM, according to a post on Twitter by former OPM Director Jeff Pon on Oct. 2. Pon either resigned or was fired just three days later.

Rob Leahy (left) accepts a certificate of recognition for his almost two years at OPM from former Director Jeff Pon.

Leahy worked for the IRS for 26 years before joining OPM in January 2017 to lead the development of the agency’s IT risk management function and CIO strategic plan. He served for a short time as the acting CIO of OPM as well.

He also managed the OCIO’s budget and contracting functions that managed nearly $500 million.

It’s unclear what Leahy will do in his return to the IRS. During his time with the tax agency, Leahy was the associate CIO for strategy and planning, director of enterprise technology implementation and chief in the Office of Compliance Analytics.

Over at the U.S. Citizenship and Immigration Service, Deputy CIO Keith Jones retired, according to a posting on LinkedIn.

Jones has been deputy CIO since 2012 and worked at DHS since 2006 and has more than 36 years of federal service.

A few other IT executives are moving to new roles.

Paul Tibbits received a promotion executive director of the Electronic Health Record Modernization Integration at the Veterans Affairs Department.

Tibbits moves into the new role after spending the last 11 years in the VA CIO’s office. Tibbits was the program executive officer for Financial Management Business Transformation since April 2017, and before that he served in various deputy CIO roles.

He replaces Genevieve Morris, who unexpectedly left in August after coming over on detail from the Office of the National Coordinator for Health IT.

In his new role, Tibbits takes over the OEHRM, which VA created in June to manage the preparation, deployment and maintenance of its new electronic health-care record system and the health information technology (IT) tools.

VA and the Defense Department are working closely on this electronic health record implementation and Tibbits brings more than 28 years of experience working for the military.

Finally, Kevin Youel Page, the former deputy commissioner of the Federal Acquisition Service at the General Services Administration who left in 2017, decided to join Deloitte and end his time as an independent consultant.

At Deloitte, Youel Page is a special executive focusing on shared services and service delivery transformation.

Youel Page started Onetegrity with former FAS Commissioner Tom Sharpe in July 2017. Sharpe continues to run the company, according to his LinkedIn page.

Read more of the Reporter’s Notebook

With CoE approach, HUD CIO putting the business in charge of IT modernization

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The decision by the Office of Management and Budget to name the Department of Housing and Urban Development as the next agency to use the Centers of Excellence (CoE) approach to modernize is both not surprising and a huge risk.

HUD has been trying to move to a new contracting approach to modernize its technology infrastructure for the better part of the last three years. It current contract, named HUD HITS — a $1 billion award made in 2005 to Lockheed Martin and Hewlett-Packard to run its infrastructure under a managed services contract — is now three years past its initial expiration date, providing the true definition of operations and maintenance.

And as one industry source told me several years ago, HUD remains a tough place to work because the ingrained culture at the agency continues to frustrate senior executives and the management support to change the culture also comes in waves.

So putting the spotlight on HUD under the CoE initiative means OMB and the General Services Administration is betting big on the top-level support that previous chief information officers didn’t have enough of over the last decade.

David Chow is HUD’s new CIO, coming over from the National Credit Union Administration about six weeks ago. He walks into the agency with a laundry list of priorities, at the top of which are the CoE initiatives and the application modernization project for which HUD received an extra $20 million from the Technology Modernization Fund (TMF).

David Chow is the HUD CIO and is leading the new CoE IT modernization effort.

Chow is the second CIO for HUD over the last year, replacing Johnson Joy, who resigned in March after only nine months.

“There has been a lack of consistency at the Office of the CIO’s level. There have been constant changes at the CIO leadership level. Each CIO comes in with a vision and it doesn’t translate to a long-term roadmap. That has been consistently an issue within the organization,” Chow said in an interview with Federal News Radio. “What the CoEs does a little differently here is we are actually having the business lead the overall effort. So let’s say one day if I’m not here, there’s still the roadmap that we are developing the foundation to be put in place. The CIO is actually interchangeable because once we have the roadmap, we can have other people help to execute the overall solution. This is part of the reason we are taking this initiative a little differently.”

The recognition by OMB that past IT modernization efforts at HUD led by Jerry Williams, the CIO from 2009 to 2013, and Rafael Diaz, the CIO from 2014-2017, failed to make a difference is why HUD is the perfect second agency to use the CoE approach. The Agriculture Department kicked off the CoE initiative in December 2017.

HUD’s challenges are as far as they are wide. Just take a look at what the Government Accountability Office reported in 2017, HUD would spend about 87 percent of its IT budget on operations and maintenance (O&M). This was actually good news considering HUD spent 95 percent, 92 percent and 94 percent on O&M during 2014, 2015 and 2016, respectively. Final data for 2017 and 2018 was unavailable.

During his tenure, Williams focused on improving HUD’s overall project management efforts as well as improving the CIO’s oversight of IT spending. Diaz picked up on some of what Williams did and focused on creating a more accurate view of the agency’s architecture. He wanted to do away with shadow IT and take control of IT investment planning.

Without a doubt, the efforts by Williams and Diaz made some progress. The Federal IT Dashboard shows 57 percent of all HUD IT projects are using iterative or agile development methodologies, while 79 percent of all IT investments are on time and on budget.

Chow said like many agencies HUD continues to struggle in two key areas: IT projects are not delivering capabilities quickly enough and there is a lack of strategic plan to meet mission goals.

“The CoEs are a great opportunity to make sure that we are engaging the business using the General Services Administration’s proven methodology to go through the assessment and really having the business leading the initial effort under Phase 1 to look at the business processes that could be convoluted and creating difficulties for the public to use, or also internally that it’s not providing the necessary benefits from an IT standpoint,” Chow said. “Phase 1 of the initiative is to have the business look at the overall process from the business aspect and what do we need to improve upon, and then translate that into IT requirements that in phase 2 we are looking to build out.”

Chow said this approach is much different because the CIO and program managers are not dictating the overall solutions.

HUD will take a similar, but different approach to that of the Agriculture Department, which moved into the second phase of  the CoE effort earlier this summer.

Chow said HUD and GSA signed the interagency agreement last week to kick-off phase one, which will be a 6-to-8 month effort with a planned completion date by March 31.

Then, HUD will move to phase 2, which over the following 18 months will implement the plans developed under the initial planning stage.

In the meantime, HUD is holding an IT industry day led by its Office of Small and Disadvantage Business Utilization (OSDBU) later this fall where Chow plans to talk about his vision and roadmap.

Phase 1 efforts will look across five areas:

  • Business process reengineering with a specific focus on the customer experience around the lifecycle of grants;
  • Cloud adoption for those associated business processes;
  • Data analytics to ensure the data is a high enough quality using business intelligence and artificial intelligence capabilities to help leaders make better decisions;
  • Transformational changes to the CIO’s office, where the task force will address human resource challenges and oversight of IT spending;
  • Contact center where the task force will focus on customer experience when citizens interact with HUD around the status of grants or other transactions as well as look at different ways to present information consistently.

Chow said GSA is committed to helping out under phase 1, but it’s unclear the direction HUD will go for phase 2 right now.

“We have this convoluted way where each office has its own grant process. At the same time, there are a number of systems in place and there is not a good way of managing the grant process through applications, which is causing confusion to the public and causing unnecessary burden on the public,” he said. “Under the secretary’s OneHUD initiative, we want to bring everyone together to look at the 80 percent solution from the business process standpoint. We want to reengineer our current grant lifecycle process to that 80 percent solution. Then for the other 20 percent, we are looking to configure a tailored solution for each of the office’s needs.”

Chow said that 80 percent solution also will help HUD focus on the data using AI and other emerging technologies based on the risks to the agency, which, in turn, will help save the analysts time to review and process grant applications.

As HUD moves into phase 1 of the CoE effort and uses the money it received under the TMF, Chow said there are several short term goals to create confidence in this latest effort across the agency.

“I want to make sure we have transparency with our project management. I want to elevate our project managers. I want to make sure I welcome people to poke into our projects and ask necessary questions. I want to partner with program offices and stakeholders to make sure they have a critical seat at the table,” he said. “It’s not going to be me that is setting out the direction of the IT, but it’s going to be a collective effort working with the program office to make sure our IT investments aligns to the overall HUD objectives. It’s not for me to go out there and tell them what technology we want to us, or toy we want to buy. This should be a collective effort with the program office.”

Let’s hope the CoE approach combined with OMB and agency leadership support finally moves HUD into the 21st century with its mission systems as we’ve seen enough fits and starts over the last decade.

Read more of the Reporter’s Notebook

Labor pulls back telecommunications RFP, to rethink strategy after protest

Chalk up a win for the “little guys” under the $50 billion Enterprise Infrastructure Solutions (EIS) telecommunications contract.

Granite, one of the six new vendors on the governmentwide contract run by the General Services Administration, realized the first big win on EIS, coming out on top of a bid protest by convincing the Labor Department to change its “winner-take-all” strategy for its network modernization effort.

Labor told the Government Accountability Office it would take corrective action by re-releasing its fair opportunity solicitation under EIS. With this notice from Labor, GAO dismissed Granite’s protest on Sept. 25.

“Give Labor credit. In my view, they looked the protest and said, ‘Yes, the way this is structured, we unintentionally have eliminated some of the new players and created situation where we might not get the best opportunity for the government,” said Sam Kline, general manager for Granite Government Solutions, in an interview with Federal News Radio. “They said they will go and change it. I thought this was going to be a long drawn out process and I think they realized this was best thing for the department and for the government because more competition is a good thing. They will get better prices and services.”

Granite filed the first bid protest of an EIS solicitation Aug. 27 alleging the request for quotes penalized any offeror that doesn’t already have all the required services on its contract, thus favoring the incumbents of the Networx contract and eliminating many of the new vendors from competition.

Multiple emails to Labor seeking comment on the EIS decision were not returned.

Kline said as part of Granite’s protest filings, it offered an idea for a new approach Labor could take to open up the competitive landscape.

“We have suggested a different grouping that may make sense. I’m not sure how they will change it, but I think they understand the logic behind different grouping and will do something that allows for more competition,” Kline said. “We suggested grouping all voice services, all data service and all wireless services would be more logical groupings. Some of the more difficult security services that the new folks wouldn’t have easy access to could be separated out.”

Granite’s win is important for several reasons.

First, several other agencies, including the Justice Department and the Social Security Administration were taking similar “winner-take-all” approaches to EIS. So when these and other agencies see Labor’s decision to rethink its approach, they should pause before releasing their solicitations.

Without knowing the specific reasons behind Labor’s decision, one can only guess that after reviewing the initial filings of the bid protest, the government’s lawyers likely felt they would lose the protest and it was just quicker to pull back and update the RFQ strategy.

Second, Granite, Met-Tel, Core Technologies, MicroTech, Harris, and BT Federal need to compete on a level playing field to make any inroads into the federal telecommunications sector that is stocked with 25-year incumbents. This protest win is a step toward keeping the competition fair.

And third, Labor’s decision also reemphasizes the Office of Management and Budget’s goal of EIS—to modernize agency networks and infrastructure. Labor can use this pause to reconsider its modernization strategy.

“We hope agencies want the same thing as Labor, which is good competition and great services,” Kline said. “There are more competitors with the new players who are a little smaller, more limber and hopefully we will be in the mix. Hopefully, agencies which are waiting a little bit to release their RFPs will see this and it will give them some direction.”

Granite and other EIS vendors are expecting a busy winter around EIS. As of July 31, only 10 civilian agencies — eight large and two small — have released at least one fair opportunity solicitation under EIS.

The American Council for Technology and Industry Advisory Council (ACT-IAC) released two white papers and a report from its June EIS Network Modernization Summit.

From the summit, the report highlighted five big ideas:

  1. Go big, and when in doubt, go bigger. There is tremendous change capacity under EIS and chances are your plans will change before you finish.  (Tim Quinn, Department of Interior)
  1. If you are not prepared to manage it, you are not prepared to transition it. (Gary Wall, Coastal Communications Consulting Group)
  2. There is no end to this journey. Be willing to change plans. (Suzette Kent, OMB)
  3. Modernizing and being late is still preferred to not modernizing and being on time. (Margie Graves, OMB)
  4. Get everyone on board – business, financial, and IT resources should all be focused on the same objective. (Crystal Philcox, GSA)

The two white papers focus on developing high quality proposals and alternatives to the “full service option” GSA used to offer under the old telecommunications contracts.

Read more of the Reporter’s Notebook

OFPP administrator, where art thou?

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

In about a week, the Office of Federal Procurement Policy will have been without a permanent leader for two years. That’s 720 days without a Senate confirmed, presidentially appointed executive to lead administration acquisition reform and deregulation priorities.

And Lesley Field, the deputy OFPP administrator, will become the longest serving OFPP administrator ever, racking up more than 4.25 years as acting administrator over the last 10 years.

Field, who became OFPP deputy administrator in July 2008, has been acting four times during her tenure. The first was in September 2008 when Paul Dennett stepped down, and then three more times, including since October 2016 when Anne Rung left for Amazon Business.

While Field is no longer technically acting administrator as the 210-day limit under the Federal Vacancies Act kicked in months ago, she is considered by almost everyone in the acquisition community still to be the leader of OFPP.

Meanwhile, the White House has not nominated anyone to be the OFPP administrator and it’s unclear when a candidate will emerge. Several sources confirmed four potential candidates didn’t make it through the vetting process over the last two years, and the one qualified executive that would’ve made it through the process, Emily Murphy, ended up running the General Services Administration.

“[The Office of Management and Budget]’s Deputy Director for Management Margaret Weichert was just recently confirmed in February of 2018 and is working to build out her team and find the right person to lead OFPP,” OMB spokesman Jacob Wood said in an email to Federal News Radio. “Because of the great work Lesley Field, Mathew Blum, and the OFPP team do in the absence of a Senate-confirmed OFPP administrator, it has afforded Ms. Weichert the ability to selectively search for a quality caliber candidate to help drive the procurement policies laid out in the President’s Management Agenda.”

So all of this begs several questions: Does the government even need an OFPP administrator anymore? And if not, would Congress even consider changing the position from one that is Senate confirmed to one that is just presidentially appointed?

Or if so, why can’t the administration find someone to take the position?

“I would support making OFPP administrator a career position,” said Rob Burton, a former deputy OFPP administrator and now an attorney with Crowell & Moring. “Historically political appointees don’t stay longer than about two years, and that becomes a drain on office resources to prepare for confirmation hearings and new appointees. There is an enormous amount of work going into that.”

But Burton, like other current and former acquisition executives, say while Field and the OFPP staff are among the best in government, not having a permanent OFPP administrator is problematic in specific instances.

“We are at a point in time when you could really make some significant changes to the way our federal government buys, but you have to have someone at the political level to lead that,” said Angela Styles, a former OFPP administrator during the administrator of former President George W. Bush. “You have industry, Congress, the Defense Department and really everyone understands the need to simplify the acquisition system and make it easier to access technology and commercial products and services. But there is not one person leading the effort. For at least right now, that is the most significant problem.”

This is true for many of the Trump administration’s priorities ranging from IT modernization to category management to federal spending transparency to improving the management of major acquisitions. Styles also said the Section 809 panel will be making recommendations in the coming year around acquisition reforms. The OFPP  administrator normally would be leading the decision process of which reforms to implement.

While Field, Blum and others in OFPP are quite capable to lead these efforts, Styles, Burton and other experts say having a political appointee in place would make a huge difference in the success of these initiatives.

“It’s the gravitas and being in the room,” said Styles, who now is a partner with Bracewell. “Lesley and staff are incredible, but you have to have somebody who is comfortable leading in that role. Only a political appointee who is confirmed will have enough gravitas to be in the room to get the okay from OMB Director Mick Mulvaney. Otherwise, it’s hard for Lesley and OFPP staff to make decisions when leading any priority. That is why we are not really seeing anything new out of OFPP. That’s not Lesley’s job. Her job is to tend the house.”

Burton added it’s well-known and widely recognized that it’s easier to promote aggressive agenda with political appointees in OFPP because there is more of a tendency for politicals to want to deal with politicals.

At the same time, the case for not having a permanent OFPP leader any longer is getting easier to make.

A current federal acquisition official, who requested anonymity because they didn’t get permission to talk to the press, said because the acquisition community respects and admires Field and OFPP career staff so much, progress is being made.

“I think from a procurement perspective when we bring issues to her, they do get addressed. Does Lesley have the ear of all politicals? I’m not always sure,” the official said. “I’m not sure she can or is willing to escalate certain issues that may be political bombs up there. But that being said, we haven’t had anything come up in the last 15 years where the world would fall apart in procurement.”

The official said the federal acquisition system is one of the more mature areas of government so the process to change whether around the President’s Management Agenda or from new laws is well known and understood.

“Lesley has the ability to push back, but has to be more graceful and know how to push back,” the source said. “You want people sitting in the agencies to help to push back against a bad idea. A lot of us raise issues to our political leadership if they have the clout to whisper back in OMB’s ears. She has been able to slow things down and have them rethink certain things because she is well admired and successful. What I like about what we do is it’s not political so it’s easier to talk common sense and impact. There is not a lot of politics surrounding what we do so that helps us have the ability for her to have good fact-based conversation.”

But others say not having a permanent OFPP administrator is slowing down the one way to change the acquisition process: regulations. From Jan. 20 to Dec. 31, 2017 only one final rule came out under the Federal Acquisition Regulations. And this year hasn’t been much better with 52 open FAR cases as of Sept. 14 and only 12 final rules.

Without an OFPP administrator and with Weichert’s focus must be on the big picture issues,  there isn’t anyone in OMB ensuring the Trump administration’s deregulation effort is moving forward.

Additionally, sources say agencies are considering deviations to the FAR instead of proposing new rules because there is little confidence that new rules would happen anytime soon.

“It’s important to have someone who can clearly define what our path for acquisition is in regards to other elements of OMB’s priorities, like IT modernization,” the federal official said. “I’m not sure if Lesley carries the clout or has ability to push the envelope. It’s not good to have someone in there with an acting title. It implies that they are not the final decision maker. But if you have to have someone, Lesley is good and the right person to have as acting.”

But nearly all the experts say having Field continuing to act is better than bringing in an under qualified or unqualified appointee. Additionally, experts say with Murphy leading GSA the administration may not be in a hurry to name a permanent OFPP administrator.

“The fact that Emily has a strong background in acquisition, is a political appointee and in a position to influence governmentwide policy, maybe there is a lot of communications between GSA and OFPP and that is filling the void,” Burton said. “The relationship between OFPP and the GSA administrator may mitigate the fact there isn’t political arm at OFPP.”

There are many who believe the lack of a permanent OFPP administrator is a symptom of a bigger problem that every administration seems to have when they take office.

“One day someone will wake up and realize that procurement is really a critical piece. They brush it off as always being late or slow or because of the FAR. But they are not thinking about the fact that we are heroes in fixing the problem. If you don’t have a good contract in place, you can’t get anything done,” the federal official said.

Read more of the Reporter’s Notebook

National Cyber Strategy: 4 things agencies, vendors should know about

The White House rolled out a new cyber strategy for the first time in 15 years.

While most of the coverage of the National Cyber Strategy focused on the Trump administration’s decision to roll back Presidential Policy Directive-20 and give the Defense Department and the intelligence community more flexibility and authority to conduct offensive cyber operations, John Bolton, the national security adviser, said the real goal of the unclassified and classified versions of the strategy was to deter adversaries from attacking the government, critical infrastructures and businesses, while also preparing for the future.

“The strategy directs the federal government to take action that ensures long-term improvements to cybersecurity for all Americans,” Bolton said, during a Sept. 20 press briefing. “Recognizing that cyber must be integrated into other elements of national power, the strategy is structured around the four pillars of the National Security Strategy.  Each of the four pillars includes a number of focus areas with associated priority actions to secure and preserve cyberspace.”

The reaction to the strategy was decidedly mixed.

Rep. Mike McCaul (R-Texas), chairman of the Homeland Security Committee, said in a statement, “This strategy will help better combat malicious cyber acts from foreign adversaries like Russia, China, Iran, and North Korea. I have consistently said we must call out our enemies, send a strong message that we will respond when attacked, and ensure there are real consequences if we are.”

While Rep. Jim Langevin (D-R.I.), co-founder and co-chair of the Congressional Cybersecurity Caucus and a senior member of the Committees on Armed Services and Homeland Security, said in a statement: “While I appreciate that the Trump National Cyber Strategy is in line with the bipartisan progress that has been made over the past two decades, it does not go far enough in accelerating the reforms that need to be made. Cybersecurity is the national and economic security challenge of the 21st Century, and it deserves a whole-of-government treatment. Unfortunately, the strategy is largely a restatement of recommendations that have carried through the last several administrations.”

Industry reaction was mostly vanilla too. Many experts congratulated the White House on the strategy update and for taking a harder stance to call out and respond to cyber attacks from nation states.

For our purposes, let’s just focus on the areas where federal agencies and contractors will be impacted the most.

Here are four items from the strategy that you need to know about:

More aggressive oversight of contractor systems

While there is little new or interesting under Pillar One of the strategy, which focuses on securing federal networks and data, the section around vendors stands out. The strategy states:

“Going forward, the federal government will be able to assess the security of its data by reviewing contractor risk management practices and adequately testing, hunting, sensoring, and responding to incidents on contractor systems. Contracts with federal departments and agencies will be drafted to authorize such activities for the purpose of improving cybersecurity.”

This is, by far, the most aggressive stance the government has taken with contractors who host federal data on their networks.

And it comes after reports found Russian hackers exploited small and large defense contractors under an attack called “Fancy Bear.”

The government has for years tried to work with contractors to protect federal data. In 2013, the Defense Department required vendors to meet National Institute of Standards and Technology (NIST) Special Publication (SP) 800-171 regulations to safeguard controlled unclassified information by Dec. 31, 2017.

The Office of Management and  Budget also released similar guidance aimed at vendors in 2015.

But based on what the Trump administration is seeing, a more aggressive stance now is expected.

Read between the federal workforce lines

The clamor for more and better trained cybersecurity workers is never ending in both the public and private sectors. Agencies have an even tougher time as few have anything more than direct hire authority to attract workers with this expertise.

This is why the Homeland Security Department’s new personnel readiness system combined with its authority to pay cyber workers 20-to-25 percent more is a major reason why the administration is looking to change how cybersecurity workers are managed.

“[T]he administration will explore appropriate options to establish distributed cybersecurity personnel under the management of DHS to oversee the  development, management, and deployment of cybersecurity personnel across federal departments and agencies with the exception of DoD and the IC. The administration will promote appropriate financial compensation for the United States Government workforce, as well as unique training and operational opportunities to effectively recruit and retain critical cybersecurity talent in light of the competitive private sector environment.”

To understand this concept more, check out the administration’s reorganization plan where it highlights cybersecurity workers as completing the identification of gaps in the cyber workforce and creating new programs to help fill them.

Securing the federal supply chain

Over the last two years, the focus on better securing the federal government’s technology supply chain has turned up several notches. The strategy highlights the need to better integrate the supply chain risk management into the acquisition process. Some agencies such as the National Nuclear Security Administration and the Defense Logistics Agency are out ahead of most agencies.

“This includes ensuring better information sharing among departments and agencies to improve awareness of supply chain threats and reduce duplicative supply chain activities within the United States government, including by creating a supply chain risk assessment shared service. It also includes addressing deficiencies in the federal acquisition system, such as providing more streamlined authorities to exclude risky vendors, products, and services when justified. This effort will be synchronized with efforts to manage supply chain risk in the nation’s infrastructure.”

DHS launched supply chain initiative earlier this year, released a request for information in August seeking to establish a business due diligence capability. Responses to the RFI are due Oct. 19.

All of this is part of the pre-planning to create this shared service and address the deficiencies in agency supply chain programs.

 Legislative actions in the short term

Among the biggest holes in current federal law is the computer crime statutes that are severely lacking and hampering the FBI and other law enforcement agencies.

The 1984 Computer Fraud and Abuse Act has been updated six times over the last 24 years, but many experts believe the current state of laws are well behind the times.

“The administration will work with the Congress to update electronic surveillance and computer crime statutes to enhance law enforcement’s capabilities to lawfully gather necessary evidence of criminal activity, disrupt criminal infrastructure through civil injunctions, and impose appropriate consequences upon malicious cyber actors.”

The goal now is to convince Congress that changing the law is both necessary and among their top priorities.

Read more of the Reporter’s Notebook

SBA, Justice, Energy innovate to deal with a 10-year-old cyber policy

Few would argue that among the most frustrating of all the cybersecurity requirements agencies must adhere to is the Trusted Internet Connections (TIC) initiative.

While maybe some vendors will tell agency chief information officers meeting the requirements under TIC isn’t difficult, it would be hard to find an agency CIO who agrees.

This is why so many CIOs are waiting with bated breath for the Office of Management and Budget to finalize new TIC policy and requirements.

Be sure it’s coming as Federal CIO Suzette Kent said the TIC policy is one of several updates expected in the coming weeks.

Federal CIO Suzette Kent

Before we jump into what this new policy may look like, let’s go back in time. OMB launched TIC in 2007 around the concept of reducing the number of internet access points and then putting advanced software tools to monitor traffic coming in to and going out of agency networks.

But a lot has changed in federal technology over the past 11 years and many say the old policy is causing more problems than it’s solving, including making the full adoption of cloud services much more difficult than it needs to be.

“The goal of TI was simple, but it was about network as the boundary,” said Susie Adams, CTO of Microsoft federal, in an interview. “Now the network isn’t the boundary anymore so what are you trying to protect against? Clearly there are legacy systems that need to be protected against threats on the internet, but when it comes to cloud the edge really moves. There is no edge anymore. If you connect to multiple clouds like most are today, how will they manage that environment? That is the real problem and TIC wasn’t built to address that problem.”

Adams said as TIC merges into the Homeland Security Department’s continuous diagnostics and mitigation (CDM) program, agencies are struggling to follow best practices where they focus on the application layer and use machine learning and artificial intelligence to monitor potential and real threats.

DHS and OMB recognized this problem and kicked off several pilots for how TIC could be upgraded.

SBA pilots new approach to TIC

The Small Business Administration was one of those agencies and recently finished its test and sent the results to OMB.

Sanjay Gupta, the SBA CTO, said at a recent cyber event sponsored by FCW, that the 90-day pilot fully integrated with CDM tools to meet the requirements of the policy but without the challenges that usually come with TIC, such as latency and complexity.

“We have out of the box functionality and when we demonstrate it to DHS, they were impressed that we have full visibility into our network,” Gupta said. “Our goal was to improve the cyber posture of SBA. We have one set of tools that oversee our entire IT environment.”

Guy Cavallo, the SBA deputy CIO, said the agency took cloud security tools to look at on premise and cloud network services.

“We are not matching control by control of the current on-premise TIC or CDM requirements,” he said. “We are getting alerts when people sign in from weird places or other potential threats.”

Cavallo said SBA also has to manage fewer tools, which means fewer things to patch and using 100 percent of the functionality of each tool instead of 5 percent-to-10 percent functionality of 30 tools.

Gupta said SBA has provided details of its TIC pilot to about 30 agencies where 300-to-400 people have seen their demonstrations.

While Cavallo and Gupta couldn’t offer to many more details about SBA pilot as they are waiting for OMB’s final comments, Microsoft issued a blog in June that captures more than enough basics to understand the pilot.

“SBA is using modern tools through the Azure security center. These are cloud tools to gather analytics that look at the metadata. We can tell if a user’s identity has been compromised, we can flag it and the ask administrator to look at it. They can then ask the person to reset their password,” Adams said. “There are modern ways to look at the hygiene of systems, making sure they are patched and looking at it from digital state perspective. This is true if you are managing across multiple clouds. SBA is using modern technology to get more in-depth telemetry. The current TIC is only looking at net flow data, but through the pilot SBA has all kinds of data. It takes a lot more than TIC to manage on-premise and cloud assets. SBA took a real different approach to doing that.”

Energy, Justice find alternate approaches

Along with SBA, the Energy and Justice departments are taking on the challenge of the current TIC requirements.

Like SBA, Energy was a pilot agency. Max Everett, the Energy CIO, said he has been working with OMB and DHS to improve the TIC process especially as it relates to cloud services.

“We just wrapped up the first round of a pilot for cloud email where we were  looking at different options,” he said at the recent Tech Trends conference sponsored by the Professional Services Council. “We need security, but need to move forward with cloud and mobility so the TIC model and architecture has to change.”

Joe Klimavicz, the Justice CIO, said his agency moved to two TIC stacks and identical configurations for cloud services, one for Azure and one for Amazon Web Services.

“We have deployed a unique solution so our cloud is optimized under TIC. We go through the complete security stack and create a super highway to get to Internet providers,” Klimavicz said at the Tech Trends event. “We have a limited stack of security controls if we go to Azure or AWS where we know solutions are secure on the other end. There is no latency and you get some visibility into the traffic, which is great, but you don’t create any bottlenecks.”

Klimavicz said Justice is making it as fast and convenient as possible to get to the cloud services.

“The thing that a lot of folks are thinking about is having the cloud providers pick up a lot of the security controls,” he said. “It works great for the bigger players like Google, Amazon and Microsoft. But we want to be sure we can get to all the cloud service provider. We have over two dozen cloud service partners in Justice, and for the smaller ones, it’s a big burden to ask them to provide the security services. We like where we are right now.”

Beyond the new OMB policy that is expected “soon,” the work by SBA, Energy, Justice and others are changing how security experts think about cloud services.

Jim Quinn, the lead system engineer for the CDM program at DHS, said at the FCW event, the SBA’s success shifted their mindset around how to integrate cloud and CDM.

“Under TIC, we are trying to figure out how we are providing a shared network service as we face challenge of dealing with cloud. We became much less prescriptive with the DEFEND task orders,” he said.

Microsoft’s Adams said she expects the new TIC policy to be less prescriptive than the previous policy, and for OMB to rely more on the tools from CDM.

Now that Kent said the policy is imminent, relief from TIC for agencies can only be good news.

Read more of the Reporter’s Notebook

« Older Entries