Reporter’s Notebook

jason-miller-original“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.

Submit ideas, suggestions and news tips  to Jason via email.

 Sign up for our Reporter’s Notebook email alert.

Baldwin moves from Transportation HQ to FAA for CIO role

For the first time in recent memory, the Transportation Department and the Federal Aviation Administration will have chief information officers that are on the same page.

The reason for this is not because former FAA CIOs — there hasn’t been a permanent one in three years — were unwilling or unable to work with the larger DoT. The optimism comes from the fact that the FAA named Kristen Baldwin, the DoT deputy CIO, as its new CIO.

Sources told Federal News Network that Baldwin will start at the FAA on Feb. 17. She replaced Sean Torpey, who had been acting CIO since December 2016. Torpey became the FAA’s executive director of its National Engagement and Regional Administration in October. Melanie Boteler, the FAA chief information security officer, has been acting since Torpey moved to a new role.

The FAA, like other large bureaus in agencies, has not always played well or taken instruction well from their parent organization. This was especially true in 2016 when former Transportation CIO Richard McKinney brought the Federal IT Acquisition Reform Act (FITARA) hammer down around two-factor authentication for privileged users and during the 2017 budget planning process when he required components to file a spend plan with the CIO’s office.

Current CIO Ryan Cote has a plan to centralize all back-office or non-mission IT by 2021, except for the FAA. But with Baldwin on board at the administration, Cote will have an easier time aligning DoT and the FAA.

The FAA has been trying to modernize and centralize some common IT functions. In a 2017 interview, Torpey said email and other similar shared services had been successful in saving money and providing better services.

One of Baldwin’s biggest challenges will be around cybersecurity. The DoT inspector general wrote in the 2019 report on actions taken to address the FAA’s top management challenges that cybersecurity has consistently been an area of concern.

The FAA’s strategic plan for 2019-2022 also provides a clue for where Baldwin will have to focus. The plan calls for technology innovation by using data to improve decision making through artificial intelligence and advanced algorithms.

“By 2022, the FAA will have realized expanded capabilities to conduct its mission support services through more efficient use of human capital resources,” the plan stated. “The agency will have expanded its alignment with DoT as well as shared services and personnel reform goals. More efficient use of resources will have led the FAA to improve its use of taxpayer funds in continued support for American industry and increased economic opportunities. Internal process and automation improvements will have resulted in a more streamlined and efficient shared service environment within the FAA, and will factor in the reduction of regulatory costs to regulated parties.”

While FAA gains a CIO, the Agriculture and Homeland Security departments are losing key IT executives.

USDA, DHS lose IT officials

Chad Sheridan, USDA’s the chief of service delivery operations, is heading to the private sector.

Sheridan said it was time to try something else after 26 years in government, including the last 10 with USDA.

He will start on Feb. 18 as the chief innovation officer for an IT services and consulting company called Net Impact Strategies.

Sheridan said he’s worked on three major programs during his career ranging from the next generation air craft carrier to the development of the farmers.gov portal.

Meanwhile, Sean Hughes, the Department of Homeland Security’s director of its Enterprise Network Modernization Program/Enterprise Infrastructure Solutions (EIS) transition is leaving federal service.

Hughes confirmed that he will be the assistant vice president for application management and operations at the Navy Federal Credit Union. He joins former DHS deputy CIO Stephen Rice, who is the NFCU deputy CIO.

His last day at DHS will be Feb. 14.

Hughes’ departure leaves another in a growing number of holes in the DHS CIO’s office. Currently, the CIO, deputy CIO and soon enterprise network modernization program director will all be filled by acting.

Speaking of the private sector, Beth Angerman, the former General Services Administration principal deputy associate administrator in the Office of Governmentwide Policy since February 2018, is joining Slalom, a technology and business consulting firm, on Feb. 12. She will be a client service partner with the company.

“Slalom is excited to have her join our federal leadership team,” said Luanne Pavco, general manager Slalom in an email to Federal News Network. “Her expertise and passion for people and transformation will accelerate Slalom’s vision: bringing our core capabilities to federal clients focused on modern technology, modern culture, and modern ways of working.”

Angerman left GSA in December after almost 15 years in government.

Two other changes in the federal technology sector worth noting: Andre Mendes is now the permanent CIO at the Commerce Department, according to his LinkedIn page. He had been acting since August when Terryne Murphy left for a new position at the Railroad Retirement Board.

Finally, Justin Marsico is the new chief data officer at the Bureau of Fiscal Service at the Treasury Department. He added the CDO hat to his existing role of the deputy assistant commissioner.


Enhanced debriefings work so why can’t GSA move faster to make them permanent?

Play the “breaking news” sounder. Ring the alarm bells. Get the 1920s newsboy out on the corner of 18th and F Streets in Northwest Washington, D.C., yelling to all who will listen: “If you give contractors more details on why they lost a bid, they will be happier and less likely to protest.”

That was one of the major findings of the General Services Administration’s pilot on enhanced debriefings. And it’s one we’ve known for a long time. It’s one of those results that most industry experts could’ve predicted with little fear of being wrong.

Called the In-depth Feedback through Open Reporting Methods (INFORM), GSA tested out the approach across 50 acquisitions starting in October 2018 and found the overwhelming majority of contractors and federal acquisition workers said giving more information about why a bid failed was valuable.

“Our goal was to provide industry with unsolicited insight into why they did or did not win the award, with the hope that the added information would help them improve future submissions,” wrote Jeff Koses, GSA’s senior procurement executive in a blog post.

GSA compared the results of the INFORM briefings to those of traditional post-award discussions.

“The pilot received very favorable reviews from industry. Overall, when compared to our traditional model, industry rated the INFORM pilot higher in perception of the fairness of GSA’s evaluation and selection process (average rating of 4.58 out of a possible 5 for the test group vs. 4.14 for the control group). The pilot group also gave a higher score to the quality of information being provided (4.50 vs. 4.10) and found the information useful in improving their future submissions (4.50 vs. 4.33),” Koses wrote. “The acquisition workforce also provided feedback on their experience with the pilot through the Acquisition Workforce Survey. Overall, the acquisition workforce preferred INFORM over traditional methods with 73% saying it was a good idea and 74% said the INFORM process did not cause any delays.”

The fact that industry and government liked the enhanced debriefings is not surprising.

Anecdotal evidence going back decades will show the more agencies and contractors communicate, share information and explain their positions, the more successful acquisitions end up.

Now add GSA’s pilot to those of the Defense Department and the Department of Homeland Security, both of which have been testing out a similar enhanced debriefing approach for much of the last two years, and it’s clear the time is now to push through a Federal Acquisition Regulations (FAR) Council rule to define, standardize and mandate enhanced debriefings.

Congress also is on board, putting a provision in the 2018 National Defense Authorization Act that required DoD to test out this approach. The Section 809 Panel of acquisition experts also recommended the Pentagon use enhanced debriefings on a larger number of procurements with the idea that DoD will face fewer protests.

“Most debriefings handled through written, short explanations about why you lost, and they are totally ineffective,” said Rob Burton, an attorney with Crowell & Moring and former deputy administrator in the Office of Federal Procurement Policy, back in February 2019 when GSA publicly announced the launch of INFORM. “They raise all sorts of questions for contractors which are never answered but many times only through the bid protest process.”

So why is GSA taking on INFORM 2.0 and testing this enhanced debriefing approach on another 300 procurements? There are some who believe the scientific method of test, review results and learn, and then test on a larger scale to learn more before wide scale implementation.

“Testing it out at this large scale will let us formalize and expand workforce training, and incorporate refinements as a result of lessons learned through the INFORM pilot,” Koses wrote.

There is nothing wrong with that approach. It has worked well for hundreds of years.

But acquisition experts, procurement attorneys and other observers will tell you, the government doesn’t need a slow, risk averse approach for this. The evidence is clear and the time for enhanced briefings is now.

Schedules consolidation phase 2

The INFORM 2.0 pilot is one piece of GSA’s Federal Marketplace Strategy. It has been a busy winter for moving the FMP forward.

GSA announced on Jan. 31 phase 2 of the multiple award schedule consolidation effort, which includes releasing a mass modification to all contract holders.

“During the ‘mass mod,’ MAS Schedule holders are asked to update their contracts so their terms and conditions match those in the new MAS solicitation,” GSA stated in its release.

GSA completed Phase 1 October with a review of the schedule contracts and by reducing about two-thirds of the number of Special Item Numbers (SINs) as part of a move to the North American Industry Classification System (NAICS).

Federal Acquisition Service Commissioner Julie Dunne said, “There will be no change to contract numbers which makes the transition less burdensome overall. And, we’ve been steadily training our contracting workforce to ensure a seamless transition. Soon we’ll have just one schedule, with a single set of terms and conditions, making it much easier to buy and offer complete solutions.”

The mass mod also will let vendors have access to special item numbers (SINs) that they previously couldn’t bid against unless they received another contract.

GSA also is updating its templates for the schedules including the price proposal document to ensure vendors know what information to provide and contracting officers would know what to expect and how to evaluate the data.

GSA said Phase 3 of the schedule consolidation effort is expected in the second half of 2020.

The plan to modernize and consolidate 24 schedule contracts to one started in November 2018. The goal is create a single point of entry for the $31 billion program.


Does the e-commerce executive order throw a wrench in GSA’s effort?

The e-commerce platform mandated by Congress and charged to the General Service Administration to build may have bigger challenges than any protest or industry complaint.

The White House is cracking down on counterfeit products and threatening Amazon, Walmart.com, eBay and others with suspension and debarment unless they address this growing concern.

President Donald Trump signed an executive order on Jan. 31 outlining the actions the government will take should these e-commerce platforms fail to solve these problems.

“Counterfeiting is the purest expression of intellectual property theft. And much of the counterfeit trafficking we are observing today is facilitated by e-commerce platforms like Amazon, Alibaba, E-Bay, Shopify, JD.com, and Walmart.com,” said Peter Navarro, the assistant to the President for trade and manufacturing policy, during a press call. “Today, when people shop online in what they think is the safety of their homes and on their favorite e-commerce platforms, they have an unacceptably high risk of being defrauded or even harmed by everything from contaminated infant formula, inferior child car seats, exploding batteries, and dangerous electronics, to deadly substances like fentanyl which can appear in fake prescription opioids. Put simply, what’s being sold on the Internet these days aren’t goods in a lot of cases — they’re ‘bads.’ This crisis is not about any one e-commerce platform. This is about e-commerce platforms as a class playing by a different set of rules that simultaneously hammer brick-and-mortar retailers, defraud consumers, steal American jobs and rip off intellectual property rights holders.”

Peter Navarro
White House trade adviser Peter Navarro speaks during a television interview at the White House in Washington. (AP Photo/Alex Brandon)

The Wall Street Journal reported in late November that Amazon, Walmart and eBay were among the companies which have expressed interest in bidding on the e-commerce solicitation.

It’s those different sets of rules that Navarro describes that has been at the center of much of the uneasiness about the e-commerce platform over the last 18 months.

The questions about how e-commerce platform providers will comply laws governing federal procurements, and whether GSA will create two separate and unequal procurement systems for the government have consistently come up during discussions with industry.

This executive order could throw more of a wrench into GSA’s plans than any bid protest, for which there have been two. The first was an agency-level protest by Amazon. The second, more recent one, is from Overstock.com, which is challenging the terms of the solicitation. It is arguing that the solicitation imposes unreasonable requirements that restrict competition. GAO has until April 24 to decide this protest.

New requirements from executive order

This executive order ups the ante for GSA and whether it can address the concerns the White House is highlighting. GSA plans to launch the proof-of-concept this spring and run it for three years.

The e-commerce platforms do not have to comply with specific laws like the Trade Agreements Act or the Buy American Act because purchases are under the micro-purchase threshold of $10,000.

But this is not the case for the new requirements under the executive order, which is why they are a bigger threat to the e-commerce platform than anything else. GSA estimated agencies spend about $6 billion a year on purchases below the MPT.

“The Trump administration is also going to hold both the counterfeit traffickers in China and the counterfeit enablers, like Amazon and Alibaba and Shopify, accountable for the great harm they are doing to American consumers, manufacturers, and workers,” Navarro said. “E-commerce hubs like Amazon, Alibaba, Shopify, and Walmart.com are putting most of the burden of monitoring counterfeit trafficking on American intellectual property rights holders rather than taking any effective responsibility. As a further blow to informed consumer choice, the e-commerce platforms are not required by law to provide country-of-origin labeling on their websites, and most have steadfastly refused to do so voluntarily.”

The EO calls on the Department of Homeland Security and Customs and Border Protection directorate to take steps to establish criteria for companies or individuals to protect intellectual property and the supply chain.

“E-commerce platforms like Amazon, Alibaba, JD.com, and Shopify also are the great enablers of counterfeit trafficking,” Navarro said. “Within days that a new American innovation appears for sale on the web, counterfeit — counterfeiters in places like China can set up competing websites offering knockoffs made with inferior materials at a third of the cost and sold at half the cost. As soon as you put up a new innovation for sale on the internet, it’s likely to be copied.”

DHS advises e-commerce platforms

DHS also issued a Jan. 24 report to the President on how it can combat counterfeit and pirated goods. In the report, DHS included an entire section on e-commerce platforms, particularly those that use warehouses in the U.S. to distribute products.

“The platforms that use this model may also coordinate with customs brokers, as well as provide third-party logistics and freight forwarding services to assist with the initial delivery of goods to the warehouse. Although this model is a significant innovation for legitimate commerce and provides benefits to consumers in the form of reduced costs and shipping time, it creates a mechanism that allows counterfeit traffickers to minimize transportation costs as well, while intermingling harmful goods among legitimate goods,” the report states. “From a risk perspective, this model allows goods to enter the United States in a decentralized manner, allowing a counterfeit trafficker to spread the risk of seizure across a number of low-value packages. In situations where the fulfillment center is outside the U.S. Customs area, this model provides the opportunity to use ocean container shipping as the primary mode of transit for the shipment, which keeps overall shipping costs relatively low as ocean cargo is much cheaper than air delivery. It is in part because of these incentives that these fulfillment centers have emerged as an important element of the supply chains for many counterfeit traffickers.”

Navarro said a simple fix for Amazon, for example, would be to list the country of origin and hold suppliers accountable by taking them down once they find counterfeit or pirated items.

“There’s absolutely awful vetting, in many cases, of the third-party sellers, to the point where maybe they don’t look at them much at all, or even if they look at them, they don’t bother to see how many aliases they’re operating under. It’s almost as if they don’t want to know,’ he said. “One of the most important things that Amazon could do tomorrow to show good faith to the American people would be to begin identifying country of origin on their websites. This is a tremendous disadvantage to bricks-and-mortar retailers. If you go onto a brick-and-mortar retail store, you will see the country of origin clearly on the label. And if you get a counterfeit, you can sue them. You can do neither with Amazon.”

Few grains of sand on a huge beach

He added that Amazon has refused to put that kind of information on its websites.

“My judgment borders on the criminal for them to do that because they — on the one hand, they say that they want to crack down on this stuff; on the other hand, they do not provide consumers with the appropriate information to solve the problem,” Navarro said. “E-commerce platforms like Amazon, e-Bay, and Alibaba claim they’re spending considerable time and money battling counterfeit trafficking, but their expenditures amount to little more than a few grains of sand on a huge beach of tainted profits they are creaming from the counterfeit trade.”

Amazon said on Jan. 24 that this year it will begin reporting all confirmed counterfeiters that it has blocked to law enforcement so they can build cases against them.

“Amazon’s anti-counterfeiting efforts are best in class but we recognize they are not perfect and will continue to innovate and work with policymakers and law enforcement to protect brands and customers,” the company said.

A Walmart spokesman told Reuters in a statement that the company takes reports of counterfeit goods very seriously and works proactively to prevent them.

“In the rare case that someone reports what they believe is a counterfeit item, we quickly block the item and then investigate promptly,” the Walmart statement said. “Today, we only see this on a very small fraction of less than one percent of total items available for sale on Walmart.com.”

Given the bids are in for the proof-of-concept for the e-commerce platform, it’s unclear if GSA will have to issue an amendment to address the executive order requirements or if they will address them in another way. No matter the approach, the executive order is another reason why getting the e-commerce platform over the finish line is getting more difficult.


Army, Navy facing critical moments as deployments of modern contract writing systems near

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Raise your hand if you’ve heard this story before: An agency neglects legacy technology systems for decades, receives multiple congressional mandates to change, picks a vendor who struggles and millions of dollars later, the agency received little to no value.

This tale of IT modernization is playing out once again with Army and the Navy’s attempt to consolidate and modernize their individual contract writing systems.

The question is whether the two services, who are working together but not going down the exact same path, can write different endings to those of the FBI’s Virtual Case File or the Defense Department’s Defense Integrated Military Health Resource System or the Army’s Future Combat System.

This month is the first test for the Army to flip the script on a story so the hero saves the town and they all live happily ever after.

Stuart Hazlett is the deputy assistant secretary of the Army for procurement.

“Our first deployment for a test is coming up here in February 2020, which will be at one of our Army Corps of Engineers sites, which is our Humphreys engineering support center located at Fort Belvoir, Virginia, and we’re very excited about putting that out there,” said Stuart Hazlett, the deputy assistant secretary of the Army for procurement, in an interview with Federal News Network. “They work with a financial system so there’s going to be interoperability there, and we’ll be able to see this used in a construction and facilities type manner. It will be followed in June of 2020 at the Mission and Installation Contracting Command — our command that provides installation support for the Army around the globe. So we’re very excited about these two events coming up in 2020.”

A ‘cure’ notice to CGI

This build up toward this initial deployment has faced several delays ranging from funding to integration to the contractor, CGI Federal, struggling to meet the Army’s requirements. The Army awarded CGI a 10-year, $133.9 million contract in June 2017 with a goal of implementing the company’s Momentum software in a pilot within one-year of the award.

That pilot is now two-years late and the Army has had to rebaseline the program.

In fact, the project got off to such a rocky start that the Army sent CGI a “cure’ notice last year.

CGI spokeswoman Jennifer Horowitz confirmed the company received mandate from the Army to address schedule concerns and complex interface requirements with other systems.

“CGI invested additional resources in the project and partnered with Army to improve overall project performance,” she said in an email to Federal News Network. “The project is within budget and schedule parameters while meeting functionality performance parameters for deployment.”

While the Army wouldn’t comment on the cure letter sent to CGI, Cherie Smith, the Army’s program executive officer for enterprise information systems (PEO EIS), said the “stutter steps” the program has faced so far are a stark reminder of the challenge ahead of them as well.

“Right now formal testing with [the system] is scheduled for April 2020 and we have had some challenges as we prepare for the integration testing, how all the different pieces come together. But that hasn’t been the total reason for the delays in the program. We’ve not had consistent funding from the beginning. There had been some funding cuts to the program, which affected the, program,” Smith said. “We had a change in direction on where we were going to host the application, so the Department of Defense is moving to cloud hosting now. We didn’t think it would make sense to build it in one environment and then turn around and move it and to the cloud afterwards. We did a tactical pause and moved it to the cloud so that we would be in the cloud when we go live, and not have to worry about a subsequent move later.”

She added some of that caused the schedule to move to the right as well as the challenges of data cleansing and understanding by the Army and by CGI of where all of the data comes from that is needed to populate the new system.

Navy moving closer to testing

The Navy, meanwhile, followed the Army’s path in many regards, but is a year or more behind. The Navy awarded CGI a 10-year, $222.9 million contract in March.

While it’s too early to say if the Navy is facing similar struggles as the Army, it too is facing important milestones in 2020 that could make or break the effort.

Ruth Youngs Lew is the Navy’s program executive officer for enterprise information systems (PEO-EIS).

“We have validated the electronic procurement systems’ program requirements, completed all of the ‘to be’ processes. We’ve also conducted working sessions and really laid out the requirements for some of our financial interface partners, such as Navy enterprise resource planning (ERP),” said Ruth Youngs Lew, the Navy’s program executive officer for enterprise information systems (PEO-EIS), in an interview with Federal News Network. “We established a pre-production hosting environment and that hosting environment will be used for development configuration and testing activities. And we’re currently working through the configuration process.”

Youngs Lew said the Navy already is taking advantage of the lessons learned from the Army’s challenges.

“We’ll definitely be benefiting from some of the common work already completed on the Army contract writing system. From a gap perspective, as part of our source selection, the vendors had to pass the gate review that required that 80% of the product capability requirements had to be met and then the remaining requirements are part of a gap closure plan,” she said. “Over the next 6-to-12 months, we’re addressing that. We’re also cleansing the data in our legacy systems, working on configuration adaptations, with a particular focus on workflows, interfaces and data migration. We anticipate that in the next six months, we will have our interim authority to test and will conduct our integration testing. In the next 12 months, we anticipate issuance of our authority to operate our way to completing our user training and conducting mock data migrations, and then a limited deployment go live to approximately 300 users.”

Go live date is July

CGI’s Horowitz said the company, the Army and the Navy are collaborating to share and reuse software configurations, interface designs, cybersecurity artifacts, software testing and consolidation of software license purchases to enable shared configurations and reduced sustainment costs.

The Army and Navy also are meeting at the program manager level on a regular basis.

“We also have a designated liaison co-located with the Army team. We coordinate reviews of deliverables, discuss and work through areas of potential efficiencies, and this includes reusing documentation such as interface designs, cybersecurity artifacts and other things,” Youngs Lew said. “We’re also looking at working on aligning software configuration decisions, coordinating software testing, enhancement requests and potentially license purchases.”

The Navy has a 15-month goal, from contract awards to go-live, with its contract writing system.

Youngs Lew said from July to November deployment will roll across the service.

“The limited deployment right now, when we talk about the capabilities and what it’s going to bring, is going to include FAR and DFARS regulations, and general procurement functionality. It will also include the financial interface to the Navy ERP system and connections to many federal and DoD procurement related systems.”

Like the Army, the Navy’s success depends on its ability to change processes and migrate data.

Cindy Shaver, the deputy assistant secretary of the Navy for acquisition and procurement, said the Navy has been working on its data for the better part of a decade. She said through tools, adopting the common DoD data taxonomy and ensuring contracting officers and other acquisition workers understand why clean data is so important, the Navy should be in good shape to migrate data to the new contracting systems.

“We’ve seen a marked increase in completeness and the quality of our data since we began monitoring, which was actually over seven years ago. We tracked a composite data score for each of our contracting activities that analyzes the number of contracts captured, and we report that score on a regular basis to the heads of those procuring activities,” Shaver said. “We’ve had some success in increasing our overall Navy data composite score. We were scoring under 50% on our scale, and now we are over 90%.”

Funding decreased in 2020

It’s that data piece that caused the Army to initially falter. Hazlett said the Army’s data grew in volume and complexity over the last three decades and the use of disparate systems didn’t make things any easier.

Smith added that the Army is trying to normalize and standardize its data to get to one version.

The assortment of data, the different business processes and the entire complexity of the systems is part of why DoD has been trying to move to a standard enterprisewide contract writing system since 2011. It’s also why Congress eventually got involved in the fiscal 2018 defense authorization bill.

Lawmakers wrote in the fiscal 2020 NDAA that the Senate Armed Services Committee “remains concerned about the inability of disparate acquisition strategies to leverage common requirements, commercial processes and solutions for writing contracts.”

Senators recommended significant funding cuts of $19 million for procurement and $15 million for the research and development account for the Army’s efforts. But in the end, lawmakers agreed to $19 million for procurement and $6 million for R&D.

The funding reprieve means lawmakers trust, at least for 2020, the Army and Navy to make real progress in consolidating dozens of systems.

Addressing problems sooner

The Army and CGI say they have fixed the initial troubles with the program.

The Army’s Smith said the project team meets monthly for two or more hours with CGI’s vice president for investment, which is the most senior executive working on the Momentum software install, to go over opportunities and challenges.

She said the Navy also joins the conversation, as do Army functional experts from PEO-EIS and contracting officers.

“That has paid great benefits, because sometimes at a lower level, people just work something and work something without raising their hand and saying, ‘Hey, we need some help.’ And because all of the senior leaders are there, we can streamline things and answer challenges or address issues quicker,” Smith said. “It’s been refreshing to have the customer and the person responsible for integrating that software and fielding it to the Army all together to cut through any bureaucracy or any challenges, to help prioritize the work, to make quick decisions if they need to be made. Nothing’s going longer than a month that we’re not all talking about it.”

The Navy’s Shaver said the service is testing its version of Momentum against a performance baseline to know where the risks are and the work that has to be done.

“In working with the company, I think that they have a plan to mitigate those risks to get us the performance that we need out of the product,” she said. “We actually put out a fairly detailed requirements document, which was a somewhat different approach. We did a test drive in our source selection where we actually put the product through its paces. So I feel like we knew exactly the product baseline was and where the issues were, and had talked with the Army to know where their gaps were, so that we could make sure that we’re leveraging each other’s lessons-learned as best as possible.”

For both services, success is more than just the launching of a new, modern system. It means for the Navy reducing the 65 databases, 75 interfaces across five legacy systems — some showing their fragility after more than 20 years — and making it easier for contracting officers to do their job and drive the service toward auditability.

For the Army, success is defined as helping 8,000 contracting experts move away from disparate systems that are 20-to-40 years old, use data to make better decisions and reduce the burden of the acquisition process.

To achieve that success, the Army, the Navy and CGI must write the projects’ next chapters to reach the happy ending they so desire.


To keep cyber workers, Army opens up its wallet

Forget about all the talk about training and the mission when it comes hiring and retaining cyber expertise in the government.

Just look to the 1996 Tom Cruise and Cuba Gooding Jr. movie “Jerry McGuire,” and yell “Show me the money!”

The Army is taking a page right out of that script to offer service members and civilians bonuses to keep their skill sets in-house and away from the alluring private sector. And while it may be too early to say for sure it’s working, initial results are promising.

Ron Pontius, the deputy to the commanding general of the Army’s Cyber Command, said two years into the effort to pay service members who are cyber experts to stay has increased retention rates as compared to the service’s average.

Ron Pontius is the deputy to the commanding general of the Army’s Cyber Command.

“We have aligned our talent management and people strategy with the Army — acquire, develop, employ and retain. That is really what the focus is,” Pontius said at the AFCEA NOVA Army IT day on Jan. 21 in Falls Church, Virginia. “On the military side on acquire, we have a lot of ROTC and West Point cadets that want to be commissioned as a cyber officer, and we have many, many people knocking on recruiter’s doors wanting to be a cyber operator. There is not a lack of people who want to join us. So how do we look at the attributes based on what we’ve learned about who gets really good in this space and let’s be more discerning about who we commission and who we enlist to make sure we are getting what will turn out to be the best talent.”

Once the Army trains those experts, the concern is losing them to private sector jobs.

In 2017, Congress gave the Defense Department authority to offer skill incentive or bonus pay to try to stem the tide of service members leaving for higher paying jobs.

“For enlisted personnel, we have the Selected Reenlistment Bonus that focuses on junior grade personnel for retention. Cyber has the highest retention incentive across the Army; an eligible soldier can receive up to $82,000 for a six-year re-enlistment. We have paid over $1.4 million in SRB to retain our junior cyber soldiers,” Pontius said in email responses to questions. “For our senior non-commissioned officers, (NCOs0, we developed the enlisted Written Bonus Agreement in 2017. Eligible NCOs could receive up to $100,000 for a four-year service obligation, we have paid $3 million to 28 soldiers of the 32 that are eligible. These individuals typically have 20 years of experience; so for $3 million we retained a collective 560 years of experience for an additional 112 ‘employed-years.’”

In fiscal 2020, the Army introduced the Warrant Officer Retention Bonus (WORB) for cyber soldiers with specific technical capabilities.

“The warrant officer could sign an agreement with the Army to serve four years for an $80,000 lump sum bonus. We had a budget limitation of $1 million for this program,” Pontius said. “To date, we have had 10 personnel accept the WORB and that resulted in three warrant officers withdrawing retirement paperwork to continue to serve. The acceptance of WORB by these 10 personnel results in 40 additional ‘employed-years’ of experienced personnel contributing to the cyber mission set.”

For civilian employees, which, for example, make up 20% of the typical cyber mission force team, the Army has fewer retention options.

“We’ve had to do an engagement, a marketing effort with selected colleges and universities and we are building internships and fellowship programs to bring young men and women into our force,” Pontius said. “Can we truly compete with what they can get from industry for the high end operators on compensation? The answer is no. But what we can compete on is mission, serving the nation, teamwork and camaraderie. That is what we are really working on and trying to make sure that the compensation is the best that we can do it.”

Career path for civilian cyber workers

Over the long term, the Army is considering recommending a proposal in the fiscal 2022 defense authorization bill to develop the cyber professional pay for both military and civilian personnel.

“This may require modification of 5 USC and 37 USC,” Pontius said. “We are also exploring a civilian equivalent of Assignment Incentive Pay (AIP) for that percentage of the workforce qualified within those critical work roles.”

Along with the Army, the Department of Homeland Security has found success in using bonuses to retain key cyber workers. In 2016, DHS began testing whether an additional 20% to 25% on top of an employee’s annual pay, depending on the certifications they’ve earned and the position they occupy, would keep them in government longer. This proof of concept is now part of DHS’ new personnel system that the agency is expected to finalize in early 2020.

As much as more pay is nice, the Army also is taking another potential obstacle to keeping civilian workers in their jobs longer. The Army created a path for civilian cyber workers, called Career Program 71, to help ensure they have the training and skill sets needed for today and the future.

First planning meeting in February

Pontius said about 600 civilian employees are in CP 71 today.

“With a renewed, centralized focus on this newly defined community, CP 71 aims to bring equity across the Army cyber effects community for training, education and development opportunities, provide more employee and leadership engagement activities and a concentrated effort to identify and address any challenges that exist across the force, specific to this demographic,” Pontius said. “This new community mirrors the 17 series Army Cyber Branch for civilian training education and professional development of the non-military cyberspace operators who work alongside their soldier counterparts in this mission area. The CP 71 Career Program Office develops, empowers and advocates on behalf of the Army’s civilian cyberspace effects workforce. This includes affecting policy to sustain, retain, and employee engagement. The cyberspace effects career program provides a central hub of workforce coordination that trains, educates, and develops our globally-distributed team of professionals. The new program is anchored in a four-point training strategy which includes functional/technical training; self-development; academics; and broadening opportunities.”

In February, the CP 71 career program planning board will hold its first meeting to discuss strategic direction and needs as well as receive feedback from current CP 71 employees.

“A myriad of topics will be discussed, including workforce development, strategy and continued development of the program,” Pontius said.

Pontius added that the Army also is working with the Office of Personnel Management to address a special pay rate for cyber positions, particularly in Augusta, Georgia, the new home of the Army Cyber Command.

“We have programmed additional incentive funds to be used to provide equity in incentives for the cyber civilian workforce to address the gap,” he said.


GSA clearing path for ‘pay by the drink’ cloud-buying model

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

A vendor recently talked about their predictions for 2020 and called it the year that the consumption IT model takes hold.

Initially, it was easy to place that prediction into the “buzzword of the day” file given it’s been a common refrain from this contractor for much of the past few years.

But wait, maybe, just maybe they actually are on to something?

For evidence, look no further than the General Services Administration’s draft memo on letting agencies buy cloud services off the schedule contract through this consumption or “pay by the drink” approach.

GSA senior procurement executive Jeff Koses sent to industry for comments a draft acquisition letter detailing how buying consumption based cloud services could work.

Jeffrey Koses
Jeff Koses is GSA senior procurement executive.

“While offering cloud computing on a consumption basis is already permissible under the FSS program, the procedures outlined in this acquisition letter further improve this strategy while seeking to realize the best practices in the private-sector of cost transparency and efficiency, increased cyber security and more robust competition,” Koses wrote in the letter, which Federal News Network obtained. “By tying cloud computing procurements to commercial market prices, still in the fixed price family, this approach provides cost transparency without burdening contractors with additional transactional price reporting requirements. This approach promotes cost efficiency as it reduces the need to lock into long term contracts in markets where falling prices are reasonably anticipated. And with a contract structure more closely tied to real time demand, this approach also provides greater flexibility to take advantage of technology improvements and better support cyber security.”

The draft letter also said using this consumption-based approach to buying cloud services will increase competition because it moves agencies toward commercial best practices.

“Buying cloud computing on a consumption basis will also provide GSA with information to evaluate the potential to purchase other types of information technology on a consumption basis,” Koses writes.

5 questions on buying cloud

Federal News Network confirmed GSA sent this letter to several industry associations seeking feedback.

“Per-consumption is a very popular way to acquire cloud services. It is an established acquisition method in the commercial market. GSA is moving in the right direction by allowing the consumption model to be used on the schedules program. It was essential if the schedules were to remain competitive in this area,” said Larry Allen, managing director of BDO and a federal procurement expert. “I think industry will have to weigh in on what the draft policy said about fixed-pricing and discounts. I am not sure how compatible that is with how consumption is sold commercially. I suspect that will be an area of particular feedback.”

GSA laid out five questions for industry to answer. These range from whether a fixed-price contract makes sense, to price fluctuations, to questions trying to better understand the pay-as-you-go model in the private sector.

Additionally, GSA is asking for input on the “requirements task order concept” proposed as part of the consumption model.

“The concept we’re exploring is different in that the government’s obligation to satisfy its requirements is limited to a task order issued against an FSS contract; instead of orders against a requirements contract, future requirements will be satisfied by activating contract line item numbers (CLINs) on a task order,” the draft letter states. “From a fiscal standpoint, this concept is similar to a typical requirements contract in that funds for future requirements will be obligated when the government activates a CLIN and obtains those services.”

The approach GSA lays out in the draft memo offers several key concepts for the consumption model:

  • Agencies must use no-year or multi-year funding that isn’t expiring in the fiscal year they are buying the services.
  • Each contract must have a ceiling price for all estimated cloud computing, and orders may not exceed 50% of the initial quantity ordered for the same line item.
  • The Price Reduction Clause will not be applied to these orders, instead contractors must use the Transcational Data Reporting standards.
  • Agencies will use a firm fixed-price contract that is based on the vendor’s market price list or an index with a discount that will remain constant as the price list or index changes.

“[T]he Office of Acquisition Policy will partner with the Federal Acquisition Services’s IT Category office in collecting and analyzing data on key metrics,” the memo states. “Final metrics are intended to address whether the government realized cost transparency and efficiency, increased cyber security and obtained more robust competition.”

GSA’s ITC office is developing a best practices guide for purchasing cloud computing on a consumption basis using GSA Schedules.

Clearing the path as spending goes up

Industry experts praised GSA for taking this initial step to lay out these proposed changes.

“The coalition supports efforts to streamline and improve the flexibility of cloud contracting in furtherance of agency mission fulfillment,” said Roger Waldron, the president of The Coalition for Government Procurement in a statement to Federal News Network.

The path to buy cloud services has been a significant obstacle for some time. The Office of Management and Budget called out the need for a flexible model for buying cloud services since the initial 25-point IT modernization plan in 2010 offered up during the Obama administration.

The Trump administration, in its cloud smart strategy, said agencies often buy cloud services through contracts that are not marketed specifically for that purpose.

Clearing the path to buy cloud services becomes even more important as agencies are expected to increase their spending over the next few years. IDC Government Insights predicts significant growth through 2021, where spending is expected to reach $3.3 billion. Deltek, a market research firm, estimated cloud spending to account for 9% of the federal civilian IT market by 2024.

An industry source, who requested anonymity in order to talk candidly about the draft memo, said this is another example of how GSA has been leaning forward on several contracting initiatives.

“I think some of this is about educating the workforce since the schedules already allow for this type of buying,” said the source. “This draft letter gives GSA a broader opportunity to write down these authorities that allow them to use existing authorities to do their job. What is expressed here is similar to the way GSA has bought cloud as an agency. This consumption model has been done and tried in some variation. So the fact GSA is thinking about it means they are putting a stake in ground to say cloud is important and essential for government to adopt in a bigger way.”

This isn’t the first time GSA has tried to make it easier for agencies to buy cloud services. In 2016, the agency began work to the change the Federal Acquisition Regulations and even considered legislative remedies, but it’s unclear if anything came from those initiatives. A year later, an interagency working group developed a best practices guide for buying cloud services.

The source added the draft memo is less about the federal acquisition process and more about how buying cloud services butts up against the federal appropriations process.

“It is telling that the memo calls out multi-year or no-year money as the money to do this with because single-year money has too many challenges,” the source said. “Buying large chunks of cloud services is not necessarily efficient and there are more ways to be efficient, especially given how industry is selling these services.”

In the end, that is what GSA is trying to do here — give agencies some top cover to address one of the obstacles in moving to the cloud.


After canceling 2 contracts, DISA tells industry: ‘We need you to deliver’

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

Tony Montemarano recently made a plea to a packed room of vendors: “We need you to deliver on contracts.”

The executive deputy director of the Defense Information Systems Agency isn’t telling industry anything they don’t already know. But after canceling, or in government speak not picking up the options of two contracts last year, Montemarano felt it was worth pressing that message into the hearts and minds of contractors.

“The biggest issue I have as the acquisition executive as I overlook all the programs, I see failures of development efforts on the part of our managers and the industry supports who are involved. When you bid the contract, we love it for you to be as aggressive as possible. But we need you to deliver,” Montemarano said at the AFCEA DC lunch on Jan. 16 in Arlington, Virginia. “The fact of the matter is we’ve had to terminate more than one contract this year because the vendor hadn’t been developing the product we needed them to develop.”

DISA executives (from left) Tony Montemarano, Dave Bennett, Roger Greenwell, Jason Martin and Steve Wallace, spoke at the AFCEA DC lunch on Jan. 16. (Photo courtesy DISA)

He declined to offer any more specifics about the contracts or who the vendors were, saying his goal was not to embarrass the contractor, but to educate the broader audience.

Montemarano is quick to point out the blame doesn’t sit solely on the shoulders of the vendors. DISA also was responsible for the contract problems, which caused one of the programs, which focused on command and control systems, to have its schedule slip by a year.

The message to industry and the lessons learned by DISA were clear. Contractors need to be smarter about how they bid on contracts, providing DISA with aggressive, yet well-thought-out plans, while DISA is getting away from the “big bang” and moving toward a more modular approach to contracting.

“We collectively have to be a good team. One thing we can’t control is if a vendor low balls it and then can’t deliver. Given the current personnel situation we have in our business, it’s hard for vendors to hire the right people and it’s hard for us to have the right people,” Montemarano said in an interview after the panel. “What’s very important is we come to grips with the fact this is not 1990 anymore. In 1990, DoD wrote the code for everything and did all the development. We don’t do that anymore. It doesn’t work. Given the business we are in, it’s pretty much exploiting commercial solution sets and just adapting to military applications. Given that’s the environment we are in today, we should be able to produce better.”

He said DISA is now using an iterative approach for the command and control program that fell behind and for other similar efforts.

Don’t promise us the world

The message to industry about how DISA’s expectations are changing isn’t just coming from acquisition, but also from the technical side.

Steve Wallace, a system innovation scientist in DISA’s Emerging Technology Directorate, said any successful program begins by government and industry having honest conversations about what is possible.

“Don’t promise us the world. Let’s have a legitimate conversation about technology, the weaknesses and the strengths. Anytime someone comes in and tells me about all the great and wonderful things, but doesn’t talk about any of the potential pitfalls or where things may need to get better, it instantly sends me into a defensive, ‘What are they not telling me?’” he said. “As we meet with you all and have those conversations, let’s really talk about those strengths and weaknesses. It’s not because I’m trying to use it against you, it’s because I’d rather find it out upfront and figure out a way together that we can work it out.”

Roger Greenwell, the DISA chief information officer and risk management executive, said industry must act more quickly, especially in addressing cybersecurity threats.

Dave Bennett, the director of DISA’s operations center, added vendor proprietary solutions aren’t helpful when the agency’s environment is so disparate with hardware and software that is 30-40-50 years old.

Montemarano and the others’ comments are both refreshing and poignant.

Too often agencies do not cancel, or do not pick up the option, on contracts because it’s just too difficult to start over. And too often, agencies are not willing to stand up to vendors when programs go awry for fear of lawsuits and more delays.

The fact DISA not only terminated two contracts, taking some of the responsibility for the failure and talking about it publicly is an important step to making the procurement process better.

While DISA is far from perfect with its contracting efforts — there still is a hangover in industry about the agency’s use of lowest-price technically acceptable (LPTA)—there is an important recognition of how it needs to change to ensure long-term success of its programs.


IT executive retirements hit USDA, Commerce while HHS, GSA lose key personnel

The end of any calendar year usually means change in the federal community. Whether because of retirements or federal executives moving to new jobs in the public or private sector, there are a lot of people on the move.

So here are some of the prominent changes that happened over the last two months. To be sure, this is by no means a complete list but one that we’ve cobbled together from assorted sources. If we missed anyone, please let me know.

The Federal Deposit Insurance Corporation has a new chief information officer. The departments of Commerce, Agriculture, and Health and Human Services; and the U.S. Agency for International Development have or will have openings in their IT shops for executive-level positions.

Sylvia Burns is the new CIO at the Federal Deposit Insurance Corporation.

These are among the notable comings and goings in the federal IT community over the last two months.

Moves at FDIC, USDA, Commerce

Sylvia Burns is the new CIO at the FDIC, replacing Howard Whyte, who left earlier this month to take a new job in the private sector. Whyte spent his three years at the FDIC addressing long-time technology challenges around cybersecurity, workforce skilling and moving the agency to the cloud.

Meanwhile, Burns has been with the FDIC for almost two years as its deputy CIO. Previously, she spent four years as the Interior Department’s top technology executive.

FDIC Chairwoman Jelena McWilliams called Burns in a release the “perfect choice” to continue to modernize and secure the agency’s data and systems.

Burns is expected to continue in that same direction bringing her experience from Interior where she pushed that agency into the cloud early and attempted to modernize other services.

While the FDIC has a new CIO, USDA and Commerce are both looking for new IT executives.

Rory Schultz, the client executive for customer service and former deputy CIO at USDA, announced he is retiring from government on Feb. 28. Schultz came into that role as part of the USDA reorganization and modernization effort.

Renee Macklin, former Commerce Department director of IT shared services

“I plan to be off for a couple months and then go back to work as a contractor in May of 2020,” Schultz wrote on Linkedin. “I’m interested in work as a consultant, as a CTO, or as an operations manager, all of which I am well qualified for and which interests me. No decisions yet although I have started some discussions with small and medium-sized women-owned businesses. That said I am pretty well able to work anywhere, be it small, large or medium.”

Schultz has been with USDA for almost 10 years and before that worked at the Treasury Department for seven years as the director of headquarters IT. In all Schultz has 33 years in government, starting as a summer intern and research analyst with the Executive Office of the President during the Reagan Administration and then returning as a GS-4 research analyst with the Records Management Branch at the Treasury Department.

Over at Commerce, Renee Macklin, the agency’s director of IT shared services, retired on Dec. 31, according to a note sent to colleagues at AFFIRM. Macklin was the government vice president of programs at the organization.

As the director of IT shared services, Macklin oversaw enterprise services across areas like human resources, and acquisition for Commerce bureaus. She helped implement automation and improved processes leading to cost savings and better decision making for the agency.

She joined government service in 2001 as the CIO of the International Trade Administration. Macklin spent just under two years at the Small Business Administration’s CIO before coming to Commerce in 2015.

GSA’s NewPay loses second top exec. since September

Amy Haseltine joined the General Services Administration in October as the NewPay director.

Finally, HHS is looking for a new deputy CIO for enterprise services. Amy Haseltine joined the General Services Administration in October as the NewPay director under GSA’s human resources quality service management office (QSMO). She replaced David Vargas, who joined the Department of Housing and Urban Development in September as the acting deputy assistant secretary of the Real Estate Assessment Center. Vargas had joined GSA in August 2018 from the Office of Personnel Management to oversee GSA’s new service management office.

This means the top two executives for NewPay, the administration’s strategy to modernize federal payroll services using software-as-a-service, left since September. Beth Angerman, who was the principal deputy associate administrator in the Office of Governmentwide Policy, left in December to take some time off and potentially join the private sector.

Haseltine, who spent 13 years at HHS, will try to bring some momentum to NewPay, which the Government Accountability Office reported in April ongoing planning and implementation challenges.

Haseltine, who has spent more than 30 years in government, is no stranger to this type of challenge, helping to turn around HHS’ Federal IT Acquisition Reform Act (FITARA) efforts.

There are a few other notable changes in the federal technology sector, but outside of the CIO world.

Kelly Morrison, who came to USAID in April to help the CIO’s office coordinate technology across the agency’s development assistance efforts, is leaving to join Grant Thornton at the end of January.

Mona Siddiqui, former chief data officer for the Department of Health and Human Services

At the private sector firm, Morrison will be a director leading Grant Thornton’s Technology Business Management (TBM) practice.

HHS CDO moves on after 3 years

Also at HHS, Mona Siddiqui announced she was leaving as the agency’s chief data officer to become vice president at Humana Corp., according to a report in Politico.

She has been CDO for three years and before that worked at the Johns Hopkins School of Medicine as an associate professor and researcher.

“I am immensely proud of everything that the data team has accomplished and will continue to drive: developing a technology stack to enable enterprise data sharing, implementing a data governance approach, establishing a data science and artificial intelligence training program and consistently working with a broad group of stakeholders to move the department forward in its strategy on opioids, AI, data privacy, collaboration with states and social determinants of health,” Siddiqui wrote in a Jan. 16 blog post.

Finally, GSA also is losing a key technology acquisition leader. Sources have confirmed that Amando Gavino, the director for the Office of Information Technology Services for the Information Technology Category (ITC) at the Federal Acquisition Service, left on Jan. 17.

Amando Gavino is the former director for the Office of Information Technology Services for the Information Technology Category at the Federal Acquisition Service

Sources said he is heading to a new agency, but he didn’t disclose which one. A GSA spokesman wouldn’t confirm Gavino is leaving.

But sources said Gavino will take some time off before returning to government. Carlton Shufflebarger, the deputy director in the Office of Telecommunications Services in the ITC at FAS, is expected to be the interim director.

Before coming to GSA in 2014 where he led the Office of Telecommunications, Gavino spent 27 years in the Air Force, retiring as a colonel. He held numerous leadership positions as communications director, including director of command, control, communications and computer (C4) systems for the Air Force’s Central Command and the Combined Air and Space Operations Center.

Shufflebarger has been with GSA since 2008, previously spending 19 years at the U.S. Postal Service.

Acquisition, human resources changes

Two other significant but non-IT related changes in the federal community worth mentioning.

Bob Gibbs, the NASA chief human capital officer, is getting a promotion to become an associate administrator for mission support directorate. He will oversee not just human resources, but all headquarters back-office operations, including procurement, protective services and shared services.

Bob Gibbs is NASA’s new associate administrator for mission support directorate.

Gibbs came to NASA from the Energy Department in May 2017, and has been in government since 1987 where he started as a Naval supply officer.

Over at the Department of Veterans Affairs, Anil Tilbe took over as the new director at the Office of Accountability and Whistleblower Protection.

He spent the last three years as VA’s deputy director of enterprise measurement and design at the Veterans Experience Office. He began working at VA in 2011 in the Office of Information and Technology.

Finally, Ben McMartin, who spent the last 10 years working in Army contracting, joined the Public Spend Forum in December as a managing partner. McMartin did more than work in contracting — he ran the Army’s Other Transaction Agreement consortium around ground vehicles and led the Acquisition Innovation Roadshow, to help promote new federal procurement concepts across the military.

At the Public Spend Forum, McMartin will focus on training, education and consulting to improve public sector acquisition processes.


When CMS comes out of hiding, there’s plenty of progress to see

The Centers for Medicare and Medicaid Services is one of those “shy” agencies. It seems like the folks at the Baltimore headquarters don’t like to make public appearances at events and subscribe to the theory, “no news is no news.”

But when CMS executives make their way to the Washington, D.C., area for assorted conferences they bring a plethora of good news and progress.

Over the past few months, CMS officials have come out of hibernation at a couple of events, most recently at the AFCEA Bethesda Health IT day and before that at the National Contract Management Association’s Government Contract Management Symposium, to highlight more than talking points, but actual progress in changing the agency.

At the AFCEA event, Rajiv Uppal, the CMS chief information officer, detailed the work the agency is doing to move more toward meeting customer expectations.

Agile and the product mindset

He said CMS does a lot of its software development and most of those teams now are using the agile methodology and “taking on the product mindset” that is focused more on outcomes that meet customer needs.

Jennie Main is the chief operating officer at the Centers for Medicare and Medicaid Services.

Uppal said IT modernization, cybersecurity and, maybe most importantly, CMS employee upskilling are among his big priorities.

Uppal’s office’s work to get systems and networks in better shape isn’t happening just because CMS is filled with legacy systems. It’s happening, in part, because CMS executives outside of IT are realizing just how valuable the agency’s data is to their decision making.

Jennie Main, the chief operating officer at CMS, said whether it’s contracting or technology or mission, the sharing of data is helping the agency break out of long-time siloes where offices focused mainly on the work in front of them and not the agency’s mission holistically.

“I think it starts from the top by sending a message that we will work together and do the hard stuff. We are going to get people in a room together and look at both sides of an issue,” Main said during an interview at the NCMA event in December. “A lot of it comes from mindset and really coming from a place of what I call ‘non-judgmental curiosity’ If you assume the person on the other side of the table is actually just trying to do their job and you go and sit in their shoes for a little bit and find out what they are doing and what they see, it’s just remarkable how many people come back and say, ‘oh my gosh, I never knew that they needed these things.’”

Main said one way to do this is by opening up opportunities for employees to work in another mission or administrative functional area for a short amount of time. For example, the contracting office sent a group of folks on a detail to mission areas to see what it’s like to be the customer.

Main said it’s about taking a little risk for a big benefit.

“We are trying to build this culture. I think this incoming generation is demanding it. They are not planning on coming to stay for 30 years. They are happy to come in as a term employment, get some cool experience and do some things, and then they are going to move on,” she said. “We are trying to bring that mindset to our organization and help people get past the fear factor or resentment of people changing positions.”

This is a culture change not just for CMS, but for most agencies. The Trump and Obama administrations have promoted the concept of moving members of the Senior Executive Service to different jobs, but only the ones that seem like punishment get public attention.

What CMS is doing, and whether or not it’s members of the SES or the general schedule doesn’t really matter, is trying to create a customer-centric organization.

Main said the workforce is interested in making better use of the CMS’s data to answer customer or mission questions.

“If you don’t have the data in your current wheelhouse, but you need it to answer a question you really care about, then you need to figure out how to get the data,” she said. “One of our questions is about hiring and how long it takes us to hire. But everyone will say I don’t want to quickly hire the wrong person and I’m willing to wait a few months to hire the right person. We don’t have data for that and we have to find a way to create that data.”

Main said CMS is using a Gartner system that surveys hiring managers after they’ve brought in a new employee to benchmark the hiring process.

Taking smart risks

Another example is how Main is bringing together all the CXO leaders to address systemic pain points as a group.

“Most functioning organizations can solve things in their own silo, if you will, but it’s when you have stuff that has to cross it that it becomes more challenging,” she said. “We have been actively working together as a leadership team to identify things that will solve multiple groups of problems.”

She said CMS spends almost $2.7 billion on technology so there is a green field of opportunity to save money through consolidation and better contract management.

All of these changes come down to understanding and accepting the risk of the culture changes.

“One of the challenges in the federal space is there is so much risk aversion just culturally from top to bottom,” Main said. “So you put someone like me, who is not risk-averse and I like to think about myself at least as a responsible risk manager, it makes a big difference. What are we aiming for? Are we at the lowest common denominator to avoid anything bad anything or are we lifting our eyes up 45 degrees and take a look around? That is where we all need to be seeing as part of the bigger picture.”

Main said one example of a recent risk she took the lead on was notifying a subset of beneficiaries about something happening with their account even though the general counsel’s office said they shouldn’t do it.

“The team wanted to send the notice and I said ‘I would be the senior official who signed off on this,’” she said. “That’s the kind of thing where you have to smart about it and have good judgement. If you just live in a world where you are constantly afraid of what might happen … I try to step back and say ‘is this the right outcome?’ If I feel comfortable with it, then I’ll go ahead and do it.”

These types of examples from Main and Uppal give credence and hope for how CMS is changing for the better. There probably are dozens of other mission and back office areas where CMS employees are saving money, serving citizens better and fulfilling the reason why they joined the government in the first place. Now if only others in CMS leadership saw the value of celebrating their successes.


EINSTEIN and TIC never got along, and TIC 3.0 makes their break-up official

Don’t start playing a dirge for the 16-year old cybersecurity program known as EINSTEIN. But with the release of the draft Trusted Internet Connections 3.0 implementation guidance, industry experts agree, the end is near for the long-time and sometimes questionable-value intrusion detection, intrusion prevent program.

“Today’s concept of EINSTEIN is going away. It kind of has to happen,” said Stephen Kovac, vice president of global government and corporate compliance at Zscaler, said in an interview. “TIC 3.0 isn’t here to kill EINSTEIN, but decouple it from TIC. I think some form of EINSTEIN will still need to exist. Agencies and DHS still need to collect telemetry data.”

Kovac said what many federal chief information security officers and chief information officers have said over the years, “EINSTEIN today is not providing very useful data.”

Kovac said Zscaler collects 93 data fields through its sensors, while EINSTEIN is focused mainly on netflow data and blocking known threats and signatures.

Susie Adams, the chief technology officer for federal at Microsoft, said the draft guidance from DHS makes it clear that EINSTEIN’s shelf life is limited, even if it doesn’t specifically say that.

“The existing TIC architecture that’s in place is to protect agency networks as they were developed over the last 20 years,” Adams said in an interview. “But for the cloud, it looks like they are trying to go to the right place by storing data in the cloud and using machine learning or advanced analytics to understand what’s going on. This is why the traditional EINSTEIN will only exist for agency traffic coming out of their own network. I think DHS is trying to evolve EINSTEIN as well.”

The need to update TIC, and thus move away from EINSTEIN, became clear as agencies suffered from latency and other delays when integrating cloud services with these security tools and architectures.

The separation from EINSTEIN, however, is more like the cherry on the TIC modernization cake.

Kovac, Adams and other federal cyber experts said DHS’s five draft guidances to implement TIC 3.0 are well thought out and well-constructed, giving agencies a less prescriptive and more flexible approach to securing data and using the cloud. Comments on the draft documents are due Jan. 31.


Source: CISA

The guidance follows the updated memo the Office of Management and Budget released in September.

“Through the new guidance, agencies now can understand what risks they are trying to mitigate, what services they are trying to use and then the steps for how they can do it,” said Josh Moses, a former chief of the cyber and national security branch in the office of the Federal CIO. “I do think this makes moving to the cloud easier. The reference architecture shows that there now are many roads that lead to Rome versus the one or two ways under the previous TIC architectures. This new TIC architecture is much more flexible in the way agencies can access the internet as well as from a security and cost perspective. It frees up agencies to make better risk informed decisions.”

That has been the goal of many of OMB’s updated policies. Experts say the decision by federal leaders to have an “assume breach” mentality instead of a “protect everything” approach is clear in the TIC documents.

Cloud bottleneck should be gone

DHS isn’t so much telling agencies what to do, but more what outcomes agencies should aim to accomplish.

“Detection is the most important piece of this. If you assume you’ve been breached, then you need to spend time on detection and automating that detection, and this new TIC documents are a step in the right direction. It’s part of the zero trust framework,” Adams said. “The bad thing about not being prescriptive like TIC 2.0 was is it leaves a lot of things for agencies to decide and that could cause things to slow down because there may not be agreement on security control implementation and risk posture for the data and where it’s stored. We are hoping that being more subjective in how you meet TIC will provide more leeway for agencies and not inhibit cloud adoption.”

Adams said the new TIC approach removes the choke point that was EINSTEIN and the managed trusted internet protocol services (MTIPS).

“Agencies can now can define their own path to secure their internet connections, and that is huge,” she said. “It gets rid of the bottleneck.”

Kovac said the move away from MTIPS may be difficult for some agencies, and especially the telecommunications providers, because they have used it for so long and are comfortable with the security services. He estimates agencies spend about $1 billion a year on MTIPS.

Ross Nodurft, another former chief of OMB’s cyber branch and now a senior director for cybersecurity services at Venable, praised OMB and DHS’s work on TIC 3.0, but said the one thing that is missing is an incentive to move to the new architectures.

“The documents make assumptions that agencies already are motivated to adopt these new technologies and want to move to a new architecture, but what is motivating them to adopt these new tools? What is driving force?” he said. “The TIC memo rescinded the other TIC requirements and gives agencies the ability to build out a TIC architecture with more of a risk-based view. But why make the change unless you have a reason to? What are the drivers for agencies who are using MTIPS or another approach and not having any problems? I would like to see a more active solicitation of pilots to show why moving to 3.0 is worthwhile.”

Still need to connect programmatic dots

In the draft documents, DHS highlights two use cases, but also tells agencies how to develop and submit plans for additional proofs of concept.

Nodurft said he’d like to see vendors take a more aggressive role in developing use cases, which could help be a driving force to modernize TIC architectures.

Moses, the other former OMB cyber chief, said he’d like to see DHS and the federal CIO’s office clarify how all of the current cyber programs like TIC, continuous diagnostics and mitigation (CDM), high-valued assets and the Federal Information Security Management Act (FISMA) fit together and what benefits are agencies receiving from them all.

“How can agencies get to good, reduce their compliance burden and how do all of these controls come together and make a difference to secure agency systems and data?” he said.

Kovac said there is a lot of pent up demand for a more flexible approach to TIC. He said several agencies have or are preparing TIC 3.0 uses cases to begin to move out of the current approach.

“I think the vision of this will be a catalog of uses cases,” he said. “The remote worker, the traditional workers, the international users and the bring-your-own-device user. There are 5-to-10 solid use cases so people can find what they want to accomplish.”


« Older Entries

Newer Entries »