“DoD Reporter’s Notebook” is a biweekly feature focused on news about the Defense Department and defense contractors, as gathered by Federal News Network DoD Reporter Jared Serbu.
Submit your ideas, suggestions and news tips to Jared via email.
Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.
Later this year, the Navy will make billions of dollars in awards for the next phase of its Next Generation Enterprise Network contract. But the longer-term future of the Navy’s ashore networks might be determined by a series of smaller, upcoming other transaction authorities.
Next week, the Navy plans to issue three separate problem statements to pave the way for what it’s calling Modern Service Delivery: A cloud-centric concept that posits that its IT users ought to be able to access roughly the same services, no matter whether they’re physically located on a Navy installation or are remotely connected via a mobile device.
“Today we have capabilities available to us when we’re on the network, and then a different set when we’re mobile. We want to drive parity for access to services and systems and data, whether we are at work, at home or on the go,” said Andrew Tash, the technical director for the Navy’s Program Executive Office for Enterprise Information Systems (PEO-EIS).
As a starting point for ubiquitous data access, the Navy thinks it needs a new approach to identity and access management. So the first problem statement will ask vendors for an “integrated suite” of identity management capabilities that can work across the service’s systems.
“Identity is the new boundary. It assumes breach and focuses our security in a different place: it focuses on the data, and it focuses on user behavior as opposed to the network boundary,” Tash said. “That’s really the main thrust of our strategy, getting away from tightly-coupled services that force people to be on a network to access them. It’s just users connecting to data and services. It may seem pretty simple, but it removes a lot of silos and bad IT behavior that we’ve grown up with over the past 15 or so years.”
The Navy is already beginning to move some of its largest business IT systems — including its main enterprise resource planning system and some of its personnel databases — into the cloud.
And since its key data assets are already beginning to live in a mixture of on-premises systems and commercial hosting environments, giving location-agnostic access to all the data a particular user might be entitled to isn’t just a matter of creating a secure tunnel to the Navy-Marine Corps Intranet.
So the second problem statement will aim to build out the concept of Network-as-a-Service — a structure that would virtualize the Navy’s networks in the same way cloud computing lets it virtualize its servers.
“If all of our business systems are moving to commercial cloud, then shouldn’t we have the most efficient connectivity to the commercial cloud, to the point where I can do 100 percent of my job from commercial cloud services? So now we’ve got to rethink what the [DoD Information Network] means in that context,” Tash said. “Yes, the data is part of the DoDIN, but how we consume that and how we inter-operate with the rest of the DoD community is up for debate. But we look at the business systems as being the primary opportunity right now to leverage things like network as a service.”
The third task to the IWRP consortium’s members will look for technologies the Navy could use to manage its users’ access to the systems it’s moving to commercial cloud environments.
That approach, called Cloud Access Security Broker (CASB), tries to monitor users’ interaction with cloud services and enforce an organization’s security policies even when they’re connecting directly to the cloud provider from an “off-premises” location like their home or mobile device.
Many of the long-term details of the Navy’s cloud-dependent vision will depend on the final disposition of DoD’s forthcoming Joint Enterprise Defense Infrastructure (JEDI) contract.
The highly-contentious solicitation is still tied up in federal court, but in its new cloud computing strategy earlier this month, the Pentagon made clear that it expects the military services and Defense agencies to use JEDI as their primary solution for cloud services.
But Navy officials say they are not waiting for JEDI before beginning some of their major cloud transitions. Navy ERP is the first major application it’s moving to the cloud, but the service is also in the process of consolidating nine separate manpower and personnel databases into a single authoritative data source; it’s looking, for now, to put that information in Amazon Web Services’ GovCloud.
Notionally, the DoD strategy allows the military services to keep their data in non-JEDI clouds, called “fit-for-purpose” clouds, but only by special permission from the DoD CIO.
“Defining what those are will be key as we move forward,” said Ruth Youngs Lew, the Navy’s Program Executive Officer for Enterprise Information Systems. “We have a couple of Navy cloud contracts right now, but those are intermediate steps. Our plan is to fully transition to JEDI at some point in the future, when they get it awarded.”
When Congress gives the Defense Department new reform-oriented legal authorities, there’s never a guarantee that DoD will ever pick them up and run with them.
But when it comes to the powers that officially took effect for the DoD chief information officer at the beginning of this year, officials don’t intend to waste any time.
The new laws, meant to give the Pentagon more authority to enforce common IT standards throughout the military services, are coming into play already, officials told the Senate Armed Services Committee last week.
After nearly a decade in which DoD has used more carrots than sticks to push IT consolidation, they suggested that the department will take a much more directive approach over the coming year.
“You have a set of leaders that are very impatient, including myself, that are done admiring the problem and are moving on to tasking,” DoD CIO Dana Deasy said. “This includes being less tolerant with people being able to go off and use their own solutions. The authorities that you all gave me starting this year around being able to set architectural standards are quite significant, and we are now starting to use those new authorities.”
Defense officials offered few specifics about the types of changes they intend to direct, but much of the way forward will be laid out by a cross functional team led by Brig. Gen. Dennis Crall, the principal deputy cyber adviser to the secretary of Defense.
Crall said the department spent most of the last year setting the stage for the reforms it plans to implement, including by publishing a new cyber strategy. This year, he said, will be all about starting to deliver meaningful results.
“So while it’s a good year for implementation, I would say it may not be a good year for some other things,” he said. “The first is stovepiped solutions. It’s a bad year for those who like to approach this in a way that we have endless niche capabilities and do business their own way: lack of standards, individual development, and difficulty in integrating. We’re putting an end to that practice, which has really robbed us of success. It’s also a bad year for those who don’t like measures of effectiveness or discussions on data-driven return on investments. We owe an accountability for how we’ve spent our money and also a level of accountability on what capabilities we’ve achieved in the expenditure of that money and effort.”
The sentiments Deasy and Crall expressed appeared to be in tune with the frustrations Congress laid out when it passed the 2018 National Defense Authorization Act, which contained the new authorities.
Specifically, the legislation gives the CIO the authority to set departmentwide IT and cyber standards, plus the mandate to conduct annual reviews of the military services’ budgets to ensure they’re setting aside enough funds to implement a common cyber approach.
“Countless efforts across the Department of Defense are plagued by poorly enforced standards and a CIO position whose policy and guidance are largely considered as advisory by the services,” the Senate Armed Services Committee wrote in a report accompanying the legislation. “As a result, each service continues to pursue disparate information technology and business systems efforts.”
That same year, lawmakers vented frustration that the cyber advisor position now held by Crall has historically been a weak player in the Pentagon’s overall power structure.
“The office of the [principal cyber adviser] has been chronically under-resourced since its establishment, and (members) are concerned about the impact of under-resourcing on the PCA’s ability to effectively execute its assigned roles and responsibilities,” they wrote in the final NDAA’s explanatory statement. “The conferees believe that the PCA should be robustly manned and resourced.”
But Crall suggested the tide had turned since that language was written. He said the cross-functional team Congress ordered to staff the office has proven effective over the past year.
“Congress got that right. The cross functional team works, and it has several advantages,” he said. “It’s only as good as it’s paid attention to, but the cross functional team that’s involved under the PCA is well-resourced in the sense that we’ve got the right people. The participating agencies that provide representation of the workforce sent us their best. We got good people. The second piece is we can approach problems in ways that don’t have some of the biases. We don’t have any stake in the legacy systems that we hold onto, it really is about the mission. So we normally come to the table with an advantage in solving some of those problems. It’s been instrumental in moving the strategy into implementation.”
DoD first started drawing up plans to collapse its legacy networks and IT services in 2010 under an initiative known as the Joint Information Environment, and began taking serious steps toward implementation — with an emphasis on shared cyber infrastructure, enterprise services and cloud computing — in 2012.
However, the Pentagon has struggled to herd the large number of cats involved in implementing the vision, and to define the project’s cost and scope, according to the Government Accountability Office. And in recent years, the “JIE” moniker has begun to fall out of favor: It was not uttered a single time during last week’s 90-minute hearing on cyber standards.
In its annual report last week, the office of the Director of Operational Test and Evaluation did offer a short description of JIE’s status, but said that DoD’s Joint Regional Security Stacks — one of its key foundations — is performing poorly, and “calls into question the current JIE cybersecurity approach.”
A long-awaited update to the Defense Department’s cloud strategy gives a nod to the “multi-cloud” approach many of its vendors have urged the Pentagon to adopt. But the same document makes clear that the department still intends to move the majority of its applications to a cloud service operated by a single vendor.
The new strategy, released on Monday, envisions a mix of hosting environments: A “general purpose” cloud that will be provided through the department’s controversial JEDI contract, several “fit for purpose” clouds operated by other vendors and some remaining DoD-operated data centers that will continue to house applications that still aren’t cloud-ready.
But the goal is for systems to migrate to the “general purpose” infrastructure- and platform-as-a-service cloud unless there’s a compelling reason not to.
“Only when mission needs cannot be supported by general purpose will fit for purpose alternatives be explored,” Defense officials wrote in the strategy, signed by acting Defense Secretary Patrick Shanahan. “In such a case, a mission owner will be required to submit for approval an exception brief to the office of the DoD CIO describing the capability and why the general purpose cloud service does not support their mission.”
The document also makes clear who will be in charge of implementing, overseeing and governing the department’s cloud plans: DoD’s chief information officer. Initially, the JEDI effort had been led by a steering group made up of multiple DoD organizations; the first instantiation did not include the CIO’s office as a voting member.
“At some future date, once the general purpose cloud environment is fully implemented and fit for purpose implementations have matured, it is possible that overall leadership could be transitioned to a different organization inside DoD,” officials wrote. “The DoD CIO will establish an enterprise cloud organization with appropriate leadership and the required governance forums to ensure that overall objectives and implementation plans as described in this strategy are enacted. The DoD CIO will leverage existing governance forums to the greatest extent possible.”
One of the CIO office’s tasks will be to begin scrutinizing the cloud environments the military services and defense agencies have already created or contracted for on their own. The office will work with those DoD organizations to either come up with “thoughtful” migration strategies to move to JEDI, or potentially give the clouds they’re already using an official “fit for purpose” blessing.
But the strategy strongly suggests that many of those existing clouds will not survive the process, because the Pentagon believes a proliferation of cloud environments across the department is already hindering interoperability.
“A lack of guidance has led to departmental inefficiencies and has hindered the department in IT modernization efforts,” the officials said. “It has led to disparate efforts with siloed teams, disjointed implementations with limited capability, siloed data and inefficient acquisitions that cannot take advantage of economies of scale.”
But more formal and written guidance is coming, officials promised. It will be drawn up in cross-agency forums organized by the CIO. Those forums will also come up with detailed implementation plans for how the department will move to the cloud.
The strategy itself offered few details about how DoD will accomplish its cloud migrations. Rather, it reads largely as a statement of principles and an explanation for why the department believes it must finally adopt the “cloud first” stance that the entire government was supposed to embrace more than eight years ago.
But the document did lay out five “lines of effort” DoD considers essential, and that will form the basis for a still-forthcoming “cloud migration playbook.”
Those are:
In a statement, DoD CIO Dana Deasy said the strategy would pave the way for his office to consolidate the department’s IT services into a construct that’s more secure, scalable and reliable.
“This marks a milestone in our efforts to adopt the cloud and also in our larger efforts to modernize information technology across the DoD enterprise,” he said. “A modern digital infrastructure is critical to support the warfighter, defend against cyberattacks and enable the department to leverage emerging technologies like machine learning and artificial intelligence.”
Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.
The centralized cyber defense centers at the core of the Pentagon’s years-long effort to consolidate its IT networks still aren’t working as advertised, and the Defense Department needs to stop deploying them until major problems are resolved, according to the Pentagon’s independent testing office.
In its annual report, released on Thursday, the Office of the Director of Operational Test and Evaluation found that DoD’s Joint Regional Security Stacks (JRSS) are neither operationally effective nor operationally suitable.
It’s the second time DOT&E has reviewed JRSS as part of the portfolio of major defense expenditures it oversees, and the second year in which it’s reached the same conclusion.
The office’s findings are based largely on an operational assessment DoD’s Joint Interoperability Test Command conducted last March.
In that event, an Air Force red team acting as a cyber attacker managed to penetrate through one of the stack’s defenses without being detected at all, and was one factor that led DOT&E to conclude that JRSS’ overall performance had not improved since an earlier test in mid-2017.
The security stacks are meant to make use of best-of-breed commercial hardware and software products, but the off-the-shelf approach the department pursued has led to a situation in which JRSS now includes security products from more than three dozen separate vendors. And DOT&E questioned whether such a wide variety of solutions was manageable for the cyber defense personnel tasked with operating the stacks.
“JRSS operator training still lags behind JRSS deployment, and is not sufficient to prepare operators to effectively integrate and configure the complex suite of JRSS hardware and associated software,” according to the report.
DOT&E found that the Defense Information Systems Agency’s Global Operations Command does not have enough personnel to properly operate the stacks, and will not until July of this year. The same appears true of the Army, which wasn’t able to certify that it has sufficient manning to handle its JRSS responsibilities.
DoD, the Army and the Air Force initially conceived JRSS as a way to improve their cyber posture by centralizing about 5,000 separate firewalls into a shared infrastructure that relieved individual military bases of the burden of monitoring and filtering all of their network traffic.
And in conjunction with the JRSS deployments, DISA and the services have also invested heavily in network capacity upgrades so that all of the military’s traffic can be funneled through the stacks. The upgrades to multiprotocol label switching technology will eventually boost the department’s network backbones from 10 gigabit to 100 gigabit connections.
But DOT&E suggested the notion of trying to monitor those vast traffic flows from a relative handful of locations may have been too ambitious a goal.
“It is inherently difficult to effectively manage the very large amount of data designed to traverse each JRSS,” the office wrote. “The DoD CIO and the services should consider the possibility that the data flow designed to traverse each JRSS may be too large to enable secure data management, and if that is the case, refine the JRSS deployment plans to reduce the required data flow through each JRSS.”
And the report said DoD and the services still do not have mature standard operating procedures for the stacks, and haven’t done enough to test them under “operationally realistic” conditions.
“DISA and the services should conduct routine cyber assessments of deployed JRSSs, using a threat representative Persistent Cyber Opposing Force, to discover and address critical cyber vulnerabilities.”
The department did not answer questions from Federal News Network about whether it would accept DOT&E’s recommendation to suspend further JRSS deployments until it addresses the problems.
But DoD did temporarily pause the deployments last year, including by delaying the installation of new stacks in U.S. Central Command and Southwest Asia. It also deferred the Marine Corps’ migration to JRSS and the activation of the versions designed to protect classified networks until 2019.
“It was to try to improve the way we handle training, the way we handle the actual processes so we could improve the deliveries,” Rory Kinney, the principal director for information enterprise in the DoD CIO’s office told an AFCEA conference in December. “But we have about a million people behind JRSS right now, and the big push is now in the Pacific, with the intent of beginning migration in January. So JRSS is still alive and well.”
So far, DoD has activated 14 out of 24 of its planned JRSS sites on its unclassified (NIPR) network. It plans another 25 for its classified (SIPR) network.
According to DOT&E, two more operational assessments were scheduled for January and July of this year. The six-month cycle would repeat until JRSS undergoes its formal initial operational test and evaluation sometime in fiscal year 2020.
In statements to the press, industry and Congress, the Pentagon has consistently maintained for several years that JRSS is the most critical near-term component of a vision it calls the Joint Information Environment.
DoD does not report the cost of the overall effort as a separate budget item, since it is not an official program of record. But as of 2016, officials pegged the department’s five-year expenditures for JRSS at $1.7 billion. The Government Accountability Office has disputed that figure, arguing that it does not include $900 million DoD and the services had spent on JRSS between 2013 and 2016.
Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.
Oracle America is telling a federal court that the Defense Department violated federal procurement laws and regulations in at least seven significant ways when it designed its multi-billion dollar JEDI Cloud computing contract.
As a result, the court should order DoD to revise its request for proposals before it’s allowed to make an award, attorneys for the tech firm said.
The arguments largely echo the complaints Oracle lodged with the Government Accountability Office in an earlier, unsuccessful pre-award bid protest.
But in its lawsuit – filed last Friday and unsealed by the U.S. Court of Federal Claims on Monday – the company said GAO had been too deferential to the defenses DoD offered in that prior proceeding, including its claims that a single-award contract was justified for reasons of national security, cost and speed.
Oracle, Microsoft, and several industry associations have argued that the Pentagon’s decision to award the up-to-$10 billion contract to only one company was imprudent, since it would lock in one vendor for up to a decade and deprive the military of the benefits of competition.
But in its filing, Oracle also asserted the single-award decision was patently illegal.
“This is a straight statutory construction issue; it is not a matter of deference or national security as claimed by DoD,” attorneys for the company wrote in the complaint. “Congress has prohibited the very contract approach DoD has implemented.”
Lawyers were referring to a federal law that requires agencies to grant multiple awards whenever possible if they’re issuing large indefinite-delivery/indefinite-quantity (ID/IQ) contracts.
There are some exceptions to that rule, and DoD believes it satisfied one of them when Ellen Lord, the undersecretary for acquisition and sustainment, signed written findings saying the department would only be issuing task orders at firm, fixed prices that would be set at the time of the contract award. And since those prices would have been set after a competition with other vendors, the government could be assured it was getting fair prices, the department reasoned.
But Oracle said that argument simply isn’t true, partly because the contract also explicitly requires the winning bidder to adjust its prices to align with what it’s offering to business customers over the course of up to the next 10 years.
One clause “will require the awardee to regularly add (as frequently as daily or weekly) its new commercial offerings onto the JEDI Cloud at yet undetermined prices, and contemplates the awardee working with DoD to develop new classified offerings — none of which offerors in the JEDI Cloud competition will specify or price in their proposals,” according to the complaint.
Separately, Oracle claims it was unfairly shut out of a fair chance of winning the contract because the department deliberately set up “gate criteria” that only two large cloud companies – Amazon and Microsoft – could possibly hope to satisfy, even though the department’s own market research showed its cloud requirements could be served by multiple companies.
To pass through those gates and be considered for the JEDI award, companies need to show, among other things:
Oracle contends that some of those requirements go beyond the department’s actual needs, are barred by the Competition in Contracting Act, or, in the case of the first gate criteria, that bidders shouldn’t be measured against the cloud services they were offering more than a year before the department could possibly hope to issue any orders against the contract.
Like another JEDI protester, IBM, Oracle contends those requirements were set with one particular company, AWS, in mind.
To back that up, the complaint offered what Oracle said was evidence of conflicts of interest involving two defense officials who helped spearhead the procurement: A chief of staff to the deputy secretary of Defense who’d previously been an AWS consultant, and a member of the Defense Digital Service who planned JEDI’s requirements. That employee had previously worked for AWS, and returned to the company as a general manager in 2017.
With regard to the former DDS official, Oracle claimed the department hadn’t done nearly enough to probe a potential conflict of interest; the JEDI contracting officer’s investigation fit on a single page, it said.
The company also reproduced documentary evidence it had obtained through its GAO protest that it characterized as “attacks” on other defense personnel or industry groups who were pushing for a multi-cloud approach to JEDI.
In Slack messages quoted in the court complaint, the former DDS employee mocked one senior defense IT official, calling her a “dum dum” after she had apparently expressed satisfaction with Microsoft cloud services. In others, he appeared to respond with vulgar dismissals to an industry group’s presentation that summarized previous government research and experience with cloud acquisition strategies.
In testimony to GAO arbiters, the department said its contracting officer would “continue to comply with her conflict of interest duties concerning this acquisition.”
The Pentagon contended any potential conflict of interest issues shouldn’t stand in the way of the procurement, and that it would investigate the matter further, “if appropriate,” before it makes a contract award. It also minimized the DDS official’s involvement, saying he only worked on the JEDI procurement for seven weeks.
GAO did not foreclose the possibility that there was a conflict of interest, but said the matter is irrelevant until and unless AWS wins the contract and a protestor can show that a conflict actually played a role in the final decision.
DoD’s answer to Oracle’s court complaint is not due until Feb. 4.
In the earlier GAO protest, the department successfully fought back the Oracle challenge by arguing it had wide latitude to make procurement decisions on national security grounds, and that a multiple-award strategy would only cost the government more time and money.
“Managing security and data accessibility between clouds creates seams that increase security risk for multiple reasons,” the JEDI contracting officer told GAO. “Crossing clouds requires complex manual configuration that is prone to human error and introduces security vulnerabilities…Systems in different clouds, even when designed to work together, require complex integration. Connections that are not correctly configured and managed at both endpoints introduce new attack vectors … I find that multiple awards increase security risks.”
And on cost and schedule grounds, the department said a single-award contract was imperative when it comes to getting modern cloud services to the tactical edge as quickly as possible, and to integrate the various cloud services and legacy systems it already has in place.
“Doing that for a single solution provided to the department by either a vendor or a team of vendors is a big lift already.Trying to do that for multiple solutions, with the department operating as the integrator, would be exceedingly complex,” said Tim Van Name, the DDS deputy director. “Part of this effort is to work with the winner of the JEDI Cloud contract, so that we can help the department better understand the risk [it is] accepting, better manage that risk, but also do so in a more timely manner, so that our war fighters get access to applications and services much faster. Trying to do that with one vendor is a thing, I think, the department knows how to do. It’s going to take a considerable amount of our technical experts. Trying to do that with multiple vendors simultaneously, I just don’t think we have the technical expertise to do that well.”
The Air Force is about to join the still-small group of federal agencies who’ve found ways to dramatically accelerate the process of granting cybersecurity approvals for IT systems.
The Authority to Operate (ATO) process, a paperwork gauntlet that routinely consumes months of time before new systems are allowed to be connected to government networks, is a requirement of the Federal Information Security Management Act. FISMA tells CIOs they must know and accept the security risks each system carries with it.
But there’s no particular reason the system can’t work much more quickly, said Bill Marion, the Air Force’s deputy CIO. Service officials are expected to sign off on a new “fast-track” ATO policy within a matter of days, he said.
“We fundamentally believe this is going to help us bring capability faster,” he said last week at AFCEA NoVA’s annual Air Force IT Day. “It will bring us software modernization at a faster clip, but also provide better security.”
Marion said the new policy won’t be appropriate for every IT system, but in some ways, it will turn the traditional ATO process on its head. Rather than assessing every single system against the entire catalog of NIST security controls, the goal is to make intelligent decisions about which of those assessments really need to be performed at all for a particular system.
He offered an example: If the Army has already gone through the Risk Management Framework (RMF) and deployed a system the Air Force wants to use, does the Air Force really need to put itself through every one of those same painful paces?
“What do I think I’m going to find in that whole other 900 controls in RMF that we didn’t already flush out when we put that system in a hardened cloud computing center and put it through penetration tests? What do we expect to find, and is the juice worth the squeeze? Part of this is getting the decision in front of the approving official sooner, to then determine what parts of the RMF you even need to go through,” he said. “In some cases it may be very, very short. In some cases it may be truncated by a third, or half. It’s a fundamental retooling, but we are in a different world in how we’re managing risk.”
One reason the Air Force may feel comfortable with less quadruple-checking of those security controls on the front-end is that it’s become increasingly confident that it can spot and fix genuine cybersecurity problems after a given system is deployed.
In early 2017, it deployed a commercial tool developed by Tanium which lets Air Force cyber defenders scan the service’s entire network within a matter of minutes and automatically patch any security holes they find in real-time.
Officials ordered that the tool, which the Air Force calls Automated Remediation and Discovery (ARAD), be deployed on virtually all of its IT systems by May of 2017. Any systems that couldn’t employ the tool for one reason or another were deemed “high risk.”
The timing was fortuitous. The WannaCry ransomware attack struck computers across the globe that same month. But because of ARAD, the Air Force managed to effectively immunize its entire network from the malware in less than an hour, Marion said.
“That was game changing for us,” he said. “We had never done that before in our history. While we had been pretty fast, it typically took days or weeks to re-mediate something of that magnitude. And we did it at scale across the Air Force in 41 minutes. We have to be able to act when something happens. This belief in defense-in-depth and network-perimeter-only security, I would argue, is a failing one in this globally connected world.”
Aside from the new availability of the ARAD tool, Marion said the Air Force’s move to the new, faster ATO process will be guided by two other major factors.
Authorizing officials will need to see demonstrable evidence that any new system adheres to basic cyber hygiene, and at least some of those systems will be subjected to a new generation of penetration tests once they’re up and running, including the “bug bounties” that are becoming increasingly pervasive across government.
“I liken it to the USDA meat inspection process,” Marion said. “We don’t inspect every piece of meat, but every piece of meat could kill you. So we inspect and we review and we check our processes to make sure that bad things aren’t creeping their way back into the system. We’re finishing Hack the Air Force 3.0 right now, but we’ve got a whole series of pen tests and bug bounties planned for fiscal year 19, and they’re funded.”
It’s not yet clear how long the revamped ATO process will take, but Kessel Run, the Air Force’s new agile software development office, has been working on a “continuous ATO” model it calls “ATO in a day.”
“So this is the new world order: Make sure you’ve got a basic level of hygiene coming into the mix – that’s the price of entry – bringing the sensors and remediation tools that sit on top, and then bringing a bug bounty pen testing process,” he said.
Similar concepts have been proven out in other federal agencies, including at the National Geospatial Intelligence Agency, which used the same terminology when it began working on its own speedier security approval process.
NGA has managed to get the process down to three days.
“We are continuing to build the telemetry necessary, the business rules, the promotion path for code committed to our dev/ops pipeline and to promote that as quickly as possible to operational,” Matt Conner, the agency’s chief information security officer said in an August interview with Federal News Network. “We still haven’t realized the one-day ATO, but it’s out there.”
Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.
The Defense Department has dragged its feet in implementing a host of new measures lawmakers ordered as part of a crackdown on lowest-price, technically acceptable (LPTA) contracts.
The findings are part of an annual Government Accountability Office review of DoD’s use of LPTA source selections. In its report, released Tuesday, GAO estimated LPTA is used for about a quarter of the Pentagon’s contracts and task orders that are worth $5 million or more.
But it also pointed out that DoD still has not issued the rules Congress ordered it to draft two years ago to ensure the department is only using the cost-conscious procurement method when appropriate.
As part of the 2017 National Defense Authorization Act, Congress — responding to longstanding industry complaints about a perceived overuse of LPTA — ordered DoD to make sure it satisfies eight separate criteria before deciding to go that route on a given contract, rather than using a “best value” method for picking the winner.
The legislation was enacted in December 2016. By September of this year, DoD still had not begun moving on the rulemaking process needed to implement the law.
Officials eventually opened a case to amend the Defense Federal Acquisition Regulation Supplement (DFARS) on Oct. 25, several weeks after GAO gave them a draft copy of the final report it issued this week. But they do not expect the final rules to go into effect until the fourth quarter of 2019.
Among the new mandates Congress issued was that when DoD contracting officers are using LPTA, they need to be able to clearly define the government’s minimum requirements. They should be sure that there’s no value in buying products or services that exceed those minimums, and be confident that DoD would be wasting its time if it delved deeply into questions about whether one vendor might be able offer a more innovative approach. That might be the case if the department is buying commodities such as natural gas or off-the-shelf computers.
DoD does not keep detailed records about how often it uses LPTA contracts, but GAO’s review of a sample of 2017 contracts and task orders found that, by and large, the department is almost always complying with those principles, even without the rule change.
But for three of the other tests Congress ordered, it’s clearly not.
The law also told the department it needs to provide a written justification for why it decided to use LPTA in any given contract. That happened only three times in the sample of 14 large procurements GAO examined.
In that same sample, there was only one case in which DoD was able to show that the “lowest price” it settled on included the operations and support or “lifecycle” costs for what it was buying, another requirement of the 2017 NDAA.
And in only two cases was DoD able to satisfy the law’s requirement that it only use LPTA for goods that are “predominantly expendable in nature, nontechnical, or have a short life expectancy or shelf life.”
At least some of the contracting officers GAO interviewed said that determination is a tough call, especially without DoD-level rules that might provide some more guidance.
“Specifically, a Marine Corps contracting official who purchased general use computers stated it was unclear if a computer that will be replaced every 5 years would be considered to have a short shelf life,” auditors wrote. “Additionally, an Air Force contracting official who purchased Blackberry licenses stated that it was unclear if this criterion would apply to such licenses, and if it did, whether a one-year license would be considered a short-shelf life. As a result, this contracting official stated he would not know how to consider this criterion in similar acquisitions.”
Amidst the Trump administration’s somewhat murky efforts to reorganize the federal government, the future role of the Office of Personnel Management is, to put it mildly, uncertain.
As Federal News Network first reported, the administration is aiming to diminish OPM’s central role as the governmentwide belly button for HR and civilian hiring.
Another massive step in that direction would be to remove the Defense civilian workforce from OPM’s administrative authority. Those workers make up nearly 40 percent of the total population of federal civil servants, but Mark Esper, the secretary of the Army, strongly believes they should be removed from OPM’s control altogether.
“They’re not bad people. They’re trying to construct a system that’s as fair as possible for a lot of patriotic Americans who want to work for the federal government,” he said. “But it’s not working, and I’d like to get control of it.”
“Control” would mean that the Defense Department would manage its own processes for hiring and managing its civilian workforce. Esper’s comments, made at a luncheon address to Army civilian employees, mostly addressed the federal hiring system.
The vast majority of open civilian positions are advertised via the USAJobs.gov website. The recruiting process that flows from those job advertisements, he said, is badly broken.
“I think any system where you have to go on a website and assert that you’re an expert in anything forces people to be dishonest,” he said. “If the tricks of the trade are to read the job description and then mimic it back, it’s a fundamentally flawed system.”
Esper said he has not delved into too many details about how a replacement for the current hiring system would work in practice, but said that he’s held some initial discussions with members of Congress about a new one that would be operated by the Defense Department instead.
“And then I can have input on it,” he said. “I can work with the secretary of Defense, and the deputy secretary, to build a system that gets rid of all of those artificialities and all of the gaming that’s inherent in [USAJobs], and maybe takes a forward-looking approach. I’m not satisfied with a hiring system that takes 140 days.”
Even worse, from Esper’s point of view: the current goal for improvement is to reduce the government’s time-to-hire to about 80 days.
“That’s not how the private sector works,” he said. “If you were to say ‘I’m going to hire you in 80 days,’ people would walk. The goal I’ve given my folks is 30 to 45 days. I don’t know if we’re going to get there, but we’re going to push hard.”
Shortcomings in the governmentwide approach to civilian personnel onboarding aren’t limited to the initial hiring process, Esper said. He believes the background investigations that are needed for many of its civilian personnel in order to gain security clearances could be finished in less than a week, using one page of documentation, if the clearance process were to be conducted “as the law intended.”
The secretary said he’s trying to “peel off” challenges in the civilian hiring process on a weekly basis.
How successful he’ll be remains to be seen, but it’s far from the first time the Defense Department has attempted to assert more control over its civilian workforce and differentiate itself from the rest of the government’s civil service.
In 2015, advisors to Defense Secretary Ashton Carter urged the Obama administration to remove DoD employees from the jurisdiction of Title 5, the section of the U.S. Code that governs the federal workforce, and place them under Title 10, the section that governs military members and is controlled by DoD.
The draft recommendation would have required congressional approval, but was never formally sent to Capitol Hill.
More recently, the department has explicitly sought to exempt itself from broader Trump Administration efforts to shrink the federal workforce, saying its civilian workforce is essential to its core missions, and growth is warranted in at least some areas.
DoD’s civilian workforce is in the business of protecting the American way of life, not regulating or governing it,” Defense officials wrote in a wide-ranging “Business Operations Plan” they quietly posted online earlier this year. “While it may be appropriate for other federal agencies to reduce their civilian workforce, for the DoD, right-sizing will necessitate targeted growth to both restore readiness and increase the lethality, capability, and capacity of our military force.”
Virginia’s senior senator is pressing the Defense Department for answers after yet another exhaustive media report documenting health, safety and other deficiencies in military base housing.
In a letter to Secretary of Defense James Mattis, Sen. Mark Warner (D-Va.) demanded a briefing from Pentagon officials on what they’re doing to handle alleged failures on the part of the companies that operate the military services’ privatized on-base housing.
The issues were raised in the latest of several lengthy investigative reports on military housing by the Reuters news agency. It found multiple cases in which the firms allegedly refused to deal with serious mold, water intrusion and vermin problems, sometimes forcing them to move off-base at their own expense, and in at least one case, leaving the family to conclude its best course of action was a earlier-than-planned departure from military service.
“This is not the first time that unhealthy conditions in military housing have been documented. In November 2011, I was made aware of similar complaints regarding mold in private military housing in the Hampton Roads area in Virginia,” Warner wrote. “Working with Navy officials and impacted military families, I strove to ensure that both the Navy and Lincoln Military Housing implemented a plan to reduce these hazards. As a result, LMH agreed to offer free mold inspection to any resident requesting the service, to hire an independent professional engineering firm to survey the conditions, to update training for maintenance teams and more; the Navy also committed to improving tracking tools and enhancing oversight of property management performance. But today it appears that these changes were insufficient or ignored.”
The military services’ decision to privatize their housing was made in the 1990s at a time when officials were searching for ways to improve living conditions in what were then government owned-and-operated homes. Housing was falling into disrepair at an alarming rate back then, and as a general matter, the privatization program has been viewed by policymakers as an overwhelming success.
However, concerns about companies’ failure to deal with substandard conditions are not isolated to Virginia, nor Naval installations. This fall, the Army devised a plan to begin testing houses that were built before 1978 for lead exposure. Those actions were taken in response to another Reuters investigation, which documented more than 1,000 small children whose blood tests, administered by on-base clinics, had shown elevated levels of lead.
The Army is starting with a sample of 10 percent of those homes, and plans to finish the inspections by the end of the year, said Jordan Gillis, the assistant secretary of the Army for installations, energy and environment.
“The Army Corps of Engineers is going to produce a report for us that will really give us a baseline understanding of where we are, so then we’ll be able to decide what the appropriate next steps are,” he said. “Whether that’s additional inspections or additional mitigation, we don’t know for sure, but that should give us a good baseline to move out from.”
The inquiry involves not just lead paint — which the Army does not consider dangerous if it’s been properly covered and contained by subsequent layers of non-leaded paint — but other sources of lead as well.
“We believe encapsulation is an effective approach, but we’re conducting inspections to ensure that it is in fact effective,” Gillis said. “While we’re in there, we will also test water at the tap for the presence of lead. You can be contaminated or exposed through sources other than lead paint, and water is one of them. So we’ll test water at the tap, and we’ll do a visual inspection for the condition of any asbestos as well.”
The Army says it’s accommodating families who have asked to move out of their current houses over lead or asbestos concerns, but says only a handful have asked to do so. However, a fact sheet produced by the Army itself notes that most children who’ve been exposed to lead don’t exhibit any immediate symptoms, and blood tests are the only way to know for sure that they’ve been exposed in ways that might be harmful.
The Reuters investigations are not the only indication of substandard living conditions in privately-managed on-base housing.
The Defense Department’s inspector general has several recommendations that the Pentagon has not dealt with to the IG’s satisfaction, including from a 2015 inspection of military housing in the national capital region.
That review, which examined a sample of housing at two bases — Joint Base Anacostia-Bolling in Washington, D.C., and Fort Belvoir in Northern Virginia, found 316 separate electrical, fire, and other safety and health problems.
Among many other recommendations in its 2015 report, the IG said the Army and Navy should conduct more routine inspections to ensure their housing contractors were complying with standards.
The Army disagreed.
“Guidance received from the [office of the assistant secretary for installations] prohibits Army personnel from conducting health and welfare inspections of privatized homes,” officials wrote at the time. “Lack of available resources and projected future reductions in resources do not adequately provide for or allow additional oversight of housing facilities.”
That, in a nutshell, is a key part of the upshot of the latest Reuters investigation.
Families living in military housing can’t complain to state and local officials who would otherwise handle health concerns or myriad other tenant protections offered under state laws. That’s because the houses are on military property. But the military services themselves have tended to take the position that it is not their business to involve themselves in individual landlord-tenant disputes.
The Naval Facilities Engineering Command, which oversees all of the Navy and Marine Corps’ privatized housing contracts, has only “a limited role” in day-to-day operations, NAVFAC’s assistant commander, Scott Forrest told the news agency.
During last week’s formal activation ceremony for the Army’s new Futures Command — held, coincidentally, just 24 hours before the death of Sen. John McCain (R-Ariz.) — Army leaders disclosed something about the command’s origins not yet publicly discussed.
The senator himself played an instrumental role in the command’s creation, starting more than two years before officials publicly announced their plans to stand it up, they said.
“None of this would be happening without someone who’s not here today, and that’s Sen. John McCain, an American hero,” Gen. Mark Milley, the Army’s chief of staff said at the Austin ceremony on Friday.
Milley went so far as to characterize the creation of AFC – which the Army calls its most significant reorganization since Vietnam – as “Senator McCain’s idea.”
Although the new four-star command took more than two years to plan and design, with deep involvement by the service’s most senior leaders, Milley recalled that the genesis was a private meeting he had with McCain, the chairman of the Senate Armed Services Committee, just before his Aug. 2015 confirmation hearing to become Army chief.
“He talked to me about a lot of the problems and challenges he thought the Army had in the area of acquisition, procurement, science, technology, research, development, modernization, and all of those things,” Milley told reporters later. “And he said, ‘You really have some significant challenges here, and I want you to think about how you’re going to reform the Army.’ I wasn’t the first person he ever told that to, but I was on the eve of confirmation. I said, ‘Oh, I’d better be paying attention to this guy, because this is the guy who’s either going to confirm you or not.’”
Milley said the planning efforts for AFC included months of quiet, ongoing conversations with McCain about how the command would be structured and how the Army could solve some of its procurement challenges by moving its modernization functions “under one roof.” Those plans were eventually revealed to the public in October 2017.
There were times during the planning process when the idea of a new modernization command appeared to have ground to a halt, in part because of a lack of political appointees within the Army who could champion the idea, according to Gen. John Murray, who became AFC’s first commander on Friday.
But Ryan McCarthy received Senate confirmation as undersecretary of the Army in August 2017, and quickly got behind it. So did Mark Esper, the secretary of the Army, when he was confirmed in November.
“I really thought that it was dead,” Murray said. “We worked very hard for about a year on it, and then Gen. Milley just quit mentioning it. But sir, I think you were really waiting for the right political leadership, and I believe the right political leadership arrived. You very quickly gained the backing of Secretary Esper and then really the muscle and the horsepower of the dynamic duo: the undersecretary and [Vice Chief of Staff] Gen. [James] McConville that really drove this home. And none of it would have been possible without Senator McCain and entire congressional support.
Two senior staff members from McCain’s office also attended the ceremony, which was held just an hour after the senator’s family announced he would be discontinuing medical treatment.
John Cornyn, the senior senator from Texas, also noted McCain’s role.
“I wish he could be here, because I know that he was key,” Cornyn said. “I know he would love to be here and be pleased.”