Reporter’s Notebook

jason-miller-original“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.

Submit ideas, suggestions and news tips  to Jason via email.

 Sign up for our Reporter’s Notebook email alert.

A deeper dive into Energy’s cyber defenses

The Energy Department’s cybersecurity is awful — well, at least that’s what many people believe based on the recent USA Today story.

The news organization found hackers were successful 159 times in penetrating Energy’s network between 2010 and 2014, including 53 instances where the attackers took control of the “root” servers.

USA Today found that the National Nuclear Security Administration (NNSA) experienced 19 successful attacks during the four-year period, according to the Freedom of Information Act records it obtained.

On the surface, it seems the Energy Department is just another federal agency that is, in the words of former White House cybersecurity official Melissa Hathaway, complacent, apathetic and/or negligent when it comes to securing its networks and data.

But when you take a closer look at the statistics, the picture isn’t all bad.

First off, the USA Today story highlights that the 159 successful intrusions were out of 1,131 cyberattacks, making the success rate 14 percent.

Industry averages of successful cyber attacks are hard to come by. Trustwave, a cyber company, said at the BlackHat conference in August that one such malware, called the RIG exploit kit, had a success rate of 34 percent.

While we know the saying, one is too many, Energy’s defenses show they are successful in stopping attacks 86 percent of the time. That is a much different story than saying Energy has been breached more than 150 times.

Michael Johnson, Energy’s chief information officer, opened the door at the AFCEA-Intelligence National Security Alliance conference on Sept. 10 a bit wider to just what the department is up against and the progress it’s making.

While Johnson didn’t directly address the USA Today story, he did offer some numbers about what the agency is trying to protect:

  • DoE’s cyber budget is $1.5 billion
  • More than 600 total information systems
  • More than 300,000 unclassified end points
  • About 200 data centers
  • More than 37,000 mobile devices.

“We at DoE are putting in place a framework to simultaneously advance and safeguard the mission of the Department of Energy and enhance how we deter and defend against adversarial actors in the cyber space,” Johnson said. “There are five main ways we are doing this. Number one, we are advancing cyber as both information sharing and information safeguarding. It’s hugely important to have a common semantic framework when you are working cyber issues so you know and realize that cyber is about both advancing the mission of the department that I represent and also about safeguarding it and what you do to keep it safe.”

The second approach focuses on making better use of cyber intelligence, such as the department’s Joint Cyber Coordination Operations Center, which it launched in 2014 with a handful of initial capabilities.

“It’s important to provide operational and situational awareness to inform, for example, the kill chain, when working through threats. What we must support is instantaneous, real time and automated sharing of cyber threat information,” Johnson said. “With instantaneous and automated sharing, an attack only succeeds once, and hence we lower the total number of successful attacks. So if there is a successful attack, for example a zero day, it should only work once, and that starts the bad guys to start over and over again.”

The third piece is moving off of legacy systems. The fourth is implementing cyber best practices, such as two-factor authentication and application white listing.

The final piece is investing in cyber research and development to improve trust in cyberspace.

There is a lot going on at Energy. It’s not that these 159 successful cyber attacks aren’t worrisome. As we’ve seen with the Office of Personnel Management and so many others, one bad day can ruin millions of people’s lives.

But the Energy Department, like every other agency, isn’t just sitting back and waiting for an attack through gaping holes in their networks like some would insinuate. The complexity of why hackers are successful goes far beyond thousands of pages of FOIA’ed documents and that’s where the USAToday and others tend to overlook too often.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


New cyber threat center to hit initial stride in October

Jason Miller: New cyber threat center to hit initial stride in October

Just in time for cybersecurity awareness month in October, the White House will launch the initial operating capability of the cyber threat intelligence integration center (CTIIC).

Michael Daniel, the White House cybersecurity coordinator, said Sept. 10 at the NIST cybersecurity event in Washington that some of the initial capabilities are moving in place.

“We are hopeful we can get all of that together and have it start producing some of its products in the first part of the fiscal year,” Daniel said.

As a quick refresher, the White House announced the creation of the CTIIC in February, modeling it after the approach used after the Sept. 11, 2001 attacks to better bring together terrorism-related information. The broader goal of the CTIIC is to look beyond the ones and zeros, and combine what the intelligence community knows about malware with what it also knows about the rest of the world, including the state and non-state actors who are using it.

Daniel said he hopes the new cyber information sharing construct plays mostly a behind-the-scenes role.

“If it does its job well, it will not be something that is terribly visible to the outside world,” he said. “It’s really designed to actually enable the government to get its act together and really understand that intelligence picture much more effectively.”

Daniel said creating this full picture of the government’s intelligence capabilities will take some time.

But Congress and the White House already are fighting over the CTIIC even before it gets off the ground.

In the House’s version of the 2016 Intelligence Authorization Act, lawmakers detail prescriptive rules around the creation of the organization. The bill, H.R.-2596, which passed the full House in June, would limit the CTIIC to have 50 permanent positions, not be allowed to augment the staff with contractors, detailees or other typical ways, must be located in an office owned or run by the intelligence community and lays out five primary mission areas.

The White House pushed back in its Statement of Administration Policy on the bill, saying it objects to the House’s provisions because it expands the role of the CTIIC into functions that already are being performed in the government.

“Given the rapidly changing nature of cyber threats to the United States, the CTIIC will require flexibility in executing its core functions,” the administration wrote. “Furthermore, the limits this bill would place on CTIIC’s resources, and the expansive approach the bill would take with regard to CTIIC’s missions, are unnecessary and unwise, and would risk the CTIIC being unable to fully perform the core functions assigned to it in the bill.”

The Senate’s version of the intelligence authorization bill, S.1705, received approval from the Intelligence Committee in July. The report on the bill, however, doesn’t mention the CTIIC or even cyber.

Related to the CTIIC, Daniel said summer of 2014 his office reconstituted an interagency group focused on cyber responses.

“It’s the body I use to help coordinate interagency response to major cyber incidents,” he said. “We also are looking to mature how we engage with the private sector. We’ve spent an extensive amount of time to-date with the financial services industry, talking about how we can better partner with that industry. To that end, we’ve been convening a series of table-top exercises to better understand how both sides actually respond to incidents. Of the key lessons we’ve learned is that neither side really understands what the other one does when the balloon actually goes up.”

Beyond the CTIIC, Daniel offered a few other tidbits worth noting.

He praised the Office of Management and Budget’s efforts around cyber, saying Tony Scott, the federal chief information officer, and his staff have really taken critical steps through the cyber sprint and other actions to improve the security of federal data and networks.

“We see many systemic weaknesses across federal IT networks and we really have to work to improve the IT security of those networks,” he said. “We worked very close with Tony Scott to help create and work on the cybersecurity sprint, which really focused on patching critical vulnerabilities rapidly within the federal government, actually figuring out how many privileged users we really had across the federal government, and tightening down on those, making sure we are deactivating accounts when they are no longer needed, and dramatically accelerating implementation of multi-factor authentication, especially for privileged user.”

Daniel said his office is working with OMB on implementing a plan to further operationalize the protection of federal networks, including the need to standardize and automate many of the manual processes, strengthening the security across the entire lifecycle of a network or system, retiring legacy systems and reducing the attack surface agencies face by segmenting networks.

“This surge to better protect the federal enterprise will be augmented by a range of policy tools. For the first time in 15 years, OMB is updating Circular A-130. As long as I’ve been doing cybersecurity, there has been talk of updating A-130, so I think it’s great we are finally getting around to it,” Daniel said. “Our goal is by the end of 2015 to have this updated foundational document for federal IT policies.”

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


Inside the Reporter’s Notebook – NASA, Energy, Commerce rearrange seats in CIO’s office

Inside the Reporter’s Notebook is a biweekly dispatch of news and information you may have missed or that slipped through the cracks at conferences, hearings and other events.

Jason Miller
Jason Miller

This is not a column or commentary — it’s news tidbits, strongly-sourced buzz, and other items of interest that have happened or are happening in the federal IT and acquisition communities.

As always, we encourage you to submit ideas, suggestions and, of course, news to Jason via email.

Be the first to know when a new Inside the Reporter’s Notebook is posted. Sign up today for our new Reporter’s Notebook email alert.


NASA, Energy, Commerce rearrange seats in CIO’s office

NASA and the Energy Department are going through major changes in their chief information officer’s shop.

Let’s start with Energy. Rod Turk, the department’s chief information security officer for about year, is heading back to the Commerce Department in the same capacity. Turk came to Energy from Commerce in August 2014.

Sources say the opportunity to work for Commerce CIO Steve Cooper was too good for Turk to pass up.

But other sources say there may be something more going on at Energy.

Michael Johnson, Energy CIO since March, is looking to be doing a major reorganization of his office. Sources say other long-time Energy IT officials may be on the move as well or at least changing roles in a significant way.

Turk is at least the second senior executive to leave Energy since April when Don Adcock, the deputy CIO at the time, took a job with the private sector. Adcock now is executive director of global operations at ActioNet, an IT security and software company.

Read more


A bid-protest decision impacting mergers and acquisitions

An interesting bid protest decision came down from the Government Accountability Office recently that could have broad repercussions, especially in the wake of all the mergers and acquisitions happening in the federal marketplace.

First the basics of what happened:

GAO ruled in favor of FCI Federal in its protest of a $209 million contract award to USIS’ Professional Services Division in July 2014 to support 68 Homeland Security Department Citizenship and Immigration Services field offices and 10 asylum offices throughout the United States.

FCI Federal claimed CIS didn’t reasonably consider and document how it reviewed the allegations of fraud against the awardee’s parent company in determining USIS PSD’s responsibility.

“We sustained the protest, finding that the record showed that the contracting officer failed to obtain and consider the specific allegations of fraud alleged by the Department of Justice (DOJ) against the awardee’s parent, relying instead on general media reports,” GAO wrote in its decision on Aug. 5, which was posted later in the month. “We also found that the contracting officer failed to consider the close relationship between the awardee and its then parent company, USIS LLC, with respect to the contemplated approach to contract performance, mistakenly believed that the two companies were separate, and misunderstood the legal standards related to affirmative responsibility determinations.”

Read more


GSA plays Captain Ahab in latest effort to streamline schedules

The General Services Administration has been taking small steps to modernize its schedule contracts for much of the last five years.

But now GSA is going after the white whale of schedule contracts: the IT Schedule, known as Schedule 70.

Schedule 70 accounted for $14 billion out of $32 billion in agency spending against all the GSA schedules in 2014.

And the agency is looking for help.

GSA issued a request for information Sept. 3 asking if the requirement for vendors to have at least two years of experience is necessary anymore to be included on Schedule 70 for IT products and services. GSA also wants to know whether this requirement is preventing cutting-edge IT firms from providing products or services to the government.

Read more


GSA plays Captain Ahab in latest effort to streamline schedules

The General Services Administration has been taking small steps to modernize its schedule contracts for much of the last five years.

But now GSA is going after the white whale of schedule contracts: the IT Schedule, known as Schedule 70.

Schedule 70 accounted for $14 billion out of $32 billion in agency spending against all the GSA schedules in 2014.

And the agency is looking for help.

GSA issued a request for information Sept. 3 asking if the requirement for vendors to have at least two years of experience is necessary anymore to be included on Schedule 70 for IT products and services. GSA also wants to know whether this requirement is preventing cutting-edge IT firms from providing products or services to the government.

GSA is proposing for those companies without two years of corporate experience that they describe at least three relevant projects that would demonstrate they are qualified to work for the government.

Responses to the RFI are due by Sept. 18.

“This is a positive step by GSA and the administrator,” Roger Waldron, president of the Coalition for Government Procurement in an email statement to Federal News Radio. “The coalition is pleased to see GSA acting on recommendations we have made regarding streamlining IT Schedule 70. We look forward to working with GSA to simplify and streamline the proposal submission and evaluation process for IT Schedule 70.”

Waldron said the proposed changes in the RFI are consistent with the “agile” approach to reducing barriers to entry for commercial firms seeking to enter the federal marketplace.

“The coalition also believes that the lessons learned and insights gained from this streamlining initiative can be applied across the entire GSA schedules program,” he said.

GSA joins a growing list of agencies wanting to reduce barriers for new companies. The Defense Department has opened up an office in Silicon Valley to recruit and help companies become defense contractors.

The Air Force also is trying to attract non-tradition vendors through a series of efforts such as Plugfest-Plus.

GSA’s 18F, the Department of Health and Human Service’s Buyer’s Club and others also are trying to take steps to draw the interest of these cutting-edge companies.

This update to Schedule 70 is part of a broader initiative to improve the GSA schedules.

In June 2012, GSA announced it would move to a demand-based approach to the contracts. But that soon was abandoned for new approaches that focused on category management and strategic sourcing.

Then in April, GSA announced another set of efforts around collecting pricing data, reducing the burden of selling commercial items to the government and adding flexibility at the order level for products and services.

GSA also has said it will open its e-Buy tool to the general public to view what requests for quotes and requests for information are available — a major hole in the government’s quest for greater transparency for the last 25 years.

The agency also recently launched an improved automated price reductions tool for schedule contract holders, and is in the midst of upgrading GSA Advantage and the DoD EMALL, with a goal of migrating these tools to a common platform so it’s easier for vendors to upload and modify their product or service catalogs instead of having to upload the catalog to multiple systems.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


Bid-protest decision impacting mergers and acquisitions

An interesting bid protest decision came down from the Government Accountability Office recently that could have broad repercussions, especially in wake of all the mergers and acquisitions happening in the federal marketplace.

First the basics of what happened:

GAO ruled in favor of FCI Federal in its protest of a $209 million contract award to USIS’ Professional Services Division in July 2014 to support 68 Homeland Security Department Citizenship and Immigration Services field offices and 10 asylum offices throughout the United States.

FCI Federal claimed CIS didn’t reasonably consider and document how it reviewed the allegations of fraud against the awardee’s parent company in determining USIS PSD’s responsibility.

“We sustained the protest, finding that the record showed that the contracting officer failed to obtain and consider the specific allegations of fraud alleged by the Department of Justice (DOJ) against the awardee’s parent, relying instead on general media reports,” GAO wrote in its decision on Aug. 5, which was posted later in the month. “We also found that the contracting officer failed to consider the close relationship between the awardee and its then parent company, USIS LLC, with respect to the contemplated approach to contract performance, mistakenly believed that the two companies were separate, and misunderstood the legal standards related to affirmative responsibility determinations.”

Basically what GAO is saying is when a company is sold from one parent to another or if one company takes over another in a merger or acquisition, the contracting officer needs to reassess the proposal and whether the new company is capable of meeting the solicitation’s requirements. The agency also must document this evaluation as part of its review of the bids.

Bill Shook, a procurement attorney who didn’t represent FCI Federal or USIS, said this decision is one of the few times GAO deals directly with determinations of responsibility.

“The facts are very unusual — a parent company in trouble that spins off a subsidiary with the subsidiary having relied upon the parent to perform a significant portion of the contract. Although government contractors are bought and sold on a regular basis so that an award may be made to a ‘new’ company, generally the contractor being bought or sold remains responsible for performance or the seller would still be capable of performing as needed because it was not under a fraud investigation,” he said. “So the case applies to the narrow circumstance where the offeror is sold during the course of a procurement and its parent is not able to participate as a subcontractor in the subsequent performance because of fraud concerns. It does happen but rarely. A far more common factual pattern would be an offeror is sold and is still responsible because all of the assets necessary for performance go with it.”

This is especially important as the Justice Department continues its aggressive pursuit of False Claims Act violations. DoJ reported in 2014 a record in terms of the number of fines levied on firms, and 2015 is shaping up to be no different based on what we’ve seen so far. USIS’ new parent company Altegrity was part of a settlement recently where it gave up $30 million in fees that it said the government owed.

A FCI Federal spokesman said the company was pleased with GAO’s decision.

Its attorney, Claude Goddard of the Polsinelli law firm, said the bid protest judgment clarifies the linking of responsibility determinations and awards.

“A responsibility determination, by law, must be made before an award decision, but the award decision should be made as soon as possible after the responsibility determination,” Goodard said. “The agency failed to follow those rules here. It tried to take a short cut by using a responsibility determination to justify an award decision made nearly 10 months earlier. In doing so, the agency ignored important developments — the sale of the business entity that submitted the offer — that rendered the original award decision outdated and obsolete. The awardee in its proposal had relied extensively on the past performance, corporate experience and corporate resources of its parent corporation, and the agency had relied on those same factors in assigning the awardee’s proposal high technical ratings. None of those factors applied, however, once the awardee was sold to another contractor. The agency erred by trying to use the later responsibility determination to justify an award that no longer was rationally based given the changed circumstances.”

GAO recommended CIS reopen discussions with all offerors remaining in the competition, request revised proposals, undertake a new evaluation of those revised proposals and make a new selection decision.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


NASA, Energy, Commerce rearrange seats in CIO’s office

Jason Miller: NASA, Energy, Commerce rearrange seats in CIO’s office

NASA and the Energy Department are going through major changes in their chief information officer’s shop.

Let’s start with Energy. Rod Turk, the department’s chief information security officer for about year, is heading back to the Commerce Department in the same capacity. Turk came to Energy from Commerce in August 2014.

Sources say the opportunity to work for Commerce CIO Steve Cooper was too good for Turk to pass up.

But other sources say there may be something more going on at Energy.

Michael Johnson, Energy CIO since March, looks to be doing a major reorganization of his office. Sources say other long-time Energy IT officials may be on the move as well or at least changing roles in a significant way.

Turk is at least the second senior executive to leave Energy since April when Don Adcock, the deputy CIO at the time, took a job with the private sector. Adcock now is executive director of global operations at ActioNet, an IT security and software company.

So be on the lookout for other major personnel and organizational changes at Energy in the coming weeks.

Meanwhile at NASA, as we reported in last week’s notebook, Larry Sweet is indeed retiring, but it’s happening sooner — by Nov. 30 — and he’ll be out as the CIO later this month.

An email obtained by Federal News Radio says Sweet told staff members that Renee Wynn will be his permanent replacement, starting Sept. 26.

Meanwhile, Sweet will work on, what one source termed, “special projects” for his final two months before retirement.

Wynn came over to NASA in July after spending her 24-year federal career with the Environmental Protection Agency.

It’s interesting that NASA chose Wynn so quickly without going through a formal hiring process. It’s not that Wynn isn’t qualified — she served as acting EPA CIO previously — but the NASA CIO position is a coveted executive role for both its mission and opportunities, and for it stature. But maybe NASA management didn’t want another internal candidate after Sweet and Linda Cureton, whom Sweet replaced, both came from NASA centers.

The other question is whether Wynn will follow in Johnson, Veterans Affairs Department CIO LaVerne Council, and Transportation Department CIO Richard McKinney footsteps to change over staff and organizational focus.

Other changes in the CIO community

Commerce’s CIO Cooper is not only adding Turk, but Renee Macklin too.

Macklin left as the CIO at the Small Business Administration earlier this summer and joined Commerce as its director for IT services, heading up the agency’s new shared services organization. Sources say she is leading Commerce’s work for IT shared services and supporting the technology underlying shared services for acquisition, human resources and finance.

Macklin came to SBA in December 2013 after spending the previous 14 years at Commerce’s International Trade Administration.

Keith Bluestein, who has been deputy CIO at SBA since May, takes over as interim CIO until the agency hires a permanent executive.

The Social Security Administration also quickly moved to replace its CIO. Bill Zielinski, SSA’s deputy commissioner for systems and CIO since August 2013, left for a detail at the Office of Management and Budget to lead the “agency oversight team.”

SSA updated its website saying Robert Klopp is the new CIO and deputy commissioner. Klopp has been SSA’s chief technology officer since January and before that worked in industry, most recently at SAP and EMC.

Zielinski’s new role at OMB isn’t entirely clear. The agency oversight team for the CIO’s office is a new construct.

Zielinski will oversee part of the FedSTAT process, which OMB introduced in May as part of agency’s fiscal 2017 budget process. But the federal CIO’s portion of FedSTAT isn’t entirely clear, nor is it certain from the link on the CIO Council’s blog as to whether it is indeed the aforementioned FedSTAT process or the Fedstats effort around better managing federal statistics.

NSA, FERC also shuffle CIO roles

The National Security Agency brought on Greg Smithberger to be its CIO earlier this summer, replacing Lonny Anderson, who remains with the agency in a different position. No further details were available about either gentleman.

Sanjay Sardar, a well-respected and familiar figure in the federal community, moved on in July from the CIO of the Federal Energy Regulatory Commission (FERC).

According to Sardar’s LinkedIn profile, he joined SAIC as its vice president of data sciences.

FERC hasn’t updated its website about who is the acting CIO.

In non-CIO news, the General Services Administration and the Environmental Protection Agency join the growing ranks of agencies with a chief data officer or chief data scientist.

Kris Rowley, became GSA’s CDO in late April after previously working for the agency’s Office of Governmentwide Policy, where he stood up a performance management line of business and developed an application to standardize and collect performance management data.

Rowely has been with the government for more than 13 years, which included stints at the IRS, OMB and the Treasury Department.

Over at EPA, Robin Thottungal will be joining as the division director for the Environmental Analysis Division (EAD) within the Office of Information Analysis and Access, and as the chief data scientist.

An email from EPA CIO Ann Dunkin, which Federal News Radio obtained, said Thottungal starts later this month after spending most of his career in the private sector.

Most recently, Thottungal worked at Deloitte Consulting where he focused on large scale analytics projects for public sector and commercial clients. He also led the global big data community of practice for Deloitte, developing analytical frameworks and go-to market strategy for big data and analytics solutions.

Additionally, Thottungal is the vice-chairman for the Institute of Electrical and Electronics Engineers (IEEE) Washington D.C. section as well as the chapter chairman for IEEE Computational and Intelligence society.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


Jason Miller: Who should own veterans.gov?

Veterans Affairs wants to make it easier for veterans to find its services. One suggestion by Secretary Bob McDonald is to create a veterans.gov website. But there's a problem. The Labor Department owns that URL. Federal News Radio's executive editor Jason Miller writes about this situation in his weekly feature, "Inside the Reporter's Notebook." He joined the Federal Drive with Tom Temin to discuss a potential solution to Labor and VA's problem. Read Jason's full notebook.


Inside the Reporter’s Notebook – Veterans, Labor already have the answer to Veterans.gov question

Jason Miller
Jason Miller

Inside the Reporter’s Notebook is a biweekly dispatch of news and information you may have missed or that slipped through the cracks at conferences, hearings and other events. This is not a column or commentary – it’s news tidbits, strongly-sourced buzz, and other items of interest that have happened or are happening in the federal IT and acquisition communities. As always, we encourage you to submit ideas, suggestions and, of course, news to Jason via email. Be the first to know when a new Inside the Reporter’s Notebook is posted. Sign up today for our new Reporter’s Notebook email alert.


Veterans, Labor already have the answer to Veterans.gov question

Behind closed doors, there is a growing disagreement — I wouldn’t call it a dispute quite yet — about how make services to veterans easier.

Sources say the departments of Labor and Veterans Affairs don’t quite see eye-to-eye about how best to use the domain veterans.gov.

VA Secretary Bob McDonald brought the discussion semi-public recently in comments made during an event hosted by Politico where he said, “Our websites have unusual names. E-Benefits, MyHealtheVet, etc. What’s wrong with Veterans.gov or Vets.gov? Rather than looking at everything through the lens of the bureaucracy toward the customer, let’s look at everything from the lens of the customer.”

Sources say there have been some discussions between Labor and VA about whether VA should own the veterans.gov website or whether Labor should continue to run it — as they have since 2001. By the way, veterans.gov redirects to the Veterans Employment and Training Service.

Read more


Sequestration rears its ugly head in year-end spending trends

The final push to spend fiscal 2015 money begins Sept. 1. And with sequestration rearing its ugly head once again and Congress speaking out of both sides of its mouth about a shutdown or no shutdown starting Oct. 1, agencies seem to be going on a spending spree before the 2016 horror story returns.

“We are looking at agencies motivated right now to spend against any authority they have, even multi-year money because theoretically that multi-year money could be reduced by a certain authority come Oct. 1,” said Steve Charles, co-founder of the ImmixGroup. “It’s a little like the year before sequestration kicked in, I guess two years ago. We saw a real cleaning of the pipes so to speak, a real spurt at year-end. This is similar, maybe not quite as dramatic, and so it’s going to be big year end.”

Take the Alliant IT services multiple award contract (MAC) run by the General Services Administration.

Casey Kelley, the Alliant program manager, said as of Aug. 21, agencies have obligated more than $211.3 million against the contract. That is way up from August 2014 when agencies only spent $13.2 million, and even higher than 2012 and about equal to 2011. In 2013, Alliant saw a huge spike of $954 million in revenue in August.

Read more


GSA gives agile development a shot in the arm with awards to 16 vendors

The much-anticipated blanket purchase agreement for agile development services went to 16 vendors late on Friday, setting up a the latest gold rush in federal procurement.

The General Services Administration’s 18F chose those six large and 10 small contractors after they demonstrated a working prototype based on a public dataset — and then showed their work in a publicly available git repository.

“By finding design firms that can turn around a prototype in just a few weeks, user researchers that can survey users to provide information before an opening sprint, and developers that use open source code rather than reinventing the wheel, 18F will be able to turn projects around faster for our federal partners, saving time and taxpayer dollars, as well as creating products better focused on the end user,” GSA said in the release.

The BPA has a ceiling of $25 million over five years. 18F issued the request for proposals in June, planning to award places on the BPA to contractors, who will work only 18F projects. Later 18F hopes to establish a second BPA for governmentwide agile services.

Read more


DoD’s new cyber rule misses critical consideration

The Defense Department’s interim rule detailing new cybersecurity requirements for contractors and for cloud services caused a lot of excitement in the federal community last week.

But what was lost in the discussion is whether the new cyber standards the Pentagon is laying out actually can be enforced and if so, what’s it going to cost DoD in the long run.

Rob Carey, a former DoD deputy chief information and now vice president of Navy Marine Corps programs at Vencore, said without a doubt make sense and will help.

“This has been bubbling for years,” Carey said. “It’s good that the government and DoD are pushing expectations to industry in order to do business with the government by saying you will do the following things to protect information. I get it. The downside, however, is industry is ill prepared for this level of granularity and structure in the defensive infrastructure of their networks. In this case, the government is better at this, especially DoD, and yet they still have their hands full.”

Read more


DoD’s new cyber rule misses critical consideration

The Defense Department’s interim rule detailing new cybersecurity requirements for contractors and for cloud services caused a lot of excitement in the federal community last week.

But what was lost in the discussion is whether the new cyber standards the Pentagon is laying out actually can be enforced and if so, what’s it going to cost DoD in the long run.

Rob Carey, a former DoD deputy chief information officer and now vice president of Navy Marine Corps programs at Vencore, said without a doubt, the new standards make sense and will help.

“This has been bubbling for years,” Carey said. “It’s good that the government and DoD are pushing expectations to industry in order to do business with the government by saying you will do the following things to protect information. I get it. The downside, however, is industry is ill prepared for this level of granularity and structure in the defensive infrastructure of their networks. In this case, the government is better at this, especially DoD, and yet they still have their hands full.”

Carey said the challenge for industry to meet DoD’s requirements are two-fold.

First, industry doesn’t have the same requirements as the government — meaning standards from the National Institute of Standards and Technology or the National Security Agency or even in laws such as the Federal Information Security Management Act — so they aren’t currently prepared to easily answer many of the questions the military is asking in the case of a cyber breach. And as we all know, there are two kinds of companies, those who have been breached and know it, and those who have been breached and don’t know it.

Second, Carey said there aren’t any details on how the government will oversee or ensure vendors are meeting these cyber requirements.

“Where we are heading is to ask who is liable if there is a breach, and I’m not sure if this gets us there,” he said. “It will raise the bar, but also drive up the cost of doing business with industry because they have to pay for the new controls, both hardware and software.”

Both of Carey’s point are valid. Over the years, industry executives have said they have the ultimate profit motive to be cyber secure: profit. But hacks of government and non-government continue to happen at an alarming rate. So if the profit motive is pushing industry to secure their systems and networks, why aren’t they doing a better job? And why would DoD need these requirements in the first place? (I know–because Congress passed a law. But lawmakers wouldn’t have seen a need for this law if there weren’t a rash of cyber attacks, and support from DoD.)

DoD reported in June that more and more the military services are adding contract language into awards requiring vendors to report cyber breaches.

DoD even admits this rule will cause vendors to provide a different or new type of effort.

“This rule requires that contractors report cyber incidents to the DoD. Of the required reporting fields several of them will likely require an information technology expert to provide information describing the cyber incident or at least to determine what information was affected, to be noted in the report,” the interim rule stated.

The challenges are especially true for small firms. DoD estimated this rule would affect about 10,000 contractors and less than half are small firms.

“These are steps down the right path, but I’m not sure how you enforce this,” Carey said. “I don’t think anyone should get excited about how this will fix all the cyber issues we face. Until there are huge investments in research and development, we are going to struggle with cybersecurity. What this is, is a set of things that will drive industry to get better and be more proactive, but also drive prices up.”

The question of burden is one that is coming up often, particularly around acquisition reporting. But the cyber burden question also needs to be considered.

So, comments to DoD on these new requirements are due by Oct. 26.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


GSA gives agile development a shot in the arm with awards to 16 vendors

The much-anticipated blanket purchase agreement for agile development services went to 16 vendors late on Friday, setting up a the latest gold rush in federal procurement.

The General Services Administration’s 18F chose those six large and 10 small contractors after they demonstrated a working prototype based on a public dataset — and then showed their work in a publicly available git repository.

“By finding design firms that can turn around a prototype in just a few weeks, user researchers that can survey users to provide information before an opening sprint, and developers that use open source code rather than reinventing the wheel, 18F will be able to turn projects around faster for our federal partners, saving time and taxpayer dollars, as well as creating products better focused on the end user,” GSA said in the release.

The BPA has a ceiling of $25 million over five years. 18F issued the request for proposals in June, planning to award places on the BPA to contractors, who will work only 18F projects. Later, 18F hopes to establish a second BPA for governmentwide agile services.

GSA said it’s still evaluating other bids under the small business set-aside portions of the BPA — one for design and one for development.

The goal from the beginning was two-fold.

First, 18F wanted to take a different approach to awarding a contract, not basing it on long written qualifications, but actually having vendors show what they can do.

Second, 18F wanted an award process that didn’t take a year.

On the surface, the year-old organization seems like it met both goals for the full-and-open portion of the BPA.

GSA says 88 firms submitted proposals and time to award was less than 70 days.

“We were looking for design firms that could turn around a prototype in just a few weeks, user researchers that could survey users to provide information before an opening sprint, and developers that use open source code rather than ‘reinventing the wheel,’” 18F stated in the Q&A.

So what comes next for the BPA and awardees?

“First, they should soon expect to begin seeing opportunities to do what they do best: Deliver working solutions to government customers that showcase the ability of agile processes to transform the way we buy and build digital services,” 18F stated in the Q&A. “Second, they should expect to maintain an ongoing dialogue with us about what works and what doesn’t on a project-by-project basis. We view the vendor community as ‘users’ of the agile BPA, and their feedback, knowledge, and insight will help us continue to find ways to innovate and improve the acquisition process. Third, they should expect to see us continue to experiment with ways to improve the procurement process. Our end goal is to reduce the friction associated with doing business with the government. This allows delivery teams to focus on the end user experience, driving better solutions at lower costs.”

The question now is whether the unsuccessful bidders will protest, and if these agile concepts can move from burgeoning idea to fully institutionalized approach.

And, of course, if this BPA is successful, how long will other agencies wait until they start setting up their own contracts and create an entirely new proliferation of multiple award contracts.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


« Older Entries

Newer Entries »