Reporter’s Notebook

jason-miller-original“Reporter’s Notebook” is a weekly dispatch of news tidbits, strongly-sourced buzz, and other items of interest happening in the federal IT and acquisition communities.

Submit ideas, suggestions and news tips  to Jason via email.

 Sign up for our Reporter’s Notebook email alert.

Draft FITARA policy elicits few comments, but new IT commission to help

J ust as agencies and vendors are telling the Office of Management and Budget what they think about the draft Federal IT Acquisition Reform Act (FITARA) guidance the administration issued in early May, a new public-private organization is offering their help.

The Technology Business Management (TBM) Council launched the Commission on IT Cost Opportunity, Strategy and Transparency (IT COST) May 14, led by public and private sector chief information officers. TBM is a non-profit business entity focused on the development of a definitive framework for running technology organizations like a business.

The commission says it will “define a set of recommendations and best practices for federal departments and agencies to transparently measure and communicate their IT costs so that federal CIOs are better equipped to govern their IT spending and support agency missions with limited resources.”

Pretty big plans for the commission, and they seem to be digging up ideas that have been around for sometime — remember Raines Rules?

Raines Rules was developed by former OMB Director Frank Raines during the Clinton administration. Raines Rules is really just basic IT management that helps ensure agencies are asking the right questions to focus on changing the business model instead of just applying technology.

But as OMB finalizes the FITARA guidance, the council believes it has a real opportunity to bring some commercial best practices into government.

For example, the commission’s first meeting will take place by September and plans to issue a series of recommendations in early 2016.

The commission says it will:

  • Aid in the implementation of FITARA, which gives federal CIOs centralized control over their agency’s spending for commodity IT.
  • Reduce waste and increase efficiency of federal sector IT spend.
  • Empower federal CIOs to demonstrate the cost, quality and value of their IT spend

“We realize that commercial and federal CIOs approach technology cost accounting and management very differently,” said Doug Lane, CEO of Capgemini Government Solutions. “However, there are key private sector learnings that can be applied to the federal space that we believe will have a substantive, lasting impact on the way public sector CIOs manage and communicate the value of their technology investments.”

Lane is one of 12 private sector technology experts on the commission. Others include Sunny Gupta, CEO of Apptio, Ralph Kahn, vice president of federal for Tanium, Rebecca Jacoby CIO of Cisco and George Westerman, a research Scientist at the Massachusetts Institute of Technology.

Five federal CIOs also are on the commission, including Frank Baitman , CIO of the Department of Health and Human Services; Richard McKinney, the CIO of the Department of Transportation; Sylvia Burns, the CIO of the Department of Interior; Steve Cooper, the CIO of the Department of Commerce; and Joyce Hunter, the acting CIO for the Department of Agriculture.

By the way, OMB hasn’t received a lot of public feedback yet on the FITARA draft guidance. According to the GitHub site, OMB has received only six comments so far, two were technical corrections, and four are more substantive.

One commenter focused on the need to give agency inspector generals more independence and another focused on the lack of any mention of enterprise architecture in the draft policy.

Federal CIO Tony Scott asked for a majority of the initial comments to come in the first two weeks so they could improve the guidance and put it out again for a second round of public comment.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


Marine Corps CIO Nally to retire

A fter 34 years in the Marine Corps, Brig. Gen. Kevin Nally is retiring.

Nally surprised most people at the AFCEA Naval IT Day May 14 by announcing his decision to leave the Corps in July.

“It’s time to transition and I’m really looking forward to the transition,” Nally said.

When asked what he plans on doing after his service is over, Nally only would say, “I know what I don’t want to do.”

And that response is why Nally was a favorite on the speaker’s circuit and with reporters. He is honest, candid and funny—three traits not nearly seen enough among federal officials.

Of course, that got Nally in some minor trouble too once in a while. And he reminded the reporter, usually publicly, when something didn’t sit right with his brass.

But Nally always was accessible to industry and the press at conferences, usually came with something to say and you knew you were getting his true view of the world.

During his five-year tenure at Marine Corps CIO, Nally moved the organization forward in several regards, including closer toward bring-your-own-device (BYOD), took over the operation and ownership of the Navy-Marines Corp Intranet (NMCI) and its follow-on Next Generation Enterprise Network (NGEN) and pushed to get data and systems to the tactical edge.

One other personnel note of interest — Dan Twomey, who spent much of his career in industry, decided to take a job as a fed.

Twomey, who many might know from his time at Unisys and GigaTrust as well as for serving in an executive volunteer position with the Industry Advisory Council, jumped into a new role at the General Services Administration as an industry expert, working on IT category management in Office of Strategic Programs in the Integrated Technology Solutions shop in the Federal Acquisition Service.

Twomey started in mid-April in his new role.

This is first of several category management-related positions that GSA and the Office of Management and Budget are hiring.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


Navy: A glutton for RFP punishment

R eally, the Department of the Navy, you want to go through this again?!

The recompete of the Navy-Marine Corps Intranet (NMCI) contract was arduous, took more than a year to award and deal with the protests and then it took more than 15 months to transition to the Next Generation Enterprise Network (NGEN) contract.

But like the dentist, the Navy knows it’s time start thinking about that root canal it doesn’t want, but desperately needs.

The DoN plans to release the first of two requests for information this summer for the follow-on to NGEN, which could be worth $3.5 billion over five years.

Phil Anderson, the deputy program manager for Naval Enterprise Networks Acquisition in the Navy’s Program Executive Office-Enterprise Information Systems (PEO-EIS), said they just started putting the teams together for the recompete.

“What is being communicated to us, which I think is a good thing, is to start early with the interaction with industry. You’ve been a part of efforts before where the PEO has sponsored industry days and other kinds of things, the team is looking really hard at other ways we can do that. We will do the industry days, but there are other things as well where we can reach out and get that information from you,” Anderson said at the 14th annual Naval IT Day sponsored by AFCEA Northern Virginia chapter on May 14. “My understanding — and I wasn’t a part of the NGEN competition — but there was a lot of ‘Here’s all the information the Navy has, industry you are welcome to look at it.’ Maybe you learn some things from that, maybe we learn some things. There has to be some input you have for us that we can use. We want to build contracts — could be a contract or contracts that are attractive to you all. You probably won’t see the same effort we had with NGEN on the recompete.”

In all, NGEN includes 34 services for 700,000 users across 2,500 sites.

Anderson said he expects the first RFI to be out in July, and another one to follow later in 2016.

Some of what the Navy will try to answer through the RFI process will include where the best fit is between contractor expertise and the government’s needs, should they have different contract types, should the Navy bundle services, should the Navy use the cloud and should the Navy have more flexible contract options.

Anderson said the follow-on likely will include on-shore, shipboard and pier connectivity, data center management and cloud, and expanded command and control technologies.

“The NGEN contract, we started off with a 13-month transition. We’ve been told under no circumstances will we get 13 months of transition to the vendor this time around,” he said. “As you think about your participation in this contract, your bids, your responses to the RFI, we will be looking at that as well.”

The transition from NMCI to NGEN cost the Navy an extra $20 million a month in missed savings opportunity, and it had to extend NMCI several times costing them another $1.2 billion.

The Marine Corps took a slightly different path in the transition to NGEN. Instead of the Navy model of a government-owned, contractor operated model, the Marine Corps went with a government-owned, government-operated model with contractor support.

Tom LaTurno, the enterprise data manager for Marine Corps Systems Command, said their part of the recompete will build off of its ongoing work to unify its five separate networks, integrate the secret and unclassified networks and push the Marine Corps Enterprise Network to the tactical edge.

“We do not use the NGEN contract today to buy the infrastructure. We use other methods to buy our infrastructure for our hardware, software and maintenance contracts,” LaTurno said. “So all that is potential opportunity to pull into the contract as well. All the enterprise initiatives are off the contract as well.”

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


TSA, HHS speed up move to cyber dashboards

The hope and expectation of the Homeland Security Department’s continuous diagnostics and mitigation (CDM) program hasn’t diminished as the cyber program has rolled out over the past year.

No bid protests nor miscommunications and have stymied the program, which Congress funded at $183 million in 2014 and the White House asked for another $103 million in 2016.

But two recent agency procurements make one wonder if patience is running short among agencies.

The Transportation Security Administration and the Department of Health and Human Services both issued separate procurement actions to either buy or look into buying their own cyber dashboards.

TSA, most recently, issued a request for quote under the CDM blanket purchase agreement.

“The Information Assurance and Cyber Security Division (IAD) requires (IT) tools and services to support the performance goal of integrating IT infrastructure and data to provide senior management greater insight into high risk areas while prioritizing investment in areas in which senior management and other stakeholders have the most interest,” stated the RFQ, which Federal News Radio obtained. “Integration of IT infrastructure and data allows for pinpointing areas of concern that helps TSA assess and mitigate risk, and will result in the improvement of the overall protection of TSA IT infrastructure assets and applications.”

TSA wants a contractor to install, configure and maintain a continuous monitoring and risk management tool to collect data from a variety of disparate sensors and provide a centralized risk report to senior officials.

“A tool that gives TSA the ability to provide accurate and automated risk reports to DHS, a centralized CMRM dashboard, and a centralized electronic document routing and approval process for workflow management,” the RFQ stated.

The TSA RFQ comes as HHS is reviewing comments from vendors on its request for information issued in March. Comments on the RFI were due March 26.

Now we know that a RFI is far from an active procurement, but it does signal that HHS is starting to look at its options.

The HHS RFI asks vendors for input on an “Enterprise Governance, Risk and Compliance (eGRC) tool to support HHS in managing security policies, controls, risks, assessments and weaknesses through a single platform. The goal is to develop a single repository to document and store security information, allowing HHS to review and assess the security stance of systems. This approach facilitates real-time approval, monitoring, and reporting for all HHS systems. An automated GRC tool would allow system owners, information system security officers (ISSOs), privacy officers and security analysts to easily identify system weaknesses and issue resolutions with improved efficiencies.”

An industry source, who requested anonymity in order to talk about the active procurements, said there are signs in the RFI that HHS may be considering going its own way.

The source said the term continuous monitoring is defined and used 11 times in HHS’ RFI, while CDM is not defined and only used once.

“Its existence and these points cast a light of independent selection,” the source said. “The DHS CDM program conceived a consensus cyber risk management path, as the Office of the Director for National Intelligence and the Defense Information Systems Agency are trying for intelligence community and for the Defense Department agencies. It’s unclear if DHS executed far and fast enough internally for large civilian agencies to follow suit. Of the 33 civilian agencies, it is likely smaller ones will adopt a DHS template, while medium to large agencies will run their own course.”

That question of how quickly DHS has rolled out the CDM tools and services continues to come up.

Sources say House and Senate oversight committees are growing concerned about the time it’s taking to get CDM tools and services to the agencies as the cyber threat grows.

Much of the issue around speed is not DHS’ fault. DHS continues to say the program is on schedule, but at least two RFQ protests have caused some delays in getting contracts awarded and agencies started down the implementation paths.

Additionally, the program is complicated causing vendors to ask more questions to DHS about the groupings of agencies, and some vendors have said the “reading room” concept is making the bidding process longer than expected.

While vendors will put up with the extra time, the question now is whether agencies will.

Do the TSA and HHS procurement actions signal a start of agencies going their own way? Or are the TSA RFQ and the HHS RFI just an aberration knowing DHS holds the funding support for the program?

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


Vendors call GSA’s estimates for data reporting rule ‘irrational’

C ontractor associations are raising red flags over the General Services Administration’s proposed rule for vendors to report 11 transactional data elements. And they are using data and expertise to make their case.

First a little background, GSA released the transactional data reporting requirement under a proposed rule on March 4. The agency is trying to address several long-standing complaints of vendors and apply new data collection requirements as part of its category management effort. But the data collection proposal quickly got industry’s ire up.

By May 8, both the Coalition for Government Procurement and the Professional Services Council submitted strongly-worded protests against the rule — specifically focusing on the burden on industry to meet the current requirements.

CGP surveyed its membership, of which 98 percent hold GSA schedule contracts and 42 percent have governmentwide acquisition contracts, and found companies estimated it would take 30 times longer to meet the proposed requirements than GSA initially estimated.

GSA says on average it would take companies 6 hours to initially change its systems to collect the 11 data elements, and then on average 31 minutes a month to do the reporting.

CGP says its small business member estimated it would take 232 hours to set up the initial requirement and 38 hours a month to do the reporting. Large business said it the rule would be even more burdensome, taking almost 1,200 hours of set up and 68 hours a month.

“Further, GSA contractors estimated that the total cost of implementing the transactional data would be $814,700,534 — 30 times the government’s estimate of approximately $24 million,” CGP said in its written comments to GSA. “The government already has much of the data that its requests from contractors, however that data is not aggregated in a way that makes the data useful. Before GSA increases the reporting burden on industry or expends money and personnel resources to build a system to collect, analyze and communicate billions of data points, GSA should conduct an internal pilot test using its own assisted acquisition organizations. Such a test could validate the data elements to be collected and assess the actual cost vs. benefit of doing so.”

While PSC didn’t survey its members, the association also says GSA’s estimates of 6 hours for set up and 31 minutes a month for reporting are “grossly underestimated.”

“The estimates do not account for costly modifications to information systems that will be required to accurately and completely capture the data elements required by the rule (previously noted as costing in the millions of dollars), nor do they sufficiently account for the time required to perform quality control on draft submissions and investigation into potential data anomalies that frequently arise with transactional data reporting,” PSC wrote in comments to GSA. “Industry is especially concerned that inaccuracies in data reports will be yet another platform for turning innocent mistakes into allegations of fraud under the civil False Claims Act. In light of the risks and liabilities that could result from erroneous submissions, companies will invest heavily in time and manpower to ensure accurate reporting, making GSA’s assumption that contractors will spend only 6 hours to establish and only 31 minutes per month to maintain these reports irrational.”

PSC also pointed out in its comments that GSA’s inspector general also believes the agency’s estimates are understated.

“Also, GSA’s estimates do not account for the proposed rule’s anticipation of more frequent Commercial Sales Practices (CSP) submissions. As evidenced by recent GSA OIG findings and litigation, the government takes a very strict — in our review unreasonably strict — interpretation of what constitutes a current, accurate, and complete CSP disclosure,” PSC wrote. “Government auditors often apply the clause in oversight activities with a heavy hand and point to transactional data and/or informal practices utilized in a single part of an organization or program as a basis for claiming contractor liability. The government’s strict interpretation of the CSP, in turn, is driving contractors to spend substantial resources on the CSP submission process. Depending on the size and complexity of the business, a contractor’s cost of preparing a single CSP could cost in the hundreds of thousands of dollars.”

So with industry raising serious concerns, the question is whether GSA has the will to act. Not because they don’t believe industry is correct or at least potentially right, but whether there’s a higher power pushing this new requirement.

Some sources say the push for more and more data is coming not just from the Office of Federal Procurement Policy, but potentially from the West Wing.

It’s unclear why the White House is getting so involved in the minutia of procurement, but it’s clear industry, even if they are 50 percent overstating their estimates, will not go away without a fight.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


Work officially sets up Pentagon reservation IT shared services

T he Joint IT Single Service Provider-Pentagon is a much better name than the Pentagon Defense Information Service Agency Field Activity. But in the end, no matter what this new shared service office is called, Defense Deputy Secretary Bob Work signed off on the consolidation of redundant IT services options across the Pentagon Reservation and in the National Capital Region.

As we first reported in April, the Pentagon shared service provider will bring together Army IT Agency, the Office of the Secretary of Defense’s Enterprise Information Technology Service Division (EITSD) and possible some smaller IT service providers in the new organization.

The change comes after Work asked the Defense Department deputy chief management officer and chief information officer to review the cost of IT operations across the DoD headquarters.

Work signed off on the memo initiating the change May 1.

“This review has challenged many of our organizational and institutional interests, but it is essential that we undertake this IT transition aimed at achieving significant savings to preserve our war-fighting capabilities,” Work wrote in the memo. “I expect and appreciate your full support, leadership and personal engagement as the department implements these IT decisions.”

Reading between the lines a bit — Work is basically saying, “I know a lot of people are unhappy about this decision, but this is what’s best so deal with it and make it successful.”

In fact, I heard from one of those people by “old-fashioned” Postal Service mail.

After writing the story in April, a letter arrived from an anonymous reader, starting out with “Shame on you Jason.”

The concern from this reader centered on the lack of recognition in my article as well as by DoD about the potential, or real, jobs that will be lost because of this consolidation.

This person said “shared services” is a “friendly, well thought-out term that harkens back to kindergarten when we are all supposed to share and make nice.” Instead, they say what’s really happening is some employees will have to take “the bullet” for the good of the broader community.

While this reader is obviously distressed about the change, DoD continues to make the case for why this is important.

Lt. Col. Valerie Henderson, Defense Department spokeswoman, said in an email that the Pentagon expects savings from people and infrastructure.

“From a labor perspective, we are taking a diligent approach to ensure that the end-state manpower levels are appropriate to ensure the long-term viability of the SSP and the expertise of the employees,” Henderson wrote. “Any adjustments in manpower will be determined once there is greater understanding of the best way forward. Manpower adjustments will take place as appropriate to maintain the most efficient expertise to meet the SSP’s mission and will be in line with all human resources processes and guidance.”

From an infrastructure perspective, Henderson said DoD will benefit from greater consistency, increased efficiency and better overall cybersecurity.

“This effort will provide a leaner DoD IT delivery organization with less overhead and reformed business and acquisition practices that will enable end- to-end visibility into the investment and accounting structures that support our IT mission,” she said. “This will enable the DoD to truly understand where our mission spend is required and then leverage the power of DoD spending to drive costs down.”

The joint IT provider will begin with the consolidation of common IT services including technical, contractual and organizational activities to reengineer and consolidate specific services such as computer network defense (CND), service help desk and video-teleconferencing (VTC).

DoD’s decision to consolidate redundant services is as much about saving money as it is about doing what makes sense. The Pentagon doesn’t need multiple organizations providing commodity IT services and capabilities. Even if it means some long-time employees need to find new jobs or gain new skills, no organization in the private sector or government can expect to stay the same.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


Cyber desktop standard gets wakeup call from CIO Council

T he Chief Information Officer’s Council held a séance, pulled out its Ouji board and asked for the approval from Rip Van Winkle to raise the United States Government Configuration Baseline (USGCB), otherwise known as the Federal Desktop Core Configuration, from its long slumber.

The council’s Information Security and Identity Management Committee took on this dormant effort to develop and approve standard security and other configuration settings for common IT products.

In a May 6 blog post, the council said the committee said it “updated configuration settings in current USGCB platforms (including Windows 7 and Windows Vista), reviewed a series of proposed settings, and prioritized a list of new baselines for existing platforms. These new baselines include Windows 8/8.1, IE 10, Windows Server 2012 (Domain Controller), Windows Server 2012 (Member Server), and Red Hat 6.”

The committee will approve new operating systems or versions as they become public, and create security automation, checklists and Security Content Automation Protocol (SCAP) tools with appropriate stakeholders.

The fact that the council reinvigorated this effort is a huge deal, especially at a time when agencies continue to face the same cyber problems but an exponentially larger set of vulnerabilities and risks.

The idea of a standard configuration for commodity IT has been around for some time and it’s a proven security approach. When the Air Force moved to a gold disk standard for Microsoft Windows in 2005, it cut its patch time from 57 days to 72 hours and saved $100 million per year in patch testing alone.

Karen Evans, the former Office of Management and Budget administrator for E-Government and IT, who initiated the Federal Desktop Core Configuration program in 2007 after the Air Force’s success, told the House Oversight and Government Reform Committee in 2008 that “by implementing a common configuration, we are gaining better control of our federal desktops, allowing for closer monitoring and correction of potential vulnerabilities. We are also working with the vendor community to make their applications safer.”

Nearly seven years later, Evans’ comments still ring true.

The problem is the Obama administration has not emphasized the FDDC/USGCB at the senior level over the past several years. Even though the National Institute of Standards and Technology included the concepts of standard configurations in its Special Publication 800-128, it was not included in the administration’s Cross-Agency Priority goals.

So, why the CIO Council’s renewed interest? It likely can be traced back to OMB’s resurgence in leading cybersecurity activities. It falls in line with how OMB is leading the ongoing work around rethinking identity management, the E-Gov Cyber Task Force and oversight of the Cross-Agency Priority goal for cybersecurity.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


OMB starts DATA Act implementation in high gear

F or all of the Obama administration’s reluctance and push-back against the Data Transparency Accountability Act or DATA Act as it was going through Congress, give them credit for meeting the first major statutory deadline of the law on May 7.

The Office of Management and Budget and the Treasury Department capped off Public Service Recognition Week May 8 by showing what a little hard work and intragovernmental collaboration can result in. OMB and Treasury released a set of 12 finalized data elements, 15 data elements for final review and 30 others that need to go through the public and agency comment phase. The two agencies also made public the process to digitally tag award data through the eXtensible Business Reporting Language (XBRL) format, known as the DATA Schema.

OMB also issued the first DATA Act guidance, which calls for agencies to designate a senior accountable official to lead the law’s implementation.

Finally, Treasury borrowed a page from OMB and created a DATA Act Playbook, highlighting eight steps for implementing the law.

“Where we are now is at a very critical point. By releasing concrete deliverables, the first one deals with OMB policy guidance, we are sending clear direction from OMB to governmentwide entities saying here are DATA Act requirements and here is what’s expected over the next two years,” said OMB Controller David Mader in an interview with Federal News Radio. “I think this begins the significant effort on the part of the departments across the government to begin their own planning and some implementation at their level.”

Mader said by designating a senior accountable official in charge of the implementation, OMB ensures there is a single point of contact to deal with issues. That is one of the main reasons previous data transparency attempts found limited success at best.

The DATA Act is the follow-on law to the Federal Funding Accountability and Transparency Act (FFATA). FFATA was sponsored by former Sens. Tom Coburn (R- Okla.) and Barack Obama (D-Ill.) and signed into law by President George W. Bush in 2006. While the Bush administration supported the ideas in FFATA, implementation fell well short of spirit or intent of the law.

So as Congress recognized the changing technology and data environment, lawmakers set out to create a new law.

OMB pushed back against early versions of the DATA Act, saying it wasn’t necessary and would be too burdensome for agencies.

But as Congress inched closer to passage, OMB warmed up to the law and now is taking on this enormous challenge of making all spending data more accessible, usable and transparent with more vigor and dedication than many thought they would have.

Treasury’s release of the playbook, may, in the end, be one of the key implementation documents released Friday.

“We are asking them to put together a team in each agency that is familiar with the systems and data within agency, and are asking them then to take look at data standards and see where the data resides in their internal systems,” said David Lebryk, Treasury’s Fiscal Assistant Secretary, in an interview with Federal News Radio. “We put together a team that took a typical agency to see where the data elements reside, the grant or the procurement or the financial management system. In some cases, it resided in multiple systems, and sometimes the data was defined and used in different ways.”

Treasury also conducted a mapping exercise between the data elements and where they currently reside in the agency’s systems.

Lebryk said one of the plays is for agencies to do their own mapping and then move toward the concept of data tagging.

“Once you identify where the authoritative source is, then how do you map and move to a data broker, at which point the information is translated into the schema, which will make the data more usable and transparent,” he said. “This will be a significant undertaking so that’s why what we did was so important. It gives them a fair amount information to get them way down the path. We said in a standard organization here are the processes that provide information, the systems the information resides in and gave them charts of the different kind of mapping we’ve done. We are not asking them to start from scratch. It is complicated and that is one of reasons we have not had more transparent data sooner.”

OMB, which will issue additional guidance around the DATA Act implementation plans later this month, said in the memo agencies should propose an implementation timeline, identify any potential challenges and suggestions to mitigate them and resource estimates.

Mader continues to push the fact that the implementation goal focuses on a data-centric approach rather than building new systems or new databases.

Reaction to the policy and related data standards, schema and playbook has been mostly positive.

“Our industry is encouraged by Treasury and OMB’s evident willingness to eventually enforce the data standards they’ve announced. If these common elements and schema are imposed and used consistently across all federal financial, budget, payment, grant, and contract reporting, then our government will be able to deliver on the promises that Rep. [Darrell] Issa and Sen. [Mark] Warner first made when they introduced the DATA Act four years ago next month,” said Hudson Hollister, president of the Data Transparency Coalition. “Federal spending reports can be accessible to citizens, searchable for managers, and automatic for grantees and contractors.”

Mader said he and Lebryk have met with staff members on the House Oversight and Government Reform and Senate Homeland Security and Governmental Affairs committees. The congressional staffers mostly offered positive feedback on the data schema, standards and other initial releases, Mader said.

Mader said the two also plan on meeting with agency inspector generals, and Gene Dodaro, the Comptroller General, who has promised Congress he would pay close attention to the law’s implementation, in the coming week.

“Streamlining this data will be no small task and I applaud them for putting forward a comprehensive and innovative plan to achieve these important goals,” Warner said in a release. “I look forward to working with the administration on these continued efforts to create a more data-driven and transparent government and I will keep monitoring this process to make sure that this progress continues.”

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


IT Job of the Week

There’s still time to apply to take over a key cybersecurity position at the Homeland Security Department. As the legislative and executive focus on cyber information sharing continues to increase, the director of the National Cybersecurity and Communications Integration Center (NCCIC) will become more important.

DHS says the director of the NCCIC will unify vital information technology and communications operations centers, along with intelligence analysis and national situational awareness, thereby merging operations of existing incident response mechanisms and better reflecting the reality of technological convergence.

DHS is looking for a new director of the NCCIC to replace Larry Zelvin, who left in July 2014.

Applications are due April 23 — so hurry!

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


Military associations embrace some compensation and retirement recommendations

Five military associations are calling on the House Armed Services Committee members to take a closer look at several of the Military Compensation and Retirement Modernization Commission’s recommendations.

The Air Force Association, the Enlisted Association of the National Guard, the National Guard Association, the Reserve Officers Association and the Veterans of Foreign Wars wrote to Reps. Mac Thornberry and Adam Smith on April 16 highlighting their support for creating a matching Thrift Savings Plan for servicemembers and increase financial literacy training.

The five associations together represent more than 3 million current and former servicemembers.

“We urge the committee to support legislating expanding TSP, along with financial literacy training to all military members,” the letter stated. “We believe that the recommendations enhances the current retirement system and is a valuable recruiting tool for a new generation of warfighters. We also believe whatever Congress passes should maintain the overall value of the retirement system, should not adversely affect retention and the TSP match should continue throughout an individual’s career.”

The military commission made 15 recommendations in late January to improve management and delivery of DoD pay and retirement services.

House and Senate Armed Services Committees have held a series of hearings exploring the recommendations, and legislation is expected in the coming months.

The fact that these military organizations publicly voiced their support for at least some of the non-controversial recommendations is a good sign. There is wide recognition that something has to be done to improve benefits and potentially lower costs for DoD, but little agreement yet on what that is or will be.

This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.


« Older Entries

Newer Entries »