- The Pentagon’s inspector general said Defense vendors are getting millions of dollars they did not earn because of shortcomings in how DoD manages cost-plus contracts. At issue is the “award fees” the department pays to incentivize good performance on those contracts. A new audit found DoD usually has good justification to support the fees, but there were some glaring exceptions in a handful of Army and Air Force contracts over the last several years. The IG also found problems with the Pentagon’s system for tracking those award-fee contracts. Auditors said without solid data, it’s hard to determine whether those award fees do much good.
- The Department of Homeland Security is working hard to build a new cadre of artificial intelligence experts. DHS is trying to recruit 50 AI technology experts in the coming year. While that talent is in high demand across the world, DHS Chief Information Officer Eric Hysen said the department’s mission is a big draw. Hysen told Federal News Network, “Where else are you going to go where you can not just get to work on cutting-edge technology, but you can apply it to missions, like combating the flow of fentanyl into the United States, like combating child sexual abuse and exploitation, like making it easier to become an American citizen? These are just such critical activities.” DHS’s AI experts will be eligible for a GS-15-level salary and fully remote positions.
- The Army is looking for international innovators to share their novel technology solutions. International small-to-medium businesses can pitch their cutting-edge technology solutions. The XTechInternational competition is inviting companies and academic institutions to showcase their technologies related to quantum sensing and AI for intelligence and decision-making applications. Participants have the opportunity to win monetary prizes and receive education and mentorship through the accelerator program. White papers are due by March 7, and winners will be announced on August 23. The Army Futures Command, the Army Combat Capabilities Development Command and the Office of Naval Research Global are organizing the event.
- The Office of Personnel Management is looking for federal employees performing work in the artificial intelligence field, as part of a new governmentwide survey. OPM kicked off its AI Job Analysis survey last Friday, as part of how it is meeting the requirements in the AI in Government Act and the October executive order on AI. The goal of the survey is to validate AI competencies identified by technical and human resources Subject-Matter Experts. Last July, OPM identified 43 general competencies and 14 technical competencies for AI. Now, OPM says it will use the job analysis survey results to finalize the AI competency model. Survey responses are due by March 1.
- The Cybersecurity and Infrastructure Security Agency is renewing a task force so it can examine artificial intelligence and other critical technology issues. This month, CISA extended the charter through 2026 for the Information and Communications Technology Supply Chain Risk Management Task Force. The ICT Task Force has developed guidance around hardware bills of material and other supply chain issues. Now, CISA said the task force can continue to work on software assurance guides and start to examine how AI could mitigate supply chain risks.
- The Office of Management and Budget is updating its nearly five-year-old policy implementing the Congressional Review Act. The CRA requires agencies to submit "major" rules to Congress and the Government Accountability Office for review and approval. Under the new guidance, OMB details how agencies and the Office of Information and Regulatory Affairs determine if the rule is considered "major." One change in the guidance reflects the April 2023 executive order that raises the monetary impact threshold to $200 million from $100 million. OMB said the new policy goes into affect on March 17.
- The Navy is looking into the idea of incorporating natural language capabilities into classification guides to enable automated security classification processes. Security classification guides are general by nature and tend to specify what information needs protection. If classification guides can explain why certain information needs a specific classification label, analysts can combine the documents with large language models to actively probe datasets. While this is not automation, it might be the first step toward automating the classification process.
Copyright
© 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.