Three agencies launched the Big Data Challenge Wednesday asking for ideas to bring together disparate data sets that help agencies meet their missions better.
wfedstaff | April 17, 2015 4:14 pm
Three agencies are giving the rest of government a challenge and an opportunity to use data differently.
NASA, the National Science Foundation and the Energy Department launched the first initiative under the Big Data Challenge Wednesday. The goal is to bring together information from several different agencies and figure out how it could help every participant’s mission.
And failure, or at least the effort to try different ideas, is not only an option, but strongly encouraged.
“It’s important to recognize that government agencies have for years been working on very sophisticated and complex analytic projects in many different domains. Many of those projects before anyone used the term big data could certainly been categorized as big data related projects,” said Steve Mills, IBM’s senior vice president and group executive. “What’s changed is the cost of computing has been coming down.”
And it’s that lower cost for storage or compute power that is letting agencies try data analysis at a much lower risk.
“You don’t need to spend a lot of money or take a lot of time. You can take data sets first and try it, and analyze it, and then made decisions about how you pull in more,” said Teresa Carlson, the vice president of Amazon’s global public sector. “Because the costs of computer and storage are exponentially lower, then that’s what makes it all possible. You don’t have to buy things and set them up, you can take advantage of what’s possible.”
Answering a central question
The Big Data Challenge is giving agencies the impetus to take some chances.
Suzi Iacono, the co-chairwoman of the Interagency Big Data Senior Steering Group, which is a part of the Networking and Information Technology Research and Development Program at NSF, said the goal of the contest to figure out how to use the data agencies hold and create.
“The big data senior steering group came up with a central big question that we would like to pose in a contest that is: how can we make heterogeneous data sets seem more homogenous so one could take action,” Iacono said today during a panel discussion in Washington about TechAmerica Foundation’s report on big data. “So the data can be interoperable and federated, so we can bring data from many different spheres that are now today siloed and bring them together to ask very big questions.”
TechAmerica Foundation released its report, Demystifying Big Data, as a guide to help agencies understand how to use and take advantage of big data.
Iacono said agencies, or teams of agencies, have until Oct. 13 to enter the contest.
“You have to use at least two government data sets such as those that are available on Data.gov. You also could bring your own data into the mix to try to show us your great idea,” she said. “It must be publicly available. It cannot be proprietary.”
Iacono said a panel of expert judges will review the applications based on:
The contest website states first place wins $1,000, second place would receive $500 and third place would get $250.
The big data challenges are part of the effort started in March by the White House Office of Science and Technology Policy.
OSTP and six other agencies launched the national big data development effort. They committed $200 million to invest in science, engineering, cybersecurity, workforce education and competition and contests to promote the development of ways to deal with the large amounts of data the government, industry, researchers and many others are producing.
Additionally, federal Chief Technology Officer Todd Park has been holding datapaloozas and other events to bring public and private sector experts together to solve problems.
Expanding the big data view
The goal of the report from the TechAmerica Foundation is to help move big data to a wider audience.
Amazon’s Carlson said technologies such as cloud computing and business intelligence and analysis tools are making it easier for agencies and companies to take advantage of all the data that’s out there.
Bill Perlowitz, chief technology officer for the science, technology and engineering group for Wyle, said the report tried to highlight what big data can do and what could be done using all the information available.
He said there are four questions that the report is trying to help agencies consider:
“It’s not about getting data and it’s not about processing data,” Perlowitz said. “It’s about getting information into the hands of people who can use it.”
Among the report’s recommendations is for agencies to name a Chief Data Officer (CDO). The Federal Communications Commission is one of the few agencies to have done just that.
The report stated a CDO would “generate and promulgate a governmentwide data vision, to coordinate activities, and to minimize duplication.” The commission said that like a federal chief information officer, there should be a federal CDO inside the Office of Management and Budget, who would “bring cohesive focus and discipline to leveraging the government’s data assets to drive change, improve performance and increase competitiveness.”
Steve Lucas, the global executive vice president and general manager for SAP’s database and technology unit, said chief data officers are becoming more common across industry.
“You might have a CIO who is incredibly adept at delivering information. We joke and say a CTO is a CIO who delivers software and not data,” he said. “The reality right now is it’s happening in industry, not universally, but we see it involving over the next three-to-give years.”
Lucas said the commission wasn’t concerned that a CDO would be another hat for someone to wear and may not get the attention it needs.
“I think it was universal agreement. This is what we need,” he said. “The economy we are in today is an information economy. It’s not about the software or the hardware. It’s about the information. This draws attention to the subject.”
At SAP, their CDO is dual-hatted with their CIO, Lucas said.
The TechAmerica Foundation report also includes case studies, from the National Archives and Records Administration and NASA, and a recommended roadmap to get started with big data.
Iacono said more and more agencies are interested in joining the big data steering committee so the report is coming at an important time.
“The report goes a long way in clarifying and articulating the promise of big data,” she said.
RELATED STORIES:
White House, agencies commit $200M to solving ‘big data’ quandary
White House looking for a few ‘bad asses’ to kick-start 5 projects
Navy struggles to find the way ahead on big data
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Jason Miller is executive editor of Federal News Network and directs news coverage on the people, policy and programs of the federal government.
Follow @jmillerWFED