Most federal listeners understand that big data is a constant today, but Levenson brings up an uncomfortable topic — this data is not static. It is in constant change. Given this fact, how does a federal professional assure that real-time information is provided for key decision-makers?
Levenson brought up the example of a military jet aircraft. Each flight produces one terabyte of data. If you consider a fleet of aircraft and constant movement, then we have a data set that is constantly being amended and altered.
It looks like the federal community has gotten a handle on harnessing the data. The real question is how to keep that ever-changing data in a state of readiness for analysis. Even if you start with a data set that is normalized, this does not mean the new information is ready for analytics.
The analytics Levenson referred to is, of course, machine learning and artificial data. Let us not forget that even the top computer science genius at MIT cannot create an algorithm without clean data.
Levenson suggested that an open-source approach on a platform will be the most efficient way to handle making sure data is up to date in today’s world.