Handling ‘Big Data’

October 18, 2011 — Big data doesn’t necessarily mean big headaches.

Let’s outline the problem – if you combine data from mobile devices, RFID, aerial sensing, software logs, and social media information you can crush a typical analyst.

Furthermore, information can reside in secure silos and proprietary data stores. The challenge for federal IT professionals is to derive deep insights from this proliferation of information.

GCE Federal has earned its stripes helping federal agencies in financial areas.

With the advent of massive amounts of data being generated, GCE Federal has developed an expertise handling what is now called “Big Data.”

President Ray Muslimani give a good technical overview of a technology called Hadoop.

Hadoop originated in 2006 as an outgrowth of the open source Apache Project.

It can give you a way to manage terabytes of information. James Kobelius from Forester writes that “Hadoop will be the nucleus of next-generation data warehouses.”

Copyright © 2023 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Federal Tech Talk

TUESDAYS at 1:00 P.M.

Host John Gilroy of The Oakmont Group speaks the language of federal CISOs, CIOs and CTOs, and gets into the specifics for government IT systems integrators. Follow John on Twitter. Subscribe on Apple Podcasts or Podcast One.