As Navy ships near the end of each deployment cycle, crews plan and prepare for maintenance and scheduled system and compartment modernization that can’t be performed underway. These tasks require accurate, up-to-date data on each ship’s configuration and system status.
Historically, large fly-away teams assessed the ship’s condition. They used tape measures and clipboards to manually capture details. Given the size and complexity of Navy ships, the undertaking was slow, costly and prone to human error.
As the Navy adopted electronic records and engineering drawings, it sought to gain the benefits of data analytics to improve planning. But the limited quality and timeliness of manually collected data reduced the value that could be derived from the analytics.
In response, the Navy implemented more digitized, automated approaches to data collection. Solutions such as whole-ship LIDAR scanning have improved data quality and timeliness while substantially reducing the size and cost of fly-away teams.
Today the Navy uses automated sensors to capture a rapidly growing amount of data. The process is fast and cost-efficient, and the data is accurate, timely and relevant. As a result, the information provides the raw material for advanced analytics to achieve insights, inform decisions, and drive the right actions.
More government organizations want to take advantage of analytics, including artificial intelligence, to support their missions and achieve desired outcomes. But many are still working with inaccurate, outdated or irrelevant data that often depends on costly and time-consuming human interaction. And running analytics on the wrong data can result in unreliable or outright incorrect insights, decisions and actions.
To fully benefit from analytics, agencies need to understand how their data is collected and mitigate common data-collection problems. In other words, they must become “data ready.” They need to align how they evaluate, generate, collect and manage their data, recognizing the strategic value of data management to their mission, to realize the full value of their information.
Three steps in the data-readiness journey
Many agencies are held back in their data readiness because of legacy systems, resource constraints or outmoded processes – all of which can contribute to poor data quality. But the bigger issue is mindset.
You probably already recognize that analytics can extract strategic value from your data. But you also need to acknowledge that the value is contained within the data itself. That is, the insights you can gain from your data depends on managing the data from collection through analytics to results. Some organizations take the attitude that “quantity has a quality all its own.” It’s true that data quantity can overcome some limitations of data quality. But large quantities of unreliable data can also lead to undesirable results.
Becoming a data-ready organization is about achieving that data quality. Getting there is a three-step journey.
First, consider the results you want to achieve. How do you hope to derive value from your data? What is the mission outcome you’re trying to support? Yes, analytics can uncover unexpected insights with additional serendipitous benefits. But you need a specific outcome in mind if you hope to achieve predictable, meaningful results. Don’t start by trying to “boil the ocean.” Narrowing your focus to a particular outcome will enable you to identify the data you need and how that data informs the outcome.
Second, assess your data repositories. Take a complete inventory of where the data comes from, how it’s collected, where it’s stored, and how it’s shared. That will enable you to create a knowledge graph, mapping your data to how you’ll use it to create value. It will also give you early insights into potential problems with data availability, timeliness, accuracy and interoperability.
Third, implement the processes that will permit you to capture the timely, accurate and relevant data you need. Be clear about what data you’re gathering and why. Make it as easy as possible for stakeholders to collect information, because if the process is onerous, you’ll end up with missing or inaccurate data. The more fully and effectively you can automate data capture, the more you’ll be able to avoid human error, reduce the time required, and ensure your data is accurate and up-to-date.
Technology for data readiness
New technologies are making data readiness more achievable. Today’s processors allow organizations to place more sensors in more places for automated data streams. They also enable data to be managed and consumed at the edge, where it’s generated – and where the most time-critical, data-informed actions often need to be taken.
For instance, new system-on-chip designs combine CPU, memory, data storage and other components on a single piece of silicon. The result is higher performance, lower power consumption and smaller space requirements. That addresses the unique size, weight, power and cost (SWaP-C) requirements of edge computing while delivering exponentially more compute where it counts. It also delivers the compute capacity to perform analytics or even AI at the edge, rather than only in centralized datacenters, better equipping agencies to benefit from analytics once they’re data-ready.
What’s exciting is that advanced analytics can truly help organizations perform better. Yet if you aren’t evaluating, generating, collecting and managing your data effectively, analytics might merely take you down a false path. In becoming data ready, you’ll position your organization to leverage analytics and AI to make smarter decisions, take more informed actions, and achieve better outcomes for your mission.
Retired U.S. Navy Rear Adm. Ron Fritzemeier is director of Mission Solutions for Intel.