Technological advances and the exponential growth of information are reshaping how business is conducted across many sectors, including the public sector. Driven by the rapid proliferation of mobile devices and applications, smart sensors, cloud computing solutions, and citizen-facing portals, government data generation and digital archiving rates are climbing. As digital information expands and grows more complex, so too do the challenges of managing, processing, storing, securing, and disposing of it. Emerging tools for capture, search, discovery, and analysis are enabling organisations to extract valuable insights from unstructured data. The government sector stands at a critical juncture, recognising that information is a strategic asset. To better serve the public and meet mission requirements, government bodies must protect, leverage, and analyse both structured and unstructured information. As leaders strive to evolve into data-driven organisations capable of successfully fulfilling their missions, they are laying the groundwork to correlate dependencies across events, people, processes, and information.
High-value government solutions will emerge from a convergence of the most disruptive technologies:
-
Mobile devices and applications
-
Cloud services
-
Social business technologies and networking
-
Big Data and analytics
Big Data stands as one of the key intelligent industry solutions, empowering government to make better decisions by acting on patterns revealed through the analysis of vast data volumes—both related and unrelated, structured and unstructured.
However, achieving these outcomes requires far more than merely accumulating massive quantities of data. "Making sense of these volumes of Big Data requires cutting-edge tools and technologies that can analyse and extract useful knowledge from vast and diverse streams of information," wrote Tom Kalil and Fen Zhao of the White House Office of Science and Technology Policy in a post on the OSTP Blog.
Taking a step towards helping agencies identify these technologies, the White House established the National Big Data Research and Development Initiative in 2012. This initiative included over $200 million in funding to maximise the benefits of the Big Data explosion and the tools required to analyse it.
The challenges posed by Big Data are nearly as daunting as the promise it offers is encouraging. Efficient data storage is one such challenge. As always, budgets remain tight, meaning agencies must minimise the per-megabyte cost of storage while ensuring data remains easily accessible so users can retrieve it when needed and in the format they require. Backing up massive quantities of data further heightens this challenge.
Effectively analysing the data presents another major hurdle. Many agencies employ commercial tools that enable them to sift through mountains of data, identifying trends that can improve operational efficiency. (A recent study by MeriTalk found that federal IT executives believe Big Data could help agencies save more than $500 billion while simultaneously fulfilling mission objectives.)
Custom-developed Big Data tools are also enabling agencies to meet their data analysis needs. For instance, the Oak Ridge National Laboratory's Computational Data Analytics Group has made its Piranha data analytics system available to other agencies. The system has assisted medical researchers in identifying links that can alert doctors to aortic aneurysms before they strike. It is also employed for more routine tasks, such as screening resumes to connect job candidates with hiring managers.
Read more...