Governmental institutions and companies worldwide are tapping into giant data pools to combat the various public health and economic challenges manifesting from the COVID-19 crisis. During these troubling times, big data analytics is proving to be an ally for first responders and other front line workers in healthcare, food service, manufacturing, and other essential services to address the true scale of challenges.
In the manufacturing world, the spread of COVID-19 has generated large volumes of data (both structured and unstructured) across the entire value chain. With regards to analyzing these large data sets or “big data”, there are some basic analytical steps that can help pave the way for a more strategic and structured response to help navigate the global pandemic.
Pooling data from a valid source
As the COVID-19 pandemic continues to threaten and take lives around the world, finding a valid source of information (data), becomes a key factor in a strategic and structured response to navigating the global pandemic. In response to this, major enterprises with extensive and robust digital platforms are collaborating with new “open data” platforms designed to promote big data sharing during the crisis - platforms like Google Cloud, AWS, and Microsoft Azure. These platforms are able to capture a real-time view of the pandemic by leveraging large, publicly available data sets including data on local shelter-in-place polices, various health reporting, transit resources, and mobility patterns to show how public behaviors are impacting the spread of the virus.
Big Data Interoperability Framework
After pooling the right data sets, crafting an agile response can be time consuming, especially during a crisis. Certain businesses might have the resources for suitable data analytics to create that quick impact, but others may not. To assist, the National Institute of Standards and Technology (NIST) has developed a Big Data Interoperability Framework (NBDIF).
- NIST Big Data Definitions & Taxonomies Subgroups
- NIST SP 1500-1r2—Volume 1: Definitions
- NIST SP 1500-2r2—Volume 1: Definitions
- NIST Big Data Use Case & Requirements Subgroup
- NIST SP 1500-3r2—Volume 3—Use Case & Requirements
- NIST Big Data Security & Privacy Subgroup
- NIST SP 1500-4r2—Volume 4: Security and Privacy
- NIST Big Data Reference Architecture Subgroup
- NIST SP 1500-5—Volume 5: Architectures White Paper Survey
- NIST SP 1500-6r2—Volume 6: Reference Architecture
- NIST Big Data Technology Roadmap Subgroup
- NIST SP 1500-7r2 -- Volume 7: Standards Roadmap
- Two New Volumes
- NIST SP 1500-9r1 -- Volume 8: Reference Architecture Interface
- NIST SP 1500-10r1 -- Volume 9: Modernization and Adoption
This framework is intended to help create “a vendor-neutral, technology- and infrastructure-independent ecosystem” to “enable Big Data stakeholders (e.g. data scientists, researchers, etc.) to utilize the best available analytics tools to process and derive knowledge through the use of standard interfaces between swappable architectural components.” In other words, helping in the analysis of large data sets using any computing platform, whereby data can be moved from one platform to another. It also provides an option of scaling up digital information from small desktop setups to a larger environment with many processor nodes, providing time-critical data and promoting informational insights. If your organization is new to big data analytics, this nine-volume framework can be a useful guide.
Large data sets can be overwhelming; but during these uncertain times it’s a valuable commodity, and to have a structure to collect, store, and analyze big data leads the way to realizing impactful results. IMEC is ready to help you Plan, Implement and Excel when it comes to Industry 4.0 needs and Big Data framework development. We stand poised with the National Manufacturing Extension Partnership (MEP) Network and partners to bring awareness and practical solutions to meet your digital transformation needs.
Get more insights on navigating COVID-19.