The HDF Group is collaborating with the University of California, Santa Barbara and Data Observation Network for Earth (DataONE), to help scientific research communities enhance the consistency and quality of their metadata, to foster discovery, access and understanding of data resources. As part of this collaboration, on February 9, 2016, The HDF Group’s Ted Habermann, Director of Earth Science, and Lindsay Powers, Deputy Director of Earth Science will co-lead a webinar “Sharing Data Through Guided Metadata Improvement” along with Matthew Jones, Director of Informatics Research at the National Center for Ecological Analysis and Synthesis. Continue reading →
The ESIP Federation comes together twice each year to discuss topics around changing technology, data, information and knowledge in support of society. ESIP meetings are interdisciplinary and inclusive. Among the attendees are Earth science data and information technology practitioners; researchers representing a variety of scientific domains that include land, atmosphere, ocean, solid earth, ecology, data and social sciences; science educators; and anyone working in science and technology-related fields who is interested in advancing Earth science information best practices in an open and transparent fashion. Continue reading →
“Any software used in the computational sciences needs to excel in the area of high performance computing (HPC).”
The Computational Fluid Dynamics (CFD) General Notation System (CGNS) is an effort to standardize CFD input and output data, including grid (both structured and unstructured), flow solution, connectivity, boundary conditions, and auxiliary information. It provides a general, portable, and extensible standard for the storage and retrieval of CFD analysis data. The system consists of two parts: (1) a standard format for recording the data, and (2) software that reads, writes, and modifies data in that format. Continue reading →
I first heard of HDF during the “Data Format Wars” of the 1990’s. These “battles” centered on the selection of a format for the emerging NASA Earth Observing System archives, and there were a number of contenders. HDF won that battle in the end because of the inherent flexibility of the format and the tools for reading and writing it.
Now, twenty years later, HDF has emerged as the foundation format for an incredibly diverse and growing selection of scientific and commercial disciplines.
Is it the inherent flexibility of the format that has led to this success? Maybe, but I would pick information integration as the killer HDF feature. Continue reading →
“A strong foundation is being built for sharing data and information to create community knowledge and wisdom. This foundation includes HDF5 as the data layer with community conventions and ISO metadata facilitating use and understanding.”
We have experienced so many monumental technological shifts during the last several decades that, like the diurnal cycle of light and dark, the technology life cycle (shown below) is becoming instinctual.
It starts with a new idea, (usually aimed at new customers), that destroys existing organizational expertise and threatens the continued existence of established processes and organizations. These disruptions raise a variety of difficult questions and initiate an Era of Ferment during which established enterprises gauge the impact of the disruption in their worlds and try to adjust. The ferment creates uncertainty, high risk, considerable wasted resources, and no interoperability.
The ferment is ended when the community agrees on a dominant design and works together to make the design work. Instead of deciding what they are going to do, they work to make what they are going to do better. Continue reading →
Building a well-designed data standard that incorporates the needs of a science community has a long-lasting value to that community (and beyond).
It vastly outweighs the momentary benefits of particular hardware or software choices at any individual experimental site – the science data lifecycle involves more that just “speeds & feeds” during production. Creating a standard that captures the necessary metadata required to characterize experimental and simulation data, while accommodating future expansion and providing flexibility for the special needs of individual researchers is a challenging, but worthwhile endeavor.
Community data standards have taken root in many domains, giving researchers the ability to collaborate on larger science projects than previously possible. For example, Continue reading →
Fifteen years ago, NASA selected HDF as the format for the data products produced by NASA Satellites for the NASA Earth Observing System (EOS).
The HDF Earth Science Program is well aware of this important legacy. We focus on continuing support of U.S. environmental satellite programs (NASA Earth Observing Systemand Joint Polar Satellite System, JPSS), on-going quality assurance of the HDF libraries and helping data users access and understand products written in HDF. The HDF-EOS Information Center(#hdfeos) includes code examples in MATLAB, IDL, NCL, and Python, many driven by user questions. The site also provides information on other HDF tools.
NASA’s decision ensured a role for HDF in Earth Science and set an important precedent. HDF developers, along with the U.S. and other Earth Observing nations, developed a clear distinction between Earth Science Data Objects (grids, swaths, profiles…); the metadata required to describe them; and the HDF objects (datasets, groups, attributes, etc.) that make them up.
The critical realization was that communities like EOS needed conventions for describing Earth Science objects to enable using and sharing those objects. These conventions, termed HDF-EOS, have been used successfully in hundreds of NASA products that can be easily shared among multiple users using standard tools.
Many other Earth Science communities have used the powerful combinationof conventions and HDF. Continue reading →