Our Commitment to HDF5’s Diverse Community

David Pearah, The HDF Group

Hello HDF Community!

Thanks for the warm welcome into the HDF family: in my 4+ months as the new CEO, I’ve been blown away by your passion, diversity of interests and applications, and willingness to provide feedback on:  1. why you use HDF5?, and  2. how can HDF5 be improved? I also want to thank my predecessor Mike Folk for his invaluable and ongoing support.

The HDF community is growing fast: when I last checked, there are nearly 700 HDF5 projects in GitHub! I’ve had the privilege of connecting via phone/web with dozens of you over the past few months. Across all of my discussions, one piece of feedback came back loud and clear: The HDF Group needs to be more engaged with its users and help foster the community. We hear you, and here are two actions we’re taking to demonstrate this commitment:   Continue reading

HDF5 and The Big Science of Nuclear Stockpile Stewardship

The August 2016 issue of Physics Today includes a fascinating piece titled, “The Big Science of stockpile stewardship.”1

The article leads with, “In the quarter century since the US last exploded a nuclear weapon, an extensive research enterprise has maintained the resources and know-how needed to preserve confidence in the country’s stockpile.”  It goes on to give the history of how the US Department of Energy (DOE) and its Los Alamos, Sandia and Lawrence Livermore national laboratories pioneered the use of high-performance computing to use computer simulation as a replacement for the actual building and testing of the USA’s nuclear weapons stockpile.

Although HDF5 is not named in this article, the history of The HDF Group and HDF5 are closely linked to this larger story of American science and geopolitics.  In 1993, DOE determined that its computing capabilities would require massive improvements, as the article says, to “ramp up computation speeds by a factor of 10,000 over the highest performing computers at the time, equivalent to a factor of 1 million over computers routinely used for nuclear calculations… To meet the [ten-year] goal, the DOE laboratories had to engage the computer industry in massively parallel processing, a technology that was just becoming available, to develop not just new hardware but new software and visualization techniques.”   Continue reading