HDF: The Next 30 Years (Part 1)

Dave Pearah, The HDF Group

How can users of open source technology ensure that the open source solutions they depend on every day don’t just survive, but thrive?

While on my flight home from New York, I’m reflecting on The Trading Show, which focused on tech solutions for the small but influential world of proprietary and quantitative financial trading. I participated in a panel called “Sharing is Caring,” regarding the industry’s broad use of open source technology.

The panel featured a mix of companies that both provide and use open source software. Among the topics:

  • Are cost pressures the only driving force behind the open source movement among trading firms, hedge funds and banks?
  • How will open source solutions shape the future of quant and algorithmic trading?
    And of particular interest,
  • How can we create an environment that encourages firms with proprietary technology to contribute back to open source projects?

The issue of open source sustainability received vigorous discussion. Many people make the mistake of assuming that open source software packages just “take care of themselves” or “will always be around,” but the evidence suggests otherwise.

The recent Ford Foundation report, “Roads and Bridges: The Unseen Labor Behind Our Digital Infrastructure” paints a very dim picture of poorly maintained or abandoned open source projects that are used by large communities of users (i.e. the issue is support, not adoption).

Klint Finley’s recent Wired article, Open Source Won. So, Now What? says, “Despite this mainstream success, many crucial open source projects—projects that major companies rely on—are woefully underfunded. And many haven’t quite found the egalitarian ideal that can really sustain them in the long term.”

This topic is near and dear to me as the CEO of The HDF Group. HDF has a long history of integrity and is very committed to its user community – survival is mandatory. At the same time, I bear the responsibility to ensure that The HDF Group’s technologies – a vitally important and broadly adopted technology portfolio – not only survive, but thrive.

In order to achieve this, we have to grow the HDF business. Why?

The HDF Group is a not-for-profit organization that makes money through consulting, typically in two forms:

  1. Adding functionality to the HDF Group’s software portfolio (i.e., HDF5, HDF4, HDFView, etc.)
  2. Helping people be successful with HDF (review, tune, correct, coach, train, educate, advise, etc.)

The profit from these activities funds the sustainability and evolution of HDF5. This is actually a fairly common open source business model and one that has worked well for us for nearly 30 years. So why change? Costs are increasing because the user base is growing: more user support, more testing, more configurations, etc. There is no corresponding increase in revenue to offset these costs.

We have an amazingly talented + passionate + dedicated team of folks who focus entirely on the HDF library. With either the sweat equity or financial equity of the user community, we can not only continue these efforts, but make plans to address new features and functions that benefit the entire user community.

In my next post – Part 2 – I’ll outline some of the ideas around how we plan to engage the HDF community and create a conversation around how to ensure HDF’s viability and relevance for the next 30 years. I’ll also outline a number of ways that you and your organizations can partner with us to make this a win-win for all stakeholders.

I look forward to this dialogue with you and I’m eager to see your blog comments!

Dave Pearah

 

…HDF5 has broad adoption in the financial services industry including High-frequency trading (HFT) firms, hedge funds, investment banks, pension boards and data syndicators. All sizes and types of financial firms rely on massive amounts of data to for trading, risk analysis, customer portfolio analysis, historical market research, and many other data intensive functions.

Many HDF adopters in finance have extremely large and complex datasets with very fast access requirements. Others turn to HDF because it allows them to easily share data across a wide variety of computational platforms using applications written in different programming languages. Some use HDF to take advantage of the many HDF-friendly tools used in financial analysis and modeling, such as MATLAB, Pandas, PyTables and R.

HDF technologies are relevant when the data challenges being faced push the limits of what can be addressed by traditional database systems, XML documents, or in-house data formats. Leveraging the powerful HDF products and the expertise of The HDF Group, organizations realize substantial cost savings while solving challenges that seemed intractable using other data management technologies. For more information, please visit our website product pages.

 

 

HDFql – the new HDF tool that speaks SQL

Rick, HDFql team, HDF guest blogger

HDFql (Hierarchical Data Format query language) was recently released to enable users to handle HDF5 files with a language as easy and powerful as SQL. 

By providing a simpler, cleaner, and faster interface for HDF across C/C++/Java/Python/C#, HDFql aims to ease scientific computing, big data management, and real-time analytics. As the author of HDFql, Rick is collaborating with The HDF Group by integrating HDFql with tools such as HDF Compass, while continuously improving HDFql to feed user needs.

Introducing HDFql

HDFqlIf you’re handling HDF files on a regular basis, chances are you’ve had your (un)fair share of programming headaches. Sure, you might have gotten used to the hassle, but navigating the current APIs probably feels a tad like filing expense reports: rarely a complete pleasure!

If you’re new to HDF, you might seek to avoid the format all together. Even trained users have been known to occasionally scout for alternatives.  One doesn’t have to have a limited tolerance for unnecessary complexity to get queasy around these APIs – one simply needs a penchant for clean and simple data management.

This is what we heard from scientists and data veterans when asked about HDF. It’s what challenged our own synapses and inspired us to create HDFql. Because on the flip-side, we also heard something else:

  • HDF has proven immensely valuable in research and science
  • the data format pushes the boundaries on what is achievable with large and complex datasets
  • and it provides an edge on speed and fast access which is critical in the big data / advanced analytics arena

With an aspiration of becoming the de facto language for HDF, we hope that HDFql will play a vital role in the future of HDF data management by:

  • Enabling current users to arrive at (scientific) insights faster via cleaner data handling experiences
  • Inspiring prospective users to adopt the powerful data format HDF by removing current roadblocks
  • Perhaps even grabbing a few HDF challengers or dissenters along the way…

Continue reading

The HDF Group welcomes new CEO Dave Pearah

HDF
Pearah joins The HDF Group as new Chief Executive Officer

Champaign, IL —  The HDF Group today announced that its Board of Directors has appointed David Pearah as its new Chief Executive Officer. The HDF Group is a software company dedicated to creating high performance computing technology to address many of today’s Big Data challenges.

Pearah replaces Mike Folk upon his retirement after ten years as company President and Board Chair. Folk will remain a member of the Board of Directors, and Pearah will become the company’s Chairman of the Board of Directors.

Pearah said, “I am honored to have been selected as The HDF Group’s next CEO. It is a privilege to be part of an organization with a nearly 30-year history of delivering innovative technology to meet the Big Data demands of commercial industry, scientific research and governmental clients.”

Industry leaders in fields from aerospace and biomedicine to finance join the company’s client list.  In addition, government entities such as the Department of Energy and NASA, numerous research facilities, and scientists in disciplines from climate study to astrophysics depend on HDF technologies.

Pearah continued, “We are an organization led by a mission to make a positive impact on everyone we engage, whether they are individuals using our open-source software, or organizations who rely on our talented team of scientists and engineers as trusted partners. I will do my best to serve the HDF community by enabling our team to fulfill their passion to make a difference.  We’ve just delivered a major release of HDF5 with many additional powerful features, and we’re very excited about several innovative new products that we’ll soon be making available to our user community.”

“Dave is clearly the leader for HDF’s future, and Continue reading

Announcing HDF5 1.10.0

We are excited and pleased to announce HDF5-1.10.0, the most powerful version of our flagship software ever.

HDF5
HDF5 1.10.0 is now available

This major new release of HDF5 is more powerful than ever before and packed with new capabilities that address important data challenges faced by our user community.

HDF5 1.10.0 contains many important new features and changes, including those listed below. The features marked with * use new extensions to the HDF5 file format.

  •  The Single-Writer / Multiple-Reader or SWMR feature enables users to read data while concurrently writing it. *
  • The virtual dataset (VDS) feature enables users to access data in a collection of HDF5 files as a single HDF5 dataset and to use the HDF5 APIs to work with that dataset. *   (NOTE: There is a known issue with the h5repack utility when using it to modify the layout of a VDS. We understand the issue and are working on a patch for it.)
  • New indexing structures for chunked datasets were added to support SWMR and to optimize performance. *
  • Persistent free file space can now be managed and tracked for better performance. *
  • The HDF5 Collective Metadata I/O feature has been added to improve performance when reading and writing data collectively with Parallel HDF5.
  • The Java HDF5 JNI has been integrated into HDF5.
  • Changes were made in how autotools handles large file support.
  • New options for the storage and filtering of partial edge chunks have been added for performance tuning.*

* Files created with these new extensions will not be readable by applications based on the HDF5-1.8 library.

We would like to thank you, our user community, for your support, and your input and feedback which helped shape this important release.

The HDF Group

Solutions to Data Challenges

Please refer to the following document which describes the new features in this release:   https://www.hdfgroup.org/HDF5/docNewFeatures/

All new and modified APIs are listed in detail in the “HDF5 Software Changes from Release to Release” document:     https://www.hdfgroup.org/HDF5/doc/ADGuide/Changes.html

For detailed information regarding this release see the release notes:     https://www.hdfgroup.org/ftp/HDF5/releases/hdf5-1.10/hdf5-1.10.0/src/hdf5-1.10.0-RELEASE.txt

For questions regarding these or other HDF issues, contact:      help@hdfgroup.org

Links to the HDF5 1.10.0 source code, documentation, and additional materials can be found on the HDF5 web page at:     https://www.hdfgroup.org/HDF5/

The HDF5 1.10.0 release can be obtained directly from:   https://www.hdfgroup.org/HDF5/release/obtain5110.html

User documentation for 1.10.0 can be accessed from:   https://www.hdfgroup.org/HDF5/doc/

Answering biological questions using HDF5 and physics-based simulation data

David Dotson, doctoral student, Center for Biological Physics, Arizona State University; HDF Guest Blogger

Recently I had the pleasure of meeting Anthony Scopatz for the first time at SciPy 2015, and we talked shop. I was interested in his opinions on MDSynthesis, a Python package our lab has designed to help manage the complexity of raw and derived data sets from molecular dynamics simulations, about which I was presenting a poster (click zip file to download).

molecular
Figure 1: Molecular dynamics simulation: Example of a molecular dynamics simulation in a simple system: deposition of a single Cu atom on a Cu (001) surface. Each circle illustrates the position of a single atom; note that the actual atomic interactions used in current simulations are more complex than those of 2-dimensional hard spheres. https://en.wikipedia.org/wiki/Molecular_dynamics Image: Kai Nordlund, professor of computational materials physics, University of Helsinki.

In particular, I wanted his thoughts on how we are leveraging HDF5, and whether we could be doing it better.  The discussion gave me plenty to think about going forward, but it also put me in contact with some of the other folks involved in the Python ecosystem surrounding HDF5. Long story short, I was asked to share how we were using HDF5 with a guest post on the HDF Group blog.

First a bit of background. At the Beckstein Lab we perform physics-based simulations of proteins, the molecular machines of life, in order to get at how they do what they do. These simulations may include thousands to millions of atoms, with the raw data a trajectory of their positions with time, which can have hundreds to millions of frames.
Continue reading

Letter to the HDF User Community

Lindsay Powers – The HDF Group

The HDF Group provides free, open-source software that is widely used in government, academia and industry. The goal of The HDF Group is to ensure the sustainable development of HDF (Hierarchical Data Format) technologies and the ongoing accessibility of HDF-stored data because users and organizations have mission-critical systems and archives relying on these technologies. These users and organizations are a critical element of the HDF community and an important source of new and innovative uses of, and sustainability for, the HDF platforms, libraries and tools.

We want to create a sustainability model for the open access platforms and libraries that can serve these diverse communities in the future use and preservation of their data. As a step towards engaging this community, we are seeking partners for a National Science Foundation Research Coordination Network (RCN).

The National Science Foundation supports RCNs in order to foster collaboration and communication among scientists and technologists in the areas of research coordination, education and training, collaborative technologies, and standards development. Our vision of this RCN is to develop a core community of experienced and dedicated HDF users to:

  1. Foster education and training of new and existing users through development of teaching modules, workshops and other mechanisms for sharing knowledge and experience,
  2. Provide a forum for sharing tools and techniques related to HDF technologies,
  3. Convene diverse users to foster interdisciplinary collaboration, and
  4. Formalize a community of committed HDF users invested in the sustainability of HDF products.

Continue reading

Worried about your unlimited data plan bills? Cut them with OPeNDAP

Joe Lee, The HDF Group

Sprint has recently hit the airwaves with a promotion claiming that they will cut your data bill in half.  But there’s no free lunch in this connected world we live in. Unlimited data plans always come with a steep price tag.

While the internet has been around awhile, there has recently been an explosion of data – email, the World Wide Web, social media, cloud computing, mobile apps for everything, and Big Data.  At the same time, the overall global population of people using the internet has skyrocketed, as has the “Internet of Things.” Getting around can be a challenge.

The overcrowded and congested internet will continue to throw more data on us. Consequently, getting the right amount of the right data can also be a great challenge.  When it’s delivered over the internet, getting the right amount of data also helps ensure that your data delivery time will be dramatically shortened, and your data delivery costs minimized.  Continue reading