One of the advantages of using HDF5 is that data stored on disk can be compressed, reducing both the space required to store them and the time needed to read those data. This data compression is applied as part of the HDF5 "filter pipeline" that modifies data during I/O operations. HDF5 includes several filter algorithms as standard, and the version of the HDF5 library found in r Biocpkg("Rhdf5lib")
is additionally compiled with support for the deflate and szip compression filters which rely on third-party compression libraries. Collectively HDF5 refer to these as the "internal" filters. It is possible to use any combination of these (including none) when writing data using r Biocpkg("rhdf5")
. The default filter pipeline is shown in Figure \@ref(fig:filter-pipeline).
knitr::include_graphics("filter_pipeline.png")
This pipeline approach has been designed so that filters can be chained together (as in the diagram above) or easily substituted for alternative filters. This allows tailoring the compression approach to best suit the data or application.
It may be case that for a specific usecase an alternative, third-party, compression algorithm would be the most appropriate to use. Such filters, which are not part of the standard HDF5 distribution, are referred to as "external" filters. In order to allow their use without requiring either the HDF5 library or applications to be built with support for all possible filters HDF5 is able to use dynamically loaded filters. These are compiled independently from the HDF5 library, but are available to an application at run time.
This package currently distributes external HDF5 filters employing bzip2 and the Blosc meta-compressor. In total r Biocpkg("rhdf5filters")
provides access to seven^[zlib compression is almost always available in a standard HDF5 installation, but is also available via Blosc.] compression filters than can be applied to HDF5 datasets. The full list of filters currently provided by the package is:
r Biocpkg("rhdf5filters")
is principally designed to be used via the r Biocpkg("rhdf5")
package, where several functions are able to utilise the compression filters. For completeness those functions are described here and are also documented in the r Biocpkg("rhdf5")
vignette.
The function h5createDataset()
within r Biocpkg("rhdf5")
takes the argument filter
which specifies which compression filter should be used when a new dataset is created.
Also available in r Biocpkg("rhdf5")
are the functions H5Pset_bzip2()
and H5Pset_blosc()
. There are not part of the standard HDF5 interface, but are modelled on the H5Pset_deflate()
function and allow the bzip2 and blosc filters to be set on dataset create property lists.
As long as r Biocpkg("rhdf5filters")
is installed, r Biocpkg("rhdf5")
will be able to transparently read data compressed using any of the filters available in the package without requiring any action on your part.
The dynamic loading design of the HDF5 compression filters means that you can use the versions distributed with r Biocpkg("rhdf5filters")
with other applications, including other R packages that interface HDF5 as well as external applications not written in R e.g. HDFVIEW. The function hdf5_plugin_path()
will return the location of in your packages library where the compiled plugins are stored. You can the set the environment variable HDF5_PLUGIN_PATH
and other applications will be able to dynamically load the compression plugins found there if needed.
rhdf5filters::hdf5_plugin_path()
## error code 127 indicate the command could not be run h5dump_found <- (system2('h5dump') != 127)
The next example demonstrates how the filters distributed by r Biocpkg("rhdf5filters")
can be used by external applications to decompress data. Do do this we'll use the version of h5dump installed on the system^[If h5dump is not found on your system these example will fail.] and a file distributed with this package that has been compressed using the blosc filter. Since r Biocpkg("rhdf5filters")
sets the HDF5_PLUGIN_PATH
environment variable in an R session, we will manually unset it to demonstrate the typical behaviour.
## blosc compressed file blosc_file <- system.file("h5examples/h5ex_d_blosc.h5", package = "rhdf5filters") ## unset environment variable Sys.setenv("HDF5_PLUGIN_PATH" = "")
Now we use system2()
to call the system version of h5dump and capture the output, which is then printed below. The most important parts to note are the FILTERS
section, which shows the dataset was indeed compressed with blosc, and DATA
, where the error shows that h5dump is currently unable to read the dataset.
h5dump_out <- system2('h5dump', args = c('-p', '-d /dset', blosc_file), stdout = TRUE, stderr = TRUE) cat(h5dump_out, sep = "\n")
cat( 'HDF5 "rhdf5filters/h5examples/h5ex_d_blosc.h5" { DATASET "/dset" { DATATYPE H5T_IEEE_F32LE DATASPACE SIMPLE { ( 30, 10, 20 ) / ( 30, 10, 20 ) } STORAGE_LAYOUT { CHUNKED ( 10, 10, 20 ) SIZE 3347 (7.171:1 COMPRESSION) } FILTERS { USER_DEFINED_FILTER { FILTER_ID 32001 COMMENT blosc PARAMS { 2 2 4 8000 4 1 0 } } } FILLVALUE { FILL_TIME H5D_FILL_TIME_IFSET VALUE H5D_FILL_VALUE_DEFAULT } ALLOCATION_TIME { H5D_ALLOC_TIME_INCR } DATA {h5dump error: unable to print data } } }' )
Next we set HDF5_PLUGIN_PATH
to the location where r Biocpkg("rhdf5filters")
has stored the filters and re-run the call to h5dump. Printing the output^[The dataset is quite large, so we only show a few lines here.] no longer returns an error in the DATA
section, indicating that the blosc filter plugin was found and used by h5dump.
## set environment variable to hdf5filter location Sys.setenv("HDF5_PLUGIN_PATH" = rhdf5filters::hdf5_plugin_path()) h5dump_out <- system2('h5dump', args = c('-p', '-d /dset', '-w 50', blosc_file), stdout = TRUE, stderr = TRUE) ## find the data entry and print the first few lines DATA_line <- grep(h5dump_out, pattern = "DATA \\{") cat( h5dump_out[ (DATA_line):(DATA_line+2) ], sep = "\n" )
cat( ' DATA { (0,0,0): 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, (0,0,11): 11, 12, 13, 14, 15, 16, 17, 18,' )
sessionInfo()
Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.