How to read hdf data in Octave
Asked Answered
A

4

8

I am doing project in remote sensing. Working with HDF on matlab is very easy. But i want to implement this with grid computing (Ubuntu). So i am trying with octave. I have HDF4 files of chlorophyll. Normal Image processing will be done by octave easily. But i want to know about hdfread, hdftool in Octave. I am added a image packages within octave image. Can anyone tell me how to read and how to work with hdf data's. Is there any package to add. Please let me know about this.

  • How to read hdf data
  • How to load hdf data
  • How to retrieve image from hdf data
Apopemptic answered 29/7, 2013 at 9:59 Comment(1)
hdfread and hdftool are not yet implemented in Octave (I'm looking at version 3.6.4)...Bott
H
5

For HDF5, Octave can load it without additional package.

For HDF4, you can convert it to NetCDF file using h4tonccf or OPeNDDAP hdf4_handler, and then load it via NetCDF call through the Octave's octcdf package. We provide a complete example with full Octave source codes here:

http://hdfeos.org/software/octave.php

The below is a sample NASA HDF4 plot created by Octave via OPeNDAP.

Octave's plot of NASA AIRS HDF4 product

Haihaida answered 10/2, 2014 at 16:8 Comment(0)
P
3

The hdf specific functions haven't been implemented in Octave yet. However, Octave can handle that format with the more standard load command. Just do load path-to-hdf-file and you'll load a struct in memory.

See these posts on the help mailing archive: How to read HDF data, and read data subsets from HDF5.

Phanerozoic answered 29/7, 2013 at 15:16 Comment(2)
Thank you for your answer. I have done this already. but it comes with error. If octave supports hdf4 and its operations, it will be very helpful for me instead of using matlab.Apopemptic
@Apopemptic you need to specify the error you get (and Octave version) if you want people to help you.Phanerozoic
C
2

The HDF5 support in Octave <= 4.0 is only intended for files which were written by Octave itself, via Simple File IO functions. It has many disadvantages if your data comes from elsewhere, like it is not possible to read a single arbitrary dataset, or a part of it.

At the moment, for more complete and Matlab compatible functions which read/write Datasets and Attributes, see the module

https://github.com/stegro/hdf5oct

Edit: I contributed to this project.

Colbert answered 1/6, 2015 at 14:46 Comment(1)
Please disclose your affiliation with the project somewhere otherwise your post could be considered as advertisement or spam which might be wrong.Darkroom
D
1

I believe you can convert hdf4 data to hdf5 with appropriate conversion tools, e.g. h4toh5.

In Octave, to load a .h5 file (plus checking) is as simple as:

octave:1> load secondhdf5.h5 
octave:2> whos
Variables in the current scope:

   Attr Name         Size                     Bytes  Class
   ==== ====         ====                     =====  ===== 
        dbldata      4x3                         96  double
        fltdata      4x3                         96  double
        intdata      4x3                         48  int32

Total is 36 elements using 240 bytes

octave:3> size(dbldata)
ans =

   4   3

By the way, the contents of 'secondhdf5.h5' were as follows:

$ h5dump secondhdf5.h5 
HDF5 "secondhdf5.h5" {
GROUP "/" {
   DATASET "dbldata" {
      DATATYPE  H5T_IEEE_F64LE
      DATASPACE  SIMPLE { ( 3, 4 ) / ( 3, 4 ) }
      DATA {
      (0,0): 1.1, 1.2, 1.3, 1.4,
      (1,0): 2.1, 2.2, 2.3, 2.4,
      (2,0): 3.1, 3.2, 3.3, 3.4
      }
   }
   DATASET "fltdata" {
      DATATYPE  H5T_IEEE_F32LE
      DATASPACE  SIMPLE { ( 3, 4 ) / ( 3, 4 ) }
      DATA {
      (0,0): 1.1, 1.2, 1.3, 1.4,
      (1,0): 2.1, 2.2, 2.3, 2.4,
      (2,0): 3.1, 3.2, 3.3, 3.4
      }
   }
   DATASET "intdata" {
      DATATYPE  H5T_STD_I32BE
      DATASPACE  SIMPLE { ( 3, 4 ) / ( 3, 4 ) }
      DATA {
      (0,0): 1, 2, 3, 4,
      (1,0): 5, 6, 7, 8,
      (2,0): 9, 10, 11, 12
      }
   }
}
}
Dymphia answered 27/3, 2015 at 7:34 Comment(1)
I would like to add that octave does not support h5 file with compound type. So every dataset must be of primitive (int, double, etc) type.Easting

© 2022 - 2024 — McMap. All rights reserved.