"ValueError: Not a location id (Invalid object id)" while creating HDF5 datasets
Asked Answered
W

2

11

I have an numpy array(arr) of size (3997,29). I am using this array to create a dataset. This array has both integer and float variables. So dtype is reference. But when I execute it I get the below error.

"ValueError: Not a location id (Invalid object id)"

with h5py.File("test1.h5", 'w') as f:
     grp = f.create_group('Nodes')

with h5py.File("test1.h5", 'r+') as f:
     grp = f.require_group('Nodes')

ref_dtype = h5py.special_dtype(ref=h5py.Reference)

arrshape = np.shape(arr)
  
dset = grp.create_dataset('Init' ,arrshape, dtype = ref_dtype , data= arr)

The error occurs in the last line. Below are the traceback messages

 dset = f.create_dataset('Init' ,arrshape, dtype = ref_dtype , data= arr)

  File "C:\Users\rupesh.n\AppData\Local\Continuum\anaconda3\lib\site-packages\h5py\_hl\group.py", line 108, in create_dataset
    dsid = dataset.make_new_dset(self, shape, dtype, data, **kwds)

  File "C:\Users\rupesh.n\AppData\Local\Continuum\anaconda3\lib\site-packages\h5py\_hl\dataset.py", line 137, in make_new_dset
    dset_id = h5d.create(parent.id, None, tid, sid, dcpl=dcpl)

  File "h5py\_objects.pyx", line 54, in h5py._objects.with_phil.wrapper

  File "h5py\_objects.pyx", line 55, in h5py._objects.with_phil.wrapper

  File "h5py\h5d.pyx", line 79, in h5py.h5d.create

ValueError: Not a location id (Invalid object id)
Willms answered 7/3, 2018 at 11:41 Comment(3)
I can guess that error occurs in the last line, but usually we ask for the traceback from errors to be sure. Another wild guess is that f is not open when you try to access grp, and that the invalid object id refers to grp. For more than guesses we need a minimal reproducible example.Opposition
@Vovanrock2002, Yes the error occurs in the last line. Updated traceback messages.Willms
Issue got resolved. The file object had been closed as soon as we come out of with loop. Resolved it now.Willms
W
20

This error frequently occurs when someone tries to create a new dataset using a closed handle. If you are iterating make sure you are not closing the file inside of a loop. I had the same problem as OP.

Whinchat answered 30/8, 2018 at 18:27 Comment(0)
L
4

This question is a bit old, but in case anyone else ends up here with the same question I'll clear up the answer a little bit. @WilderField is correct, but to be a little more clear.

In the last line:

dset = grp.create_dataset('Init' ,arrshape, dtype = ref_dtype , data= arr)

grp is pointing to the Closed h5py.Group that was used in:

with h5py.File("test1.h5", 'r+') as f:
     grp = f.require_group('Nodes')

Because grp was set to point to the group inside of the with... context manager grp is only an open group within that context manager. The HDF and all groups/datasets associated with the HDF are closed when the context manager exits. This behaviour is to prevent the HDF from being held open by lost pointers to HDF objects.

The solution is to create the h5py.Dataset inside of the context manager, i.e.:

with h5py.File("test1.h5", 'r+') as f:
     grp = f.require_group('Nodes')
     dset = grp.create_dataset('Init' ,arrshape, dtype = ref_dtype , data= arr)

Again, as soon as the context manager closes, dset will point to a Closed h5py.Dataset so unless you actually want to do something more with it, it would be sufficient to call grp.create_dataset(...) without assigning the return to dset.

Leporid answered 1/12, 2020 at 19:33 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.