Hi, we are using the CU-3 variant. We have a question regarding dataset size. Here is a brief introduction to how we are using ARTIQ:
- We have an experiment script that runs a
for
loop, scanning through an experimental parameter. - With each instance of the loop, we take a picture of our setup (646 x 482 px²) and append it to our dataset with
append_to_dataset()
. - At the end of the script, ARTIQ saves the dataset in the HDF5 file as per normal.
We noticed that if we take more than 100 pictures, we cannot open the HDF5 file, and it shows us an error OSError: Unable to open file (bad object header version number)
. If we stay below 100 pictures, this error does not occur. It also does not look like an error with the camera or image, because we use the dashboard applet to show the last image taken, and there is nothing wrong with all the images shown by the applet.
At 120 pictures of the aforementioned size, the HDF5 file is 148 MB on disk. We initially thought that this could be a RAM issue, but the issue persists even after we upgraded the RAM on the computer running artiq_master
to 64 GB from 16 GB.
We also thought that it could be the time delay between the start and end of the experiment, but we tried running a very very long experiment (using time.sleep()
) without saving images, and the HDF5 file opens fine.
Is there a limit to the size of the dataset, or does anyone know what could have gone wrong with the saving of the HDF5 file?
Please help, thank you!