Hi,
This is probably a bit of a stupid question but it is not entirely clear to me what the best way to get around this is.
We have written an experimental sequence that takes 1.5 s that we run in a loop (that also collects data from NIDAQ etc.). However, we have found that the deadtime between the runs is quite long (2 - 3 s), which appears to be due to the communication between the PC and ARTIQ (i.e. the sequence being sent to the ARTIQ seems to take up most of this time).
I was wondering whether there was any advice as to the best way to reduce this deadtime as it is currently limiting our experiments, and if so, what would be the most sensible way to do this?
Thank you for your help in advance.