This is a very straightforward code but I do not understand the result:
class TimeTest(EnvExperiment):
def build(self):
self.setattr_device("core")
@kernel
def run(self):
self.core.reset()
time1 = self.core.get_rtio_counter_mu()
delay(1*s)
time2 = self.core.get_rtio_counter_mu()
print("True kernel time: ",self.core.mu_to_seconds((time2-time1)))
When I run this, I would expect the print statement to return something on the order of 1 second but rather, regardless of the value in delay(), it returns ~ 1 us. I tried putting timing it with time.time() in this way:
class TimeTest(EnvExperiment):
def build(self):
self.setattr_device("core")
def run(self):
t = time.time()
self.delay_experiment()
print("Runtime: ",time.time() - t)
@kernel
def delay_experiment(self):
time1 = self.core.get_rtio_counter_mu()
delay(1*s)
time2 = self.core.get_rtio_counter_mu()
print("True kernel time: ",self.core.mu_to_seconds((time2-time1)))
and it returns the same ~ 1 us for the 'True kernel time' and 138 ms for the 'Runtime'. Likewise, I would expect both values to be ~ 1 s.
Is this an error or perhaps I just do not understand the way the timeline works?