I have a question regarding the Pl1000SetInterval() function.
In the programming guide (https://www.picotech.com/download/manuals/picolog-1000-series-data-loggers-programmers-guide.pdf), the sampling interval is shown to be calculated as
where ideal_no_of_samples is the number of samples one wants to collect per channel.in BM_SINGLE mode*:
i = 1 μs x us_for_block / (ideal_no_of_samples x no_of_channels)
in other modes:
i= 10 μs x us_for_block / (ideal_no_of_samples x no_of_channels)
However, in the example code listed here(https://github.com/picotech/picosdk-c-examples/blob/master/pl1000/pl1000Con/pl1000Con.c) on line 522 for the streaming mode example the sampling interval is calculated as
Which calculation is correct?samplingIntervalUs = (usForBlock / (nSamplesPerChannel * nChannels));