Picolog1216-SDK: Pl1000SetInterval

Having problems ? let us know the details here
Post Reply
roma
Newbie
Posts: 1
Joined: Sun Oct 13, 2019 9:35 am

Picolog1216-SDK: Pl1000SetInterval

Post by roma » Sun Oct 13, 2019 9:49 am

Hi,
I have a question regarding the Pl1000SetInterval() function.
In the programming guide (https://www.picotech.com/download/manuals/picolog-1000-series-data-loggers-programmers-guide.pdf), the sampling interval is shown to be calculated as
in BM_SINGLE mode*:
i = 1 μs x us_for_block / (ideal_no_of_samples x no_of_channels)
in other modes:
i= 10 μs x us_for_block / (ideal_no_of_samples x no_of_channels)
where ideal_no_of_samples is the number of samples one wants to collect per channel.

However, in the example code listed here(https://github.com/picotech/picosdk-c-examples/blob/master/pl1000/pl1000Con/pl1000Con.c) on line 522 for the streaming mode example the sampling interval is calculated as
samplingIntervalUs = (usForBlock / (nSamplesPerChannel * nChannels));
Which calculation is correct?

disys
Newbie
Posts: 1
Joined: Wed Oct 16, 2019 1:40 pm

Re: Picolog1216-SDK: Pl1000SetInterval

Post by disys » Wed Oct 16, 2019 1:42 pm

Good morning,

We are having the same issue, have you found which way is correct ?

Post Reply