The attached image are the two triggers I have at my disposal. The blue trace ("Scan") goes high when an MS scan is initiated and stays high for the duration of the MS scan, and the red trace ("Trigger") goes high every time ions hit the detector. How I am currently operating this, is to use the "Scan" as ext and triggering on that, feeding the "Trigger" to channel B, and the raw output of the detector to Channel A. I set up a for loop that bins all of the ADC count each time the trigger goes high. This works okay, but not great.
What I would like to do is not use the "Scan", instead triggering the picoscope on the "Trigger", taking a series of captures every time "Trigger" goes high, and only for the duration that "Trigger" goes high. In the attached example, there are about 36 captures (though eventually there would be at least 1000). The problem is as you can see, "Trigger" does not stay high or low for a set duration of time. It instead goes high and low as a set number of periods of the corresponding frequencies.
So my question is: Is there a way to trigger to pico to take a capture when a channel goes high, and only for as long as it is high? And to do that repeatedly until it stays low for some set amount of time? If I can get that, I'm pretty sure I can stitch all of the captures together and put them in a queue to actually see what I'm looking for.
I'm currently using the RapidBlock mode of the picoscope in LabView. I've seen some people mention Streaming mode to work for longer acquisition times, but I can't for the life of me understand the example or how to use it. The .vi I'm using is kind of robust as it does the acquisition and data analysis, but I can upload a portion of it if needed. I'm really just using the pre-mad PS5000 Settings and Block in a while loop so I'm not currently doing anything crazy in my program as far as the picoscope is concerned.
Any help would be greatly appreciated
