Streaming AND Logging


Recommended Posts

Hello All,

My company purchased a LabJack UE9 with DAQFactoryExpress. I installed the software and the device, last week and am very impressed with it. Currently, I'm monitoring a pressure transducer and am logging data on an event (when psi>20).

I've been studying the technical material, and I'm fairly comfortable with the scripting associated with streaming and the scripting associated with logging. However, the logging examples (writing to a file from a sequence) really only show how to write a single line of data to a file.

Is it possible to write a sequence so that the large block of data that is pulled in during a streaming operation can then be written to my log file?

Let's assume that my stream rate is 1000 samples per second and I'm pulling the data in once every second. How would I log those 1000 samples to my log file each time I pull the data?

Thanks for any help you can provide!

Jeremy N. Choate

Senior Manufacturing Controls Engineer

OPTIMedical Systems, Inc.

Roswell, Georgia

Link to comment
Share on other sites

You can write a block of data using the file.writedelim() function. It will take an array, so, for example, you could specify mychannel[0,999] to write the last 1000 points. The problem is there won't be any timestamp, and just one column of data, so you really need to assemble all the data you want into a multidimensional array, then call writedelim(). Something like this:

private dataout

dataout[][0] = mychannel.time[0,999]

dataout[][1] = mychannel[0,999]

dataout[][2] = myotherchannel[0,999]

then you can use writedelim() to write the whole array. This is MUCH faster than writing a for loop and writing each individual line.

Link to comment
Share on other sites

  • 1 month later...

Thanks for the tip. I have a follow-up, however.

If I use the following command to retrieve the stream data from the hardware buffer on the UE9:

eGet(lngHandle, LJ_ioGET_STREAM_DATA, LJ_chALL_CHANNELS, &numScans, scandata)

The stream data is stored in the one-dimensional array ("scandata") that I have created. Even though the channel data is being written to this array, will your method still work? In other words, will the time stamp data be written to "mychannel" despite the fact that the data, itself, is being written to "scandata"?

For example, what if I stream data in, and it looks like this:

scandata[0] = 20.123

scandata[1] = 21.254

scandata[2] = 20.456


Where will the time stamp for those individual data points be stored (in mychannel[0], [1], [2], etc.)?

Thanks so much.

Link to comment
Share on other sites

Oh, well, I wouldn't use GET_STREAM_DATA at all. There is some extra stuff DAQFactory does to be more efficient and apply the right time stamp when you use the built in stream capture. Also, I think DAQFactory will still do the built in stuff once you do START_STREAM, which means you'll get missing chunks of data, because DF processed some internally and sent it to channels, and you queried the UD directly. Let DF put the data into the channels, then process it from there.

Link to comment
Share on other sites

I'm starting to follow a little better, now. Such are the hazards of consulting both the DAQFactory Express User's Guide AND the LabJack UD manual at the same time. So, I assume that whenever the LabJack is streaming, DAQFactory Express is pulling the data into the channel?

I started a basic stream and noticed that DAQFactory Express is putting the stream data directly into the channel. Now your example makes more sense.

I set the scan rate with the following statement:


because I only need about 100 scans per second.

Now, I wish to log the data, whenever a certain threshold is crossed (say, MyChannel[0]>3) at a rate of once per second. So, for every second that the channel is above that threshold, we would be logging 100 data points.

As an experiment, I put the following code in the event tab of MyChannel:

global array[1][2]

if (MyChannel[0]>3)

	//wait one second to accumulate 100 data points   

	//write the time stamp to row 1, column 1   

	//write the value to row 1, column 2


I did this to see how well the program keeps up. I'm watching the value of array[0][0] in the Watch window to see if it's updating once per second. It's really slowing my application down, and it's not updating every one second, exactly. Close, but not exactly (avg: 1.08 sec).

I intended to replace the code after the WAIT(1) statement with a sequence call. That sequence would handle the writing of my 100 data points from the channel to an array, opening a file, and using the File.WriteDelim function that you spoke of.

After that, the program would return to the event tab, evaluate the value of MyChannel[0], and repeat, if necessary. Does this sound like a good strategy, or am I going about it all wrong?


Link to comment
Share on other sites

You'd be better off just using an export set, and referencing the channel data directly. The export set expression would be something like:


to log the last 100 points. Then create a sequence that triggers the export set every second:


Note that normally we recommend against wait(), but this is one of the cases where it is appropriate to use it. the beginexport() statement will always be fast, and using wait() instead of delay() insures that the loop runs at exactly every second.

Link to comment
Share on other sites


This topic is now archived and is closed to further replies.