Simultaneous Streaming and Command-Response

Recommended Posts


I've been working with a couple of LabJack units for a while, and I'm trying to increase their performance by utilizing the streaming mode, but I am a little confused about what streaming mode can and can't do. I am using DAQFactory Base Edition. The channels that I need to utilize are:

2 quadrature inputs (2 timers each on separate LabJacks)

15 analog inputs (high sample rate desired)

2 digital outputs (only occasionally used)

Here are the questions that I have:

What is the difference between the data averaging using the export function, and averaging using the option in the channel list? I am looking for the method that will provide the fastest sample rate. All of the analog channels that I have (15) need to be averaged.

Is it possible to operate some channels in streaming mode, and some others in command-response mode on the same LabJack unit? If not, could I operate two LabJacks in command-response, and one in streaming mode?

If I am streaming data, can I still access the channels as if they were in command-response mode? Specifically, I'd like to stream the data at a high rate, average 10 history points per channel, and then use an export set to record them to a file at a non-constant time interval based on an event. Basically, I am not sure how my control of the data is limited when I am in streaming mode.

Any suggestions would be appreciated. I have a limited experience with scripting, but if I need to do all of this using sequences to get the best performance, I'm willing to attempt it. If you need more info, or need me to upload my .ctl file, let me know.



Link to comment
Share on other sites

First, which LabJack's? I'm going to assume not the U12 given your specs. The U12 has very different capabilities than the others.

1: don't use data averaging in export sets. Its not going to work the way you want. Use the one in the channel list. That will cause the system to take x number of data points and reduce it to a single averaged value. Only that averaged value will appear in the channel and you can treat it as a single value and ignore the fact that its actually oversampled. I believe this works fine with streaming, but there may be some limitations based on the data block size that comes back from the device.

2. yes, I believe so. With the U12, you definitely can't, but I'm pretty sure with the U3/U6/UE9 they made it possible. Its easy to check, just fire up streaming data then try and set an output or read another channel. Maybe the LabJack folks can provide confirmation. Even with the U12 you can stream one and poll another, though I believe you can only stream from one unit at a time.

3. Yes. The main DAQFactory app doesn't really pay attention to the fact that the data is streamed or not. That's all controlled by the driver. There is some blocking of data, which, for example, causes the channel Event to only be called with each block, not each data point, but otherwise, once the data is in DAQFactory, its just data.

Link to comment
Share on other sites

Also sounds to me like you are talking about the U3/U6/UE9 which all use the UD driver on Windows. Streaming has to take control of the entire analog input system, so while streaming analog inputs are not available using command/response. Besides analog inputs, you can do command/response while streaming for everything else.

Link to comment
Share on other sites

Ok, Thanks for the response. That info is going to help me out a bunch. For the record, I'm using U3s.

Thanks again,


Link to comment
Share on other sites

  • 2 weeks later...

With your advice, I now have my system working the way I wanted it to with one exception: I am using an event to trigger a function that writes data to a file, but is is skipping some data points. I think that this is because the event is only called once for each block, as you stated above. I need to log one line of data every time the quadrature input is incremented.

To remedy the problem I have tried the following code in an event on the quadrature channel:

// This function looks for differences in the last 100 entries of the 
//	  quadrature encoder for changes, and logs a line of data.

Private i
i = 0


   if( (Front_Timer0[i] > Front_Timer0[i+1]) && (Snd_Man))  // Snd_Man allows the datalogging to be turned on and off.

	  Func_Export(i) // This is a function that exports a single line of data.



This works OK, but it will either skip a data point every once in a while, or log some data points twice, depending on the number used in the 'while' loop and the stream rate. Right now the stream rate is set to 700. Is there a way I can fix this problem and make the data logging independent of the stream rate? I imagine the solution may require me to figure out the size of each block of data that is created while streaming. Is this information available? I also thought that I might be able to use the time stamp as a reference instead of the array index, but I don't know how to do this since I need to write data several times a second.



Link to comment
Share on other sites

You are going to run into big problems doing a while() loop inside an event. It will be slow and that will carry back into the acquisition and eventually cause the streaming to stop. As a side note, you really should check snd_man outside the loop, otherwise you are wasting a bunch of time running through a loop that won't do anything.

I would use a time stamp as you suggested. First, declare a global variable "starttime" in a startup sequence:

global starttime = systime()

then, subset by time:

private curtime = front_timer.time[0]

private group = front_timer[starttime, curtime]

then, to process, use search() instead of a while loop. Its much faster. I'm still using a while loop, but its only iterating each match, not every element:

private i = -1
while (snd_man)
   i = search(group > front_timer[1,1000],i+1)
   if (i == -1)
starttime = curtime+0.00001

In my search I use a trick: group is an array of values over the last scan. Front_timer[1,1000], even though its subsetted by index, is an array of the last 1000 values, skipping the most recent one. When we do a comparison between the two, the bigger array is truncated, presumably to the last scan since that array will certainly be under 999 elements. Then, it compares them 1 to 1, but since the right side is offset by one it gives us what we want.

Link to comment
Share on other sites

  • 3 weeks later...

Hello again,

I've used the script that you gave me, and it seems to work perfectly as long as the quadrature input is incremented very slowly. If I turn the encoder at even a moderate speed (10 RPMs or so), quite a bit of data is skipped. I do not understand the script that you gave me completely, but i suspect that it is only identifying a single change in the encoder count per data packet. Is there a way that I can catch every change in the encoder value? I thought about just reducing the streaming rate, but that kind of negates the purpose of using streaming mode.



Link to comment
Share on other sites

Ok, in my event tab on the channel that set up for the quadrature encoder the code that I have is as follows:

private curtime = Front_Timer0.time[0]
private group = Front_Timer0[starttime, curtime]
private i = -1
private j = -1

while (snd_man && FT)

   i = search(group > Front_Timer0[1,1000],i+1)
   j = search(group > Front_Timer0[1,1000],j+1)

   if (i == -1)


   if(j ==-1)


starttime = curtime+0.00001

It is basically the same as the bit if code that you gave me, but I repeated the process using the 'j' variable to try and catch a second change in the encoder each time the event is executed. It does not seem to help very much. The 'FT' variable is used to conditionally execute the code. I also declare some global variables in a start-up sequence:

global starttime = systime()

global data[23] = 0

The 'Average()' function just averages the data, stores the averaged value to a variable, and then starts a sequence that writes those values to a file. I'll include the 'Average()' function and the 'WriteToFile' sequence code below.

The Average() function:

function Average(i)
   private numave = 40

   data[0] = Front_Timer0.time[i]
   data[1] = Package_length
   data[2] = Module_Contact
   data[3] = Ht_cal
   data[4] = mean(Analog_In_0[i,numave + i])
   data[5] = mean(Analog_In_1[i,numave + i])
   data[6] = mean(Analog_In_2[i,numave + i])
   data[7] = mean(Analog_In_3[i,numave + i])
   data[8] = mean(Analog_In_4[i,numave + i])
   data[9] = mean(Analog_In_5[i,numave + i])
   data[10] = mean(Analog_In_6[i,numave + i])
   data[11] = mean(Analog_In_7[i,numave + i])
   data[12] = mean(Analog_In_8[i,numave + i])
   data[13] = mean(Analog_In_9[i,numave + i])
   data[14] = mean(Analog_In_10[i,numave + i])
   data[15] = mean(Analog_In_11[i,numave + i])
   data[16] = mean(Analog_In_12[i,numave + i])
   data[17] = mean(String_Pot[i,numave + i])
   data[18] = mean(Inclinometer[i,numave + i])
   data[19] = mean(Supply_Voltage[i,numave + i])
   data[20] = Front_Timer0[i]
   data[21] = Rear_Timer0[i]
   data[22] = Wheel_diam

	 ?"ERROR in Average function! i = " + i
	 ?"i + numave = " + (i+numave)


The WriteToFile sequence:

   private string DataLine= DoubletoStr(data[0]) +\
	  "," + DoubletoStr(data[1]) +\
	  "," + DoubletoStr(data[2]) +\
	  "," + DoubletoStr(data[3]) +\
	  "," + DoubletoStr(data[4]) +\
	  "," + DoubletoStr(data[5]) +\
	  "," + DoubletoStr(data[6]) +\
	  "," + DoubletoStr(data[7]) +\
	  "," + DoubletoStr(data[8]) +\
	  "," + DoubletoStr(data[9]) +\
	  "," + DoubletoStr(data[10]) +\
	  "," + DoubletoStr(data[11]) +\
	  "," + DoubletoStr(data[12]) +\
	  "," + DoubletoStr(data[13]) +\
	  "," + DoubletoStr(data[14]) +\
	  "," + DoubletoStr(data[15]) +\
	  "," + DoubletoStr(data[16]) +\
	  "," + DoubletoStr(data[17]) +\
	  "," + DoubletoStr(data[18]) +\
	  "," + DoubletoStr(data[19]) +\
	  "," + DoubletoStr(data[20]) +\
	  "," + DoubletoStr(data[21]) +\
	  "," + DoubletoStr(data[22]) 

   File.Write(handle1, DataLine)

In case I wasn't clear initially, my problem is that as the rate that the encoder spins increases, the amount of data that is skipped in the output file increases. Thanks

Link to comment
Share on other sites

OK, first, I'd change it so instead of doing Front_Timer0[1,1000], I'd do group[1,1000].

Next, dump the duplicate. Your chasing the symptom, not the problem.

Next, I'm unclear about your logging. You appear to be logging the mean of the next 41 data points after the encoder change.

Finally, I'd simplify to figure out where the problem is. Instead of calling Average(), print out the index and more importantly the time:

? "Index: " + i

? "Time Stamp: " + formatDateTime("%c",group.time)

Then run it for a short time and try see what you get. You might add a:

? "Block received" just before your while() loop.

Link to comment
Share on other sites


This topic is now archived and is closed to further replies.