Logging more than 10 data points per second (LabJack T7)


Andreas

Recommended Posts

Hello,

I'm using a Labjack T7 in stream mode to collect data from analogue inputs. Streaming works fine using sequences and I receive 10000 data points per second in my "Test-"Channels with no problems. I can see this data in the channel tables, channel graph and page graph. There are really 10000 points per second.

However, I am unable to log more than exacly (!) 10 data points per second using a logging set in ASCII-Method. The UsersGuide in chapter 9.2 says, 10000 points per second should be acheivable with this method. It doesn't matter how many channels I try to log or what other settings I use (Mode, Time Format, Interval, Threshold,...). 10 points per second is the maximum. As the logged data points are equally spaced at 0.1 seconds, I suspect an artificial limit somewhere. 

Link to comment
Share on other sites

Hello,

I found the problem with "logging" myself. It was not the alignment threshold or the logging-functionality per se (I tried different values before).

The problem actually lies with the way, the stream data is read and put into the channel: In a "while"-loop, every 1/10th second, an array of data (for 10000 values per second, the array is 1000values big) is read from the LJ-Stream and put to the channel, using "Channelname.AddValue(Array)"-Method. It appers, that the Logging-functionality only writes/logs a single value, when data is written to the channel-history. So in this case only every 1/10th second, a single value is written to the log.

to prove this, I tried a loop every 1/100th second and received a maximum of 100 values per second in the log.

I based this stream-reading method on the application example from the users guide (chapter 17.9). It appers to me, that the logging-functionality is incompatible with this streaming-method and putting arrays in the channel. At least I have not managed to write more than one value per "logging-event".

I managed to write a sequence, with a loop, which uses an Export-Set to write data (10000 points) every second. It works basically. However, with this I get "alignment"-issues (time-gaps) as the loop for the Export-Set is not exactly 1second long.

So the sequence with the export-set loop is a step forward, but I am still looking for a way to continously (without gaps) log about 10000 values per second from my Labjack T7-stream to a  ASCII-file.

Link to comment
Share on other sites

We will be addressing streaming with the T series soon.  In the meantime, you probably will need to script the logging into your streaming sequence.  The original sample is this:

private scanRate = 10000
device.labjackM.LJM_eStreamStart("ANY", {"AIN0", "AIN1"}, scanRate, scansPerRead, 1)
private dataIn
private data
private st = systime()
global backlog
while(1)
   dataIn = device.labjackM.LJM_eStreamRead("ANY")
   data = insertTime(dataIn.data.AIN0, st, 1 / scanRate)
   channelA.AddValue(data)
   data = insertTime(dataIn.data.AIN1, st, 1 / scanRate)
   channelB.addValue(data)
   st += scansPerRead / scanRate
   backlog = dataIn.ljmscanBacklog
endwhile

You'd want to make it into something like this:

private scanRate = 10000
device.labjackM.LJM_eStreamStart("ANY", {"AIN0", "AIN1"}, scanRate, scansPerRead, 1)
private dataIn
private data
private st = systime()
global backlog
global string streamFileName = "c:\myData\streamData.csv"
private string curFileName = ""
private handle = 0
while(1)
   dataIn = device.labjackM.LJM_eStreamRead("ANY")
   data = insertTime(dataIn.data.AIN0, st, 1 / scanRate)
   channelA.AddValue(data)
   data = insertTime(dataIn.data.AIN1, st, 1 / scanRate)
   channelB.addValue(data)
   st += scansPerRead / scanRate
   backlog = dataIn.ljmscanBacklog
   // setup logging:
   // check for change in file name and open file
   if (streamFileName != curFileName)
      if (handle)
         file.close(handle)
      endif
      handle = file.open(streamFileName, 0, 1, 1, 1)
      curFileName = streamFileName
   endif
   data[][0] = seqadd(st, 1/scanRate, numrows(datain.data.ain0))
   data[][1] = datain.data.ain0
   data[][2] = datain.data.ain1
   file.writedelim(handle,data, ",", chr(10))  
endwhile

The only trick to this is if you stop this sequence (like you do with stopStream), then the file will still be open.  Use file.closeAll() to close all the open handles.  You could, also, make the file handle and file opening global, and then just have the sequence write to the handle when it is non-zero.

 

 

Link to comment
Share on other sites

  • 2 weeks later...

Hello,

I finally had some time testing that logging-code. It does work, basically. I added some features (like start/stop from page, choosing file destination, adjust scands per s, ... from the page). However, there are still some points/issues:

- IMO, this

st += scansPerRead / scanRate"

line should be after the logging part. otherwise the time stamps of logged data and channel data will always be "scansPerRead/scanRate" apart. However, when only logged data and its timedeltas are relevant, this does not matter.

- the whole thing is still quite unhandy. Everything must be (hard-)coded. e.g. there is again no conversion before logging. One must code it or open the data and add a conversion afterwards. A native streaming and respective logging support would be much appreciated.

- I have not managed to use "file.writedelim()"-command to write the time column with all six decimals (double precision, µs). The time format of variables up to this point (systime(), st-Variable and data[][0]-Array) is double precision, having six decimals. In the resulting logfile, however, there are always only 3 decimals (ms). With only 3 decimals, there are 10 identical times for ten data points, when logging with full speed (10000 scans/s). Porbably, this is just a formatting-issue but I could not find anything in the UsersGuide.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.