Missing data and duplicate lines in CSV log-file


Martin

Recommended Posts

We need some help and/or advice.

We have a Daqfactory Pro (Release 17.1 - 2309) with a Labjack T7 (Firmware: 1.0255) logging three experiments.

We are running it on Windows 7 Pro. SP1. Intel I7 processor with 4 Gig Ram. It is fully patched with auto update and sleep disabled. The Labjacks are hard wired back to the PC via Ethernet.

Each experiment consists of  two analog ports: a thermocouple (K Type) and a current sensor (hall effect) and a one digital port counting a Geiger detector (counts / second). We have two additional channels monitoring background radiation counts and ambient temperature for all three experiments.

We are using a one second time-base to log all three experiments at the same time.

We have tried many configurations now. From logging at "Fixed Interval" to "All Data Points" with "Align Threshold" from 0.1, 0.7, and even 0.99 . Each time DaqFactory and our T7 will run well for maybe a few hours and then it will start duplicating lines, blank lines, lines with only one or two fields, fields randomly captured and missed. There is no real consistent pattern.

Then for no reason it will start logging perfectly well again.

One thing that strikes me as odd, is that despite defining the CSV file for current to 2 significant figures, when our current sensors detect a small negative signal DF insists in logging to 5 significant figures... Why is this?

We need to log these over a 24 hour period.

I have attached the latest CSV file from last night along with the CTL file.

The logfile starts going strange at line 15719, 15731, 15738 - 18453, 41930, 41948 - 44665 and is then fine down to 60002.

 

Master_180806_160103.CSV

Ecalox_LJ_Live_06_08_2018_d_2_Added_Channel.ctl

Link to comment
Share on other sites

It looks like the alignment of the data is just off.  You probably want an alignment threshold of .9, or actually maybe even 1.  It's hard to say without a time stamp with milliseconds in it.  You might try a run with DF time instead of custom time.

As for sig figs, it is important to understand that sig figs is different what we call "Precision" in variable value components.  Precision is the number of digits to display to the right of the decimal place.  Sig figs is actually just that, significant figures,  though DAQFactory automatically adds 1 to the sig figs, basically to avoid have your least significant figure rounded.  So, when you specify 2 for figs, you'll get three non-zero digits.  So:

123.4567 becomes 123

but: 1234567 becomes 1230000

and: 0.000123456 becomes 0.000123

Its more digits because of the leading 0's, but that is needed.  The 0's aren't significant so don't count.  Note that 102.345 becomes 102.  The zero is significant because it has a non-zero digit to its left.  Likewise 123000.567 becomes 123000.  The zero's aren't significant, but are required to actually write 123 thousand.

Link to comment
Share on other sites

  • 3 weeks later...

Thanks for the guidance Guru.

Increasing the threshold to 0.9 and then to 1 helped a bit. The data drop-out was reduced.... However what really made the biggest difference was taking the data acquisition off of WiFi and hard wiring the lot back to a hub with ethernet.

Even though the WiFi path was a relatively short distance and only one wall from the wireless router it was enough, when loading all points, to create missed cells of data and the dreaded duplicate lines.

So with all the guidance on tweaking from yourselves in Support and then hard wiring the system we now have really very clean data.

I hope these couple of postings become useful to others as a possible cause of data error.  

Thanks again.

Regards - Martin

Link to comment
Share on other sites

Yes, that is good to know.  Wifi is notoriously finicky I find, which is why I never understood the push for it in industrial settings.  We have a commercial level Wifi system here and it still drops out for very short periods.  Its almost unnoticeable, but if you were streaming a LabJack I could see how it would become an issue.  Thank you for sharing!

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.