Recommended Posts


I'm using a LabJack U3 to read a hall sensor in my DF (non-streaming) application. Now the thing is that whilst the actual pulse counter works just fine in DaqF the time difference, dt, between two pulses I can resolve at low speeds looks stable but at higher speeds (the hall sensor monitors a DC motor shaft) the dt becomes pretty shaky (ranging from 0.342sec up to 0.387sec) but the frequency of the motor is stable (~2.7Hz at the highest speed) so it's not really a high speed. Any ideas on how to improve the dt stability would be much appreciated, preferable without switching to streaming or using averaging...

My sequence for the speed calculation looks like:

while (1)
 private theTime=Wear_sensor_mm.Time[0]  // Wear_sensor_mm is a FIO channel not related to the motor speed from where I "borrow" the time stamp
 distance.AddValue(insertTime((travel),theTime,0))  // distance is a just test channel which monitors the total travel of the motor
  if (temp_rev_counter<rev_counter)
     if (rev_counter%2==1)   //  on every other turn of the motor the delta_T1 is set and on every other turn delta_T2 is set
     speed.AddValue(insertTime((0.19468/Abs(delta_T2-delta_T1)),theTime,0)) // speed calculation which is stable for really slow speeds but (delta_T2-delta_T1) is shaky at "higher" speeds
// Hall sensor code
AddRequest(ID,LJ_ioGET_COUNTER,0, 0, 0, 0)  .
delay(1/sampling_frequency)  // sampling_frequency set to 40Hz


Link to comment
Share on other sites

I don't really get what you are doing.  It seems like you are reading the counter at some interval and then making time measurements based on this.  Well, the problem is that the minimum time interval is going to be 1/sampling_frequency, which is 25ms based on your comments.  Through in whatever slop you have between the reading of the FIO that you are using as time (and its minimum sampling rate) and the sequence loop, and I can easily see 45ms of error in your measurement.  You need to do a few things:

1) set the sampling frequency higher to reduce slop with the counter.  No matter what you do elsewhere, if you only sample at an interval of X milliseconds, you are going to get noise of at least X milliseconds in your delta T, probably more like 2 * X.  This is because the counter might increment 1 microsecond after a reading, but you aren't reading for another X milliseconds and so the timestamp is off by X milliseconds. 
2) use the timestamp from rev_counter which is inside the loop, and don't rely on times from elsewhere
3) replace delay() with wait() so your loop time is actually 1/sampling_frequency and not something more.  


Link to comment
Share on other sites


This topic is now archived and is closed to further replies.