Hi,
Thanks again for the prompt reply.
I believe I have traced it back to my start up sequence. As soon as I run it the timing lag issues begin. I have pasted the code below:
global Offset = 0
using("device.labjack.")
include("c:\program files (x86)\labjack\drivers\labjackud.h")
AddRequest(0, LJ_ioPUT_CONFIG, LJ_chAIN_RESOLUTION, 12, 0, 0)
AddRequest(0, LJ_ioPUT_AIN_RANGE, 0, LJ_rgBIP0.1V, 0, 0)
AddRequest(0, LJ_ioPUT_AIN_RANGE, 10, LJ_rgBIP5V, 0, 0)
AddRequest(0, LJ_ioPUT_AIN_RANGE, 11, LJ_rgBIP5V, 0, 0)
AddRequest(0, LJ_ioPUT_AIN_RANGE, 2, LJ_rgBIP10V, 0, 0)
AddRequest(0, LJ_ioPUT_AIN_RANGE, 3, LJ_rgBIP10V, 0, 0)
AddRequest(0, LJ_ioPUT_AIN_RANGE, 4, LJ_rgBIP5V, 0, 0)
AddRequest(0, LJ_ioPUT_AIN_RANGE, 5, LJ_rgBIP5V, 0, 0)
AddRequest(0, LJ_ioPUT_AIN_RANGE, 6, LJ_rgBIP10V, 0, 0)
AddRequest(0, LJ_ioPUT_AIN_RANGE, 7, LJ_rgBIP10V, 0, 0)
GoOne(0)
What I am trying to do here is set the range of the channels and specifiy certain ones as differential (as in Ch0 + and Ch1 is the -).
Is seems the issue is with the following line:
AddRequest(0, LJ_ioPUT_CONFIG, LJ_chAIN_RESOLUTION, 12, 0, 0)
Can u spot anything wrong? I am using a LabJack U6-Pro. maybe the line isn't needed or I am setting the resolution wrong.
Thanks as always!