Search the Community

Showing results for tags 'timing'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • DAQFactory
    • Alarming
    • Channels, Conversions and general data acquisition
    • Graphing
    • Logging
    • Modbus
    • Networking: web, email, FTP etc.
    • PID
    • Screen Components and User Interface
    • Sequences and scripting
    • Serial Communications
    • General DAQFactory
    • DAQFactory and the LabJack
    • DAQFactory Express
    • DAQConnect
    • DAQFactory New Feature Requests
    • The DAQFactory Guru's Articles
    • DAQFactory Newletters

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


AIM


MSN


Website URL


ICQ


Yahoo


Jabber


Skype


Location


Interests

Found 3 results

  1. We'd like to dynamically select which Channels are in use during testing sessions. Switching Channel.Timing between 0.00 and 0.01 is reported correctly in the Channel Table, but Variable Values derived from Channels, for instance, do not track the changes (by showing or not the beloved Red X). Stopping the Document and reloading it after the .Timing is changed DOES cause the desired results. We'd like to dynamically re-select Channels during the same run of the Document, as well as change some .Expressions, .LogFileName, and such, to reflect the new selected Channel set without having to manually stop and reload the Document. Is this possible?
  2. The Overall Question: Is there a difference in how DAQFactory or the Labjack hardware assigns timestamps during streaming vs. command/response mode? For example, if I were to record a single square wave input (20 Hz) on two different labjacks, one streaming data and the other using C&R with both labjacks collecting data at the same frequency (250 hz), would I not only see the same frequency square wave but have the square waves line up fairly close to each other (within 1/250 s) if I were to plot them using the data and respective timestamps provided by each labjack? The Context: I'm currently streaming data from one Labjack U6 (let's call it LJ1_stream) and using command/response to collect data on another U6 (LJ2_CR). The reason for this is that a) I need 6 timers, hence 2 different labjacks and one of the timers is to record a rotary quadrature encoder that rotates too quickly for streaming (due to the 1 edge/33 usec processing limitation). Therefore, 4 rotary encoders using duty cycle timing (mode 4) are read by LJ1_stream and 2 quadrature rotary encoders using quadrature timing (mode 8) are read by LJ2_CR. I've set the streaming to 250 hz and the C&R timing to 0.004 seconds (which I realize is faster than recommended for C&R and could be part of the issue as discussed further below) to also provide 250 hz measuring. A common 20 hz square wave generated by a function generator is also being recorded on both labjacks, as well as an experiment start signal and a valve command signal, which both start low and go high when respective buttons are pushed. Data is collected and exported (not logged) using expressions of the form Channelx_Data[Experiment_startTime,Experiment_endTime] where Experiment_startTime = systime() at the beginning of the experiment based on a button press and similarly for Experiment_endTime. The data from LJ1_stream and LJ2_CR are exported as two separate export sets, with All data points (aligned) and zero align threshold. Channel histories are 3000 for LJ1_stream and 4000 for LJ2_CR, since my experiments only run for ~2-3 seconds at a time. LJ1_stream, in addition to the 8 channels being occupied by the four timers and the three common digital input signal channels, also has 5 analog signal channels and one other digital input, for a total of 17 streamed channels. According to the 50000 max sample rate, 17 channels should correspond to a max scan rate of 2941, or a minimum of 0.00034 seconds between scans, so running at 250 hz should be slow enough. LJ2_CR has the 4 channels for the 2 quadrature timers, and the three common digital inputs. The Detailed Problem: There are experiments where the common signals are being recorded at very different timestamps from LJ1_stream compared to LJ2_CR. For example, in one experiment, the experiment start signal went high at -0.193 seconds on LJ2_CR vs. at 0 for LJ1_stream (obviously using the LJ1_stream experiment start signal time as the basis for subtracting the DAQFactory time into useable time for both LJ1_stream and LJ2_CR). There are other experiments where the effect is less pronounced, but still off by >2 timesteps (e.g. experiment start signal goes high at +0.009s on LJ2_CR but is recorded at 0 for LJ1_stream). Some Thoughts: 1) Running LJ2_CR at 250 hz is too fast? According to the labjack specifications (sec 3.1 and 3.2 of labjack U6 manual), one complete scan of 6 channels (2xquadrature timers and 2 digital inputs) in C&R mode would take about 8.5 ms, which corresponds to roughly a max record rate of 117 hz. However, in examining the quadrature encoder data recorded at 250 hz using C&R on LJ2_CR, there doesn't appear to be any major issues other than some occasional data dropouts that occur. The data still looks fairly smooth and uninterrupted. Is it possible that the timer channels are recorded more accurately because they're interrupt driven rather than the digital signals (are they polled?)? 2) Is this an issue with using export rather than logging? Any other suggestions or thoughts would be greatly appreciated as to why timestamps aren't lining up, and I'm happy to clarify anything as I realize this is a very specific and detailed question that may not be as applicable to the general public.
  3. crossbowjapan

    Logging Frequency And Timing Value

    Hi We have been using the attached script to work with U3-HV with DAQ Factory Express over the years and it used to work well. Recently we seem to be unable to set low frequency logging with this script. In the script, it used to be, by setting "Timing" in the Channel Table View, we used to change logging frequency to lower values. (While the default is 1/1000 sec), by changing the Timing from 0.00 to say 120.00 (second) the sampling would be every 2 minutes. However, recently we see that this change does not happen when Apply is clicked, and keeps sampling and logging at 1/1000 sec interval. Could anyone help us where in the script is wrong and what should be done to enable slow logging like every 6000 seconds using this script (or modification thereof)? Thanks - CN u3hv_4ch.ctl