Log Averages As Samples Are Taken


mkelly

Recommended Posts

Background:

Running an experiment that get's sampled every 10 minutes to 1 hour depending on what we're trying to accomplish that day.

I have the DAQF logging all the data being displayed.

Problem:

Want to output an average for each output of every 10-20-60 minutes depending on when "sample" is taken. This is an incremental number that we log in DAQF to show when a sample was taken.

Current Progress:

I figured out how to use the built in logging function to log the average every 10 minutes. This appears to need to be started on the time I start my experiment to coordinate with my timing.

The issues I have with this are: It's fixed unless I manually modify it every time, and I have to start this at the same time as I start my experiment or else the data will be slightly off.

Neither of these are a huge issue but it would be beneficial to me to have it represent the whole sample instead of half or 1/6 of it. I already have the complete log set and this is merely a means of logging the average.

Ideas:

I know there's a GetTime function that could pull the time that the samples were taken. Though I'm not sure if GetTime can work the opposite direction.

My idea in pseudo code

If Sample number increases

Get time of last sample number

Average points between last sample number and current sample

Log calculated averages

Though to me I see a flaw in that the computer would have to somehow determine how many data points for each channel relates to the change in time. Which is what I mean by GetTime working the opposite.

Is there a simple way to do this, or would it be best to just do the standard logging procedure for averages?

Link to comment
Share on other sites

Let me just make sure I understand: you are sampling an input channel (call it myChannel) say every 10 seconds or some relatively fast interval, and you want to be able to log that data every 10, 20 or 60 minutes where the value logged is the average of the last 10, 20, or 60 minute's worth of readings (basically a boxcar average). You are only going to log at one of these intervals with each experiment. If this is correct, then this is what you do:

1) create a new channel to hold the average. I'm going to call it myAverage. It should be a Test channel, I/O type A/D, Timing = 0.

2) create an autostart sequence to initialize some variables:

global averageTime = 10 // in minutes

global lastTime = systime()

global expRunning = 0

3) in the event for myChannel put this:


if (expRunning && ((myChannel.Time[0] - lastTime) > averageTime*60))
private average = mean(myChannel[lastTime, lastTime + averageTime*60 - 0.001])
average.time = lastTime + averageTime*60
myAverage.addValue(average)
lastTime += averageTime*60
endif

[/CODE]

4) now log myAverage

5) using screen controls you can change the averageTime variable.

6) when you want to start an experiment you need to do this script:

lastTime = systime()

expRunning = 1

7) when you want to stop the experiment do:

expRunning = 0

to keep myAverage from accumulating new values.

Link to comment
Share on other sites

I should have been clearer on my setup. Let me try and expand a bit.

I have my main data logging recording 20 channels worth of information approximately once every 2 seconds.

While I'm observing the experiment I have a sheet of paper in front of me that I write down 11 channels worth of information every time we take a sample. I need this kind of document to show to the boss, but it's not really crucial if we have data being logged.

When I take a "sample" I'm extracting liquid from my system. I use a sequence to open and close a series of solenoid valve and I have a quick sequence that adds onto the sample number every time the button is pressed.

This means we'd see samples as at times like

Sample_Number[1] @ 10:40:15 = 1

Sample_Number[0] @ 11:20:45 = 2

etc

I only have 12 channels remaining on my license so I'd be in favor of a method that didn't involve adding channels. I need the flexibility to upgrade in the future which can involve more channels being used for thermocouples and the such.

Now looking at the code you provided it does seem possible to take an average between two time stamps if I understand your coding correctly

if (Sample_Number > 0) // This initiallizes the sequence if sample number = 1 (Indicating the start of the process as all times taken under 1 is sample 1)
   private average = mean(Reactor_Thermocouple[GetTime(Sample_Number[1]), GetTime(Sample_Number[0]))
   Reactor_Thermocouple_Average.addValue(average)
endif

Would this work, or does the time stamping not work like this?

Link to comment
Share on other sites

  • 2 weeks later...

Archived

This topic is now archived and is closed to further replies.