Logging And Averaging


Recommended Posts

I would like to be able to log average values, sampled data average over 60 seconds, to have one minute log intervals.  I would ALSO like to have the original samples of every second available back, say 10 minutes, in the event of a failure, to see what led up to that failure.  Is this possible without having duplicate channels for each point being sampled?

Link to comment
Share on other sites

Yup.  What you do is setup the channel to read at your desired fast data rate.  Then create a logging set for your averaged data.  On the details page select "Fixed Interval", then to the right set the Interval to 60 and make sure Average is selected.  That's it.  If you also want to log 1 second data, create another logging set with All Data Points selected.

For display, do the same thing.  Keep the channel data coming in fast.  To display the most recent 1 second data, its just myChannel[0].  To display the average of the last 60 seconds its: myChannel[systime(), systime()-60], or if one second data: myChannel[0,59].  To graph the average use the appropriate boxcar() or smooth() function in your graph expression.

Link to comment
Share on other sites

Great, thanks for the reply.


That being said, is there a way to create a logging set that is 600 rows max file size, that continuously overwrites on a FIFO basis?  I'm trying to see if I can have an event trigger capturing a log file for the last "x" seconds for all channels.  The simplest way that I can think of would be to have a log file continuously logging all channels, then just save that file upon an event (i.e. myTemp[0] > 300), this way the last ten minutes of sampling every second would be captured and would be helpful in determining why myTemp[0] went > 300.



Link to comment
Share on other sites

Three choices:


1) use an Export set where the Expression for each row has [0,599] in it:




then run the export set when you want to log the last 600 rows.


2) if you want this continuously, I suppose you could technically just keep rerunning the export set, after setting the "Existing File" parameter to "Overwrite".  You'd just need to make sure and wait for the export set to stop running before running it again.  Just look at the export.myExport.running flag.  If you try and run it while its still busy, the command will be ignored.  Of course it should export very quickly.


3) you could script it up real quick yourself using a combination of file.open(), file.writeDelim() and file.close().  With WriteDelim() you could dump the channel to a file in a single command.  If you want to dump multiple channels (or have a time stamp come to think of it), you'd have to build up an array, but that is easy too:


private out

out[][0] = myChannel.time[0,599]

out[][1] = myChannel.[0,599]

out[][2] = myOtherChannel[0,599]

private handle = file.open("c:\data\myFile.csv",0,1,0,1)

file.writeDelim(handle, out, ",", chr(10))



Export set is easier, but the file. functions are, as always, more flexible.

Link to comment
Share on other sites

so would I simply put an event similar to this in my channel's event box:


  if(myTemp[0] > 300)




and do as your #1 suggestion is, have an export set named myExport defined with channel names myChan1[0,599]; myChan2[0,599]; etc?

Link to comment
Share on other sites


This topic is now archived and is closed to further replies.