Simple Way To Track Total Volume Over Time?


EA1

Recommended Posts

I am running into problems trying to come up with a simple way to record the total volume of air sampled from start time to stop time.  I need to do similar functions to collect samples for a specified time, or volume, or just manually.  Right now I am trying to debug the timed sample process.   After initiating the sample, I record that start_time=systime().  Then call the function that will collect samples for a time that is specified in an edit box (=sample_time).  The problem seems to be related to the way I calculate the time increment.   Run_time is a test channel.  The initial variables have been previously defined as global variables.

 

The code is:

//This function takes a sample for a specified time period (=sample_time) set on Page 1)
function time_sample(start_time)
del_time=0 //defines and sets the value of time increment to 0
avg_flow=0
max_flow=0
min_flow=0 //calculate initial volume.  Here the volume=scc, time=sec, flow=std cc/sec
end_time=0
tot_time=0
delay(2)
run_time=systime()-start_time
private xvolume = run_time*mean(flow_rate_smp[systime(),start_time])//flow_rate_smp is an analog input from a flow meter.    

while (run_time<sample_time)
   delay(1)
   del_time=run_time[1]-run_time[0] //del_time is time increment since last checked
   del_volume=del_time*flow_rate_smp[0] //del_volume calculates the volume increment from current flow rate and del_time
   xvolume = xvolume + del_volume //calculates accumulated volume
   delay(1)
   run_time=systime()-start_time//keeps track of total sampling time
   continue
 endwhile  
endtime=systime()//records the end time of the sample
SET_POSN_B() //change 4 port to bypass when runtime>set_time
   total_volume=xvolume //records total calculated volume
   tot_time=run_time//records total_time of sample
   avg_flow=mean(flow_rate_smp[endtime,start_time])//calc average flow over sample time
   max_flow=max(flow_rate_smp[endtime,start_time])//calc max flow over sample time
   min_flow=min(flow_rate_smp[endtime,start_time])//calc min flow over sample time
   beginExport(Sample_info)//record the sample info
Return

 

It is likely obvious to you what I am doing wrong.  Can you help?

 

Thanks.

Link to comment
Share on other sites

I didn't take time to step through your logic, so there might be other issues, but one is that you're mixing syntax when addressing run_time for example.

 

DAQ Factory supports scalar variables as well as arrays, both generic and a specific array type called a channel, which automatically associates a time entry with each value.

 

Variables must be declared for scope (global, private, static, etc.) somewhere before first use, as in

global MyGlobalVar  = {1, 2, 4, 8}
private MyPrivateVar
global string MyStringVar = "Hello, World!"

If it's a channel it should be defined in your channel table, accessible from the workspace.

 

It's unclear from your code whether run_time is a scalar, a generic array, or a channel.  Only arrays (including channels) are referenced using the square bracket nomenclature, and arrays need it for assignment, as in

MyArray[3] = systime()

Channels use a different method when populated from script:

MyChannel.AddValue(newvalue)

Your code mixes these styles (except that run_time might possibly be declared somewhere else) so there's one issue.  You have to decide which data type to use and treat it as such throughout.

 

Also if run_time ends up as a channel or other array, if it's populated channel-style, pushing on from the front, then run_time[1] will be older than run_time[0], so run_time[1] - run_time[0] will yield a negative result.

 

Finally, since iterations are spaced with two Delay(1)'s, why are you computing the time between loops?  Isn't it by definition 2?  Or do you need more precision to account for any minimal delay in execution?

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.