BStewart Posted July 16, 2008 Share Posted July 16, 2008 Hi, I am trying to control an analog output channel (DAC0_fanout) which varies with time, with the following piece of sequence: global step private j step = (FanSetPoint-DAC0_fanout) / FanRampTime j=0 while (j<FanRampTime) DAC0_fanout = DAC0_fanout + step j++ wait(1) endwhile ... so basically it is suppose to update the new value for DAC0_fanout with a value equal to the previous value of DAC0_fanout plus an increment (step), every one second. On inspection of the time history of DAC0_fanput, the update time intervals range from 10s of ms up to approx one second (which confuses me also but is not the main point of my dilema). It seems that when the value of DAC0_fanout is updated to a new value, if there is a small time interval until the next update of the channel, it seems to revert back to the value before the recent update (like the new value didnt have time to register or something??). I hope I have explained well enough so you know what I am getting at... Thanks for any help... Ben Link to comment Share on other sites More sharing options...
This topic is now archived and is closed to further replies.