SteveMyres Posted March 8, 2013 Share Posted March 8, 2013 File.ReadDelim() appears to traverse the entire file even when a number of lines is specified for reading, rather than leaving the read/write pointer at the beginning of the next line after it's read the specified number of lines. If so, this prevents using multiple consecutive calls to populate unrelated arrays. (Not saying it returns too much data, but just that it leaves the pointer in an unexpected place.) The way to do it then, I guess, is to read with an unspecified number of lines into a 2D buffer array, then move individual rows from there into the 1D target arrays. You may want to document the pointer behavior in the manual section on the File.ReadDelim(). Link to comment Share on other sites More sharing options...
AzeoTech Posted March 8, 2013 Share Posted March 8, 2013 That's because it's terribly inefficient to read one line at a time. Instead, DAQFactory reads a big block from the file (possibly even the whole file if its not too big) and then processes it. If it finds the desired number of rows, then it stops processing, leaving the file pointer wherever it happened to be. The difference is significant, several orders of magnitude at least. Alas, we can't document everything. For one thing, the manual is already 440 pages long, also its simply unspecified, which means you can't rely on it being anything. Once we specify it one way or another, we have to maintain that or break existing code. Note ReadInter() works better because with fixed record lengths, DAQFactory can predetermine how many bytes it needs to read. ASCII delimited, however, has variable record lengths, so its impossible to know. Link to comment Share on other sites More sharing options...
SteveMyres Posted March 9, 2013 Author Share Posted March 9, 2013 OK, that all makes sense. Is the idea of reading it all at once into a 2D array, then distributing rows to 1D arrays the best approach, for variable field length, variable byte length records? Link to comment Share on other sites More sharing options...
SteveMyres Posted March 9, 2013 Author Share Posted March 9, 2013 Here we go -- better idea than my first one. MyArray1 = StrToDouble(Parse(File.Read())) MyArray2 = StrToDouble(Parse(File.Read())) Link to comment Share on other sites More sharing options...
AzeoTech Posted March 9, 2013 Share Posted March 9, 2013 I don't know what you are trying to do. Are you trying to find a particular row? Link to comment Share on other sites More sharing options...
SteveMyres Posted March 9, 2013 Author Share Posted March 9, 2013 There's several rows in one file each of which backup data for one array, not necessarily all the same length. I'm only talking two or three lines, and the file read only gets done on DF startup anyway, so I don't mind the even two order of magnitude inefficiency in reading one line at a time. The writes are more frequent, but use File.WriteDelim(). It's a small number of arrays, but probably 100 total values, and I was doing it with registry variables, but then the total number of variables stored in the arrays started to grow, plus some of them are naturally floats, so..... The approach from the 8:05 post works OK and is the cleanest I've come up with so far. Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.