Count length of varying series

Copper Contributor

Greetings, all, and thank you for taking the time to read this.

 

What I have is a large data file output by an automated data collection device. The device performs an operation on a sample and takes measurements at ~1 second interval, generating a new line of data in an Excel spreadsheet for each measurement taken, noting such things as the timestamp, the count of which measurement it is in the sample (a row with a number indicating what number measurement it is, 1 up through 200-ish), and other information that interests me, like temperature. Each sample involves dozens or hundreds of measurements, and all the samples for several weeks are in the same spreadsheet.

 

What I want to do is to determine the length of time that each sample took. It would be simple enough if I weren't dealing with thousands of rows, but since I am, I need a way to automate it.

 

I want to note the timestamp for the first count and subtract it from the timestamp for the last count for each sample. But since each sample takes a different number of counts to complete, I don't know how to tell Excel to take the highest numbered count and use that to determine the length of time (or, put another way, to take the count from the row above the next time the count resets to 1)

 

Is there a way to do this? I know it's kind of an eccentric problem, but I thought maybe it was something people elsewhere have dealt with before. To say it one more time for clarity's sake, I just want Excel to subtract the timestamp in the row for count 1 from the timestamp in the row with the highest count before it resets to 1 again for the next sample. Thanks again for taking the time to read this.

 

Alan T.

1 Reply

Hi Alan

 

If you have Excel 2016 you could use MAXIFS and MINIFS

 

I've attached a demo file