Hello
I have an interesting problem I am trying to solve. I have two datasets that show distance of a sensor from the bottom of a pipe. As the water rises the distance (in inches) gets smaller.
For example the sensor starts at 95 inches, and as water rises the next data point (anywhere from 1 minute to 12 minutes after) will show 92 inches and so on and so on.
Eventually, if the water rises high enough, the sensor "flatlines" and no longer provides an accurate reading. An example of that would be a constant reporting of 16 inches.
So here is what I have been racking my brain about:
At a certain time one of the sensors "flatlined", we will say at 2:00 PM, at a reading of 22 inches. We know for a fact that the water climbed past that twenty two inches to level or 0 inches.
The second dataset continues, we will say until 6:00 PM, before "flatlining" at a reading of 16 inches.
To solve this I have taken the average rise from 2:00 PM until 6:00 PM on this data set, which was 2.86 inches every 15 minutes, and applied it to the other data set to estimate when it reach 0" inches.
I am not entirely sure that this is the best way to forecast the rise for the other sensor. Is there a better way to do this?
Bookmarks