How Does Sample Rate Impact the Data Logger?
The maximum required sample rate impacts several key areas in the design and operation of data logger. Obviously, the logger has to have enough on-board memory to store all of the data collected between upload intervals. You can easily estimate the required memory size (in data points) by
# of inputs x sample rate x upload interval
For example, sampling 4 channels once a minute and uploading it weekly would require:
4 inputs x 1/minute x 60 minutes/hour x 24 hours/day x 7 days/week = 40,320 points
While this can easily be accommodated by larger systems like the dataTaker DT80 or Grant Squirrel 2020, it is well beyond the capability of smaller units like the TandD TR-71NW.
Faster sample rates and larger memory sizes drive the requirements for the on-board processor embedded in the data logger. Many of the smaller loggers with low sample rates rely on simple 8 or 16 bit microprocessors. Higher sample rates and larger memory require faster 32 bit controllers to provide the data throughput and larger memory address space to handle the data. This impacts not only the processor but all of the support circuitry around it and the power required to run it all.
Finally, the required sample rate also has a large impact on the power requirements for the logger. Most modern loggers use an internal sleep mode to reduce power consumption by turning off most of the circuitry in between measurements. If the logger is set to sample every 10 minutes but only needs to wake up for 2 seconds to take the measurement and then goes back to sleep the power it will require is less than 0.5% of the power that it would if it were operating at a sample rate of 2 seconds or faster. This really has a huge impact in projects where mains power is not available and the logger must operate off batteries or solar power.
How Will You Process the Data?
One thing that is often not considered up front is how the data will be processed. The current version of Excel supports 1.048 million rows, which sounds like a lot. I recently had a customer ask me to convert a data file to CSV format so they could import it into Excel. It was 6 months of data sampled once a second which turns out to be about 1.5 million rows – sorry you can’t read it in as a single file, it has to be split up. And even if you could read it in how would you go about analyzing this amount of data?
It’s really important that you think about this before jumping in. It’s easy to get a fast PC with a big hard disk to store all this data but how will you go about digging through it to find the important information? If you really need to sample fast, consider a data logger that has more advanced features like built-in statistical data processing that can be found in the dataTaker DT8x family or tolerance based data reduction featured in the Delphin products.
It may sound like overkill, but when selecting the sample rate and data logger you will use for a project you really need to look at it as a system. Think about the goal when choosing a sample rate to make sure it’s realistic. Realize that it’s not just the logger; you have to think about how fast the things you are measuring really change, how the sensor responds to these changes and how much noise you can tolerate in the measured data. The sample rate affects several aspects of the logger design including on-board memory, local processor requirements and power consumption and all of these will impact the cost. Finally, consider how you will manage and analyze the data – just remember more is not always better!