Average 6000 data points in sets of 10
Hallo everyone,
I have 6000 data points, which is too much. The data points are aligned with
a time line. It measures 10 times a second. I want to take the average of
each second (so ten data points) how do I do that?
I can't do it manually and if I use the average function it averages the
first 10 and then 2-11 while it should average 10-20 and so on.
Please help, Thank you.
|