Fitness Challenge Blog
Tracker Comparison Part Two
Testing results and evaluation
May 26, 2019 by Joseph
In part one of this three-part series, I covered the fitness trackers used in the test and how the testing was conducted. While I am certainly not a statistician,
I wanted a decent sample size to add some confidence to the reliability of the testing. Therefore, this tracker comparison occurred over the course of three months.
Fitness trackers have some functions that are almost universally shared regardless of brand: steps, distance, active minutes, and calories burned.
Obviously, some devices track more activities than those listed, but it is difficult building a fitness challenge on an activity that most devices don’t support.
Therefore, the testing data collected focuses on these four functions.
When comparing the average data obtained from these trackers, we computed the relative percent range and standard deviation. The smaller the range and
standard deviation, the smaller the difference of recorded data, the less-likely that a tracker brand will make a difference in the final outcome of the challenge.
I.e. the “fairer” the challenge.
A majority of fitness challenges use steps walked as the mode of preference and there are several compelling reasons why:
- Everyone understands what constitutes a step and how it is measured.
- Steps can roughly approximate fitness – someone with a high daily step-count is generally more fit.
- Many other activities can convert to steps using readily available conversion charts.
Not surprisingly, all of the trackers and devices tested were very accurate when counting steps and the relative percent range of data collected was within
10% of each other. This was much closer than I anticipated. When counting just trackers and excluding using smartphones as trackers, the range was even smaller
at 5%. What does this mean? Trackers and smartphones are very good at counting steps (at least under controlled conditions). Taking the analysis a bit further,
the standard deviation is 93 steps when running two miles, meaning, all trackers tested are pretty equal when it comes to steps.
The following chart shows the Fitness Tracker and the associated average number of steps collected during the test when running 2 miles on a treadmill:
|Samsung Health App||2964|
|Apple Health App||3075|
Most trackers tested (excluding Polar) could be used to track distance. While very similar to steps, calculating distance requires that the fitness tracking
device knows the wearer’s “stride length” or the distance travelled with each step. Depending on the height of an individual and whether they are running or not,
one person may travel a much longer distance than another person who has the same step count. For this reason, there was much a much greater relative percent
range in the distance data recorded during our tests. In fact, there was a 46% difference between our shortest and longest distance and the sample had a
standard deviation of a quarter of a mile. Converting that to steps, the standard deviation for distance is roughly 500 steps.
The following chart shows the Fitness Tracker and the associated average distance collected during the test when running 2 miles on a treadmill:
|Samsung Health App||2.28|
|Apple Health App||2.21|
Of all challenge types, active minutes is my personal favorite. This challenge mode does not favor one exercise over another and can be used to track running,
biking, swimming, or any other high intensity workout with ease. The biggest downside is that all trackers calculate “medium to high intensity” differently
and some (Fitbit, Gamin, and others) do not start counting active minutes until after 10 minutes has elapsed. While this was done based on
American Heart Association recommendations, it doesn’t seem right when some of your workouts are very high intensity but last less than ten minutes.
Since each tracker manufacturer uses different algorithms to compute active minutes, a “real world” (read: not controlled) test would probably render more
widely disbursed results. However, in our controlled tests, there was just over 25% relative percent range in the results. The standard deviation turned out to be:
1.8 minutes. At 160 steps per minute, that comes to 288 steps. Better than a distance activity but worse than steps walked.
The following chart shows the Fitness Tracker and the associated average active minutes collected during the test when running 2 miles at 6 MPH on a treadmill:
While most trackers calculate calories burned, it is the least understood of the common tracker activities and the hardest to accurately compute.
Steps and distance are relatively concrete metrics. A step and a mile are the same for everyone even though it may take one person more steps to reach a mile.
An active minute is more complicated because you first must determine what constitutes as active (how vigorous the movement was). Calorie tracking takes this a
step further and not only does it try to determine being active, it also tries to calculate how many calories your body burns by being active.
While there are formulas that can help, this is not a universally precise measurement. What may be close for one individual, will be very inaccurate for another.
All of that said, running a competition on calories burned can be a lot of fun and lead to great results.
As expected, the relative percent range in calories burned was the highest of all tracker activities at 74%. The standard deviation was 63 calories.
When converted to steps, that comes to roughly 750 steps. If Polar (the outlier) was removed, the standard deviation drops to 36 calories or roughly 432 steps
which is less than the standard deviation for distance travelled. I will discuss Polar more in the next article and why this outlier was more a result of the
testing method and not an inditement on the device itself.
The following chart shows the Fitness Tracker and the associated average calories burned collected during the test when running 2 miles at 6 MPH on a treadmill:
|Samsung Health App||299.2|
|Apple Health App||228.0|
We started testing these devices against one another for accuracy out of a desire to better help our clients when evaluating trackers and from simple curiosity.
What we found was that, while some were better at accurately tracking running on a treadmill, they were all accurate enough to run a successful challenge and be
confident of a fair outcome.
In the next and final article in this series, I will discuss some of the findings as they relate to specific tracker manufacturers and devices.