Published: Wed Aug 02 2023

Weekly reports from Garmin data

If you do a lot of running, then something you would probably benefit from knowing is how long you are spending in each heart rate zone per week. I have a Garmin watch that tracks all of this data, but, surprisingly, it doesn't appear to offer any kind of reporting of this sort of information.

If we could export the data to CSV or something similar it would be fairly easy to create these reports. Unfortunately, Garmin makes it pretty convoluted to export your own data.

Fortunately there are open source projects to automate exporting your data from Garmin, which I think work by essentially scraping the Garmin Connect website. I have been using GarminDB, which takes all your data and puts it into a set of Sqlite databases.

GarminDB is fairly easy to set up by following the instructions on GitHub. I have years worth of data and it took a few hours to fetch all the history initially. After this, the one point lacking from the documentation is where to find the databases. They are created in ~/HealthData. There are various things under there, but the activities DB is ~/HealthData/DBs/garmin_activities.db.

The update command to fetch new activities isn't particularly memorable so I've put it in a shell script:

#!/usr/bin/env bash --all --download --import --analyze --latest

From there, it's just a matter of SQL. The table layout is a bit unintuitive if you're just thinking in terms of running, so it's worth reviewing. There is an activities table, which is mostly what you'd expect. Then there's a steps_activities table which adds step specific data to the activity, like running dynamics data. There's also an activity_laps table, which gives you some of the same data but at the split/lap level.

My weekly report is as follows. It totals up the weekly heart rate zones, includes the heart rate zone definitions, and adds various other things I'm interested in. I'm particularly interested in some of my dynamics data, because I'm trying to work on a left/right imbalance. So I've included this and weighted it and some other fields by distance.

DROP VIEW IF EXISTS report_activities_zones_weekly;

CREATE VIEW report_activities_zones_weekly AS 

WITH data AS (
        STRFTIME('%Y_%W', a.start_time, 'localtime', 'weekday 0', '-6 days') AS `week`,	
        TIME(SUM(STRFTIME('%s', a.hrz_1_time)) - SUM(STRFTIME('%s', '2000-01-01')), "unixepoch") AS hrz_1_total,
        TIME(SUM(STRFTIME('%s', a.hrz_2_time)) - SUM(STRFTIME('%s', '2000-01-01')), "unixepoch") AS hrz_2_total,
        TIME(SUM(STRFTIME('%s', a.hrz_3_time)) - SUM(STRFTIME('%s', '2000-01-01')), "unixepoch") AS hrz_3_total,
        TIME(SUM(STRFTIME('%s', a.hrz_4_time)) - SUM(STRFTIME('%s', '2000-01-01')), "unixepoch") AS hrz_4_total,
        TIME(SUM(STRFTIME('%s', a.hrz_5_time)) - SUM(STRFTIME('%s', '2000-01-01')), "unixepoch") AS hrz_5_total,
        ROUND(AVG(hrz_1_hr)) AS hrz_1_hr,
        ROUND(AVG(hrz_2_hr)) AS hrz_2_hr,
        ROUND(AVG(hrz_3_hr)) AS hrz_3_hr,
        ROUND(AVG(hrz_4_hr)) AS hrz_4_hr,
        ROUND(AVG(hrz_5_hr)) AS hrz_5_hr,
        TIME(SUM(STRFTIME('%s', a.moving_time)) - SUM(STRFTIME('%s', '2000-01-01')), "unixepoch") AS total_time,
        COUNT(*) AS total_runs,
        ROUND(SUM(distance), 1) AS total_distance,
        SUM(steps) AS total_steps,		
        ROUND(SUM(distance * 1000) / SUM(steps), 2) AS stride_avg,
        -- weighted averages, in terms of distance
        ROUND(SUM(avg_gct_balance * distance) / SUM(distance), 1) AS gct_balance,
        ABS(50 - ROUND(SUM(avg_gct_balance * distance) / SUM(distance), 1)) AS gct_balance_offset,		
        ROUND(1000 * SUM(strftime('%f', avg_ground_contact_time) * distance) / SUM(distance)) AS gct_ms,
        ROUND(SUM(avg_stance_time_percent * distance) / SUM(distance), 1) AS stance_time_percent,
        ROUND(AVG(vo2_max), 1) AS vo2_max
      steps_activities AS sa
      JOIN activities AS a ON a.activity_id = sa.activity_id 
    WHERE == "running" 
    GROUP BY week
    ORDER BY week DESC
    SELECT week,
        ROUND((STRFTIME('%s', hrz_1_total) - STRFTIME('%s', '2000-01-01')) 
          / CAST(STRFTIME('%s', total_time) - STRFTIME('%s', '2000-01-01') AS REAL) * 100) AS hrz_1_pct,
        ROUND((STRFTIME('%s', hrz_2_total) - STRFTIME('%s', '2000-01-01')) 
          / CAST(STRFTIME('%s', total_time) - STRFTIME('%s', '2000-01-01') AS REAL) * 100) AS hrz_2_pct, 
        ROUND((STRFTIME('%s', hrz_3_total) - STRFTIME('%s', '2000-01-01')) 
          / CAST(STRFTIME('%s', total_time) - STRFTIME('%s', '2000-01-01') AS REAL) * 100) AS hrz_3_pct, 
        ROUND((STRFTIME('%s', hrz_4_total) - STRFTIME('%s', '2000-01-01')) 
          / CAST(STRFTIME('%s', total_time) - STRFTIME('%s', '2000-01-01') AS REAL) * 100) AS hrz_4_pct, 
        ROUND((STRFTIME('%s', hrz_5_total) - STRFTIME('%s', '2000-01-01')) 
          / CAST(STRFTIME('%s', total_time) - STRFTIME('%s', '2000-01-01') AS REAL) * 100) AS hrz_5_pct,
    FROM data

Sqlite is missing some common SQL features like variables and functions, so there is a bit of repetition.

Probably the biggest thing to note concerns the handling of time spans. Time spans are always a bit weird. Here they are stored as time fields. Using STRFTIME('%s', value) converts a time span to a number of seconds since 2000-01-01 00:00:00, so subtracting from this STRFTIME('%s', '2000-01-01') will convert the value to an absolute number of seconds.

The end result looks something like this:

Garmin weekly data

The next step will be trying to plot some of this data, especially the distribution trends of the relative time in each zone over time, but that's a challenge for another time and will require some research of charting libraries!