Migrating from Simple Analytics to self-hosted Umami
If you are moving away from Simple Analytics and want to keep your historical numbers in a self-hosted Umami installation, the good news is: it is possible.
I recently migrated historical data from Simple Analytics into Umami and wrote a Python script that converts the Simple Analytics CSV export into SQL statements for Umami.
The script is available as a GitHub Gist: Simple Analytics to Umami migration script
This post explains the full workflow: exporting the data from Simple Analytics, generating the SQL file, and importing that SQL into a Docker-based Umami setup.
What this migration can and cannot do
Simple Analytics is intentionally privacy-friendly, and that means its raw export contains less tracking detail than many other analytics tools.
That has a direct impact on the migration:
pageviewscan be migrated wellvisitscan be approximatedvisitorscan be approximated- deep session analysis will not be as accurate as native Umami data
In other words, this migration is a good fit if you mainly want to preserve historical traffic trends and compare them with current Umami data.
Prerequisites
You need:
- a self-hosted Umami installation with PostgreSQL
- the
website_idof the target website in Umami - a CSV export from Simple Analytics for each website/domain
- Python 3.12 or newer
I also recommend creating a PostgreSQL backup before importing anything.
Exporting your data from Simple Analytics
There is a good documentation available on how to export your Simple Analytics data.
The short version is:
- Open your Simple Analytics dashboard
- Go to the export page in the UI
- Download the raw data as CSV
Simple Analytics also supports exporting via API, but for this migration the CSV export is the simplest option.
One CSV per website
In my case, each domain was tracked as a separate website in both tools:
- One CSV file per website in Simple Analytics
- One Umami
website_idper website in Umami
That is important, because the migration script imports one CSV file into one Umami website.
Finding the Umami website ID
If you do not already have the website_id, you can query it from PostgreSQL:
docker compose exec db psql -U YOUR_POSTGRES_USER -d YOUR_POSTGRES_DB -c "select website_id, name, domain from website;"
Pick the correct website_id for the website you want to import into.
Generating the SQL file
Download the Python script from the gist above and run it like this:
python3 migrate_simpleanalytics_to_umami.py \
--input-file /path/to/example.com.csv \
--website-id YOUR-UMAMI-WEBSITE-ID
By default, the script writes a SQL file next to the CSV, for example:
/path/to/example.com.csv.sql
You can also choose a custom output file:
python3 migrate_simpleanalytics_to_umami.py \
--input-file /path/to/example.com.csv \
--website-id YOUR-UMAMI-WEBSITE-ID \
--output-file /path/to/example.com-import.sql
How the script handles sessions and visitors
Simple Analytics exports do not provide Umami-ready sessions and visitors, so the script has to approximate them.
It does that by building a fingerprint from fields such as:
- browser name and version
- operating system name and version
- country code
- screen size
- user agent
If two matching events are less than 30 minutes apart, they are treated as the same visit. Otherwise, a new visit is created.
This is not perfect, but it is significantly better than treating every pageview as a separate visit.
Designed for large exports
My largest CSV file was more than 800 MB.
Because of that, the script is designed to process the CSV in a streaming fashion:
- it does not load the full file into memory
- it reads rows sequentially
- it writes SQL in batches
That makes it suitable for large exports on a server.
Importing the SQL into Umami
My Umami setup uses Docker Compose with a separate PostgreSQL container.
In that setup, the easiest way to import the generated SQL is with psql inside the db container.
First, create a backup:
docker compose exec -T db pg_dump -U YOUR_POSTGRES_USER -d YOUR_POSTGRES_DB > /root/umami-before-import.sql
Then import the generated SQL file:
docker compose exec -T db psql -v ON_ERROR_STOP=1 -U YOUR_POSTGRES_USER -d YOUR_POSTGRES_DB < /path/to/example.com.csv.sql
The important parts are:
-Tdisables TTY allocation so input redirection works correctly-v ON_ERROR_STOP=1stops at the first SQL error-Uand-dmust match your PostgreSQL user and databasedbmust be your Umami postgres container name
If you have multiple websites, import them one after another.
Verifying the import
After the import, you can run a few quick checks:
docker compose exec db psql -U YOUR_POSTGRES_USER -d YOUR_POSTGRES_DB -c "select count(*) from website_event;"
docker compose exec db psql -U YOUR_POSTGRES_USER -d YOUR_POSTGRES_DB -c "select count(*) from session;"
If you want to verify one specific website only:
docker compose exec db psql -U YOUR_POSTGRES_USER -d YOUR_POSTGRES_DB -c "select count(*) from website_event where website_id = 'YOUR-UMAMI-WEBSITE-ID';"
docker compose exec db psql -U YOUR_POSTGRES_USER -d YOUR_POSTGRES_DB -c "select count(*) from session where website_id = 'YOUR-UMAMI-WEBSITE-ID';"
After that, open Umami and compare the imported historical numbers with your expectations.
Notes about bots
My migration script skips rows marked as robots in the Simple Analytics export.
That was the right choice for my setup, because Umami typically excludes bot traffic by default. If your current Umami setup is configured differently, you should review that behavior before importing historical data so your comparisons stay consistent.