Moves Import From SQLite Database (Instead of Export ZIP)

Hi, I used Moves app back in the day. Never got around to grab the export of my data. I do, however, have a copy of the SQLite database from an old iPhone backup.

I know that importing from the SQLite database is not possible. So I wrote a Python script that grabs data from the database and converts it into daily storyline JSON files. The result looks promising (excerpt below).

When I copy the JSON file into the Arc App iCloud folder I can see the import badge light up for a second but the data does not appear in the app timeline. (Tried to check the iPhone logs, too, but cannot make sense of the Arc logs.)

My question is: is there any place where I can read up on the file format, the directory structure and whatever else is expected from a moves_export.zip in order to replicate this as close as possible in hopes to get the old data imported?

What I tried:

  • added the storyline_YYYYMMDD.json file to the iCloud/Arc App/Import/ folder
  • added the storyline_YYYYMMDD.json file to the iCloud/Arc App/moves_export/json/daily/storyline/ folder
  • added a moves_export.zip file (with the json/daily/storyline/storyline_YYYYMMDD.json directory/file structure) to iCloud/Arc App/

The zip file is unpacked (by Arc, I assume). But that’s my only success up until here. As I don’t have a (somewhat) complete picture what’s expected by Arc for this to properly work, I am afraid I am stuck. Any kind of pointer, hint at a “official” documentation or just a (working and redacted example file) would be greatly appreciated.

Thanks in advance!

Generated JSON data (excerpt)

(lat, lon, date/time fields redacted)

{
  "date": "20131008",
  "summary": [
    {
      "activity": "walking",
      "group": "walking",
      "duration": 1561,
      "steps": 2587,
      "distance": 1783
    },
    {
      "activity": "transport",
      "group": "transport",
      "duration": 1539,
      "steps": 0,
      "distance": 7023
    }
  ],
  "segments": [
    {
      "type": "move",
      "activities": [
        {
          "totalDistanceLoc": 771.0,
          "trackPoints": [
            {
              "time": "20131008T102000+0200",
              "lat": 1.0,
              "hacc": 1,
              "lon": 1.0
            },
            {
              "time": "20131008T102000+0200",
              "lat": 1.0,
              "hacc": 16.0,
              "lon": 1.0
            },
            {
              "time": "20131008T102000+0200",
              "lat": 1.0,
              "hacc": 4.0,
              "lon": 1.0
            }
          ],
          "startTime": "20131008T102000+0200",
          "totalDistanceStep": 719,
          "steps": 959,
          "bridged": false,
          "filteredSteps": 0,
          "totalTime": 545,
          "activity": "walking",
          "totalDistance": 771,
          "group": "walking"
        }
      ],
      "startTime": "20131008T102000+0200",
      "steps": 0,
      "bridged": false,
      "endTime": "20131008T102000+0200"
    },
    {
      "bridged": false,
      "place": {
        "id": 917312,
        "type": "unknown",
        "location": {
          "lat": 1.0,
          "lon": 1.0
        }
      },
      "type": "place",
      "startTime": "20131008T102000+0200",
      "activities": [
        {
          "totalDistanceLoc": 1,
          "trackPoints": [
            {
              "time": "20131008T102000+0200",
              "lat": 1.0,
              "hacc": 100.0,
              "lon": 1.0
            },
            {
              "time": "20131008T102000+0200",
              "lat": 1.0,
              "hacc": 100.0,
              "lon": 1.0
            }
          ],
          "startTime": "20131008T102000+0200",
          "totalDistanceStep": 15,
          "steps": 20,
          "bridged": false,
          "filteredSteps": 0,
          "totalTime": 30,
          "activity": "walking",
          "totalDistance": 10,
          "group": "walking"
        }
      ],
      "location": {
        "lat": 1.0,
        "lon": 1.0
      },
      "endTime": "20131008T102000+0200",
      "steps": 0
    }
  ]
}

Hi @dan!

If the Importing badge shows up (albeit briefly) then I’m guessing you’re already putting the files in the right place, right filenames, etc.

So I suspect what’s happening is Arc’s Moves importer is broken. That code is over 6 years old now, and hasn’t been tested or updated in 4+ years. So yeah, my first guess at the most likely problem is that the code just isn’t working properly anymore, due to other things around it having subtly or significantly changed over the years.

Aside: I was actually going to rip that Moves importing code out completely recently. But figured it at least wasn’t causing any problems elsewhere in the system, and might still be working correctly, so no harm in leaving it. Though you might have proven now that it’s not still working correctly!

We could start diving in to the debugging process, to figure out what’s going wrong in the importer. There’s a chance it might just be one or two minor disconnects between old code and new, that can be patched up without too much fuss, if we can find them.

Though the other option is to convert your SQLite data to GPX instead of Moves storyline format. Arc’s GPX importer is very new, and much more advanced than the dodgy old Moves importer (that I wrote in couple of weeks, in a panic, back when Facebook started shutting down Moves with short notice).

Though GPX is inherently going to be more fiddly to generate, given that it’s both XML and a somewhat different structure from what’s going to be in the Moves SQLite db.

Let me know which way you want to approach it. If you want to go with the storyline format, feel free to send me some example files to matt@bigpaua.com, and I’ll see if I can get them importing on a test device here, and get that debugging process going.

The advantage of going the GPX route would be that you’d have fine grained control over what gets imported, with the ability to correct timezone issues and activity type classification issues at import time in the GPX importer’s advanced interface. The disadvantage would be the likely tedious process of twisting the Moves data into the right shapes to fit into GPX.

Both options sound equally viable to me, so I’m happy to assist with either!

Hi @matt, thanks for the fast response. I was afraid I would be late for the Moves importer. :smile:

I’ve emailed you a couple of daily storyline exports, converted to what I believe is the Moves JSON format. It would be awesome if you could check the import and see if a quick patch is possible or if I am missing something about the expected format. There might be errors, especially with activities, as that part of the conversion involved some guesswork.

If that doesn’t work, I’ll try converting to GPX. How does the GPX importer recognize activities? Can they be included in the format, or does Arc infer them from track data (location + time)?

Thanks for the zip! I’m having a sift through the JSON and code now…

First thing to check, yep, it’s got the right filename. Arc looks specifically for moves_export.zip.

The code then unzips that, and looks for a json.zip, which it also unzips. If it’s already unzipped I would imagine it’d continue on happily to the next step, but I haven’t looked at this code in years, so that’s just a guess.

Ah, I think I see the first problem. The import code looks for json/monthly/storyline. Your zip has only daily, so the importer won’t find them.

So yeah, it basically wants to get moves_export/json/monthly/storyline, and will then attempt to work through the files in there.

I’ve got a feeling “activities” in Moves data were a detail that couldn’t be imported. Like, they were just a list of physical activities inside a Visit - kind of the equivalent of HealthKit Workouts that don’t have Workout Routes - so there wasn’t any location data to import. But I might be wrong on that… I’ll keep having a poke around the import code.

Oh, yeah, I don’t have the monthly storyline exports. Two reasons for that:

  1. the SQLite database schema is structured around daily storylines, and
  2. I couldn’t find any JSON example files for monthly storylines.

Do you have an idea of the fields or schema that the importer expects the monthly storyline files to have? Also, which filenames would be expected to be in the moves_export/json/monthly/storyline directory?

I could then prepare a monthly export to test if the import works.

Good questions. I’ll email you one of the moves_export.zip files I’ve kept sitting around… K, sent!

Thanks! So, it looks like the monthly storyline exports are located in:

moves_export/json/monthly/storyline/storyline_YYYY-MM.json

Monthly storyline JSON files seem to be a list of daily storyline objects, i.e. what I had in the moves_export/json/daily/storyline/storyline_YYYYMMDD.json files grouped by month, in one big array.

Arc unpacks the updated ZIP file and its nested json.zip but nothing more happens. I might still be missing something about the format or this is really that the importer is not working any more.

Overview of the ZIP file’s contents

For future reference, this is an overview of the structure of the zip file.

What I didn’t know before is, that the moves_export.zip holds nested zip files (instead of directories). Here’s the structure:

moves_export.zip
  csv.zip
  geojson.zip
  georss.zip
  gpx.zip
  ical.zip
  json.zip

Where json.zip is structured as follows:

json/
  daily/
  monthly/
  weekly/
  yearly/

Each of these, daily, monthly … include folders for:

  activities/
    activities_[YYYY-MM | YYYYMMDD].json
  places/
    places_[YYYY-MM | YYYYMMDD].json
  storyline/
    storyline_[YYYY-MM | YYYYMMDD].json
  summary/
    summary_[YYYY-MM | YYYYMMDD].json

Then there’s a:

json/
  full/
    activities.json
    places.json
    storyline.json
    summary.json
1 Like

Yeah, there’s a good chance it’s just not working anymore :grimacing:

I think the next thing to do will be to send you a TestFlight build, so you can see the debug log views and see what’s being logged to console / log file.

Let’s see if I can also attach the Swift source files to this thread, so you’ve got visibility of those too… Hm, it won’t let me drag and drop Swift files in. Ok, hopefully this iCloud Drive share link will work: iCloud Drive - Apple iCloud

I’ll send you a TestFlight invite email. Looking at the logging code, it’s so archaic that it uses a global log() function I didn’t even realise still existed! But looks like it still gets to the debug log files, so should still help.

In the TestFlight build there’s extra options at the bottom of the Settings tab. The one you’ll want is “Debug Logs”, with a new log file started at each app launch.

Cool, got the invite. Will I be able to install the TestFlight version without losing my current Art Timeline data? It asks me to replace the currently installed (App Store) version and warns about potential data loss.

Yep. It’s safe to change between TestFlight and App Store builds.

The data is only at risk if the app is uninstalled. And even then, you have to delete all apps in the Arc family, to get iOS to delete the shared app container (which is where the databases live).