I’ve never seen this before or paid attention to it until now. But last weekend I wondered why my location in the live view in Arc followed the road next to the tracks, even though I was on a train. At first I thought it was due to an inaccuracy in determining my location, but then I realized that the blue dot followed even smaller bends in the road. So I figured Arc was assuming I was in the car, and that it was tying my location to the road (as far as the light blue circle would allow).
I remembered someone here a few years ago asking about such a feature for train travel, and I thought maybe Matt had added this to Arc… but I don’t remember reading about it and for car travel it doesn’t make that much sense - only for navigation purposes, but not so much for tracking my position.
But then it dawned on me, and I checked Apple Maps and a few other apps: same behavior everywhere. So it’s not Arc’s fault, but Apple’s, I guess.
This is disappointing and quite a problem. Because it means that Apple is not trying to determine my location as well as possible, but is trying to guess it better based on the map data. Am I right about that?
Oh, someone else has soon it too now! I only saw it once, just recently, so wasn’t sure it was for real. But yep, “snap to roads”. I think you’re right about that.
What version of iOS are you on? I noticed it after I think updating to one of the iOS 18.4 betas. So the hunch is that Apple have added it in just recently.
Snap To Roads has been a feature of navigation apps for a long time. Google Maps and Apple Maps will do it strongly when in active navigation sessions. Which makes sense - if you’re getting driving directions you’re very likely to be on the road.
I’ve also long toyed with the idea of using OpenStreetMap data to provide Arc’s underlying Kalman filter with road information as a weak hint/signal, to hopefully reduce noise drift etc. But it’s too much of a “that would be cool to try” thing rather than “that’s something we actually need”, so I’ve never got around to it.
Anyway it seems like Apple might be experimenting with doing that automatically now, inside the Core Location framework that all iOS apps use for location data. Which is … I think not a great idea. And hopefully something we can turn off.
Well, unless it turns out to be actually helpful. But yeah, I’ll set aside some time to investigate more and figure out what’s best to do with it / about it.
Aside: My experience was very similar to yours. That blue live updating map dot in Arc is direct from iOS, not from Arc’s processed or filtered data, so it’s telling you purely what the phone itself is reporting, before Arc does anything with it. And I was watching it while travelling (probably also on a train) and saw it look suspiciously like it was following a nearby road. My first thought is “that’s a crazy coincidence”, but it … was too precise, and stuck at it for too long to be just pure chance.
I updated to 18.4 on the morning of March 30th. (from 18.3, having skipped a few betas over the last few months). A few hours later, I noticed this “snap to roads” behavior for the first time. I already know it from navigation apps (e.g. in OsmAnd you can turn it off) and understand its purpose.
It would only be really useful for tracking train journeys. First, because OSM data on train routes is very good, but GPS reception on trains is often not as good or much worse than in cars. Secondly, in Germany/Europe there are often cycle paths along a road or near a road. I wouldn’t want to have this “snap-to-road” function there — even not as a weak hint.
But I really doubt Apple paid much attention to trains (or bikes) when developing this new feature for Maps/their Core Location Framework.
So far I have not found a way to disable this feature… There only seems to be this new toggle “Improve location accuracy” (Location Services > System Services > Product Improvement).
Yeah this is similar to the thoughts I’ve had when this idea has popped up from time to time. The main difficulty being … what line to snap to?
The activity type classifiers can do a really good job a lot of the time, but we don’t use them real time - samples are classified later, to avoid the energy cost during real time recording. So how would we know which line (train line, road, cycle path, etc) to feed in to the Kalman filter in real time? We… wouldn’t.
And even if we were classifying in real time, there’s still the common case of the classifiers having a hard time of getting it right in complex situations. Having the wrong line fed to the filter as a hint would make things potentially considerably worse (as we’re possibly going to see with this new iOS 18.4 behaviour).
Yep. It’s one of my favourite rants: Apple are very US focused and often do a terrible job for trains, or anything not car, especially outside the US. I don’t have faith in them either understanding the complexity well or managing it well in this case.
There’s a setting on the CLLocationManagers used for recording, to indicate what kind of travel it is. But its types are:
other
automotiveNavigation
fitness
otherNavigation
airborne
The same issues described above mean that Arc can’t do real time changes to that with any expectation of accuracy, so it’s left as “other”. But my hunch is that Apple have used the “automotiveNavigation” and “otherNavigation” settings to implicitly enable Snap To Roads in the past, and are now also enabling it for “other”. Which… does sound like a bug to me.
I’d file it as a bug with Apple, but their bug reporting system is where happiness goes to die. There’s little to no expectation of any useful outcome. Sigh
The lack of OSM data being used to hint the tagging system has always been frustrating for me specifically as I am an avid OSM contributor! I can also think of a handful instances where that would be incredibly useful! [1]
If my journey didn’t start or end at an airport (intersection of Arc track and aeroway=runway), I definitely wasn’t in an airplane.
If I am traveling under 250km/h I am also not in a plane!
Trains and trams cannot travel off of tracks (unless they get loaded into a ferry or something goes horribly wrong).
Same applies to chairlifts, funiculars, ski lifts, and cable cars. All of these probably shouldn’t be auto-suggested unless they are present on the map. Ideally this would let Arc surface these under-used tags to the user first-try based on map data instead of requiring them to manually tag it to train the engine that these transport options are frequently used in these locations.
If I am over water and traveling over 8km/h I am likely in a boat
Golf only occurs on golf courses
Rowing, kayaking, and surfing only really occur over water
If I am over landuse=farmland it’s more likely (though not a complete guarantee) that I am in a tractor!
I’m sure there are more!
I don’t think “snap to roads” is a good idea for everything (especially not vehicles which can notably go off of roads) — there’s a lot of value to maintaining an accurate log of GPS data, if it is to be added I would absolutely give it a toggle.
Yeah… I feel that, you’re not the only one with this experience
I’ll also note that I tag my Arc data religiously every day. It still routinely gets things wrong without manual intervention, and is especially bad when traveling as it often doesn’t have much initial data to go off of in those regions. ↩︎
Hahah. @hwilkinson you’ve just listed off a whole bunch of my wishlist fantasies
The kinds of things I think about while not working, like “wouldn’t it be SO cool if we were doing [thing] or [other thing]”. But then I have a little quiet cry while I remind myself that there’s only one of me, and many more higher priority tasks I have to do instead
But a very strong YES to all of those ideas! God I wish I had time to explore them. Not only would they be potentially great power ups for the models / classifiers, but it’d be genuinely incredibly fun stuff to work on too.
I wonder if the hardest part would just be the surfacing of contextually relevant OSM data to the classifiers in real time (and to a less degree at model rebuild time). Adding new model features to the models themselves is fairly trivial. It’s the surfacing of the data when it’s needed that’s often the much bigger challenge.
Though for the new heart rate data model feature I’ve added in LocoKit2 / Arc Editor app I’m doing it delayed. Feeding in real time heart rate data would be prohibitively energy expensive. But collecting it up later in the foreground and annotating the existing LocomotionSamples isn’t too much struggle. And then the reclassification of those samples gets that added accuracy (with some really satisfying results showing up so far).
Maybe OSM data could be a similar delayed annotation process… Yes it means initial classification will still be more naive, but once the added metadata is in, reclassification… Yeah… I SO want to spend some time on this!