With the launch of iOS 17.2, Apple has explained the map-related data it collects to improve its augmented reality location feature. In a new one Supporting documentssays Apple aims to improve the speed and accuracy of Augmented reality functionality in the maps app.
When you use augmented reality features in Maps, including walking directions or the location refinement option, Apple collects information through “feature points” that represent the shape and appearance of stationary objects such as buildings. The data does not include photos or images.
Apple explains that the app is in use on-device machine learning to compare feature points with Apple Maps reference data sent to your iPhone. The camera filters out moving objects like people and vehicles, while Apple only collects feature points from stationary objects.
By comparing feature points to Apple Maps reference data, Maps can pinpoint a user’s location and Providing detailed directions in AR. Using AR directions or the location accuracy improvement feature updates Apple’s reference data to improve the accuracy of augmented reality.
The data Apple collects is encrypted and is not associated with an individual user or Apple ID. Apple also uses on-device machine learning to add “noise” to the feature point data, adding irregular variations that complicate any attempt to use those points to reconstruct an image from the data.
According to Apple, only an extremely sophisticated attack that has access to the company’s encryption system would be able to recover an image from the feature points, but since the data is encrypted and limited only to Apple, “an attack and a Data recovery is extremely unlikely.”
The usage of AR data can be disabled to prevent Apple from collecting them. You can access the Improve AR Accuracy toggle in the Settings app by going to Privacy and Security, then tapping Analytics and Improvements.