An iOS library that combines ARKit's local accuracy with CoreLocation's GPS scale to place AR objects using real-world coordinates.
ARKit-CoreLocation is an open-source iOS library that merges ARKit's precise local mapping with CoreLocation's global GPS data. It enables developers to create augmented reality experiences anchored to real-world geographic coordinates, bridging the gap between high-fidelity AR and large-scale location-based applications.
iOS developers building location-based augmented reality apps, such as navigation aids, tourism guides, or geospatial data visualization tools, who need to place AR objects at specific geographic coordinates.
Developers choose this library because it uniquely combines ARKit's high local accuracy with CoreLocation's global scale, offering tools like SceneLocationView for simplified management and improved location accuracy through sensor fusion, reducing GPS errors in AR contexts.
Combines the high accuracy of AR with the scale of GPS data.
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
Enables precise placement of AR objects at geographic coordinates, allowing developers to create location-based experiences like navigation overlays or points of interest, as shown in the demos.
Combines ARKit's motion tracking with CoreLocation data to reduce GPS errors, especially over short distances, leveraging sensor fusion as described in the improved location accuracy section.
Provides SceneLocationView class that handles AR rendering and location node updates, reducing boilerplate code for integrating ARKit and CoreLocation.
Supports annotations using UIImage, UIView, or CALayer, offering customizable options for points of interest that can face the user or scale with distance, as detailed in the LocationAnnotationNode usage.
The improved accuracy feature is experimental and can be confused by ARKit's occasional inaccuracies in user position and direction, as admitted in the issues section of the README.
Requires manual adjustments for scene heading due to device calibration limits with 15º accuracy, adding setup complexity and user intervention, as noted in the True North calibration section.
UIView annotations are converted to static UIImage internally, preventing dynamic content updates and limiting real-time interactivity, which is a constraint mentioned in the quick start guide.