At WWDC 2022 Apple introduced RoomPlan API for Swift, which allows obtaining room scans using a camera and LiDAR on iPhone and iPad. This might look similar to the Scene Reconstruction API, which was introduced earlier and also uses LiDAR. It produces a polygonal mesh of the environment, which essentially provides information about the shape of the environment.
We’ve got a lot of great ideas for running AI/XR solutions on a mobile device, and our team needs a strong iOS developer to bring them to life.
Apple events always amaze the entire world and 2020 was not the exception. Apple presented the first mobile devices equipped with LiDAR: iPad Pro 11 and iPhone 12 Pro (and PRO max version). This active sensor measures physical distances to the objects on a spatial two-dimensional grid. Nowadays it is widespread in the automotive area for object detection and collision avoidance.
Our client’s goal was to enhance various printed media (magazines, posters, banners, etc.) with interactive experience using augmented reality. With AR, certain areas on the reading materials can be overlayed with digital information of a different kind: from videos, images, and 3D models to weather information and buttons that bring additional functionality, etc.
Indoor positioning systems are becoming popular nowadays. Indeed, there is plenty of opportunities for real-time user navigation in GPS-denied environments. An interesting use cases are as follows: Fig. 1. Indoor navigation use cases There are several options for hardware (see It-Jim blog post). We have developed the positioning algorithm based on cheap Bluetooth beacons and built-in IMU sensors on a mobile device.