Augmented Reality allows for the overlay of digital information and interactive content onto scenes and objects. In order to provide tight registration of data onto objects in a scene, it is most common for markers to be employed, such as QR Codes and ArUco markers.
LightAnchors enables spatially-anchored data in augmented reality applications without special hardware.
Unlike most prior tracking methods, which instrument objects with markers, LightAnchors takes advantage of point lights already found in many objects and environments. For example, most electrical appliances now feature small (LED) status lights, and light bulbs are common in indoor and outdoor settings.
Software-only device communication
LightAnchors requires no extra hardware and simply takes advantage of high-speed cameras on recent smartphone models.
Bring existing devices into AR experiences
Many devices already contain microprocessors that can control status lights and can be LightAnchor-enabled with a firmware update.
Dynamic payloads without the cloud
Unlike conventional markers, LightAnchors can transmit dynamic payloads, without the need for WiFi, Bluetooth or any connectivity.
LightAnchors is brought to you by the collaborative efforts of the Future Interfaces Group and SMASH Lab at Carnegie Mellon University. This research was generously supported with funds from the CONIX Research Center, one of six centers in JUMP, a Semiconductor Research Corporation (SRC) program. We are also grateful to Anthony Rowe and his lab for early help on this project.
To cite LightAnchors, please use the following reference:
Karan Ahuja, Sujeath Pareddy, Robert Xiao, Mayank Goel, and Chris Harrison. 2019. LightAnchors: Appropriating Point Lights for Spatially-Anchored Augmented Reality Interfaces. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST '19). ACM, New York, NY, USA, 189-196. DOI: https://doi.org/10.1145/3332165.3347884