28.06.2023 22:45 Uhr, Quelle: Engadget

Researchers reconstruct 3D environments from eye reflections

Researchers at the University of Maryland have turned eye reflections into (somewhat discernible) 3D scenes. The work builds on Neural Radiance Fields (NeRF), an AI technology that can reconstruct environments from 2D photos. Although the eye-reflection approach has a long way to go before it spawns any practical applications, the study (first reported byTech Xplore) provides a fascinating glimpse into a technology that could eventually reveal an environment from a series of simple portrait photos.The team used subtle reflections of light captured in human eyes (using consecutive images shot from a single sensor) to try to discern the person’s immediate environment. They began with several high-resolution images from a fixed camera position, capturing a moving individual looking toward the camera. They then zoomed in on the reflections, isolating them and calculating where the eyes were looking in the photos.The results (here’s the entire set animated) show a decently discernible environmental rec

Weiterlesen bei Engadget

Digg del.icio.us Facebook email MySpace Technorati Twitter

JustMac.info © Thomas Lohner - Impressum - Datenschutz