Filed under: Robots
Tokyo University engineer Tsuyoshi Horo has developed a novel system for controlling robots (or in this case, a moving stool) using a simple set of hand and body gestures. The researcher is utilizing a circular array of cameras to track and detect body movement within a controlled environment, and then translate those movements to actions for an automaton. The cameras are used to create a real-time, 3D, volumetric model of objects or people in the space, which is then converted into a psychedelic stack of virtual cubes which are read and processed as data. Viewed movement allows a user to control something like the direction of a bot simply by pointing which way