For years now, researchers have been exploring ways to create devices that understand the nonverbal cues that we take for granted in human-human interaction. One of the more interesting projects we've seen of late is led by Professor Peter Robinson at the Computer Laboratory at the University of Cambridge, who is working on what he calls "mind-reading machines," which can infer mental states of people from their body language. By analyzing faces, gestures, and tone of voice, it is hoped that machines could be made to be more helpful (hell, we'd settle for "less frustrating"). Peep the video after the break to see Robinson using a traditional (and annoying) satnav device, versus one that features both the