A new patent by Apple reveals that the company has been working on gesture-based commands determined with audio transducers on the corners of a given surface -- in other words, as your fingers press and tap on a surface (like a keyboard or a computer casing or even the bezel around an iPad), the audio receivers would determine where and how you touched it, driving a user interface. The patent outlines a few different ways this could be done, from listening in to the housing itself or just keeping an electronic ear out for the sounds of touching the surface.
It seems like this would all be done via interaction with the surface itself, though of course we've