MIT researchers have put forward a paper for a new approach to fast Fourier transform math that could provide a major lift to image compression and other signal processing technology. The new technique, discovered by associate professor Dina Katabi and professor Piotr Indyk, divides signals and looks for "sparse" but strong frequencies within each section of the signal. Since it would only need to sample random details from those sparse signals instead of full details, it could speed up the processing time by as much as ten times, MIT said.