Smartphones will get smarter with “deep learning” developments

Smartphones will get smarter with “deep learning” developments
Just as augmented reality through devices like Google Glass, or Nokia City Lens, helps us identify points-of-interest in our surroundings, soon the cameras on our smartphones will be able to identify nearly everything in a given field of vision.

No, this is not just about facial recognition, this is about “everything” recognition. Researchers at Purdue University are developing an AI that is able to recognize general surroundings for what they are.

Imaging this example, you took a picture years ago of you and your friends at a concert. You want to pull up that picture again, but do not want to scroll through the thousands of pictures on your phone. What if you could initiate a search based on the surroundings in the picture, like “concert” or “stage” and pull it up that way?

Researchers are building what it calls a “deep learning” AI function that makes the machine process information in the same manner humans do. It learns to recognize things like “trees” or a “car” and create layers of information that can then be indexed and searched. Naturally the processing power has not been optimal for such development to be feasible in mobile devices, but as you might expect, the forward march of technology advancement is tearing down that obstacle.

Associate Professor at Purdue’s Weldon School of Biomedical Engineering and Department of Psychological Science, Eugenio Culurciello summarized, “It analyzes the scene and puts tags on everything. When you give vision to machines, the sky’s the limit.”

The artificial intelligence “upgrade” potential this presents to mobile devices is phenomenal. The methodology is not inefficient either, as research is showing that the approach being pursued is about 15-times more efficient than conventional graphic processing. This will have medical applications as well, as devices process scans of a patient, they will eventually learn to have the ability to detect signs of cancer or other risk factors.

This development truly is a matter of “when,” not "if," we will see it in our smartphones and wearables. Funding for this research is being provided by the Office of Naval Research, National Science Foundation, and DARPA. Culurciello has also started a company, called TeraDeep, to develop commercial applications to the research.

source: Perdue University

Recommended Stories

Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless