Apps and robots!
Researchers at Carnegie Mellon are teaching robots.
Apparently, our robotic servants aren’t cutting it. Because a new research program takes lessons for teaching babies and applies it to robots. They’re teaching robots how to use everyday physical objects.
“Psychological studies have shown that if people can’t affect what they see, their visual understanding of that scene is limited,” said PhD student Lerrel Pinto, who’s in the research group. “Interaction with the real world exposes a lot of visual dynamics.”
Google has given up to $1.5 million to the researchers. Hopefully this is the next step to robots washing my car and cooking.
Next is an app that can replace your doctor.
Not really. But it can help you with everyday questions like what’s that mole on my arm…
A new algorythm for a mobile app has been written with a database of close to 130,000 images of skin diseases.
The programmers- again funded by Google- hope to produce a smart x-ray app that can identify skin cancer and other dermatological problems.
Or hopefully just tell you it’s a mole.