Google is teaching computers to trip balls. Nah, not really. But it is reverse engineering neural networks, teaching the A.I. to spit back what it sees when it tries to sort through and recognise images, in order to better understand whether its algorithms are working and how to improve them. As Geoffrey Hinton discussed in that podcast I linked to the other day, neural nets are getting better and better at doing the things we ask it to do, but in a twist on the proverb, teach a computer to fish and you’ll spend a whole lifetime wondering how the hell it caught the damn thing… (h/t moonmilk via mefi, AKA @ranjit)
IBM’s Jeopardy-champ Watson may soon be hitting your smart phone, reports Niv Elis at the Jerusalem Post
Helicopter designers are turning to automation to improve safety, reports the WSJ
NASA would like to swing on a star. Or at the very least grab hold of an asteroid, and it’s got just the grippy bot to do it, reports Evan Ackerman at IEEE Spectrum
iRobot and now CyPhy Works’ Helen Greiner with a pretty touching recounting of what drove her into robotics
John Walker of Rock Paper Shotgun contends the the supposedly revolutionary Oculus Rift and its ilk are basically like 3D TV…you know, doomed
And in the latest instalment of the NY Times’ Robotica series, it talks to a bunch of Aibo robot dog owners who wanted to fight on when Sony decided to send the pups to the big server farm in the sky.