Google is teaching computers to trip balls. Nah, not really. But it is reverse engineering neural networks, teaching the A.I. to spit back what it sees when it tries to sort through and recognise images, in order to better understand whether its algorithms are working and how to improve them. As Geoffrey Hinton discussed in that podcast I linked to the other day, neural nets are getting better and better at doing the things we ask it to do, but in a twist on the proverb, teach a computer to fish and you’ll spend a whole lifetime wondering how the hell it caught the damn thing… (h/t moonmilk via mefi, AKA @ranjit)
Helicopter designers are turning to automation to improve safety, reports the WSJ
iRobot and now CyPhy Works’ Helen Greiner with a pretty touching recounting of what drove her into robotics
And in the latest instalment of the NY Times’ Robotica series, it talks to a bunch of Aibo robot dog owners who wanted to fight on when Sony decided to send the pups to the big server farm in the sky.