Artificial Intelligence Bulletin – Normal Social Conventions and Waymo

Posted By on December 14, 2016



Each week on a Wednesday Nikolas Badminton, Futurist highlights the top stories from the past week relating to the incredible rise of artificial intelligence and its application in society, the workplace, in cities, and in our lives.

In  Artificial Intelligence Bulletin – Normal Social Conventions and Waymo we see AI needing to behave itself, Google’s Waymo, fly monitoring with TensorFlow, and the hearing aid reinvented.

Jerry Kaplan: Artificial Intelligence & Normal Social Conventions

Stanford’s Jerry Kaplan says that the biggest challenge for the artificial intelligence field is making sure that robots adhere to “normal social conventions,” i.e. driverless cars not stealing parking spots.

Say Hello to Google’s Waymo

Waymo stands for a new way forward in mobility. We started as the Google self-driving car project in 2009 and we are now an independent self-driving technology company with a mission to make it safe and easy for people and things move around. In 2015, we invited Steve Mahan, former CEO of the Santa Clara Valley Blind Center, for a special ride. Steve had ridden in our cars in the past—first accompanied by a test driver in 2012 and then on a closed course in 2013. This time was different. Steve experienced the world’s first fully self-driving ride on public roads, navigating through everyday traffic with no steering wheel, no pedals, and no test driver.

See more highlights of Steve’s ride:


flyAI with TensorFlow

Installation by David Bowen features a colony of houseflies monitored with artificial intelligence controlling their wellbeing.

flyAI creates a situation where the fate of a colony of living houseflies is determined by the accuracy of artificial intelligence software.

The installation uses the TensorFlow machine learning image recognition library to classify images of live houseflies. As the flies fly and land in front of a camera, their image is captured. The captured image is classified by the image recognition software and a list of guessed items is ranked 1 through 5. Each of the items is assigned a percentage based on how likely the software thinks that the listed item is what it sees.

If “fly” is ranked number 1 on the list, a pump delivers water and nutrients to the colony based on the percentage of the ranking. If “fly” is not ranked number 1 the pump does not deliver water and nutrients to the colony. The system is setup to run indefinitely with an indeterminate outcome.



Deep Learning Reinvents the Hearing Aid

Without a better hearing aid, the world’s hearing will get worse. The World Health Organization estimates that 15 percent of adults, or roughly 766 million people, suffer from hearing loss. That number is rising as the population expands and the proportion of older adults becomes larger. And the potential market for an advanced hearing aid isn’t limited to people with hearing loss. Developers could use the technique to improve smartphone speech recognition. Employers could use it to help workers on noisy factory floors, and militaries could equip soldiers to hear one another through the noisy chaos of warfare.

It all adds up to a big potential market. The global US $6 billion hearing aid industry is expected to grow at 6 percent every year through 2020, according to the market research firm MarketsandMarkets, in Pune, India. Satisfying all those new customers, though, means finding a way to put the cocktail party problem behind us. At last, deep neural networks are pointing the way forward.

Read more at IEEE Spectrum



Like the story? Post comment using disqus.