Uncategorized

“The limits of my language are the limits of my world”- Ludwig Wittgenstein | Highlights Section 1

16 Feb , 2017  

• At the beginning of the fourth industrial revolution, we are in a turning point of human history in which humans are shaping the future interaction between us and machines in our everyday life.

• Human language is rich and complex, we need to teach computers how to process it and therefore understand us.

Mark Riedl, Director of the Entertainment Intelligence Lab talked about the intersection of artificial intelligence and storytelling. Riedl explained that the so-called machine enculturation, meaning having machines interacting with us, is something we are going to see in the near future, machines that incorporate in our real world. Through stories, we can teach computers how society works. Also, he stated that we need to teach the machines human values, social norms, and rules in order to prevent conflict. These developments will result in many practical, real-world applications of narrative intelligence. A better understanding of humans, makes the machines appear less “alien” to us as something we don’t have to fear about. Ultimately, in doing all that, they could be part of our culture in a safer space for society.

Wendy Johannson, Director of Product and User experience at Wizeline talked about the applications of language and speech recognition. She understated that in the usability experience matters, humans are very much involved: Things don’t create themselves, humans -the creators- are the ones who matter in the system. “The last 60 years we have been adapting to computers, learning how to use them… and for the next years, computers will be adapting to us.” The four main applications of language and speak recognition are: 1) Command System -talking to machines through the phone for example; 2) Dictation – to avoid driving and texting which helps improve our lives avoiding risky experiences; 3) Agents –like Siri (Apple) or Alexa (Amazon) which reflect brand personality; and 4) Identification – like voice recognition which a lot of banks are using for security. These applications have some common limitations: limit use in different languages, problems with acoustic modeling also known as accent recognition, and usability.

Jean Pierre Kloppers, CEO of BrandsEye a platform that combines machine learning and crowdsourcing for data analysis predicted the two main political turning points in our times: Brexit Leave and Trump winning the election in the USA presidential campaign. Kloppers said that we study sentiment analysis, because we know that the way people feel today will influence what they will do tomorrow. Understanding massive data is hard for computers, and even more if people are being sarcastic or making jokes. BrandsEye works with crowdsourcing so that real people that read online information are able to understand it and combine this analysis with machine learning gives us 97% of accuracy.

As a summary, in the Zoom in conclusions session, speakers had the opportunity to expand on the implications, barriers and goals of language and analytics. Focusing on the role of humans in shaping where artificial intelligence goes from here; and how its different applications in various spheres of our daily lives such as social, political and economic will shape our world.