© Pint of Science, 2024. All rights reserved.
People communicate verbally: by speaking to one another, and non-verbally: through our actions, expressions and gestures. To realise the potential of human-robot interaction we need our robots to master both modes of communication. In our final ‘Tech Me Out’ event Edinburgh researchers Dr Frank Broz and Dr Korin Richmond will expand on their respective research in non-verbal human-robot interaction, and in automatic speech synthesis. Please note that this event takes place on the lower ground floor and is not accessible for those with impaired mobility.
"Repeat after me!" - Teaching machines to talk like us
Mankind has long wanted machines to speak, producing speech artificially in a process called "speech synthesis". After decades of work, devices once dreams of science fiction are appearing in daily life as innovations like Sat Navs and smartphone assistants (Siri or Cortana) - computers that can speak! How did we get to this point? Why is getting computers to speak so hard? This talk will feature research from the UoE that uses human speech to try to get machines to speak more (and more!) like us.
Face-to-face: Social human-robot interaction
In order to integrate well into our workplaces and homes, robots need to be able to understand, respond to and produce socially appropriate behavior. Non-verbal interaction is especially important, because humans use subtle movements and cues such as gaze to explicitly and implicitly express information. This talk will show how studying human-human interaction can help us design more natural human-robot interaction. I will also explore whether we can use social robots to help teach people with autism to correctly interpret social behavior.
Map data © OpenStreetMap contributors.