© Pint of Science, 2024. All rights reserved.
How do robots actually perceive the world? Sci-fi movies such as Star Wars and Ex Machina have shown us how the robots of the future might become perfectly autonomous machines, capable of interpreting their surroundings, making decisions, and sensing human emotions. But how are we stepping towards this human-level artificial intelligence?
This event takes place on the first floor, only accessible by stairs.
This event takes place on the first floor, only accessible by stairs.
What can Automatic Face Analysis do for us?
Professor Maja Pantic
(Affective & Behavioural Computing, Imperial College London & University of Twente)
The human face is our main means to communicate our identity and send social signals. This talk will demonstrate how facial expressions can be automatically sensed and analysed by computers. Maja will present her research on machine understanding of human behaviour, including vision-based detection, tracking, and analysis of human behaviour like facial expressions, body gestures, laughter, and social signals.
Domestic robots for the future
Dr Edward Johns
(Dyson Fellow at Imperial College London)
Imagine that you are a robot and you need to clear a messy table. You try to observe the local environment through your visual sensors, which present the environment as a bunch of numbers. How would you interpret these data? Edward will discuss how robots can learn visual perception autonomously and show you how robots of the future may be able to clean up your house without burning it down!
Map data © OpenStreetMap contributors.