© Pint of Science, 2018. All rights reserved.
Electronic devices have drastically changed the world we live in. At the basis of this “digital revolution” there are basic principles of Mathematics and Physics. One such principle is that information may be encoded as a sequence of zeroes and ones. Today we will explain the Maths (why 0s and 1s?) and the Physics (how electronic devices actually work) behind this idea. The event will be in the first floor room, there will be games and special Pint of Science and Mendeley goodies to be won!
How to Tell a Computer what to do
Davide Bianchini (Theoretical Physicist)
Humans have the ability to interpret different symbols and to give them a particular meaning. If Isay the word “dog” or “3”, you immediately understand what I mean without any furtherexplanation. But how can a computer, a piece of plastic and metal, understand? How can it process information? After having explained why binary signals (sequences of zeros and ones) are more suitable as electronic signals, I will show how to translate 'human' signals into binary sequences. I will conclude with some examples of how logic gates can implement simple and complex operations on binary signals.
From Zeroes and Ones to Quantum Computers
Olalla Castro Alvaredo (Senior Lecturer in Mathematics)
Did you ever watch “The Matrix”? If so, you will be vaguely familiar with sequences of zeroes and ones somehow being associated with computers. In this talk I will expand on Davide’s introduction by explaining some of the mathematical reasons why zeroes and ones are at the basis of how electronic devices work. I will then go on to explain how this technology may one day be replaced by a new concept (the quantum bit or qbit) and why quantum bits could revolutionize computation and the world we live in by giving us new electronic devices with enormous potential.