A collection of interviews and films that offer a fascinating insight into the lives of very different people who have helped to shape the information society we live in today. Including five short films that were created by an award-winning production team, this collection demonstrates how informat…
BCS, The Chartered Institute for IT
Vint Cerf, Google's Chief Internet Evangelist, talks about receiving a BCS Distinguished Fellowship, how we can make the internet safer, the future of Google and tweeting aliens.
While he was over in the UK for a book tour and lecture series, Professor Donald Knuth made time to talk to BCS editor Justin Richards about his life and works. Donald is author of the hugely respected The Art of Computer Programming book series and dozens of highly regarded academic papers on computer science
BCS, The Chartered Institute for IT, is sharing the stories of five Information Pioneers who truly enabled the information society.
Cambridge, 1936. While the world was being shaped by events in Europe - the Spanish Civil War, the Nazis retaking the Rhineland - Alan Turing, a young mathematician, completes a new theory in his rooms in Cambridge. This theory was originally described as an imaginary machine, to crunch imaginary numbers, but it went on to be the origin of Artificial Intelligence as we now know it. Turing was the first to understand that computers could learn and adapt to new stimuli, just as we humans can, it was just a matter of giving them the right tools in the first place. Much like teaching a child to cross the road, Turing set about creating Artificial Intelligence in his machines. Today, self-parking cars, self-flying planes, even the Mars rover are all descendants of Turing's Machine.
While the world changed around him, with little fanfare Alan Turing, just 24 years old, dreamt up a machine that could be taught to think. This idea was going to change the course of history. He was the first to understand that computers could learn and adapt to new stimuli, just as we humans can. Much like teaching a child to cross the road, Turing set about creating artificial intelligence in his machines. Today, self-parking cars, self-flying planes, even the Mars rover are all descendants of Turing's Machine.
London, 1833. Ada Lovelace, the 19-year-old daughter of Lord Byron and Annabella Millbank is introduced to an eccentric genius, Charles Babbage. Babbage shows her a prototype of a calculating machine he has invented called a Difference Engine. The two minds connect immediately, and Ada continues to work with Babbage until in 1844 he shows her his plans for another machine, the Analytical Engine. This machine uses hole-punched cards as programmes which tell it how to calculate the problems presented to it. In this moment, Ada sees a future that would not come into being for another 100 years. She saw that if the Analytical Engine could be programmed to calculate, it could pretty much be programmed to do anything. And thus, she gave us the blueprints for computer programming as we know it.
London, 1833. Ada Lovelace, the 19-year-old daughter of Lord Byron and Annabella Millbank, is introduced to an eccentric genius, Charles Babbage. Babbage shows her a prototype of a calculating machine he has invented called a Difference Engine. The two minds connect immediately, and Ada continues to work with Babbage until in 1844 he shows her his plans for another machine, the Analytical Engine. This machine uses hole-punched cards as programmes which tell it how to calculate the problems presented to it. In this moment, Ada sees a future that would not come into being for another 100 years. She saw that if the Analytical Engine could be programmed to calculate, it could pretty much be programmed to do anything. And thus, she gave us the blueprints for computer programming as we know it.
Hollywood, 1940. Hedy Lamarr, known in Hollywood as "the most beautiful woman in films" already had quite a reputation. The first woman to perform a nude scene in mainstream cinema, Hedy had fled Europe and a marriage to Mussolini's arms dealer to become box office gold, starring alongside Clark Gable and Spencer Tracey. But it wasn't the life she wanted. Knowing, from her marriage in Vienna, that military use of torpedos was being hampered by their single-frequency transmissions, Hedy, along with George Antheil, her piano-playing neighbour, set about creating a system called frequency hopping, in which parts of a signal were sent across different frequencies. The system was based on the piano rolls that Antheil used in his player pianos, and allowed torpedos to be controlled without being intercepted. The patent she held is the basis for the wi-fi, GPS and mobile communications we use today.
Hollywood, 1940. Hedy Lamarr, known in Hollywood as 'the most beautiful woman in films' already had quite a reputation. The first woman to perform a nude scene in mainstream cinema, Hedy had fled Europe and a marriage to Mussolini's arms dealer to become box office gold, starring alongside Clark Gable and Spencer Tracey. But it wasn't the life she wanted. Knowing, from her marriage in Vienna, that military use of torpedos was being hampered by their single-frequency transmissions, Hedy, along with George Antheil, her piano-playing neighbour, set about creating a system called frequency hopping, in which parts of a signal were sent across different frequencies.
Britain, 1979. To most people in the 70's, computers were monstrous, bleeping big-brother machines the size of a bus with hundreds of valves and great reels of magnetic tape. They were expensive to run and difficult to understand, and certainly, not something any of us would want at home. Clive Sinclair, a serial inventor, thought differently. He saw that the next step in modern computing was to create a small, affordable machine that could be used alongside our existing televisions and cassette players at home. His idea was to give the general public a tool to learn, organise and play on, that they could programme themselves. So, in 1979, he gave us the ZX80 - a home computer with a 1KB memory, no sound and a monochrome display. It may seem strange to us now, but that temperamental (and sometimes glitchy) little beauty launched the home computer industry that surrounds us today.
Britain, 1979. To most people in the 70s, computers were monstrous, bleeping big-brother machines the size of a bus with hundreds of valves and great reels of magnetic tape. They were expensive to run and difficult to understand, and certainly, not something any of us would want at home. Clive Sinclair, a serial inventor, thought differently. He saw that the next step in modern computing was to create a small, affordable machine that could be used alongside our existing televisions and cassette players at home.
Geneva, 1980s. Based at CERN, the European Organisation for Nuclear Research in Switzerland, Tim Berners-Lee was suffering from a case of information overload at work. Desperately trying to co-ordinate a mass of research and data from incompatible computer systems around the globe, Berners-Lee figured there must be a better way to do things. So he set about creating a space where any piece of information could be linked to any other piece of information out in the world. To do this he joined two separate ideas that had been knocking about for some time - hypertext and the internet - and he created the World Wide Web. He gave his invention to us, the public, for free and, after a cautious start, we all leapt on board to create the huge collective brain that we now use daily. Imagine a world without the world wide web. You can't, can you?
Geneva, 1980s. Based at CERN, the European Organisation for Nuclear Research in Switzerland, Tim Berners-Lee was suffering from a case of information overload at work. Desperately trying to coordinate a mass of research and data from incompatible computer systems around the globe, Berners-Lee figured there must be a better way to do things. So he set about creating a space where any piece of information could be linked to any other piece of information out in the world. To do this he joined two separate ideas that had been knocking about for some time - hypertext and the internet - and he created the World Wide Web.
Information Pioneers is a campaign from BCS, The Chartered Institute for IT, that seeks to show how the contributions of five very different people helped to shape the information society that we live in today. This video takes a look behind the scenes during the filming of Information Pioneers
Co-founder of Wikipedia and Wikia Jimmy Wales speaks to BCS Editor Henry Tucker about politics online, the future of paper, why not everything should be on the internet, large video projects and the power of mobile carriers.
In an exclusive interview with the Institute, Sir Tim Berners-Lee spoke about the latest on the semantic web, his view on the advent of artificial life forms on the internet, the biggest barriers to enabling the information society for all, where the mobile web is going and more.
In an exclusive interview with the Institute, Sir Tim Berners-Lee spoke about the latest on the semantic web, his view on the advent of artificial life forms on the internet, the biggest barriers to enabling the information society for all, where the mobile web is going and more.
He hates cellphones, but thinks that acceptance of the open source concept is now taken for granted - in a good way. BCS managing editor Brian Runciman interviewed Linus Torvalds after he received the BCS Lovelace Medal.
Karen Spärck Jones is winner of the 2007 BCS Lovelace Medal. BCS managing editor Brian Runciman interviewed her
The first in a series of interviews with key figures in the IT industry for BCS's 50th anniversary, Brian Runciman spoke to Vint Cerf, Google's chief web evangelist.
BCS managing editor Brian Runciman interviews the inventor of the web Sir Tim Berners-Lee.