POPULARITY
Chris Quigg, the celebrated theoretical physicist and co-author of Grace in All Simplicity, whisks us away on a journey through the wonders of particle physics—served with a dash of poetry and a sprinkle of grace! With a talent for turning complex ideas into accessible stories, Chris shares insights from his new book and reveals the inspiration behind its intriguing title. Listen for unforgettable moments in the history of science and get a sneak peek into life at iconic research centers like CERN and Fermilab. Thank you, Chris, for showing us how curiosity and a love for learning can transform our understanding of science and our approach to life. About Chris Quigg: Chris Quigg has spent his career making particle physics approachable and fascinating for everyone. Having worked at world-renowned institutions like CERN and Fermilab, he's explored fundamental questions about the universe. In Grace in All Simplicity, co-authored with Bob Kahn, Chris combines science with storytelling, inviting readers to discover the personal journeys of scientists behind remarkable breakthroughs. His warm and engaging style draws in curious minds of all ages to experience the wonder of physics. Resources Mentioned: Grace in All Simplicity by Chris Quigg and Bob Kahn CERN Science Gateway – A new visitor center bringing science to life for the public Fermilab's Lederman Science Center – An educational space for kids and families to explore the wonders of science Connect with Chris: Twitter/X: @chrisquiggbsky.social: @chrisquigg Check out the reviews of the book: Nature: https://www.nature.com/articles/d41586-023-03424-5 Science https://www.science.org/doi/10.1126/science.adl2396. If you enjoyed this episode and would like to share, I'd love to hear it!
In response to escalating gang violence and severe food shortages, a U.S. government-chartered flight from Cap Haitien brought 47 Americans to safety in Miami. This operation follows a series of evacuations and warnings of dire conditions in Haiti.With the arrival of spring, it's the perfect time to declutter your finances and address pressing financial matters. CBS News business analyst Jill Schlesinger offers expert advice on how to refresh and organize your financial life.In a heartfelt return to prime-time, Oprah Winfrey confronts the complex issues of obesity and the associated shame, sharing her personal journey with weight and discussing the impact of weight-loss drugs like Ozempic.The children of late Run D-M-C star DJ Jam Master Jay are speaking out for the first time since two men were convicted last month of murdering their father more than 20 years ago. CBS New York anchor Maurice DuBois spoke to them at "Scratch DJ Academy," which was co-founded by their father."CBS Mornings" co-host Tony Dokoupil sits down with three computer scientists who helped create the internet, Bob Kahn, Vint Cerf and Steve Crocker, to see what they think of their creation now, and what our digital future may hold.See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
In this episode, we have the honor of talking to one of the pioneers of the internet: Vint Cerf. He is not only the co-inventor of the TCP/IP protocols and the internet architecture, but also a visionary leader and a passionate advocate for digital inclusion and accessibility.He will tell us how he got interested in technology and software engineering, how he met and collaborated with Bob Kahn on creating the internet, and how he became an evangelist for bringing more people online. He will also share some of his challenges and achievements from his six-decade-long career, and his thoughts on the future of the internet. He will also give us some tips on how to deal with stress, burnout, and harmful behaviors onlineDon't miss this incredible and inspiring conversation with Vint Cerf.-"I'm smart enough to know that if you want to do anything big, you need to get help, preferably from people who are smarter than you are."-Vint's Links:LinkedInTwitterWikipediaCommunications of the ACM - Cerf's UpMarconi Society--Thanks for being an imposter - a part of the Imposter Syndrome Network (ISN)! We'd love it if you connected with us at the links below: The ISN LinkedIn group (community): https://www.linkedin.com/groups/14098596/ The ISN on Twitter: https://twitter.com/ImposterNetwork Zoë on Twitter: https://twitter.com/RoseSecOps Chris on Twitter: https://twitter.com/ChrisGrundemann Make it a great day.
In the latest episode of Public Power Now, Bob Kahn, who in July became the new general manager of Texas public power utility Austin Energy, details how his wealth of experience in the power sector has helped to prepare him to hit the ground running as the new general manager of Austin Energy. Kahn also details the strategies that Austin Energy is utilizing to address supply chain constraints.
Almost 50 years ago, Vint Cerf and Bob Kahn designed TCP/IP, a set of rules enabling computers to connect and communicate with each other. It led to the creation of a vast global network: the internet. TCP/IP is how almost the entirety of the internet still sends and receives information. Vint Cerf is now 80 and serves as the chief internet evangelist and a vice president at Google. He is also the chairman of the Marconi Society, a group that promotes digital equity.Alok Jha, The Economist's science and technology editor, asks Vint to reflect on the state of the internet today and the lessons that should be learned for the next, disruptive technology: generative artificial intelligence. Vint Cerf explains how he thinks large language models can be regulated without stifling innovation—ie, more precisely based on their specific applications.For full access to The Economist's print, digital and audio editions subscribe at economist.com/podcastoffer and sign up for our weekly science newsletter at economist.com/simplyscience. Hosted on Acast. See acast.com/privacy for more information.
Almost 50 years ago, Vint Cerf and Bob Kahn designed TCP/IP, a set of rules enabling computers to connect and communicate with each other. It led to the creation of a vast global network: the internet. TCP/IP is how almost the entirety of the internet still sends and receives information. Vint Cerf is now 80 and serves as the chief internet evangelist and a vice president at Google. He is also the chairman of the Marconi Society, a group that promotes digital equity.Alok Jha, The Economist's science and technology editor, asks Vint to reflect on the state of the internet today and the lessons that should be learned for the next, disruptive technology: generative artificial intelligence. Vint Cerf explains how he thinks large language models can be regulated without stifling innovation—ie, more precisely based on their specific applications.For full access to The Economist's print, digital and audio editions subscribe at economist.com/podcastoffer and sign up for our weekly science newsletter at economist.com/simplyscience. Hosted on Acast. See acast.com/privacy for more information.
This week, Stephanie Wong and Anthony Bushong introduce a special podcast of the Gtalk at Airbus speaker series where prestigious Googlers have been invited to talk with Airbus. In this episode, Vint Cerf, who is widely regarded as one of the fathers of the Internet, talks with Rhys Phillips of Airbus and fellow Googler Rafael Lami Dozo. Vint tells us about his journey to Google, including his interest in science which stemmed from a chemistry set he received as a child. After high school, he got a job writing data analyzation software on the Apollo project. His graduate work at UCLA led him to the ARPANet project where he developed host protocols, and eventually to his work on the original Internet with Bob Kahn. Vint tells us about the security surrounding this project and the importance of internet security still today. The open architecture of the internet then and now excites Vint because it allows new, interesting projects to contribute without barriers. Vint is also passionate about accessibility. At Google, he and his team continue to make systems more accessible by listening to clients and adapting software to make it usable. He sees an opportunity to train developers to optimize software to work with common accessibility tools like screen readers to ensure better usability. Later, Vint tells us about the Interplanetary Internet, describing how this system is being built to provide fast, effective Internet to every part of the planet. Along with groups like the Internet Engineering Task Force, this new Internet is being deployed and tested now to ensure it works as expected. He talks about his work with NASA and other space agencies to grow the Interplanetary Internet. Digital obsolescence is another type of accessibility that concerns Vint. Over time, the loads of data we store and their various storage devices could become unreadable. Software needed to use or see this media could no longer be supported as well, making the data inaccessible. Vint hopes we will begin practicing ways to perpetuate the existence of this data through copying and making software more backward compatible. He addresses the issues with this, including funding. Vint Cerf While at UCLA, Vint Cerf worked on ARPANet - the very beginnings of what we know as the internet today and is now, fittingly, Chief Internet Evangelist & VP at Google. He is an American Internet pioneer and is recognized as one of “the fathers of the Internet”, sharing this title with TCP/IP co-developer Bob Kahn. Rhys Phillips Rhys Phillips is Change and Adoption Leader, Digital Workplace at Airbus. Rafael Lami Dozo Rafael Lami Dozo is Customer Success Manager, Google Cloud Workspace for Airbus. Cool things of the week Celebrating Pi Day with Cloud Functions blog Apollo Scales GraphQL Platform using GKE blog Interview Vinton G. Cerf Profile site ARPANet on Wikipedia site To Boldly Go Where No Internet Protocol Has Gone Before article Building the backbone of an interplanetary internet video IETF site CCSDS site IPNSIG site The Internet Society site NASA site What's something cool you're working on? Stephanie is working on new Discovering Data Centers videos. Anthony is working on content for building scalable GKE clusters. Hosts Stephanie Wong and Anthony Bushong
Unfinished tasks as the Internet reaches 50% penetration of the world's population. View the full video interview here. Vinton Gray Cerf is an American Internet pioneer, who is recognized as one of “the fathers of the Internet”, sharing this title with TCP/IP co-inventor Bob Kahn. His contributions have been acknowledged and lauded, repeatedly, with honorary degrees and awards that include the National Medal of Technology, the Turing Award, the Presidential Medal of Freedom, the Marconi Prize and membership in the National Academy of Engineering.
Sometimes you learn an interesting fact that you really need time and space to absorb. If we were to tell you that it was actually Kid Rock and not computer scientists Vinton Cerf and Bob Kahn that came up with the concept for the internet, would you be surprised? We are joined by writer & comedian Lloyd Langford! Check out Lloyd on Instagram get all his upcoming gig dates on his website HERE!Subscribe to us on Spotify, Apple Podcasts, Amazon Music, Google Podcasts, Deezer, Podchaser, iHeartRadio, TuneIn, Podcast Addict, Stitcher, Vurbl or wherever you get your podcasts.If you enjoy our podcast and can afford to shoot some shrapnel our way we would be absolutely bloody stoked about it! You can sign up for as little as $2 a month and receive bonus episodes, extra content and even be a guest on the podcast if you're keen! Jump on our Patreon page now and sign up! Please tell your mates about the podcast and jump on Apple Podcasts/iTunes and give us a 5-star review!Support the show (https://www.patreon.com/1001songsthatmakeyouwanttodie)
It's Halloween time, and it's important to make certain our kids are safe when they go out trick or treating. But certainly, every day is that time, because as parents we must protect our kids from those who would do them harm -- like sexual predators and bullies. Today, we're with Bob Kahn, a retired deputy sheriff in Nevada and former elementary school teacher who has authored eight children's books about safety in an increasingly dangerous world. The “Stranger Danger” program he created for schools and community service organizations has been credited in foiling over 40 attempted abductions by strangers. We're going explore issues of “good touch, bad touch” and what that's all about, and we're going to take a look at the ever serious problem of bullying.Robert has given over 20,000 presentations covering 15 topics on children's safety issues. He also was an instructor for D.A.R.E., a program that gives children the life skills they need to avoid involvement with drugs, gangs, and violence. He holds a master's degree in education from the University of Phoenix and a B.S. in education from the University of Nevada, Reno.Take a listen.Become a supporter of this podcast: https://www.spreaker.com/podcast/the-lean-to-the-left-podcast--4719048/support.
It's Halloween time, and it's important to make certain our kids are safe when they go out trick or treating. But certainly, every day is that time, because as parents we must protect our kids from those who would do them harm -- like sexual predators and bullies. Today, we're with Bob Kahn, a retired deputy sheriff in Nevada and former elementary school teacher who has authored eight children's books about safety in an increasingly dangerous world. The “Stranger Danger” program he created for schools and community service organizations has been credited in foiling over 40 attempted abductions by strangers. We're going explore issues of “good touch, bad touch” and what that's all about, and we're going to take a look at the ever serious problem of bullying.Robert has given over 20,000 presentations covering 15 topics on children's safety issues. He also was an instructor for D.A.R.E., a program that gives children the life skills they need to avoid involvement with drugs, gangs, and violence. He holds a master's degree in education from the University of Phoenix and a B.S. in education from the University of Nevada, Reno.Take a listen.
Oxide and Friends Twitter Space: September 27th, 2021The Books in the BoxWe've been holding a Twitter Space weekly on Mondays at 5p for about an hour. Even though it's not (yet?) a feature of Twitter Spaces, we have been recording them all; here is the recording for our Twitter Space for September 27th, 2021.In addition to Bryan Cantrill and Adam Leventhal, speakers on September 27th included Tom Lyon, Dan Cross, Antranig Vartanian Simeon Miteff Matt Campbell, Jeremy Tanner, Joshua Clulow, Ian, Tim Burnham, and Nathaniel Reindl. (Did we miss your name and/or get it wrong? Drop a PR!)Some of the topics we hit on, in the order that we hit them: Not recommended :-( Dave Hitz and Pat Walsh (2008) How to Castrate a Bull book Peter Thiel (2014) Zero to One book [@2:45](https://youtu.be/zrZAHO89XGk?t=165) David Jacques Gerber (2015) The Inventor's Dilemma: The Remarkable Life of H. Joseph Gerber book [@7:21](https://youtu.be/zrZAHO89XGk?t=441) Sidney Dekker (2011) Drift into Failure: From Hunting Broken Components to Understanding Complex Systems book [@13:08](https://youtu.be/zrZAHO89XGk?t=788) Robert Buderi (1996) The Invention that Changed the World: The Story of Radar from War to Peace book MIT Rad Lab Series info Nuclear Magnetic Resonance wiki Richard Rhodes (1995) Dark Sun: The Making of the Hydrogen Bomb book Michael Riordan and Lillian Hoddeson (1997) Crystal Fire: The Birth of the Information Age book Craig Canine (1995) Dream Reaper: The Story of an Old-Fashioned Inventor in the High-Tech, High-Stakes World of Modern Agriculture book David Fisher and Marshall Fisher (1996) Tube: The Invention of Television book Michael Hiltzik (2015) Big Science: Ernest Lawrence and the Invention that Launched the Military-Industrial Complex book [@18:05](https://youtu.be/zrZAHO89XGk?t=1085) Ben Rich and Leo Janos (1994) Skunk Works: A Personal Memoir of My Years at Lockheed book Network Software Environment Lockheed SR-71 on display at the Sea, Air and Space Museum in NYC. [@26:52](https://youtu.be/zrZAHO89XGk?t=1612) Brian Dear (2017) The Friendly Orange Glow: The Untold Story of the Rise of Cyberculture book [@30:15](https://youtu.be/zrZAHO89XGk?t=1815) Randall Stross (1993) Steve Jobs and the NeXT Big Thing book [@32:21](https://youtu.be/zrZAHO89XGk?t=1941) Christophe Lécuyer and David C. Brock (2010) Makers of the Microchip: A Documentary History of Fairchild Semiconductor book [@33:06](https://youtu.be/zrZAHO89XGk?t=1986) Lamont Wood (2012) Datapoint: The Lost Story of the Texans Who Invented the Personal Computer Revolution book Charles Kenney (1992) Riding the Runaway Horse: The Rise and Decline of Wang Laboratories bookTom's tweet [@34:06](https://youtu.be/zrZAHO89XGk?t=2046) Bryan's Lost Box of Books! Edgar H. Schein et al (2003) DEC is Dead, Long Live DEC: The Lasting Legacy of Digital Equipment Corporation book [@36:56](https://youtu.be/zrZAHO89XGk?t=2216) Alan Payne (2021) Built to Fail: The Inside Story of Blockbuster's Inevitable Bust bookVideotape format war wiki Hackers (1995) movie. Watch the trailer ~2mins Steven Levy (1984) Hackers: Heroes of the Computer Revolution book [@42:32](https://youtu.be/zrZAHO89XGk?t=2552) Paul Halmos (1985) I Want to be a Mathematician: An Automathography book Paul Hoffman (1998) The Man Who Loved Only Numbers about Paul Erdős book 1981 text adventure game for the Apple II by Sierra On-Line, “Softporn Adventure” (wiki) [@49:16](https://youtu.be/zrZAHO89XGk?t=2956) Douglas Engelbart The Mother of All Demos wikiJohn Markoff (2005) What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry book Katie Hafner and Matthew Lyon (1998) Where Wizards Stay Up Late book 1972 Computer Networks: The Heralds of Resource Sharing documentary ~26mins (wiki) included big names like Corbató, Licklider and Bob Kahn. Gordon Moore (1965) Cramming more components onto integrated circuits paper and Moore's Law wiki [@52:37](https://youtu.be/zrZAHO89XGk?t=3157) Physicists, mathematicians, number theory, proofs Wiles's proof of Fermat's Last Theorem 1993 wiki Simon Singh (1997) Fermat's Last Theorem book Ronald Calinger (2015) Leonhard Euler: Mathematical Genius in the Enlightenment purports to be the first full-scale “comprehensive and authoritative” biography [@1:00:12](https://youtu.be/zrZAHO89XGk?t=3612) Robert X. Cringely (1992) Accidental Empires: How the Boys of Silicon Valley Make Their Millions, Battle Foreign Competition, and Still Can't Get a Date book Jerry Kaplan (1996) Startup: A Silicon Valley Adventure book Brian Kernighan (2019) UNIX: A History and a Memoir book [@1:03:03](https://youtu.be/zrZAHO89XGk?t=3783) Douglas Coupland (1995) Microserfs book Douglas Coupland (1991) Generation X: Tales for an Accelerated Culture book Fry's Electronics wiki [@1:06:49](https://youtu.be/zrZAHO89XGk?t=4009) Michael A. Hiltzik (1999) Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age book Albert Cory (pen name for Bob Purvy) (2021) Inventing the Future bookXerox Star wiki [@1:11:20](https://youtu.be/zrZAHO89XGk?t=4280) Corporate espionage, VMWare and Parallels, Cadence v. Avanti wiki, Cisco and Huawei (article) If we got something wrong or missed something, please file a PR! Our next Twitter space will likely be on Monday at 5p Pacific Time; stay tuned to our Twitter feeds for details. We'd love to have you join us, as we always love to hear from new speakers!
The Internet is not a simple story to tell. In fact, every sentence here is worthy of an episode if not a few. Many would claim the Internet began back in 1969 when the first node of the ARPAnet went online. That was the year we got the first color pictures of earthen from Apollo 10 and the year Nixon announced the US was leaving Vietnam. It was also the year of Stonewall, the moon landing, the Manson murders, and Woodstock. A lot was about to change. But maybe the story of the Internet starts before that, when the basic research to network computers began as a means of networking nuclear missile sites with fault-tolerant connections in the event of, well, nuclear war. Or the Internet began when a T3 backbone was built to host all the datas. Or the Internet began with the telegraph, when the first data was sent over electronic current. Or maybe the Internet began when the Chinese used fires to send messages across the Great Wall of China. Or maybe the Internet began when drums sent messages over long distances in ancient Africa, like early forms of packets flowing over Wi-Fi-esque sound waves. We need to make complex stories simpler in order to teach them, so if the first node of the ARPAnet in 1969 is where this journey should end, feel free to stop here. To dig in a little deeper, though, that ARPAnet was just one of many networks that would merge into an interconnected network of networks. We had dialup providers like CompuServe, America Online, and even The WELL. We had regional timesharing networks like the DTSS out of Dartmouth University and PLATO out of the University of Illinois, Champaign-Urbana. We had corporate time sharing networks and systems. Each competed or coexisted or took time from others or pushed more people to others through their evolutions. Many used their own custom protocols for connectivity. But most were walled gardens, unable to communicate with the others. So if the story is more complicated than that the ARPAnet was the ancestor to the Internet, why is that the story we hear? Let's start that journey with a memo that we did an episode on called “Memorandum For Members and Affiliates of the Intergalactic Computer Network” sent by JCR Licklider in 1963 and can be considered the allspark that lit the bonfire called The ARPANet. Which isn't exactly the Internet but isn't not. In that memo, Lick proposed a network of computers available to research scientists of the early 60s. Scientists from computing centers that would evolve into supercomputing centers and then a network open to the world, even our phones, televisions, and watches. It took a few years, but eventually ARPA brought in Larry Roberts, and by late 1968 ARPA awarded an RFQ to build a network to a company called Bolt Beranek and Newman (BBN) who would build Interface Message Processors, or IMPs. The IMPS were computers that connected a number of sites and routed traffic. The first IMP, which might be thought of more as a network interface card today, went online at UCLA in 1969 with additional sites coming on frequently over the next few years. That system would become ARPANET. The first node of ARPAnet went online at the University of California, Los Angeles (UCLA for short). It grew as leased lines and more IMPs became more available. As they grew, the early computer scientists realized that each site had different computers running various and random stacks of applications and different operating systems. So we needed to standardize certain aspects connectivity between different computers. Given that UCLA was the first site to come online, Steve Crocker from there began organizing notes about protocols and how systems connected with one another in what they called RFCs, or Request for Comments. That series of notes was then managed by a team that included Elizabeth (Jake) Feinler from Stanford once Doug Engelbart's project on the “Augmentation of Human Intellect” at Stanford Research Institute (SRI) became the second node to go online. SRI developed a Network Information Center, where Feinler maintained a list of host names (which evolved into the hosts file) and a list of address mappings which would later evolve into the functions of Internic which would be turned over to the US Department of Commerce when the number of devices connected to the Internet exploded. Feinler and Jon Postel from UCLA would maintain those though, until his death 28 years later and those RFCs include everything from opening terminal connections into machines to file sharing to addressing and now any place where the networking needs to become a standard. The development of many of those early protocols that made computers useful over a network were also being funded by ARPA. They funded a number of projects to build tools that enabled the sharing of data, like file sharing and some advancements were loosely connected by people just doing things to make them useful and so by 1971 we also had email. But all those protocols needed to flow over a common form of connectivity that was scalable. Leonard Kleinrock, Paul Baran, and Donald Davies were independently investigating packet switching and Roberts brought Kleinrock into the project as he was at UCLA. Bob Kahn entered the picture in 1972. He would team up with Vint Cerf from Stanford who came up with encapsulation and so they would define the protocol that underlies the Internet, TCP/IP. By 1974 Vint Cerf and Bob Kahn wrote RFC 675 where they coined the term internet as shorthand for internetwork. The number of RFCs was exploding as was the number of nodes. The University of California Santa Barbara then the University of Utah to connect Ivan Sutherland's work. The network was national when BBN connected to it in 1970. Now there were 13 IMPs and by 1971, 18, then 29 in 72 and 40 in 73. Once the need arose, Kleinrock would go on to work with Farouk Kamoun to develop the hierarchical routing theories in the late 70s. By 1976, ARPA became DARPA. The network grew to 213 hosts in 1981 and by 1982, TCP/IP became the standard for the US DOD and in 1983, ARPANET moved fully over to TCP/IP. And so TCP/IP, or Transport Control Protocol/Internet Protocol is the most dominant networking protocol on the planet. It was written to help improve performance on the ARPAnet with the ingenious idea to encapsulate traffic. But in the 80s, it was just for researchers still. That is, until NSFNet was launched by the National Science Foundation in 1986. And it was international, with the University College of London connecting in 1971, which would go on to inspire a British research network called JANET that built their own set of protocols called the Colored Book protocols. And the Norwegian Seismic Array connected over satellite in 1973. So networks were forming all over the place, often just time sharing networks where people dialed into a single computer. Another networking project going on at the time that was also getting funding from ARPA as well as the Air Force was PLATO. Out of the University of Illinois, was meant for teaching and began on a mainframe in 1960. But by the time ARPAnet was growing PLATO was on version IV and running on a CDC Cyber. The time sharing system hosted a number of courses, as they referred to programs. These included actual courseware, games, convent with audio and video, message boards, instant messaging, custom touch screen plasma displays, and the ability to dial into the system over lines, making the system another early network. In fact, there were multiple CDC Cybers that could communicate with one another. And many on ARPAnet also used PLATO, cross pollinating non-defense backed academia with a number of academic institutions. The defense backing couldn't last forever. The Mansfield Amendment in 1973 banned general research by defense agencies. This meant that ARPA funding started to dry up and the scientists working on those projects needed a new place to fund their playtime. Bob Taylor split to go work at Xerox, where he was able to pick the best of the scientists he'd helped fund at ARPA. He helped bring in people from Stanford Research Institute, where they had been working on the oNLineSystem, or NLS and people like Bob Metcalfe who brought us Ethernet and better collusion detection. Metcalfe would go on to found 3Com a great switch and network interface company during the rise of the Internet. But there were plenty of people who could see the productivity gains from ARPAnet and didn't want it to disappear. And the National Science Foundation (NSF) was flush with cash. And the ARPA crew was increasingly aware of non-defense oriented use of the system. So the NSF started up a little project called CSNET in 1981 so the growing number of supercomputers could be shared between all the research universities. It was free for universities that could get connected and from 1985 to 1993 NSFNET, surged from 2,000 users to 2,000,000 users. Paul Mockapetris made the Internet easier than when it was an academic-only network by developing the Domain Name System, or DNS, in 1983. That's how we can call up remote computers by names rather than IP addresses. And of course DNS was yet another of the protocols in Postel at UCLAs list of protocol standards, which by 1986 after the selection of TCP/IP for NSFnet, would become the standardization body known as the IETF, or Internet Engineering Task Force for short. Maintaining a set of protocols that all vendors needed to work with was one of the best growth hacks ever. No vendor could have kept up with demand with a 1,000x growth in such a small number of years. NSFNet started with six nodes in 1985, connected by LSI-11 Fuzzball routers and quickly outgrew that backbone. They put it out to bid and Merit Network won out in a partnership between MCI, the State of Michigan, and IBM. Merit had begun before the first ARPAnet connections went online as a collaborative effort by Michigan State University, Wayne State University, and the University of Michigan. They'd been connecting their own machines since 1971 and had implemented TCP/IP and bridged to ARPANET. The money was getting bigger, they got $39 million from NSF to build what would emerge as the commercial Internet. They launched in 1987 with 13 sites over 14 lines. By 1988 they'd gone nationwide going from a 56k backbone to a T1 and then 14 T1s. But the growth was too fast for even that. They re-engineered and by 1990 planned to add T3 lines running in parallel with the T1s for a time. By 1991 there were 16 backbones with traffic and users growing by an astounding 20% per month. Vint Cerf ended up at MCI where he helped lobby for the privatization of the internet and helped found the Internet Society in 1988. The lobby worked and led to the the Scientific and Advanced-Technology Act in 1992. Before that, use of NSFNET was supposed to be for research and now it could expand to non-research and education uses. This allowed NSF to bring on even more nodes. And so by 1993 it was clear that this was growing beyond what a governmental institution whose charge was science could justify as “research” for any longer. By 1994, Vent Cerf was designing the architecture and building the teams that would build the commercial internet backbone at MCI. And so NSFNET began the process of unloading the backbone and helped the world develop the commercial Internet by sprinkling a little money and know-how throughout the telecommunications industry, which was about to explode. NSFNET went offline in 1995 but by then there were networks in England, South Korea, Japan, Africa, and CERN was connected to NSFNET over TCP/IP. And Cisco was selling routers that would fuel an explosion internationally. There was a war of standards and yet over time we settled on TCP/IP as THE standard. And those were just some of the nets. The Internet is really not just NSFNET or ARPANET but a combination of a lot of nets. At the time there were a lot of time sharing computers that people could dial into and following the release of the Altair, there was a rapidly growing personal computer market with modems becoming more and more approachable towards the end of the 1970s. You see, we talked about these larger networks but not hardware. The first modulator demodulator, or modem, was the Bell 101 dataset, which had been invented all the way back in 1958, loosely based on a previous model developed to manage SAGE computers. But the transfer rate, or baud, had stopped being improved upon at 300 for almost 20 years and not much had changed. That is, until Hayes Hayes Microcomputer Products released a modem designed to run on the Altair 8800 S-100 bus in 1978. Personal computers could talk to one another. And one of those Altair owners was Ward Christensen met Randy Suess at the Chicago Area Computer Hobbyists' Exchange and the two of them had this weird idea. Have a computer host a bulletin board on one of their computers. People could dial into it and discuss their Altair computers when it snowed too much to meet in person for their club. They started writing a little code and before you know it we had a tool they called Computerized Bulletin Board System software, or CBBS. The software and more importantly, the idea of a BBS spread like wildfire right along with the Atari, TRS-80, Commodores and Apple computers that were igniting the personal computing revolution. The number of nodes grew and as people started playing games, the speed of those modems jumped up with the v.32 standard hitting 9600 baud in 84, and over 25k in the early 90s. By the early 1980s, we got Fidonet, which was a network of Bulletin Board Systems and by the early 90s we had 25,000 BBS's. And other nets had been on the rise. And these were commercial ventures. The largest of those dial-up providers was America Online, or AOL. AOL began in 1985 and like most of the other dial-up providers of the day were there to connect people to a computer they hosted, like a timesharing system, and give access to fun things. Games, news, stocks, movie reviews, chatting with your friends, etc. There was also CompuServe, The Well, PSINet, Netcom, Usenet, Alternate, and many others. Some started to communicate with one another with the rise of the Metropolitan Area Exchanges who got an NSF grant to establish switched ethernet exchanges and the Commercial Internet Exchange in 1991, established by PSINet, UUNet, and CERFnet out of California. Those slowly moved over to the Internet and even AOL got connected to the Internet in 1989 and thus the dial-up providers went from effectively being timesharing systems to Internet Service Providers as more and more people expanded their horizons away from the walled garden of the time sharing world and towards the Internet. The number of BBS systems started to wind down. All these IP addresses couldn't be managed easily and so IANA evolved out of being managed by contracts from research universities to DARPA and then to IANA as a part of ICANN and eventually the development of Regional Internet Registries so AFRINIC could serve Africa, ARIN could serve Antarctica, Canada, the Caribbean, and the US, APNIC could serve South, East, and Southeast Asia as well as Oceania LACNIC could serve Latin America and RIPE NCC could serve Europe, Central Asia, and West Asia. By the 90s the Cold War was winding down (temporarily at least) so they even added Russia to RIPE NCC. And so using tools like WinSOCK any old person could get on the Internet by dialing up. Modems for dial-ups transitioned to DSL and cable modems. We got the emergence of fiber with regional centers and even national FiOS connections. And because of all the hard work of all of these people and the money dumped into it by the various governments and research agencies, life is pretty darn good. When we think of the Internet today we think of this interconnected web of endpoints and content that is all available. Much of that was made possible by the development of the World Wide Web by Tim Berners-Lee in in 1991 at CERN, and Mosaic came out of the National Center for Supercomputing applications, or NCSA at the University of Illinois, quickly becoming the browser everyone wanted to use until Mark Andreeson left to form Netscape. Netscape's IPO is probably one of the most pivotal moments where investors from around the world realized that all of this research and tech was built on standards and while there were some patents, the standards were freely useable by anyone. Those standards let to an explosion of companies like Yahoo! from a couple of Stanford grad students and Amazon, started by a young hedge fund Vice President named Jeff Bezos who noticed all the money pouring into these companies and went off to do his own thing in 1994. The companies that arose to create and commercialize content and ideas to bring every industry online was ferocious. And there were the researchers still writing the standards and even commercial interests helping with that. And there were open source contributors who helped make some of those standards easier to implement by regular old humans. And tools for those who build tools. And from there the Internet became what we think of today. Quicker and quicker connections and more and more productivity gains, a better quality of life, better telemetry into all aspects of our lives and with the miniaturization of devices to support wearables that even extends to our bodies. Yet still sitting on the same fundamental building blocks as before. The IANA functions to manage IP addressing has moved to the private sector as have many an onramp to the Internet. Especially as internet access has become more ubiquitous and we are entering into the era of 5g connectivity. And it continues to evolve as we pivot due to new needs and threats a globally connected world represent. IPv6, various secure DNS options, options for spam and phishing, and dealing with the equality gaps surfaced by our new online world. We have disinformation so sometimes we might wonder what's real and what isn't. After all, any old person can create a web site that looks legit and put whatever they want on it. Who's to say what reality is other than what we want it to be. This was pretty much what Morpheus was offering with his choices of pills in the Matrix. But underneath it all, there's history. And it's a history as complicated as unraveling the meaning of an increasingly digital world. And it is wonderful and frightening and lovely and dangerous and true and false and destroying the world and saving the world all at the same time. This episode is pretty simplistic and many of the aspects we cover have entire episodes of the podcast dedicated to them. From the history of Amazon to Bob Taylor to AOL to the IETF to DNS and even Network Time Protocol. It's a story that leaves people out necessarily; otherwise scope creep would go all the way back to to include Volta and the constant electrical current humanity received with the battery. But hey, we also have an episode on that! And many an advance has plenty of books and scholarly works dedicated to it - all the way back to the first known computer (in the form of clockwork), the Antikythera Device out of Ancient Greece. Heck even Louis Gerschner deserves a mention for selling IBM's stake in all this to focus on things that kept the company going, not moonshots. But I'd like to dedicate this episode to everyone not mentioned due to trying to tell a story of emergent networks. Just because they were growing fast and our modern infrastructure was becoming more and more deterministic doesn't mean that whether it was writing a text editor or helping fund or pushing paper or writing specs or selling network services or getting zapped while trying to figure out how to move current that there aren't so, so, so many people that are a part of this story. Each with their own story to be told. As we round the corner into the third season of the podcast we'll start having more guests. If you have a story and would like to join us use the email button on thehistoryofcomputing.net to drop us a line. We'd love to chat!
Java, Ruby, PHP, Go. These are web applications that dynamically generate code then interpreted as a file by a web browser. That file is rarely static these days and the power of the web is that an app or browser can reach out and obtain some data, get back some xml or json or yaml, and provide an experience to a computer, mobile device, or even embedded system. The web is arguably the most powerful, transformational technology in the history of technology. But the story of the web begins in philosophies that far predate its inception. It goes back to a file, which we can think of as a document, on a computer that another computer reaches out to and interprets. A file comprised of hypertext. Ted Nelson coined the term hypertext. Plenty of others put the concepts of linking objects into the mainstream of computing. But he coined the term that he's barely connected to in the minds of many. Why is that? Tim Berners-Lee invented the World Wide Web in 1989. Elizabeth Feinler developed a registry of names that would evolve into DNS so we could find computers online and so access those web sites without typing in impossible to remember numbers. Bob Kahn and Leonard Kleinrock were instrumental in the Internet Protocol, which allowed all those computers to be connected together, providing the schemes for those numbers. Some will know these names; most will not. But a name that probably doesn't come up enough is Ted Nelson. His tale is one of brilliance and the early days of computing and the spread of BASIC and an urge to do more. It's a tale of the hacker ethic. And yet, it's also a tale of irreverence - to be used as a warning for those with aspirations to be remembered for something great. Or is it? Steve Jobs famously said “real artists ship.” Ted Nelson did ship. Until he didn't. Let's go all the way back to 1960, when he started Project Xanadu. Actually, let's go a little further back first. Nelson was born to TV directory Ralph Nelson and Celeste Holm, who won an Academy Award for her role in Gentleman's Agreement in 1947 and took home another pair of nominations through her career, and for being the original Ado Annie in Oklahoma. His dad worked on The Twilight Zone - so of course he majored in philosophy at Swarthmore College and then went off to the University of Chicago and then Harvard for graduate school, taking a stab at film after he graduated. But he was meant for an industry that didn't exist yet but would some day eclipse the film industry: software. While in school he got exposed to computers and started to think about this idea of a repository of all the world's knowledge. And it's easy to imagine a group of computing aficionados sitting in a drum circle, smoking whatever they were smoking, and having their minds blown by that very concept. And yet, it's hard to imagine anyone in that context doing much more. And yet he did. Nelson created Project Xanadu in 1960. As we'll cover, he did a lot of projects during the remainder of his career. The Journey is what is so important, even if we never get to the destination. Because sometimes we influence the people who get there. And the history of technology is as much about failed or incomplete evolutions as it is about those that become ubiquitous. It began with a project while he was enrolled in Harvard grad school. Other word processors were at the dawn of their existence. But he began thinking through and influencing how they would handle information storage and retrieval. Xanadu was supposed to be a computer network that connected humans to one another. It was supposed to be simple and a scheme for world-wide electronic publishing. Unlike the web, which would come nearly three decades later, it was supposed to be bilateral, with broken links self-repairing, much as nodes on the ARPAnet did. His initial proposal was a program in machine language that could store and display documents. Being before the advent of Markdown, ePub, XML, PDF, RTF, or any of the other common open formats we use today, it was rudimentary and would evolve over time. Keep in mind. It was for documents and as Nelson would say later, the web - which began as a document tool, was a fork of the project. The term Xanadu was borrowed from Samuel Taylor Coleridge's Kubla Khan, itself written after some opium fueled dreams about a garden in Kublai Khan's Shangdu, or Xanadu.In his biography, Coleridge explained the rivers in the poem supply “a natural connection to the parts and unity to the whole” and he said a “stream, traced from its source in the hills among the yellow-red moss and conical glass-shaped tufts of bent, to the first break or fall, where its drops become audible, and it begins to form a channel.” Connecting all the things was the goal and so Xanadu was the name. He gave a talk and presented a paper called “A File Structure for the Complex, the Changing and the Indeterminate” at the Association for Computing Machinery in 1965 that laid out his vision. This was the dawn of interactivity in computing. Digital Equipment had launched just a few years earlier and brought the PDP-8 to market that same year. The smell of change was in the air and Nelson was right there. After that, he started to see all these developments around the world. He worked on a project at Brown University to develop a word processor with many of his ideas in it. But the output of that project, as with most word processors since - was to get things printed. He believed content was meant to be created and live its entire lifecycle in the digital form. This would provide perfect forward and reverse citations, text enrichment, and change management. And maybe if we all stand on the shoulders of giants, it would allow us the ability to avoid rewriting or paraphrasing the works of others to include them in own own writings. We could do more without that tedious regurgitation. He furthered his counter-culture credentials by going to Woodstock in 1969. Probably not for that reason, but it happened nonetheless. And he traveled and worked with more and more people and companies, learning and engaging and enriching his ideas. And then he shared them. Computer Lib/Dream Machines was a paperback book. Or two. It had a cover on each side. Originally published in 1974, it was one of the most important texts of the computer revolution. Steven Levy called it an epic. It's rare to find it for less than a hundred bucks on eBay at this point because of how influential it was and what an amazing snapshot in time it represents. Xanadu was to be a hypertext publishing system in the form of Xanadocs, or files that could be linked to from other files. A Xanadoc used Xanalinks to embed content from other documents into a given document. These spans of text would become transclusions and change in the document that included the content when they changed in the live document. The iterations towards working code were slow and the years ticked by. That talk in 1965 gave way to the 1970s, then 80s. Some thought him brilliant. Others didn't know what to make of it all. But many knew of his ideas for hypertext and once known it became deterministic. Byte Magazine published many of his thoughts in 1988 called “Managing Immense Storage” and by then the personal computer revolution had come in full force. Tim Berners-Lee put the first node of the World Wide Web online the next year, using a protocol they called Hypertext Transfer Protocol, or http. Yes, the hypertext philosophy was almost a means of paying homage to the hard work and deep thinking Nelson had put in over the decades. But not everyone saw it as though Nelson had made great contributions to computing. “The Curse of Xanadu” was an article published in Wired Magazine in 1995. In the article, the author points out the fact that the web had come along using many of the ideas Nelson and his teams had worked on over the years but actually shipped - whereas Nelson hadn't. Once shipped, the web rose in popularity becoming the ubiquitous technology it is today. The article looked at Xanadu as vaporware. But there is a deeper, much more important meaning to Xanadu in the history of computing. Perhaps inspired by the Wired article, the group released an incomplete version of Xanadu in 1998. But by then, other formats - including PDF which was invented in 1993 and .doc for Microsoft Word, were the primary mechanisms we stored documents and first gopher and then the web were spreading to interconnect humans with content. https://www.youtube.com/watch?v=72M5kcnAL-4 The Xanadu story isn't a tragedy. Would we have had hypertext as a part of Douglas Engelbart's oNLine System without it? Would we have object-oriented programming or later the World Wide Web without it? The very word hypertext is almost an homage, even if they don't know it, to Nelson's work. And the look and feel of his work lives on in places like GitHub, whether directly influenced or not, where we can see changes in code side-by-side with actual production code, changes that are stored and perhaps rolled back forever. Larry Tessler coined the term Cut and Paste. While Nelson calls him a friend in Werner Herzog's Lo and Behold, Reveries of the Connected World, he also points out that Tessler's term is flawed. And I think this is where we as technologists have to sometimes trim down our expectations of how fast evolutions occur. We take tiny steps because as humans we can't keep pace with the rapid rate of technological change. We can look back and see a two steps forward and one step back approach since the dawn of written history. Nelson still doesn't think the metaphors that harken back to paper have any place in the online written word. Here's another important trend in the history of computing. As we've transitioned to more and more content living online exclusively, the content has become diluted. One publisher I wrote online pieces for asked that they all be +/- 700 words and asked that paragraphs be no more than 4 sentences long (preferably 3) and the sentences should be written at about a 5th or 6th grade level. Maybe Nelson would claim that this de-evolution of writing is due to search engine optimization gamifying the entirety of human knowledge and that a tool like Xanadu would have been the fix. After all, if we could borrow the great works of others we wouldn't have to paraphrase them. But I think as with most things, it's much more nuanced than that. Our always online, always connected brains can only accept smaller snippets. So that's what we gravitate towards. Actually, we have plenty of capacity for whatever we actually choose to immerse ourselves into. But we have more options than ever before and we of course immerse ourselves into video games or other less literary pursuits. Or are they more literary? Some generations thought books to be dangerous. As do all oppressors. So who am I to judge where people choose to acquire knowledge or what kind they indulge themselves in. Knowledge is power and I'm just happy they have it. And they have it in part because others were willing to water own the concepts to ship a product. Because the history of technology is about evolutions, not revolutions. And those often take generations. And Nelson is responsible for some of the evolutions that brought us the ht in http or html. And for that we are truly grateful! As with the great journey from Lord of the Rings, rarely is greatness found alone. The Xanadu adventuring party included Cal Daniels, Roger Gregory, Mark Miller, Stuart Greene, Dean Tribble, Ravi Pandya, became a part of Autodesk in the 80s, got rewritten in Smalltalk, was considered a rival to the web, but really is more of an evolutionary step on that journey. If anything it's a divergence then convergence to and from Vannevar Bush's Memex. So let me ask this as a parting thought? Are the places you are not willing to sacrifice any of your core designs or beliefs worth the price being paid? Are they worth someone else ending up with a place in the history books where (like with this podcast) we oversimplify complex topics to make them digestible? Sometimes it's worth it. In no way am I in a place to judge the choices of others. Only history can really do that - but when it happens it's usually an oversimplification anyways… So the building blocks of the web lie in irreverence - in hypertext. And while some grew out of irreverence and diluted their vision after an event like Woodstock, others like Nelson and his friend Douglas Englebart forged on. And their visions didn't come with commercial success. But as an integral building block to the modern connected world today they represent as great a mind as practically anyone else in computing.
Let's oversimplify something in the computing world. Which is what you have to do when writing about history. You have to put your blinders on so you can get to the heart of a given topic without overcomplicating the story being told. And in the evolution of technology we can't mention all of the advances that lead to each subsequent evolution. It's wonderful and frustrating all at the same time. And that value judgement of what goes in and what doesn't can be tough. Let's start with the fact that there are two main types of processors in our devices. There's the x86 chipset developed by Intel and AMD and then there's the RISC-based processors, which are ARM and for the old school people, also include PowerPC and SPARC. Today we're going to set aside the x86 chipset that was dominant for so long and focus on how the RISC and so ARM family emerged. First, let's think about what the main difference is between ARM and x86. RISC and so ARM chips have a focus on reducing the number of instructions required to perform a task to as few as possible, and so RISC stands for Reduced Instruction Set Computing. Intel, other than the Atom series chips, with the x86 chips has focused on high performance and high throughput. Big and fast, no matter how much power and cooling is necessary. The ARM processor requires simpler instructions which means there's less logic and so more instructions are required to perform certain logical operations. This increases memory and can increase the amount of time to complete an execution, which ARM developers address with techniques like pipelining, or instruction-level parallelism on a processor. Seymour Cray came up with this to split up instructions so each core or processor handles a different one and so Star, Amdahl and then ARM implemented it as well. The X86 chips are Complex Instruction Set Computing chips, or CISC. Those will do larger, more complicated tasks, like computing floating point integers or memory searches, on the chip. That often requires more consistent and larger amounts of power. ARM chips are built for low power. The reduced complexity of operations is one reason but also it's in the design philosophy. This means less heat syncs and often accounting for less consistent streams of power. This 130 watt x86 vs 5 watt ARM can mean slightly lower clock speeds but the chips can cost more as people will spend less in heat syncs and power supplies. This also makes the ARM excellent for mobile devices. The inexpensive MOS 6502 chips helped revolutionize the personal computing industry in 1975, finding their way into the Apple II and a number of early computers. They were RISC-like but CISC-like as well. They took some of the instruction set architecture family from the IBM System/360 through to the PDP, General Nova, Intel 8080, Zylog, and so after the emergence of Windows, the Intel finally captured the personal computing market and the x86 flourished. But the RISC architecture actually goes back to the ACE, developed in 1946 by Alan Turing. It wasn't until the 1970s that Carver Mead from Caltech and Lynn Conway from Xerox PARC saw that the number of transistors was going to plateau on chips while workloads on chips were growing exponentially. ARPA and other agencies needed more and more instructions, so they instigated what we now refer to as the VLSI project, a DARPA program initiated by Bob Kahn to push into the 32-bit world. They would provide funding to different universities, including Stanford and the University of North Carolina. Out of those projects, we saw the Geometry Engine, which led to a number of computer aided design, or CAD efforts, to aid in chip design. Those workstations, when linked together, evolved into tools used on the Stanford University Network, or SUN, which would effectively spin out of Stanford as Sun Microsystems. And across the bay at Berkeley we got a standardized Unix implementation that could use the tools being developed in Berkely Software Distribution, or BSD, which would eventually become the operating system used by Sun, SGI, and now OpenBSD and other variants. And the efforts from the VLSI project led to Berkely RISC in 1980 and Stanford MIPS as well as the multi chip wafer.The leader of that Berkeley RISC project was David Patterson who still serves as vice chair of the RISC-V Foundation. The chips would add more and more registers but with less specializations. This led to the need for more memory. But UC Berkeley students shipped a faster ship than was otherwise on the market in 1981. And the RISC II was usually double or triple the speed of the Motorola 68000. That led to the Sun SPARC and DEC Alpha. There was another company paying attention to what was happening in the RISC project: Acorn Computers. They had been looking into using the 6502 processor until they came across the scholarly works coming out of Berkeley about their RISC project. Sophie Wilson and Steve Furber from Acorn then got to work building an instruction set for the Acorn RISC Machine, or ARM for short. They had the first ARM working by 1985, which they used to build the Acorn Archimedes. The ARM2 would be faster than the Intel 80286 and by 1990, Apple was looking for a chip for the Apple Newton. A new company called Advanced RISC Machines or Arm would be founded, and from there they grew, with Apple being a shareholder through the 90s. By 1992, they were up to the ARM6 and the ARM610 was used for the Newton. DEC licensed the ARM architecture to develop the StrongARMSelling chips to other companies. Acorn would be broken up in 1998 and parts sold off, but ARM would live on until acquired by Softbank for $32 billion in 2016. Softbank is currently in acquisition talks to sell ARM to Nvidia for $40 billion. Meanwhile, John Cocke at IBM had been working on the RISC concepts since 1975 for embedded systems and by 1982 moved on to start developing their own 32-bit RISC chips. This led to the POWER instruction set which they shipped in 1990 as the RISC System/6000, or as we called them at the time, the RS/6000. They scaled that down to the Power PC and in 1991 forged an alliance with Motorola and Apple. DEC designed the Alpha. It seemed as though the computer industry was Microsoft and Intel vs the rest of the world, using a RISC architecture. But by 2004 the alliance between Apple, Motorola, and IBM began to unravel and by 2006 Apple moved the Mac to an Intel processor. But something was changing in computing. Apple shipped the iPod back in 2001, effectively ushering in the era of mobile devices. By 2007, Apple released the first iPhone, which shipped with a Samsung ARM. You see, the interesting thing about ARM is they don't fab chips, like Intel - they license technology and designs. Apple licensed the Cortex-A8 from ARM for the iPhone 3GS by 2009 but had an ambitious lineup of tablets and phones in the pipeline. And so in 2010 did something new: they made their own system on a chip, or SoC. Continuing to license some ARM technology, Apple pushed on, getting between 800MHz to 1 GHz out of the chip and using it to power the iPhone 4, the first iPad, and the long overdue second-generation Apple TV. The next year came the A5, used in the iPad 2 and first iPad Mini, then the A6 at 1.3 GHz for the iPhone 5, the A7 for the iPhone 5s, iPad Air. That was the first 64-bit consumer SoC. In 2014, Apple released the A8 processor for the iPhone 6, which came in speeds ranging from 1.1GHz to the 1.5 GHz chip in the 4th generation Apple TV. By 2015, Apple was up to the A9, which clocked in at 1.85 GHz for the iPhone 6s. Then we got the A10 in 2016, the A11 in 2017, the A12 in 2018, A13 in 2019, A14 in 2020 with neural engines, 4 GPUs, and 11.8 billion transistors compared to the 30,000 in the original ARM. And it's not just Apple. Samsung has been on a similar tear, firing up the Exynos line in 2011 and continuing to license the ARM up to Cortex-A55 with similar features to the Apple chips, namely used on the Samsung Galaxy A21. And the Snapdragon. And the Broadcoms. In fact, the Broadcom SoC was used in the Raspberry Pi (developed in association with Broadcom) in 2012. The 5 models of the Pi helped bring on a mobile and IoT revolution. And so nearly every mobile device now ships with an ARM chip as do many a device we place around our homes so our digital assistants can help run our lives. Over 100 billion ARM processors have been produced, well over 10 for every human on the planet. And the number is about to grow even more rapidly. Apple surprised many by announcing they were leaving Intel to design their own chips for the Mac. Given that the PowerPC chips were RISC, the ARM chips in the mobile devices are RISC, and the history Apple has with the platform, it's no surprise that Apple is going back that direction with the M1, Apple's first system on a chip for a Mac. And the new MacBook Pro screams. Even software running in Rosetta 2 on my M1 MacBook is faster than on my Intel MacBook. And at 16 billion transistors, with an 8 core GPU and a 16 core neural engine, I'm sure developers are hard at work developing the M3 on these new devices (since you know, I assume the M2 is done by now). What's crazy is, I haven't felt like Intel had a competitor other than AMD in the CPU space since Apple switched from the PowerPC. Actually, those weren't great days. I haven't felt that way since I realized no one but me had a DEC Alpha or when I took the SPARC off my desk so I could play Civilization finally. And this revolution has been a constant stream of evolutions, 40 years in the making. It started with an ARPA grant, but various evolutions from there died out. And so really, it all started with Sophie Wilson. She helped give us the BBC Micro and the ARM. She was part of the move to Element 14 from Acorn Computers and then ended up at Broadcom when they bought the company in 2000 and continues to act as the Director of IC Design. We can definitely thank ARPA for sprinkling funds around prominent universities to get us past 10,000 transistors on a chip. Given that chips continue to proceed at such a lightning pace, I can't imagine where we'll be at in another 40 years. But we owe her (and her coworkers at Acorn and the team at VLSI, now NXP Semiconductors) for their hard work and innovations.
How did the Internet come into being and for what purpose? What were the early problems that led to its development? What were the fundamental concepts that allowed it to evolve over time into what it is today, a technology that impacts every aspect of our lives? Nomad Futurist is honored to feature Dr. Vint Cerf, one of the founding fathers of the Internet. Cerf played a key role in leading the development of Internet and Internet-related data packet and security technologies. This is an opportunity to hear the fascinating story of the inception of the Internet directly from the co-designer of its key protocols and architecture. Cerf’s exposure to computers reflected a mix of hands-on experience as well as a deep academic background at UCLA and Stanford where he focused on data transmission. This led to his work with Bob Kahn, one of the architects of ARPANET, funded by the U.S. Defense Department. He describes how Kahn invited him to help solve for the challenge of developing a uniform delivery system for different types of network technologies and how their collaboration led to the development of TCP/IP, the fundamental communication protocols at the heart of the Internet. He explains that he got drawn into the field inexorably, “Once the internet design work got started, it has been central in my career ever since.” Cerf, who is currently VP, Chief Internet Evangelist at Google, recounts his journey and the evolution of the system that allowed the development of a universal communications network where data can be transferred independently of the routing technologies deployed or the ultimate end user applications. “This extensibility, this willingness to let people invent new things and fit them into the architecture that was not so rigid that it would reject that was really powerful.” He talks about his tremendous good fortune to have been at the right place and right time. And even the good fortune to not have “known” everything. “Sometimes this idea of incompleteness and deliberate ignorance is sometimes your friend when you want to build something whose certainty is uncertain and whose design is open.” Cerf talks eloquently about the past, present trends, and the future to come including a discussion of quantum computing and the establishment of internet nodes throughout the solar system. He also mentions astrophysics, microbiology, neural interfaces, IoT and better programming environments as areas needing deeper exploration. When asked if he could have anticipated what the Internet would become, he cites the World Wide Web and the birth of search engines as important developments that had not been foreseen. “It’s been 50 years… so this whole thing in some ways has been a personal experience for me. It’s something I have lived through day by day watching it grow… we didn’t imagine everything. But we did know that what we were doing had powerful enabling potential.” Cerf’s key advice for the young: “Take risks early so you have plenty of time to recover from a mistake… It's often the case that a foiled scientific experiment teaches you more than a successful one does…and don't be afraid to break out of conventional thinking…Much of what we discover turns out to be something that doesn't look like it's possible or gets rejected by the mainstream.” Widely known as a “Father of the Internet,” Dr. Vinton G. Cerf is the co-designer of the TCP/IP protocols and the architecture of the Internet. He contributes to global policy development and continued spread of the Internet. Since 2005, Cerf has served as Vice President and Chief Internet Evangelist for Google. In this role, he is responsible for identifying new enabling technologies to support the development of advanced, Internet-based products and services. He has served in executive positions at MCI, the Corporation for National Research Initiatives and the Defense Advanced Research Projects Agency and on the faculty of Stanford...
In the latest episode of ACM ByteCast, host Jessica Bell chats with former ACM President Vint Cerf, Vice President and Chief Internet Evangelist at Google, an Internet pioneer widely recognized as one of “the fathers of the Internet.” His many recognitions include the ACM A.M. Turing Award, the National Medal of Technology, the Presidential Medal of Freedom, and the Marconi Prize.Cerf takes us along on an amazing voyage from seeing his first tube-based computer in 1958 to his work on ARPANET and TCP/IP with Bob Kahn, providing a brief history of the Internet in the process. Along the way, he explains how they approached the problem of building a network architecture that scaled astronomically over time. Cerf also points to important social, business, and ethical problems yet to be resolved, and explains why it’s an exciting time to be a student in computing.
Months before the first node of ARPANET went online, the intrepid easy engineers were just starting to discuss the technical underpinnings of what would evolve into the Internet some day. Here, we hear how hosts would communicate to the IMPs, or early routing devices (although maybe more like a Paleolithic version of what's in a standard network interface today). It's nerdy. There's discussion of packets and what bits might do what and later Vint Cerf and Bob Kahn would redo most of this early work as the protocols evolved towards TCP/IP. But reading their technical notes and being able to trace those through thousands of RFCs that show the evolution into the Internet we know today is an amazing look into the history of computing.
What is the Internet? The development of the concept of computers in the 1950s brought the data transfer between devices into the agenda. Package designs in the digital network were developed in laboratories in France and England, especially in the USA. While the history of the Internet started in this way, the US Department of Defense started to use the first protocol, ARPANET. The first message sent over this network was sent to the computer of a professor at the University of California in Los Angeles. As a result of the development work carried out in 1981, ARPANET was given a structure that can be connected to larger networks. Although telex machines used in the 19th century were among the pioneers of digital communication, this was developed in the early 20th century in a modern sense. The first computers that shared networks could allow data flow at the level allowed by the technology of the era, that is, only over two points between the main computers. Over time, this system has also been developed and the importance of the distance between the main computer or terminals has disappeared. With the discovery of high connection speed, the history of the internet is carried to a different dimension. File sharing between two devices can now be done easily. However, the main problem that arises in this situation is the possibility of damage to the physical network that will provide data flow between two points. The concept of the Internet was pronounced in the TCP protocol by Vint Cerf and Bob Kahn for the first time in history. The system, which facilitates access to information by combining all networks around the world, was further developed between 1973 and 1980. As of January 1, 1983, ARPANET announced that every device connected to the network must accept the Communication Control Protocol. Shortly after this protocol that laid the foundations of the Internet, especially gov., Com, Edu. An article has been published that talks about extensions such as The article was published by the IETF and the history of the internet has thus entered a new era. The years 1990 and 1995 were the years when important breaks were experienced in terms of the digital networks. ARPANET in 1990 and another digital network service provider, NFSNET, are now out of date. The "world wide web" that we all know as "www", which comes from the city of CERN in Switzerland, was now providing access to the digital network even at homes. This system, which was a revolution around the world, now allowed many people to benefit from social networks and online shopping. More Podcasts What is the Difference Between LAN, WAN, and MAN Networks? What is LAN? What is WAN? What is Internet? What is Ubuntu?
The History Of Cisco Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us to innovate (and sometimes cope with) the future! Today we're going to talk about the history of Cisco. They have defined the routing and switching world for decades. Practically since the beginning of the modern era. They've bought companies, they've grown and shrunk and grown again. And their story feels similar in many ways to the organizations that came out of the tail end of the grants tossed around by DARPA. These companies harnessed the incredibly innovative ideas and technology to found the companies who commercialized all of that amazing research and changed the world. These companies ushered in a globally connected network, almost instantaneously transmitting thoughts and hopes and dreams and failures and atrocities. They made money. Massive, massive truckloads of money. But they changed the world for the better. Hopefully in an irrevocable kind of way. The Cisco story is interesting because it symbolizes a time when we were moving from the beginnings of the Internet. Stanford had been involved in ARPAnet since the late 60s but Vint Cerf and Bob Kahn had been advancing TCP and IP in the 70s, establishing IPv4 in 1983. And inspired by ALOHAnet, Bob Metcaffe and the team at Xerox PARC had developed Ethernet in 74. And the computer science research community had embraced these, with the use of Email and time sharing spurring more and more computers to be connected to the Internet. Raw research being done out of curiosity and to make the world a better place. The number of devices connected to the growing network was increasing. And Stanford was right in the center of it. Silicon Valley founders just keep coming out of Stanford but this one, they were professors, and early on. They invented the multi-protocol router and finance the startup with their own personal credit cards. Leonard Bosack and Sandy K. Lerner are credited for starting Cisco, but the company rose out of projects to network computers on the Stanford campus. The project got started after Xerox PARC donated some Alto workstations and Ethernet boards they didn't need anymore in 1980, shortly after Metcaffe left Xerox to start 3COM. And by then Cerf was off to MCI to help spur development of the backbones faster. And NSFnet came along in 1981, bringing even more teams from universities and private companies into the fold. The Director of Computer Facilities, Ralph Gorin, needed to be able to get longer network cables to get even more devices connected. He got what would amount to a switch today. The team was informal. They used a mother board from Andy Bechtolsheim, later the founder of Sun Microsystems. They borrow boards from other people. Bosack himself, who had been an ARPAnet contributor, donated a board. And amongst the most important was the software, which William Yeager wrote, which had a little routing program that connected medical center computers to the computer science department computers and could use the Parc Universal Packet (PUP), XNS, IP and CHAOSNet.. The network linked any types of computers, from Xerox Altos to mainframes using a number of protocols, including the most important for the future, IP, or the Internet Protocol. They called it the Blue Box. And given the number of computers that were at Stanford, various departments around campus started asking for them, as did other universities. There were 5,000 computers connected at Stanford by the time they were done. Seeing a potential business here, Bosack, then running the computers for the Computer Science department, and Lerner, then the Director of Computer Facilities for the Graduate School of Business, founded Cisco Systems in 1984, short for San Francisco, and used an image of the Golden Gate Bridge a their logo. You can see the same pattern unfold all over. When people from MIT built something cool, it was all good. Until someone decided to monetize it. Same with chip makers and others. By 1985, Stanford formally started a new project to link all the computers they could on the campus. Yeager gave the source to Bosack and Kirk Lougheed so they could strip out everything but the Internet Protocol and beef that up. I guess Yeager saw routers as commercially viable and he asked the university if he could sell the Blue Box. They said no. But Bosack and Lougheed were plowing ahead, using Stanford time and resources. But Bosack and Lerner hadn't asked and they were building these routers in their home and it was basically the same thing as the Blue Box, including the software. Most of the people at Stanford thought they were crazy. They kept adding more code and logic and the devices kept getting better. By 1986, Bosack's supervisor Les Earnest caught wind and started to investigate. He went to the dean and Bosack was given an ultimatum, it was go the wacky Cisco thing or stay at Stanford. Bosack quit to try to build Cisco into a company. Lougheed ran into something similar and quit as well. Lerner had already left but Greg Satz and Richard Troiano left as well, bringing them up to 5 people. Yeager was not one of them, even though he'd worked a lot on the software, including on nights and weekends. But everyone was learning and when it was to benefit the university, it was fine. But then when things went commercial, Stanford got the lawyers involved. Yeager looked at the code and still saw some of his in there. I'm sure the Cisco team considered that technical debt. Cisco launched the Advanced Gateway Server (AGS) router in 1986, two years after the Mac was released. The software was initially written by Yeager but improved by Bosack and Lougheed, as the operating system, later called Cisco IOS. Stanford thought about filing a criminal complaint of theft but realized it would be hard to prosecute, and ugly especially given that Stanford itself is a non-profit. They had $200,000 in contracts and couldn't really be paying all this attention to lawsuits and not building the foundations of the emerging Internet. So instead they all agreed to license the software and the imprint of the physical boards being used (known as photomasks), to the fledgling Cisco Systems in 1987. This was crucial as now Cisco could go to market with products without the fear of law suits. Stanford got discounts on future products, $19,300 up front, and $150,000 in royalties. No one knew what Cisco would become so it was considered a fair settlement at the time. Yeager, being a mensch and all, split his 80% of the royalties between the team. He would go on to give us IMAP and Kermit, before moving to Sun Microsystems. Speaking of Sun, there was bad blood between Cisco and Stanford, which I always considered ironic given that a similar thing happened when Sun was founded in some part, using Stanford intellectual property and unused hardware back in 1982. I think the difference is trying to hide things and being effusive with the credit for code and inventions. But as sales increased, Lougheed continued to improve the code and the company hired Bill Graves to be CEO in 1987 who was replaced with John Mordridge in 1988. And the sales continued to skyrocket. Cisco went public in 1990 when they were valued at $224 million. Lerner was fired later that year and Bosack decided to join her. And as is so often the case after a company goes public, the founders who had a vision of monetizing great research, were no longer at the startup. Seeing a need for more switching, Cisco acquired a number of companies including Grand Junction and Crescendo Communications which formed like Voltron to become the Cisco Catalyst, arguably the most prolific switching line in computing. Seeing the success of Cisco and the needs of the market, a number of others started building routers and firewalls. The ocean was getting redder. John Mays had the idea to build a device that would be called the PIX in 1994 and Branley Coile in Athens, Georgia programmed it to become a PBX running on IP. We were running out of IP addresses because at the time, organizations used public IPs. But NAT was about to become a thing and RFC 1918 was being reviewed by the IETF. They brought in Johnson Wu and shipped a device that could run NAT that year, ushering in the era of the Local Area Network. John T. Chambers replaced Mordridge in 1995 and led Cisco as its CEO until 2015. Cisco quickly acquired the company and the Cisco PIX would become the standard firewall used in organizations looking to get their computers on the Internets. The PIX would sell and make Cisco all the monies until it was replaced by the Cisco ASA in 2008. In 1996, Cisco's revenues hit $5.4 billion, making it one of Silicon Valley's biggest success stories. By 1998 they were up to $6B. Their stock peaked in 2000. By the end of the dot-com bubble in the year 2000, Cisco had a more than $500 billion market capitalization. They were building an industry. The CCNA, or Cisco Certified Network Associate, and CCNE, Cisco Certified Network Engineer were the hottest certifications on the market. When I got mine it was much easier than it is today. The market started to fragment after that. Juniper came out strong in 1999 and led a host of competitors that landed in niche markets and expanded into core markets. But the ASA combined Cisco's IPS, VPN concentration, and NAT functionality into one simpler box that actually came with a decent GUI. The GUI seemed like sacrilege at the time. And instead of sitting on top of a network operating system, it ran on Linux. At the top end they could handle 10 million connections, important once devices established and maintained so many connections to various services. And you could bolt on antivirus and other features that were becoming increasingly necessary at various layers of connectivity at the time. They went down-market for routing devices with an acquisition of Linksys in 2003. They acquired Webex in 2007 for over $3 billion dollars and that became the standard in video conferencing until a solid competitor called Zoom emerged recently. They acquired SourceFire in 2013 for $2.7B and have taken the various services offered there to develop Cisco products, such as the anti-virus to be a client-side malware scanning tool called Cisco AMP. Juniper gave away free training unlike the Cisco training that cost thousands of dollars and Alcatel-Lucent, Linksys, Palo Alto Networks, Fortinet, SonicWall, Barracuda, CheckPoint, and rising giant Huawei led to a death by a thousand competitors and Cisco's first true layoffs by 2011. Cisco acquired OpenDNS in 2015 to establish a core part of what's now known as Cisco Umbrella. This gives organizations insight into what's happening on increasingly geographically distributed devices; especially mobile devices due to a close partnership with Apple. And they acquired Broadsoft in 2017 to get access to even more sellers and technology in the cloud communication space. Why? Because while they continue to pump out appliances for IP connectivity, they just probably can't command a higher market share due to the market dynamics. Every vendor they acquire in that space will spawn two or more new serious competitors. Reaching into other spaces provides a more diverse product portfolio and gives their sellers more SKUs in the quiver to make quotas. And pushes the world forward with newer concepts, like fog computing. Today, Cisco is still based in San Jose and makes around $50 billion a year in revenue and boasts close to 75,000 employees. A lot has happened since those early days. Cisco is one of the most innovative and operationally masterful companies on the planet. Mature companies can have the occasional bumps in the road and will go through peaks and valleys. But their revenues are a reflection of their market leadership, sitting around 50 billion dollars. Yes, most of their true innovation comes from acquisitions today. However, the insights on whom to buy and how to combine technologies, and how to get teams to work well with one another. That's a crazy level of operational efficiency. There's a chance that the Internet explosion could have happened without Cisco effectively taking the mantle in a weird kind of way from BBN for selling and supporting routing during the storm when it came. There's also a chance that without a supply chain of routing appliances to help connect the world that the whole thing might have tumbled down. So consider this: technological determinism. If it hadn't of been Cisco, would someone else have stepped up to get us to the period of the dot com bubble? Maybe. And since they made so much money off the whole thing I've heard that Cisco doesn't deserve our thanks for the part they played. But they do. Without their training and appliances and then intrusion prevention, we might not be where we are today. So thank you Cisco for teaching me everything I know about OSI models and layers and all that. And you know… helping the Internet become ubiquitous and all. And thank you, listener, for tuning in to yet another episode of the history of computing podcast. We are so very lucky to have you. Have a great day!
Today we're going to look at what it really means to be a standard on the Internet and the IETF, the governing body that sets those standards. When you open a web browser and visit a page on the Internet, there are rules that govern how that page is interpreted. When traffic sent from your computer over the Internet gets broken into packets and encapsulated, other brands of devices can interpret the traffic and react, provided that the device is compliant in how it handles the protocol being used. Those rules are set in what are known as RFCs. It's a wild concept. You write rules down and then everyone follows them. Well, in theory. It doesn't always work out that way but by and large the industry that sprang up around the Internet has been pretty good at following the guidelines defined in RFCs. The Requests for Comments gives the Internet industry an opportunity to collaborate in a non-competitive environment. Us engineers often compete on engineering topics like what's more efficient or stable and so we're just as likely to disagree with people at your own organization as we are to disagree with people at another company. But if we can all meet and hash out our differences, we're able to get emerging or maturing technology standards defined in great detail, leaving as small a room for error in implementing the tech as possible. This standardization process can be lengthy and slows down innovation, but it ends up creating more innovation and adoption once processes and technologies become standardized. The concept of standardizing advancements in technologies is nothing new. Alexander Graham Bell saw this when he started The American Institute of Electrical Engineers in 1884 to help standardize the new electrical inventions coming out of Bell labs and others. That would merge with the Institute of Radio Engineers in 1963 and now boasts half a million members spread throughout nearly every company in the world. And the International Organization for Standardization was founded in 1947. It was as a merger of sorts between the International Federation of the National Standardizing Associations, which had been founded in 1928 and the newly formed United Nations Standards Coordinating Committee. Based in Geneva, they've now set over 20,000 standards across a number of industries. I'll over-simplify this next piece and revisit it in a dedicated episode. The Internet began life as a number of US government funded research projects inspired by JCR Licklider around 1962, out of ARPAs Information Processing Techniques Office, or IPTO. The packet switching network would evolve into ARPANET based on a number of projects he and his successor Bob Taylor at IPTO would fund straight out of the pentagon. It took a few years, but eventually they brought in Larry Roberts, and by late 1968 they'd awarded an RFQ to a company called Bolt Beranek and Newman (BBN) to build Interface Message Processors, or IMPs, to connect a number of sites and route traffic. The first one went online at UCLA in 1969 with additional sites coming on frequently over the next few years. Given that UCLA was the first site to come online, Steve Crocker started organizing notes about protocols in what they called RFCs, or Request for Comments. That series of notes would then be managed by Jon Postel until his death 28 years later. They were also funding a number of projects to build tools to enable the sharing of data, like file sharing and by 1971 we also had email. Bob Kahn was brought in, in 1972, and he would team up with Vinton Cerf from Stanford who came up with encapsulation and so they would define TCP/IP. By 1976, ARPA became DARPA and by 1982, TCP/IP became the standard for the US DOD and in 1983, ARPANET moved over to TCP/IP. NSFNet would be launched by the National Science Foundation in 1986. And so it was in 1986 when The Internet Engineering Task Force, or IETF, was formed to do something similar to what the IEEE and ISO had done before them. By now, the inventors, coders, engineers, computer scientists, and thinkers had seen other standards organizations - they were able to take much of what worked and what didn't, and they were able to start defining standards. They wanted an open architecture. The first meeting was attended by 21 researchers who were all funded by the US government. By the fourth meeting later that year they were inviting people from outside the hollowed halls of the research community. And it grew, with 4 meetings a year that continue on to today, open to anyone. Because of the rigor practiced by Postel and early Internet pioneers, you can still read those notes from the working groups and RFCs from the 60s, 70s, and on. The RFCs were funded by DARPA grants until 1998 and then moved to the Internet Society, who runs the IETF and the RFCs are discussed and sometimes ratified at those IETF meetings. You can dig into those RFCs and find the origins and specs for NTP, SMTP, POP, IMAP, TCP/IP, DNS, BGP, CardDAV and pretty much anything you can think of that's become an Internet Standard. A lot of companies claim to the “the” standard in something. And if they wrote the RFC, I might agree with them. At those first dozen IETF meetings, we got up to about 120 people showing up. It grew with the advancements in routing, application protocols, other networks, file standards, peaking in Y2K with 2,810 attendees. Now, it averages around 1,200. It's also now split into a number of working groups with steering committees, While the IETF was initially funded by the US government, it's now funded by the Public Interest Registry, or PIR, which was sold to Ethos Capital in November of 2019. Here's the thing about the Internet Society and the IETF. They're mostly researchers. They have stayed true to the mission since they took over from Pentagon, a de-centralized Internet. The IETF is full of super-smart people who are always trying to be independent and non-partisan. That independence and non-partisanship is the true Internet, the reason that we can type www.google.com and have a page load, and work, no matter the browser. The reason mail can flow if you know an email address. The reason the Internet continues to grow and prosper and for better or worse, take over our lives. The RFCs they maintain, the standards they set, and everything else they do is not easy work. They iterate and often don't get credit individually for their work other than a first initial and a last name as the authors of papers. And so thank you to the IETF and the men and women who put themselves out there through the lens of the standards they write. Without you, none of this would work nearly as well as it all does. And thank you, listeners, for tuning in for this episode of the History of Computing Podcast. We are so lucky to have you.
Cybersecurity awareness and education are not just for a month, they are forever | A conversation with Jeff Wilbur | Director of the Online Trust Alliance Initiative at the Internet Society. It was about time that The Cyber Society met The Internet Society. Sounds like we have at least one thing in common: we think about the present and future of our lives in this digitally connected world. Other than that the difference is enormous. This is just a small column that I run on ITSPmagazine with the mission to bring awareness, education, and a paradigm shift in the way our society perceives technology. The Internet Society is a large non-profit that has been around for 26 years, and it was founded by two of the “fathers of the Internet,” Vint Cerf and Bob Kahn. When they created it, they saw the full potential and risk of what was coming and wanted to be sure that someone was looking after this fantastic communication system they helped to create: the Internet. Their Vision? The Internet is for everyone. Their Mission? To support and promote the development of the Internet as global technical infrastructure, a resource to enrich people’s lives, and a force for good in society. Their main goal? The Internet to be open, globally connected, secure, and trustworthy. An admirable quest and not an easy one, but hopefully a dream that many of us still share and will fight to support and help make a reality. This conversation happened during National Cybersecurity Awareness Month and while it is an extra incentive for many, Jeff and I agreed that there’s no such thing as a bad month, day or hour to talk about cybersecurity awareness and education. We spoke about how things have changed, especially with the advent of the Internet of Things and the massive adoption by industries and consumers. We talk about the Online Trust Alliance Framework, with its set of 40 principles intended to serve as a guide for IoT-solution vendors to ensure that devices and services follow security and privacy best practices. We discussed what happens when companies put unsafe IoT devices on the market and how consumers can choose a safe device to put in a smart home, instead of a trojan horse. In a perfect world we shouldn’t have to worry about all this and. hopefully, soon we will have to be a bit less concerned about it. The Online Trust Alliance is working with other organizations all over the world to create a standard certification that will assure the quality and safety of the IoT devices on the market. Until then, and probably even then, remember that 100% cybersecurity simply does not exist. We, the users, cannot merely stand still waiting for technology to fix itself, for companies to unconditionally respect and protect us, or for the government to step in. We must do our part to protect ourselves, our businesses and our families. We must educate ourselves and act. This is part of what living in a Cyber Society means and what awareness is about. Educate yourself and listen up. Read the full article and discover more articles on ITSPmagazine: https://www.itspmagazine.com/itsp-chronicles/cybersecurity-awareness-and-education-are-not-just-for-a-month-they-are-forever
Follow-up Combipakket Love bij Orange Onderwerpen Jaap Meijers recycleerde de Kindle van Marco en deed wel meer leuke dingen: Vintage radio’s installeren in een hotel. Een razendsnelle diascanner maken. Niet van Jaap, maar wel zeer knap: een zelf oplossende Rubik’s Cube. De eerste Android-telefoon is 10 jaar oud en Android Pie is uit De Zero Phone Floris interviewde Bob Kahn over DOA. Tips Jaap: Eindhoven Maker Faire Floris: Pixlite LED controllers en Xmas-land.de Ruurd: Podcastfestival, Solo: a Star Wars Story en Le Dimmer Toon: Hugo Arnoud: Dogfight en Doom Royale
Voor deze aflevering van de Media Fast Forward podcast gingen Floris Daelemans en Fredo De Smet naar IBC in Amsterdam. Het is de grootste broadcast technologiebeurs van Europa en ze gingen op zoek naar Duurzaamheid en naar praktische toepassingen van Artificiële Intelligentie. Ze kijken ook nog eens naar wat er de voorbije weken allemaal gezegd is over een mogelijke "Vlaamse Netflix". Blijven Hangen Fredo heeft op Youtube een man gevonden die van bekende liedjes verknipte -maar interessante- versies maakt: another one bites the dust but every other beat is missing [CC] chop suey but every other beat is missing take on me but every other measure is missing Floris ontmoette Bob Kahn, één van de uitvinders van het TCP/IP netwerkprotocol. Kahn is een echte internetpionier en is op zijn 79ste nog steeds bezig met het heruitvinden van "zijn" internet. Bob Kahn over Digital Objects Architecture. Meer info over ons? Kijk eens op de website van VRT Innovatie of op mediafastforward.be
An interview with "the father of the Internet” TCP/IP co-inventor Vint Cerf. Cerf was born June 23, 1943 is an American Internet pioneer, who is recognized as one of "the fathers of the Internet”, sharing this title with TCP/IP co-inventor Bob Kahn. His contributions have been acknowledged and lauded, repeatedly, with honorary degrees and awards that include the National Medal of Technology, the Turing Award, the Presidential Medal of Freedom, the Marconi Prizeand membership in the National Academy of Engineering. In the early days, Cerf was a manager for the United States' Defense Advanced Research Projects Agency (DARPA) funding various groups to develop TCP/IP technology. When the Internet began to transition to a commercial opportunity during the late 1980s, Cerf moved to MCIwhere he was instrumental in the development of the first commercial email system (MCI Mail) connected to the Internet. Cerf was instrumental in the funding and formation of ICANN from the start. He waited a year before stepping forward to join the ICANN Board, and eventually became chairman. He was elected as the president of the Association for Computing Machinery in May 2012, and in August 2013 he joined the Council on CyberSecurity's Board of Advisors. Cerf is active in many organizations that are working to help the Internet deliver humanitarian value to the world. He is supportive of innovative projects that are experimenting with new approaches to global problems, including the digital divide, the gender gap, and the changing nature of jobs. Cerf is also known for his sartorial style, typically appearing in a three-piece suit—a rarity in an industry known for its casual dress norms. More here - https://en.wikipedia.org/wiki/Vint_Cerf
Ted Kahn, Atari Institute for Educational Action Research Ted Kahn was creator of the Atari Institute for Educational Action Research, which awarded major grants of Atari home computer products, and consulting services to individuals, schools, and non-profit organizations. The group granted more than $1.25 million in products and services to about 100 innovative people and projects around the US and overseas. He also co-wrote the books Atari Games and Recreations, and Atari PILOT Activities and Games. This interview took place on October 9, 2015. In it, we discuss Ted's bother, Bob Kahn; and Tandy Trower, both of whom I have previously interviewed. Teaser quotes: "Its purpose is not just to give stuff away, but it's purpose is to really make sure that if it's given away, it's going to be given to people and organizations who can make some impact with it." "A thing, behind closed doors, in Washington, in which we had an entire group of Senators and Congressmen, for a period of about a day, to learn about all this stuff..." Antic magazine article about the Atari Institute for Educational Action Research: http://www.atarimagazines.com/v2n6/insideatari.html
On this episode of ANTIC the atari 8-bit podcast: We discuss new atari archive sites. Randy tells us all about BASIC XL Kevin delves into the sordid history of DorsETT Educational Systems Bill Kendrick fills us in on a panoply of stuff, including a brain transplant for your 8-bit. Recurring Links Floppy Days Podcast AtariArchives.org AtariMagazines.com Kevins Book “Terrible Nerd” New Atari books scans at archive.org ANTIC feedback at AtariAge What we’ve been up to Jewel-encrusted Atari XL Alan Watson’s Gold Mine game at AtariMania New Computing Pioneers Interview Transcript Site Bob Kahn digitized material at Archive.org Atari PILOT II "Inside Atari DOS" by Bill Wilkinson and Compute! Books "Atari Player/Missile Graphics in BASIC" by Philip C. Seyer “The VisiCalc Book, Atari Edition” by Donald H. Beil Podcast Episode of Inverse Atascii about VisiCalc for the Atari Floppy Days episode on the Atari 400/800, part 3 of 3 SIO2SD Interview Discussion Atari interview discussion thread on AtariAge News How To Turn Your Whole Car Into A Video Game Simulator by Jason Torchinsky Pix from Atari Party 2015 Classic Gamefest July 25, 26 Austin, TX VCF Midwest 10 August 29-30, 2015, Elk Grove Village, IL Portland Retro Gaming expo October 17-18 in Portland OR Maury Markowitz Wikipedia article on the Atari Sierra New issue of Pro(c) magazine - issue #6 Ft. Apocalypse on GitHub Ft. Apocalypse Twitter Steve Hales Twitter AtariAge Thread on Modified Version of Silent Service Download: SILENT SERVICE mod Delta Space Arena on cartridge More on Delta Space Arena Project to Port Prince of Persia to Atari Atari 8-bit software preservation project Atari XL Basic Listings - in german - collection of type-in listings from magazines and books New podcast on the Tandy/Radio Shack Color Computer - The CoCo Crew Podcast New at Archive.org Newsletters Compuserve Computer Room Operations Training Manual a javascript based Apple //GS emulator Stuff Kevin has uploaded from Bob Kahn and other sources Feature [K] Dorsett Educational Systems and Lloyd Dorsett Dorsett’s letter in Antic Dorsett’s letter in A.N.A.L.O.G. Computing Thomas Cherryhomes’ demonstration of his edutape creation tool Bill’s Modern Segment "Project Veronica" thread at AtariAge Forums Post at "Vintage is the New Old" (based on a post at the Spanish "Atariteca" blog), including a side-by-side demo of "rotator" Altirra website Lotharek's Lair: SIO2PC Atari Party 2015 "World 1-1" documentary film Sal "kjmann" Esquivel (ANTIC interview ep. 17) Mike Albaugh (ANTIC ep. 6) Dan Trak-ball man Kramer (5200 Super Podcast ep. 3) Jerry Jessop (ANTIC interview ep. 30) "Transform the Gotek Floppy Emulator into an Amiga Floppy emulator" "ACA 500 Accelerator" for Amiga 500 (Sal Esquivel) "Tank" in Action! atr2unix by Preston Crow Internet Archive's Software Library: Atari Computer atr_txt_dump extractor for Kevin Savetz / Internet Archive Rob McMullen's "ATRCopy" on github "Space Wall" Work-in-Progress video The Atari 5200 Super Podcast ep. 1: "The Pac(k) Ins!" Atari 2600 Game by Game podcast, ep. 100: "AX-018 Pitfall!" Hardware of the Month - IDE Plus 2, Rev.D HDD Interface for Atari XL/XE computers eBay listing Description at Atari8 ideplus manual Software/Website of the Month - SpartaDOS X SpartaDOS X Upgrade Project AtariAge thread Programming Languages - BASIC A+, BASIC XL & BASIC XE Wikipedia BASIC A+ at AtariMania BASIC XL at AtariMania BASIC XL Toolkit at AtariMania BASIC XE at AtariMania BASIC XE Extension Disk at AtariMania BASIC XE Article ANTIC VOL. 4, NO. 9 / JANUARY 1986 Feedback Ed Rotberg, wrote a magazine article YouTube video showing a demo program that Rotberg created
Difference between wi-max and wi-fi, profiles in IT (Vint Cerf and Bob Kahn, founders of Internet), iPhone update (battery replacement, unlocking hacks, sensors), thermal expansion of gasoline, mechanical computer update, Internet and terrorism (arrests, methods, traffic patterns), traffic analysis using alexa.com, wi-fi Internet cameras, Google image labeler, Google lobbying focus, and Intel bios patch for Core 2 and Xeon chips. This show originally aired on Saturday, July 7, 2007, at 9:00 AM EST on Washington Post Radio (WTWP) Radio.
Difference between wi-max and wi-fi, profiles in IT (Vint Cerf and Bob Kahn, founders of Internet), iPhone update (battery replacement, unlocking hacks, sensors), thermal expansion of gasoline, mechanical computer update, Internet and terrorism (arrests, methods, traffic patterns), traffic analysis using alexa.com, wi-fi Internet cameras, Google image labeler, Google lobbying focus, and Intel bios patch for Core 2 and Xeon chips. This show originally aired on Saturday, July 7, 2007, at 9:00 AM EST on Washington Post Radio (WTWP) Radio.