POPULARITY
For Episode 66, Read Between the Wines ventures to New Zealand, uncovering the story of Dog Point Vineyard in Marlborough with General Manager Matt Sutherland. As the son of the winery's founders, Margaret and Ivan Sutherland, Matt grew up immersed in the family business before carving his own path in wine management and sales across New Zealand, Australia, and London. Returning home, Matt brought a global perspective to Dog Point, helping to cement its reputation as a pioneer in organic and sustainable winemaking. In this episode, Matt delves into Dog Point's unique approach, from handpicking fruit to crafting wines with low intervention techniques. He shares the journey of converting their vineyards to certified organic practices, explaining how soil health, biodiversity, and a deep respect for the land underpin their philosophy. Listeners will discover how Dog Point's terroir, a combination of hillside blocks and clay-laden soils, contributes to the elegance and complexity of their Sauvignon Blanc, Chardonnay, and Pinot Noir. The conversation also explores New Zealand's wine industry, its rapid rise to prominence, and the distinctive profiles of Marlborough wines. Matt reflects on the challenges of modern winemaking, shifting consumer demands, and Dog Point's commitment to creating wines that cellar beautifully and tell a story of place and tradition. For more information about our Podcast, visit us on the web: https://readbetweenthewinespodcast.com Follow us on Instagram: https://www.instagram.com/betweenthewinesmedia Connect with us on LinkedIn: https://www.linkedin.com/company/read-between-the-wines
Inventing on PrincipleStop Drawing Dead FishThe Future of Programming Yes, all three of them in one episode. Phew! Links $ patreon.com/futureofcoding — Lu and Jimmy recorded an episode about Hest without telling me, and by total coincidence released it on my birthday. Those jerks… make me so happy. Lu's talk at SPLASH 2023: Cellpond: Spatial Programming Without Escape Gary Bernhardt's talk Wat Inventing on Principle by Bret Victor ("""Clean""" Audio) Braid, the good video game from the creator of The Witness David Hellman is the visual artist behind Braid, A Lesson Is Learned but the Damage Is Irreversible, Dynamicland, and… the Braid section of Inventing on Principle. Light Table by Chris Granger Learnable Programming by Bret Victor When Lu says "It's The Line", they're referring to this thing they're working on called Seet (or "see it"), and you can sneak a peek at seet right heet. Paris Fashion Week absolutely struts, and so can you! The Canadian Tuxedo. As the representative of Canada, I can confirm that I own both a denim jacket and denim pants. If you see me at a conference wearing this combo, I will give you a hug. Jimmy runs a personal Lichess data lake. Hot Module Replacement is a good thing. Pygmalion has a lot of juicy silly bits, 'parently. Cuttle is awesome! It's a worthy successor to Apparatus. Toby Schachman, Forrest Oliphant, I think maybe a few other folks too? Crushing it. Oh, and don't miss Toby's episode of this very podcast! Recursive Drawing, another Toby Schachman joint. Screens in Screens in Screens, another Lu Wilson joint. Larry Tesler. Not a fan of modes. Lu writes about No Ideas on their blog, which is actually just a wiki, but it's actually a blog, but it's actually just a garden. When we mention Rich Hickey, we're referring to the talk Simple Made Easy Jacob Collier, ugh. Suffragettes, women advocating for their right to vote, absolutely had a principle. Not sure that we should be directly likening their struggle to what we do in tech. On the other hand, it's good to foster positive movements, to resist incel and other hateful ones. Instead of linking to e/ anything, I'm just gonna link to BLTC for reasons that only make sense to longtime listeners. Stop Writing Dead Programs by Jack Rusher. Jack Rusher? Jack Rusher! It's the fish one, the one with the fish. …Sorry, these aren't actually fish, or something, because they're just drawings. René Magritte is the creator behind La Trahison des Images, origin of "Ceci n'est pas une pipe". Or maybe it was Margit the Fell Omen? Magritte's Words and Images are lovely. Here's an English translation, though its worth taking a look at the original in context. Acousmatic Music Lu has made art with behaviour — various sands, and CellPond, say. Barnaby Dixon? Barnaby Dixon. Barnaby Dixon! Barnaby Dixon!! You can listen to part of Ivan's """Metronome""", if you want. Or you can listen to an early version of the song he's using this metronome to write. Or you can hear snippets of it in the Torn Leaf Zero video (especially the ending). But, like, you could also go make yourself lunch. I recommend mixing up a spicy peanut sauce for your roasted carrots. Shred a bit of cheese, tomato. Toast the bread. Pull the sausages right when the oil starts to spit. Put them straight into the compost. Look at the bottom of the compost bucket. What's down there? It's shiny. Why are you reading this? Why am I writing this? Why do we make thispodcast? Wintergatan — Marble Machine exists Oh, I forgot to add a link to Arroost earlier. You can also watch a pretty good video that is basically an Arroost tutorial, not much to it. There are also some nice examples of things people have made with Arroost. The Rain Room looks pretty cool. It's the exact inverse of how rain works in many video games. YOU MUST PLAY RAIN WORLD. Here's a beautiful demo of a microtonal guitar, and speaking of using complex machines to make music that would be "easier" to make with a computer, here's a microtonal guitar with mechanized frets that can change the tuning dynamically. This entire YT channel is gold. Shane Crowley wrote a lovely blog post about creating music with Arroost. blank.page is a fun experiment in writing with various frictions. Super Meat Boy (the successor to Meat Boy, a Flash game) and Celeste are great examples of communicating tacit knowledge through the design of a simulation. Newgrounds and eBaum's World and Homestar Runner were early examples of (arguably) computer-native media. Hey, here's this episode's requisite link to the T2 Tile Project and Robust-First Computing. I should probably just create a hard-coded section of the episode page template linking to T2, The Witness, and Jack Rusher. The pun-proof Ivan Sutherland made Sketchpad. Planner exists. The PlayStation 3 Cell processor was this weirdly parallel CPU that was a pain in the butt to program. The SpaceMouse Put all metal back into the ground. Music featured in this episode: Fingers from This Score is Butt Ugly The Sailor's Chorus from Wagner's The Flying Dutchman. ! Send us email, share your ideas in the Slack, and catch us at these normal places: Ivan: Mastodon • Website Jimmy: Mastodon • Website Lu: Mastodon • Website See you in the future! https://futureofcoding.org/episodes/71See omnystudio.com/listener for privacy information.
Tom Furness has been working on virtual reality technologies since 1966, but most of his early work with the United States Air Force has remained fairly secret (see my previous interviews in episodes # and #). Ivan Sutherland and Mort Heileg are often cited as early VR pioneers, but Furness was also working secretly at Wright Patterson Air Force base on the first helmet-mounted displays, visually-coupled systems, and eventually The Super Cockpit. It wasn't until he was given permission to speak about The Super Cockpit project in the mid-80s that the world got to learn about the advances in VR technologies he'd been working on. I had a chance to catch up with Furness at the AWE 2023 on June 1st, just ahead of Apple's announcement of the Apple Vision Pro that happened a few days later on June 5th. We talk about the Virtual World Society, and the meeting of XR industry CEOs and leaders at AWE to have an off-the-record conversation about the impacts of AI and how to collaborate to help bring about a more exalted future of the XR industry. Furness also shares a bit more context of the early history of VR, as he's has been working continuously within for the past 57-58 years. I think it's really important to look back upon where these immersive and spatial computing technologies have come from in order to get a better idea for where they might be going. Furness also recently won the inaugural member's choice Ethical Values Award from the XR Guild for significant contributions to the XR industry, and continues to lead the Virtual World Society to promote the more pro-social uses of XR technologies.
In this episode of the Building the Open Metaverse podcast, hosts Marc Petit and Patrick Cozzi are joined by computer graphics legend Jim Blinn. Blinn recounts his pioneering work in CG, from developing bump mapping while a PhD student at University of Utah in the 1970s to producing animations for the Voyager spacecraft's flybys of Jupiter and Saturn at NASA JPL. Blinn traces his journey from growing up fascinated by astronomy, animation and education, to falling into computer programming and graphics at University of Michigan. He recounts moving to Caltech and JPL, where through luck and persistence he got to work on animations for Carl Sagan's "Cosmos" series and the Annenberg Foundation's physics series "The Mechanical Universe.” Along the way Blinn met pioneers like Ivan Sutherland and inspired innovations like ray tracing. He reflects on always focusing on improving images with the tools at hand rather than waiting for better tech. Blinn also looks to the future, discussing the potential and limitations of VR, AI and the metaverse. He emphasizes the need for research to continue advancing graphics and for computer scientists to properly document their work. Overall, it's a fascinating insider perspective on the birth and evolution of computer graphics from one of its foremost founding fathers. Blinn imparts wisdom and humor drawn from a storied career at the forefront of CG. ==== Have any comments or questions? Email the show Feedback@Buildingtheopenmetaverse.org Want more information? Visit our website www.buildingtheopenmetaverse.org And make sure you follow us on Linkedin for all of our show updates https://www.linkedin.com/company/buildingtheopenmetaverse/ Building the Open Metaverse is a podcast hosted by Patrick Cozzi (Cesium) and Marc Petit (Epic Games) that invites a broad range of technical experts to share their insights on how the community is building the metaverse together. #BuildingTheOpenMetaversePodcast #MetaversePodcast #Metaverse
Happy 50th birthday Marlborough wine region! Ivan Sutherland shares his experiences of starting out growing grapes in its infancy, through to a 30,000-hectare powerhouse. He joins REX set up producer Jo Grigg to discuss the threats to Sauvignon Blanc, why he thinks our wine is too cheap and the joys of succession in the family business Dog Point Vineyard. Tune in to REX every day for the latest and greatest rural content on your favourite streaming platform, visit rexonline.co.nz and follow us on Instagram, Facebook and LinkedIn for more.See omnystudio.com/listener for privacy information.
Computer graphics (or CG) has changed the way we experience the art of moving images. Computer graphics is the difference between Steamboat Willie and Buzz Lightyear, between ping pong and PONG. It began in 1963 when an MIT graduate student named Ivan Sutherland created Sketchpad, the first true computer animation program. Sutherland noted: "Since motion can be put into Sketchpad drawings, it might be exciting to try making cartoons." This book, the first full-length history of CG, shows us how Sutherland's seemingly offhand idea grew into a multibillion dollar industry. In Moving Innovation, Tom Sito--himself an animator and industry insider for more than thirty years--describes the evolution of CG. His story features a memorable cast of characters--math nerds, avant-garde artists, cold warriors, hippies, video game enthusiasts, and studio executives: disparate types united by a common vision. Sito shows us how fifty years of work by this motley crew made movies like Toy Story and Avatar possible. Tom Sito has been a professional animator since 1975. One of the key players in Disney's animation revival of the 1980s and 1990s, he worked on such classic Disney films as The Little Mermaid (1989), Beauty and the Beast (1991), and The Lion King (1994). He left Disney to help set up the Dreamworks Animation Unit in 1995. He is Professor of Cinema Practice in the School of Cinematic Arts at the University of Southern California. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/communications
Computer graphics (or CG) has changed the way we experience the art of moving images. Computer graphics is the difference between Steamboat Willie and Buzz Lightyear, between ping pong and PONG. It began in 1963 when an MIT graduate student named Ivan Sutherland created Sketchpad, the first true computer animation program. Sutherland noted: "Since motion can be put into Sketchpad drawings, it might be exciting to try making cartoons." This book, the first full-length history of CG, shows us how Sutherland's seemingly offhand idea grew into a multibillion dollar industry. In Moving Innovation, Tom Sito--himself an animator and industry insider for more than thirty years--describes the evolution of CG. His story features a memorable cast of characters--math nerds, avant-garde artists, cold warriors, hippies, video game enthusiasts, and studio executives: disparate types united by a common vision. Sito shows us how fifty years of work by this motley crew made movies like Toy Story and Avatar possible. Tom Sito has been a professional animator since 1975. One of the key players in Disney's animation revival of the 1980s and 1990s, he worked on such classic Disney films as The Little Mermaid (1989), Beauty and the Beast (1991), and The Lion King (1994). He left Disney to help set up the Dreamworks Animation Unit in 1995. He is Professor of Cinema Practice in the School of Cinematic Arts at the University of Southern California. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/communications
Computer graphics (or CG) has changed the way we experience the art of moving images. Computer graphics is the difference between Steamboat Willie and Buzz Lightyear, between ping pong and PONG. It began in 1963 when an MIT graduate student named Ivan Sutherland created Sketchpad, the first true computer animation program. Sutherland noted: "Since motion can be put into Sketchpad drawings, it might be exciting to try making cartoons." This book, the first full-length history of CG, shows us how Sutherland's seemingly offhand idea grew into a multibillion dollar industry. In Moving Innovation, Tom Sito--himself an animator and industry insider for more than thirty years--describes the evolution of CG. His story features a memorable cast of characters--math nerds, avant-garde artists, cold warriors, hippies, video game enthusiasts, and studio executives: disparate types united by a common vision. Sito shows us how fifty years of work by this motley crew made movies like Toy Story and Avatar possible. Tom Sito has been a professional animator since 1975. One of the key players in Disney's animation revival of the 1980s and 1990s, he worked on such classic Disney films as The Little Mermaid (1989), Beauty and the Beast (1991), and The Lion King (1994). He left Disney to help set up the Dreamworks Animation Unit in 1995. He is Professor of Cinema Practice in the School of Cinematic Arts at the University of Southern California. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/film
Computer graphics (or CG) has changed the way we experience the art of moving images. Computer graphics is the difference between Steamboat Willie and Buzz Lightyear, between ping pong and PONG. It began in 1963 when an MIT graduate student named Ivan Sutherland created Sketchpad, the first true computer animation program. Sutherland noted: "Since motion can be put into Sketchpad drawings, it might be exciting to try making cartoons." This book, the first full-length history of CG, shows us how Sutherland's seemingly offhand idea grew into a multibillion dollar industry. In Moving Innovation, Tom Sito--himself an animator and industry insider for more than thirty years--describes the evolution of CG. His story features a memorable cast of characters--math nerds, avant-garde artists, cold warriors, hippies, video game enthusiasts, and studio executives: disparate types united by a common vision. Sito shows us how fifty years of work by this motley crew made movies like Toy Story and Avatar possible. Tom Sito has been a professional animator since 1975. One of the key players in Disney's animation revival of the 1980s and 1990s, he worked on such classic Disney films as The Little Mermaid (1989), Beauty and the Beast (1991), and The Lion King (1994). He left Disney to help set up the Dreamworks Animation Unit in 1995. He is Professor of Cinema Practice in the School of Cinematic Arts at the University of Southern California. Learn more about your ad choices. Visit megaphone.fm/adchoices
Computer graphics (or CG) has changed the way we experience the art of moving images. Computer graphics is the difference between Steamboat Willie and Buzz Lightyear, between ping pong and PONG. It began in 1963 when an MIT graduate student named Ivan Sutherland created Sketchpad, the first true computer animation program. Sutherland noted: "Since motion can be put into Sketchpad drawings, it might be exciting to try making cartoons." This book, the first full-length history of CG, shows us how Sutherland's seemingly offhand idea grew into a multibillion dollar industry. In Moving Innovation, Tom Sito--himself an animator and industry insider for more than thirty years--describes the evolution of CG. His story features a memorable cast of characters--math nerds, avant-garde artists, cold warriors, hippies, video game enthusiasts, and studio executives: disparate types united by a common vision. Sito shows us how fifty years of work by this motley crew made movies like Toy Story and Avatar possible. Tom Sito has been a professional animator since 1975. One of the key players in Disney's animation revival of the 1980s and 1990s, he worked on such classic Disney films as The Little Mermaid (1989), Beauty and the Beast (1991), and The Lion King (1994). He left Disney to help set up the Dreamworks Animation Unit in 1995. He is Professor of Cinema Practice in the School of Cinematic Arts at the University of Southern California. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/science-technology-and-society
Computer graphics (or CG) has changed the way we experience the art of moving images. Computer graphics is the difference between Steamboat Willie and Buzz Lightyear, between ping pong and PONG. It began in 1963 when an MIT graduate student named Ivan Sutherland created Sketchpad, the first true computer animation program. Sutherland noted: "Since motion can be put into Sketchpad drawings, it might be exciting to try making cartoons." This book, the first full-length history of CG, shows us how Sutherland's seemingly offhand idea grew into a multibillion dollar industry. In Moving Innovation, Tom Sito--himself an animator and industry insider for more than thirty years--describes the evolution of CG. His story features a memorable cast of characters--math nerds, avant-garde artists, cold warriors, hippies, video game enthusiasts, and studio executives: disparate types united by a common vision. Sito shows us how fifty years of work by this motley crew made movies like Toy Story and Avatar possible. Tom Sito has been a professional animator since 1975. One of the key players in Disney's animation revival of the 1980s and 1990s, he worked on such classic Disney films as The Little Mermaid (1989), Beauty and the Beast (1991), and The Lion King (1994). He left Disney to help set up the Dreamworks Animation Unit in 1995. He is Professor of Cinema Practice in the School of Cinematic Arts at the University of Southern California. Learn more about your ad choices. Visit megaphone.fm/adchoices
Computer graphics (or CG) has changed the way we experience the art of moving images. Computer graphics is the difference between Steamboat Willie and Buzz Lightyear, between ping pong and PONG. It began in 1963 when an MIT graduate student named Ivan Sutherland created Sketchpad, the first true computer animation program. Sutherland noted: "Since motion can be put into Sketchpad drawings, it might be exciting to try making cartoons." This book, the first full-length history of CG, shows us how Sutherland's seemingly offhand idea grew into a multibillion dollar industry. In Moving Innovation, Tom Sito--himself an animator and industry insider for more than thirty years--describes the evolution of CG. His story features a memorable cast of characters--math nerds, avant-garde artists, cold warriors, hippies, video game enthusiasts, and studio executives: disparate types united by a common vision. Sito shows us how fifty years of work by this motley crew made movies like Toy Story and Avatar possible. Tom Sito has been a professional animator since 1975. One of the key players in Disney's animation revival of the 1980s and 1990s, he worked on such classic Disney films as The Little Mermaid (1989), Beauty and the Beast (1991), and The Lion King (1994). He left Disney to help set up the Dreamworks Animation Unit in 1995. He is Professor of Cinema Practice in the School of Cinematic Arts at the University of Southern California. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/technology
Telle Whitney began her career in the tech industry in 1986 after earning a Ph.D. in computer science from Cal Tech. Her journey into graduate studies was sparked by an encounter with graphics during her undergraduate studies at the University of Utah. Although she initially wasn't interested in graphics, the idea of computer-aided design fascinated her, and she was drawn to work with Ivan Sutherland, a co-founder of the computer science department at Cal Tech.Throughout college, Telle learned various programming languages, starting with C as an undergraduate and later delving into object-oriented languages like Simula and Mainsail. While she hasn't programmed in years, Telle acknowledges that programming languages evolve and change rapidly, but once you understand the core concepts, transitioning to a new language becomes relatively easy.Reflecting on her path into computer science, Telle admits that she had no exposure to the field during high school, which is a common experience for many young girls. “It wasn't until my sophomore year, where I was at my wit's end of trying to figure out what to study, and I took this interest test that compared your interests to other people's interests and programming came out on top.”From her first programming class, Telle knew she had found her calling, even though she started later than many of her peers. Telle's love for programming stems from its logical nature. “When you're writing a program, and you're trying to solve this problem, it is so absorbing. I would become completely captured with whatever I was working on at the time, and it was very fulfilling, no question.”She advises aspiring coders to ignore the myth of natural ability in programming and the notion that girls are not good at math. Persistence and patience are key in navigating the challenges that arise, and the belief in one's ability to succeed is crucial.Discussing the persistent stereotypes and biases that deter women and people of color from pursuing careers in tech, Telle, and Margot highlight the prevalence of these harmful beliefs even today. Despite efforts to increase diversity, Telle emphasizes that more needs to be done to ensure the best minds participate in shaping the future of technology. Both Telle and Margot stress the significance of representation, with Margot outlining the WiDS goal of achieving at least 30% female representation by 2030, given that the current representation stands at a mere 10%. Such representation can help drive a cultural shift and improve the treatment of underrepresented groups.Telle dedicated 20 years to working full-time in the chip industry, actively striving to bring about change within the field. Concurrently, she collaborated with her close friend Anita Borg on the Grace Hopper Celebration, an initiative aimed at celebrating women who create technology. When Anita fell ill with brain cancer, Telle was asked to step into the role of CEO. During her 15-year tenure, Telle successfully expanded Anita Borg into a prominent organization.Although she hadn't planned to take on this role initially, Telle saw it as a valuable opportunity and made a conscious pivot. She has since left Anita Borg to establish her own consulting firm, proud of the impact she made and the organization's continued influence under new leadership.The lack of progress in achieving diversity in the tech industry is a cause of concern for Telle. Breaking down barriers and changing the perception of what a technologist looks like remains an ongoing challenge.Telle's particular interest lies in fostering a more inclusive culture within organizations. While community plays a vital role, Telle believes that actual cultural change stems from providing equal opportunities for advancement.Offering advice to aspiring data scientists, Telle urges them to take risks, develop confidence in their ideas, and master effective communication. She emphasizes the importance of curiosity and creativity in shaping the future and encourages aspiring data scientists to be at the forefront of technological advancements. “I want you to be at the table creating a technology that's going to change our lives. That's what you should do.” RELATED LINKSConnect with Telle Whitney on LinkedInFind out more about AnitaB.orgConnect with Margot Gerritsen on Twitter (@margootjeg) and LinkedInFollow WiDS on Twitter (@WiDS_Worldwide), Facebook (WiDSWorldwide), and Instagram (wids_worldwide)Listen and Subscribe to the WiDS Podcast on Apple Podcasts, Google Podcasts, Spotify, Stitcher
Neața bună să vă găsească cu inima plină, cu cafelutza în mână și cu voie bună. Ne regăsim de pe două continente diferite, la o distanță de aproape 10.000 km și un fus orar cu aproape 10 ore diferență. Pe scurt, eu sunt la New York și Radu în studio. De data aceasta am plecat eu și a rămas el să țină frontul. Chiar și așa, avem zeci de știri, care mai decare mai palpitante, mai interesante, și cu potențialul de a schimba lumea în mai puțin de 10 ani. Vorbim despre prezent, puțin despre trecut și cel mai mult despre viitor, în emisiunea noastră preferată de tehnologie ce se apropie vertiginos de 200 de ediții. Da, da, știu, mai avem vreo jumătate de an până acolo, și cred că am reușit să nu ratăm nici o săptămână până acum. Cred.
The Mogollon culture was an indigenous culture in the Western United States and Mexico that ranged from New Mexico and Arizona to Sonora, Mexico and out to Texas. They flourished from around 200 CE until the Spanish showed up and claimed their lands. The cultures that pre-existed them date back thousands more years, although archaeology has yet to pinpoint exactly how those evolved. Like many early cultures, they farmed and foraged. As they farmed more, their homes become more permanent and around 800 CE they began to create more durable homes that helped protect them from wild swings in the climate. We call those homes adobes today and the people who lived in those peublos and irrigated water, often moving higher into mountains, we call the Peubloans - or Pueblo Peoples. Adobe homes are similar to those found in ancient cultures in what we call Turkey today. It's an independent evolution. Adobe Creek was once called Arroyo de las Yeguas by the monks from Mission Santa Clara and then renamed to San Antonio Creek by a soldier Juan Prado Mesa when the land around it was given to him by the governor of Alto California at the time, Juan Bautista Alvarado. That's the same Alvarado as the street if you live in the area. The creek runs for over 14 miles north from the Black Mountain and through Palo Alto, California. The ranchers built their adobes close to the creeks. American settlers led the Bear Flag Revolt in 1846, and took over the garrison of Sonoma, establishing the California Republic - which covered much of the lands of the Peubloans. There were only 33 of them at first, but after John Fremont (yes, he of whom that street is named after as well) encouraged the Americans, they raised an army of over 100 men and Fremont helped them march on Sutter's fort, now with the flag of the United States, thanks to Joseph Revere of the US Navy (yes, another street in San Francisco bears his name). James Polk had pushed to expand the United States. Manfiest Destiny. Remember The Alamo. Etc. The fort at Monterey fell, the army marched south. Admiral Sloat got involved. They named a street after him. General Castro surrendered - he got a district named after him. Commodore Stockton announced the US had taken all of Calfironia soon after that. Manifest destiny was nearly complete. He's now basically the patron saint of a city, even if few there know who he was. The forts along the El Camino Real that linked the 21 Spanish Missions, a 600-mile road once walked by their proverbial father, Junípero Serra following the Portolá expedition of 1769, fell. Stockton took each, moving into Los Angeles, then San Diego. Practically all of Alto California fell with few shots. This was nothing like the battles for the independence of Texas, like when Santa Anna reclaimed the Alamo Mission. Meanwhile, the waters of Adobe Creek continued to flow. The creek was renamed in the 1850s after Mesa built an adobe on the site. Adobe Creek it was. Over the next 100 years, the area evolved into a paradise with groves of trees and then groves of technology companies. The story of one begins a little beyond the borders of California. Utah was initialy explored by Francisco Vázquez de Coronado in 1540 and settled by Europeans in search of furs and others who colonized the desert, including those who established the Church of Jesus Christ of Latter-day Saints, or the Mormons - who settled there in 1847, just after the Bear Flag Revolt. The United States officially settled for the territory in 1848 and Utah became a territory and after a number of map changes wher ethe territory got smaller, was finally made a state in 1896. The University of Utah had been founded all the way back in 1850, though - and re-established in the 1860s. 100 years later, the University of Utah was a hotbed of engineers who pioneered a number of graphical advancements in computing. John Warnock went to grad school there and then went on to co-found Adobe and help bring us PostScript. Historically, PS, or Postscript was a message to be placed at the end of a letter, following the signature of the author. The PostScript language was a language to describe a page of text computationally. It was created by Adobe when Warnock, Doug Brotz, Charles Geschke, Bill Paxton (who worked on the Mother of All Demos with Doug Englebart during the development of Online System, or NLS in the late 70s and then at Xerox PARC), and Ed Taft. Warnock invented the Warnock algorithm while working on his PhD and went to work at Evans & Sutherland with Ivan Sutherland who effectively created the field of computer graphics. Geschke got his PhD at Carnegie Melon in the early 1970s and then went of to Xerox PARC. They worked with Paxton at PARC and before long, these PhDs and mathematicians had worked out the algorithms and then the languages to display images on computers while working on InterPress graphics at Xerox and Gerschke left Xerox and started Adobe. Warnock joined them and they went to market with Interpress as PostScript, which became a foundation for the Apple LaswerWriter to print graphics. Not only that, PostScript could be used to define typefaces programmatically and later to display any old image. Those technologies became the foundation for the desktop publishing industry. Apple released the 1984 Mac and other vendors brought in PostScript to describe graphics in their proprietary fashion and by 1991 they released PostScript Level 2 and then PostScript 3 in 1997. Other vendors made their own or furthered standards in their own ways and Adobe could have faded off into the history books of computing. But Adobe didn't create one product, they created an industry and the company they created to support that young industry created more products in that mission. Steve Jobs tried to buy Adobe before that first Mac as released, for $5,000,000. But Warnock and Geschke had a vision for an industry in mind. They had a lot of ideas but development was fairly capital intensive, as were go to market strategies. So they went public on the NASDAQ in 1986. They expanded their PostScript distribution and sold it to companies like Texas Instruments for their laser printer, and other companies who made IBM-compatible companies. They got up to $16 million in sales that year. Warnock's wife was a graphic designer. This is where we see a diversity of ideas help us think about more than math. He saw how she worked and could see a world where Ivan Sutherland's Sketchpad was much more given how far CPUs had come since the TX-0 days at MIT. So Adobe built and released Illustrator in 1987. By 1988 they broke even on sales and it raked in $19 million in revenue. Sales were strong in the universities but PostScript was still the hot product, selling to printer companies, typesetters, and other places were Adobe signed license agreements. At this point, we see where the math, cartesian coordinates, drawn by geometric algorithms put pixels where they should be. But while this was far more efficient than just drawing a dot in a coordinate for larger images, drawing a dot in a pixel location was still the easier technology to understand. They created Adobe Screenline in 1989 and Collectors Edition to create patterns. They listened to graphic designers and built what they heard humans wanted. Photoshop Nearly every graphic designer raves about Adobe Photoshop. That's because Photoshop is the best selling graphics editorial tool that has matured far beyond most other traditional solutions and now has thousands of features that allow users to manipulate images in practically any way they want. Adobe Illustrator was created in 1987 and quickly became the de facto standard in vector-based graphics. Photoshop began life in 1987 as well, when Thomas and John Knoll, wanted to build a simpler tool to create graphics on a computer. Rather than vector graphics they created a raster graphical editor. They made a deal with Barneyscan, a well-known scanner company that managed to distribute over two hundred copies of Photoshop with their scanners and Photoshop became a hit as it was the first editing software people heard about. Vector images are typically generated with Cartesian coordinates based on geometric formulas and so scale out more easily. Raster images are comprised of a grid of dots, or pixels, and can be more realistic. Great products are rewarded with competitions. CorelDRAW was created in 1989 when Michael Bouillon and Pat Beirne built a tool to create vector illustrations. The sales got slim after other competitors entered the market and the Knoll brothers got in touch with Adobe and licensed the product through them. The software was then launched as Adobe Photoshop 1 in 1990. They released Photoshop 2 in 1991. By now they had support for paths, and given that Adobe also made Illustrator, EPS and CMYK rasterization, still a feature in Photoshop. They launched Adobe Photoshop 2.5 in 1993, the first version that could be installed on Windows. This version came with a toolbar for filters and 16-bit channel support. Photoshop 3 came in 1994 and Thomas Knoll created what was probably one of the most important features added, and one that's become a standard in graphical applications since, layers. Now a designer could create a few layers that each had their own elements and hide layers or make layers more transparent. These could separate the subject from the background and led to entire new capabilities, like an almost faux 3 dimensional appearance of graphics.. Then version four in 1996 and this was one of the more widely distributed versions and very stable. They added automation and this was later considered part of becoming a platform - open up a scripting language or subset of a language so others built tools that integrated with or sat on top of those of a product, thus locking people into using products once they automated tasks to increase human efficiency. Adobe Photoshop 5.0 added editable type, or rasterized text. Keep in mind that Adobe owned technology like PostScript and so could bring technology from Illustrator to Photoshop or vice versa, and integrate with other products - like export to PDF by then. They also added a number of undo options, a magnetic lasso, improved color management and it was now a great tool for more advanced designers. Then in 5.5 they added a save for web feature in a sign of the times. They could created vector shapes and continued to improve the user interface. Adobe 5 was also a big jump in complexity. Layers were easy enough to understand, but Photoshop was meant to be a subset of Illustrator features and had become far more than that. So in 2001 they released Photoshop Elements. By now they had a large portfolio of products and Elements was meant to appeal to the original customer base - the ones who were beginners and maybe not professional designers. By now, some people spent 40 or more hours a day in tools like Photoshop and Illustrator. Adobe Today Adobe had released PostScript, Illustrator, and Photoshop. But they have one of the most substantial portfolios of products of any company. They also released Premiere in 1991 to get into video editing. They acquired Aldus Corporation to get into more publishing workflows with PageMaker. They used that acquisition to get into motion graphics with After Effects. They acquired dozens of companies and released their products as well. Adobe also released the PDF format do describe full pages of information (or files that spread across multiple pages) in 1993 and Adobe Acrobat to use those. Acrobat became the de facto standard for page distribution so people didn't have to download fonts to render pages properly. They dabbled in audio editing when they acquired Cool Edit Pro from Syntrillium Software and so now sell Adobe Audition. Adobe's biggest acquisition was Macromedia in 2005. Here, they added a dozen new products to the portfolio, which included Flash, Fireworks, WYSYWIG web editor Dreamweaver, ColdFusion, Flex, and Breeze, which is now called Adobe Connect. By now, they'd also created what we call Creative Suite, which are packages of applications that could be used for given tasks. Creative Suite also signaled a transition into a software as a service, or SaaS mindset. Now customers could pay a monthly fee for a user license rather than buy large software packages each time a new version was released. Adobe had always been a company who made products to create graphics. They expanded into online marketing and web analytics when they bought Omniture in 2009 for $1.8 billion. These products are now normalized into the naming convention used for the rest as Adobe Marketing Cloud. Flash fell by the wayside and so the next wave of acquisitions were for more mobile-oriented products. This began with Day Software and then Nitobi in 2011. And they furthered their Marketing Cloud support with an acquisition of one of the larger competitors when they acquired Marketo in 2018 and acquiring Workfront in 2020. Given how many people started working from home, they also extended their offerings into pure-cloud video tooling with an acquisition of Frame.io in 2021. And here we see a company started by a bunch of true computer sciencists from academia in the early days of the personal computer that has become far more. They could have been rolled into Apple but had a vision of a creative suite of products that could be used to make the world a prettier place. Creative Suite then Creative Cloud shows a move of the same tools into a more online delivery model. Other companies come along to do similar tasks, like infinite digital whiteboard Miro - so they have to innovate to stay marketable. They have to continue to increase sales so they expand into other markets like the most adjacent Marketing Cloud. At 22,500+ employees and with well over $12 billion in revenues, they have a lot of families dependent on maintaining that growth rate. And so the company becomes more than the culmination of their software. They become more than graphic design, web design, video editing, animation, and visual effects. Because in software, if revenues don't grow at a rate greater than 10 percent per year, the company simply isn't outgrowing the size of the market and likely won't be able to justify stock prices at an inflated earnings to price ratio that shows explosive growth. And yet once a company saturates sales in a given market they have shareholders to justify their existence to. Adobe has survived many an economic downturn and boom time with smart, measured growth and is likely to continue doing so for a long time to come.
Gutenburg shipped the first working printing press around 1450 and typeface was born. Before then most books were hand written, often in blackletter calligraphy. And they were expensive. The next few decades saw Nicolas Jensen develop the Roman typeface, Aldus Manutius and Francesco Griffo create the first italic typeface. This represented a period where people were experimenting with making type that would save space. The 1700s saw the start of a focus on readability. William Caslon created the Old Style typeface in 1734. John Baskerville developed Transitional typefaces in 1757. And Firmin Didot and Giambattista Bodoni created two typefaces that would become the modern family of Serif. Then slab Serif, which we now call Antique, came in 1815 ushering in an era of experimenting with using type for larger formats, suitable for advertisements in various printed materials. These were necessary as more presses were printing more books and made possible by new levels of precision in the metal-casting. People started experimenting with various forms of typewriters in the mid-1860s and by the 1920s we got Frederic Goudy, the first real full-time type designer. Before him, it was part of a job. After him, it was a job. And we still use some of the typefaces he crafted, like Copperplate Gothic. And we saw an explosion of new fonts like Times New Roman in 1931. At the time, most typewriters used typefaces on the end of a metal shaft. Hit a kit, the shaft hammers onto a strip of ink and leaves a letter on the page. Kerning, or the space between characters, and letter placement were often there to reduce the chance that those metal hammers jammed. And replacing a font would have meant replacing tons of precision parts. Then came the IBM Selectric typewriter in 1961. Here we saw precision parts that put all those letters on a ball. Hit a key, the ball rotates and presses the ink onto the paper. And the ball could be replaced. A single document could now have multiple fonts without a ton of work. Xerox exploded that same year with the Xerox 914, one of the most successful products of all time. Now, we could type amazing documents with multiple fonts in the same document quickly - and photocopy them. And some of the numbers on those fancy documents were being spat out by those fancy computers, with their tubes. But as computers became transistorized heading into the 60s, it was only a matter of time before we put fonts on computer screens. Here, we initially used bitmaps to render letters onto a screen. By bitmap we mean that a series, or an array of pixels on a screen is a map of bits and where each should be displayed on a screen. We used to call these raster fonts, but the drawback was that to make characters bigger, we needed a whole new map of bits. To go to a bigger screen, we probably needed a whole new map of bits. As people thought about things like bold, underline, italics, guess what - also a new file. But through the 50s, transistor counts weren't nearly high enough to do something different than bitmaps as they rendered very quickly and you know, displays weren't very high quality so who could tell the difference anyways. Whirlwind was the first computer to project real-time graphics on the screen and the characters were simple blocky letters. But as the resolution of screens and the speed of interactivity increased, so did what was possible with drawing glyphs on screens. Rudolf Hell was a German, experimenting with using cathode ray tubes to project a CRT image onto paper that was photosensitive and thus print using CRT. He designed a simple font called Digital Grotesk, in 1968. It looked good on the CRT and the paper. And so that font would not only be used to digitize typesetting, loosely based on Neuzeit Book. And we quickly realized bitmaps weren't efficient to draw fonts to screen and by 1974 moved to outline, or vector, fonts. Here a Bézier curve was drawn onto the screen using an algorithm that created the character, or glyph using an outline and then filling in the space between. These took up less memory and so drew on the screen faster. Those could be defined in an operating system, and were used not only to draw characters but also by some game designers to draw entire screens of information by defining a character as a block and so taking up less memory to do graphics. These were scalable and by 1979 another German, Peter Karow, used spline algorithms wrote Ikarus, software that allowed a person to draw a shape on a screen and rasterize that. Now we could graphically create fonts that were scalable. In the meantime, the team at Xerox PARC had been experimenting with different ways to send pages of content to the first laser printers. Bob Sproull and Bill Newman created the Press format for the Star. But this wasn't incredibly flexible like what Karow would create. John Gaffney who was working with Ivan Sutherland at Evans & Sutherland, had been working with John Warnock on an interpreter that could pull information from a database of graphics. When he went to Xerox, he teamed up with Martin Newell to create J&M, which harnessed the latest chips to process graphics and character type onto printers. As it progressed, they renamed it to Interpress. Chuck Geschke started the Imaging Sciences Laboratory at Xerox PARC and eventually left Xerox with Warnock to start a company called Adobe in Warnock's garage, which they named after a creek behind his house. Bill Paxton had worked on “The Mother of All Demos” with Doug Engelbart at Stanford, where he got his PhD and then moved to Xerox PARC. There he worked on bitmap displays, laser printers, and GUIs - and so he joined Adobe as a co-founder in 1983 and worked on the font algorithms and helped ship a page description language, along with Chuck Geschke, Doug Brotz, and Ed Taft. Steve Jobs tried to buy Adobe in 1982 for $5 million. But instead they sold him just shy of 20% of the company and got a five-year license for PostScript. This allowed them to focus on making the PostScript language more extensible, and creating the Type 1 fonts. These had 2 parts. One that was a set of bit maps And another that was a font file that could be used to send the font to a device. We see this time and time again. The simpler an interface and the more down-market the science gets, the faster we see innovative industries come out of the work done. There were lots of fonts by now. The original 1984 Mac saw Susan Kare work with Jobs and others to ship a bunch of fonts named after cities like Chicago and San Francisco. She would design the fonts on paper and then conjure up the hex (that's hexadecimal) for graphics and fonts. She would then manually type the hexadecimal notation for each letter of each font. Previously, custom fonts were reserved for high end marketing and industrial designers. Apple considered licensing existing fonts but decided to go their own route. She painstakingly created new fonts and gave them the names of towns along train stops around Philadelphia where she grew up. Steve Jobs went for the city approach but insisted they be cool cities. And so the Chicago, Monaco, New York, Cairo, Toronto, Venice, Geneva, and Los Angeles fonts were born - with her personally developing Geneva, Chicago, and Cairo. And she did it in 9 x 7. I can still remember the magic of sitting down at a computer with a graphical interface for the first time. I remember opening MacPaint and changing between the fonts, marveling at the typefaces. I'd certainly seen different fonts in books. But never had I made a document and been able to set my own typeface! Not only that they could be in italics, outline, and bold. Those were all her. And she inspired a whole generation of innovation. Here, we see a clean line from Ivan Sutherland and the pioneering work done at MIT to the University of Utah to Stanford through the oNLine System (or NLS) to Xerox PARC and then to Apple. But with the rise of Windows and other graphical operating systems. As Apple's 5 year license for PostScript came and went they started developing their own font standard as a competitor to Adobe, which they called TrueType. Here we saw Times Roman, Courier, and symbols that could replace the PostScript fonts and updating to Geneva, Monaco, and others. They may not have gotten along with Microsoft, but they licensed TrueType to them nonetheless to make sure it was more widely adopted. And in exchange they got a license for TrueImage, which was a page description language that was compatible with PostScript. Given how high resolution screens had gotten it was time for the birth of anti-aliasing. He we could clean up the blocky “jaggies” as the gamers call them. Vertical and horizontal lines in the 8-bit era looked fine but distorted at higher resolutions and so spatial anti-aliasing and then post-processing anti-aliasing was born. By the 90s, Adobe was looking for the answer to TrueImage. So 1993 brought us PDF, now an international standard in ISO 32000-1:2008. But PDF Reader and other tools were good to Adobe for many years, along with Illustrator and then Photoshop and then the other products in the Adobe portfolio. By this time, even though Steve Jobs was gone, Apple was hard at work on new font technology that resulted in Apple Advanced Typography, or AAT. AAT gave us ligature control, better kerning and the ability to write characters on different axes. But even though Jobs was gone, negotiations between Apple and Microsoft broke down to license AAT to Microsoft. They were bitter competitors and Windows 95 wasn't even out yet. So Microsoft started work on OpenType, their own font standardized language in 1994 and Adobe joined the project to ship the next generation in 1997. And that would evolve into an open standard by the mid-2000s. And once an open standard, sometimes the de facto standard as opposed to those that need to be licensed. By then the web had become a thing. Early browsers and the wars between them to increment features meant developers had to build and test on potentially 4 or 5 different computers and often be frustrated by the results. So the WC3 began standardizing how a lot of elements worked in Extensible Markup Language, or XML. Images, layouts, colors, even fonts. SVGs are XML-based vector image. In other words the browser interprets a language that displays the image. That became a way to render Web Open Format or WOFF 1 was published in 2009 with contributions by Dutch educator Erik van Blokland, Jonathan Kew, and Tal Leming. This built on the CSS font styling rules that had shipped in Internet Explorer 4 and would slowly be added to every browser shipped, including Firefox since 3.6, Chrome since 6.0, Internet Explorer since 9, and Apple's Safari since 5.1. Then WOFF 2 added Brotli compression to get sizes down and render faster. WOFF has been a part of the W3C open web standard since 2011. Out of Apple's TrueType came TrueType GX, which added variable fonts. Here, a single font file could contain a number or range of variants to the initial font. So a family of fonts could be in a single file. OpenType added variable fonts in 2016, with Apple, Microsoft, and Google all announcing support. And of course the company that had been there since the beginning, Adobe, jumped on board as well. Fewer font files, faster page loads. So here we've looked at the progression of fonts from the printing press, becoming more efficient to conserve paper, through the advent of the electronic typewriter to the early bitmap fonts for screens to the vectorization led by Adobe into the Mac then Windows. We also see rethinking the font entirely so multiple scripts and character sets and axes can be represented and rendered efficiently. I am now converting all my user names into pig Latin for maximum security. Luckily those are character sets that are pretty widely supported. The ability to add color to pig Latin means that OpenType-SVG will allow me add spiffy color to my glyphs. It makes us wonder what's next for fonts. Maybe being able to design our own, or more to the point, customize those developed by others to make them our own. We didn't touch on emoji yet. But we'll just have to save the evolution of character sets and emoji for another day. In the meantime, let's think on the fact that fonts are such a big deal because Steve Jobs took a caligraphy class from a Trappist monk named Robert Palladino while enrolled at Reed College. Today we can painstakingly choose just the right font with just the right meaning because Palladino left the monastic life to marry and have a son. He taught jobs about serif and san serif and kerning and the art of typography. That style and attention to detail was one aspect of the original Mac that taught the world that computers could have style and grace as well. It's not hard to imagine if entire computers still only supported one font or even one font per document. Palladino never owned or used a computer though. His influence can be felt through the influence his pupil Jobs had. And it's actually amazing how many people who had such dramatic impacts on computing never really used one. Because so many smaller evolutions came after them. What evolutions do we see on the horizon today? And how many who put a snippet of code on a service like GitHub may never know the impact they have on so many?
We're finally taking a look at Sketchpad. This program was completed in 1963 as Ivan Sutherland's Ph.D. research. On the surface it looks like a very fancy drawing program. Under the hood it's hiding some impressive new programming techniques. Selected Sources: http://worrydream.com/refs/Sutherland-Sketchpad.pdf - Sutherland's Sketchpad thesis https://www.youtube.com/watch?v=495nCzxM9PI - Sketchpad in action https://www.computerhistory.org/collections/catalog/102738195 - Oral History transcripts
The technology behind the metaverse has its origins in the darkest days of the Cold War. This ‘Military metaverse' gave birth to the Internet, transforming warfare - and, a generation later, online gaming. Featuring Michael Zyda. JCR Licklider – “Lick” to his friends and colleagues – is little-known but absolutely an essential figure in the development of modern computing. Ivan Sutherland is probably the most influential computer scientist, full stop. Here's a video of Ivan Sutherland giving a demo of ‘Sketchpad' the first interactive computer drawing program. Bob Taylor is largely unknown outside the small community of individuals involved with the early Internet – but his work is profoundly influential. Here's an interview with Bob Taylor, talking about the origins of the Internet Here's an excellent documentary on the recreation of the “Battle of 73 Easting” – the first tank battle captured in real-time, then simulated endlessly using SIMNET. 1968: When the World Began, 3D graphics, A Brief History of the Metaverse, Bob Taylor, DARPA, Internet, IPTO, Ivan Sutherland, JCR Licklider, metaverse, Mike Zyda, military, SIMNET, simulation, SKETCHPAD, VR For more information about this and all our other 'The Next Billion Seconds" content, please check out https://nextbillionseconds.com This podcast is sponsored by the Digital Skills Organisation. The DSO is championing an employer-led, skills-based approach to digital literacy. Our offering is designed to support future-proofing the country, growing jobs, supporting our economic growth and ensuring that Australia remains a global leader in digital. If we are to equip our workforce with the skills to meet a rapidly changing, technological future, we need a new approach. We're working in collaboration with employers, trainers and employees. Their involvement is vital. We believe it's a better way to create consistent journey pathways and build relevant digital skills. We define the problem this way - digital skills training must: Create value both internally and externally. Improve customer experience. Build operational capabilities. To deliver on these objectives we need to strengthen Australia's digital workforce. It's that simple. DSO - Digitally Upskilling Australia To find out more, visit the DSO website: https://digitalskillsorg.com.au Mark Pesce - The Next Billion Seconds is produced by Ampel - visit https://ampel.com.au to find out what Ampel could do for you! If you are interested in sponsoring The Next Billion Seconds podcast, reach out to our Director of Media and Partnerships Lauren Deighton at lauren@ampel.com.au If you enjoyed this show, please leave a rating and/or review on Apple, Spotify or any other podcast platform. See omnystudio.com/listener for privacy information.
Our last episode was on Project MAC, a Cold War-era project sponsored by ARPA. That led to many questions like what led to the Cold War and just what was the Cold War. We'll dig into that today. The Cold War was a period between 1946, in the days after World War II, and 1991, when the United States and western allies were engaged in a technical time of peace that was actually an aggressive time of arms buildup and proxy wars. Technology often moves quickly when nations or empires are at war. In many ways, the Cold War gave us the very thought of interactive computing and networking, so is responsible for the acceleration towards our modern digital lives. And while I've never seen it references as such, this was more of a continuation of wars between the former British empire and the Imperialistic Russian empires. These make up two or the three largest empires the world has ever seen and a rare pair of empires that were active at the same time. And the third, well, we'll get to the Mongols in this story as well. These were larger than the Greeks, the Romans, the Persians, or any of the Chinese dynasties. In fact, the British Empire that reached its peak in 1920 was 7 times larger than the land controlled by the Romans, clocking in at 13.7 million square miles. The Russian Empire was 8.8 million square miles. Combined the two held nearly half the world. And their legacies live on in trade empires, in some cases run by the same families that helped fun the previous expansions. But the Russians and British were on a collision course going back to a time when their roots were not as different as one might think. They were both known to the Romans. But yet they both became feudal powers with lineages of rulers going back to Vikings. We know the Romans battled the Celts, but they also knew of a place that Ptolemy called Sarmatia Europea in around 150AD, where a man named Rurik settle far later. He was a Varangian prince, which is the name Romans gave to Vikings from the area we now call Sweden. The 9th to 11th century saw a number o these warrior chiefs flow down rivers throughout the Baltics and modern Russia in search of riches from the dwindling Roman vestiges of empire. Some returned home to Sweden; others conquered and settled. They rowed down the rivers: the Volga, the Volkhov, the Dvina, and the networks of rivers that flow between one another, all the way down the Dnieper river, through the Slavic tripes Ptolemy described which by then had developed into city-states, such as Kiev, past the Romanians and Bulgers and to the second Rome, or Constantinople. The Viking ships rowed down these rivers. They pillaged, conquered, and sometimes settled. The term for rowers was Rus. Some Viking chiefs set up their own city-states in and around the lands. Some when their lands back home were taken while they were off on long campaigns. Charlemagne conquered modern day France and much of Germany, from The Atlantic all the way down into the Italian peninsula, north into Jutland, and east to the border with the Slavic tribes. He weakened many, upsetting the balance of power in the area. Or perhaps there was never a balance of power. Empires such as the Scythians and Sarmatians and various Turkic or Iranian powers had come and gone and each in their wake crossing the vast and harsh lands found only what Homer said of the area all the way back in the 8th century BCE, that the land was deprived of sunshine. The Romans never pushed up so far into the interior of the steppes as the were busy with more fertile farming grounds. But as the Roman Empire fell and the Byzantines flourished, the Vikings traded with them and even took their turn trying to loot Constantinople. And Frankish Paris. And again, settled in the Slavic lands, marrying into cultures and DNA. The Rus Rome retreated from lands as her generals were defeated. The Merovingian dynasty rose in the 5th century with the defeat of Syagrius, the last Roman general Gaul and lasted until a family of advisors slowly took control of running the country, transitioning to the Carolingian Empire, of which Charlemagne, the Holy Roman Emperor, as he was crowned, was the most famous. He conquered and grew the empire. Charlemagne knew the empire had outgrown what one person could rule with the technology of the era, so it was split into three, which his son passed to his grandsons. And so the Carolingian empire had made the Eastern Slavs into tributaries of the Franks. There were hostilities but by the Treaty of Mersen in 870 the split of the empire generally looked like the borders of northern Italy, France, and Germany - although Germany also included Austria but not yet Bohemia. It split and re-merged and smaller boundary changes happened but that left the Slavs aware of these larger empires. The Slavic peoples grew and mixed with people from the Steppes and Vikings. The Viking chiefs were always looking for new extensions to their trade networks. Trade was good. Looting was good. Looting and getting trade concessions to stop looting those already looted was better. The networks grew. One of those Vikings was Rurik. Possibly Danish Rorik, a well documented ally who tended to play all sides of the Carolingians and a well respected raider and military mind. Rurik was brought in as the first Viking, or rower, or Rus, ruler of the important trade city that would be known as New City, or Novgorod. Humans had settled in Kiev since the Stone Age and then by Polans before another prince Kyi took over and then Rurik's successor Oleg took Smolensk and Lyubech. Oleg extended the land of Rus down the trading routes, and conquered Kiev. Now, they had a larger capital and were the Kievan Rus. Rurik's son Igor took over after Oleg and centralized power in Kiev. He took tribute from Constantinople after he attacked, plunder Arab lands off the Caspian Sea, and was killed overtaxing vassal states in his territory. His son Sviatoslav the Brave then conquered the Alans and through other raiding helped cause the collapse of the Kazaria and Bulgarian empires. They expanded throughout the Volga River valley, then to the Balkans, and up the Pontic Steppe, and quickly became the largest empire in Europe of the day. His son Vladimir the Great expanded again, with he empire extending from the Baltics to Belarus to the Baltics and converted to Christianity, thus Christianizing the lands he ruled. He began marrying and integrating into the Christian monarchies, which his son continued. Yaroslov the Wise married the daughter of the King of Sweden who gave him the area around modern-day Leningrad. He then captured Estonia in 1030, and as with others in the Rurikid dynasty as they were now known, made treaties with others and then pillaged more Byzantine treasures. He married one daughter to the King of Norway, another to the King of Hungary, another to the King of the Franks, and another to Edward the Exile of England, and thus was the grandfather of Edgar the Aetheling, who later became a king of England. The Mongols The next couple of centuries saw the rise of Feudalism and the descendants of Rurik fight amongst each other. The various principalities were, as with much of Europe during the Middle Ages, semi-independent duchies, similar to city-states. Kiev became one of the many and around the mid 1100s Yaroslav the Wise's great-grandson, Yuri Dolgoruki built a number of new villages and principalities, including one along the Moskva river they called Moscow. They built a keep there, which the Rus called kremlins. The walls of those keeps didn't keep the Mongols out. They arrived in 1237. They moved the capital to Moscow and Yaroslav II, Yuri's grandson, was poisoned in the court of Ghengis Khan's grandson Batu. The Mongols ruled, sometimes through the descendants of Rurik, sometimes disposing of them and picking a new one, for 200 years. This is known as the time of the “Mongol yoke.” One of those princes the Mongols let rule was Ivan I of Moscow, who helped them put down a revolt in a rival area in the 1300s. The Mongols trusted Moscow after that, and so we see a migration of rulers of the land up into Moscow. The Golden Horde, like the Viking Danes and Swedes settled in some lands. Kublai Khan made himself ruler of China. Khanates splintered off to form the ruling factions of weaker lands, such as modern India and Iran - who were once the cradle of civilization. Those became the Mughals dynasties as they Muslimized and moved south. And so the Golden Horde became the Great Horde. Ivan the Great expanded the Muscovite sphere of influence, taking Novgorod, Rostov, Tver, Vyatka, and up into the land of the Finns. They were finally strong enough to stand up to the Tatars as they called their Mongol overlords and made a Great Stand on the Ugra River. And summoning a great army simply frightened the Mongol Tatars off. Turns out they were going through their own power struggles between princes of their realm and Akhmed was assassinated the next year, with his successor becoming Sheikh instead of Khan. Ivan's grandson, Ivan the Terrible expanded the country even further. He made deals with various Khans and then conquered others, pushing east to conquer the Khanate of Sibiu and so conquered Siberia in the 1580s. The empire then stretched all the way to the Pacific Ocean. He had a son who didn't have any heirs and so was the last in the Rurikid dynasty. But Ivan the Terrible had married Anastasia Romanov, who when he crowned himself Caesar, or Tsar as they called it, made her Tsaritsa. And so the Romanov's came to power in 1596 and following the rule of Peter the Great from 1672 to 1725, brought the Enlightenment to Russia. He started the process of industrialization, built a new capital he called St Petersburg, built a navy, made peace with the Polish king, then Ottoman king, and so took control of the Baltics, where the Swedes had taken control of on and off since the time of Rurik. Russian Empire Thus began the expansion as the Russian Empire. They used an alliance with Denmark-Norway and chased the Swedes through the Polish-Lithuanian Commonwealth, unseating the Polish king along the way. He probably should not have allied with them. They moved back into Finland, took the Baltics so modern Latvia and Estonia, and pushed all the way across the Eurasian content across the frozen tundra and into Alaska. Catherine the Great took power in 1762 and ignited a golden age. She took Belarus, parts of Mongolia, parts of modern day Georgia, overtook the Crimean Khanate, and modern day Azerbaijan. and during her reign founded Odessa, Sevastopol and other cities. She modernized the country like Peter and oversaw nearly constant rebellions in the empire. And her three or four children went on to fill the courts of Britain, Denmark, Sweden, Spain, and the Netherlands. She set up a national network of schools, with teachings from Russian and western philosophers like John Locke. She collected vast amounts of art, including many from China. She set up a banking system and issued paper money. She also started the process to bring about the end of serfdom. Even though between her and the country she owned 3.3 million herself. She planned on invading the Khanate of Persia, but passed away before her army got there. Her son Paul halted expansion. And probably just in time. Her grandson Alexander I supported other imperial powers against Napoleon and so had to deal with the biggest invasion Russia had seen. Napoleon moved in with his grand army of half a million troops. The Russians used a tactic that Peter the Great used and mostly refused to engage Napoleon's troops instead burning the supply lines. Napoleon lost 300,000 troops during that campaign. Soon after the Napoleanic wars ended, the railways began to appear. The country was industrializing and with guns and cannons, growing stronger than ever. The Opium Wars, between China and the UK then the UK and France were not good to China. Even though Russia didn't really help they needed up with a piece of the Chinese empire and so in the last half of the 1800s the Russian Empire grew by another 300,000 square miles on the backs of a series of unequal treaties as they came to be known in China following World War I. And so by 1895, the Romanovs had expanded past their native Moscow, driven back the Mongols, followed some of the former Mongol Khanates to their lands and taken them, took Siberia, parts of the Chinese empire, the Baltics, Alaska, and were sitting on the third largest empire the world had ever seen, which covered nearly 17 percent of the world. Some 8.8 million square miles. And yet, still just a little smaller than the British empire. They had small skirmishes with the British but by and large looked to smaller foes or proxy wars, with the exception of the Crimean War. Revolution The population was expanding and industrializing. Workers flocked to factories on those train lines. And more people in more concentrated urban areas meant more ideas. Rurik came in 862 and his descendants ruled until the Romanovs took power in 1613. They ruled until 1917. That's over 1,000 years of kings, queens, Tsars, and Emperors. The ideas of Marx slowly spread. While the ruling family was busy with treaties and wars and empire, they forgot to pay attention to the wars at home. People like Vladimir Lenin discovered books by people like Karl Marx. Revolution was in the air around the world. France had shown monarchies could be toppled. Some of the revolutionaries were killed, others put to work in labor camps, others exiled, and still others continued on. Still, the empire was caught up in global empire intrigues. The German empire had been growing and the Russians had the Ottomans and Bulgarians on their southern boarders. They allied with France to take Germany, just as they'd allied with Germany to take down Poland. And so after over 1.8 million dead Russians and another 3.2 million wounded or captured and food shortages back home and in the trenches, the people finally had enough of their Tsar. They went on strike but Tsar Nicholas ordered the troops to fire. The troops refused. The Duma stepped in and forced Nicholas to abdicate. Russia had revolted in 1917, sued Germany for peace, and gave up more territory than they wanted in the process. Finland, the Baltics, their share of Poland, parts of the Ukraine. It was too much. But the Germans took a lot of time and focus to occupy and so it helped to weaken them in the overall war effort. Back home, Lenin took a train home and his Bolshevik party took control of the country. After the war Poland was again independent. Yugoslavia, Czechoslovakia, Estonia, Lithuania, Latvia, and the Serbs became independent nations. In the wake of the war the Ottoman Empire was toppled and modern Turkey was born. The German Kaiser abdicated. And socialism and communism were on the rise. In some cases, that was really just a new way to refer to a dictator that pretended to care about the people. Revolution had come to China in 1911 and Mao took power in the 1940s. Meanwhile, Lenin passed in 1924 and Rykov, then Molotov, who helped spur a new wave of industrialization. Then Stalin, who led purges of the Russian people in a number of Show Trials before getting the Soviet Union, as Russian Empire was now called, into World War II. Stalin encouraged Hitler to attack Poland in 1939. Let's sit on that for a second. He tried to build a pact with the Western powers and after that broke down, he launched excursions annexing parts of Poland, Finland, Romania, Lithuania, Estonia, Latvia. Many of the lands were parts of the former Russian Empire. The USSR had chunks of Belarus and the Ukraine before but as of the 1950s annexed Poland, Easter Germany, Czechoslovakia, Romania, and Bulgaria as part of the Warsaw Pact, a block of nations we later called the Soviet Bloc. They even built a wall between East and West Germany. During and after the war, the Americans whisked German scientists off to the United States. The Soviets were in no real danger from an invasion by the US and the weakened French, Austrians, and military-less Germans were in no place to attack the Soviets. The UK had to rebuild and British empire quickly fell apart. Even the traditional homes of the vikings who'd rowed down the rivers would cease to become global powers. And thus there were two superpowers remaining in the world, the Soviets and the United States. The Cold War The Soviets took back much of the former Russian Empire, claiming they needed buffer zones or through subterfuge. At its peak, the Soviet Union cover 8.6 million square miles; just a couple hundred thousand shy of the Russian Empire. On the way there, they grew to a nation of over 290 million people with dozens of nationalities. And they expanded the sphere of influence even further, waging proxy wars in places like Vietnam and Korea. They never actually went to war with the United States, in much the same way they mostly avoided the direct big war with the Mongols and the British - and how Rorik of Dorestad played both sides of Frankish conflicts. We now call this period the Cold War. The Cold War was an arms race. This manifested itself first in nuclear weapons. The US is still the only country to detonate a nuclear weapon in war time, from the bombings that caused the surrender of Japan at the end of the war. The Soviets weren't that far behind and detonated a bomb in 1949. That was the same year NATO was founded as a treaty organization between Belgium, Canada, Denmark, France, Iceland, Italy, Luxembourg, the Netherlands, Norway, Portugal, and the United States. The US upped the ante with the hydrogen bomb in 1952. The Soviets got the hydrogen bomb in 1955. And then came the Space Race. Sputnik launched in 1957. The Russians were winning the space race. They further proved that when they put Yuri Gagarin up in 1961. By 1969 the US put Neil Armstrong and Buzz Aldrin on the moon. Each side developed military coalitions, provided economic aid to allies, built large arsenals of weapons, practiced espionage against one another, deployed massive amounts of propaganda, and spreading their ideology. Or at least that's what the modern interpretation of history tells us. There were certainly ideological differences, but the Cold War saw the spread of communism as a replacement for conquest. That started with Lenin trying to lead a revolt throughout Europe but shifted over the decades into again, pure conquest. Truman saw the rapid expansion of the Soviets and without context that they were mostly reclaiming lands conquered by the Russian imperial forces, won support for the Truman Doctrine. There, he contained Soviet expansion in Eastern Europe. First, they supported Greece and Turkey. But the support extended throughout areas adjacent to Soviet interests. Eisenhower saw how swiftly Russians were putting science in action with satellites and space missions and nuclear weapons - and responded with an emphasis in American science. The post-war advancements in computing were vast in the US. The industry moved from tubes and punch cards to interactive computing after the Whirlwind computer was developed at MIT first to help train pilots and then to intercept soviet nuclear weapons. Packet switching, and so the foundations of the Internet were laid to build a computer network that could withstand nuclear attack. Graphical interfaces got their start when Ivan Sutherland was working at MIT on the grandchild of Whirlwind, the TX-2 - which would evolve into the Digital Equipment PDP once privatized. Drum memory, which became the foundation of storage was developed to help break Russian codes and intercept messages. There isn't a part of the computing industry that isn't touched by the research farmed out by various branches of the military and by ARPA. Before the Cold War, Russia and then the Soviet Union were about half for and half against various countries when it came to proxy wars. They tended to play both sides. After the Cold War it was pretty much always the US or UK vs the Soviet Union. Algeria, Kenya, Taiwan, the Sudan, Lebanon, Central America, the Congo, Eritrea, Yemen, Dhofar, Algeria, Malaysia, the Dominican Republic, Chad, Iran, Iraq, Thailand, Bolivia, South Africa, Nigeria, India, Bangladesh, Angolia, Ethiopia, the Sahara, Indonesia, Somalia, Mozambique, Libya, and Sri Lanka. And the big ones were Korea, Vietnam, and Afghanistan. Many of these are still raging on today. The Soviet empire grew to over 5 million soldiers. The US started with 2 nuclear weapons in 1945 and had nearly 300 by 1950 when the Soviets had just 5. The US stockpile grew to over 18,000 in 1960 and peaked at over 31,000 in 1965. The Soviets had 6,129 by then but kept building until they got close to 40,000 by 1980. By then the Chinese, France, and the UK each had over 200 and India and Israel had developed nuclear weapons. Since then only Pakistan and North Korea have added warheads, although there are US warheads located in Germany, Belgium, Italy, Turkey, and the Netherlands. Modern Russia The buildup was expensive. Research, development, feeding troops, supporting asymmetrical warfare in proxy states, and trade sanctions put a strain on the government and nearly bankrupted Russia. They fell behind in science, after Stalin had been anti-computers. Meanwhile, the US was able to parlay all that research spending into true productivity gains. The venture capital system also fueled increasingly wealthy companies who paid taxes. Banking, supply chains, refrigeration, miniaturization, radio, television, and everywhere else we could think of. By the 1980s, the US had Apple and Microsoft and Commodore. The Russians were trading blat, or an informal black market currency, to gain access to knock-offs of ZX Spectrums when the graphical interfaces systems were born. The system of government in the Soviet Union had become outdated. There were some who had thought to modernize it into more of a technocracy in an era when the US was just starting to build ARPANET - but those ideas never came to fruition. Instead it became almost feudalistic with high-ranking party members replacing the boyars, or aristocrats of the old Kievan Rus days. The standard of living suffered. So many cultures and tribes under one roof, but only the Slavs had much say. As the empire over-extended there were food shortages. If there are independent companies then the finger can be pointed in their direction but when food is rationed by the Politburo then the decline in agricultural production became dependent on bringing food in from the outside. That meant paying for it. Pair that with uneven distribution and overspending on the military. The Marxist-Leninist doctrine had been a one party state. The Communist Party. Michael Gorbachev allowed countries in the Bloc to move into a democratic direction with multiple parties. The Soviet Union simply became unmanageable. And while Gorbachev took the blame for much of the downfall of the empire, there was already a deep decay - they were an oligarchy pretending to be a communist state. The countries outside of Russia quickly voted in non-communist governments and by 1989 the Berlin Wall came down and the Eastern European countries began to seek independence, most moving towards democratic governments. The collapse of the Soviet Union resulted in 15 separate countries and left the United States standing alone as the global superpower. The Czech Republic, Hungary, and Poland joined NATO in 1999. 2004 saw Bulgaria, Estonia, Latvia, Lithuania, Romania, Slovakia, and Slovenia join. 2009 brought in Albania and Croatia. 2017 led to Montenegro and then North Macedonia. Then came the subject of adding Ukraine. The country that the Kievan Rus had migrated throughout the lands from. The stem from which the name and possibly soul of the country had sprouted from. How could Vladimir Putin allow that to happen? Why would it come up? As the Soviets pulled out of the Bloc countries , they left remnants of their empire behind. Belarus, Kazakstan, and the Ukraine were left plenty of weapons that couldn't be moved quickly. Ukraine alone had 1,700 nuclear weapons, which included 16 intercontinental ballistic missiles. Add to that nearly 2,000 biological and chemical weapons. Those went to Russia or were disassembled once the Ukrainians were assured of their sovereignty. The Crimea, which had been fought over in multiple bloody wars was added to Ukraine. At least until 2014, when Putin wanted the port of Sevastopol, founded by Catherine the Great. Now there was a gateway from Russia to the Mediterranean yet again. So Kievan Rus under Rurik is really the modern Ukraine and the Russian Empire then Romanov Dynasty flowed from that following the Mongol invasions. The Russian Empire freed other nations from the yolk of Mongolian rule but became something entirely different once they over-extended. Those countries in the empire often traded the Mongol yolk for the Soviet yolk. And entirely different from the Soviet Union that fought the Cold War and the modern Russia we know today. Meanwhile, the states of Europe had been profoundly changed since the days of Thomas Paine's The Rights of Man and Marx. Many moved left of center and became socialized parts of their economy. No one ever need go hungry in a Scandanavian country. Health care, education, even child care became free in many countries. Many of those same ideals that helped lift the standard of living for all in developed countries then spread, including in Canada and some in the US. And so we see socialism to capitalism as more of a spectrum than a boolean choice now. And totalitarianism, oligarchy, and democracy as a spectrum as well. Many could argue reforms in democratic countries are paid for by lobbyists who are paid for by companies and thus an effective oligarchy. Others might argue the elections in many countries are rigged and so they aren't even oligarchs, they're monarchies. Putin took office in 1999 and while Dmitry Medvedev was the president for a time, but he effectively ruled in a tandemocracy with Putin until Putin decided to get back in power. That's 23 years and counting and just a few months behind when King Abdullah took over in Jordan and King Mohammed VI took over in Morocco. And so while democratic in name, they're not all quite so democratic. Yet they do benefit from technology that began in Western countries and spread throughout the world. Countries like semi-conductor manufacturer Sitronics even went public on the London stock exchange. Hard line communists might (and do) counter that the US has an empire and that western countries conspire for the downfall of Russia or want to turn Russians into slaves to the capitalist machine. As mentioned earlier, there has always been plenty of propaganda in this relationship. Or gaslighting. Or fake news. Or disinformation. One of those American advancements that ties the Russians to the capitalist yoke is interactive computing. That could have been developed in Glushkov's or Kitov's labs in Russia, as they had the ideas and talent. But because the oligarchy that formed around communism, the ideas were sidelined and it came out of MIT - and that led to Project MAC, which did as much to democratize computing as Gorbachev did to democratize the Russian Federation.
The Internet is not a simple story to tell. In fact, every sentence here is worthy of an episode if not a few. Many would claim the Internet began back in 1969 when the first node of the ARPAnet went online. That was the year we got the first color pictures of earthen from Apollo 10 and the year Nixon announced the US was leaving Vietnam. It was also the year of Stonewall, the moon landing, the Manson murders, and Woodstock. A lot was about to change. But maybe the story of the Internet starts before that, when the basic research to network computers began as a means of networking nuclear missile sites with fault-tolerant connections in the event of, well, nuclear war. Or the Internet began when a T3 backbone was built to host all the datas. Or the Internet began with the telegraph, when the first data was sent over electronic current. Or maybe the Internet began when the Chinese used fires to send messages across the Great Wall of China. Or maybe the Internet began when drums sent messages over long distances in ancient Africa, like early forms of packets flowing over Wi-Fi-esque sound waves. We need to make complex stories simpler in order to teach them, so if the first node of the ARPAnet in 1969 is where this journey should end, feel free to stop here. To dig in a little deeper, though, that ARPAnet was just one of many networks that would merge into an interconnected network of networks. We had dialup providers like CompuServe, America Online, and even The WELL. We had regional timesharing networks like the DTSS out of Dartmouth University and PLATO out of the University of Illinois, Champaign-Urbana. We had corporate time sharing networks and systems. Each competed or coexisted or took time from others or pushed more people to others through their evolutions. Many used their own custom protocols for connectivity. But most were walled gardens, unable to communicate with the others. So if the story is more complicated than that the ARPAnet was the ancestor to the Internet, why is that the story we hear? Let's start that journey with a memo that we did an episode on called “Memorandum For Members and Affiliates of the Intergalactic Computer Network” sent by JCR Licklider in 1963 and can be considered the allspark that lit the bonfire called The ARPANet. Which isn't exactly the Internet but isn't not. In that memo, Lick proposed a network of computers available to research scientists of the early 60s. Scientists from computing centers that would evolve into supercomputing centers and then a network open to the world, even our phones, televisions, and watches. It took a few years, but eventually ARPA brought in Larry Roberts, and by late 1968 ARPA awarded an RFQ to build a network to a company called Bolt Beranek and Newman (BBN) who would build Interface Message Processors, or IMPs. The IMPS were computers that connected a number of sites and routed traffic. The first IMP, which might be thought of more as a network interface card today, went online at UCLA in 1969 with additional sites coming on frequently over the next few years. That system would become ARPANET. The first node of ARPAnet went online at the University of California, Los Angeles (UCLA for short). It grew as leased lines and more IMPs became more available. As they grew, the early computer scientists realized that each site had different computers running various and random stacks of applications and different operating systems. So we needed to standardize certain aspects connectivity between different computers. Given that UCLA was the first site to come online, Steve Crocker from there began organizing notes about protocols and how systems connected with one another in what they called RFCs, or Request for Comments. That series of notes was then managed by a team that included Elizabeth (Jake) Feinler from Stanford once Doug Engelbart's project on the “Augmentation of Human Intellect” at Stanford Research Institute (SRI) became the second node to go online. SRI developed a Network Information Center, where Feinler maintained a list of host names (which evolved into the hosts file) and a list of address mappings which would later evolve into the functions of Internic which would be turned over to the US Department of Commerce when the number of devices connected to the Internet exploded. Feinler and Jon Postel from UCLA would maintain those though, until his death 28 years later and those RFCs include everything from opening terminal connections into machines to file sharing to addressing and now any place where the networking needs to become a standard. The development of many of those early protocols that made computers useful over a network were also being funded by ARPA. They funded a number of projects to build tools that enabled the sharing of data, like file sharing and some advancements were loosely connected by people just doing things to make them useful and so by 1971 we also had email. But all those protocols needed to flow over a common form of connectivity that was scalable. Leonard Kleinrock, Paul Baran, and Donald Davies were independently investigating packet switching and Roberts brought Kleinrock into the project as he was at UCLA. Bob Kahn entered the picture in 1972. He would team up with Vint Cerf from Stanford who came up with encapsulation and so they would define the protocol that underlies the Internet, TCP/IP. By 1974 Vint Cerf and Bob Kahn wrote RFC 675 where they coined the term internet as shorthand for internetwork. The number of RFCs was exploding as was the number of nodes. The University of California Santa Barbara then the University of Utah to connect Ivan Sutherland's work. The network was national when BBN connected to it in 1970. Now there were 13 IMPs and by 1971, 18, then 29 in 72 and 40 in 73. Once the need arose, Kleinrock would go on to work with Farouk Kamoun to develop the hierarchical routing theories in the late 70s. By 1976, ARPA became DARPA. The network grew to 213 hosts in 1981 and by 1982, TCP/IP became the standard for the US DOD and in 1983, ARPANET moved fully over to TCP/IP. And so TCP/IP, or Transport Control Protocol/Internet Protocol is the most dominant networking protocol on the planet. It was written to help improve performance on the ARPAnet with the ingenious idea to encapsulate traffic. But in the 80s, it was just for researchers still. That is, until NSFNet was launched by the National Science Foundation in 1986. And it was international, with the University College of London connecting in 1971, which would go on to inspire a British research network called JANET that built their own set of protocols called the Colored Book protocols. And the Norwegian Seismic Array connected over satellite in 1973. So networks were forming all over the place, often just time sharing networks where people dialed into a single computer. Another networking project going on at the time that was also getting funding from ARPA as well as the Air Force was PLATO. Out of the University of Illinois, was meant for teaching and began on a mainframe in 1960. But by the time ARPAnet was growing PLATO was on version IV and running on a CDC Cyber. The time sharing system hosted a number of courses, as they referred to programs. These included actual courseware, games, convent with audio and video, message boards, instant messaging, custom touch screen plasma displays, and the ability to dial into the system over lines, making the system another early network. In fact, there were multiple CDC Cybers that could communicate with one another. And many on ARPAnet also used PLATO, cross pollinating non-defense backed academia with a number of academic institutions. The defense backing couldn't last forever. The Mansfield Amendment in 1973 banned general research by defense agencies. This meant that ARPA funding started to dry up and the scientists working on those projects needed a new place to fund their playtime. Bob Taylor split to go work at Xerox, where he was able to pick the best of the scientists he'd helped fund at ARPA. He helped bring in people from Stanford Research Institute, where they had been working on the oNLineSystem, or NLS and people like Bob Metcalfe who brought us Ethernet and better collusion detection. Metcalfe would go on to found 3Com a great switch and network interface company during the rise of the Internet. But there were plenty of people who could see the productivity gains from ARPAnet and didn't want it to disappear. And the National Science Foundation (NSF) was flush with cash. And the ARPA crew was increasingly aware of non-defense oriented use of the system. So the NSF started up a little project called CSNET in 1981 so the growing number of supercomputers could be shared between all the research universities. It was free for universities that could get connected and from 1985 to 1993 NSFNET, surged from 2,000 users to 2,000,000 users. Paul Mockapetris made the Internet easier than when it was an academic-only network by developing the Domain Name System, or DNS, in 1983. That's how we can call up remote computers by names rather than IP addresses. And of course DNS was yet another of the protocols in Postel at UCLAs list of protocol standards, which by 1986 after the selection of TCP/IP for NSFnet, would become the standardization body known as the IETF, or Internet Engineering Task Force for short. Maintaining a set of protocols that all vendors needed to work with was one of the best growth hacks ever. No vendor could have kept up with demand with a 1,000x growth in such a small number of years. NSFNet started with six nodes in 1985, connected by LSI-11 Fuzzball routers and quickly outgrew that backbone. They put it out to bid and Merit Network won out in a partnership between MCI, the State of Michigan, and IBM. Merit had begun before the first ARPAnet connections went online as a collaborative effort by Michigan State University, Wayne State University, and the University of Michigan. They'd been connecting their own machines since 1971 and had implemented TCP/IP and bridged to ARPANET. The money was getting bigger, they got $39 million from NSF to build what would emerge as the commercial Internet. They launched in 1987 with 13 sites over 14 lines. By 1988 they'd gone nationwide going from a 56k backbone to a T1 and then 14 T1s. But the growth was too fast for even that. They re-engineered and by 1990 planned to add T3 lines running in parallel with the T1s for a time. By 1991 there were 16 backbones with traffic and users growing by an astounding 20% per month. Vint Cerf ended up at MCI where he helped lobby for the privatization of the internet and helped found the Internet Society in 1988. The lobby worked and led to the the Scientific and Advanced-Technology Act in 1992. Before that, use of NSFNET was supposed to be for research and now it could expand to non-research and education uses. This allowed NSF to bring on even more nodes. And so by 1993 it was clear that this was growing beyond what a governmental institution whose charge was science could justify as “research” for any longer. By 1994, Vent Cerf was designing the architecture and building the teams that would build the commercial internet backbone at MCI. And so NSFNET began the process of unloading the backbone and helped the world develop the commercial Internet by sprinkling a little money and know-how throughout the telecommunications industry, which was about to explode. NSFNET went offline in 1995 but by then there were networks in England, South Korea, Japan, Africa, and CERN was connected to NSFNET over TCP/IP. And Cisco was selling routers that would fuel an explosion internationally. There was a war of standards and yet over time we settled on TCP/IP as THE standard. And those were just some of the nets. The Internet is really not just NSFNET or ARPANET but a combination of a lot of nets. At the time there were a lot of time sharing computers that people could dial into and following the release of the Altair, there was a rapidly growing personal computer market with modems becoming more and more approachable towards the end of the 1970s. You see, we talked about these larger networks but not hardware. The first modulator demodulator, or modem, was the Bell 101 dataset, which had been invented all the way back in 1958, loosely based on a previous model developed to manage SAGE computers. But the transfer rate, or baud, had stopped being improved upon at 300 for almost 20 years and not much had changed. That is, until Hayes Hayes Microcomputer Products released a modem designed to run on the Altair 8800 S-100 bus in 1978. Personal computers could talk to one another. And one of those Altair owners was Ward Christensen met Randy Suess at the Chicago Area Computer Hobbyists' Exchange and the two of them had this weird idea. Have a computer host a bulletin board on one of their computers. People could dial into it and discuss their Altair computers when it snowed too much to meet in person for their club. They started writing a little code and before you know it we had a tool they called Computerized Bulletin Board System software, or CBBS. The software and more importantly, the idea of a BBS spread like wildfire right along with the Atari, TRS-80, Commodores and Apple computers that were igniting the personal computing revolution. The number of nodes grew and as people started playing games, the speed of those modems jumped up with the v.32 standard hitting 9600 baud in 84, and over 25k in the early 90s. By the early 1980s, we got Fidonet, which was a network of Bulletin Board Systems and by the early 90s we had 25,000 BBS's. And other nets had been on the rise. And these were commercial ventures. The largest of those dial-up providers was America Online, or AOL. AOL began in 1985 and like most of the other dial-up providers of the day were there to connect people to a computer they hosted, like a timesharing system, and give access to fun things. Games, news, stocks, movie reviews, chatting with your friends, etc. There was also CompuServe, The Well, PSINet, Netcom, Usenet, Alternate, and many others. Some started to communicate with one another with the rise of the Metropolitan Area Exchanges who got an NSF grant to establish switched ethernet exchanges and the Commercial Internet Exchange in 1991, established by PSINet, UUNet, and CERFnet out of California. Those slowly moved over to the Internet and even AOL got connected to the Internet in 1989 and thus the dial-up providers went from effectively being timesharing systems to Internet Service Providers as more and more people expanded their horizons away from the walled garden of the time sharing world and towards the Internet. The number of BBS systems started to wind down. All these IP addresses couldn't be managed easily and so IANA evolved out of being managed by contracts from research universities to DARPA and then to IANA as a part of ICANN and eventually the development of Regional Internet Registries so AFRINIC could serve Africa, ARIN could serve Antarctica, Canada, the Caribbean, and the US, APNIC could serve South, East, and Southeast Asia as well as Oceania LACNIC could serve Latin America and RIPE NCC could serve Europe, Central Asia, and West Asia. By the 90s the Cold War was winding down (temporarily at least) so they even added Russia to RIPE NCC. And so using tools like WinSOCK any old person could get on the Internet by dialing up. Modems for dial-ups transitioned to DSL and cable modems. We got the emergence of fiber with regional centers and even national FiOS connections. And because of all the hard work of all of these people and the money dumped into it by the various governments and research agencies, life is pretty darn good. When we think of the Internet today we think of this interconnected web of endpoints and content that is all available. Much of that was made possible by the development of the World Wide Web by Tim Berners-Lee in in 1991 at CERN, and Mosaic came out of the National Center for Supercomputing applications, or NCSA at the University of Illinois, quickly becoming the browser everyone wanted to use until Mark Andreeson left to form Netscape. Netscape's IPO is probably one of the most pivotal moments where investors from around the world realized that all of this research and tech was built on standards and while there were some patents, the standards were freely useable by anyone. Those standards let to an explosion of companies like Yahoo! from a couple of Stanford grad students and Amazon, started by a young hedge fund Vice President named Jeff Bezos who noticed all the money pouring into these companies and went off to do his own thing in 1994. The companies that arose to create and commercialize content and ideas to bring every industry online was ferocious. And there were the researchers still writing the standards and even commercial interests helping with that. And there were open source contributors who helped make some of those standards easier to implement by regular old humans. And tools for those who build tools. And from there the Internet became what we think of today. Quicker and quicker connections and more and more productivity gains, a better quality of life, better telemetry into all aspects of our lives and with the miniaturization of devices to support wearables that even extends to our bodies. Yet still sitting on the same fundamental building blocks as before. The IANA functions to manage IP addressing has moved to the private sector as have many an onramp to the Internet. Especially as internet access has become more ubiquitous and we are entering into the era of 5g connectivity. And it continues to evolve as we pivot due to new needs and threats a globally connected world represent. IPv6, various secure DNS options, options for spam and phishing, and dealing with the equality gaps surfaced by our new online world. We have disinformation so sometimes we might wonder what's real and what isn't. After all, any old person can create a web site that looks legit and put whatever they want on it. Who's to say what reality is other than what we want it to be. This was pretty much what Morpheus was offering with his choices of pills in the Matrix. But underneath it all, there's history. And it's a history as complicated as unraveling the meaning of an increasingly digital world. And it is wonderful and frightening and lovely and dangerous and true and false and destroying the world and saving the world all at the same time. This episode is pretty simplistic and many of the aspects we cover have entire episodes of the podcast dedicated to them. From the history of Amazon to Bob Taylor to AOL to the IETF to DNS and even Network Time Protocol. It's a story that leaves people out necessarily; otherwise scope creep would go all the way back to to include Volta and the constant electrical current humanity received with the battery. But hey, we also have an episode on that! And many an advance has plenty of books and scholarly works dedicated to it - all the way back to the first known computer (in the form of clockwork), the Antikythera Device out of Ancient Greece. Heck even Louis Gerschner deserves a mention for selling IBM's stake in all this to focus on things that kept the company going, not moonshots. But I'd like to dedicate this episode to everyone not mentioned due to trying to tell a story of emergent networks. Just because they were growing fast and our modern infrastructure was becoming more and more deterministic doesn't mean that whether it was writing a text editor or helping fund or pushing paper or writing specs or selling network services or getting zapped while trying to figure out how to move current that there aren't so, so, so many people that are a part of this story. Each with their own story to be told. As we round the corner into the third season of the podcast we'll start having more guests. If you have a story and would like to join us use the email button on thehistoryofcomputing.net to drop us a line. We'd love to chat!
May 7th is Sauvignon Blanc day but we'll be celebrating all week! Join the Wine Thieves at ground zero for new world sauvignon blanc, Marlborough, New Zealand. John and Sara explore its rise to prominence from the first plantings in the 1970s, through to its explosion on the international scene in the 1990s, and now to the current ‘third wave' (a good third wave, that is) of producers that are breaking rules and leading the country in new stylistic directions.Joining the discussion is James Healy, co-founder with Ivan Sutherland, of Dog Point wines in Marlborough. James was chief winemaker at Cloudy Bay in the 1990s before striking out on his own, and thus really was at ground zero of the New Zealand and Marlborough sauvignon blanc international explosion that occurred in the mid-1990s. He tells us about a new initiative called Appellation Marlborough Wine (AMW) and its importance and gaining prominence. We're also joined by Matt Deller, Master of Wine and Chief Global Sales & Marketing Officer for the Villa Maria group founded by Sir George Fistonich in the middle of last century in Auckland. 'Villa', as it's known, has grown to be one of the larger and most admired players in the NZ wine industry, and Matt discusses the company's commitment to organics and sub-appellations. In the final segment of the show, the Thieves welcome Erica Crawford of Loveblock winery and the co-creator, with husband Kim, of the hugely successful Kim Crawford brand, which they sold a decade and a half ago to Constellation Brands. Erica falls in love with sauvignon all over again making wines in the style she loves to drink, shares insight into the rapid commercial success of New Zealand sauvignon blanc, praises Lord of the Rings, and reveals how high grade tea powder might just change winemaking for the better. Grab a well chilled glass of crunchy sauvignon and settle in to a great discussion. This episode was produced in partnerships with New Zealand Winegrowers.
There was a nexus of Digital Research and Xerox PARC, along with Stanford and Berkeley in the Bay Area. The rise of the hobbyists and the success of Apple attracted some of the best minds in computing to Apple. This confluence was about to change the world. One of those brilliant minds that landed at Apple started out as a technical writer. Apple hired Jef Raskin as their 31st employee, to write the Apple II manual. He quickly started harping on people to build a computer that was easy to use. Mike Markkula wanted to release a gaming console or a cheap computer that could compete with the Commodore and Atari machines at the time. He called the project “Annie.” The project began with Raskin, but he had a very different idea than Markkula's. He summed it up in an article called “Computers by the Millions” that wouldn't see publication until 1982. His vision was closer to his PhD dissertation, bringing computing to the masses. For this, he envisioned a menu driven operating system that was easy to use and inexpensive. Not yet a GUI in the sense of a windowing operating system and so could run on chips that were rapidly dropping in price. He planned to use the 6809 chip for the machine and give it a five inch display. He didn't tell anyone that he had a PhD when he was hired, as the team at Apple was skeptical of academia. Jobs provided input, but was off working on the Lisa project, which used the 68000 chip. So they had free reign over what they were doing. Raskin quickly added Joanna Hoffman for marketing. She was on leave from getting a PhD in archaeology at the University of Chicago and was the marketing team for the Mac for over a year. They also added Burrell Smith, employee #282 from the hardware technician team, to do hardware. He'd run with the Homebrew Computer Club crowd since 1975 and had just strolled into Apple one day and asked for a job. Raskin also brought in one of his students from the University of California San Diego who was taking a break from working on his PhD in neurochemistry. Bill Atkinson became employee 51 at Apple and joined the project. They pulled in Andy Hertzfeld, who Steve Jobs hired when Apple bought one of his programs as he was wrapping up his degree at Berkeley and who'd been sitting on the Apple services team and doing Apple III demos. They added Larry Kenyon, who'd worked at Amdahl and then on the Apple III team. Susan Kare came in to add art and design. They, along with Chris Espinosa - who'd been in the garage with Jobs and Wozniak working on the Apple I, ended up comprising the core team. Over time, the team grew. Bud Tribble joined as the manager for software development. Jerrold Manock, who'd designed the case of the Apple II, came in to design the now-iconic Macintosh case. The team would eventually expand to include Bob Belleville, Steve Capps, George Crow, Donn Denman, Bruce Horn, and Caroline Rose as well. It was still a small team. And they needed a better code name. But chronologically let's step back to the early project. Raskin chose his favorite Apple, the Macintosh, as the codename for the project. As far as codenames go it was a pretty good one. So their mission would be to ship a machine that was easy to use, would appeal to the masses, and be at a price point the masses could afford. They were looking at 64k of memory, a Motorola 6809 chip, and a 256 bitmap display. Small, light, and inexpensive. Jobs' relationship with the Lisa team was strained and he was taken off of that and he started moving in on the Macintosh team. It was quickly the Steve Jobs show. Having seen what could be done with the Motorola 68000 chip on the Lisa team, Jobs had them redesign the board to work with that. After visiting Xerox PARC at Raskin's insistence, Jobs finally got the desktop metaphor and true graphical interface design. Xerox had not been quiet about the work at PARC. Going back to 1972 there were even television commercials. And Raskin had done time at PARC while on sabbatical from Stanford. Information about Smalltalk had been published and people like Bill Atkinson were reading about it in college. People had been exposed to the mouse all around the Bay Area in the 60s and 70s or read Engelbart's scholarly works on it. Many of the people that worked on these projects had doctorates and were academics. They shared their research as freely as love was shared during that counter-culture time. Just as it had passed from MIT to Dartmouth and then in the back of Bob Albrecht's VW had spread around the country in the 60s. That spirit of innovation and the constant evolutions over the past 25 years found their way to Steve Jobs. He saw the desktop metaphor and mouse and fell in love with it, knowing they could build one for less than the $400 unit Xerox had. He saw how an object-oriented programming language like Smalltalk made all that possible. The team was already on their way to the same types of things and so Jobs told the people at PARC about the Lisa project, but not yet about the Mac. In fact, he was as transparent as anyone could be. He made sure they knew how much he loved their work and disclosed more than I think the team planned on him disclosing about Apple. This is the point where Larry Tesler and others realized that the group of rag-tag garage-building Homebrew hackers had actually built a company that had real computer scientists and was on track to changing the world. Tesler and some others would end up at Apple later - to see some of their innovations go to a mass market. Steve Jobs at this point totally bought into Raskin's vision. Yet he still felt they needed to make compromises with the price and better hardware to make it all happen. Raskin couldn't make the kinds of compromises Jobs wanted. He also had an immunity to the now-infamous Steve Jobs reality distortion field and they clashed constantly. So eventually Raskin the project just when it was starting to take off. Raskin would go on to work with Canon to build his vision, which became the Canon CAT. With Raskin gone, and armed with a dream team of mad scientists, they got to work, tirelessly pushing towards shipping a computer they all believed would change the world. Jobs brought in Fernandez to help with projects like the macOS and later HyperCard. Wozniak had a pretty big influence over Raskin in the early days of the Mac project and helped here and there withe the project, like with the bit-serial peripheral bus on the Mac. Steve Jobs wanted an inexpensive mouse that could be manufactured en masse. Jim Yurchenco from Hovey-Kelley, later called Ideo, got the task - given that trusted engineers at Apple had full dance cards. He looked at the Xerox mouse and other devices around - including trackballs in Atari arcade machines. Those used optics instead of mechanical switches. As the ball under the mouse rolled beams of light would be interrupted and the cost of those components had come down faster than the technology in the Xerox mouse. He used a ball from a roll-on deodorant stick and got to work. The rest of the team designed the injection molded case for the mouse. That work began with the Lisa and by the time they were done, the price was low enough that every Mac could get one. Armed with a mouse, they figured out how to move windows over the top of one another, Susan Kare designed iconography that is a bit less 8-bit but often every bit as true to form today. Learning how they wanted to access various components of the desktop, or find things, they developed the Finder. Atkinson gave us marching ants, the concept of double-clicking, the lasso for selecting content, the menu bar, MacPaint, and later, HyperCard. It was a small team, working long hours. Driven by a Jobs for perfection. Jobs made the Lisa team the enemy. Everything not the Mac just sucked. He took the team to art exhibits. He had the team sign the inside of the case to infuse them with the pride of an artist. He killed the idea of long product specifications before writing code and they just jumped in, building and refining and rebuilding and rapid prototyping. The team responded well to the enthusiasm and need for perfectionism. The Mac team was like a rebel squadron. They were like a start-up, operating inside Apple. They were pirates. They got fast and sometimes harsh feedback. And nearly all of them still look back on that time as the best thing they've done in their careers. As IBM and many learned the hard way before them, they learned a small, inspired team, can get a lot done. With such a small team and the ability to parlay work done for the Lisa, the R&D costs were minuscule until they were ready to release the computer. And yet, one can't change the world over night. 1981 turned into 1982 turned into 1983. More and more people came in to fill gaps. Collette Askeland came in to design the printed circuit board. Mike Boich went to companies to get them to write software for the Macintosh. Berry Cash helped prepare sellers to move the product. Matt Carter got the factory ready to mass produce the machine. Donn Denman wrote MacBASIC (because every machine needed a BASIC back then). Martin Haeberli helped write MacTerminal and Memory Manager. Bill Bull got rid of the fan. Patti King helped manage the software library. Dan Kottke helped troubleshoot issues with mother boards. Brian Robertson helped with purchasing. Ed Riddle designed the keyboard. Linda Wilkin took on documentation for the engineering team. It was a growing team. Pamela Wyman and Angeline Lo came in as programmers. Hap Horn and Steve Balog as engineers. Jobs had agreed to bring in adults to run the company. So they recruited 44 years old hotshot CEO John Sculley to change the world as their CEO rather than selling sugar water at Pepsi. Scully and Jobs had a tumultuous relationship over time. While Jobs had made tradeoffs on cost versus performance for the Mac, Sculley ended up raising the price for business reasons. Regis McKenna came in to help with the market campaign. He would win over so much trust that he would later get called out of retirement to do damage control when Apple had an antenna problem on the iPhone. We'll cover Antenna-gate at some point. They spearheaded the production of the now-iconic 1984 Super Bowl XVIII ad, which shows woman running from conformity and depicted IBM as the Big Brother from George Orwell's book, 1984. Two days after the ad, the Macintosh 128k shipped for $2,495. The price had jumped because Scully wanted enough money to fund a marketing campaign. It shipped late, and the 128k of memory was a bit underpowered, but it was a success. Many of the concepts such as a System and Finder, persist to this day. It came with MacWrite and MacPaint and some of the other Lisa products were soon to follow, now as MacProject and MacTerminal. But the first killer app for the Mac was Microsoft Word, which was the first version of Word ever shipped. Every machine came with a mouse. The machines came with a cassette that featured a guided tour of the new computer. You could write programs in MacBASIC and my second language, MacPascal. They hit the initial sales numbers despite the higher price. But over time that bit them on sluggish sales. Despite the early success, the sales were declining. Yet the team forged on. They introduced the Apple LaserWriter at a whopping $7,000. This was a laser printer that was based on the Canon 300 dpi engine. Burrell Smith designed a board and newcomer Adobe knew laser printers, given that the founders were Xerox alumni. They added postscript, which had initially been thought up while working with Ivan Sutherland and then implemented at PARC, to make for perfect printing at the time. The sluggish sales caused internal issues. There's a hangover when we do something great. First there were the famous episodes between Jobs, Scully, and the board of directors at Apple. Scully seems to have been portrayed by many to be either a villain or a court jester of sorts in the story of Steve Jobs. Across my research, which began with books and notes and expanded to include a number of interviews, I've found Scully to have been admirable in the face of what many might consider a petulant child. But they all knew a brilliant one. But amidst Apple's first quarterly loss, Scully and Jobs had a falling out. Jobs tried to lead an insurrection and ultimately resigned. Wozniak had left Apple already, pointing out that the Apple II was still 70% of the revenues of the company. But the Mac was clearly the future. They had reached a turning point in the history of computers. The first mass marketed computer featuring a GUI and a mouse came and went. And so many others were in development that a red ocean was forming. Microsoft released Windows 1.0 in 1985. Acorn, Amiga, IBM, and others were in rapid development as well. I can still remember the first time I sat down at a Mac. I'd used the Apple IIs in school and we got a lab of Macs. It was amazing. I could open a file, change the font size and print a big poster. I could type up my dad's lyrics and print them. I could play SimCity. It was a work of art. And so it was signed by the artists that brought it to us: Peggy Alexio, Colette Askeland, Bill Atkinson, Steve Balog, Bob Belleville, Mike Boich, Bill Bull, Matt Carter, Berry Cash, Debi Coleman, George Crow, Donn Denman, Christopher Espinosa, Bill Fernandez, Martin Haeberli, Andy Hertzfeld, Joanna Hoffman, Rod Holt, Bruce Horn, Hap Horn, Brian Howard, Steve Jobs, Larry Kenyon, Patti King, Daniel Kottke, Angeline Lo, Ivan Mach, Jerrold Manock, Mary Ellen McCammon, Vicki Milledge, Mike Murray, Ron Nicholson Jr., Terry Oyama, Benjamin Pang, Jef Raskin, Ed Riddle, Brian Robertson, Dave Roots, Patricia Sharp, Burrell Smith, Bryan Stearns, Lynn Takahashi, Guy "Bud" Tribble, Randy Wigginton, Linda Wilkin, Steve Wozniak, Pamela Wyman and Laszlo Zidek. Steve Jobs left to found NeXT. Some, like George Crow, Joanna Hoffman, and Susan Care, went with him. Bud Tribble would become a co-founder of NeXT and then the Vice President of Software Technology after Apple purchased NeXT. Bill Atkinson and Andy Hertzfeld would go on to co-found General Magic and usher in the era of mobility. One of the best teams ever assembled slowly dwindled away. And the oncoming dominance of Windows in the market took its toll. It seems like every company has a “lost decade.” Some like Digital Equipment don't recover from it. Others, like Microsoft and IBM (who has arguably had a few), emerge as different companies altogether. Apple seemed to go dormant after Steve Jobs left. They had changed the world with the Mac. They put swagger and an eye for design into computing. But in the next episode we'll look at that long hangover, where they were left by the end of it, and how they emerged to become to change the world yet again. In the meantime, Walter Isaacson weaves together this story about as well as anyone in his book Jobs. Steven Levy brilliantly tells it in his book Insanely Great. Andy Hertzfeld gives some of his stories at folklore.org. And countless other books, documentaries, podcasts, blog posts, and articles cover various aspects as well. The reason it's gotten so much attention is that where the Apple II was the watershed moment to introduce the personal computer to the mass market, the Macintosh was that moment for the graphical user interface.
Robert Taylor was one of the true pioneers in computer science. In many ways, he is the string (or glue) that connected the US governments era of supporting computer science through ARPA to innovations that came out of Xerox PARC and then to the work done at Digital Equipment Corporation's Systems Research Center. Those are three critical aspects of the history of computing and while Taylor didn't write any of the innovative code or develop any of the tools that came out of those three research environments, he saw people and projects worth funding and made sure the brilliant scientists got what they needed to get things done. The 31 years in computing that his stops represented were some of the most formative years for the young computing industry and his ability to inspire the advances that began with Vannevar Bush's 1945 article called “As We May Think” then ended with the explosion of the Internet across personal computers. Bob Taylor inherited a world where computing was waking up to large crusty but finally fully digitized mainframes stuck to its eyes in the morning and went to bed the year Corel bought WordPerfect because PCs needed applications, the year the Pentium 200 MHz was released, the year Palm Pilot and eBay were founded, the year AOL started to show articles from the New York Times, the year IBM opened a we web shopping mall and the year the Internet reached 36 million people. Excite and Yahoo went public. Sometimes big, sometimes small, all of these can be traced back to Bob Taylor - kinda' how we can trace all actors to Kevin Bacon. But more like if Kevin Bacon found talent and helped them get started, by paying them during the early years of their careers… How did Taylor end up as the glue for the young and budding computing research industry? Going from tween to teenager during World War II, he went to Southern Methodist University in 1948, when he was 16. He jumped into the US Naval Reserves during the Korean War and then got his masters in psychology at the University of Texas at Austin using the GI Bill. Many of those pioneers in computing in the 60s went to school on the GI Bill. It was a big deal across every aspect of American life at the time - paving the way to home ownership, college educations, and new careers in the trades. From there, he bounced around, taking classes in whatever interested him, before taking a job at Martin Marietta, helping design the MGM-31 Pershing and ended up at NASA where he discovered the emerging computer industry. Taylor was working on projects for the Apollo program when he met JCR Licklider, known as the Johnny Appleseed of computing. Lick, as his friends called him, had written an article called Man-Computer Symbiosis in 1960 and had laid out a plan for computing that influenced many. One such person, was Taylor. And so it was in 1962 he began and in 1965 that he succeeded in recruiting Taylor away from NASA to take his place running ARPAs Information Processing Techniques Office, or IPTO. Taylor had funded Douglas Engelbart's research on computer interactivity at Stanford Research Institute while at NASA. He continued to do so when he got to ARPA and that project resulted in the invention of the computer mouse and the Mother of All Demos, one of the most inspirational moments and a turning point in the history of computing. They also funded a project to develop an operating system called Multics. This would be a two million dollar project run by General Electric, MIT, and Bell Labs. Run through Project MAC at MIT there were just too many cooks in the kitchen. Later, some of those Bell Labs cats would just do their own thing. Ken Thompson had worked on Multics and took the best and worst into account when he wrote the first lines of Unix and the B programming language, then one of the most important languages of all time, C. Interactive graphical computing and operating systems were great but IPTO, and so Bob Taylor and team, would fund straight out of the pentagon, the ability for one computer to process information on another computer. Which is to say they wanted to network computers. It took a few years, but eventually they brought in Larry Roberts, and by late 1968 they'd awarded an RFQ to build a network to a company called Bolt Beranek and Newman (BBN) who would build Interface Message Processors, or IMPs. The IMPS would connect a number of sites and route traffic and the first one went online at UCLA in 1969 with additional sites coming on frequently over the next few years. That system would become ARPANET, the commonly accepted precursor to the Internet. There was another networking project going on at the time that was also getting funding from ARPA as well as the Air Force, PLATO out of the University of Illinois. PLATO was meant for teaching and had begun in 1960, but by then they were on version IV, running on a CDC Cyber and the time sharing system hosted a number of courses, as they referred to programs. These included actual courseware, games, convent with audio and video, message boards, instant messaging, custom touch screen plasma displays, and the ability to dial into the system over lines, making the system another early network. Then things get weird. Taylor is sent to Vietnam as a civilian, although his rank equivalent would be a brigadier general. He helped develop the Military Assistance Command in Vietnam. Battlefield operations and reporting were entering the computing era. Only problem is, while Taylor was a war veteran and had been deep in the defense research industry for his entire career, Vietnam was an incredibly unpopular war and seeing it first hand and getting pulled into the theater of war, had him ready to leave. This combined with interpersonal problems with Larry Roberts who was running the ARPA project by then over Taylor being his boss even without a PhD or direct research experience. And so Taylor joined a project ARPA had funded at the University of Utah and left ARPA. There, he worked with Ivan Sutherland, who wrote Sketchpad and is known as the Father of Computer Graphics, until he got another offer. This time, from Xerox to go to their new Palo Alto Research Center, or PARC. One rising star in the computer research world was pretty against the idea of a centralized mainframe driven time sharing system. This was Alan Kay. In many ways, Kay was like Lick. And unlike the time sharing projects of the day, the Licklider and Kay inspiration was for dedicated cycles on processors. This meant personal computers. The Mansfield Amendment in 1973 banned general research by defense agencies. This meant that ARPA funding started to dry up and the scientists working on those projects needed a new place to fund their playtime. Taylor was able to pick the best of the scientists he'd helped fund at ARPA. He helped bring in people from Stanford Research Institute, where they had been working on the oNLineSystem, or NLS. This new Computer Science Laboratory landed people like Charles Thacker, David Boggs, Butler Lampson, and Bob Sproul and would develop the Xerox Alto, the inspiration for the Macintosh. The Alto though contributed the very ideas of overlapping windows, icons, menus, cut and paste, word processing. In fact, Charles Simonyi from PARC would work on Bravo before moving to Microsoft to spearhead Microsoft Word. Bob Metcalfe on that team was instrumental in developing Ethernet so workstations could communicate with ARPANET all over the growing campus-connected environments. Metcalfe would leave to form 3COM. SuperPaint would be developed there and Alvy Ray Smith would go on to co-found Pixar, continuing the work begun by Richard Shoup. They developed the Laser Printer, some of the ideas that ended up in TCP/IP, and the their research into page layout languages would end up with Chuck Geschke, John Warnock and others founding Adobe. Kay would bring us the philosophy behind the DynaBook which decades later would effectively become the iPad. He would also develop Smalltalk with Dan Ingalls and Adele Goldberg, ushering in the era of object oriented programming. They would do pioneering work on VLSI semiconductors, ubiquitous computing, and anything else to prepare the world to mass produce the technologies that ARPA had been spearheading for all those years. Xerox famously did not mass produce those technologies. And nor could they have cornered the market on all of them. The coming waves were far too big for one company alone. And so it was that PARC, unable to bring the future to the masses fast enough to impact earnings per share, got a new director in 1983 and William Spencer was yet another of three bosses that Taylor clashed with. Some resented that he didn't have a PhD in a world where everyone else did. Others resented the close relationship he maintained with the teams. Either way, Taylor left PARC in 1983 and many of the scientists left with him. It's both a curse and a blessing to learn more and more about our heroes. Taylor was one of the finest minds in the history of computing. His tenure at PARC certainly saw the a lot of innovation and one of the most innovative teams to have ever been assembled. But as many of us that have been put into a position of leadership, it's easy to get caught up in the politics. I am ashamed every time I look back and see examples of building political capital at the expense of a project or letting an interpersonal problem get in the way of the greater good for a team. But also, we're all human and the people that I've interviewed seem to match the accounts I've read in other books. And so Taylor's final stop was Digital Equipment Corporation where he was hired to form their Systems Research Center in Palo Alto. They brought us the AltaVista search engine, the Firefly computer, Modula-3 and a few other advances. Taylor retired in 1996 and DEC was acquired by Compaq in 1998 and when they were acquired by HP the SRC would get merged with other labs at HP. From ARPA to Xerox to Digital, Bob Taylor certainly left his mark on computing. He had a knack of seeing the forest through the trees and inspired engineering feats the world is still wrestling with how to bring to fruition. Raw, pure science. He died in 2017. He worked with some of the most brilliant people in the world at ARPA. He inspired passion, and sometimes drama in what Stanford's Donald Knuth called “the greatest by far team of computer scientists assembled in one organization.” In his final email to his friends and former coworkers, he said “You did what they said could not be done, you created things that they could not see or imagine.” The Internet, the Personal Computer, the tech that would go on to become Microsoft Office, object oriented programming, laser printers, tablets, ubiquitous computing devices. So, he isn't exactly understating what they accomplished in a false sense of humility. I guess you can't do that often if you're going to inspire the way he did. So feel free to abandon the pretense as well, and go inspire some innovation. Heck, who knows where the next wave will come from. But if we aren't working on it, it certainly won't come. Thank you so much and have a lovely, lovely day. We are so lucky to have you join us on yet another episode.
Para este episodio Pablo, Ariel y Luis hacen un interesante análisis acerca de la carta que la comunidad AEC envió explícitamente al CEO de Autodesk Andrew Anagnost, comentando sus puntos y desarrollando su opinión. Además, se realizan anuncios acerca de la serie de Ivan Sutherland que estará próximamente dentro de la programación de Shared Coordinates. La sección de BrainStorm viene cargada con mucha información, para esta ocasión Víctor y Luis realizan una entrevista a Alex Ziebert, Lead Marketing Manager para Nvidia en América Latina, así como un amigo y conocido del podcast Ninja Pollo nos acompaña para desmitificar lo que sucede en el mundo de las GPU’s y el punto de vista de un fabricante de hardware como lo es Nvidia. La sección de Diario de un BIM Manager nos dará un acercamiento al tema central del episodio a través de los ojos y experiencias de David Barco desde España y Europa. Reflexiones BIM, BIM Events, Quién es Quién, Recursos disponibles e interesantes relacionados con la temática de hardware y software. BIMnomad viaja hasta desde Michigan hasta Manchester para traernos una entrevista que contempla el factor humano de BIM y su correlación con herramientas y procesos de selección de la mano de Tamara Briseño. Cerramos el programa con TurnOff, comentarios de la comunidad, saludos y buena vibra. ////////// NOTAS DEL PODCAST ////////// AEC Magazine: https://bit.ly/30wr2Wb Nvidia Omniverse: https://bit.ly/3gzT3lj Tortilla Squad: https://bit.ly/2C7CDSo Diario de un BIM Manager: https://bit.ly/33uQxcv Whatsapp: +1 619 535 6032
Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us to innovate (and sometimes cope with) the future! Today we're going to cover yet another of the groundbreaking technologies to come out of MIT: Sketchpad. Ivan Sutherland is a true computer scientist. After getting his masters from Caltech, he migrated to the land of the Hackers and got a PhD from MIT in 1963. The great Claud Shannon supervised his thesis and Marvin Minsky was on the thesis review committee. But he wasn't just surrounded by awesome figures in computer science, he would develop a critical piece between the Memex in Vannevar Bush's “As We May Think” and the modern era of computing: graphics. What was it that propelled him from PhD candidate to becoming the father of computer graphics? The 1962-1963 development of a program called Sketchpad. Sketchpad was the ancestor of the GUI, object oriented programming, and computer graphics. In fact, it was the first graphical user interface. And it was all made possible by the TX-2, a computer developed at the MIT Lincoln Laboratory by Wesley Clark and others. The TX-2 was transistorized and so fast. Fast enough to be truly interactive. A lot of innovative work had come with the TX-0 and the program would effectively spin off as Digital Equipment Corporation and the PDP series of computers. So it was bound to inspire a lot of budding computer scientists to build some pretty cool stuff. Sutherland's Sketchpad used a light pen. These were photosensitive devices that worked like a stylus but would send light to the display, activating dots on a cathode ray tube (CRT). Users could draw shapes on a screen for the first time. Whirlwind at MIT had allowed highlighting objects, but this graphical interface to create objects was a new thing altogether, inputing data into a computer as an object instead of loading it as code, as could then be done using punch cards. Suddenly the computer could be used for art. There were toggle-able switches that made lines bigger. The extra memory that was pretty much only available in the hallowed halls of government-funded research in the 60s opened up so many possibilities. Suddenly, computer-aided design, or CAD, was here. Artists could create a master drawing and then additional instances on top, with changes to the master reverberating through each instance. They could draw lines, concentric circles, change ratios. And it would be 3 decades before MacPaint would bring the technology into homes across the world. And of course AutoCAD, making Autodesk one of the greatest software companies in the world. The impact of Sketchpad would be profound. Sketchpad would be another of Doug Englebart's inspirations when building the oN-Line System and there are clear correlations in the human interfaces. For more on NLS, check out the episode of this podcast called the Mother of All Demos, or watch it on YouTube. And Sutherland's work would inspire the next generation: people who read his thesis, as well as his students and coworkers. Sutherland would run the Information Processing Techniques Office for the US Defense Department Advanced Research Project Agency after Lick returned to MIT. He also taught at Harvard, where he and students developed the first virtual reality system in 1968, decades before it was patented by VPL research in 1984. Sutherland then went to the University of Utah, where he taught Alan Kay who gave us object oriented programming in smalltalk and the concept of the tablet in the Dynabook, and Ed Catmull who co-founded Pixar and many other computer graphics pioneers. He founded Evans and Sutherland, with the man that built the computer science department at the University of Utah and their company launched the careers of John Warnock, the founder of Adobe and Jim Clark, the founder of Silicon Graphics. His next company would be acquired by Sun Microsystems and become Sun Labs. He would remain a Vice President and fellow at Sun and a visiting scholar at Berkeley. For Sketchpad and his other contributions to computing, he would be awarded a Computer Pioneer Award, become a fellow at the ACM, receive a John von Neumann Medal, receive the Kyoto Prize, become a fellow at the Computer History Museum, and receive a Turing Award. I know we're not supposed to make a piece of software an actor in a sentence, but thank you Sketchpad. And thank you Sutherland. And his students and colleagues who continued to build upon his work.
The Einstein of interactivity, Ivan Sutherland conceived of ‘drawing’ onto a computer screen - going on to create the ‘ultimate display
Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Today we're going to cover the first real object-oriented programming language, Smalltalk. Many people outside of the IT industry would probably know the terms Java, Ruby, or Swift. But I don't think I've encountered anyone outside of IT that has heard of Smalltalk in a long time. And yet… Smalltalk influenced most languages in use today and even a lot of the base technologies people would readily identify with. As with PASCAL from Episode 3 of the podcast, Smalltalk was designed and created in part for educational use, but more so for constructionist learning for kids. Smalltalk was first designed at the Learning Research Group (LRG) of Xerox PARC by Alan Kay, Dan Ingalls, Adele Goldberg, Ted Kaehler, Scott Wallace, and others during the 1970s. Alan Kay had coined the term object-oriented programming was coined by Alan Kay in the late 60s. Kay took the lead on a project which developed an early mobile device called the Dynabook at Xerox PARC, as well as the Smalltalk object-oriented programming language. The first release was called Smalltalk-72 and was really the first real implementation of this weird new programming philosophy Kay had called object-oriented programming. Although… Smalltalk was inspired by Simula 67, from Norwegian developers Kirsten Nygaard and Ole-johan Dahl. Even before that Stewart Nelson and others from MIT had been using a somewhat object oriented model when working on Lisp and other programs. Kay had heard of Simula and how it handled passing messages and wrote the initial Smalltalk in a few mornings. He'd go on work with Dan Ingalls to help with implementation and Adele Goldberg to write documentation. This was Smalltalk 71. Object oriented program is a programming language model where programs are organized around data, also called objects. This is a contrast to programs being structured around functions and logic. Those objects could be data fields, attributes, behaviors, etc. For example, a product you're selling can have a sku, a price, dimensions, quantities, etc. This means you figure out what objects need to be manipulated and how those objects interact with one another. Objects are generalized as a class of objects. These classes define the kind of data and the logic used when manipulating data. Within those classes, there are methods, which define the logic and interfaces for object communication, known as messages. As programs grow and people collaborate on them together, an object-oriented approach allows projects to more easily be divided up into various team members to work on different parts. Parts of the code are more reusable. The way programs are played out is more efficient. And in turn, the code is more scalable. Object-oriented programming is based on a few basic principals. These days those are interpreted as encapsulation, abstraction, inheritance, and polymorphism. Although to Kay encapsulation and messaging are the most important aspects and all the classing and subclassing isn't nearly as necessary. Most modern languages that matter are based on these same philosophies, such as java, javascript, Python, C++, .Net, Ruby. Go, Swift, etc. Although Go is arguably not really object-oriented because there's no type hierarchy and some other differences, but when I look at the code it looks object-oriented! So there was this new programming paradigm emerging and Alan Kay really let it shine in Smalltalk. At the time, Xerox PARC was in the midst of revolutionizing technology. The MIT hacker ethic had seeped out to the west coast with Marvin Minsky's AI lab SAIL at Stanford and got all mixed into the fabric of chip makers in the area, such as Fairchild. That Stanford connection is important. The Augmentation Research Center is where Engelbart introduced the NLS computer and invented the Mouse there. And that work resulted in advances like hypertext links. In the 60s. Many of those Stanford Research Institute people left for Xerox PARC. Ivan Sutherland's work on Sketchpad was known to the group, as was the mouse from NLS, and because the computing community that was into research was still somewhat small, most were also aware of the graphic input language, or GRAIL, that had come out of Rand. Sketchpad's had handled each drawing elements as an object, making it a predecessor to object-oriented programming. GRAIL ran on the Rand Tablet and could recognize letters, boxes, and lines as objects. Smalltalk was meant to show a dynamic book. Kinda' like the epub format that iBooks uses today. The use of similar objects to those used in Sketchpad and GRAIL just made sense. One evolution led to another and another, from Lisp and the batch methods that came before it through to modern models. But the Smalltalk stop on that model railroad was important. Kay and the team gave us some critical ideas. Things like overlapping windows. These were made possibly by the inheritance model of executions, a standard class library, and a code browser and editor. This was one of the first development environments that looked like a modern version of something we might use today, like an IntelliJ or an Eclipse for Java developers. Smalltalk was the first implementation of the Model View Controller in 1979, a pattern that is now standard for designing graphical software interfaces. MVC divides program logic into the Model, the View, and the Controller in order to separate internal how data is represented from how it is presented as decouples the model from the view and the controller allow for much better reuse of libraries of code as well as much more collaborative development. Another important thing happened at Xerox in 1979, as they were preparing to give Smalltalk to the masses. There are a number of different interpretations to stories about Steve Jobs and Xerox PARC. But in 1979, Jobs was looking at how Apple would evolve. Andy Hertzfeld and the original Mac team were mostly there at Apple already but Jobs wanted fresh ideas and traded a million bucks in Apple stock options to Xerox for a tour of PARC. The Lisa team came with him and got to see the Alto. The Alto prototype was part of the inspiration for a GUI-based Lisa and Mac, which of course inspired Windows and many advances since. Smalltalk was finally released to other vendors and institutions in 1980, including DEC, HP, Apple, and Berkely. From there a lot of variants have shown up. Instantiations partnered with IBM and in 1984 had the first commercial version at Tektronix. A few companies tried to take SmallTalk to the masses but by the late 80s SQL connectivity was starting to add SQL support. The Smalltalk companies often had names with object or visual in the name. This is a great leading indicator of what Smalltalk is all about. It's visual and it's object oriented. Those companies slowly merged into one another and went out of business through the 90s. Instantiations was acquired by Digitalk. ParcPlace owed it's name to where the language was created. The biggest survivor was ObjectShare, who was traded on NASDAQ, peaking at $24 a share until 1999. In a LA Times article: “ObjectShare Inc. said its stock has been delisted from the Nasdaq national market for failing to meet listing requirements. In a press release Thursday, the company said it is appealing the decision.” And while the language is still maintained by companies like Instantiations, in the heyday, there was even a version from IBM called IBM VisualAge Smalltalk. And of course there were combo-language abominations, like a smalltalk java add on. Just trying to breathe some life in. This was the era where Filemaker, Foxpro, and Microsoft Access were giving developers the ability to quickly build graphical tools for managing data that were the next generation past what Smalltalk provided. And on the larger side products like JDS, Oracle, Peoplesoft, really jumped to prominence. And on the education side, the industry segmented into learning management systems and various application vendors. Until iOS and Google when apps for those platforms became all the rage. Smalltalk does live on in other forms though. As with many dying technologies, an open source version of Smalltalk came along in 1996. Squeak was written by Alan Kay, Dan Ingalls, Ted Kaehler, Scott Wallace, John Maloney, Andreas Raab, Mike Rueger and continues today. I've tinkerated with Squeak here and there and I have to say that my favorite part is just getting to see how people who actually truly care about teaching languages to kids. And how some have been doing that for 40 years. A great quote from Alan Kay, discussing a parallel between Vannevar Bush's “As We May Think” and the advances they made to build the Dynabook: If somebody just sat down and implemented what Bush had wanted in 1945, and didn't try and add any extra features, we would like it today. I think the same thing is true about what we wanted for the Dynabook. There's a direct path with some of the developers of Smalltalk to deploying MacBooks and Chromebooks in classrooms. And the influences these more mass marketed devices have will be felt for generations to come. Even as we devolve to new models from object-oriented programming, and new languages. The research that went into these early advances and the continued adoption and research have created a new world of teaching. At first we just wanted to teach logic and fundamental building blocks. Now kids are writing code. This might be writing java programs in robotics classes, html in Google Classrooms, or beginning iOS apps in Swift Playgrounds. So until the next episode, think about this: Vannevar Bush pushed for computers to help us think, and we have all of the worlds data at our fingertips. With all of the people coming out of school that know how to write code today, with the accelerometers, with the robotics skills, what is the next stage of synthesizing all human knowledge and truly making computers help with As we may think. So thank you so very much for tuning into another episode of the History of Computing Podcast. We're lucky to have you. Have a great day!
Today my conversation is with David Smith. He’s the CEO and Founder of Croquet Studios David Smith is a computer scientist and entrepreneur who has focused on interactive 3D and using 3D as a basis for new user environments and entertainment for over thirty years. His specialty is system design and advanced user interfaces. He is a pioneer in 3D graphics, robotics, telepresence, artificial intelligence (AI) and augmented reality (AR). He creates world-class teams and ships impossible products. In 1987, Smith created The Colony, the very first real-time 3D adventure game/shooter and the precursor to today's first-person shooters. The game was developed for the Apple Macintosh and won the "Best Adventure Game of the Year" award from MacWorld Magazine. In 1990, Smith founded Virtus Corporation and developed Virtus Walkthrough, the first real-time 3D design application for personal computers. Virtus Walkthrough won the very first MacWorld/MacUser Breakthrough Product of the Year. David was Chief Innovation Officer at Lockheed Martin and a Senior Fellow at Lockheed Martin MST, focused on next-generation, human centric computing and collaboration platforms. Here he developed a number of key technologies and won the Lockheed Martin TLS Inventor of the Year for the last four years (every year he has been eligible). What’s really, really interesting is that he worked closely with authors Tom Clancy (Rainbow Six, Hunt for Red October) and Michael Crichton (Andromeda Strain, Jurassic Park) to develop games. But that’s only the beginning. . . . David believes that the year 1968 was the most critical year in computer science. In this one year, three key individuals launched what he considers, and what he’s continuing to build upon, is this goal of enhancing humans’ ability to solve hard problems using computers to think in a different way. Again, enhancing humans’ ability to solve hard problems using computers to think in a different way. He’s building upon the work of, really the pioneers in the internet: Doug Engelbart, Alan Kay, and Ivan Sutherland’s work - all focused on working with the Xerox Alto Project from a long time ago – close to 50-years ago. Some of these breakthroughs - that even amazed Steve Jobs, as you can see on some of his YouTube videos from years’ ago when he was stunned as he looked at the Xerox Alto project. At that time, what really stuck out for Steve Jobs was the gooey interface. This was really that first interface between a computer and a human. David’s passion is to continue to use his skills and his competencies and capabilities in 3D and 3D engineering and design. His goal is to develop these applications and systems and platforms that are really going to transform how we use computers and solve big problems in the coming years. He’s exploring the use of 3D and graphical situations that we can’t even imagine right now, and problem solving and using computers to solve interesting challenges and complex problems moving forward. So, with that, I wanted to introduce you to my conversation and wonderful interview with David Smith. Read the full transcript here Major Take-Aways from This Episode: What is an Augmented Conversation? The future of turning a computer into a vehicle to exchange ideas in real-time and sophisticated areas. How a fusion of ideas reinvents and redefines the vision of what computing is. How to get in touch with David A. Smith: Linkedin Blog Twitter YouTube Resources Referenced Platform for the Future of AR & VR |David Smith|TEDxBeacon Street ARIA ARIA - AR in Action, David Smith Croquet Demo, Part 1, David A. Smith Croquet Demo, Part 2, David A. Smith This episode is sponsored by the CIO Innovation Forum, dedicated to Business Digital Leaders who want to be a part of 20% of the planet and help their businesses win with innovation and transformation. Credits OUTRO music provided by Ben’s Sound: http://www.bensound.com/ Other Ways to Listen to the Podcast iTunes | Libsyn | Soundcloud | RSS | LinkedIn Leave a Review If you enjoyed listening to my podcast, please take a minute to leave a review here! Click here for instructions on how to leave an iTunes review if you’re doing this for the first time. About Bill Murphy Bill Murphy is a world renowned IT Security Expert dedicated to your success as an IT business leader. Follow Bill on LinkedIn and Twitter.
Ed Catmull has been at the forefront of computer graphics since his time at the University of Utah. His position as CTO and one of the two founding members of Pixar - which later evolved into a role as President of Disney Animation Studios and Pixar Animation Studios – is just the tip of the ice berg when describing Ed’s career. It is no exaggeration to say Catmull is part of the fabric of the 20th century and early 21st century. That is, via technical achievements like subdivisions of surfaces (for which his work in the field was awarded an Oscar by The Academy of Motion Picture Arts and Sciences) and through the stories created by Pixar. The people Ed has called peers throughout his life include many of the architects of personal computing as we know it, such as Alan Kay, as well as iconic figures like Steve Jobs and George Lucas. Ed sits down with Adam and the Remotely Interested Podcast to discuss his life in animation and computing. Part one of the interview includes his time at the University of Utah; ARPA and DARPA; Tubby the Tuba; Lucas Arts, Star Wars and George Lucas; Steve Jobs and the grandfather of AR / VR, Ivan Sutherland. Creativity Inc: http://www.creativityincbook.com/about/ Oscar: https://www.hollywoodreporter.com/behind-screen/oscars-sci-tech-awards-ed-catmull-john-knoll-receive-honors-1168791 Walt Disney Innovator: https://www.pbs.org/wgbh/theymadeamerica/whomade/disney_hi.html University of Utah School of Computing: https://en.wikipedia.org/wiki/University_of_Utah_School_of_Computing Sword of Damocles: https://youtu.be/eVUgfUvP4uk Gertie The Dinosaur: https://youtu.be/TGXC8gXOPoU Computer Animated Hand: https://youtu.be/fAhyBfLFyNA Tubby The Tuba: https://youtu.be/ENKHzTREALo Droid Maker: https://www.droidmaker.com/ Pixar Image Computing: https://youtu.be/PhhGfdkK9Ek Alan Kay https://en.wikipedia.org/wiki/Alan_Kay
안녕하세요. 오늘은 컴퓨터 그래픽스의 대부 이반 서덜랜드 교수에 대한 이야기를 해 보도록 하겠습니다. 이 분이 1960년대부터 개발한 기술은 현재 우리들이 사용하는 컴퓨터 기반 디지털 디자인, 가상현실, 렌더링의 기초를 이루고 있습니다. 이 내용을 간략히 이야기하고, 그 분의 생애에 대해서 나눔해 보도록 하겠습니다.최초의 가상현실 시스템 (이반 서덜랜드)No.72 - 컴퓨터 그래픽스의 대부 이반 서덜랜드 - 팟케스트 방송
The Einstein of interactivity, Ivan Sutherland conceived of ‘drawing’ onto a computer screen - going on to create the ‘ultimate displaySee omnystudio.com/listener for privacy information.See omnystudio.com/listener for privacy information.
The Einstein of interactivity, Ivan Sutherland conceived of ‘drawing’ onto a computer screen - going on to create the ‘ultimate display
CADnoob Podcast # 2 July 2016 a noob history of CAD. In this episode CADnoob and Jess look into a little bit of the history of CAD and Autodesk, Andygator Tips: VIEWRES change .bak to .dwg Links Inspiration for episode: http://www.autodesk.com/campaigns/inspired-by-autocad/cad-innovation Additional podcast with history: http://www.learncadd.info/Podcast.html Ivan Sutherland Sketch Pad: https://archive.org/details/AlanKeyD1987 https://www.youtube.com/watch?feature=player_embedded&v=USyoT_Ha_bA https://www.youtube.com/watch?feature=player_embedded&v=BKM3CmRqK2o John Walker: https://www.fourmilab.ch/autofile/ Mike Riddle: http://www.michaelriddle.com/ http://www.digibarn.com/stories/mike-riddle/index.html Byte magazine (page 172 for CAD): https://archive.org/details/byte-magazine-1984-01 Additional cad history: http://mbinfo.mbdesign.net/CAD-History.htm http://www.cadazz.com/cad-software-history.htm http://design.osu.edu/carlson/history/lesson10.html#mcs Check out some old cad running (circa 1980) http://autodesk.blogs.com/between_the_lines/2013/11/friday-flashback-run-1980s-software.html
Realidad, ¿para qué? En esta versión, hablamos sobre realidad virtual: qué es, cuales son sus alcances y qué aplicaciones ofrece. Para eso, repasamos sus grandes contribuidores y dispositivos, tales como el primer estereoscopio, el fallido Virtual Boy de Nintendo y el novedoso Oculus Rift. Para cerrar, el destacado fue @EmbassyCat, la nueva compañía de Assange en su estadía por la embajada de Ecuador en Londres.
TAG Interview with Tom Sito - 2Find all TAG Interviews on the TAG website at this link TAG's President Emeritus Tom Sito (who is also an animator, director, storyboard artist and college professor) has written a fine book on the history of Computer Generated Imagery: Computer graphics (or CG) has changed the way we experience the art of moving images. Computer graphics is the difference between Steamboat Willie and Buzz Lightyear, between ping pong and PONG. It began in 1963 when an MIT graduate student named Ivan Sutherland created the first true computer animation program. Instead of presenting a series of numbers, Sutherland's Sketchpad program drew lines that created recognizable images. Sutherland noted: "Since motion can be put into Sketchpad drawings, it might be exciting to try making cartoons." This book, the first full-length history of CG, shows us how Sutherland's seemingly offhand idea grew into a multibillion dollar industry. ... And Tom takes us through that long-ago beginning to right now. ... This is Tom's second TAG podcast. The first covered his animation career, the second is centered on his just-released book and the history of CG. (This audio interview is broken into three parts of thirty minutes each. Starting tomorrow, the video/YouTube versions -- each 45 minutes in length -- will appear. So choose your format.)