POPULARITY
Fri, 01 Mar 2024 19:30:00 GMT http://relay.fm/rd/229 http://relay.fm/rd/229 Merlin Mann and John Siracusa Follow-up, Billy Joel, and Apple Vision Pro. Follow-up, Billy Joel, and Apple Vision Pro. clean 8638 Subtitle: You have desk at home.Follow-up, Billy Joel, and Apple Vision Pro. This episode of Reconcilable Differences is sponsored by: Squarespace: Save 10% off your first purchase of a website or domain using code DIFFS. Links and Show Notes: John doesn't like how Merlin starts the show but, unsurprisingly, has nothing better to offer. Counting is discussed. Then, there's a tangent on the peculiar joys of long-running videos series. Then there's a lot of really good listener Follow-Up, including new Things It Took You Too Long to Realize plus more examples of movies that don't seem to understand the sport that they're about. Merlin confesses that Ry is his only friend. Next up is some fascinating FU on the ways non-English languages handle plurals. John demands Merlin find out whether his kid knows what a jellyroll is and directly implies that Merlin also does not know what a jellyroll is. It seems like your hosts are about to talk about Apple Vision Pro when Merlin suddenly remembers there's very important new content from Dr. William Martin Joel that requires immediate discussion. John thinks he kinda looks like a Sontaran. Then your hosts talk about Apple Vision Pro for a pretty long time. (Recorded on Tuesday, February 20, 2024) Credits Audio Editor: Jim Metzendorf Admin Assistance: Kerry Provenzano Music: Merlin Mann The Suits: Stephen Hackett, Myke Hurley Get an ad-free version of the show, plus a monthly extended episode. Scansion Rebuilding the historic Tally Ho sailing ship: Sampson Boat Co - YouTube Playlist On Cinema at the Cinema - YouTube Playlist Tim Heidecker's On Cinema at the Cinema: An Under-Appreciated Masterpiece - YouTube"Tim Heidecker's On Cinema at the Cinema is a web series created by Tim Heidecker and Gregg Turkington. Well, web series is reductive... its more like a cinematic universe... the OCU?" Hero (2002) Dual (Grammatical Number) Japanese Counters A funny video about Japanese counters - YouTube Language Plural Rules from the Unicode CLDR Jelly Roll Billy Joel: Turn the Lights Back On (Music Video) The Sunrise Wildfire of 1995 The Mother of All Demos, presented by Douglas Engelbart (1968) Ry :ducked: (@ry@duck.haus) - duc
Fri, 01 Mar 2024 19:30:00 GMT http://relay.fm/rd/229 http://relay.fm/rd/229 A Layer of Proscenium 229 Merlin Mann and John Siracusa Follow-up, Billy Joel, and Apple Vision Pro. Follow-up, Billy Joel, and Apple Vision Pro. clean 8638 Subtitle: You have desk at home.Follow-up, Billy Joel, and Apple Vision Pro. This episode of Reconcilable Differences is sponsored by: Squarespace: Save 10% off your first purchase of a website or domain using code DIFFS. Links and Show Notes: John doesn't like how Merlin starts the show but, unsurprisingly, has nothing better to offer. Counting is discussed. Then, there's a tangent on the peculiar joys of long-running videos series. Then there's a lot of really good listener Follow-Up, including new Things It Took You Too Long to Realize plus more examples of movies that don't seem to understand the sport that they're about. Merlin confesses that Ry is his only friend. Next up is some fascinating FU on the ways non-English languages handle plurals. John demands Merlin find out whether his kid knows what a jellyroll is and directly implies that Merlin also does not know what a jellyroll is. It seems like your hosts are about to talk about Apple Vision Pro when Merlin suddenly remembers there's very important new content from Dr. William Martin Joel that requires immediate discussion. John thinks he kinda looks like a Sontaran. Then your hosts talk about Apple Vision Pro for a pretty long time. (Recorded on Tuesday, February 20, 2024) Credits Audio Editor: Jim Metzendorf Admin Assistance: Kerry Provenzano Music: Merlin Mann The Suits: Stephen Hackett, Myke Hurley Get an ad-free version of the show, plus a monthly extended episode. Scansion Rebuilding the historic Tally Ho sailing ship: Sampson Boat Co - YouTube Playlist On Cinema at the Cinema - YouTube Playlist Tim Heidecker's On Cinema at the Cinema: An Under-Appreciated Masterpiece - YouTube"Tim Heidecker's On Cinema at the Cinema is a web series created by Tim Heidecker and Gregg Turkington. Well, web series is reductive... its more like a cinematic universe... the OCU?" Hero (2002) Dual (Grammatical Number) Japanese Counters A funny video about Japanese counters - YouTube Language Plural Rules from the Unicode CLDR Jelly Roll Billy Joel: Turn the Lights Back On (Music Video) The Sunrise Wildfire of 1995 The Mother of All Demos, presented by Douglas Engelbart (1968) Ry :ducked: (@ry@duck.haus) - duck.hausMerlin&
Panelists: Paul Hagstrom (hosting), Quinn Dunki, and Carrington Vanston Topic: 1968-1969 In 1968 and 1969, we had SHRDLU, the Mother of All Demos, Go To being considered harmful, and Unix. Topic/Feedback links: Retro Computing News: Vintage Computer(-related) commercials: Retro Computing Gift Idea: Auction Picks: A2Stream file: Feedback/Discussion: Intro / Closing Song: Back to Oz by … Continue reading RCR Episode 269: Considered harmful →
This week's challenge: look at old computers.You can hear the after show and support Do By Friday on Patreon!——Produced and Edited by Alex Cox——Show LinksBill Murray Accuses Phil Donahue Of Dating His Grandmother | LettermanThe inbox makeover | MacworldComputer Chronicles - WikipediaComputer History Archives Project ("CHAP") - YouTubeThe Computer Chronicles - YouTubePalm VII - WikipediaThe Mother of All Demos, presented by Douglas Engelbart (1968) - YouTubeAlex's Old Computers PlaylistMerlin's Old Computers PlaylistRecorded Wednesday, July 19, 2023Next week's challenge: have Coxmas in July!
Almost every single person listening to this podcast right now is doing so on some sort of personal computing device. Many of the things that we consider part of a modern personal computer, windows, hyperlinks, a mouse, and a text editor, all were released upon the world in a single 90-minute demo in 1968. The ideas were so advanced it would take over two decades before most of them found themselves in everyone's homes. Learn more about the Mother of All Demos and the birth of personal computing, on this episode of Everything Everywhere Daily. Sponsors Expedition Unknown Find out the truth behind popular, bizarre legends. Expedition Unknown, a podcast from Discovery, chronicles the adventures of Josh Gates as he investigates unsolved iconic stories across the globe. With direct audio from the hit TV show, you'll hear Gates explore stories like the disappearance of Amelia Earhart in the South Pacific and the location of Captain Morgan's treasure in Panama. These authentic, roughshod journeys help Gates separate fact from fiction and learn the truth behind these compelling stories. InsideTracker provides a personal health analysis and data-driven wellness guide to help you add years to your life—and life to your years. Choose a plan that best fits your needs to get your comprehensive biomarker analysis, customized Action Plan, and customer-exclusive healthspan resources. For a limited time, Everything Everywhere Daily listeners can get 20% off InsideTracker's new Ultimate Plan. Visit InsideTracker.com/eed. Subscribe to the podcast! https://link.chtbl.com/EverythingEverywhere?sid=ShowNotes -------------------------------- Executive Producer: Charles Daniel Associate Producers: Peter Bennett & Thor Thomsen Become a supporter on Patreon: https://www.patreon.com/everythingeverywhere Update your podcast app at newpodcastapps.com Discord Server: https://discord.gg/UkRUJFh Instagram: https://www.instagram.com/everythingeverywhere/ Facebook Group: https://www.facebook.com/groups/everythingeverywheredaily Twitter: https://twitter.com/everywheretrip Website: https://everything-everywhere.com/ Learn more about your ad choices. Visit megaphone.fm/adchoices
Douglas Engelbart invented the 21st Century! In 1968 his Mother of All Demos showed how together work in social networks, using computer screens with graphic user interface, mouse, word processors, hyperlinks, online document sharing and video-conferencing, as part of his plan to Augment Human Intelligence. Produced and hosted by Ian Woolf Support Diffusion by making a contribution
When we peel back the layers of the stack, there's one human characteristic we're sure to find: errors. Mistakes, mishaps, and miscalculations are fundamental to being human, and as such, error is built into every piece of infrastructure and code we create. Of course, learning from our errors is critical in our effort to create functional, reliable tech. But could our mistakes be as important to technological development as our ideas? And what happens when we try to change our attitude towards errors…or remove them entirely? In this fascinating episode of Traceroute, we start back in 1968, when “The Mother of All Demos“ was supposed to change the face of personal computing…before the errors started. We're then joined by Andrew Clay Shafer, a DevOps pioneer who has seen the evolution of “errors” to “incidents” through practices like Scrum, Agile, and Chaos Engineering. We also speak with Courtney Nash, a Cognitive Neuroscientist and Researcher whose Verica Open Incident Directory (VOID) has changed the way we look at incident reporting. Additional ResourcesConnect with Amy Tobey: LinkedIn or TwitterConnect with Fen Aldrich: LinkedIn or TwitterConnect with John Taylor on LinkedInConnect With Courtney Nash on TwitterConnect with Andrew Clay Shafter on TwitterVisit Origins.dev for more informationEnjoyed This Episode?If you did, be sure to follow and share it with your friends! We'd also appreciate a five-star review on Apple Podcasts - it really helps people find the show!Traceroute is a podcast from Equinix and is a production of Stories Bureau. This episode was produced by John Taylor with help from Tim Balint and Cat Bagsic. It was edited by Joshua Ramsey and mixed by Jeremy Tuttle, with additional editing and sound design by Mathr de Leon. Our theme song was composed by Ty Gibbons.
The Mogollon culture was an indigenous culture in the Western United States and Mexico that ranged from New Mexico and Arizona to Sonora, Mexico and out to Texas. They flourished from around 200 CE until the Spanish showed up and claimed their lands. The cultures that pre-existed them date back thousands more years, although archaeology has yet to pinpoint exactly how those evolved. Like many early cultures, they farmed and foraged. As they farmed more, their homes become more permanent and around 800 CE they began to create more durable homes that helped protect them from wild swings in the climate. We call those homes adobes today and the people who lived in those peublos and irrigated water, often moving higher into mountains, we call the Peubloans - or Pueblo Peoples. Adobe homes are similar to those found in ancient cultures in what we call Turkey today. It's an independent evolution. Adobe Creek was once called Arroyo de las Yeguas by the monks from Mission Santa Clara and then renamed to San Antonio Creek by a soldier Juan Prado Mesa when the land around it was given to him by the governor of Alto California at the time, Juan Bautista Alvarado. That's the same Alvarado as the street if you live in the area. The creek runs for over 14 miles north from the Black Mountain and through Palo Alto, California. The ranchers built their adobes close to the creeks. American settlers led the Bear Flag Revolt in 1846, and took over the garrison of Sonoma, establishing the California Republic - which covered much of the lands of the Peubloans. There were only 33 of them at first, but after John Fremont (yes, he of whom that street is named after as well) encouraged the Americans, they raised an army of over 100 men and Fremont helped them march on Sutter's fort, now with the flag of the United States, thanks to Joseph Revere of the US Navy (yes, another street in San Francisco bears his name). James Polk had pushed to expand the United States. Manfiest Destiny. Remember The Alamo. Etc. The fort at Monterey fell, the army marched south. Admiral Sloat got involved. They named a street after him. General Castro surrendered - he got a district named after him. Commodore Stockton announced the US had taken all of Calfironia soon after that. Manifest destiny was nearly complete. He's now basically the patron saint of a city, even if few there know who he was. The forts along the El Camino Real that linked the 21 Spanish Missions, a 600-mile road once walked by their proverbial father, Junípero Serra following the Portolá expedition of 1769, fell. Stockton took each, moving into Los Angeles, then San Diego. Practically all of Alto California fell with few shots. This was nothing like the battles for the independence of Texas, like when Santa Anna reclaimed the Alamo Mission. Meanwhile, the waters of Adobe Creek continued to flow. The creek was renamed in the 1850s after Mesa built an adobe on the site. Adobe Creek it was. Over the next 100 years, the area evolved into a paradise with groves of trees and then groves of technology companies. The story of one begins a little beyond the borders of California. Utah was initialy explored by Francisco Vázquez de Coronado in 1540 and settled by Europeans in search of furs and others who colonized the desert, including those who established the Church of Jesus Christ of Latter-day Saints, or the Mormons - who settled there in 1847, just after the Bear Flag Revolt. The United States officially settled for the territory in 1848 and Utah became a territory and after a number of map changes wher ethe territory got smaller, was finally made a state in 1896. The University of Utah had been founded all the way back in 1850, though - and re-established in the 1860s. 100 years later, the University of Utah was a hotbed of engineers who pioneered a number of graphical advancements in computing. John Warnock went to grad school there and then went on to co-found Adobe and help bring us PostScript. Historically, PS, or Postscript was a message to be placed at the end of a letter, following the signature of the author. The PostScript language was a language to describe a page of text computationally. It was created by Adobe when Warnock, Doug Brotz, Charles Geschke, Bill Paxton (who worked on the Mother of All Demos with Doug Englebart during the development of Online System, or NLS in the late 70s and then at Xerox PARC), and Ed Taft. Warnock invented the Warnock algorithm while working on his PhD and went to work at Evans & Sutherland with Ivan Sutherland who effectively created the field of computer graphics. Geschke got his PhD at Carnegie Melon in the early 1970s and then went of to Xerox PARC. They worked with Paxton at PARC and before long, these PhDs and mathematicians had worked out the algorithms and then the languages to display images on computers while working on InterPress graphics at Xerox and Gerschke left Xerox and started Adobe. Warnock joined them and they went to market with Interpress as PostScript, which became a foundation for the Apple LaswerWriter to print graphics. Not only that, PostScript could be used to define typefaces programmatically and later to display any old image. Those technologies became the foundation for the desktop publishing industry. Apple released the 1984 Mac and other vendors brought in PostScript to describe graphics in their proprietary fashion and by 1991 they released PostScript Level 2 and then PostScript 3 in 1997. Other vendors made their own or furthered standards in their own ways and Adobe could have faded off into the history books of computing. But Adobe didn't create one product, they created an industry and the company they created to support that young industry created more products in that mission. Steve Jobs tried to buy Adobe before that first Mac as released, for $5,000,000. But Warnock and Geschke had a vision for an industry in mind. They had a lot of ideas but development was fairly capital intensive, as were go to market strategies. So they went public on the NASDAQ in 1986. They expanded their PostScript distribution and sold it to companies like Texas Instruments for their laser printer, and other companies who made IBM-compatible companies. They got up to $16 million in sales that year. Warnock's wife was a graphic designer. This is where we see a diversity of ideas help us think about more than math. He saw how she worked and could see a world where Ivan Sutherland's Sketchpad was much more given how far CPUs had come since the TX-0 days at MIT. So Adobe built and released Illustrator in 1987. By 1988 they broke even on sales and it raked in $19 million in revenue. Sales were strong in the universities but PostScript was still the hot product, selling to printer companies, typesetters, and other places were Adobe signed license agreements. At this point, we see where the math, cartesian coordinates, drawn by geometric algorithms put pixels where they should be. But while this was far more efficient than just drawing a dot in a coordinate for larger images, drawing a dot in a pixel location was still the easier technology to understand. They created Adobe Screenline in 1989 and Collectors Edition to create patterns. They listened to graphic designers and built what they heard humans wanted. Photoshop Nearly every graphic designer raves about Adobe Photoshop. That's because Photoshop is the best selling graphics editorial tool that has matured far beyond most other traditional solutions and now has thousands of features that allow users to manipulate images in practically any way they want. Adobe Illustrator was created in 1987 and quickly became the de facto standard in vector-based graphics. Photoshop began life in 1987 as well, when Thomas and John Knoll, wanted to build a simpler tool to create graphics on a computer. Rather than vector graphics they created a raster graphical editor. They made a deal with Barneyscan, a well-known scanner company that managed to distribute over two hundred copies of Photoshop with their scanners and Photoshop became a hit as it was the first editing software people heard about. Vector images are typically generated with Cartesian coordinates based on geometric formulas and so scale out more easily. Raster images are comprised of a grid of dots, or pixels, and can be more realistic. Great products are rewarded with competitions. CorelDRAW was created in 1989 when Michael Bouillon and Pat Beirne built a tool to create vector illustrations. The sales got slim after other competitors entered the market and the Knoll brothers got in touch with Adobe and licensed the product through them. The software was then launched as Adobe Photoshop 1 in 1990. They released Photoshop 2 in 1991. By now they had support for paths, and given that Adobe also made Illustrator, EPS and CMYK rasterization, still a feature in Photoshop. They launched Adobe Photoshop 2.5 in 1993, the first version that could be installed on Windows. This version came with a toolbar for filters and 16-bit channel support. Photoshop 3 came in 1994 and Thomas Knoll created what was probably one of the most important features added, and one that's become a standard in graphical applications since, layers. Now a designer could create a few layers that each had their own elements and hide layers or make layers more transparent. These could separate the subject from the background and led to entire new capabilities, like an almost faux 3 dimensional appearance of graphics.. Then version four in 1996 and this was one of the more widely distributed versions and very stable. They added automation and this was later considered part of becoming a platform - open up a scripting language or subset of a language so others built tools that integrated with or sat on top of those of a product, thus locking people into using products once they automated tasks to increase human efficiency. Adobe Photoshop 5.0 added editable type, or rasterized text. Keep in mind that Adobe owned technology like PostScript and so could bring technology from Illustrator to Photoshop or vice versa, and integrate with other products - like export to PDF by then. They also added a number of undo options, a magnetic lasso, improved color management and it was now a great tool for more advanced designers. Then in 5.5 they added a save for web feature in a sign of the times. They could created vector shapes and continued to improve the user interface. Adobe 5 was also a big jump in complexity. Layers were easy enough to understand, but Photoshop was meant to be a subset of Illustrator features and had become far more than that. So in 2001 they released Photoshop Elements. By now they had a large portfolio of products and Elements was meant to appeal to the original customer base - the ones who were beginners and maybe not professional designers. By now, some people spent 40 or more hours a day in tools like Photoshop and Illustrator. Adobe Today Adobe had released PostScript, Illustrator, and Photoshop. But they have one of the most substantial portfolios of products of any company. They also released Premiere in 1991 to get into video editing. They acquired Aldus Corporation to get into more publishing workflows with PageMaker. They used that acquisition to get into motion graphics with After Effects. They acquired dozens of companies and released their products as well. Adobe also released the PDF format do describe full pages of information (or files that spread across multiple pages) in 1993 and Adobe Acrobat to use those. Acrobat became the de facto standard for page distribution so people didn't have to download fonts to render pages properly. They dabbled in audio editing when they acquired Cool Edit Pro from Syntrillium Software and so now sell Adobe Audition. Adobe's biggest acquisition was Macromedia in 2005. Here, they added a dozen new products to the portfolio, which included Flash, Fireworks, WYSYWIG web editor Dreamweaver, ColdFusion, Flex, and Breeze, which is now called Adobe Connect. By now, they'd also created what we call Creative Suite, which are packages of applications that could be used for given tasks. Creative Suite also signaled a transition into a software as a service, or SaaS mindset. Now customers could pay a monthly fee for a user license rather than buy large software packages each time a new version was released. Adobe had always been a company who made products to create graphics. They expanded into online marketing and web analytics when they bought Omniture in 2009 for $1.8 billion. These products are now normalized into the naming convention used for the rest as Adobe Marketing Cloud. Flash fell by the wayside and so the next wave of acquisitions were for more mobile-oriented products. This began with Day Software and then Nitobi in 2011. And they furthered their Marketing Cloud support with an acquisition of one of the larger competitors when they acquired Marketo in 2018 and acquiring Workfront in 2020. Given how many people started working from home, they also extended their offerings into pure-cloud video tooling with an acquisition of Frame.io in 2021. And here we see a company started by a bunch of true computer sciencists from academia in the early days of the personal computer that has become far more. They could have been rolled into Apple but had a vision of a creative suite of products that could be used to make the world a prettier place. Creative Suite then Creative Cloud shows a move of the same tools into a more online delivery model. Other companies come along to do similar tasks, like infinite digital whiteboard Miro - so they have to innovate to stay marketable. They have to continue to increase sales so they expand into other markets like the most adjacent Marketing Cloud. At 22,500+ employees and with well over $12 billion in revenues, they have a lot of families dependent on maintaining that growth rate. And so the company becomes more than the culmination of their software. They become more than graphic design, web design, video editing, animation, and visual effects. Because in software, if revenues don't grow at a rate greater than 10 percent per year, the company simply isn't outgrowing the size of the market and likely won't be able to justify stock prices at an inflated earnings to price ratio that shows explosive growth. And yet once a company saturates sales in a given market they have shareholders to justify their existence to. Adobe has survived many an economic downturn and boom time with smart, measured growth and is likely to continue doing so for a long time to come.
Gutenburg shipped the first working printing press around 1450 and typeface was born. Before then most books were hand written, often in blackletter calligraphy. And they were expensive. The next few decades saw Nicolas Jensen develop the Roman typeface, Aldus Manutius and Francesco Griffo create the first italic typeface. This represented a period where people were experimenting with making type that would save space. The 1700s saw the start of a focus on readability. William Caslon created the Old Style typeface in 1734. John Baskerville developed Transitional typefaces in 1757. And Firmin Didot and Giambattista Bodoni created two typefaces that would become the modern family of Serif. Then slab Serif, which we now call Antique, came in 1815 ushering in an era of experimenting with using type for larger formats, suitable for advertisements in various printed materials. These were necessary as more presses were printing more books and made possible by new levels of precision in the metal-casting. People started experimenting with various forms of typewriters in the mid-1860s and by the 1920s we got Frederic Goudy, the first real full-time type designer. Before him, it was part of a job. After him, it was a job. And we still use some of the typefaces he crafted, like Copperplate Gothic. And we saw an explosion of new fonts like Times New Roman in 1931. At the time, most typewriters used typefaces on the end of a metal shaft. Hit a kit, the shaft hammers onto a strip of ink and leaves a letter on the page. Kerning, or the space between characters, and letter placement were often there to reduce the chance that those metal hammers jammed. And replacing a font would have meant replacing tons of precision parts. Then came the IBM Selectric typewriter in 1961. Here we saw precision parts that put all those letters on a ball. Hit a key, the ball rotates and presses the ink onto the paper. And the ball could be replaced. A single document could now have multiple fonts without a ton of work. Xerox exploded that same year with the Xerox 914, one of the most successful products of all time. Now, we could type amazing documents with multiple fonts in the same document quickly - and photocopy them. And some of the numbers on those fancy documents were being spat out by those fancy computers, with their tubes. But as computers became transistorized heading into the 60s, it was only a matter of time before we put fonts on computer screens. Here, we initially used bitmaps to render letters onto a screen. By bitmap we mean that a series, or an array of pixels on a screen is a map of bits and where each should be displayed on a screen. We used to call these raster fonts, but the drawback was that to make characters bigger, we needed a whole new map of bits. To go to a bigger screen, we probably needed a whole new map of bits. As people thought about things like bold, underline, italics, guess what - also a new file. But through the 50s, transistor counts weren't nearly high enough to do something different than bitmaps as they rendered very quickly and you know, displays weren't very high quality so who could tell the difference anyways. Whirlwind was the first computer to project real-time graphics on the screen and the characters were simple blocky letters. But as the resolution of screens and the speed of interactivity increased, so did what was possible with drawing glyphs on screens. Rudolf Hell was a German, experimenting with using cathode ray tubes to project a CRT image onto paper that was photosensitive and thus print using CRT. He designed a simple font called Digital Grotesk, in 1968. It looked good on the CRT and the paper. And so that font would not only be used to digitize typesetting, loosely based on Neuzeit Book. And we quickly realized bitmaps weren't efficient to draw fonts to screen and by 1974 moved to outline, or vector, fonts. Here a Bézier curve was drawn onto the screen using an algorithm that created the character, or glyph using an outline and then filling in the space between. These took up less memory and so drew on the screen faster. Those could be defined in an operating system, and were used not only to draw characters but also by some game designers to draw entire screens of information by defining a character as a block and so taking up less memory to do graphics. These were scalable and by 1979 another German, Peter Karow, used spline algorithms wrote Ikarus, software that allowed a person to draw a shape on a screen and rasterize that. Now we could graphically create fonts that were scalable. In the meantime, the team at Xerox PARC had been experimenting with different ways to send pages of content to the first laser printers. Bob Sproull and Bill Newman created the Press format for the Star. But this wasn't incredibly flexible like what Karow would create. John Gaffney who was working with Ivan Sutherland at Evans & Sutherland, had been working with John Warnock on an interpreter that could pull information from a database of graphics. When he went to Xerox, he teamed up with Martin Newell to create J&M, which harnessed the latest chips to process graphics and character type onto printers. As it progressed, they renamed it to Interpress. Chuck Geschke started the Imaging Sciences Laboratory at Xerox PARC and eventually left Xerox with Warnock to start a company called Adobe in Warnock's garage, which they named after a creek behind his house. Bill Paxton had worked on “The Mother of All Demos” with Doug Engelbart at Stanford, where he got his PhD and then moved to Xerox PARC. There he worked on bitmap displays, laser printers, and GUIs - and so he joined Adobe as a co-founder in 1983 and worked on the font algorithms and helped ship a page description language, along with Chuck Geschke, Doug Brotz, and Ed Taft. Steve Jobs tried to buy Adobe in 1982 for $5 million. But instead they sold him just shy of 20% of the company and got a five-year license for PostScript. This allowed them to focus on making the PostScript language more extensible, and creating the Type 1 fonts. These had 2 parts. One that was a set of bit maps And another that was a font file that could be used to send the font to a device. We see this time and time again. The simpler an interface and the more down-market the science gets, the faster we see innovative industries come out of the work done. There were lots of fonts by now. The original 1984 Mac saw Susan Kare work with Jobs and others to ship a bunch of fonts named after cities like Chicago and San Francisco. She would design the fonts on paper and then conjure up the hex (that's hexadecimal) for graphics and fonts. She would then manually type the hexadecimal notation for each letter of each font. Previously, custom fonts were reserved for high end marketing and industrial designers. Apple considered licensing existing fonts but decided to go their own route. She painstakingly created new fonts and gave them the names of towns along train stops around Philadelphia where she grew up. Steve Jobs went for the city approach but insisted they be cool cities. And so the Chicago, Monaco, New York, Cairo, Toronto, Venice, Geneva, and Los Angeles fonts were born - with her personally developing Geneva, Chicago, and Cairo. And she did it in 9 x 7. I can still remember the magic of sitting down at a computer with a graphical interface for the first time. I remember opening MacPaint and changing between the fonts, marveling at the typefaces. I'd certainly seen different fonts in books. But never had I made a document and been able to set my own typeface! Not only that they could be in italics, outline, and bold. Those were all her. And she inspired a whole generation of innovation. Here, we see a clean line from Ivan Sutherland and the pioneering work done at MIT to the University of Utah to Stanford through the oNLine System (or NLS) to Xerox PARC and then to Apple. But with the rise of Windows and other graphical operating systems. As Apple's 5 year license for PostScript came and went they started developing their own font standard as a competitor to Adobe, which they called TrueType. Here we saw Times Roman, Courier, and symbols that could replace the PostScript fonts and updating to Geneva, Monaco, and others. They may not have gotten along with Microsoft, but they licensed TrueType to them nonetheless to make sure it was more widely adopted. And in exchange they got a license for TrueImage, which was a page description language that was compatible with PostScript. Given how high resolution screens had gotten it was time for the birth of anti-aliasing. He we could clean up the blocky “jaggies” as the gamers call them. Vertical and horizontal lines in the 8-bit era looked fine but distorted at higher resolutions and so spatial anti-aliasing and then post-processing anti-aliasing was born. By the 90s, Adobe was looking for the answer to TrueImage. So 1993 brought us PDF, now an international standard in ISO 32000-1:2008. But PDF Reader and other tools were good to Adobe for many years, along with Illustrator and then Photoshop and then the other products in the Adobe portfolio. By this time, even though Steve Jobs was gone, Apple was hard at work on new font technology that resulted in Apple Advanced Typography, or AAT. AAT gave us ligature control, better kerning and the ability to write characters on different axes. But even though Jobs was gone, negotiations between Apple and Microsoft broke down to license AAT to Microsoft. They were bitter competitors and Windows 95 wasn't even out yet. So Microsoft started work on OpenType, their own font standardized language in 1994 and Adobe joined the project to ship the next generation in 1997. And that would evolve into an open standard by the mid-2000s. And once an open standard, sometimes the de facto standard as opposed to those that need to be licensed. By then the web had become a thing. Early browsers and the wars between them to increment features meant developers had to build and test on potentially 4 or 5 different computers and often be frustrated by the results. So the WC3 began standardizing how a lot of elements worked in Extensible Markup Language, or XML. Images, layouts, colors, even fonts. SVGs are XML-based vector image. In other words the browser interprets a language that displays the image. That became a way to render Web Open Format or WOFF 1 was published in 2009 with contributions by Dutch educator Erik van Blokland, Jonathan Kew, and Tal Leming. This built on the CSS font styling rules that had shipped in Internet Explorer 4 and would slowly be added to every browser shipped, including Firefox since 3.6, Chrome since 6.0, Internet Explorer since 9, and Apple's Safari since 5.1. Then WOFF 2 added Brotli compression to get sizes down and render faster. WOFF has been a part of the W3C open web standard since 2011. Out of Apple's TrueType came TrueType GX, which added variable fonts. Here, a single font file could contain a number or range of variants to the initial font. So a family of fonts could be in a single file. OpenType added variable fonts in 2016, with Apple, Microsoft, and Google all announcing support. And of course the company that had been there since the beginning, Adobe, jumped on board as well. Fewer font files, faster page loads. So here we've looked at the progression of fonts from the printing press, becoming more efficient to conserve paper, through the advent of the electronic typewriter to the early bitmap fonts for screens to the vectorization led by Adobe into the Mac then Windows. We also see rethinking the font entirely so multiple scripts and character sets and axes can be represented and rendered efficiently. I am now converting all my user names into pig Latin for maximum security. Luckily those are character sets that are pretty widely supported. The ability to add color to pig Latin means that OpenType-SVG will allow me add spiffy color to my glyphs. It makes us wonder what's next for fonts. Maybe being able to design our own, or more to the point, customize those developed by others to make them our own. We didn't touch on emoji yet. But we'll just have to save the evolution of character sets and emoji for another day. In the meantime, let's think on the fact that fonts are such a big deal because Steve Jobs took a caligraphy class from a Trappist monk named Robert Palladino while enrolled at Reed College. Today we can painstakingly choose just the right font with just the right meaning because Palladino left the monastic life to marry and have a son. He taught jobs about serif and san serif and kerning and the art of typography. That style and attention to detail was one aspect of the original Mac that taught the world that computers could have style and grace as well. It's not hard to imagine if entire computers still only supported one font or even one font per document. Palladino never owned or used a computer though. His influence can be felt through the influence his pupil Jobs had. And it's actually amazing how many people who had such dramatic impacts on computing never really used one. Because so many smaller evolutions came after them. What evolutions do we see on the horizon today? And how many who put a snippet of code on a service like GitHub may never know the impact they have on so many?
## 内容简介本期我从Douglas Engelbart的《增强人类智力 - 一个概念框架》展开,讨论工具设计与增强人类智力的关系,即工具用以提升人处理问题与决策的能力。再详细介绍了增强系统的组成与作用:语言、工具、方法论和训练。最后借马克思·韦伯《以学术为业》演讲的观点对于母题的重要作用,以及母题落地的实际问题进行回应。### 参考:- Doug Engelbart's landmark report Augmenting Human Intellect: A Conceptual Framework(1962)- The Mother of All Demos, presented by Douglas Engelbart (1968)- Vannevar Bush- Dynamicland - Bret Victor- 汪丁丁 - 《行为金融学讲义》- 设计乘数 # 41:从行为社会科学基本问题看去- 马克思韦伯 《以学术为业》
## 内容简介本期我从Douglas Engelbart的《增强人类智力 - 一个概念框架》展开,讨论工具设计与增强人类智力的关系,即工具用以提升人处理问题与决策的能力。再详细介绍了增强系统的组成与作用:语言、工具、方法论和训练。最后借马克思·韦伯《以学术为业》演讲的观点对于母题的重要作用,以及母题落地的实际问题进行回应。### 参考:- Doug Engelbart's landmark report Augmenting Human Intellect: A Conceptual Framework(1962)- The Mother of All Demos, presented by Douglas Engelbart (1968)- Vannevar Bush- Dynamicland - Bret Victor- 汪丁丁 - 《行为金融学讲义》- 设计乘数 # 41:从行为社会科学基本问题看去- 马克思韦伯 《以学术为业》
We're getting back to my hypertext series with a big of an obscure tale. ZOG is a hypertext system what was first developed in 1972 at Carnegie-Melon University. It then stagnated until the latter half of the 1970s when it was picked back up. By 1983 it was cruising on a US Navy aircraft carrier. ZOG presents a hypertext system with some very modern notions. But here's the part that gets me excited: ZOG was developed after Doug Engelbart's Mother of All Demos. So, in theory, ZOG should take ques from this seminal event. Right? ... right? Selected sources: https://www.campwoodsw.com/mentorwizard/PROMISHistory.pdf - History of PROMIS https://apps.dtic.mil/sti/pdfs/ADA049512.pdf - 1977 ZOG Report https://apps.dtic.mil/docs/citations/ADA158084 - 1984 USS Carl Vinson Report
Subscribe to the podcast! https://podfollow.com/everythingeverywhere/ Almost every single person listening to this podcast right now is doing so on some sort of personal computing device. Many of the things that we consider part of a modern personal computer, windows, hyperlinks, a mouse, and a text editor, all were released upon the world in a single 90-minute demo in 1968. The ideas were so advanced it would take over two decades before most of them found themselves in everyone's homes. Learn more about the Mother of All Demos and the birth of personal computing, on this episode of Everything Everywhere Daily. -------------------------------- Associate Producers: Peter Bennett & Thor Thomsen Become a supporter on Patreon: https://www.patreon.com/everythingeverywhere Update your podcast app at newpodcastapps.com Discord Server: https://discord.gg/UkRUJFh Instagram: https://www.instagram.com/everythingeverywhere/ Twitter: https://twitter.com/everywheretrip Website: https://everything-everywhere.com/everything-everywhere-daily-podcast/ Learn more about your ad choices. Visit megaphone.fm/adchoices
We had this Mac lab in school. And even though they were a few years old at the time, we had a whole room full of Macintosh SEs. I'd been using the Apple II Cs before that and these just felt like Isaac Asimov himself dropped them off just for me to play with. Only thing: no BASIC interpreter. But in the Apple menu, tucked away in the corner was a little application called HyperCard. HyperCard wasn't left by Asimov, but instead burst from the mind of Bill Atkinson. Atkinson was the 51st employee at Apple and a former student of Jeff Raskin, the initial inventor of the Mac before Steve Jobs took over. Steve Jobs convinced him to join Apple where he started with the Lisa and then joined the Mac team until he left with the team who created General Magic and helped bring shape to the world of mobile devices. But while at Apple he was on the original Mac team developing the menu bar, the double-click, Atkinson dithering, MacPaint, QuickDraw, and HyperCard. Those were all amazing tools and many came out of his work on the original 1984 Mac and the Lisa days before that. But HyperCard was something entirely different. It was a glimpse into the future, even if self-contained on a given computer. See, there had been this idea floating around for awhile. Vannevar Bush initially introduced the world to a device with all the world's information available in his article “As We May Think” in 1946. Doug Engelbart had a team of researchers working on the oN-Line System that saw him give “The Mother of All Demos in 1968” where he showed how that might look, complete with a graphical interface and hypertext, including linked content. Ted Nelson introduced furthered the ideas in 1969 of having linked content, which evolved into what we now call hyperlinks. Although Nelson thought ahead to include the idea of what he called transclusions, or the snippets of text displayed on the screen from their live, original source. HyperCard built on that wealth of information with a database that had a graphical front-end that allowed inserting media and a programming language they called HyperTalk. Databases were nothing new. But a simple form creator that supported graphics and again stressed simple, was new. Something else that was brewing was this idea of software economics. Brooks' Law laid it out but Barry Boehm's book on Software Engineering Economics took the idea of rapid application development another step forward in 1981. People wanted to build smaller programs faster. And so many people wanted to build tools that we needed to make it easier to do so in order for computers to make us more productive. Against that backdrop, Atkinson took some acid and came up with the idea for a tool he initially called WildCard. Dan Winkler signed onto the project to help build the programming language, HyperTalk, and they got to work in 1986. They changed the name of the program to HyperCard and released it in 1987 at MacWorld. Regular old people could create programs without knowing how to write code. There were a number of User Interface (UI) components that could easily be dropped on the screen, and true to his experience there was panel of elements like boxes, erasers, and text, just like we'd seen in MacPaint. Suppose you wanted a button, just pick it up from the menu and drop it where it goes. Then make a little script using the HyperText that read more like the English language than a programming language like LISP. Each stack might be synonymous with a web page today. And a card was a building block of those stacks. Consider the desktop metaphor extended to a rolodex of cards. Those cards can be stacked up. There were template cards and if the background on a template changed, that flowed to each card that used the template, like styles in Keynote might today. The cards could have text fields, video, images, buttons, or anything else an author could think of. And the author word is important. Apple wanted everyone to feel like they could author a hypercard stack or program or application or… app. Just as they do with Swift Playgrounds today. That never left the DNA. We can see that ease of use in how scripting is done in HyperTalk. Not only the word scripting rather than programming, but how HyperTalk is weakly typed. This is to say there's no memory safety or type safety, so a variable might be used as an integer or boolean. That either involves more work by the interpreter or compiler - or programs tend to crash a lot. Put the work on the programmers who build programming tools rather than the authors of HyperCard stacks. The ease of use and visual design made Hypercard popular instantly. It was the first of its kind. It didn't compile at first, although larger stacks got slow because HyperTalk was interpreted, so the team added a just-in-time compiler in 1989 with HyperCard 2.0. They also added a debugger. There were some funny behaviors. Like some cards could have objects that other cards in a stack didn't have. This led to many a migration woe for larger stacks that moved into modern tools. One that could almost be considered HyperCard 3, was FileMaker. Apple spun their software business out as Claris, who bought Noshuba software, which had this interesting little database program called Nutshell. That became FileMaker in 1985. By the time HyperCard was ready to become 3.0, FileMaker Pro was launched in 1990. Attempts to make Hypercard 3.0 were still made, but Hypercard had its run by the mid-1990s and died a nice quiet death. The web was here and starting to spread. The concept of a bunch of stacks on just one computer had run its course. Now we wanted pages that anyone could access. HyperCard could have become that but that isn't its place in history. It was a stepping stone and yet a milestone and a legacy that lives on. Because it was a small tool in a large company. Atkinson and some of the other team that built the original Mac were off to General Magic. Yet there was still this idea, this legacy. Hypercard's interface inspired many modern applications we use to create applications. The first was probably Delphi, from Borland. But over time Visual Studio (which we still use today) for Microsoft's Visual Basic. Even Powerpoint has some similarities with HyperCard's interface. WinPlus was similar to Hypercard as well. Even today, several applications and tools use HyperCard's ideas such as HyperNext, HyperStudio, SuperCard, and LiveCode. HyperCard also certainly inspired FileMaker and every Apple development environment since - and through that, most every tool we use to build software, which we call the IDE, or Integrated Development Environment. The most important IDE for any Apple developer is Xcode. Open Xcode to build an app and look at Interface Builder and you can almost feel Bill Atkinson's pupils dilated pupils looking back at you, 10 hours into a trip. And within those pupils visions - visions of graphical elements being dropped into a card and people digitized CD collections, built a repository for their book collection, put all the Grateful Dead shows they'd recorded into a stack, or even built an application to automate their business. Oh and let's not forget the Zine, or music and scene magazines that were so popular in the era that saw photocopying come down in price. HyperCard made for a pretty sweet Zine. HyperCard sprang from a trip when the graphical interface was still just coming into its own. Digital computing might have been 40 years old but the information theorists and engineers hadn't been as interested in making things easy to use. They wouldn't have been against it, but they weren't trying to appeal to regular humans. Apple was, and still is. The success of HyperCard seems to have taken everyone by surprise. Apple sold the last copy in 2004, but the legacy lives on. Successful products help to mass- Its success made a huge impact at that time as well on the upcoming technology. Its popularity declined in the mid-1990s and it died quietly when Apple sold its last copy in 2004. But it surely left a legacy that has inspired many - especially old-school Apple programmers, in today's “there's an app for that” world.
It's human nature to make everything we do competitive. I've played football, ran track at times, competed in hacking competitions at Def Con, and even participated in various gaming competitions like Halo tournaments. I always get annihilated by kids who had voices that were cracking, but I played! Humans have been competing in sports for thousands of years. The Lascaux in France shows people sprinting over 15,000 years ago. The Egyptians were bowling in the 5,000s BCE. The Sumerians were wrestling 5,000 years ago. Mesopotamian art shows boxing in the second or third millennium BCE. The Olmecs in Mesoamerican societies were playing games with balls around the same time. Egyptian monuments show a number of individual sports being practiced in Egypt as far back as 2,000 BCE. The Greeks evolved the games first with the Minoans and Mycenaeans between 1,500 BCE and 1,000BCE and then they first recorded their Olympic games in 776 BCE, although historians seem to agree the games were practiced at least 500 years before that evolving potentially from funeral games. Sports competitions began as ways to showcase an individuals physical prowess. Weight lifting, discus, whether individual or team sports, sports rely on physical strength, coordination, repetitive action, or some other feat that allows one person or team to stand out. Organized team sports first appeared in ancient times. The Olmecs in Mesoamerica but Hurling supposedly evolved past 1000 BCE, although written records of that only begin around the 16th century and it could be that was borrowed through the Greek game harpaston when the Romans evolved it into the game harpastum and it spread with Roman conquests. But the exact rules and timelines of all of these are lost to written history. Instead, written records back up that western civilization team sports began with polo appearing about 2,500 years ago in Persia. The Chinese gave us a form of kickball they called cuju, around 200 BCE. Football, or soccer for the American listeners, started in 9th century England but evolved into the game we think of today in the 1850s, then a couple of decades later to American football. Meanwhile, cricket came around in the 16th century and then hockey and baseball came along in the mid 1800s with basketball arriving in the 1890s. That's also around the same time the modern darts game was born, although that started in the Middle Ages when troops threw arrows or crossbow bolts at wine barrels turned on their sides or sections of tree trunks. Many of these sports are big business today, netting multi-billion dollar contracts for media rights to show and stream games, naming rights to stadiums for hundreds of millions, and players signing contracts for hundreds of millions across all major sports. There's been a sharp increase in sports contracts since the roaring 1920s, rising steadily until the television started to show up in homes around the world until ESPN solidified a new status in our lives when it was created in 1979. Then came the Internet and the money got crazy town. All that money leads the occasional entrepreneurial minded sports enthusiast to try something new. We got the World Wrestling Body in the 1950s, which evolved out of Jim McMahon's father's boxing promotions put him working with Toots Mondt on what they called Western Style Wrestling. Beating people up has been around since the dawn of life but became an official sport when UFC 1 was launched in 1993. We got the XFL in 1999. So it's no surprise that we would take a sport that requires hand-eye coordination and turn that into a team endeavor. That's been around for a long time, but we call it Esports today. Video Game Competitions Competing in video games is as almost as old as, well, video games. Spacewar! was written in 1962 and students from MIT competed with one another for dominance of deep space, dogfighting little ships, which we call sprites today, into oblivion. The game spread to campuses and companies as the PDP minicomputers spread. Countless hours spent playing and by 1972, there were enough players that they held the first Esports competition, appropriately called the Intergalactic Spacewar! Olympics. Of course, Steward Brand would report on that for Rolling Stone, having helped Mouse inventor Doug Englebart with the “Mother of All Demos” just four years before. Pinball had been around since the 1930s, or 1940s with flippers. They could be found around the world by the 1970s and 1972 was also the first year there was a Pinball World Champion. So game leagues were nothing new. But Brand and others, like Atari founder Nolan Bushnell knew that video games were about to get huge. Tennis was invented in the 1870s in England and went back to 11th century France. Tennis on a screen would make loads of sense as well when Tennis For Two debuted in 1958. So when Pong came along in 1972, the world (and the ability to mass produce technology) was ready for the first video game hit. So when people flowed into bars first in the San Francisco Bay Area, then around the country to play Pong, it's no surprise that people would eventually compete in the game. From competing in billiards to a big game console just made sense. Now it was a quarter a game instead of just a dart board hanging in the corner. And so when Pong went to home consoles of course people competed there as well. Then came Space Invaders in 1978. By 1980 we got the first statewide Space Invaders competition, and 10,000 players showed up. The next year there was a Donkey Kong tournament and Billy Mitchell set the record for the game at 874,300 that stood for 18 years. We got the US National Video Game Team in 1983 and competitions for arcade games sprung up around the world. A syndicated television show called Starcade even ran to show competitions, which now we might call streaming. And Tron came in 1982. Then came the video game crash of 1983. But games never left us. The next generation of consoles and arcade games gave us competitions and tournaments for Street Fighter and Mortal Kombat then first-person game like Goldeneye and other first-person shooters later in the decade, paving the way for games like Call of Duty and World of Warcraft. Then in 1998 a legendary StarCraft 2 tournament was held and 50 million people around the world tuned in on the Internet. That's a lot of eyeballs. Team options were also on the rise. Netter had been written to play over the Internet by 16 players at once. Within a few years, massive multiplayers could have hundreds of players duking it out in larger battle scenes. Some were recorded and posted to web pages. There was appetite for tracking scores for games and competing and even watching games, which we've all done over the shoulders of friends since the arcades and consoles of old. Esports and Twitch As the 2000s came, Esports grew in popularity. Esports is short for the term electronic sports, and refers to competitive video gaming, which includes tournaments and leagues. Let's set aside the earlier gaming tournaments and think of those as classic video games. Let's reserve the term Esports for events held after 2001. That's because the World Cyber Games was founded in 2000 and initially held in 2001, in Seoul, Korea (although there was a smaller competition in 2000). The haul was $300,000 and events continue on through the current day, having been held in San Francisco, Italy, Singapore, and China. Hundreds of people play today. That started a movement. Major League Gaming (MLG) came along in 2002 and is now regarded as one of the most significant Esports hosts in the world. The Electronic Sports World Cup came in 2003 were the first tournaments, which were followed by the introduction of ESL Intel Extreme Master in 2007 and many others. The USA Network broadcast their first Halo 2 tournament in 2006. We've gone from 10 major tournaments held in 2000 to an incalculable number today. That means more teams. Most Esports companies are founded by former competitors, like Cloud9, 100 Thieves, and FaZeClan. Team SoloMid is the most valuable Esports organization. Launched by League of Legends star Dan Dinh and his brother in 2009, and is now worth over $400 million and has fielded teams like ZeRo for Super Smash Brothers, Excelerate Gaming for Rainbow Six Seige, Team Dignitas for Counter-Strike: Global Offensive, and even chess grandmaster Hikaru Nakamura. The analog counterpart would be sports franchises. Most of those were started by athletic clubs or people from the business community. Gaming has much lower startup costs and thus far has been more democratic in the ability to start a team with higher valuations. Teams play in competitions held by leagues, of which There seems to be new ones all the time. The NBA 2K League and the Overwatch League are two new leagues that have had early success. One reason for teams and leagues like this is naming and advertising rights. Another is events like The International 2021, with a purse of over $40M. The inaugural League of Legends World Championship took place in 2011. In 2013 another tournament was held in the Staples Center in Los Angeles (close to their US offices). Tickets for the event sold out within minutes. The purse for that was originally $100,000 and has since risen to over $7M. But others are even larger. Arena of Valor tournament Honor of Kings World Champion Cup is $7.7M and Fortnite World Cup Finals has gone as high as $15M. One reason for the leagues and teams is that companies that make games want to promote their games. The video game business is almost an 86 billion dollar industry. Another is that people started watching other people play on YouTube. But then YouTube wasn't really purpose-built for gaming. Streamers made due using cameras to stream images of themselves in a picture-in-picture frame but that still wasn't optimal. Esports had been broadcast (the original form of streaming) before but streaming wasn't all that commercially successful until the birth of Twitch in 2011. YouTube had come along in 2005 and Justin Kan and Emmett Shear created Justin.tv in 2007 as a place for people to broadcast video, or stream, online. They started with just one channel: Justin's life. Like 24 by 7 life. They did Y Combinator and managed to land an $8M seed round. Justin had a camera mounted to his hat, and left that outside the bathroom since it wasn't that kind of site. They made a video chat system and not only was he streaming, but he was interacting with people on the other side of the stream. It was like the Truman Show, but for reals. A few more people joined up, but then came other sites to provide this live streaming option. They added forums, headlines, comments, likes, featured categories of channels, and other features but just weren't hitting it. One aspect was doing really well: gaming. They moved that to a new site in 2011 and called that Twitch. This platform allowed players to stream themselves and their games. And they could interact with their viewers, which gave the entire experience a new interactive paradigm. And it grew fast with the whole thing being rebranded as Twitch in 2014. Amazon bought Twitch in 2014 for $1B. They made $2.3 Billion in 2020 with an average of nearly 3 million concurrent viewers watching nearly 19 billion hours of content provided monthly by nearly 9 million streamers. Other services like Youtube Gaming have come and gone but Twitch remains the main way people watch others game. ESPN and others still have channels for Esports, but Twitch is purpose-built for gaming. And watching others play games is no different than Greeks showing up for the Olympics or watching someone play pool or watching Liverpool play Man City. In fact, the money they make is catching up. Platforms like Twitch allow professional gamers and those who announce the games to to become their own unique class of celebrities. The highest paid players have made between three and six million dollars, with the top 10 living outside the US and making their hauls from Dota 2. Others have made over a million playing games like Counter-Strike, Fortnite, League of Legends, and Call of Duty. None are likely to hold a record for any of those games for 18 years. But they are likely to diversify their sources of income. Add a YouTube channel, Twitch stream, product placements, and appearances - and a gamer could be looking at doubling what they bring in from competitions. Esports has come far but has far further to go. The total Esports market was just shy of $1B in 2020 and expected tor each $2.5B in 2025 (which the pandemic may push even faster). Not quite the 100 million that watch the Super Bowl every year or the half billion that tune into the World Cup finals but growing at a faster rate than the Super Bowl, which has actually declined in the past few years. And the International Olympic Committee recognized the tremendous popularity of Esports throughout the world in 2017 and left open the prospect of Esports becoming an Olympic sport in the future (although with the number of vendors involved that's hard to imagine happening). Perhaps some day when archaeologists dig up what we've left behind, they'll find some Egyptian Obelisk or gravestone with a controller and a high score. Although they'll probably just scoff at the high score, since they already annihilated that when they first got their neural implants and have since moved on to far better games! Twitch is young in the context of the decades of history in computing. However, the impact has been fast and along with Esports shows us a windows into how computing has reshaped entire ways we week not only entertainment, but also how we make a living. In fact, the US Government recognized League of Legends as a sport as early as 2013, allowing people to get Visas to come into the US and play. And where there's money to be made, there's betting and abuse. 2010 saw SaviOr and some of the best Starcraft players to ever play embroiled in a match-fixing scandal. That almost destroyed the Esports gaming industry. And yet as with the Video Game Crash of 1983, the industry has always bounced back, at magnitudes larger than before.
Mozilla's "BigSig" buffer overflow hole. UK to put IoT vendors on notice. The Mother of All Demos. Cryptocurrency company catastrophe. Firefox gets an extra sandbox. And an access point from outer space (OK, from home). Original music by Edith Mudge Got questions/suggestions/stories to share? Email tips@sophos.com Twitter @NakedSecurity Instagram @NakedSecurity
Mozilla's "BigSig" buffer overflow hole. UK to put IoT vendors on notice. The Mother of All Demos. Cryptocurrency company catastrophe. Firefox gets an extra sandbox. And an access point from outer space (OK, from home). https://nakedsecurity.sophos.com/mozilla-patches-exploitable-bigsig https://nakedsecurity.sophos.com/iot-devices-must-protect-consumers https://nakedsecurity.sophos.com/cryptocurrency-startup-fails-to-subtract https://nakedsecurity.sophos.com/firefox-update-brings-a-whole-new With Paul Ducklin and Doug Aamoth. Original music by Edith Mudge (https://www.edithmudge.com) Got questions/suggestions/stories to share? Email: tips@sophos.com Twitter: NakedSecurity (https://twitter.com/nakedsecurity) Instagram: NakedSecurity (https://instagram.com/nakedsecurity)
NLS, or the oN-Line System, is often looked at as a mile marker in the development of modern computing. It was the first system to use a mouse, one of the first functional examples of hypertext, pioneered remote collaboration, and so much more. But how much do you know about NLS itself? In this series of episode I'm picking apart the system behind the legend. In Part 2 we are looking at the development of NLS itself. Along the way we talk timesharing, strange custom hardware, and complex programming practices. Does NLS live up to the hype? You'll have to listen to find out. Selected Sources: https://dougengelbart.org/content/view/374/ - Go watch the Mother of All Demos https://www.dougengelbart.org/content/view/140/ - 1968 NLS progress report http://web.archive.org/web/20160210002938/https://web.stanford.edu/dept/SUL/library/extra4/sloan/mousesite/EngelbartPapers/B2_F5_ARNAS1.html - 1966 progress report
The Osborne Effect isn't an episode about Spider-Man that covers turning green or orange and throwing bombs off little hoverboards. Instead it's about the impact of The Osborne 1 computer on the history of computers. Although many might find discussing the Green Goblin or Hobgoblin much more interesting. The Osborne 1 has an important place in the history of computing because when it was released in 1981, it was the first portable computer that found commercial success. Before the Osborne, there were portable teletype machines for sure, but computers were just starting to get small enough that a fully functional machine could be taken on an airplane. It ran 2.2 of the CP/M operating system and came with a pretty substantial bundle of software. Keep in mind, there weren't internal hard drives in machines like this yet but instead CP/M was a set of floppies. It came with MBASIC from Microsoft, dBASE II from Ashton-Tate, the WordStar word processor, SuperCalc for spreadsheets, the Grammatik grammar checker, the Adventure game, early ledger tools from PeachTree Software, and tons of other software. By bundling so many titles, they created a climate where other vendors did the same thing, like Kaypro. After all, nothing breeds competitors like the commercial success of a given vendor. The Osborne was before flat panel screens so had a built-in CRT screen. This and the power supply and the heavy case meant it weighed almost 25 pounds and came in at just shy of $1,800. Imagine two disk drives with a 5 inch screen in the middle. The keyboard, complete with a full 10-key pad, was built into a cover that could be pulled off and used to interface with the computer. The whole thing could fit under a seat on an airplane. Airplane seats were quite a bit larger than they are today back then! We think of this as a luggable rather than a portable because of that and because computers didn't have batteries yet. Instead it pulled up to 37 watts of power. All that in a 20 inch wide case that stood 9 inches tall. The two people most commonly associated with the Osborne are Adam Osborne and Lee Felsenstein. Osborne got his PhD from the University of Delaware in 1968 and went to work in chemicals before he moved to the Bay Area and started writing books about computers and started a company called Osborne and Associates to write computer books. He sold that to McGraw-Hill in 1979. By then he'd been hanging around the Homebrew Computer Club for a few years and there were some pretty wild ideas floating around. He saw Jobs and Wozniak demo the Apple I and watched their rise. Founders and engineers from Cromemco, IMSAI, Tiny BASIC, and Atari were also involved there - mostly before any of those products were built. So with the money from McGraw-Hill and sales of some of his books like An Introduction To Microcomputers, he set about thinking through what he could build. Lee Felsenstein was another guy from that group who'd gotten his degree in Computer Science at Berkeley before co-creating Community Memory, a project to build an early bulletin board system on top of a SDS 940 timesharing mainframe with links to terminals like a Teletype Model 33 sitting at Leopold's Records in Berkeley. That had started up back in 1973 when Doug Englebart donated his machine from The Mother of All Demos and eventually moved to minicomputers as those became more available. Having seen the world go from a mainframe the size of a few refrigerators to minicomputers and then to early microcomputers like the Altair, when a hardware hacker like Felsenstein paired up with someone with a little seed money like Osborne, magic was bound to happen. The design was similar to the NoteTaker that Alan Kay had built at Xerox in the 70s - but hacked together from parts they could find. Like 5 inch Fujitsu floppy drives. They made 10 prototypes with metal cases and quickly moved to injection molded plastic cases, taking them to the 1981 West Coast Computer Faire and getting a ton of interest immediately. Some thought the screen was a bit too small but at the time the price justified the software alone. By the end of 1981 they'd had months where they did a million dollars in sales and they fired up the assembly line. People bought modems to hook to the RS-232 compatible serial port and printers to hook to the parallel port. Even external displays. Sales were great. They were selling over 10,000 computers a month and Osborne was lining up more software vendors, offering stock in the Osborne Computer Corporation. By 1983 they were preparing to go public and developing a new line of computers, one of which was the Osborne Executive. That machine would come with more memory, a slightly larger screen, an expansion slot and of course more software using sweetheart licensing deals that accompanied stock in the company to keep the per-unit cost down. He also announced the Vixen - same chipset but lighter and cheaper. Only issue is this created a problem, which we now call the Osborne Effect. People didn't want the Osborne 1 any more. Seeing something new was on the way, people cancelled their orders in order to wait for the Executive. Sales disappeared almost overnight. At the time, computer dealers pushed a lot of hardware and the dealers didn't want to have all that stock of an outdated model. Revenue disappeared and this came at a terrible time. The market was changing. IBM showed up with a PC, Apple had the Lisa and were starting to talk about the Mac. KayPro had come along as a fierce competitor. Other companies had clued in on the software bundling idea. The Compaq portable wasn't far away. The company ended up cancelling the IPO and instead filing for bankruptcy. They tried to raise money to build a luggable or portable IBM clone - and if they had done so maybe they'd be what Compaq is today - a part of HP. The Osborne 1 was cannibalized by the Osborne Executive that never actually shipped. Other companies would learn the same lesson as the Osborne Effect throughout history. And yet the Osborne opened our minds to this weird idea of having machines we could take with us on airplanes. Even if they were a bit heavy and had pretty small screens. And while the timing of announcements is only one aspect of the downfall of the company, the Osborne Effect is a good reminder to be deliberate about how we talk about future products. Especially for hardware but we also have to be careful not to sell features that don't exist yet in software.
Paris Marx is joined by Brian Merchant to discuss the development of the iPhone, how Apple manages the press, and how the parts of the company's supply chain that get too little attention.Brian Merchant is the author of The One Device: The Secret History of the iPhone and Blood in the Machine, coming in 2022. Follow Brian on Twitter at @bcmerchant.
Robert Taylor was one of the true pioneers in computer science. In many ways, he is the string (or glue) that connected the US governments era of supporting computer science through ARPA to innovations that came out of Xerox PARC and then to the work done at Digital Equipment Corporation's Systems Research Center. Those are three critical aspects of the history of computing and while Taylor didn't write any of the innovative code or develop any of the tools that came out of those three research environments, he saw people and projects worth funding and made sure the brilliant scientists got what they needed to get things done. The 31 years in computing that his stops represented were some of the most formative years for the young computing industry and his ability to inspire the advances that began with Vannevar Bush's 1945 article called “As We May Think” then ended with the explosion of the Internet across personal computers. Bob Taylor inherited a world where computing was waking up to large crusty but finally fully digitized mainframes stuck to its eyes in the morning and went to bed the year Corel bought WordPerfect because PCs needed applications, the year the Pentium 200 MHz was released, the year Palm Pilot and eBay were founded, the year AOL started to show articles from the New York Times, the year IBM opened a we web shopping mall and the year the Internet reached 36 million people. Excite and Yahoo went public. Sometimes big, sometimes small, all of these can be traced back to Bob Taylor - kinda' how we can trace all actors to Kevin Bacon. But more like if Kevin Bacon found talent and helped them get started, by paying them during the early years of their careers… How did Taylor end up as the glue for the young and budding computing research industry? Going from tween to teenager during World War II, he went to Southern Methodist University in 1948, when he was 16. He jumped into the US Naval Reserves during the Korean War and then got his masters in psychology at the University of Texas at Austin using the GI Bill. Many of those pioneers in computing in the 60s went to school on the GI Bill. It was a big deal across every aspect of American life at the time - paving the way to home ownership, college educations, and new careers in the trades. From there, he bounced around, taking classes in whatever interested him, before taking a job at Martin Marietta, helping design the MGM-31 Pershing and ended up at NASA where he discovered the emerging computer industry. Taylor was working on projects for the Apollo program when he met JCR Licklider, known as the Johnny Appleseed of computing. Lick, as his friends called him, had written an article called Man-Computer Symbiosis in 1960 and had laid out a plan for computing that influenced many. One such person, was Taylor. And so it was in 1962 he began and in 1965 that he succeeded in recruiting Taylor away from NASA to take his place running ARPAs Information Processing Techniques Office, or IPTO. Taylor had funded Douglas Engelbart's research on computer interactivity at Stanford Research Institute while at NASA. He continued to do so when he got to ARPA and that project resulted in the invention of the computer mouse and the Mother of All Demos, one of the most inspirational moments and a turning point in the history of computing. They also funded a project to develop an operating system called Multics. This would be a two million dollar project run by General Electric, MIT, and Bell Labs. Run through Project MAC at MIT there were just too many cooks in the kitchen. Later, some of those Bell Labs cats would just do their own thing. Ken Thompson had worked on Multics and took the best and worst into account when he wrote the first lines of Unix and the B programming language, then one of the most important languages of all time, C. Interactive graphical computing and operating systems were great but IPTO, and so Bob Taylor and team, would fund straight out of the pentagon, the ability for one computer to process information on another computer. Which is to say they wanted to network computers. It took a few years, but eventually they brought in Larry Roberts, and by late 1968 they'd awarded an RFQ to build a network to a company called Bolt Beranek and Newman (BBN) who would build Interface Message Processors, or IMPs. The IMPS would connect a number of sites and route traffic and the first one went online at UCLA in 1969 with additional sites coming on frequently over the next few years. That system would become ARPANET, the commonly accepted precursor to the Internet. There was another networking project going on at the time that was also getting funding from ARPA as well as the Air Force, PLATO out of the University of Illinois. PLATO was meant for teaching and had begun in 1960, but by then they were on version IV, running on a CDC Cyber and the time sharing system hosted a number of courses, as they referred to programs. These included actual courseware, games, convent with audio and video, message boards, instant messaging, custom touch screen plasma displays, and the ability to dial into the system over lines, making the system another early network. Then things get weird. Taylor is sent to Vietnam as a civilian, although his rank equivalent would be a brigadier general. He helped develop the Military Assistance Command in Vietnam. Battlefield operations and reporting were entering the computing era. Only problem is, while Taylor was a war veteran and had been deep in the defense research industry for his entire career, Vietnam was an incredibly unpopular war and seeing it first hand and getting pulled into the theater of war, had him ready to leave. This combined with interpersonal problems with Larry Roberts who was running the ARPA project by then over Taylor being his boss even without a PhD or direct research experience. And so Taylor joined a project ARPA had funded at the University of Utah and left ARPA. There, he worked with Ivan Sutherland, who wrote Sketchpad and is known as the Father of Computer Graphics, until he got another offer. This time, from Xerox to go to their new Palo Alto Research Center, or PARC. One rising star in the computer research world was pretty against the idea of a centralized mainframe driven time sharing system. This was Alan Kay. In many ways, Kay was like Lick. And unlike the time sharing projects of the day, the Licklider and Kay inspiration was for dedicated cycles on processors. This meant personal computers. The Mansfield Amendment in 1973 banned general research by defense agencies. This meant that ARPA funding started to dry up and the scientists working on those projects needed a new place to fund their playtime. Taylor was able to pick the best of the scientists he'd helped fund at ARPA. He helped bring in people from Stanford Research Institute, where they had been working on the oNLineSystem, or NLS. This new Computer Science Laboratory landed people like Charles Thacker, David Boggs, Butler Lampson, and Bob Sproul and would develop the Xerox Alto, the inspiration for the Macintosh. The Alto though contributed the very ideas of overlapping windows, icons, menus, cut and paste, word processing. In fact, Charles Simonyi from PARC would work on Bravo before moving to Microsoft to spearhead Microsoft Word. Bob Metcalfe on that team was instrumental in developing Ethernet so workstations could communicate with ARPANET all over the growing campus-connected environments. Metcalfe would leave to form 3COM. SuperPaint would be developed there and Alvy Ray Smith would go on to co-found Pixar, continuing the work begun by Richard Shoup. They developed the Laser Printer, some of the ideas that ended up in TCP/IP, and the their research into page layout languages would end up with Chuck Geschke, John Warnock and others founding Adobe. Kay would bring us the philosophy behind the DynaBook which decades later would effectively become the iPad. He would also develop Smalltalk with Dan Ingalls and Adele Goldberg, ushering in the era of object oriented programming. They would do pioneering work on VLSI semiconductors, ubiquitous computing, and anything else to prepare the world to mass produce the technologies that ARPA had been spearheading for all those years. Xerox famously did not mass produce those technologies. And nor could they have cornered the market on all of them. The coming waves were far too big for one company alone. And so it was that PARC, unable to bring the future to the masses fast enough to impact earnings per share, got a new director in 1983 and William Spencer was yet another of three bosses that Taylor clashed with. Some resented that he didn't have a PhD in a world where everyone else did. Others resented the close relationship he maintained with the teams. Either way, Taylor left PARC in 1983 and many of the scientists left with him. It's both a curse and a blessing to learn more and more about our heroes. Taylor was one of the finest minds in the history of computing. His tenure at PARC certainly saw the a lot of innovation and one of the most innovative teams to have ever been assembled. But as many of us that have been put into a position of leadership, it's easy to get caught up in the politics. I am ashamed every time I look back and see examples of building political capital at the expense of a project or letting an interpersonal problem get in the way of the greater good for a team. But also, we're all human and the people that I've interviewed seem to match the accounts I've read in other books. And so Taylor's final stop was Digital Equipment Corporation where he was hired to form their Systems Research Center in Palo Alto. They brought us the AltaVista search engine, the Firefly computer, Modula-3 and a few other advances. Taylor retired in 1996 and DEC was acquired by Compaq in 1998 and when they were acquired by HP the SRC would get merged with other labs at HP. From ARPA to Xerox to Digital, Bob Taylor certainly left his mark on computing. He had a knack of seeing the forest through the trees and inspired engineering feats the world is still wrestling with how to bring to fruition. Raw, pure science. He died in 2017. He worked with some of the most brilliant people in the world at ARPA. He inspired passion, and sometimes drama in what Stanford's Donald Knuth called “the greatest by far team of computer scientists assembled in one organization.” In his final email to his friends and former coworkers, he said “You did what they said could not be done, you created things that they could not see or imagine.” The Internet, the Personal Computer, the tech that would go on to become Microsoft Office, object oriented programming, laser printers, tablets, ubiquitous computing devices. So, he isn't exactly understating what they accomplished in a false sense of humility. I guess you can't do that often if you're going to inspire the way he did. So feel free to abandon the pretense as well, and go inspire some innovation. Heck, who knows where the next wave will come from. But if we aren't working on it, it certainly won't come. Thank you so much and have a lovely, lovely day. We are so lucky to have you join us on yet another episode.
It’s not easy to learn how to use computers when you can’t actually touch them. But that’s how Dr. Clarence Ellis started his career of invention—which would ultimately lead to reimagining how we all worked with computers and each other.Martez Mott describes the “Mother of all Demos” that would inspire a generation of builders. Gary Nutt recounts working with Dr. Clarence Ellis at Xerox PARC, and the atmosphere at the coveted research lab. Chengzheng Sun and Paul Curzon explain how Operational Transformation—the project to which Dr. Ellis devoted so much time and effort—laid the foundation for the collaborative tools many of us use every day. And Delilah DeMers shares how humble her father was, and how he loved teaching people that technology can be a force for good.“Mother of All Demos” clip courtesy of SRI International. To learn more about Operational Transformation, you can check out this FAQ written by Chengzheng Sun. If you want to read up on some of our research on Dr. Clarence Ellis, you can check out all our bonus material over at redhat.com/commandlineheroes. Follow along with the episode transcript.
Jason Yuan believes that we all should feel empowered to think about ways to improve our computer's operating system. He joins Mark and Adam to talk about stage design, dreaming big versus delivering practical products, and why software should be fun. @MuseAppHQ hello@museapp.com Show notes Jason Yuan / @jasonyuandesign Mercury OS MakeSpace Screenotate Omar Rizwan Tyler Angert Repl.it Weiwei Hsu Desktop Neo Artifacts iOS 14 widgets Sketch Orgami Quartz Composer Android launchers Gall's law the iPod click wheel virtual workspaces Dynamicland Bret Victor spring damping David Attenborough: A Life on Our Planet Xanadu philosopher's stone The Mother of All Demos
Jason Yuan believes that we all should feel empowered to think about ways to improve our computer's operating system. He joins Mark and Adam to talk about stage design, dreaming big versus delivering practical products, and why software should be fun. @MuseAppHQ hello@museapp.com Show notes Jason Yuan / @jasonyuandesign Mercury OS MakeSpace Screenotate Omar Rizwan Tyler Angert Repl.it Weiwei Hsu Desktop Neo Artifacts iOS 14 widgets Sketch Orgami Quartz Composer Android launchers Gall’s law the iPod click wheel virtual workspaces Dynamicland Bret Victor spring damping David Attenborough: A Life on Our Planet Xanadu philosopher’s stone The Mother of All Demos
Through the history of computing, user interfaces (UIs) have evolved from punch cards to voice interaction. In this episode we track that evolution, discussing each paradigm and the machine that popularized it. We primarily focus on personal computer UIs, covering command-line interfaces (CLIs), graphical user interfaces (GUIs), touch-screen interaction, and voice interfaces. We also imagine the future, including neural interfaces, virtual reality, and augmented reality. This episode is an introductory guide to the interfaces available and a short history, not a comprehensive tour. Show Notes Episode 16: The Personal Computer Revolution The Mother of All Demos via Wikipedia Fingerworks (developer of modern multi-touch) via Wikipedia Neuralink via Wikipedia Follow us on Twitter @KopecExplains. Theme “Place on Fire” Copyright 2019 Creo, CC BY 4.0 Find out more at http://kopec.live
My guest in this episode is Molly Wright Steenson, a designer, author, professor, and international speaker whose work focuses on the intersection of design, architecture, and artificial intelligence. In this wide-ranging discussion, Molly explains how the history of computational technologies, architecture, pattern language and AI combined to define the fields of Agile, interaction design, UX, AI and pretty much the rest of today's digital world. Show Notes This episode's archive and transcript (https://pln.me/p10) Molly at Girlwonder.com (http://www.girlwonder.com/) Molly on Twitter (https://twitter.com/maximolly) Molly on LinkedIn (https://www.linkedin.com/in/mollysteenson/) Molly at CMU (https://design.cmu.edu/people/faculty/molly-steenson#profile-main) Architectural Intelligence: How Designers and Architects Created the Digital Landscape (https://amzn.to/32ZApiF) Bauhaus Futures (https://amzn.to/361wHGP) Molly talking pneumatic tubes (https://soundcloud.com/roman-mars/61-a-series-of-tubes) Mark Pesce and Dr. Genevieve Bell’s podcast on the Mother of All Demos (https://nextbillionseconds.com/2018/12/07/1968-when-the-world-began-the-mother-of-all-demos/) Doug Engelbart’s Mother of All Demos (https://www.youtube.com/watch?v=yJDv-zdhzMY) Doctor’s Note newsletter (https://pln.me/nws) Andy on Twitter (https://twitter.com/apolaine) Andy on LinkedIn (https://linkedin.com/in/andypolaine) Polaine.com (https://www.polaine.com/) Get in touch! (https://www.polaine.com/contact) And if you like Power of Ten, please consider giving it a rating or review on iTunes.
Mother of All Demos.Homo Deus.Hackers: Heroes of the Computer Revolution.The Map is not the territory.Neural Engine.A Game of Giants by WBW.Your hosts are Arvind Vermani and Ilya Belikin. Thank you for listening. You might learn more about us and join the conversations at our place.
Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us to innovate (and sometimes cope with) the future! Today we're going to cover yet another of the groundbreaking technologies to come out of MIT: Sketchpad. Ivan Sutherland is a true computer scientist. After getting his masters from Caltech, he migrated to the land of the Hackers and got a PhD from MIT in 1963. The great Claud Shannon supervised his thesis and Marvin Minsky was on the thesis review committee. But he wasn't just surrounded by awesome figures in computer science, he would develop a critical piece between the Memex in Vannevar Bush's “As We May Think” and the modern era of computing: graphics. What was it that propelled him from PhD candidate to becoming the father of computer graphics? The 1962-1963 development of a program called Sketchpad. Sketchpad was the ancestor of the GUI, object oriented programming, and computer graphics. In fact, it was the first graphical user interface. And it was all made possible by the TX-2, a computer developed at the MIT Lincoln Laboratory by Wesley Clark and others. The TX-2 was transistorized and so fast. Fast enough to be truly interactive. A lot of innovative work had come with the TX-0 and the program would effectively spin off as Digital Equipment Corporation and the PDP series of computers. So it was bound to inspire a lot of budding computer scientists to build some pretty cool stuff. Sutherland's Sketchpad used a light pen. These were photosensitive devices that worked like a stylus but would send light to the display, activating dots on a cathode ray tube (CRT). Users could draw shapes on a screen for the first time. Whirlwind at MIT had allowed highlighting objects, but this graphical interface to create objects was a new thing altogether, inputing data into a computer as an object instead of loading it as code, as could then be done using punch cards. Suddenly the computer could be used for art. There were toggle-able switches that made lines bigger. The extra memory that was pretty much only available in the hallowed halls of government-funded research in the 60s opened up so many possibilities. Suddenly, computer-aided design, or CAD, was here. Artists could create a master drawing and then additional instances on top, with changes to the master reverberating through each instance. They could draw lines, concentric circles, change ratios. And it would be 3 decades before MacPaint would bring the technology into homes across the world. And of course AutoCAD, making Autodesk one of the greatest software companies in the world. The impact of Sketchpad would be profound. Sketchpad would be another of Doug Englebart's inspirations when building the oN-Line System and there are clear correlations in the human interfaces. For more on NLS, check out the episode of this podcast called the Mother of All Demos, or watch it on YouTube. And Sutherland's work would inspire the next generation: people who read his thesis, as well as his students and coworkers. Sutherland would run the Information Processing Techniques Office for the US Defense Department Advanced Research Project Agency after Lick returned to MIT. He also taught at Harvard, where he and students developed the first virtual reality system in 1968, decades before it was patented by VPL research in 1984. Sutherland then went to the University of Utah, where he taught Alan Kay who gave us object oriented programming in smalltalk and the concept of the tablet in the Dynabook, and Ed Catmull who co-founded Pixar and many other computer graphics pioneers. He founded Evans and Sutherland, with the man that built the computer science department at the University of Utah and their company launched the careers of John Warnock, the founder of Adobe and Jim Clark, the founder of Silicon Graphics. His next company would be acquired by Sun Microsystems and become Sun Labs. He would remain a Vice President and fellow at Sun and a visiting scholar at Berkeley. For Sketchpad and his other contributions to computing, he would be awarded a Computer Pioneer Award, become a fellow at the ACM, receive a John von Neumann Medal, receive the Kyoto Prize, become a fellow at the Computer History Museum, and receive a Turing Award. I know we're not supposed to make a piece of software an actor in a sentence, but thank you Sketchpad. And thank you Sutherland. And his students and colleagues who continued to build upon his work.
In this inaugural “Points of Entry” episode, host Katie Kheriji-Watts speaks with culturebot.org founder Andy Horwitz about economic shame and labor advocacy in the arts world, the intersection of counterculture and online technology, the patience required to effect change in large institutions and what it’s like to experience parenthood for the first time at the age of 50. mentioned in this episode : Andy Horwitzhttp://www.andyhorwitz.com “The Future, Revisited: ‘The Mother of All Demos’ at 50” - LA Review of Books https://lareviewofbooks.org/article/the-future-revisited-the-mother-of-all-demos-at-50The Brooklyn Communehttps://brooklyncommune.orgCulturebothttps://www.culturebot.orgfollow Points of Entry : http://www.pointsofentry.comhttps://www.instagram.com/pointsofentryspecial thanks : Solene de Bony, Fabrice Allain, Gabriel Nazoa, Zied Kheriji
Today we're going to honor Larry Tesler, who died on February 17th, 2020. Larry Tesler is probably best known for early pioneering work on graphical user interfaces. He was the person that made up cut, copy, and paste as a term. Every time you say “just paste that in there,” you're honoring his memory. I've struggled with how to write the episode or episodes about Xerox PARC. It was an amazing crucible of technical innovation. But they didn't materialize huge commercial success for Xerox. Tesler was one of the dozens of people who contributed to that innovation. He studied with John McCarthy and other great pioneers at the Stanford Artificial Intelligence Laboratory in the 60s. What they called artificial intelligence back then we might call computer science today. Being in the Bay Area in the 60s, Tesler got active in war demonstrations and disappeared off to a commune in Oregon until he got offered a job by Alan Kay. You might remember Kay from earlier episodes as the one behind Smalltalk and the DynaBook. They'd both been at The Mother of All Demos where Doug Englebart showed the mouse, the first hyperlinks, and the graphical user interface and they'd been similarly inspired about the future of computing. So Tesler moves back down in 1970. I can almost hear Three Dog Night's Mama Told Me Not To Come booming out of the 8track of his car stereo on the drive. Or hear Nixon and Kissinger on the radio talking about why they invaded Cambodia. So he gets to PARC and there's a hiring freeze at Xerox, who after monster growth was starting to get crushed by bureaucracy, so was in a hiring freeze. Les Earnest from back at Stanford had him write one of the first markup language implementations, which he called Pub. That became the inspiration for Don Knuth's TeX and Brian Reid's Scribe and an ancestor of JavaScript and PHP. They find a way to pay him, basically bringing him on as a contractor. He works on Gypsy, the first real word processor. At the time, they'd figured out a way of using keystrokes to switch modes for documents. Think of how in vi or pico, you switch to a mode in order to insert or move, but they were applying metadata to an object, like making text bold or copying text from one part of a document to another. Those modes were terribly cumbersome and due to very simple mistakes, people would delete their documents. So he and Tim Mott started looking at ways to get rid of modes. That's when they came up with the idea to make a copy and paste function. And to use the term cut, copy, and paste. Thee are now available in all “what you see is what you get” or WYSIWYG interfaces. Oh, he also coined that term while at PARC, although maybe not the acronym. And he became one of the biggest proponents of making software “user-friendly” when he was at PARC. By the way, that's another term he coined, with relation to computing at least. He also seems to be the first to have used the term browser after building a browser for a friend to more easily write code. He'd go on to work on the Xerox Alto and NoteTaker. That team, which would be led by Adele Goldberg after Bob Taylor and then Alan Kay left PARC got a weird call to show these kids from Apple around. The scientists from PARC didn't think much of these hobbyists but in 1979 despite Goldberg's objections, Xerox management let the fox in the chicken coup when they let Steve Jobs and some other early Apple employees get a tour of PARC. Tesler would be one of the people giving Jobs a demo. And it's no surprise that after watching Xerox not ship the Alto, that Tesler would end up at Apple 6 months later. After Xerox bonuses were distributed of course. At Apple, he'd help finish the Lisa. It cost far less than the Xerox Star, but it wouldn't be until it went even further down-market to become the Macintosh that all of their hard work at Xerox and then Apple would find real success. Kay would become a fellow at Apple in 1984, as many of the early great pioneers left PARC. Tesler was the one that added object-oriented programming to Pascal, used to create the Lisa Toolkit and then he helped bring those into MacApp as class libraries for developing the Mac GUI. By 1990, Jobs had been out of Apple for 5 years and Tesler became the Vice President of the Newton project at Apple. He'd see Alan Kay's concept of the digital assistant made into a reality. He would move into the role of Chief Scientist at Apple once the project was complete. There, he made his own mini-PARC but would shut down the group and leave after Apple entered their darkest age in 1997. Tesler had been a strong proponent, acting as the VP of AppleNet and pushing more advanced networking options prior to his departure. He would strike out and build Stagecast, a visual programming language that began life as an object-oriented teaching language called Cocoa. Apple would reuse the name Cocoa when they ported in OpenStep, so not the Cocoa many developers will remember or maybe even still use. Stagecast would run until Larry decided to join the executive team at Amazon. At Amazon, Larry was the VP of Shopping Experience and would start a group on usability, doing market research, usability research, and lots of data mining. He would stay there for 4 years before moving on to Yahoo!, spreading the gospel about user experience and design, managing up to 200 people at a time and embedding designers and researchers into product teams, a practice that's become pretty common in UX. He would also be a fellow at Yahoo! before taking that role at 23 and me and ending his long and distinguished career as a consultant, helping make the world a better place. He conceptualized the Law of Conservation of Complexity, or Tesler's Law, in 1984 states that “Every application has an inherent amount of irreducible complexity. The only question is: Who will have to deal with it—the user, the application developer, or the platform developer?” But One of my favorite quotes of his “I have been mistakenly identified as “the father of the graphical user interface for the Macintosh”. I was not. However, a paternity test might expose me as one of its many grandparents.” The first time I got to speak with him, he was quick to point out that he didn't come up with much; he was simply carrying on the work started by Englebart. He was kind and patient with me. When Larry passed, we lost one of the founders of the computing world as we know it today. He lived and breathed user experience and making computers more accessible. That laser focus on augmenting human capabilities by making the inventions easier to use and more functional is probably what he'd want to be known for above all else. He was a good programmer but almost too empathetic not to end up with a focus on the experience of the devices. I'll include a link to an episode he did on the 99% Invisible episode in the show notes if you want to hear more from him directly ( https://99percentinvisible.org/episode/of-mice-and-men ). Everyone except the people who get royalties from White Out loved what he did for computing. He was a visionary and one of the people that ended up putting the counterculture into computing culture. He was a pioneer in User Experience and a great human. Thank you Larry for all you did for us. And thank you, listeners, in advance or in retrospect, for your contributions.
In a world of rapidly changing technologies, few have lasted as long is as unaltered a fashion as the mouse. The party line is that the computer mouse was invente d by Douglas Engelbart in 1964 and that it was a one-button wooden device that had two metal wheels. Those used an analog to digital conversion to input a location to a computer. But there's a lot more to tell. Englebart had read an article in 1945 called “As We May Think” by Vannevar Bush. He was in the Philippines working as a radio and radar tech. He'd return home,. Get his degree in electrical engineering, then go to Berkeley and get first his masters and then a PhD. Still in electrical engineering. At the time there were a lot of military grants in computing floating around and a Navy grant saw him work on a computer called CALDIC, short for the California Digital Computer. By the time he completed his PhD he was ready to start a computer storage company but ended up at the Stanford Research Institute in 1957. He published a paper in 1962 called Augmenting Human Intellect: A Conceptual Framework. That paper would guide the next decade of his life and help shape nearly everything in computing that came after. Keeping with the theme of “As We May Think” Englebart was all about supplementing what humans could do. The world of computer science had been interested in selecting things on a computer graphically for some time. And Englebart would have a number of devices that he wanted to test in order to find the best possible device for humans to augment their capabilities using a computer. He knew he wanted a graphical system and wanted to be deliberate about every aspect in a very academic fashion. And a key aspect was how people that used the system would interact with it. The keyboard was already a mainstay but he wanted people pointing at things on a screen. While Englebart would invent the mouse, pointing devices certainly weren't new. Pilots had been using the joystick for some time, but an electrical joystick had been developed at the US Naval Research Laboratory in 1926, with the concept of unmanned aircraft in mind. The Germans would end up building one in 1944 as well. But it was Alan Kotok who brought the joystick to the computer game in the early 1960s to play spacewar on minicomputers. And Ralph Baer brought it into homes in 1967 for an early video game system, the Magnavox Odyssey. Another input device that had come along was the trackball. Ralph Benjamin of the British Royal Navy's Scientific Service invented the trackball, or ball tracker for radar plotting on the Comprehensive Display System, or CDS. The computers were analog at the time but they could still use the X-Y coordinates from the trackball, which they patented in 1947. Tom Cranston, Fred Longstaff and Kenyon Taylor had seen the CDS trackball and used that as the primary input for DATAR, a radar-driven battlefield visualization computer. The trackball stayed in radar systems into the 60s, when Orbit Instrument Corporation made the X-Y Ball Tracker and then Telefunken turned it upside down to control the TR 440, making an early mouse type of device. The last of the options Englebart decided against was the light pen. Light guns had shown up in the 1930s when engineers realized that a vacuum tube was light-sensitive. You could shoot a beam of light at a tube and it could react. Robert Everett worked with Jay Forrester to develop the light pen, which would allow people to interact with a CRT using light sensing to cause an interrupt on a computer. This would move to the SAGE computer system from there and eek into the IBM mainframes in the 60s. While the technology used to track the coordinates is not even remotely similar, think of this as conceptually similar to the styluses used with tablets and on Wacom tablets today. Paul Morris Fitts had built a model in 1954, now known as Fitts's Law, to predict the time that's required to move things on a screen. He defined the target area as a function of the ratio between the distance to the target and the width of the target. If you listen to enough episodes of this podcast, you'll hear a few names repeatedly. One of those is Claude Shannon. He brought a lot of the math to computing in the 40s and 50s and helped with the Shannon-Hartley Theorum, which defined information transmission rates over a given medium. So these were the main options at Englebart's disposal to test when he started ARC. But in looking at them, he had another idea. He'd sketched out the mouse in 1961 while sitting in a conference session about computer graphics. Once he had funding he brought in Bill English to build a prototype I n 1963. The first model used two perpendicular wheels attached to potentiometers that tracked movement. It had one button to select things on a screen. It tracked x,y coordinates as had previous devices. NASA funded a study to really dig in and decide which was the best device. He, Bill English, and an extremely talented team, spent two years researching the question, publishing a report in 1965. They really had the blinders off, too. They looked at the DEC Grafacon, joysticks, light pens and even what amounts to a mouse that was knee operated. Two years of what we'd call UX research or User Research today. Few organizations would dedicate that much time to study something. But the result would be patenting the mouse in 1967, an innovation that would last for over 50 years. I've heard Engelbart criticized for taking so long to build the oNline System, or NLS, which he showcased at the Mother of All Demos. But it's worth thinking of his research as academic in nature. It was government funded. And it changed the world. His paper on Computer-Aided Display Controls was seminal. Vietnam caused a lot of those government funded contracts to dry up. From there, Bill English and a number of others from Stanford Research Institute which ARC was a part of, moved to Xerox PARC. English and Jack Hawley iterated and improved the technology of the mouse, ditching the analog to digital converters and over the next few years we'd see some of the most substantial advancements in computing. By 1981, Xerox had shipped the Alto and the Star. But while Xerox would be profitable with their basic research, they would miss something that a candle-clad hippy wouldn't. In 1979, Xerox let Steve Jobs make three trips to PARC in exchange for the opportunity to buy 100,000 shares of Apple stock pre-IPO. The mouse by then had evolved to a three button mouse that cost $300. It didn't roll well and had to be used on pretty specific surfaces. Jobs would call Dean Hovey, a co-founder of IDEO and demand they design one that would work on anything including quote “blue jeans.” Oh, and he wanted it to cost $15. And he wanted it to have just one button, which would be an Apple hallmark for the next 30ish years. Hovey-Kelley would move to optical encoder wheels, freeing the tracking ball to move however it needed to and then use injection molded frames. And thus make the mouse affordable. It's amazing what can happen when you combine all that user research and academic rigor from Englebarts team and engineering advancements documented at Xerox PARC with world-class industrial design. You see this trend played out over and over with the innovations in computing that are built to last. The mouse would ship with the LISA and then with the 1984 Mac. Logitech had shipped a mouse in 1982 for $300. After leaving Xerox, Jack Howley founded a company to sell a mouse for $400 the same year. Microsoft released a mouse for $200 in 1983. But Apple changed the world when Steve Jobs demanded the mouse ship with all Macs. The IBM PC would ;use a mouse and from there it would become ubiquitous in personal computing. Desktops would ship with a mouse. Laptops would have a funny little button that could be used as a mouse when the actual mouse was unavailable. The mouse would ship with extra buttons that could be mapped to additional workflows or macros. And even servers were then outfitted with switches that allowed using a device that switched the keyboard, video, and mouse between them during the rise of large server farms to run the upcoming dot com revolution. Trays would be put into most racks with a single u, or unit of the rack being used to see what you're working on; especially after Windows or windowing servers started to ship. As various technologies matured, other innovations came along to input devices. The mouse would go optical in 1980 and ship with early Xerox Star computers but what we think of as an optical mouse wouldn't really ship until 1999 when Microsoft released the IntelliMouse. Some of that tech came to them via Hewlett-Packard through the HP acquisition of DEC and some of those same Digital Research Institute engineers had been brought in from the original mainstreamer of the mouse, PARC when Bob Taylor started DRI. The LED sensor on the muse stuck around. And thus ended the era of the mouse pad, once a hallmark of many a marketing give-away. Finger tracking devices came along in 1969 but were far too expensive to produce at the time. As capacitive sensitive pads, or trackpads came down in price and the technology matured those began to replace the previous mouse-types of devices. The 1982 Apollo computers were the first to ship with a touchpad but it wasn't until Synaptics launched the TouchPad in 1992 that they began to become common, showing up in 1995 on Apple laptops and then becoming ubiquitous over the coming years. In fact, the IBM Thinkpad and many others shipped laptops with little red nubs in the keyboard for people that didn't want to use the TouchPad for awhile as well. Some advancements in the mouse didn't work out. Apple released the hockey puck shaped mouse in 1998, when they released the iMac. It was USB, which replaced the ADB interface. USB lasted. The shape of the mouse didn't. Apple would go to the monolithic surface mouse in 2000, go wireless in 2003 and then release the Mighty Mouse in 2005. The Mighty Mouse would have a capacitive touch sensor and since people wanted to hear a click would produce that with a little speaker. This also signified the beginning of bluetooth as a means of connecting a mouse. Laptops began to replace desktops for many, and so the mouse itself isn't as dominant today. And with mobile and tablet computing, resistive touchscreens rose to replace many uses for the mouse. But even today, when I edit these podcasts, I often switch over to a mouse simply because other means of dragging around timelines simply aren't as graceful. And using a pen, as Englebart's research from the 60s indicated, simply gets fatiguing. Whether it's always obvious, we have an underlying story we're often trying to tell with each of these episodes. We obviously love unbridled innovation and a relentless drive towards a technologically utopian multiverse. But taking a step back during that process and researching what people want means less work and faster adoption. Doug Englebart was a lot of things but one net-new point we'd like to make is that he was possibly the most innovative in harnessing user research to make sure that his innovations would last for decades to come. Today, we'd love to research every button and heat map and track eyeballs. But remembering, as he did, that our job is to augment human intellect, is best done when we make our advances useful, helps to keep us and the forks that occur in technology from us, from having to backtrack decades of work in order to take the next jump forward. We believe in the reach of your innovations. So next time you're working on a project. Save yourself time, save your code a little cyclomatic complexity, , and save users frustration from having to relearn a whole new thing. And research what you're going to do first. Because you never know. Something you engineer might end up being touched by nearly every human on the planet the way the mouse has. Thank you Englebart. And thank you to NASA and Bob Roberts from ARPA for funding such important research. And thank you to Xerox PARC, for carrying the torch. And to Steve Jobs for making the mouse accessible to every day humans. As with many an advance in computing, there are a lot of people that deserve a little bit of the credit. And thank you listeners, for joining us for another episode of the history of computing podcast. We're so lucky to have you. Now stop consuming content and go change the world.
50 years later, both creators and keepers of the flame for the ‘Mother of All Demos’ reflect on how 1968 changed the world - for all of us.
9 grudnia 536 - wojska Belizariusza zajęły opuszczony Rzym9 grudnia 1688 - Jakub II Stuart został pokonany przez zięcia i córkę w bitwie pod Reading9 grudnia 1803 - w USA przyjęto 12 poprawkę i od tej pory głosuje się na prezydenta i wiceprezydenta9 grudnia 1824 - bitwa pod Ayacucho w Peru9 grudnia 1922 - Gabriel Narutowicz został wybrany przez zgromadzenie narodowe na pierwszego prezydenta RP9 grudnia 1960 - wyemitowano pierwszy odcinek Coronation Street9 grudnia 1968 - pokazano demonstrację nazywaną Matką Wszystkich Dem (ang. The Mother of All Demos)9 grudnia 1990 - w II turze wyborów Wałęsa pokonał Tymińskiego9 grudnia 1991 - rozpoczęło nadawanie Radio Maryja9 grudnia 1997 - uchwalono, że imiesłowy przymniotnikowe zawszy piszemy łącznie z partykułą “nie”
I'm excited to be going back to the future with my guest today.His name is Leland Russell and he is well known for his work with GEO Group, helping entrepreneurs, non-profits and Fortune 500 companies with mission critical teams survive in times of rapid change.Leland saw how the world transformed through the 60's, 70's, 80's, 90's and 2000's. Today, he comes out of retirement to share his thoughts on what the future holds for businesses today.Leland started his career in the music business, became a CEO of a mid-sized company and presented as a keynote speaker on topics of leadership and technology. He co-wrote a book called Winning in Fast Times, which is available to be purchased online.If you've ever been feeling overwhelmed with Artificial Intelligence, Blockchain and other technological advancement, and you're wondering how you're going to manage your business in times of rapid change, then you must tune into this episode to learn how Leland Russell helps businesses stay calm in the face of a technological storm.You can follow Leland on Twitter GEO_LeadershipLinkedIn - https://www.linkedin.com/in/lelandrussell/Website: https://geogroup.net/Big Picture Future: https://news.microsoft.com/on-the-issues/tools-and-weapons/Mother of All Demos: https://www.youtube.com/watch?v=B6rKUf9DWRI==========To keep in touch with us, you can follow me on**INSTAGRAM, follow me on @bumperleads***FACEBOOK Page called Bumper Leads*** Follow me on Twitter@JovanaTheGreat***Follow me on LinkedIn*** Does Your Website Pass The 5-Second Test? Download this checklist and find out*** You can find all episodes here: https://www.bumperleads.com/podcast***SUBSCRIBE & REVIEW my podcast so I can continue delivering awesome shows(I do a happy dance around the house every time I see a 5-star review
Howard Rheingold calls himself an Independent Instigator & Observer, and we are so glad he is. Since the 60s, when the first concepts of personal computing and connecting humankind through networks were merely ideas on an LSD trip, he’s been watching how technology and human minds interrelate. Howard brings us up to the macro level, discussing how modes of communication shape the way societies behave, or how expressing hope can save an entire generation. Then he takes us to the micro level, comparing Facebook to other platforms that better encourage self-expression, telling us about the ‘five essential literacies’ for surviving in the modern world, and giving a huge list of recommended reading—like any good professor would. LINKS MENTIONED IN THIS EPISODE (FOR FULL LIST OF LINKS, SEE OUR FULL WEBPAGE): Michael Pollan’s book: How to Change Your Mind Higher Creativity: Liberating the Unconscious for Breakthrough Insights by Howard Rheingold and Willis Harman The WELL Howard’s TED Talk: “The New Power of Collaboration” Alan Kay’s article in Scientific American “Microelectronics in the Personal Computer" Future Shock by Alvin Toffler Whole Earth Catalog (on Amazon and Wikipedia) and the Millennium Whole Earth Catalog Our Minds Have Been Hijacked by Our Phones. Tristan Harris Wants to Rescue Them from Wired.com Connected Learning Alliance What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry by John Markoff Doug Engelbart’s “Mother of All Demos” video— the world debut of personal and interactive computing in 1968! Howard’s Patreon profile MUSICAL INSPIRATION FOR THIS EPISODE ON SPOTIFY: "People Got to Be Free" by The Rascals ABOUT THIS PODCAST Stayin' Alive in Tech is an oral history of Silicon Valley and technology. Melinda Byerley, the host, is a 20-year veteran of Silicon Valley and the founder of Timeshare CMO, a digital marketing intelligence firm, based in San Francisco. We really appreciate your reviews, shares on social media, and your recommendations for future guests. And check out our Spotify playlist for all the songs we refer to on our show.
Panelists: Earl Evans (hosting), Paul Hagstrom, and Jack Nutting Topic: Most innovative / forward-looking technology in the 60s/70s/80s Mostly, an excuse to talk about Douglas Englebart's "mother of all demos." But other topics are welcome too. Topic and feedback notes: The Mother of All Demos, Douglas Englebart (1968) Retrobits Podcast episode reviewing What the Dormouse Said (covering events of the era) Lisa slideshow Merlin in a browser (and links to other IXO things) Byte 1982 article on IXO telecomputer Paul's Retrochallenge 2014 WW on IXO AmigaOS 3.1.4 Kano (Raspberry Pi-based programming kits for kids) Ad for the Portbubble terminal X-keys XK-16 Stick Pac-Man: The untold story of how we really played the game HN discussion on arcade cabinet restoration Retro Computing News: Kickstarter: Transparent toaster Mac case Beto O'Rourke's history with Cult of the Dead Cow Twitter thread from author with further details Creating new LCD displays for repairs 8-bit symphony Anti-M (pre-boot for protected Apple II disks) Upcoming Shows: Computer Conservation Society lecture series, Manchester and London VCF Pacific Northwest, Living Computers: Museum+Labs, Seattle, WA, Mar 23-24, 2019 VCF East, InfoAge Science Center, Wall, NJ, May 3-5, 2019 GORF (The Great Oz-stralian Retro-technology Festival), Melbourne, Apr 24-28, 2019 CoCoFest, Lombard, IL, May 4-5, 2019 WOzFest 12:00, Sydney, May 25, 2019 QFest 12, Brisbane, May 25, 2019 KansasFest, Kansas City, MO, Jul 15-21, 2019 Fujiama 2019, Lengenfeld, Germany, Aug 26-Sep 4, 2019 Vintage Computer-related Commercial: IBM 5100 (1977) Retro Computing Gift Idea: 1977 Pillow Auction Picks: Earl: Vintage Burroughs B 9974-5 Disk Pack Jack: eMate 300 with all its stuff Paul: IBM Microsoft Adventure New Old Computer See also: Electronika KR-03 See also: @Foone discussing one on Twitter ORA RedGuard Data Protection Osborne car adapter Maxell 10-minute cassette Quazon Quik-Link 300 CP+ Telecoupler Closing comments: Scooping the Soviets Feedback/Discussion: @rcrpodcast on Twitter Vintage Computer Forum RCR Podcast on Facebook Throwback Network Throwback Network on Facebook Intro / Closing Song: Back to Oz by John X - link Show audio files hosted by CyberEars Listen/Download:
50 years later, both creators and keepers of the flame for the ‘Mother of All Demos’ reflect on how 1968 changed the world - for all of us.See omnystudio.com/listener for privacy information.See omnystudio.com/listener for privacy information.
50 years later, both creators and keepers of the flame for the ‘Mother of All Demos’ reflect on how 1968 changed the world - for all of us.
Recorded 8th December 2018 This week a slightly different show as I am deeply privileged to have as my special guest David Acklam, not a name you may know, but he was part of the development team for the Global Positioning System. Something that we now take forgranted but as one of the team described it a technology that went “from obscurity to ubiquity” and started out at what became known as “The Lonely Halls Meeting” which sounds more like a Peter Jackson fantasy epic than a world changing technology meeting! You can watch the documentary featuring David and many more of the team and telling the story of GPS on Amazon Prime here GIVEAWAYS Skylum has gifted us 5x of Luminar 2019 and Aurora 2019 to give away! Send an email to essentialapple@sudomail.com mentioning Luminar or Aurora and the phrase I give in the show to enter. PLUS Listeners of this show can claim $10 off purchases of Luminar and/or Aurora HDR 2019 with discount code EssentialApple (If you buy Luminar 2018 you'll get all the 2019 updates for free. Learn more.) Also we have a two Licenses for BeLight Live Home 3D to give away... 1x iOS and 1x Mac. Email the show on essentialapple@sudomail.com mentioning Live Home 3D and the phrase I give out in the show. Winners for both will be announced on the Christmas Party Podcast which is recording on the 23rd December. Why not come and join the Slack community? You can now just click on this Slackroom Link to sign up and join in the chatter! We can now also be found on Spotify, Soundcloud and even YouTube. Essential Apple Recommended Services: 33mail.com – Never give out your real email address online again. Sudo – Get up to 9 “avatars” with email addresses, phone numbers and more to mask your online identity. Free for the first year and priced from $0.99 US / £2.50 UK per month thereafter... ProtonMail – End to end encrypted, open source, based in Switzerland. Prices start from FREE... what more can you ask? ProtonVPN – a VPN to go with it perhaps? Prices also starting from nothing! Fake Name Generator – So much more than names! Create whole identities (for free) with all the information you could ever need. Wire – Free for personal use, open source and end to end encryted messenger and VoIP. Pinecast – a fabulous podcast hosting service with costs that start from nothing. Everyone should have a font manager... I really do believe that. So I highly recommend FontBase — All platforms. Professional features. Beautiful UI. Totally free. FontBase is the font manager of the new generation, built by designers, for designers. Essential Apple is not affiliated with or paid to promote any of these services... We recommend services that we use ourselves and feel are either unique or outstanding in their field, or in some cases are just the best value for money in our opinion. On this week's show: DAVID ACKLAM Retired Professional Engineer (PE inactive) BSEE, MS, University of Arizona Career Air Force Officer, retired (1966 -1987) Systems integration and test engineer with Texas Instruments Defense Group and Raytheon Missile Systems, retired (1987 - 2002) Part of the development program for Global Positioning System Documentary available on Amazon Prime Community outreach volunteer since retirement in 2003 Member, University of Arizona Lunar and Planetary Laboratory Director's Advisory board 2006 to present. Chairman of the UA Lunar and Planetary Laboratory Kuiper Circle Community Outreach committee OSIRIS-REx Ambassador OSIRIS-REx on NASA Friend and Docent at the Planetary Science Institute Member of the Tucson Mac User Group Tucson MUG on Facebook Staff reviewer with MyMac.com (http://mymac.com/) APPLE Apple released a clear case for the iPhone Xr – 9to5 Mac Apple released watchOS 5.1.2 with the ECG function for Apple Watch 4... Apple Watch user discovers A-fib heart issue with new ECG app – 9to5 Mac TECHNOLOGY Corning is building impossibly thin, flexible Gorilla Glass for foldable phones – BGR 50 years ago, Douglas Engelbart's ‘Mother of All Demos' changed personal technology forever – Mashable JUST A SNIPPET For things that are not worth more than a flypast Ask Siri “Why are fire trucks red?” for a fun answer WORTH-A-CHIRP / ESSENTIAL TIPS Faraday Bags for your kit MOSISO Keyboard Cover for Macbook Pro £6 UK or $7 US Non Plastic Beach offers alternatives to single use & disposable plastics to help you turn the tide on plastic waste, one sustainable product at a time. Force Apple's iOS apps to use Dropbox or any other storage – Cult of Mac Apple's iOS apps can store their files anywhere. You just have to know how to tell them. Ghostery Nemo's Hardware Store (49:53) Bonx Grip earpiece - $139 US each or $260 US for twin packs Social Media and Slack You can follow us on: Twitter / Slack / EssentialApple.com / Spotify / Soundcloud / YouTube / Facebook / Pinecast Also a big SHOUT OUT to the members of the Slack room without whom we wouldn't have half the stories we actually do – we thank you all for your contributions and engagement. You can always help us out with a few pennies by using our Amazon Affiliate Link so we get a tiny kickback on anything you buy after using it. If you really like the show that much and would like to make a regular donation then please consider joining our Patreon or using the Pinecast Tips Jar (which accepts one off or regular donations) And a HUGE thank you to the patrons who already do. This podcast is powered by Pinecast.
Panel: Dave Kimura Eric Berry Catherine Meyers Special Guests: Vishal Telangre In this episode of Ruby Rogues, the panelists talk to Vishal Telangre about his blog post entitled Ruby 2.5 prints backtrace and error message in reverse order. Vishal is working remotely for BigBinary where he works with Ruby on Rails, Kuberernetes, and Elm. They talk about the power of blog posts at BigBinary, give suggestions for people wanting to get into blogging, and inspiration for blog posts. They also touch on his blog post, the changes to backtrace in Ruby 2.5, and more! In particular, we dive pretty deep on: Vishal intro BigBinary posts a lot of blogs Write about the experiences that they encounter while working Plan-free Fridays Is there any type of motivation or culture that adds to people wanting to provide so many blog posts? Suggestions for someone trying to get into blogging Vishal’s blog posts at BigBinary Start with a simple topic Your blog post doesn’t have to “change the world” Blogging about new things coming up Ruby 2.5 backtrace His blog post Changes to backtrace in Ruby 2.5 Makes debugging convenient Huge change for companies who do logs Effect of change from a developer standpoint Time saved Mixed sentiments on this change When this features is enabled And much, much more! Links: Ruby 2.5 prints backtrace and error message in reverse order BigBinary Ruby on Rails Kuberernetes Elm Vishal’s blog posts at BigBinary Vishal’s GitHub Vishaltelangre.com @suruwat Sponsors FreshBooks Loot Crate Picks: Dave Husky 20 Gallon Air Compressor Eric Developer conundrum Catherine LeetCode.com Marcella Hazan Pesto Recipe Vishal The Mother of All Demos by Douglas Engelbart
Panel: Dave Kimura Eric Berry Catherine Meyers Special Guests: Vishal Telangre In this episode of Ruby Rogues, the panelists talk to Vishal Telangre about his blog post entitled Ruby 2.5 prints backtrace and error message in reverse order. Vishal is working remotely for BigBinary where he works with Ruby on Rails, Kuberernetes, and Elm. They talk about the power of blog posts at BigBinary, give suggestions for people wanting to get into blogging, and inspiration for blog posts. They also touch on his blog post, the changes to backtrace in Ruby 2.5, and more! In particular, we dive pretty deep on: Vishal intro BigBinary posts a lot of blogs Write about the experiences that they encounter while working Plan-free Fridays Is there any type of motivation or culture that adds to people wanting to provide so many blog posts? Suggestions for someone trying to get into blogging Vishal’s blog posts at BigBinary Start with a simple topic Your blog post doesn’t have to “change the world” Blogging about new things coming up Ruby 2.5 backtrace His blog post Changes to backtrace in Ruby 2.5 Makes debugging convenient Huge change for companies who do logs Effect of change from a developer standpoint Time saved Mixed sentiments on this change When this features is enabled And much, much more! Links: Ruby 2.5 prints backtrace and error message in reverse order BigBinary Ruby on Rails Kuberernetes Elm Vishal’s blog posts at BigBinary Vishal’s GitHub Vishaltelangre.com @suruwat Sponsors FreshBooks Loot Crate Picks: Dave Husky 20 Gallon Air Compressor Eric Developer conundrum Catherine LeetCode.com Marcella Hazan Pesto Recipe Vishal The Mother of All Demos by Douglas Engelbart
Panel: Dave Kimura Eric Berry Catherine Meyers Special Guests: Vishal Telangre In this episode of Ruby Rogues, the panelists talk to Vishal Telangre about his blog post entitled Ruby 2.5 prints backtrace and error message in reverse order. Vishal is working remotely for BigBinary where he works with Ruby on Rails, Kuberernetes, and Elm. They talk about the power of blog posts at BigBinary, give suggestions for people wanting to get into blogging, and inspiration for blog posts. They also touch on his blog post, the changes to backtrace in Ruby 2.5, and more! In particular, we dive pretty deep on: Vishal intro BigBinary posts a lot of blogs Write about the experiences that they encounter while working Plan-free Fridays Is there any type of motivation or culture that adds to people wanting to provide so many blog posts? Suggestions for someone trying to get into blogging Vishal’s blog posts at BigBinary Start with a simple topic Your blog post doesn’t have to “change the world” Blogging about new things coming up Ruby 2.5 backtrace His blog post Changes to backtrace in Ruby 2.5 Makes debugging convenient Huge change for companies who do logs Effect of change from a developer standpoint Time saved Mixed sentiments on this change When this features is enabled And much, much more! Links: Ruby 2.5 prints backtrace and error message in reverse order BigBinary Ruby on Rails Kuberernetes Elm Vishal’s blog posts at BigBinary Vishal’s GitHub Vishaltelangre.com @suruwat Sponsors FreshBooks Loot Crate Picks: Dave Husky 20 Gallon Air Compressor Eric Developer conundrum Catherine LeetCode.com Marcella Hazan Pesto Recipe Vishal The Mother of All Demos by Douglas Engelbart
The Valley comes of age as the center of innovation and personal computing. Doug Englebart delivers the Mother of All Demos. Steve Jobs makes a fateful visit to Xerox PARC. On The WELL, people learn what it means to socialize online. Guests: Leslie Berlin, John Markoff, and Howard Rheingold.
Chris speaks with Daniel G. Siegel about his talk "the bullet hole misconception" covering topics such as how developers should challenge themselves and preconceptions. The Future of Programming, Bret Victor http://worrydream.com/dbx/ The Mother of All Demos, Doug Engelbart http://www.dougengelbart.org/firsts/dougs-1968-demo.html Augmenting Human Intellect, Doug Engelbart http://www.dougengelbart.org/pubs/augment-3906.html The Surrender of Culture to Technology, Neil Postman https://www.youtube.com/watch?v=hlrv7DIHllE Neil Postman talks at Apple https://www.youtube.com/watch?v=QqxgCoHv_aE The bullet hole misconception https://www.dgsiegel.net/talks/the-bullet-hole-misconception The lost medium https://www.dgsiegel.net/talks/the-lost-medium Previous episodes - http://gregariousmammal.com/podcast Support the show - http://gregariousmammal.com/support --- Send in a voice message: https://anchor.fm/theweeklysqueak/message
Topics Leadership by example Picks Craig Clean Architecture (https://www.amazon.com/Clean-Architecture-Craftsmans-Software-Structure/dp/0134494164) by Robert “Uncle Bob” Martin Building Evolutionary Architectures (https://www.amazon.com/gp/product/1491986360/) by Neal Ford and Rebecca Parsons John The Mother of All Demos (https://youtu.be/r6iADUTlqwQ), presented by Douglas Engelbart 1968 The Five Dysfunctions of a Team (https://smile.amazon.com/Five-Dysfunctions-Team-Leadership-Fable/dp/0787960756/ref=sr_1_sc_1?ie=UTF8&qid=1508296754&sr=8-1-spell&keywords=five+dystfunctions) Jason Liberating Structures (http://www.liberatingstructures.com) - helpful and well designed frameworks for specific types of strategic conversations - references available on the web at plus a Mobile App and also in print. LIVE EVENT - Imposter Syndrome: Innovation Killer Among Us (https://www.eventbrite.com/e/imposter-syndrome-innovation-killer-among-us-tickets-38074391530) - Wednesday morning, November 8, 2017 in the Kansas City area - Join Billie Schuttpelz after the Lean Agile KC conference to recreate the audacious salon experience from Agile2017 and learn more about Imposter Symdrome and how it impacts all of us.
Recapitulando la versión anterior, los piratas hicieron un update de la no tan positiva situación del voto electrónico en el país. Pero para no dejar de lado a los clásicos personajes, nos presentaron a Douglas Engelbart, el tipo que inventó el mouse. Ingeniero e inventor, impulsó el vínculo entre las capacidades intuitivas de la mente humana y las capacidades de procesamiento de las computadoras. The Mother of All Demos, uno de sus grandes hitos. Para cerrar, le dijeron adiós y hasta siempre a uno de los mejores trackers privados para descargar torrents.
Recorded on January 19th, 2015 - Francis Crick, Nobel Prize-winning father of modern genetics, deduced the double-helix structure of DNA: may have been influenced by LSD. - Kary Mullis, inventor of PCR, a scientific breakthrough that accelerated the sequencing of the human genome: "I found it to be a mind-opening experience. It was certainly much more important than any courses I ever took. [...] What if I had not taken LSD ever; would I have still invented PCR? I don't know. I doubt it. I seriously doubt it." - Steve Jobs: “Taking LSD was a profound experience, one of the most important things in my life. LSD shows you that there’s another side to the coin, and you can’t remember it when it wears off, but you know it. It reinforced my sense of what was important—creating great things instead of making money, putting things back into the stream of history and of human consciousness as much as I could.” “When you grow up you tend to get told that the world is the way it is and you're life is just to live your life inside the world. Try not to bash into the walls too much. Try to have a nice family life, have fun, save a little money. That's a very limited life. Life can be much broader once you discover one simple fact: Everything around you that you call life was made up by people that were no smarter than you. And you can change it, you can influence it… Once you learn that, you'll never be the same again.” - Steve Jobs Also: "Here's to the Crazy Ones" Douglas Englebart, early computer scientist, presenter of the Mother of All Demos, had "two LSD experiences." - Kevin Herbert, early Cisco engineer: "When I'm on LSD and hearing something that's pure rhythm, it takes me to another world and into anther brain state where I've stopped thinking and started knowing. It must be changing something about the internal communication in my brain." References: Interview with Patrick Lundborg: 60’s psych & garage guru, psychedelic culture scholar and author of brilliant „Psychedelia” and „Acid Archives” books, discussed in Entheogen #003 What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer by John Markoff "Shaking one's snow globe" with LSD: Entheogen 002: Psychedelic Research Renaissance, Part 2