U.S. computer manufacturer 1957-1998
POPULARITY
Kevin Mitnick fue un hacker legendario. Durante los 80 accedió ilegalmente a sistemas de empresas como Digital Equipment y, tras violar su libertad condicional, fue fugitivo por más de dos años. Cumplió cinco años de prisión tras ser arrestado en 1995; un hecho que le reformó para terminar dedicándose a la educación digital a través de conferencias y como consultor de ciberseguridad. Murió en 2023, dejando tras de sí un importante legado en la historia del hacking. Y descubre más historias curiosas en el canal National Geographic y en Disney + Learn more about your ad choices. Visit podcastchoices.com/adchoices
For the last 200 years, innovation and technology have produced dramatic increases in living standards and our quality of life. Yet today there is a widespread and growing belief that technology has become the root of all evils with all sorts of claims being made that it destroys privacy, spreads misinformation, undermines trust, and democracy, eliminates jobs, discriminates by race, and gender, increases inequality, rips off the consumer, harms children, and even threatens the human race. This is quite a bill of indictment! But is any of this true? Bill's guests on this show, Rob Atkinson and David Moschella, believe this is mostly agenda driven bunk and have written a persuasive book to prove it. “Technology Fears and Scapegoats: 40 Myths About Privacy, Jobs, AI, and Today's Innovation Economy” Robert D. Atkinson is the founder and president of ITIF, the Information Technology and Innovation Foundation, and author of many books including “Innovation Economics: The Race for Global Advantage” David Moschella is a nonresident senior fellow at ITIF and previously was head of worldwide research for IDC, the largest market analysis firm in the information technology industry. “America's always flourished more than anybody else in the world,” declares Atkinson, “because we have had this underlying faith in innovation, in the future, in taking risks, in going forward into the unknown. And now, that's really at risk. People are saying things like, ‘Wait a minute, we shouldn't deploy facial recognition because it's racially biased' or that ‘technology innovation has not improved the average worker's living standards.'” “Well, both those statements are wrong. They are 100% myth.” “As someone who grew up in the Boston area in the 60s and 70s,” says Moschella, “Massachusetts was considered a dead economy with no future. And then, this thing called the minicomputer was designed out of MIT, and created companies like Digital Equipment, and Prime, and Wang and all the others. And all of a sudden you had the so-called Massachusetts Miracle. People are forgetting these realities.” Some of the most damaging myths stem from a deep-seated rejection of the Western capitalist system. But to gain traction for this agenda, anti-capitalists must first convince voters that the current system is failing, and a top target is technology driven innovation. “Also, what's happened is that some of the legitimate criticism of globalization has morphed into criticism of automation,” says Moschella. “People didn't like globalization, but then said, 'Well, automation is really more the cause than globalization, and of course technology drives automation, so we can blame technology automation for the problems globalization has created.''” “I've actually heard members of Congress say, ‘The pace of change is so rapid and we have to slow it down,' worries Rob. “Now think about that. When has America ever said that? That the average person can't handle change. That it's too fast. We've got to slow things down.” In this episode we also take on the myth that the pace of technological change is accelerating. Compare current era to the early 20th century, which saw the introduction of transformative technologies like electricity, radio, automobiles, and airplanes. The perception of rapid change today is often skewed by the digital revolution's visibility, but in reality, the physical and infrastructural advancements of the past were probably more transformative. America has shifted its focus from delivering technological wonders to preventing “harmful” change. Once widely seen as a savior of humanity, technology is increasingly used as a scapegoat for just about every societal ill. But if we see innovation as a necessary force for good, with government's role as a constructive enabler, there will be thoughtful innovation policies and more innovation But if the dominant narrative is that technology is an out-of-control force for harm, there will be destructive policies and a stultifying future. Agree or disagree, this conversation wanders into some interesting waters and challenges a lot of today's conventional wisdom about technology.
Paul Nerger is a seasoned industry expert with a diverse background spanning technology, business management, and academia. He holds a bachelor's degree in computer science from Washington State University with a focus in Artificial Intelligence. His academic journey continued with coursework at Templeton College at the University of Oxford as well as serving as a lecturer at MIT's Applied Innovation Institute and the University of California, Berkeley's Center for Entrepreneurship and Technology.Paul's entrepreneurial spirit shines through his roles as a co-founder and CTO of Happtique, Inc., AppCentral, dotMobi, and Argogroup all of which were sold to larger companies. Paul also has experience at larger companies including senior positions at Digital Equipment, Sequent Computer Systems, Silicon Graphics, and Informix Software. Currently, as a VP at Blue Prism, Paul leads a team dedicated to fostering collaboration and innovation within the RPA ecosystem including how to apply advanced AI to solving business problems.Paul has served on the board of directors and technical advisory boards for several companies including the Boeing Company and British Aerospace. He served as a U.S. Air Force Officer and the Air Defense Advisor to NATO.With a rich tapestry of experiences encompassing academia, entrepreneurship, and corporate leadership, Paul continues to drive innovation and empower organizations to thrive in the ever-evolving landscape of technology and business.Paul is a member the Royal Air Force Club and Fellow of the Royal Institute of Arts in London.Support the showFollow me on Facebook ⬇️https://www.facebook.com/manuj.aggarwal❤️ ID - Manuj Aggarwal■ LinkedIn: https://www.linkedin.com/in/manujaggarwal/ ■ Facebook: https://www.facebook.com/realmanuj■ Instagram: ...
================================================== ==SUSCRIBETEhttps://www.youtube.com/channel/UCNpffyr-7_zP1x1lS89ByaQ?sub_confirmation=1================================================== == DEVOCIÓN MATUTINA PARA ADOLESCENTES 2022“UN SALTO EN EL TIEMPO”Narrado por: DORIANY SÁNCHEZDesde: PERÚUna cortesía de DR'Ministries y Canaan Seventh-Day Adventist Church 19 DE DICIEMBRELAS COMPUTADORAS PERSONALES«Pero tú, Daniel, cierra las palabras y sella el libro hasta el tiempo del fin. Muchos correrán de aquí para allá, y la ciencia obtendrán» (Daniel 12: 4, RV95).Bill Gates y Paul Allen, de Microsoft, se iniciaron en la industria informática trabajando en su casa. Por extraño que parezca, entregaron su primer programa de computadora personal al fabricante en algo parecido a una casete de audio. La computadora personal tuvo seguramente comienzos humildes, pero muchas veces las grandes cosas suelen comenzar así.He aquí algunos comentarios (bastante equivocados) que la gente hizo sobre la computadora personal cuando se fabricó:«Las computadoras del futuro no pesarán más de 1,5 toneladas», — Popular Mechanics, 1949«Creo que hay un mercado mundial para quizás cinco computadoras», — El presidente de IBM en 1943«¿El microchip? Pero, ¿para qué... sirve?», — Un ingeniero de IBM en 1968«No hay ninguna razón para que alguien quiera una computadora en su casa». — Ken Olson, presidente y fundador de Digital Equipment, en 1977 -¡Cuán lejos de la verdad estaban estas personas! Las computadoras han evolucionado hasta convertirse en lo que son hoy: la mayor herramienta de acceso al conocimiento desde la imprenta de Gutenberg. Desde las versiones de Commodore 64, con cintas de casete, hasta las unidades de disquete; desde los disquetes de 3.5 hasta las memorias USB; desde las transmisiones de correo electrónico hasta las inscripciones digitales a través de vías aéreas; las computadoras dominan totalmente la sociedad. Casi todos los hogares tienen al menos una computadora, por no hablar de la industria y la tecnología robótica que se está apoderando de nuestra cultura.Hoy en día, utilizamos las computadoras también para llevar a cabo el evangelio de Jesús a la gente que no lo conoce. Se dan estudios bíblicos en línea; se hacen transmisiones por satélite; y todo esto acelera la llegada del evangelio a cada hogar. El conocimiento ha llegado, tal como profetizó Daniel, y la computadora ha sido parte de ello. Sin embargo, la Inspiración nos dice que el último gran mensaje de salvación para el mundo se difundirá a través de la evangelización individual, de uno a uno. Eso significa que, por muy maravillosas que sean las computadoras, nunca tendrán ese toque personal del tú a tú.
In this episode, we went to Stockhölm to meet Sören Enholm, TCO Certified CEO
Taiwan is a country about half the size of Maine with about 17 times the population of that state. Taiwan sits just over a hundred miles off the coast of mainland China. It's home to some 23 and a half million humans, roughly half way between Texas and Florida or a few more than live in Romania for the Europeans. Taiwan was connected to mainland China by a land bridge in the Late Pleistocene and human remains have been found dating back to 20,000 to 30,000 years ago. About half a million people on the island nation are aboriginal, or their ancestors are from there. But the population became more and more Chinese in recent centuries. Taiwan had not been part of China during the earlier dynastic ages but had been used by dynasties in exile to attack one another and so became a part of the Chinese empire in the 1600s. Taiwan was won by Japan in the late 1800s and held by the Japanese until World War II. During that time, a civil war had raged on the mainland of China with the Republic of China eventually formed as the replacement government for the Qing dynasty following a bloody period of turf battles by warlords and then civil war. Taiwan was in martial law from the time the pre-communist government of China retreated there during the exit of the Nationalists from mainland China in the 1940s to the late 1980. During that time, just like the exiled Han dynasty, they orchestrated war from afar. They stopped fighting, much like the Koreans, but have still never signed a peace treaty. And so large parts of the world remained in stalemate. As the years became decades, Taiwan, or the Republic of China as they still call themselves, has always had an unsteady relationship with the People's Republic of China, or China as most in the US calls them. The Western world recognized the Republic of China and the Soviet and Chines countries recognized the mainland government. US President Richard Nixon visited mainland China in 1972 to re-open relations with the communist government there and relations slowly improved. The early 1970s was a time when much of the world still recognized the ruling government of Taiwan as the official Chinese government and there were proxy wars the two continued to fight. The Taiwanese and Chinese still aren't besties. There are deep scars and propaganda that keep relations from being repaired. Before World War II, the Japanese also invaded Hong Kong. During the occupation there, Morris Chang's family became displaced and moved to a few cities during his teens before he moved Boston to go to Harvard and then MIT where he did everything to get his PhD except defend his thesis. He then went to work for Sylvania Semiconductor and then Texas Instruments, finally getting his PhD from Stanford in 1964. He became a Vice President at TI and helped build an early semiconductor designer and foundry relationship when TI designed a chip and IBM manufactured it. The Premier of Taiwan at the time, Sun Yun-suan, who played a central role in Taiwan's transformation from an agrarian economy to a large exporter. His biggest win was when to recruit Chang to move to Taiwan and found TSCM, or Taiwan Semiconductor Manufacturing Company. Some of this might sound familiar as it mirrors stories from companies like Samsung in South Korea. In short, Japanese imperialism, democracies versus communists, then rapid economic development as a massive manufacturing powerhouse in large part due to the fact that semiconductor designers were split from semiconductor foundry's or where chips are actually created. In this case, a former Chinese national was recruited to return as founder and led TSMC for 31 years before he retired in 2018. Chang could see from his time with TI that more and more companies would design chips for their needs and outsource manufacturing. They worked with Texas Instruments, Intel, AMD, NXP, Marvell, MediaTek, ARM, and then the big success when they started to make the Apple chips. The company started down that path in 2011 with the A5 and A6 SoCs for iPhone and iPad on trial runs but picked up steam with the A8 and A9 through A14 and the Intel replacement for the Mac, the M1. They now sit on a half trillion US dollar market cap and are the largest in Taiwan. For perspective, their market cap only trails the GDP of the whole country by a few billion dollars. Nvidia TSMC is also a foundry Nvidia uses. As of the time of this writing, Nvidia is the 8th largest semiconductor company in the world. We've already covered Broadcom, Qualcomm, Micron, Samsung, and Intel. Nvidia is a fabless semiconductor company and so design chips that vendors like TSMC manufacture. Nvidia was founded by Jensen Huang, Chris Malachowsky, and Curtis Priem in 1993 in Santa Clara, California (although now incorporated in Delaware). Not all who leave the country they were born in due to war or during times of war return. Huang was born in Taiwan and his family moved to the US right around the time Nixon re-established relations with mainland China. Huang then went to grad school at Stanford before he became a CPU designer at AMD and a director at LSI Logic, so had experience as a do-er, a manager, and a manager's manager. He was joined by Chris Malachowsky and Curtis Priem, who had designed the IBM Professional Graphics Adapter and then the GX graphics chip at Sun. because they saw this Mac and Windows and Amiga OS graphical interface, they saw the games one could play on machines, and they thought the graphics cards would be the next wave of computing. And so for a long time, Nvidia managed to avoid competition with other chip makers with a focus on graphics. That initially meant gaming and higher end video production but has expanded into much more like parallel programming and even cryptocurrency mining. They were more concerned about the next version of the idea or chip or company and used NV in the naming convention for their files. When it came time to name the company, they looked up words that started with those letters, which of course don't exist - so instead chose invidia or Nvidia for short, as it's latin for envy - what everyone who saw those sweet graphics the cards rendered would feel. They raised $20 million in funding and got to work. First with SGS-Thomson Microelectronics in 1994 to manufacture what they were calling a graphical-user interface accelerator that they packaged on a single chip. They worked with Diamond Multimedia Systems to install the chips onto the boards. In 1995 they released NV1. The PCI card was sold as Diamond Edge 3D and came with a 2d/3d graphics core with quadratic texture mapping. Screaming fast and Virtual Fighter from Sega ported to the platform. DirectX had come in 1995. So Nviia released DirectX drivers that supported Direct3D, the api that Microsoft developed to render 3d graphics. This was a time when 3d was on the rise for consoles and desktops. Nvidia timed it perfectly and reaped the rewards when they hit a million sold in the first four months for the RIVA, a 128-bit 3d processor that got used as an OEM in 1997. Then the 1998 RIVAZX with RIVATNT for multi-texture 3D processing. They also needed more manufacturing support at this point and entered into a strategic partnership with TSMC to manufacture their boards. A lot of vendors had a good amount of success in their niches. By the late 1990s there were companies who made memory, or the survivors of the DRAM industry after ongoing price dumping issues. There were companies that made central processors like Intel. Nvidia led the charge for a new type of chip, the GPU. They invented the GPU in 1999 when they released the GeForce 256. This was the first single-chip GPU processor. This means integrated lightings, triangle setups, rendering, like the old math coprocessor but for video. Millions of polygons could be drawn on screens every second. They also released the Quadro Pro GPU for professional graphics and went public in 1999 at an IPO of $12 per share. Nvidia used some of the funds from the IPO to scale operations, organically and inorganically. In 2000 they released the GeForce2 Go for laptops and acquired 3dfx, closing deals to get their 3d chips in devices from OEM manufacturers who made PCs and in the new Microsoft Xbox. By 2001 they hit $1 billion in revenues and released the GeForce 3 with a programmable GPU, using APIs to make their GPU a platform. They also released the nForce integrated graphics and so by 2002 hit 100 million processors out on the market. They acquired MediaQ in 2003 and partnered with game designer Blizzard to make Warcraft. They continued their success in the console market when the GeForce platform was used in the PS 3 in 2005 and by 2006 had sold half a billion processors. They also added the CUDA architecture that year to put a general purpose GPU on the market and acquired Hybrid Graphics who develops 2D and 3D embedded software for mobile devices. In 2008 they went beyond the consoles and PCs when Tesla used their GPUs in cars. They also acquired PortalPlayer, who supplies semiconductors and software for personal media players and launched the Tegra mobile processor to get into the exploding mobile market. More acquisitions in 2008 but a huge win when the GeForce 9400M was put into Apple MacBooks. Then more smaller chips in 2009 when the Tegra processors were used in Android devices. They also continued to expand how GPUs were used. They showed up in Ultrasounds and in 2010 the Audi. By then they had the Tianhe-1A ready to go, which showed up in supercomputers and the Optimus. All these types of devices that could use a GPU meant they hit a billion processors sold in 2011, which is when they went dual core with the Tegra 2 mobile processor and entered into cross licensing deals with Intel. At this point TSMC was able to pack more and more transistors into smaller and smaller places. This was a big year for larger jobs on the platform. By 2012, Nvidia got the Kepler-based GPUs out by then and their chips were used in the Titan supercomputer. They also released a virtualized GPU GRID for cloud processing. It wasn't all about large-scale computing efforts. The Tegra-3 and GTX 600 came out in 2012 as well. Then in 2013 the Tegra 4, a quad-core mobile processor, a 4G LTE mobile processor, Nvidia Shield for portable gaming, the GTX Titan, a grid appliance. In 2014 the Tegra K1 192, a shield tablet, and Maxwell. In 2015 came the TegraX1 with deep learning with 256 cores and Titan X and Jetson TX1 for smart machines, and the Nvidia Drive for autonomous vehicles. They continued that deep learning work with an appliance in 2016 with the DGX-1. The Drive got an update in the form of PX 2 for in-vehicle AI. By then, they were a 20 year old company and working on the 11th generation of the GPU and most CPU architectures had dedicated cores for machine learning options of various types. 2017 brought the Volta, Jetson TX2, and SHIELD was ported over to the Google Assistant. 2018 brought the Turing GPU architecture, the DGX-2, AGX Xavier, Clara, 2019 brought AGX Orin for robots and autonomous or semi-autonomous piloting of various types of vehicles. They also made the Jetson Nano and Xavier, and EGX for Edge Computing. At this point there were plenty of people who used the GPUs to mine hashes for various blockchains like with cryptocurrencies and the ARM had finally given Intel a run for their money with designs from the ARM alliance showing up in everything but a Windows device (so Apple and Android). So they tried to buy ARM from SoftBank in 2020. That deal fell through eventually but would have been an $8 billion windfall for Softbank since they paid $32 billion for ARM in 2016. We probably don't need more consolidation in the CPU sector. Standardization, yes. Some of top NVIDIA competitors include Samsung, AMD, Intel Corporation Qualcomm and even companies like Apple who make their own CPUs (but not their own GPUs as of the time of this writing). In their niche they can still make well over $15 billion a year. The invention of the MOSFET came from immigrants Mohamed Atalla, originally from Egypt, and Dawon Kahng, originally from from Seoul, South Korea. Kahng was born in Korea in 1931 but immigrated to the US in 1955 to get his PhD at THE Ohio State University and then went to work for Bell Labs, where he and Atalla invented the MOSFET, and where Kahng retired. The MOSFET was an important step on the way to a microchip. That microchip market with companies like Fairchild Semiconductors, Intel, IBM, Control Data, and Digital Equipment saw a lot of chip designers who maybe had their chips knocked off, either legally in a clean room or illegally outside of a clean room. Some of those ended in legal action, some didn't. But the fact that factories overseas could reproduce chips were a huge part of the movement that came next, which was that companies started to think about whether they could just design chips and let someone else make them. That was in an era of increasing labor outsourcing, so factories could build cars offshore, and the foundry movement was born - or companies that just make chips for those who design them. As we have covered in this section and many others, many of the people who work on these kinds of projects moved to the United States from foreign lands in search of a better life. That might have been to flee Europe or Asian theaters of Cold War jackassery or might have been a civil war like in Korea or Taiwan. They had contacts and were able to work with places to outsource too and given that these happened at the same time that Hong Kong, Singapore, South Korea, and Taiwan became safe and with no violence. And so the Four Asian Tigers economies exploded, fueled by exports and a rapid period of industrialization that began in the 1960s and continues through to today with companies like TSMC, a pure play foundry, or Samsung, a mixed foundry - aided by companies like Nvidia who continue to effectively outsource their manufacturing operations to companies in the areas. At least, while it's safe to do so. We certainly hope the entire world becomes safe. But it currently is not. There are currently nearly a million Rohingya refugees fleeing war in Myanmar. Over 3.5 million have fled the violence in Ukraine. 6.7 million have fled Syria. 2.7 million have left Afghanistan. Over 3 million are displaced between Sudan and South Sudan. Over 900,000 have fled Somalia. Before Ukranian refugees fled to mostly Eastern European countries, they had mainly settled in Turkey, Jordan, Lebanon, Pakistan, Uganda, Germany, Iran, and Ethiopia. Very few comparably settled in the 2 largest countries in the world: China, India, or the United States. It took decades for the children of those who moved or sent their children abroad to a better life to be able to find a better life. But we hope that history teaches us to get there faster, for the benefit of all.
"We were actually founded by two folks out of the engineering department of Digital Equipment. They were phenomenally big in computers early on, and had a very large manufacturing facility in Augusta just 10 miles from here. Throughout our history, we have had one big event that really moved us along, and early on at AMI was Stanley Toolworks. They're the Stanley tools that we still hear about and they were moving into electronics, and they had an electronic stapler. That couldn't just have a manual switch, because if it did, you'd click the switch and it would shoot off 15 staples all at once. So it needed a little electronic switch to just allow that to fire once and AMI landed that contract. At some point they decided to do a reverse auction, which was in effect, how low will you go to build our product and they took quotes to Asia, this was just starting to happen then and people were moving products offshore. Sure enough on something like 15 cents, we lost the product line to a manufacturer in Asia. So we had to start working again and we happen to be working with a smaller company in Massachusetts called Software House and they were designing devices that would control access to buildings. So instead of having a key, you'd go up and slide a wave a car would let you in," said Kim Vandermeulen, CFO of Alternative Manufacturing Inc. (AMI).AMI has been building chipboards for various tech companies across the globe for decades. Today they have helped companies with the transportation of COVID vaccines by tracking the temperature and other important factors within the crate that the vaccines are being transported in. But the one important thing AMI cares about is supporting start-up companies."What's unique about AMI, is that we're not afraid of the small guy, we're not afraid of a startup. We'll build one [chip], I'll build 10 [chips]. A lot of the bigger companies, they don't want that in their process. Our process is set up uniquely where we have modeled call 11. Many lines, most companies want volume, so they can set up their line run one pot, 10,000 to 100,000 parts, that's not us, we're the little train that could so we have the ability to use those small guys and we have the ability to take it to the next level of pre-production and production when it gets to the super-high volumes, I want to buy 100,000 or a million of something, it tends to fall out of our hands, we try to hand it off to a local partner try. But a lot of times they'll take it offshore to Asia, basically, for cost-driving purposes to drive as much of the cost out of it. I will say that I think the number is 85% of the clients who have started down the path with us are still with us," said Jim Barry, VP of Sales & Marketing.One start-up that is mentioned in the interview is called iTell which gives seniors the ability to not forget their walkers when traveling about their day. This gadget provides a solution to reducing falls and medical issues in the senior community.Tune in to learn more about AMI's history, how they have helped not only start-ups but international companies with their tech needs, and how Kim is not only a maker focused on tech but of delicious beverages.To learn more about AMI please visit their website.
Check out this conversation with Laury D'Oliveira. This high energy Director of Workplace Experience shared her story from working at Digital Equipment as a technician, to moving into office management, to AP/AR and her current role. This travel lover is always thinking about what else she can do to help make her company better- a trait she got from her parents encouraging her to conquer the world. She talks about being self-aware and shares great advice for folks at all places in their career. Enjoy! Check out Workhuman Connect with Laury on LinkedIn
Imagine a game that begins with a printout that reads: You are standing at the end of a road before a small brick building. Around you is a forest. A small stream flows out of the building and down a gully. In the distance there is a tall gleaming white tower. Now imagine typing some information into a teletype and then reading the next printout. And then another. A trail of paper lists your every move. This is interactive gaming in the 1970s. Later versions had a monitor so a screen could just show a cursor and the player needed to know what to type. Type N and hit enter and the player travels north. “Search” doesn't work but “look” does. “Take water” works as does “Drink water” but it takes hours to find dwarves and dragons and figure out how to battle or escape. This is one of the earliest games we played and it was marvelous. The game was called Colossal Cave Adventure and it was one of the first conversational adventure games. Many came after it in the 70s and 80s, in an era before good graphics were feasible. But the imagination was strong. The Oregon Trail was written before it, in 1971 and Trek73 came in 1973, both written for HP minicomputers. Dungeon was written in 1975 for a PDP-10. The author, Don Daglow, went on the work on games like Utopia and Neverwinter Nights Another game called Dungeon showed up in 1975 as well, on the PLATO network at the University of Illinois Champagne-Urbana. As the computer monitor spread, so spread games. William Crowther got his degree in physics at MIT and then went to work at Bolt Baranek and Newman during the early days of the ARPANET. He was on the IMP team, or the people who developed the Interface Message Processor, the first nodes of the packet switching ARPANET, the ancestor of the Internet. They were long hours, but when he wasn't working, he and his wife Pat explored caves. She was a programmer as well. Or he played the new Dungeons & Dragons game that was popular with other programmers. The two got divorced in 1975 and like many suddenly single fathers he searched for something for his daughters to do when they were at the house. Crowther combined exploring caves, Dungeons & Dragons, and FORTRAN to get Colossal Cave Adventure, often just called Adventure. And since he worked on the ARPANET, the game found its way out onto the growing computer network. Crowther moved to Palo Alto and went to work for Xerox PARC in 1976 before going back to BBN and eventually retiring from Cisco. Crowther loosely based the game mechanics on the ELIZA natural language processing work done by Joseph Weizenbaum at the MIT Artificial Intelligence Laboratory in the 1960s. That had been a project to show how computers could be shown to understand text provided to computers. It was most notably used in tests to have a computer provide therapy sessions. And writing software for the kids or gaming can be therapeutic as well. As can replaying happier times. Crowther explored Mammoth Cave National Park in Kentucky in the early 1970s. The characters in the game follow along his notes about the caves, exploring the area around it using natural language while the computer looked for commands in what was entered. It took about 700 lines to do the original Fortran code for the PDP-10 he had at his disposal at BBN. When he was done he went off on vacation, and the game spread. Programmers in that era just shared code. Source needed to be recompiled for different computers, so they had to. Another programmer was Don Woods, who also used a PDP-10. He went to Princeton in the 1970s and was working at the Stanford AI Lab, or SAIL, at the time. He came across the game and asked Crowther if it would be OK to add a few features and did. His version got distributed through DECUS, or the Digital Equipment Computer Users Society. A lot of people went there for software at the time. The game was up to 3,000 lines of code when it left Woods. The adventurer could now enter the mysterious cave in search of the hidden treasures. The concept of the computer as a narrator began with Collosal Cave Adventure and is now widely used. Although we now have vast scenery rendered and can point and click where we want to go so don't need to type commands as often. The interpreter looked for commands like “move”, “interact” with other characters, “get” items for the inventory, etc. Woods went further and added more words and the ability to interpret punctuation as well. He also added over a thousand lines of text used to identify and describe the 40 locations. Woods continued to update that game until the mid-1990s. James Gillogly of RAND ported the code to C so it would run on the newer Unix architecture in 1977 and it's still part of many a BSD distribution. Microsoft published a version of Adventure in 1979 that was distributed for the Apple II and TRS-80 and followed that up in 1981 with a version for Microsoft DOS or MS-DOS. Adventure was now a commercial product. Kevin Black wrote a version for IBM PCs. Peter Gerrard ported it to Amiga Bob Supnik rose to a Vice President at Digital Equipment, not because he ported the game, but it didn't hurt. And throughout the 1980s, the game spread to other devices as well. Peter Gerrard implemented the version for the Tandy 1000. The Original Adventure was a version that came out of Aventuras AD in Spain. They gave it one of the biggest updates of all. Colossal Cave Adventure was never forgotten, even though it was Zork was replaced. Zork came along in 1977 and Adventureland in 1979. Ken and Roberta Williams played the game in 1979. Ken had bounced around the computer industry for awhile and had a teletype terminal at home when he came across Colossal Cave Adventure in 1979. The two became transfixed and opened their own company to make the game they released the next year called Mystery House. And the text adventure genre moved to a new level when they sold 15,000 copies and it became the first hit. Rogue, and others followed, increasingly interactive, until fully immersive graphical games replaced the adventure genre in general. That process began when Warren Robinett of Atari created the 1980 game, Adventure. Robinett saw Colossal Cave Adventure when he visited the Stanford Artificial Intelligence Laboratory in 1977. He was inspired into a life of programming by a programming professor he had in college named Ken Thompson while he was on sabbatical from Bell Labs. That's where Thompason, with Dennis Ritchie and one of the most amazing teams of programmers ever assembled, gave the world Unix and the the C programming language at Bell Labs. Adventure game went on to sell over a million copies and the genre of fantasy action-adventure games moved from text to video.
The first operating systems as we might think of them today (or at least anything beyond a basic task manager) shipped in the form of Multics in 1969. Some of the people who worked on that then helped created Unix at Bell Labs in 1971. Throughout the 1970s and 1980s, Unix flowed to education, research, and corporate environments through minicomputers and many in those environments thought a flavor of BSD, or Berkeley Software Distribution, might become the operating system of choice on microcomputers. But the microcomputer movement had a while other plan if only in spite of the elder minicomputers. Apple DOS was created in 1978 in a time when most companies who made computers had to mail their own DOS as well, if only so software developers could built disks capable of booting the machines. Microsoft created their Disk Operating System, or MS-DOS, in 1981. They proceeded to Windows 1 to sit on top of MS-DOS in 1985, which was built in Intel's 8086 assembler and called operating system services via interrupts. That led to poor programmers locking down points in order to access memory addresses and written assuming a single-user operating system. Then came Windows 2 in 1987, Windows 3 in 1992, and released one of the most anticipated operating systems of all time in 1995 with Windows 95. 95 turned into 98, and then Millineum in 2000. But in the meantime, Microsoft began work on another generation of operating systems based on a fusion of ideas between work they were doing with IBM, work architects had done at Digital Equipment Corporation (DEC), and rethinking all of it with modern foundations of APIs and layers of security sitting atop a kernel. Microsoft worked on OS/2 with IBM from 1985 to 1989. This was to be the IBM-blessed successor of the personal computer. But IBM was losing control of the PC market with the rise of cloned IBM architectures. IBM was also big, corporate, and the small, fledgeling Microsoft was able to move quicker. Really small companies that find success often don't mesh well with really big companies that have layers of bureaucracy. The people Microsoft originally worked with were nimble and moved quickly. The ones presiding over the massive sales and go to market efforts and the explosion in engineering team size was back to the old IBM. OS/2 had APIs for most everything the computer could do. This meant that programmers weren't just calling assembly any time they wanted and invading whatever memory addresses they wanted. They also wanted preemptive multitasking and threading. And a file system since by then computers had internal hard drives. The Microsoft and IBM relationship fell apart and Microsoft decided to go their own way. Microsoft realized that DOS was old and building on top of DOS was going to some day be a big, big problem. Windows 3 was closer, as was 95, so they continued on with that plan. But they started something similar to what we'd call a fork of OS/2 today. So Gates went out to recruit the best in the industry. He hired Dave Cutler from Digital Equipment to take on the architecture of the new operating system. Cutler had worked on the VMS operating system and helped lead efforts for next-generation operating system at DEC that they called MICA. And that moment began the march towards a new operating system called NT, which borrowed much of the best from VMS, Microsoft Windows, and OS/2 - and had little baggage. Microsoft was supposed to make version 3 of OS/2 but NT OS/2 3.0 would become just Windows NT when Microsoft stopped developing on OS/2. It took 12 years, because um, they had a loooooot of customers after the wild success of first Windows 3 and then Windows 95, but eventually Cutler and team's NT would replace all other operating systems in the family with the release of Windows 2000. Cutler wanted to escape the confines of what was by then the second largest computing company in the world. Cutler worked on VMS and RSX-12 before he got to Microsoft. There were constant turf battles and arguments about microkernels and system architecture and meetings weren't always conducive with actually shipping code. So Cutler went somewhere he could. At least, so long as they kept IBM at bay. Cutler brought some of the team from Digital with him and they got to work on that next generation of operating systems in 1988. They sat down to decide what they wanted to build, using the NS OS/2 operating system they had a starting point. Microsoft had sold Xenix and the team knew about most every operating system on the market at the time. They wanted a multi-user environment like a Unix. They wanted programming APIs, especially for networking, but different than what BSD had. In fact, many of the paths and structures of networking commands in Windows still harken back to emulating those structures. The system would be slow on the 8086 processor, but ever since the days of Xerox PARC, everyone knew Moore's Law was real and that the processors would double in speed every other year. Especially since Moore was still at Intel and could make his law remain true with the 286 and 386 chips in the pipeline. They also wanted the operating system to be portable since IBM selected the Intel CPU but there were plenty of other CPU architectures out there as well. The original name for NT was to be OS/2 3.0. But the IBM and Microsoft relationship fell apart and the two companies took their operating systems in different directions. OS/2 became went the direction of Warp and IBM never recovered. NT went in a direction where some ideas came over from Windows 95 or 3.1 but mostly the team just added layers of APIs and focused on making NT a fully 32-bit version of Windows that could that could be ported to other platforms including ARM, PowerPC, and the DEC Alpha that Cutler had exposure to from his days at Digital. The name became Windows NT and NT began with version 3, as it was in fact the third installment of OS/2. The team began with Cutler and a few others, grew to eight and by the time it finally shipped as NT 3.1 in 1993 there were a few hundred people working on the project. Where Windows 95 became the mass marketed operating system, NT took lessons learned from the Unix, IBM mainframe, and VMS worlds and packed them into an operating system that could run on a corporate desktop computer, as microcomputers were called by then. The project cost $150 million, about the same as the first iPhone. It was a rough start. But that core team and those who followed did what Apple couldn't in a time when a missing modern operating system nearly put Apple out of business. Cutler inspired, good managers drove teams forward, some bad managers left, other bad managers stayed, and in an almost agile development environment they managed to break through the conflicts and ship an operating system that didn't actually seem like it was built by a committee. Bill Gates knew the market and was patient enough to let NT 3 mature. They took the parts of OS/2 like LAN Manager. They took parts of Unix like ping. But those were at the application level. The microkernel was the most important part. And that was a small core team, like it always is. The first version they shipped to the public was Windows NT 3.1. The sales people found it easiest to often say that NT was the business-oriented operating system. Over time, the Windows NT series was slowly enlarged to become the company's general-purpose OS product line for all PCs, and thus Microsoft abandoned the Windows 9x family, which might or might not have a lot to do with the poor reviews Millennium Edition had. Other aspects of the application layer the original team didn't do much with included the GUI, which was much more similar to Windows 3.x. But based on great APIs they were able to move faster than most, especially in that era where Unix was in weird legal territory, changing hands from Bell to Novell, and BSD was also in dubious legal territory. The Linux kernel had been written in 1991 but wasn't yet a desktop-class operating system. So the remaining choices most business considered were really Mac, which had serious operating system issues at the time and seemed to lack a vision since Steve Jobs left the company, or Windows. Windows NT 3.5 was introduced in 1994, followed by 3.51 a year later. During those releases they shored up access control lists for files, functions, and services. Services being similar in nearly every way to a process in Unix. It sported a TCP/IP network stack but also NetBIOS for locating computers to establish a share and a file sharing stack in LAN Manager based on the Server Message Block, or SMB protocol that Barry Feigenbaum wrote at IBM in 1983 to turn a DOS computer into a file server. Over the years, Microsoft and 3COM add additional functionality and Microsoft added the full Samba with LDAP out of the University of Michigan as a backend and Kerberos (out of MIT) to provide single sign-on services. 3.51 also brought a lot of user-mode components from Windows 95. That included the Windows 95 common control library, which included the rich edit control, and a number of tools for developers. NT could run DOS software, now they were getting it to run Windows 95 software without sacrificing the security of the operating system where possible. It kinda' looked like a slightly more boring version of 95. And some of the features were a little harder to use, like configuring a SCSI driver to get a tape drive to work. But they got the ability to run Office 95 and it was the last version that ran the old Program Manager graphical interface. Cutler had been joined by Moshe Dunie, who led the management side of NT 3.1, through NT 4 and became the VP of the Windows Operating System Division so also had responsibility for Windows 98 and 2000. For perspective, that operating system group grew to include 3,000 badged Microsoft employees and about half that number of contractors. Mark Luovsky and Lou Perazzoli joined from Digital. Jim Alchin came in from Banyan Vines. Windows NT 4.0 was released in 1996, with a GUI very similar to Windows 95. NT 4 became the workhorse of the field that emerged for large deployments of computers we now refer to as enterprise computing. It didn't have all the animation-type bells and whistles of 95 but did perform about as well as any operating system could. It had the NT Explorer to browse files, a Start menu, for which many of us just clicked run and types cmd. It had a Windows Desktop Update and a task scheduler. They released a number of features that would take years for other vendors to catch up with. The DCOM, or Distributed Component Object Modeling and Object Linking & Embedding (or OLE) was a core aspect any developer had to learn. The Telephony API (or TAPI) allowed access to the modem. The Microsoft Transaction Server allowed developers to build network applications on their own sockets. The Crypto API allowed developers to encrypt information in their applications. The Microsoft Message Queuing service allowed queuing data transfer between services. They also built in DirectX support and already had OpenGL support. The Task Manager in NT 4 was like an awesome graphical version of the top command on Unix. And it came with Internet Explorer 2 built in. NT 4 would be followed by a series of service packs for 4 years before the next generation of operating system was ready. That was Windows 5, or more colloquially called Windows 2000. In those years NT became known as NT Workstation, the server became known as NT Server, they built out Terminal Server Edition in collaboration with Citrix. And across 6 service packs, NT became the standard in enterprise computing. IBM released OS/2 Warp version 4.52 in 2001, but never had even a fraction of the sales Microsoft did. By contrast, NT 5.1 became Windows XP and 6 became Vista in while OS/2 was cancelled in 2005.
Dell is one of the largest technology companies in the world, and it all started with a small startup that sold personal computers out of Michael Dell's dorm room at the University of Texas. From there, Dell grew into a multi-billion dollar company, bought and sold other companies, went public, and now manufactures a wide range of electronics including laptops, desktops, servers, and more. After graduating high school, Michael Dell enrolled at the University of Texas at Austin with the idea that he would some day start his own company. Maybe even in computers. He had an Apple II in school and Apple and other companies had done pretty well by then in the new microcomputer space. He took it apart and these computers were just a few parts that were quickly becoming standardized. Parts that could be bought off the shelf at computer stores. So he opened a little business that he ran out of his dorm room fixing computers and selling little upgrades. Many a student around the world still does the exact same thing. He also started buying up parts and building new computers. Texas Instruments was right up the road in Dallas. And there was a price war in the early 80s between Commodore and Texas Instruments. Computers could be big business. And it seemed clear that this IBM PC that was introduced in 1981 was going to be more of a thing, especially in offices. Especially since there were several companies making clones of the PC, including Compaq who was all over the news as Silicon Cowboys, having gotten to $100 million in sales within just two years. So from his dorm room in 1984, Dell started a little computer company he called PCs Limited. He built PCs using parts and experimented with different combinations. One customer led to another and he realized that a company like IBM bought a few hundred dollars worth of parts, put them in a big case and sold it for thousands of dollars. Any time a company makes too much margin, smaller and more disruptive companies will take the market away. Small orders turned into bigger and ones and he was able to parlay each into being able to build bigger orders. They released the Turbo PC in 1985. A case, a mother board, a CPU, a keyboard, a mouse, some memory, and a CPU chip. Those first computers he built came with an 8088 chip. Low overhead meant he could be competitive on price: $795. No retail store front and no dealers, who often took 25 to 50 percent of the money spent on computers, let the company run out of a condo. He'd sold newspapers as a kid so he was comfortable picking up the phone and dialing for dollars. He managed to make $200,000 in sales in that first year. So he dropped out of school to build the company. To keep costs low, he sold through direct mail and over the phone. No high-paid sellers in blue suits like IBM, even if the computers could run the same versions of DOS. He incorporated as Dell Computer Company in 1987, started to expand internationally, and on the back of rapid revenue growth and good margins. They hit $159 million in sales that year. So they took the company public in 1988. The market capitalization when they went public was $30 million and quickly rose to $80 million. By then we'd moved past the 8088 chips and the industry was standardizing on the 80386 chip, following the IBM PS/2. By the end of 1989 sales hit $250 million. They needed more Research and Development firepower, so they brought in Glenn Henry. He'd been at IBM for over 20 years and managed multiple generations of mid-range mainframes then servers and then RISC-based personal computers. He helped grow the R&D team into the hundreds and quality of computer went up, which paired well with costs of computers remaining affordable compared to the rest of the market. Dell was, and to a large degree still is, a direct to consumer company. They experimented with the channel in the early 1990s, which is to say 3rd parties that were authorized to sell their computers. They signed deals to sell through distributors, computer stores, warehouse clubs, and retail chains. But the margins didn't work, so within just a few years they cancelled many of those relationships. Instead they went from selling to companies to the adjacent home market. It seems like that's the last time in recent memory that direct mailing as a massive campaign worked. Dell was able to undercut most other companies who sold laptops at the time by going direct to consumers. They brought in marketing execs from other companies, like Tandy. The London office was a huge success, bringing in tens of millions in revenue, so they brought on a Munich office and then slowly expanded into tother countries. They were one of the best sales and marketing machines in that direct to consumer and business market. Customers could customize orders, so maybe add a faster CPU, some extra memory, or even a scanner, modem, or other peripheral. They got the manufacturing to the point where they could turn computers around in five days. Just a decade earlier people waited months for computers. They released their first laptop in 1989, which they called the 316LT. Just a few years earlier, Michael Dell was in a dorm room. If he'd completed a pre-med degree and gotten into medical school, he'd likely be in his first or second year. He was now a millionaire; and just getting started. With the help of their new R&D chief, they were able to get into the server market where the margins were higher, and that helped get more corporate customers. By the end of 1990, they were the sixth largest personal computer company in the US. To help sales in the rapidly growing European and Middle Eastern offices, they opened another manufacturing location in Ireland. And by 1992, they became a one of the top 500 companies in the world. Michael Dell, instead of being on an internship in medical school and staring down the barrel of school loans, was the youngest CEO in the Fortune 500. The story is almost boring. They just grow and grow. Especially when rivals like IBM, HP, Digital Equipment, and Compaq make questionable finance and management choices that don't allow those companies to remain competitive. They all had better technology at many times, but none managed to capitalize on the markets. Instead of becoming the best computer maker they could be, they played corporate development games and wandered away from their core businesses. Or like IBM they decided that they didn't want to compete with the likes of Dell and just sold off their PC line to Lenovo. But Dell didn't make crappy computers. They weren't physically inspiring like some computers at the time, but they got the job done and offices that needed dozens or hundreds of machines often liked working with Dell. They continued the global expansion through the 90s and added servers in 1996. By now there were customers buying their second or third generation of computer, going from DOS to Windows 3.1 to Windows 95. And they did something else really important in 1996: they began to sell through the web at dell.com. Within a few months they were doing a million a day in sales and the next year hit 10 million PCs sold. Little Dell magazines showed up in offices around the world. Web banners appeared on web pages. Revenues responded and went from $2.9 billion in 1994 to $3.5 billion in 1995. And they were running at margins over 20 percent. Revenue hit $5.3 billion in 1996, 7.8 in 1997, 12.3 in 1998, 18.2 in 1999, and $25.3 in 2000. The 1990s had been good to Dell. Their stock split 7 times. It wouldn't double every other year again, but would double again by 2009. In the meantime, the market was changing. The Dell OptiPlex is one of the best selling lines of computers of all time and offers a glimpse into what was changing. Keep in mind, this was the corporate enterprise machine. Home machines can be better or less, according to the vendor. The processors ranged from a Celeron up to a Pentium i9 at this point. Again, we needed a mother board, usually an ATX or a derivative. They started with that standard ATX mother board form factor but later grew to be a line that came in the tower, the micro, and everything in between. Including an All-in-one. That Series 1 was beige and just the right size to put a big CRT monitor on top of it. It sported a 100 MHz 486 chip and could take up to 64 megabytes of memory across a pair of SIMM slots. The Series 2 was about half the size and by now we saw those small early LCD flat panel screens. They were still beige though. As computers went from beige to black with the Series 3 we started to see the iconic metallic accents we're accustomed to now. They followed along the Intel replacement for the ATX motherboard, the BTX, and we saw those early PCI form factors be traded for PCIe. By the end of the Series 3 in 2010, the Optiplex 780 could have up to 16 gigs of memory as a max, although that would set someone back a pretty penning in 2009. And the processors came ranging from the 800 MHz to 1.2 GHz. We'd also gone from PS/2 ports with serial and parallel to USB 2 ports and from SIMM to DIMM slots, up to DDR4 with the memory about as fast as a CPU. But they went back to the ATX and newer Micro ATX with the Series 4. They embraced the Intel i series chips and we got all the fun little metal designs on the cases. Cases that slowly shifted to being made of recycled parts. The Latitude laptops followed a similar pattern. Bigger faster, and heavier. They released the Dell Dimension and acquired Alienware in 2006, at the time the darling of the gamer market. Higher margin hardware, like screaming fast GPU graphic cards. But also lower R&D costs for the Dell lines as there was the higher end line that flowed down to the OptiPlex then Dimension. Meanwhile, there was this resurgent Apple. They'd released the iMac in 1998 and helped change the design language for computers everywhere. Not that everyone needed clear cases. Then came the iPod in 2001. Beautiful design could sell products at higher prices. But they needed to pay a little more attention to detail. But more importantly, those Dells were getting bigger and faster and heavier while the Apple computers were getting lighter, and even the desktops more portable. The iPhone came in 2007. The Intel MacBook Air came 10 years after that iMac, in 2008. The entire PC industry was in a race for bigger power supplies to push more and more gigahertz through a CPU without setting the house on fire and Apple changed the game. The iPad was released in 2010. Apple finally delivered on the promise of the Dynabook that began life at Xerox PARC. Dell had been in the drivers seat. They became the top personal computer company in 2003 and held that spot until HP and Compaq merged. But their spot would never be regained as revenue slowed from the time the iPad was released for almost a decade, even contracting at times. See, Dell had a close partnership with Intel and Microsoft. Microsoft made operating systems for mobile devices but the Dell Venue was not competitive with the iPhone. They also tried making a mobile device using Android but the Streak never sold well either and was discontinued as well. While Microsoft retooled their mobile platforms to compete in the tablet space, Dell tried selling Android tablets but discontinued those in 2016. To make matters worse for Dell, they'd ridden a Microsoft Windows alliance where they never really had to compete with Microsoft for nearly 30 years and then Microsoft released the Surface in 2012. The operating systems hadn't been pushing people to upgrade their computers and Microsoft even started selling Office directly and online, so Dell lost revenue bundling Office with computers. They too had taken their eye off the market. HP bought EDS in 2008, diversifying into a services organization, something IBM had done well over a decade before. Except rather than sell their PC business they made a go at both. So Dell did the same, acquiring Perot Systems, the company Perot started after he sold EDS and ran for president, for $3.9 billion, which came in at a solid $10 billion less than what HP paid for EDS. The US was in the midst of a recession, so that didn't help matters either. But it did make for an interesting investment climate. Interest rates were down, so large investors needed to put money to work to show good returns for customers. Dell had acquired just 8 companies before the Great Recession but acquired an average of 5 over each of the next four years. This allowed them to diversify, And Michael Dell made another savvy finance move, he took the company private in 2013 with the help of Silver Lake partners. 5 years off the public market was just what they needed. 2018 they went public again on the backs of revenues that had shot up to to $79 billion from a low of around $50 billion in 2016. And they exceeded $94 billion in 2021. The acquisition of EMC-VMware was probably the most substantial to $67 billion. That put them in the enterprise server market and gave them a compelling offer at pretty much every level of the enterprise stack. Although at this point maybe it remains to be seen if the enterprise server and storage stack is still truly a thing. A Dell Optiplex costs about the same amount today as it did when Dell sold that first Turbo PC. They can be had cheaper but probably shouldn't. Adjusted for an average 2.6 percent inflation rate, that brings those first Dell PCs to just north of $2,000 as of the time of this writing. Yet the computer remained the same, with fairly consistent margins. That means the components have gotten half as expensive because they're made in places with cheaper labor than they were in the early 1980s. That means there are potentially less components, like a fan for certain chips or RAM when they're memory integrated in a SoC, etc. But the world is increasingly mobile. Apple, Google, and Microsoft sell computers for their own operating systems now. Dell doesn't make phones and they aren't in the top 10 for the tablet market. People don't buy products from magazines that show up any longer. Now it's a quick search on Amazon. And looking for a personal computer there, the results right this second (that is, while writing this paragraph) showed the exact same order as vendor market share for 2021: Lenovo, followed by HP, then Dell. All of the devices looked about the same. Kinda' like those beige injection-molded devices looked about the same. HP couldn't have such a large company exist under one roof and eventually spun HP Enterprise out into its own entity. Dell sold Perot Systems to NTT Docomo to get the money to buy EMC on leverage. Not only do many of these companies have products that look similar, but their composition does as well. What doesn't look similar is Michael Dell. He's worth just shy of $60 billion dollars (according to the day and the markets). His book, Direct From Dell is one of the best looks at the insides of a direct order mail business making the transition to early commerce one can find. Oh, and it's not just him and some friends in a dorm room. It's 158,000 employees who help make up over a $42 billion market cap. And helped generations of people afford personal computers. That might be the best part of such a legacy.
Scanner School - Everything you wanted to know about the Scanner Radio Hobby
The RadioShack brand is making a comeback. They are getting into the crypto game and are enterin the market with their own coin, $RADIO. What You Need To Know RadioShack was started in 1921 by two brothers who wanted to provide the ham radio market equipment. They started with one physical store and a mail order store in downtown Boston. In 1962, Tandy was looking for a hobby-related business, so they took over RadioShack, restructured it, and gave RadioShack a brand new foundation. RadioShack was known for carrying its name brands like Tandy and realistic. Tandy Corp was not only manufacturing their stuff for RadioShack, but there is also an OEM for hardware like Digital Equipment, Corporation grid, Olivetti, AST Computer, and Panasonic. RadioShack had several successes and ultimate failures. RadioShack had some iconic scanners like the Realistic Pro 2006. If you were getting into the electronic hobby, you were getting into radios. RadioShack is probably where you learned. That's where you cut your teeth in his hobby. We scanner radio users or amateur radio operators were their target audience, and we were their target market from the beginning. But unfortunately, selling radios, antennas, and components is no longer something out there that can support a brick and mortar store. ====================================
In the previous episodes, we looked at the rise of patents and software and their impact on the nascent computer industry. But a copyright is a right. And that right can be given to others in whole or in part. We have all benefited from software where the right to copy was waved and it's shaped the computing industry as much, if not more, than proprietary software. The term Free and Open Source Software (FOSS for short) is a blanket term to describe software that's free and/or whose source code is distributed for varying degrees of tinkeration. It's a movement and a choice. Programmers can commercialize our software. But we can also distribute it free of copy protections. And there are about as many licenses as there are opinions about what is unique, types of software, underlying components, etc. But given that many choose to commercialize their work products, how did a movement arise that specifically didn't? The early computers were custom-built to perform various tasks. Then computers and software were bought as a bundle and organizations could edit the source code. But as operating systems and languages evolved and businesses wanted their own custom logic, a cottage industry for software started to emerge. We see this in every industry - as an innovation becomes more mainstream, the expectations and needs of customers progress at an accelerated rate. That evolution took about 20 years to happen following World War II and by 1969, the software industry had evolved to the point that IBM faced antitrust charges for bundling software with hardware. And after that, the world of software would never be the same. The knock-on effect was that in the 1970s, Bell Labs pushed away from MULTICS and developed Unix, which AT&T then gave away as compiled code to researchers. And so proprietary software was a growing industry, which AT&T began charging for commercial licenses as the bushy hair and sideburns of the 70s were traded for the yuppy culture of the 80s. In the meantime, software had become copyrightable due to the findings of CONTU and the codifying of the Copyright Act of 1976. Bill Gates sent his infamous “Open Letter to Hobbyists” in 1976 as well, defending the right to charge for software in an exploding hobbyist market. And then Apple v Franklin led to the ability to copyright compiled code in 1983. There was a growing divide between those who'd been accustomed to being able to copy software freely and edit source code and those who in an up-market sense just needed supported software that worked - and were willing to pay for it, seeing the benefits that automation was having on the capabilities to scale an organization. And yet there were plenty who considered copyright software immoral. One of the best remembered is Richard Stallman, or RMS for short. Steven Levy described Stallman as “The Last of the True Hackers” in his epic book “Hackers: Heroes of the Computer Revolution.” In the book, he describes the MIT Stallman joined where there weren't passwords and we didn't yet pay for software and then goes through the emergence of the LISP language and the divide that formed between Richard Greenblatt, who wanted to keep The Hacker Ethic alive and those who wanted to commercialize LISP. The Hacker Ethic was born from the young MIT students who freely shared information and ideas with one another and help push forward computing in an era they thought was purer in a way, as though it hadn't yet been commercialized. The schism saw the death of the hacker culture and two projects came out of Stallman's technical work: emacs, which is a text editor that is still included freely in most modern Unix variants and the GNU project. Here's the thing, MIT was sitting on patents for things like core memory and thrived in part due to the commercialization or weaponization of the technology they were producing. The industry was maturing and since the days when kings granted patents, maturing technology would be commercialized using that system. And so Stallman's nostalgia gave us the GNU project, born from an idea that the industry moved faster in the days when information was freely shared and that knowledge was meant to be set free. For example, he wanted the source code for a printer driver so he could fix it and was told it was protected by an NDAQ and so couldn't have it. A couple of years later he announced GNU, a recursive acronym for GNU's Not Unix. The next year he built a compiler called GCC and the next year released the GNU Manifesto, launching the Free Software Foundation, often considered the charter of the free and open source software movement. Over the next few years as he worked on GNU, he found emacs had a license, GCC had a license, and the rising tide of free software was all distributed with unique licenses. And so the GNU General Public License was born in 1989 - allowing organizations and individuals to copy, distribute, and modify software covered under the license but with a small change, that if someone modified the source, they had to release that with any binaries they distributed as well. The University of California, Berkley had benefited from a lot of research grants over the years and many of their works could be put into the public domain. They had brought Unix in from Bell Labs in the 70s and Sun cofounder and Java author Bill Joy worked under professor Fabry, who brought Unix in. After working on a Pascal compiler that Unix coauthor Ken Thompson left for Berkeley, Joy and others started working on what would become BSD, not exactly a clone of Unix but with interchangeable parts. They bolted on the OSI model to get networking and through the 80s as Joy left for Sun and DEC got ahold of that source code there were variants and derivatives like FreeBSD, NetBSD, Darwin, and others. The licensing was pretty permissive and simple to understand: Copyright (c) . All rights reserved. Redistribution and use in source and binary forms are permitted provided that the above copyright notice and this paragraph are duplicated in all such forms and that any documentation, advertising materials, and other materials related to such distribution and use acknowledge that the software was developed by the . The name of the may not be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED ``AS IS AND WITHOUT ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, WITHOUT LIMITATION, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. By 1990 the Board of Regents at Berkley accepted a four clause BSD license that spawned a class of licenses. While it's matured into other formats like a 0 clause license it's one of my favorites as it is truest to the FOSS cause. And the 90s gave us the Apache License, from the Apache Group, loosely based on the BSD License and then in 2004 leaning away from that with the release of the Apache License 2 that was more compatible with the GPL license. Given the modding nature of Apache they didn't require derivative works to also be open sourced but did require leaving the license in place for unmodified parts of the original work. GNU never really caught on as an OS in the mainstream, although a collection of tools did. The main reason the OS didn't go far is probably because Linus Torvalds started releasing prototypes of his Linux operating system in 1991. Torvalds used The GNU General Public License v2, or GPLv2 to license his kernel, having been inspired by a talk given by Stallman. GPL 2 had been released in 1991 and something else was happening as we turned into the 1990s: the Internet. Suddenly the software projects being worked on weren't just distributed on paper tape or floppy disks; they could be downloaded. The rise of Linux and Apache coincided and so many a web server and site ran that LAMP stack with MySQL and PHP added in there. All open source in varying flavors of what open source was at the time. And collaboration in the industry was at an all-time high. We got the rise of teams of developers who would edit and contribute to projects. One of these was a tool for another aspect of the Internet, email. It was called popclient, Here Eric S Raymond, or ESR for short, picked it up and renamed it to fetchmail, releasing it as an open source project. Raymond presented on his work at the Linux Congress in 1997, expanded that work into an essay and then the essay into “The Cathedral and the Bazaar” where bazaar is meant to be like an open market. That inspired many to open source their own works, including the Netscape team, which resulted in Mozilla and so Firefox - and another book called “Freeing the Source: The Story of Mozilla” from O'Reilly. By then, Tim O'Reilly was a huge proponent of this free or source code available type of software as it was known. And companies like VA Linux were growing fast. And many wanted to congeal around some common themes. So in 1998, Christine Peterson came up with the term “open source” in a meeting with Raymond, Todd Anderson, Larry Augustin, Sam Ockman, and Jon “Maddog” Hall, author of the first book I read on Linux. Free software it may or may not be but open source as a term quickly proliferated throughout the lands. By 1998 there was this funny little company called Tivo that was doing a public beta of a little box with a Linux kernel running on it that bootstrapped a pretty GUI to record TV shows on a hard drive on the box and play them back. You remember when we had to wait for a TV show, right? Or back when some super-fancy VCRs could record a show at a specific time to VHS (but mostly failed for one reason or another)? Well, Tivo meant to fix that. We did an episode on them a couple of years ago but we skipped the term Tivoization and the impact they had on GPL. As the 90s came to a close, VA Linux and Red Hat went through great IPOs, bringing about an era where open source could mean big business. And true to the cause, they shared enough stock with Linus Torvalds to make him a millionaire as well. And IBM pumped a billion dollars into open source, with Sun moving to open source openoffice.org. Now, what really happened there might be that by then Microsoft had become too big for anyone to effectively compete with and so they all tried to pivot around to find a niche, but it still benefited the world and open source in general. By Y2K there was a rapidly growing number of vendors out there putting Linux kernels onto embedded devices. TiVo happened to be one of the most visible. Some in the Linux community felt like they were being taken advantage of because suddenly you had a vendor making changes to the kernel but their changes only worked on their hardware and they blocked users from modifying the software. So The Free Software Foundation updated GPL, bundling in some other minor changes and we got the GNU General Public License (Version 3) in 2006. There was a lot more in GPL 3, given that so many organizations were involved in open source software by then. Here, the full license text and original copyright notice had to be included along with a statement of significant changes and making source code available with binaries. And commercial Unix variants struggled with SGI going bankrupt in 2006 and use of AIX and HP-UX Many of these open source projects flourished because of version control systems and the web. SourceForge was created by VA Software in 1999 and is a free service that can be used to host open source projects. Concurrent Versions System, or CVS had been written by Dick Grune back in 1986 and quickly became a popular way to have multiple developers work on projects, merging diffs of code repositories. That gave way to git in the hearts of many a programmer after Linus Torvalds wrote a new versioning system called git in 2005. GitHub came along in 2008 and was bought by Microsoft in 2018 for 2018. Seeing a need for people to ask questions about coding, Stack Overflow was created by Jeff Atwood and Joel Spolsky in 2008. Now, we could trade projects on one of the versioning tools, get help with projects or find smaller snippets of sample code on Stack Overflow, or even Google random things (and often find answers on Stack Overflow). And so social coding became a large part of many a programmers day. As did dependency management, given how many tools are used to compile a modern web app or app. I often wonder how much of the code in many of our favorite tools is actually original. Another thought is that in an industry dominated by white males, it's no surprise that we often gloss over previous contributions. It was actually Grace Hopper's A-2 compiler that was the first software that was released freely with source for all the world to adapt. Sure, you needed a UNIVAC to run it, and so it might fall into the mainframe era and with the emergence of minicomputers we got Digital Equipment's DECUS for sharing software, leading in part to the PDP-inspired need for source that Stallman was so adamant about. General Motors developed SHARE Operating System for the IBM 701 and made it available through the IBM user group called SHARE. The ARPAnet was free if you could get to it. TeX from Donald Knuth was free. The BASIC distribution from Dartmouth was academic and yet Microsoft sold it for up to $100,000 a license (see Commodore ). So it's no surprise that people avoided paying upstarts like Microsoft for their software or that it took until the late 70s to get copyright legislation and common law. But Hopper's contributions were kinda' like open source v1, the work from RMS to Linux was kinda' like open source v2, and once the term was coined and we got the rise of a name and more social coding platforms from SourceForge to git, we moved into a third version of the FOSS movement. Today, some tools are free, some are open source, some are free as in beer (as you find in many a gist), some are proprietary. All are valid. Today there are also about as many licenses as there are programmers putting software out there. And here's the thing, they're all valid. You see, every creator has the right to restrict the ability to copy their software. After all, it's their intellectual property. Anyone who chooses to charge for their software is well within their rights. Anyone choosing to eschew commercialization also has that right. And every derivative in between. I wouldn't judge anyone based on any model those choose. Just as those who distribute proprietary software shouldn't be judged for retaining their rights to do so. Why not just post things we want to make free? Patents, copyrights, and trademarks are all a part of intellectual property - but as developers of tools we also need to limit our liability as we're probably not out there buying large errors and omissions insurance policies for every script or project we make freely available. Also, we might want to limit the abuse of our marks. For example, Linus Torvalds monitors the use of the Linux mark through the Linux Mark Institute. Apparently some William Dell Croce Jr tried to register the Linux trademark in 1995 and Torvalds had to sue to get it back. He provides use of the mark using a free and perpetual global sublicense. Given that his wife won the Finnish karate championship six times I wouldn't be messing with his trademarks. Thank you to all the creators out there. Thank you for your contributions. And thank you for tuning in to this episode of the History of Computing Podcast. Have a great day.
Once upon a time, people were computers. It's probably hard to imagine teams of people spending their entire day toiling in large grids of paper, writing numbers and calculating numbers by hand or with mechanical calculators, and then writing more numbers and then repeating that. But that's the way it was before the 1979. The term spreadsheet comes from back when a spread, like a magazine spread, of ledger cells for bookkeeping. There's a great scene in the Netflix show Halston where a new guy is brought in to run the company and he's flying through an electro-mechanical calculator. Halston just shuts the door. Ugh. Imagine doing what we do in a spreadsheet in minutes today by hand. Even really large companies jump over into a spreadsheet to do financial projections today - and with trendlines, tweaking this small variable or that, and even having different algorithms to project the future contents of a cell - the computerized spreadsheet is one of the most valuable business tools ever built. It's that instant change we see when we change one set of numbers and can see the impact down the line. Even with the advent of mainframe computers accounting and finance teams had armies of people who calculated spreadsheets by hand, building complicated financial projections. If the formulas changed then it could take days or weeks to re-calculate and update every cell in a workbook. People didn't experiment with formulas. Computers up to this point had been able to calculate changes and provided all the formulas were accurate could output results onto punch cards or printers. But the cost had been in the millions before Digital Equipment and Data Nova came along and had dropped into the tens or hundreds of thousands of dollars The first computerized spreadsheets weren't instant. Richard Mattessich developed an electronic, batch spreadsheet in 1961. He'd go on to write a book called “Simulation of the Firm Through a Budget Computer Program.” His work was more theoretical in nature, but IBM developed the Business Computer Language, or BCL the next year. What IBM did got copied by their seven dwarves. former GE employees Leroy Ellison, Harry Cantrell, and Russell Edwards developed AutoPlan/AutoTab, another scripting language for spreadsheets, following along delimited files of numbers. And in 1970 we got LANPAR which opened up more than reading files in from sequential, delimited sources. But then everything began to change. Harvard student Dan Bricklin graduated from MIT and went to work for Digital Equipment Corporation to work on an early word processor called WPS-8. We were now in the age of interactive computing on minicomputers. He then went to work for FasFax in 1976 for a year, getting exposure to calculating numbers. And then he went off to Harvard in 1977 to get his MBA. But while he was at Harvard he started working on one of the timesharing programs to help do spreadsheet analysis and wrote his own tool that could do five columns and 20 rows. Then he met Bob Frankston and they added Dan Fylstra, who thought it should be able to run on an Apple - and so they started Software Arts Corporation. Frankston got the programming bug while sitting in on a class during junior high. He then got his undergrad and Masters at MIT, where he spent 9 years in school and working on a number of projects with CSAIL, including Multics. He'd been consulting and working at various companies for awhile in the Boston area, which at the time was probably the major hub. Frankston and Bricklin would build a visible calculator using 16k of space and that could fit on a floppy. They used a time sharing system and because they were paying for time, they worked at nights when time was cheaper, to save money. They founded a company called Software Arts and named their Visual Calculator VisiCalc. Along comes the Apple II. And computers were affordable. They ported the software to the platform and it was an instant success. It grew fast. Competitors sprung up. SuperCalc in 1980, bundled with the Osborne. The IBM PC came in 1981 and the spreadsheet appeared in Fortune for the first time. Then the cover of Inc Magazine in 1982. Publicity is great for sales and inspiring competitors. Lotus 1-2-3 came in 1982 and even Boeing Computer Services got in the game with Boeing Calc in 1985. They extended the ledger metaphor to add sheets to the spreadsheet, which we think of as tabs today. Quattro Pro from Borland copied that feature and despite having their offices effectively destroyed during an earthquake just before release, came to market in 1989. Ironically they got the idea after someone falsely claimed they were making a spreadsheet a few years earlier. And so other companies were building Visible Calculators and adding new features to improve on the spreadsheet concept. Microsoft was one who really didn't make a dent in sales at first. They released an early spreadsheet tool called Multiple in 1982. But Lotus 1-2-3 was the first killer application for the PC. It was more user friendly and didn't have all the bugs that had come up in VisiCalc as it was ported to run on platform after platform. Lotus was started by Mitch Kapor who brought Jonathan Sachs in to develop the spreadsheet software. Kapor's marketing prowess would effectively obsolete VisiCalc in a number of environments. They made TV commercials so you know they were big time! And they were written natively in the x86 assembly so it was fast. They added the ability to add bar charts, pie charts, and line charts. They added color and printing. One could even spread their sheet across multiple monitors like in a magazine. It was 1- spreadsheets, 2 - charts and graphs and 3 - basic database functions. Heck, one could even change the size of cells and use it as a text editor. Oh, and macros would become a standard in spreadsheets after Lotus. And because VisiCalc had been around so long, Lotus of course was immediately capable of reading a VisiCalc file when released in 1983. As could Microsoft Excel, when it came along in 1985. And even Boeing Calc could read Lotus 1-2-3 files. After all, the concept went back to those mainframe delimited files and to this day we can import and export to tab or comma delimited files. VisiCalc had sold about a million copies but that would cease production the same year Excel was released, although the final release had come in 1983. Lotus had eaten their shorts in the market, and Borland had watched. Microsoft was about to eat both of theirs. Why? Visi was about to build a windowing system called Visi-On. And Steve Jobs needed a different vendor to turn to. He looked to Lotus who built a tool called Jazz that was too basic. But Microsoft had gone public in 1985 and raised plenty of money, some of which they used to complete Excel for the Mac that year. Their final release in 1983 began to fade away And so Excel began on the Mac and that first version was the first graphical spreadsheet. The other developers didn't think that a GUI was gonna' be much of a thing. Maybe graphical interfaces were a novelty! Version two was released for the PC in 1987 along with Windows 2.0. Sales were slow at first. But then came Windows 3. Add Microsoft Word to form Microsoft Office and by the time Windows 95 was released Microsoft became the de facto market leader in documents and spreadsheets. That's the same year IBM bought Lotus and they continued to sell the product until 2013, with sales steadily declining. And so without a lot of competition for Microsoft Excel, spreadsheets kinda' sat for a hot minute. Computers became ubiquitous. Microsoft released new versions for Mac and Windows but they went into that infamous lost decade until… competition. And there were always competitors, but real competition with something new to add to the mix. Google bought a company called 2Web Technologies in 2006, who made a web-based spreadsheet called XL2WEB. That would become Google Sheets. Google bought DocVerse in 2010 and we could suddenly have multiple people editing a sheet concurrently - and the files were compatible with Excel. By 2015 there were a couple million users of Google Workspace, growing to over 5 million in 2019 and another million in 2020. In the years since, Microsoft released Office 365, starting to move many of their offerings onto the web. That involved 60 million people in 2015 and has since grown to over 250 million. The statistics can be funny here, because it's hard to nail down how many free vs paid Google and Microsoft users there are. Statista lists Google as having a nearly 60% market share but Microsoft is clearly making more from their products. And there are smaller competitors all over the place taking on lots of niche areas. There are a few interesting tidbits here. One is that the tools that there's a clean line of evolution in features. Each new tool worked better, added features, and they all worked with previous file formats to ease the transition into their product. Another is how much we've all matured in our understanding of data structures. I mean we have rows and columns. And sometimes multiple sheets - kinda' like multiple tables in a database. Our financial modeling and even scientific modeling has grown in acumen by leaps and bounds. Many still used those electro-mechanical calculators in the 70s when you could buy calculator kits and build your own calculator. Those personal computers that flowed out in the next few years gave every business the chance to first track basic inventory and calculate simple information, like how much we might expect in revenue from inventory in stock to now thousands of pre-built formulas that are supported across most spreadsheet tooling. Despite expensive tools and apps to do specific business functions, the spreadsheet is still one of the most enduring and useful tools we have. Even for programmers, where we're often just getting our data in a format we can dump into other tools! So think about this. What tools out there have common file types where new tools can sit on top of them? Which of those haven't been innovated on in a hot minute? And of course, what is that next bold evolution? Is it moving the spreadsheet from a book to a batch process? Or from a batch process to real-time? Or from real-time to relational with new tabs? Or to add a GUI? Or adding online collaboration? Or like some big data companies using machine learning to analyze the large data sets and look for patterns automatically? Not only does the spreadsheet help us do the maths - it also helps us map the technological determinism we see repeated through nearly every single tool for any vertical or horizontal market. Those stuck need disruptive competitors if only to push them off the laurels they've been resting on.
Investors have pumped capital into emerging markets since the beginning of civilization. Egyptians explored basic mathematics and used their findings to build larger structures and even granaries to allow merchants to store food and serve larger and larger cities. Greek philosophers expanded on those learnings and applied math to learn the orbits of planets, the size of the moon, and the size of the earth. Their merchants used the astrolabe to expand trade routes. They studied engineering and so learned how to leverage the six simple machines to automate human effort, developing mills and cranes to construct even larger buildings. The Romans developed modern plumbing and aqueducts and gave us concrete and arches and radiant heating and bound books and the postal system. Some of these discoveries were state sponsored; others from wealthy financiers. Many an early investment was into trade routes, which fueled humanities ability to understand the world beyond their little piece of it and improve the flow of knowledge and mix found knowledge from culture to culture. As we covered in the episode on clockworks and the series on science through the ages, many a scientific breakthrough was funded by religion as a means of wowing the people. And then autocrats and families who'd made their wealth from those trade routes. Over the centuries of civilizations we got institutions who could help finance industry. Banks loan money using an interest rate that matches the risk of their investment. It's illegal, going back to the Bible to overcharge on interest. That's called usury, something the Romans realized during their own cycles of too many goods driving down costs and too few fueling inflation. And yet, innovation is an engine of economic growth - and so needs to be nurtured. The rise of capitalism meant more and more research was done privately and so needed to be funded. And the rise of intellectual property as a good. Yet banks have never embraced startups. The early days of the British Royal Academy were filled with researchers from the elite. They could self-fund their research and the more doing research, the more discoveries we made as a society. Early American inventors tinkered in their spare time as well. But the pace of innovation has advanced because of financiers as much as the hard work and long hours. Companies like DuPont helped fuel the rise of plastics with dedicated research teams. Railroads were built by raising funds. Trade grew. Markets grew. And people like JP Morgan knew those markets when they invested in new fields and were able to grow wealth and inspire new generations of investors. And emerging industries ended up dominating the places that merchants once held in the public financial markets. Going back to the Venetians, public markets have required regulation. As banking became more a necessity for scalable societies it too required regulation - especially after the Great Depression. And yet we needed new companies willing to take risks to keep innovation moving ahead., as we do today And so the emergence of the modern venture capital market came in those years with a few people willing to take on the risk of investing in the future. John Hay “Jock” Whitney was an old money type who also started a firm. We might think of it more as a family office these days but he had acquired 15% in Technicolor and then went on to get more professional and invest. Jock's partner in the adventure was fellow Delta Kappa Epsilon from out at the University of Texas chapter, Benno Schmidt. Schmidt coined the term venture capital and they helped pivot Spencer Chemicals from a musicians plant to fertilizer - they're both nitrates, right? They helped bring us Minute Maid. and more recently have been in and out of Herbalife, Joe's Crab Shack, Igloo coolers, and many others. But again it was mostly Whitney money and while we tend to think of venture capital funds as having more than one investor funding new and enterprising companies. And one of those venture capitalists stands out above the rest. Georges Doriot moved to the United States from France to get his MBA from Harvard. He became a professor at Harvard and a shrewd business mind led to him being tapped as the Director of the Military Planning Division for the Quartermaster General. He would be promoted to brigadier general following a number of massive successes in the research and development as part of the pre-World War II military industrial academic buildup. After the war Doriot created the American Research and Development Corporation or ARDC with the former president of MIT, Karl Compton, and engineer-turned Senator Ralph Flanders - all of them wrote books about finance, banking, and innovation. They proved that the R&D for innovation could be capitalized to great return. The best example of their success was Digital Equipment Corporation, who they invested $70,000 in in 1957 and turned that into over $350 million in 1968 when DEC went public, netting over 100% a year of return. Unlike Whitney, ARDC took outside money and so Doriot became known as the first true venture capitalist. Those post-war years led to a level of patriotism we arguably haven't seen since. John D. Rockefeller had inherited a fortune from his father, who built Standard Oil. To oversimplify, that company was broken up into a variety of companies including what we now think of as Exxon, Mobil, Amoco, and Chevron. But the family was one of the wealthiest in the world and the five brothers who survived John Jr built an investment firm they called the Rockefeller Brothers Fund. We might think of the fund as a social good investment fund these days. Following the war in 1951, John D Rockefeller Jr endowed the fund with $58 million and in 1956, deep in the Cold War, the fund president Nelson Rockefeller financed a study and hired Henry Kissinger to dig into the challenges of the United States. And then came Sputnik in 1957 and a failed run for the presidency of the United States by Nelson in 1960. Meanwhile, the fund was helping do a lot of good but also helping to research companies Venrock would capitalize. The family had been investing since the 30s but Laurance Rockefeller had setup Venrock, a mashup of venture and Rockefeller. In Venrock, the five brothers, their sister, MIT's Ted Walkowicz, and Harper Woodward banded together to sprinkle funding into now over 400 companies that include Apple, Intel, PGP, CheckPoint, 3Com, DoubleClick and the list goes on. Over 125 public companies have come out of the fund today with an unimaginable amount of progress pushing the world forward. The government was still doing a lot of basic research in those post-war years that led to standards and patents and pushing innovation forward in private industry. ARDC caught the attention of a number of other people who had money they needed to put to work. Some were family offices increasingly willing to make aggressive investments. Some were started by ARDC alumni such as Charlie Waite and Bill Elfers who with Dan Gregory founded Greylock Partners. Greylock has invested in everyone from Red Hat to Staples to LinkedIn to Workday to Palo Alto Networks to Drobo to Facebook to Zipcar to Nextdoor to OpenDNS to Redfin to ServiceNow to Airbnb to Groupon to Tumblr to Zenprise to Dropbox to IFTTT to Instagram to Firebase to Wandera to Sumo Logic to Okta to Arista to Wealthfront to Domo to Lookout to SmartThings to Docker to Medium to GoFundMe to Discord to Houseparty to Roblox to Figma. Going on 800 investments just since the 90s they are arguably one of the greatest venture capital firms of all time. Other firms came out of pure security analyst work. Hayden, Stone, & Co was co-founded by another MIT grad, Charles Hayden, who made his name mining copper to help wire up the world in what he expected to be an increasingly electrified world. Stone was a Wall Street tycoon and the two of them founded a firm that employed Joe Kennedy, the family patriarch, Frank Zarb, a Chairman of the NASDAQ and they gave us one of the great venture capitalists to fund technology companies, Arthur Rock. Rock has often been portrayed as the bad guy in Steve Jobs movies but was the one who helped the “Traitorous 8” leave Shockley Semiconductor and after their dad (who had an account at Hayden Stone) mentioned they needed funding, got serial entrepreneur Sherman Fairchild to fund Fairchild Semiconductor. He developed tech for the Apollo missions, flashes, spy satellite photography - but that semiconductor business grew to 12,000 people and was a bedrock of forming what we now call Silicon Valley. Rock ended up moving to the area and investing. Parlaying success in an investment in Fairchild to invest in Intel when Moore and Noyce left Fairchild to co-found it. Venture Capital firms raise money from institutional investors that we call limited partners and invest that money. After moving to San Francisco, Rock setup Davis and Rock, got some limited partners, including friends from his time at Harvard and invested in 15 companies, including Teledyne and Scientific Data Systems, which got acquired by Xerox, taking their $257,000 investment to a $4.6 million dollar valuation in 1970 and got him on the board of Xerox. He dialed for dollars for Intel and raised another $2.5 million in a couple of hours, and became the first chair of their board. He made all of his LPs a lot of money. One of those Intel employees who became a millionaire retired young. Mike Markulla invested some of his money and Rock put in $57,000 - growing it to $14 million and went on to launch or invest in companies and make billions of dollars in the process. Another firm that came out of the Fairchild Semiconductor days was Kleiner Perkins. They started in 1972, by founding partners Eugene Kleiner, Tom Perkins, Frank Caufield, and Brook Byers. Kleiner was the leader of those Traitorous 8 who left William Shockley and founded Fairchild Semiconductor. He later hooked up with former HP head of Research and Development and yet another MIT and Harvard grad, Bill Perkins. Perkins would help Corning, Philips, Compaq, and Genentech - serving on boards and helping them grow. Caufield came out of West Point and got his MBA from Harvard as well. He'd go on to work with Quantum, AOL, Wyse, Verifone, Time Warner, and others. Byers came to the firm shortly after getting his MBA from Stanford and started four biotech companies that were incubated at Kleiner Perkins - netting the firm over $8 Billion dollars. And they taught future generations of venture capitalists. People like John Doerr - who was a great seller at Intel but by 1980 graduated into venture capital bringing in deals with Sun, Netscape, Amazon, Intuit, Macromedia, and one of the best gambles of all time - Google. And his reward is a net worth of over $11 billion dollars. But more importantly to help drive innovation and shape the world we live in today. Kleiner Perkins was the first to move into Sand Hill Road. From there, they've invested in nearly a thousand companies that include pretty much every household name in technology. From there, we got the rise of the dot coms and sky-high rent, on par with Manhattan. Why? Because dozens of venture capital firms opened offices on that road, including Lightspeed, Highland, Blackstone, Accel-KKR, Silver Lake, Redpoint, Sequoia, and Andreesen Horowitz. Sequoia also started in the 70s, by Don Valentine and then acquired by Doug Leone and Michael Moritz in the 90s. Valentine did sales for Raytheon before joining National Semiconductor, which had been founded by a few Sperry Rand traitors and brought in some execs from Fairchild. They were venture backed and his background in sales helped propel some of their earlier investments in Apple, Atari, Electronic Arts, LSI, Cisco, and Oracle to success. And that allowed them to invest in a thousand other companies including Yahoo!, PayPal, GitHub, Nvidia, Instagram, Google, YouTube, Zoom, and many others. So far, most of the firms have been in the US. But venture capital is a global trend. Masayoshi Son founded Softbank in 1981 to sell software and then published some magazines and grew the circulation to the point that they were Japan's largest technology publisher by the end of the 80s and then went public in 1994. They bought Ziff Davis publishing, COMDEX, and seeing so much technology and the money in technology, Son inked a deal with Yahoo! to create Yahoo! Japan. They pumped $20 million into Alibaba in 2000 and by 2014 that investment was worth $60 billion. In that time they became more aggressive with where they put their money to work. They bought Vodafone Japan, took over competitors, and then the big one - they bought Sprint, which they merged with T-Mobile and now own a quarter of the combined companies. An important aspect of venture capital and private equity is multiple expansion. The market capitalization of Sprint more than doubled with shares shooting up over 10%. They bought Arm Limited, the semiconductor company that designs the chips in so many a modern phone, IoT device, tablet and even computer now. As with other financial firms, not all investments can go great. SoftBank pumped nearly $5 billion into WeWork. Wag failed. 2020 saw many in staff reductions. They had to sell tens of billions in assets to weather the pandemic. And yet with some high profile losses, they sold ARM for a huge profit, Coupang went public and investors in their Vision Funds are seeing phenomenal returns across over 200 companies in the portfolios. Most of the venture capitalists we mentioned so far invested as early as possible and stuck with the company until an exit - be it an IPO, acquisition, or even a move into private equity. Most got a seat on the board in exchange for not only their seed capital, or the money to take products to market, but also their advice. In many a company the advice was worth more than the funding. For example, Randy Komisar, now at Kleiner Perkins, famously recommended TiVo sell monthly subscriptions, the growth hack they needed to get profitable. As the venture capital industry grew and more and more money was being pumped into fueling innovation, different accredited and institutional investors emerged to have different tolerances for risk and different skills to bring to the table. Someone who built an enterprise SaaS company and sold within three years might be better served to invest in and advise another company doing the same thing. Just as someone who had spent 20 years running companies that were at later stages and taking them to IPO was better at advising later stage startups who maybe weren't startups any more. Here's a fairly common startup story. After finishing a book on Lisp, Paul Graham decides to found a company with Robert Morris. That was Viaweb in 1995 and one of the earliest SaaS startups that hosted online stores - similar to a Shopify today. Viaweb had an investor named Julian Weber, who invested $10,000 in exchange for 10% of the company. Weber gave them invaluable advice and they were acquired by Yahoo! for about $50 million in stock in 1998, becoming the Yahoo Store. Here's where the story gets different. 2005 and Graham decides to start doing seed funding for startups, following the model that Weber had established with Viaweb. He and Viaweb co-founders Robert Morris (the guy that wrote the Morris worm) and Trevor Blackwell start Y Combinator, along with Jessica Livingston. They put in $200,000 to invest in companies and with successful investments grew to a few dozen companies a year. They're different because they pick a lot of technical founders (like themselves) and help the founders find product market fit, finish their solutions, and launch. And doing so helped them bring us Airbnb, Doordash, Reddit, Stripe, Dropbox and countless others. Notice that many of these firms have funded the same companies. This is because multiple funds investing in the same company helps distribute risk. But also because in an era where we've put everything from cars to education to healthcare to innovation on an assembly line, we have an assembly line in companies. We have thousands of angel investors, or humans who put capital to work by investing in companies they find through friends, family, and now portals that connect angels with companies. We also have incubators, a trend that began in the late 50s in New York when Jo Mancuso opened a warehouse up for small tenants after buying a warehouse to help the town of Batavia. The Batavia Industrial Center provided office supplies, equipment, secretaries, a line of credit, and most importantly advice on building a business. They had made plenty of money on chicken coops and though that maybe helping companies start was a lot like incubating chickens and so incubators were born. Others started incubating. The concept expanded from local entrepreneurs helping other entrepreneurs and now cities, think tanks, companies, and even universities, offer incubation in their walls. Keep in mind many a University owns a lot of patents developed there and plenty of companies have sprung up to commercialize the intellectual property incubated there. Seeing that and how technology companies needed to move faster we got accelerators like Techstars, founded by David Cohen, Brad Feld, David Brown, and Jared Polis in 2006 out of Boulder, Colorado. They have worked with over 2,500 companies and run a couple of dozen programs. Some of the companies fail by the end of their cohort and yet many like Outreach and Sendgrid grow and become great organizations or get acquired. The line between incubator and accelerator can be pretty slim today. Many of the earlier companies mentioned are now the more mature venture capital firms. Many have moved to a focus on later stage companies with YC and Techstars investing earlier. They attend the demos of companies being accelerated and invest. And the fact that founding companies and innovating is now on an assembly line, the companies that invest in an A round of funding, which might come after an accelerator, will look to exit in a B round, C round, etc. Or may elect to continue their risk all the way to an acquisition or IPO. And we have a bevy of investing companies focusing on the much later stages. We have private equity firms and family offices that look to outright own, expand, and either harvest dividends from or sell an asset, or company. We have traditional institutional lenders who provide capital but also invest in companies. We have hedge funds who hedge puts and calls or other derivatives on a variety of asset classes. Each has their sweet spot even if most will opportunistically invest in diverse assets. Think of the investments made as horizons. The Angel investor might have their shares acquired in order to clean up the cap table, or who owns which parts of a company, in later rounds. This simplifies the shareholder structure as the company is taking on larger institutional investors to sprint towards and IPO or an acquisition. People like Arthur Rock, Tommy Davis, Tom Perkins, Eugene Kleiner, Doerr, Masayoshi Son, and so many other has proven that they could pick winners. Or did they prove they could help build winners? Let's remember that investing knowledge and operating experience were as valuable as their capital. Especially when the investments were adjacent to other successes they'd found. Venture capitalists invested more than $10 billion in 1997. $600 million of that found its way to early-stage startups. But most went to preparing a startup with a product to take it to mass market. Today we pump more money than ever into R&D - and our tax systems support doing so more than ever. And so more than ever, venture money plays a critical role in the life cycle of innovation. Or does venture money play a critical role in the commercialization of innovation? Seed accelerators, startup studios, venture builders, public incubators, venture capital firms, hedge funds, banks - they'd all have a different answer. And they should. Few would stick with an investment like Digital Equipment for as long as ARDC did. And yet few provide over 100% annualized returns like they did. As we said in the beginning of this episode, wealthy patrons from Pharaohs to governments to industrialists to now venture capitalists have long helped to propel innovation, technology, trade, and intellectual property. We often focus on the technology itself in computing - but without the money the innovation either wouldn't have been developed or if developed wouldn't have made it to the mass market and so wouldn't have had an impact into our productivity or quality of life. The knowledge that comes with those who provide the money can be seen with irreverence. Taking an innovation to market means market-ing. And sales. Most generations see the previous generations as almost comedic, as we can see in the HBO show Silicon Valley when the cookie cutter industrialized approach goes too far. We can also end up with founders who learn to sell to investors rather than raising capital in the best way possible, selling to paying customers. But there's wisdom from previous generations when offered and taken appropriately. A coachable founder with a vision that matches the coaching and a great product that can scale is the best investment that can be made. Because that's where innovation can change the world.
Java, Ruby, PHP, Go. These are web applications that dynamically generate code then interpreted as a file by a web browser. That file is rarely static these days and the power of the web is that an app or browser can reach out and obtain some data, get back some xml or json or yaml, and provide an experience to a computer, mobile device, or even embedded system. The web is arguably the most powerful, transformational technology in the history of technology. But the story of the web begins in philosophies that far predate its inception. It goes back to a file, which we can think of as a document, on a computer that another computer reaches out to and interprets. A file comprised of hypertext. Ted Nelson coined the term hypertext. Plenty of others put the concepts of linking objects into the mainstream of computing. But he coined the term that he's barely connected to in the minds of many. Why is that? Tim Berners-Lee invented the World Wide Web in 1989. Elizabeth Feinler developed a registry of names that would evolve into DNS so we could find computers online and so access those web sites without typing in impossible to remember numbers. Bob Kahn and Leonard Kleinrock were instrumental in the Internet Protocol, which allowed all those computers to be connected together, providing the schemes for those numbers. Some will know these names; most will not. But a name that probably doesn't come up enough is Ted Nelson. His tale is one of brilliance and the early days of computing and the spread of BASIC and an urge to do more. It's a tale of the hacker ethic. And yet, it's also a tale of irreverence - to be used as a warning for those with aspirations to be remembered for something great. Or is it? Steve Jobs famously said “real artists ship.” Ted Nelson did ship. Until he didn't. Let's go all the way back to 1960, when he started Project Xanadu. Actually, let's go a little further back first. Nelson was born to TV directory Ralph Nelson and Celeste Holm, who won an Academy Award for her role in Gentleman's Agreement in 1947 and took home another pair of nominations through her career, and for being the original Ado Annie in Oklahoma. His dad worked on The Twilight Zone - so of course he majored in philosophy at Swarthmore College and then went off to the University of Chicago and then Harvard for graduate school, taking a stab at film after he graduated. But he was meant for an industry that didn't exist yet but would some day eclipse the film industry: software. While in school he got exposed to computers and started to think about this idea of a repository of all the world's knowledge. And it's easy to imagine a group of computing aficionados sitting in a drum circle, smoking whatever they were smoking, and having their minds blown by that very concept. And yet, it's hard to imagine anyone in that context doing much more. And yet he did. Nelson created Project Xanadu in 1960. As we'll cover, he did a lot of projects during the remainder of his career. The Journey is what is so important, even if we never get to the destination. Because sometimes we influence the people who get there. And the history of technology is as much about failed or incomplete evolutions as it is about those that become ubiquitous. It began with a project while he was enrolled in Harvard grad school. Other word processors were at the dawn of their existence. But he began thinking through and influencing how they would handle information storage and retrieval. Xanadu was supposed to be a computer network that connected humans to one another. It was supposed to be simple and a scheme for world-wide electronic publishing. Unlike the web, which would come nearly three decades later, it was supposed to be bilateral, with broken links self-repairing, much as nodes on the ARPAnet did. His initial proposal was a program in machine language that could store and display documents. Being before the advent of Markdown, ePub, XML, PDF, RTF, or any of the other common open formats we use today, it was rudimentary and would evolve over time. Keep in mind. It was for documents and as Nelson would say later, the web - which began as a document tool, was a fork of the project. The term Xanadu was borrowed from Samuel Taylor Coleridge's Kubla Khan, itself written after some opium fueled dreams about a garden in Kublai Khan's Shangdu, or Xanadu.In his biography, Coleridge explained the rivers in the poem supply “a natural connection to the parts and unity to the whole” and he said a “stream, traced from its source in the hills among the yellow-red moss and conical glass-shaped tufts of bent, to the first break or fall, where its drops become audible, and it begins to form a channel.” Connecting all the things was the goal and so Xanadu was the name. He gave a talk and presented a paper called “A File Structure for the Complex, the Changing and the Indeterminate” at the Association for Computing Machinery in 1965 that laid out his vision. This was the dawn of interactivity in computing. Digital Equipment had launched just a few years earlier and brought the PDP-8 to market that same year. The smell of change was in the air and Nelson was right there. After that, he started to see all these developments around the world. He worked on a project at Brown University to develop a word processor with many of his ideas in it. But the output of that project, as with most word processors since - was to get things printed. He believed content was meant to be created and live its entire lifecycle in the digital form. This would provide perfect forward and reverse citations, text enrichment, and change management. And maybe if we all stand on the shoulders of giants, it would allow us the ability to avoid rewriting or paraphrasing the works of others to include them in own own writings. We could do more without that tedious regurgitation. He furthered his counter-culture credentials by going to Woodstock in 1969. Probably not for that reason, but it happened nonetheless. And he traveled and worked with more and more people and companies, learning and engaging and enriching his ideas. And then he shared them. Computer Lib/Dream Machines was a paperback book. Or two. It had a cover on each side. Originally published in 1974, it was one of the most important texts of the computer revolution. Steven Levy called it an epic. It's rare to find it for less than a hundred bucks on eBay at this point because of how influential it was and what an amazing snapshot in time it represents. Xanadu was to be a hypertext publishing system in the form of Xanadocs, or files that could be linked to from other files. A Xanadoc used Xanalinks to embed content from other documents into a given document. These spans of text would become transclusions and change in the document that included the content when they changed in the live document. The iterations towards working code were slow and the years ticked by. That talk in 1965 gave way to the 1970s, then 80s. Some thought him brilliant. Others didn't know what to make of it all. But many knew of his ideas for hypertext and once known it became deterministic. Byte Magazine published many of his thoughts in 1988 called “Managing Immense Storage” and by then the personal computer revolution had come in full force. Tim Berners-Lee put the first node of the World Wide Web online the next year, using a protocol they called Hypertext Transfer Protocol, or http. Yes, the hypertext philosophy was almost a means of paying homage to the hard work and deep thinking Nelson had put in over the decades. But not everyone saw it as though Nelson had made great contributions to computing. “The Curse of Xanadu” was an article published in Wired Magazine in 1995. In the article, the author points out the fact that the web had come along using many of the ideas Nelson and his teams had worked on over the years but actually shipped - whereas Nelson hadn't. Once shipped, the web rose in popularity becoming the ubiquitous technology it is today. The article looked at Xanadu as vaporware. But there is a deeper, much more important meaning to Xanadu in the history of computing. Perhaps inspired by the Wired article, the group released an incomplete version of Xanadu in 1998. But by then, other formats - including PDF which was invented in 1993 and .doc for Microsoft Word, were the primary mechanisms we stored documents and first gopher and then the web were spreading to interconnect humans with content. https://www.youtube.com/watch?v=72M5kcnAL-4 The Xanadu story isn't a tragedy. Would we have had hypertext as a part of Douglas Engelbart's oNLine System without it? Would we have object-oriented programming or later the World Wide Web without it? The very word hypertext is almost an homage, even if they don't know it, to Nelson's work. And the look and feel of his work lives on in places like GitHub, whether directly influenced or not, where we can see changes in code side-by-side with actual production code, changes that are stored and perhaps rolled back forever. Larry Tessler coined the term Cut and Paste. While Nelson calls him a friend in Werner Herzog's Lo and Behold, Reveries of the Connected World, he also points out that Tessler's term is flawed. And I think this is where we as technologists have to sometimes trim down our expectations of how fast evolutions occur. We take tiny steps because as humans we can't keep pace with the rapid rate of technological change. We can look back and see a two steps forward and one step back approach since the dawn of written history. Nelson still doesn't think the metaphors that harken back to paper have any place in the online written word. Here's another important trend in the history of computing. As we've transitioned to more and more content living online exclusively, the content has become diluted. One publisher I wrote online pieces for asked that they all be +/- 700 words and asked that paragraphs be no more than 4 sentences long (preferably 3) and the sentences should be written at about a 5th or 6th grade level. Maybe Nelson would claim that this de-evolution of writing is due to search engine optimization gamifying the entirety of human knowledge and that a tool like Xanadu would have been the fix. After all, if we could borrow the great works of others we wouldn't have to paraphrase them. But I think as with most things, it's much more nuanced than that. Our always online, always connected brains can only accept smaller snippets. So that's what we gravitate towards. Actually, we have plenty of capacity for whatever we actually choose to immerse ourselves into. But we have more options than ever before and we of course immerse ourselves into video games or other less literary pursuits. Or are they more literary? Some generations thought books to be dangerous. As do all oppressors. So who am I to judge where people choose to acquire knowledge or what kind they indulge themselves in. Knowledge is power and I'm just happy they have it. And they have it in part because others were willing to water own the concepts to ship a product. Because the history of technology is about evolutions, not revolutions. And those often take generations. And Nelson is responsible for some of the evolutions that brought us the ht in http or html. And for that we are truly grateful! As with the great journey from Lord of the Rings, rarely is greatness found alone. The Xanadu adventuring party included Cal Daniels, Roger Gregory, Mark Miller, Stuart Greene, Dean Tribble, Ravi Pandya, became a part of Autodesk in the 80s, got rewritten in Smalltalk, was considered a rival to the web, but really is more of an evolutionary step on that journey. If anything it's a divergence then convergence to and from Vannevar Bush's Memex. So let me ask this as a parting thought? Are the places you are not willing to sacrifice any of your core designs or beliefs worth the price being paid? Are they worth someone else ending up with a place in the history books where (like with this podcast) we oversimplify complex topics to make them digestible? Sometimes it's worth it. In no way am I in a place to judge the choices of others. Only history can really do that - but when it happens it's usually an oversimplification anyways… So the building blocks of the web lie in irreverence - in hypertext. And while some grew out of irreverence and diluted their vision after an event like Woodstock, others like Nelson and his friend Douglas Englebart forged on. And their visions didn't come with commercial success. But as an integral building block to the modern connected world today they represent as great a mind as practically anyone else in computing.
Key Insights:Cory Doctorow is AWESOME!It is depressing. We once, with the creation of the market economy, got interoperability right. But now the political economy blocks us from there being any obvious path to an equivalent lucky historical accident in our future.The problems in our society are not diametrically opposed: Addressing the problems of one thing doesn't necessarily create equal and opposite problems on the other side—but it does change the trade-offs, and so things become very complex and very difficult to solve. Always keep a trash bag in your car.Hexapodia!References:Books:Cory Doctorow: How to Destroy Surveillance Capitalism Cory Doctorow: Attack Surface Cory Doctorow: Walkaway Cory Doctorow: Down & Out in the Magic Kingdom Cory Doctorow: Little Brother William Flesch:Comeuppance: Costly Signaling, Altruistic Punishment, and Other Biological Components of Fiction Daniel L. Rubinfeld: A Retrospective on U.S. v. Microsoft: Why Does It Resonate Today? Louis Galambos & Peter Temin: The Fall of the Bell System: A Study in Prices & Politics Websites:Electronic Frontier Foundation: Adversarial Interop Case Studies: Privacy without Monopoly: Cory Doctorow: Craphound Cory Doctorow: Pluralistic &, of course:Vernor Vinge: A Fire Upon the Deep (Remember: You can subscribe to this… weblog-like newsletter… here: There’s a free email list. There’s a paid-subscription list with (at the moment, only a few) extras too.)Grammatized Transcript:Brad: Noah! What is the key insight? Noah: Hexapodia is the key insight! Six feet!Brad: And what is that supposed to mean? Noah: That there is some nugget of fact that, if you grasp it correctly and place it in the proper context, will transform your view of the situation and allow you to grok it completely.Brad: And in the context of Vernor Vinge’s amazing and mind-Bending science-fiction space-opera novel A Fire Upon the Deep?Noah: The importance of “hexapodia” is that those sapient bushes…Brad: …riding around on six wheeled scooters have been genetically…Noah: …programmed to be a fifth column of spies and agents for the Great Evil.Brad: However, here we seek different key insights than “hexapodia”. Today we seek them from the genius science-fiction author and social commentator Cory Doctorow. I think of him as—it was Patrick Nielsen Hayden, I think, who said around 2004: that he felt like he was living in the future of Scottish science fiction author, Ken MacLeod. And he wished Ken would just stop. At times I feel that way about Cory. But we are very happy to have him here. His latest book is How to Destroy Surveillance Capitalism IIRC, his latest fiction is Attack Surface. My favorite two books of his are Walkaway and—I think it was your first—Down & Out in the Magic Kingdom.Cory: That's right. Yes. Thank you. Thank you for that very effusive introduction. I decry all claims of genius, though.Brad: Well, we know this is a problem. When one is dealing with an author whose work one has read a lot of—by reading your books.by now I've spent forty hours of my life looking at squiggles on a page or on a screen and, through a complicated mental process, downloaded to my wetware and then run on it a program that is my image of a sub-Turing instantiation of your mind, who has then told me many very entertaining and excellent stories. So I feel like I know you very well…Cory: There’s this infamous and very funny old auto reply that Neal Stephenson used to send to people who emailed him. It basically went: “Ah, I get it. You feel like you were next to me when we were with Hero Protagonist in Alaska fighting off the right-wing militias. But while you were there with me, I wasn't there with you. And so I understand why you want to, like, sit around and talk about our old military campaigns. But I wasn't on that campaign with you.Brad: Yes. It was only my own imago, my created sub-Turing instantiation of your mind that was there…Cory: Indeed. We are getting off of interoperability, which is what I think we're mostly going to talk about. But this is my cogpsy theory of why fiction works, and where the fanfic dispute comes from. Writers have this very precious thing they say. It is: “I'm writing and I'm writing and all of a sudden the characters start telling me what they want to do.” I think that what they actually mean by that is that we all have this completely automatic process by which we try and create models of the people we encounter. Sometimes we never encounter those people. We just encounter second-hand evidence of them. Sometimes those people don't live at all. Think about the people who feel great empathy for imaginary people that cruel catfishers have invented on the internet to document their imaginary battles with cancer. They then feel deeply hurt and betrayed and confused, when this person they've come to empathize with turns out to be a figment of someone else's imagination. I think what happens when you write is that you generate this optical link between two parts of your brain that don't normally talk to each other. There are these words that you are explicitly thinking up that show up on your screen. And then those words are being processed by your eyeballs and being turned into fodder for a model in this very naive way. And then the model gets enough flesh on the bones—so it starts telling you what it wants to do. At this point you are basically breathing your own exhaust fumes here. But it really does take what is at first a somewhat embarrassing process of putting on a puppet show for yourself: “Like, everybody, let’s go on a quest!” “That sounds great!” “Here we go!” It just becomes something where you don't feel like you're explicitly telling yourself a story. Now the corollary of this is that it sort of explains the mystery of why we like stories, right? Why we have these completely involuntary, emotional responses to the imaginary experiences of people who never lived and died and have no consequence. The most tragic death in literature of Romeo and Juliet is as nothing next to the death of the yogurt I digested with breakfast this morning, because that yogurt was alive and now it's dead and Romeo and Juliet never lived, never died, nothing that happened to them happened. Yet you hear about the Romeo and Juliet…Noah: …except that a human reads about Romeo and Juliet and cares…Cory: That is where it matters, yes indeed. But the mechanism by which we care is our build this model which is then subjected to the author's torments, and then we feel empathy for the model. What that means is that the readers, when they're done, if the book hit its aesthetic marks, if it did the thing that literature does to make it aesthetically pleasing—then the reader still has a persistent model in the same way that if your granny dies, you still have a model of your granny, right? You are still there. That is why fanfic exists. The characters continue to have imagined lives. If the characters don't go on having imagined lives, then the book never landed for you. And that’s why authors get so pissy about fanfic. They too have this model that they didn't set out to explicitly create, but it's there. And it's important to their writing process. And if someone is putting data in about that modeled person that is not consistent with the author's own perception of them, that creates enormous dissonance. I think that if we understood this, we would stop arguing about fanfic.Noah: We argue about fanfic?Brad: Oh yes, there are people who do. I remember—in some sense, the most precious thing I ever read was Jo Walton saying that she believed that Ursula K. LeGuin did not understand her own dragons at all…Noah: …Yep, correct…Cory: Poppy Bright—back when Poppy Bright was using that name and had that gender identity—was kicked out of a fan group for Poppy Bright fans on LiveJournal for not understanding Poppy Bright’s literature. I think that's completely true. Ray Bradbury to his dying day insisted that Fahrenheit 451 had nothing to do with censorship but was about the dangers of television…Brad: Fanfic is an old and wonderful tradition. It goes back to Virgil, right? What is the Aeneid but Iliad fanfic?Cory: And what is Genesis but Babylonian fanfic? It goes a lot further back than that…Brad: Today, however, we are here to talk not about humans as narrative-loving animals, not about the sheer weirdness of all the things that we run on our wetware, but about “mandated interoperability”, and similar things—how we are actually going to try to get a handle on the information and attention network economy that we are building out in a more bizarre and irrational way than I would have ever thought possible.Cory: Yes. I don't know if the audience will see this, but the title that you've chosen is: “Mandated Interoperability Is Not Going to Work”. I am more interested in how we make mandated interoperability work. I don't think it's a dead letter. I think that to understand what's what's happened you have to understand that the main efficiency that large firms bring to the market is regulatory capture. In an industry with only four or five major companies, all of the executives almost by definition must have worked at one or two of the other ones. Think of Sheryl Sandberg, moving from Google to Facebook. They form an emerging consensus. Sometimes they all sit around the same a boardroom table. Remember that photo of the tech leaders around the table at the top of Trump Tower? They converge on a set of overlapping lobbying priorities. They have a lot of excess rents that they can extract to mobilize lobbying in favor of that. One of the things that these firms have done in the forty years of the tech industry is to move from a posture where they were all upstarts and were foursquare for interoperability with the existing platforms—because they understood that things like network advantages were mostly important in as much as they conferred a penalty for switching, and that if you could switch easily then the network advantage disappeared. If you could read Microsoft Office documents on a Mac, then the fact that there's a huge network effect of Microsoft Office documents out there is irrelevant. Why? Because you can just run switch ads, and say every document ever created with Microsoft Office is now a reason to own a Mac. But as they became dominant, and as their industries have become super-concentrated, they have swung against interoperability. I think that we need a couple of remedies for that. I think that we need some orderly structured remedies in the forms of standards. We need to check whether or not those standards are mandated. And we’ve seen how those standards can be subverted. And so I think we need something that stops dominant firms from subverting standards—a penalty that they pay that is market-based, that impacts their bottom line, and that doesn't rely on a slow-moving or possibly captured regulator but that, instead, can actually just emerge in real time. That is what I call “adversarial interoperability”: reverse engineering and scraping and bots. Steve Jobs paying some engineers to reverse engineer Microsoft Office file formats and make iWork suite, instead of begging Bill Gates to rescue the Mac…Brad: …But he did beg Bill Gates to rescue the Mac…Cory: He did that as well. But that wasn't the whole story. He had a carrot and a stick. He had: let's have a managed, structured market. Right. And then he had: what happens if you don't come up to my standards is that we have alternatives, because we can just reverse-engineer your stuff. Look at, for example, the way that we standardized the formatting of personal finance information. There were standards that no one adopted. Then Mint came along, and they wrote bots, and you would give the bots your login credentials for your bank, and they would go and scrape your account data and put it into a single unified interface. This was adversarial interoperability. This spurred the banks to actually come into compliance with the standard. Rather than having this guerrilla warfare, they wanted a quantifiable business process that they could understand from year to year that wouldn't throw a lot of surprises that would disrupt their other other plans.Brad: Let me back up: In the beginning, the spirit of Charles Babbage moved upon the face of the waters, and Babbage said: “Let there be electromechanical calculating devices”. And there was IBM. And IBM then bred with DARPA in the form of the Sage Air Defense, and begat generation upon generation of programmers. And from them was born FORTRAN and System 360. And FORTRAN and IBM System 360 bestrode the world like the giants of the Nephilim, and Babbage saw it, and it was good. And there was nibbling around the edges from Digital Equipment and Data General. Yea, until one day out of Silicon Valley, there emerged crystallized sand doped with germanium atoms, and everything was upset as out of CERN and there emerged the http protocol. All the companies that had been construct their own walled information gardens, and requiring you to sign up with AOL and CompuServe and Genie and four or five others in order to access databases through gopher and whatever—they found themselves overwhelmed by the interoperability tide of the internet. And for fifteen years there was interoperability and openness and http and rss, and everyone frantically trying to make their things as interoperable as possible so that they could get their share of this absolutely exploding network of human creativity and ideas. And then it all stopped. People turned on a dime. They began building their own walled gardens again. Noah: I feel like we did just get Neal Stephenson on this podcast…Brad: Sub-Turing! It's a sub-Turing instantiation of a Neal Stephenson imago!Cory: I think that your point of view or generational outlook or whatever creates a different lens than mine. I think about it like this: In 1979 we got an Apple II+. In 1980, we got a modem card for it. Right. By 1982, there were a lot of BBS’s and that was great. Even though we were in Canada, the BBS software was coming up from the American market. We had local dial-up BBS's running software that was being mailed around on floppies…Brad: Whish whish whine… Beep beep… Whish… I am trying to make modem noises…Cory: that sounded like V.42bis. And then by 1984 there were the PC clones. Everyone had a computer. This company that no one had ever heard of—Microsoft—suddenly grew very big. They created this dynamism in the industry. You could have a big old giant, like IBM. You could have two guys in a garage, like Microsoft. The one could eclipse the other. IBM couldn't even keep control of its PCs. They were being cloned left and right. And then Microsoft became the thing that had slain. It became a giant. And the DOJ intervened. Even though Microsoft won the suit ultimately—they weren't broken up…Brad: They did back off from destroying Google…Cory: What’s missing from that account is the specific mechanisms. We got modems because we got cheap, long distance. We got that because 1982 we had the ATT breakup. Leading up to the breakup shifted the microeconomics. People ATT were all: don’t do that. It's going to piss off the enforcers. We've got this breakup to deal withBrad: Yes. The enforcers, the enforcers are important. Both the Modification of Final Judgment. And ATT’s anticipatory reaction to it. Plus the periodic attempted antitrust kneecappings of IBM. They meant that when people in IBM turned around and said: “Wait a minute. When we started the PC project, John F. Akers told us we needed to find something for Mary Gates’s boy Bill to do, because he sat next to her at United Way board meetings. But this is turning into a monster. We need to squelch them.” And from the C-suite came down: “No, our antitrust position is sufficiently fraught that we can't move to squash Microsoft.”Cory: Yes. IBM spent 12 years in antitrust litigation. Hell, they called it. Antitrust as Vietnam. They essentially had been tied by the ankles to the back of DOJ’s bumper and dragged up and down a gravel road for 12 years. They were outspending the entire DOJ legal department every single year for that one case. And one of the things that DOJ really didn't like about IBM was tying software to hardware. And so when Phoenix makes the IBM ROM clone, IBM is like: Yeah, whatever. Any costs we pay because of the clone ROM are going to be lower than the costs we will incur if we get back into antitrust hell—and the same goes for Microsoft. They got scared off. What we were seeing, what it felt like, the optimism that I think we felt and of which we were aware was—it looked like we'd have protocols and not products, and we'd have a pluralistic internet, not five giant websites filled with screenshots of text from the other. But our misapprehension was not due to technological factors. It was our failing to understand that like Bork and Reagan had shivved antitrust in the guts in 1980, and it was bleeding out. So by the time Google was big enough to do to everyone else what Microsoft had not been able to do to them, there was no one there to stop Google.Noah: Cory, let me ask a question here. I'm the designated grump of the podcast. Brad is the designated history expounder. I want to know: Why do we care right right now? I've written about interoperability with regards to electric cars and other emerging technologies. What things in the software world are people hurt by not having interoperability for? What are the big harms in software to consumers or to other stakeholders from lack of interoperability?Cory: Let me frame the question before I answer it. We have market concentration in lots of different sectors for similar reasons, mergers. We should have different remedies for them. We heard about Babbage. I would talk about Turing and the universality of the computer. Interoperability represents a pro-competitive remedy to anti-competitive practices that is distinct and specific to computers. I don't know if you folks know about the middle-gauge muddle in Australia. Independent states and would-be rail barons laid their own gauge rail across the country. You can't get a piece of rolling stock from one edge of the country to the other. For 150 years they have been trying to build designs that can drop one set of wheels where the track needs it. And none of them have worked. And now their solution is to tear up rails and put down new rails. If that was a software object, we just write a compatibility layer. Where we have these durable anti-competitive effects in the physical world, that sometimes necessitate these very difficult remedies, we can actually facilitate decentralized remedies where people can seize the means of computation to create digital remedies: self-determination, the right to decide how to talk to their friends and under what circumstances, as opposed to being forced to choose between being a social person and being private…Brad: For me, at least there are lots and lots of frictions that keep me from seeing things that I would like to see, and keep me from cross-referencing things that I would like to cross-references. There are bunches of things I've seen on Twitter and Facebook in the past that, because they are inside the walled gardens. I definitely am not able to get them out quickly and easily and cheaply enough to put them into the wider ideas flow. And I feel stupider as a result. And then there are all the people who have been trapped by their own kind of cognitive functioning, so that they are now a bunch of zombies with eyeballs glued to the screen being fed terror so that they can be sold fake diabetes cures and overpriced gold funds…Noah: That’s a good angle right here. If we look at the real harms that are coming through the internet right now—I worry about Kill Zones, and of course I worry about the next cool thing getting swallowed up by predatory acquisitions. That's our legitimate worry for sure. When I look at the internet and what bad the internet is causing, I do not see the lack of alternative information sources as the biggest problem. I see the people who are the biggest problem as coming precisely from alternative information sources. This is not to say we should get rid of those sources. This is not to say we should have mass censorship and ban all the anti-vax sites. I'm not saying that. But if we look at the issues—there was a mass banning of Trump and many of the Q-Anons from the main social media websites, and yet a vast underground network of alternative right wing media has sprung up.Cory: It seems like they were able to. Let me redirect from the harms that Brad raised. I think those are perfectly good harms. But I want to go to some broader harms. In the purely digital online world, we had some people we advised at EFF who were part of a medical cancer previvor group—people who have a gene that indicates a very high likelihood of cancer, women. They had been aggressively courted by Facebook at a time when they were trying to grow up their medical communities. And one of the members of this group who wasn't a security researcher or anything was just noodling around on Facebook, and found that you could enumerate the membership of every group on Facebook, including hers. She reported that to Facebook. That's obviously a really significant potential harm to people in the medical communities. She reported it to Facebook. Facebook characterized her report as a feature request and won't fix it. She made more of a stink. They said: fine, we're going to do a partial fix because it would have interfered with their ad-tech stack to do a full fix. So you have to be a member of a group to enumerate the group. This was still insufficient. But they had this big problem with inertia—with the collective action problem of getting everyone who's now on Facebook to leave Facebook and go somewhere else. They were all holding each other mutually hostage. Now you could imagine that they could have set up a Diaspora instance, and they could have either had a mandated- or standards-defined interface that allowed those people to talk to their friends on Facebook. And they could have a little footer at the bottom of each message: today 22% of the traffic in this group originated on our diaspora, once that tips to 60% were all leaving, and quitting Facebook. They might do this with a bot, without Facebook's cooperation, in the absence of Facebook's legal right to prevent those bots. Facebook has weaponized the computer fraud and abuse act and other laws to prevent people from making these bots to allow them to inter-operate with Facebook—even though, when Facebook started, the way that it dealt with its issues with MySpace was creating MySpace spots, where you could input your login and password, and it would get your waiting MySpace messages and put them in your Facebook inbox and let you reply to them. Facebook has since sued Power Ventures for doing the same thing. They’re engaged in legal activity against other bot producers that are doing beneficial pro-user things. That's one harm. Another harm that I think is really important here is repair. Independent repairs are about 5% of US GDP. The lack of access to repair is of particular harm to people who are already harmed the most: it raises the cost of being poor. The ability to control repair is a source of windfall profits. Tim Cook advised his investors in 2019, the year after he killed twenty right-to-repair bills at these state level, that the biggest threat to Apple's profits was that people were fixing their devices instead of throwing them away. It’s an environmental problem, and so on. The biggest problem with right-to-repair is not that the companies don't provide their data or the diagnostic codes or encrypt diagnostic codes. The problem is that you face felony prosecution under the CFAA and DMCA, as well as ancillary stuff like non-compete and non-disclosure, and so on through federal trade secrecy law, if you create tools to repairs without the cooperation of the vendors. This is a real harm that arises out of the rules that have been exploited to block interoperability.Brad: This goes deep, right? This affects not just tech but the world, or, rather, because tech has eaten the world, hard-right unsympathetic state representatives from rural Missouri are incredibly exercised about right-to-repair, and the fact that John Deere does not have enough internal capacity to repair all the tractors that need to be repaired in the three weeks before the most critical-need part of the year.Cory: This is an important fracture line. There are people who have a purely instrumental view: me my constituents need tractor repair, so I will do whatever it takes to get them tractor repair. In California we got a terrible compromise on this brokered with John Deere—it was basically a conduct remedy instead of a structural change. Right. Something I questioned a lot about Klobuchar’s antitrust story is that she keeps saying: I believe that we need to jettison the 40-year consumer-welfare standard and return to a more muscular antitrust that is predicated on social harms that include other stakeholders besides consumers paying higher prices, and I have a bipartisan consensus on this because Josh Hawley agrees with me, but Josh Hawley does not agree with her. Josh Hawley just wants to get Alex Jones back on Twitter, right. And that's like, it begins and ends there.She might be able to get the inertia going where Josh Hawley is put in the bind where he either has to brief for a more broad antitrust cause of action that includes social harms, or he has to abandon Alex Jones to not being on Twitter. And maybe he'll take Alex Jones if that's the price. But I do think that that's a huge fracture line, that there are honest brokers who don't care about the underlying principle and the long run effects of bad policy. And there are people who just want to fix something for a political point or immediate benefit.Brad: Fixing it to the extent that fixing something scores a political point—that does mean actually doing good things for your constituents, who include not just Alex Jones, but the guys in rural Missouri who want their John Deere tractors repaired cheaply.Cory: This is how I feel about de platforming. I was angry about deplatforming for 10 years, when it was pipeline activists and sex workers and drag queens who were being forced to use their real name, and trans people were forced to use their dead names, and political dissidents in countries where they could be rounded up and tortured and murdered if they adhere to Facebook’s real names policy, and all of that stuff. First they came for the drag queens, and I said nothing because I wasn't a drag queen. Then they came for the far right conspiratorialists. But they're fair-weather friends. It's like the split between open source and free software where, you know, the benefits of technological self-determination were subsumed into the instrumental benefits of having access to the source so you could improve it. What we have is free software for the tech monopolists, for they can see the source and modify the source of everything on their backend. And we have open source for the rest of us. We can inspect the source, we can improve their software for them, but we don't get to choose how their backends run. And since everything loops through their backends, we no longer have software freedom. That's the risk if you decouple instrumental from ethical propositions. You can end up with a purely instrumental fix that leaves the ethical things that worry you untouched, and in fact in a declining spiral.Noah: I want to argue. I don’t think we don't get enough argument on this podcast. I want to inject a little here. A turning point for my generation in terms of our use of the internet was Gamergate. That happened in 2014. Gamergate largely morphed after that into the the Trump movement and the alt-right. Gamergate destroyed what I knew as online nerd culture. It was an extinction-level event for the idea that nerd culture existed apart from the rest of society. It was a terrible thing. Maybe nerd culture couldn't have lasted, but a giant subculture that I enjoyed and partially defined myself by as a young person was gone. And not only that, not only me—I’m centering myself and making all about me here, but a lot of people got harassed. Some good friends of mine got harassed. It was really terrible as an event in and of itself, irrespective of the long-term effects. Even Moot, a big, huge defender of anonymity and free speech, eventually banned Gamergate topics from 4chan. That was the moment when I realized that the idea of free speech as free speech guarded by individual forums or platforms separately from the government—that that idea was dead. When Moot banned banned Gamergate from 4chan, I said: okay, we're in a different era. That was the Edward R Murrow moment. That was the moment we started going back toward Dan Rather and Edward R Murrow and the big three television companies in the 1950s—when Moot banned Gamergate. Maybe this just has to happen. Maybe bad actors are able to always co-opt a fragmented internet. There’s no amount of individual Nazi punching that can get the Nazis out. If you have people whose speech is entirely focused on destroying other people's right to speak, as Gamergate was, then then free speech means nothing because no one feels free to speak. I wonder whether fragmentation of platforms makes it harder to police things like Gamergate and thus causes Nazis to fractally permeate each little space on the internet and every little pool of the internet. Wherever we have one big pool, we have economies of scale in guarding that pool. Brad: That is: what you are saying is that an information world of just four monopolistic, highly oligopolistic, walled gardens is bad, but an internet in which you cannot build any wall around your garden is bad as well. Then what we really need is a hundred walled gardens blooming, perhaps. But I want to hear what Cory has to say about this and interoperability.Cory: I found that so interesting. I had to get out some, no paper and take notes. First of all, I would trace back before the Gamergate issue. Before it was the Sad Puppies, the disruption of the Hugo awards by far-right authors was before Gamergate. It was the same ringleaders. Gamergate was the second act of sad puppies. So I'm there with you. I was raised by Trotskyists. I want to say that, listening to you describe how you feel about nerd culture after you discovered that half of your colleagues and friends were violent misogynists—it sounds a lot like how Trotskyists talk about Stalinists, right. You have just recounted the the internet nerd version of Homage to Catalonia. Orwell goes to Spain to fight the fascist and a Stalinist shoots him through the throat.We in outsider or insurgent or subcultural movements often have within our conception of a group people who share some characteristics and diverge on others. We paper over those divergences until they fracture. Think about the punk Nazi-punk split. This anti-authoritarian movement is united around a common aesthetic and music and a shared cultural identity. And there's this political authoritarian anti-authoritarian things sitting in the middle. And they just don't talk about it until they start talking about it—Dead Kennedys record: Nazi punks f-—- off. And here we are, still in the midst of that reckoning. That's where Stormfront comes from and all the rest of it. This is not distinct to the internet. It is probably unrealistic, it's definitely unrealistic for there to be a regime in which conduct that is lawful can find no home. Not that not that it won't happen in your home, but that it won't happen in anyone's home. The normative remedy where we just make some conduct that is lawful so far beyond the pale that everyone ceases to engage in it—that has never really existed. Right. You can see that with conduct that we might welcome today, as you know, socially fine and conduct that we dislike—whether that's, you know, polyamory. You go back to the future house, where Judy Merrill and, and Fred Pohl and C.M. Kornbluth lived in the thirties, and they had this big, weird polyamorous household of leftist science fiction writers write at a time when it was unmentionably weird to do it. And today it's pretty mainstream—at least in some parts of California. In the absence of an actual law against it, it's probably going to happen. The first question is: is our response to people who have odious ideas that we want there to be nowhere where they can talk about it? If that's the case, we'll probably have to make a law against them. Noah: Right. But hold on. Is it ideas, or is it actions? If you harass someone you're not expressing an idea, you're stopping them from expressing theirs. Cory: Absolutely. So, so the issue is: that there are Nazis talking to other Nazis is okay. It's just that when Nazis talked to other Nazis and figured out how to go harass someone. Let me give you an example of someone I know who is in the midst of one of these harassment campaigns. Now there's a brilliant writer, a librettist, novelist, and comics author named Cecile Castellucci. She also used to be like a pioneering Riot Girl and toured with Sloan. So she's just this great polymath person. And because she's a woman who writes comics, men on the internet hate her. And there's a small and dedicated cadre of these men who figured out a way to mess with women on Twitter. They send you a DM that is really violent and disgusting. They wait until they see the read receipt, and then they delete it. Twitter, to its credit, will not accept screenshotted DMs as evidence of harassment, because it would be very easy for those same men to forge DMs from their targets and get those people kicked off Twitter. Then what they do is they revictimize their targets by making public timeline mentions that comport with Twitter's rules unless you've seen the private message. And they make references to the private message that trigger the emotions from the private message over and over again. It is a really effective harassment technique. The women they use it against are stuck on Twitter, because their professional lives require them to be on Twitter, right. Their careers would end to some important degree if they weren't part of this conversation on Twitter. Now, imagine if you had Gotham Clock Tower, Barbara Gordon's secret home, which was a Mastodon instance that was federated with Twitter, either through a standard or through a mandate or through adversarial interoperability. There could be a dozen women there who could agree that among themselves that they're willing to treat screenshotted DMs as evidence of harassment, so that they could block and silence and erase the all presence of these horrible men. We'd still want Twitter to do something about them, but if some of those men slipped through Twitter’s defenses as they will, not just because they can't catch everyone when they're at the scale, but because the range of normal activities at scale is so broad: a hundred million people have a hundred and one million use cases every day. Then those people are that, that those people could still be on Twitter, but not subject to the harassment of Twitter. It's a way for them. Maybe, in the way that we talk about states being democracy's laboratories, maybe these satellite communities could pioneer moderation techniques that range beyond takedowns or account terminations or warning labels. There are so many different ways we could deal with this. You could render some comments automatically in Comic Sans. They could try them and see if they work. And they could be adopted back into main Twitter. That's what self-determination gets you: it gets you the right to set the rules of your discourse, and it gets you the right to decide who you trust to be within the group of people who make those rules.Brad: So if we had the real interoperable world, we would have lots that would screen things according to someone's preferences. And you could sign up to have that bot included in your particular bot list to pre-process and filter, so that you don't have to wade through the garbage.Cory: Sure. And there might be some conduct that we consider so far beyond the pale that we actually criminalize it. Then we can take the platforms where that conduct routinely takes place and things like reforms to 230 would cease to be nearly so important. We would be saying that if you are abetting unlawful conduct, when we see a remedy for preventing this unlawful conduct, and you refusing to implement that remedy, we might defenestrate you. We might do something worse. Think of how the phone network works.It is standardized. There are these standard interchanges. There's lots of ways it can be abused. Every now and and then, from some Caribbean Island, we get a call that fakes a number from a Caribbean Island, and if you call it back, you're billed at $20 a minute for a long distance to have someone go: no, it was a wrong number. When that happens, the telco either cleans up its act or all the other telcos break their connection to it. There's certain conduct that's unlawful on the phone network, not unlawful because it cheats the phone company—not toll fraud—but unlawful because it's bad for the rest of the world, like calling bomb threats in. Either the customer gets terminated or the operator is disciplined by law. All of those things can work without having to be in this in this regime where you have paternalistic control, where you vest all of your hope in a God-King who faces no penalty if he makes a bad call. They say: we’ll defend your privacy when the FBI wants to break the iPhone. But when they threaten to shut down our manufacturing, we'll let them spy on you even as they're opening up concentration camps and putting a million people in them.Brad: Was that the real serpent in all of these walled gardens? Was the advertising-supported model the thing that turns your eyeballs into the commodity to be enserfed. If we had the heaven of micropayments, would we manage to avoid all of this?Cory: We've had advertising for a long time. The toxicity of advertising is pretty new. Mostly what's toxic about advertising is surveillance, and not because I think the surveillance allows them to do feats of mind control. I think everyone who's ever claimed to have mind control turned out to be lying to themselves or everyone else. Certainly there is not a lot of evidence for it. You have these Facebook large-scale experiments: 60 million people subjected to a nonconsensual, psychological intervention to see if they can be convinced to vote. And you get 0.38% effect size. Facebook should be disqualified from running a lemonade stand if we catch them performing nonconsensual experiments on 60 million people. But, at the same time, 0.38% effect sizes are not mind control. They do engage in a lot of surveillance. It’s super-harmful because it leaks, because it allows them to do digital redlining, because it allows them to reliably target fascists with messages that if they were uttered in public, where everyone could see them, might cause the advertiser to be in bad odor. They can take these dog whistles and they can whisper them to the people who won’t spread them around. Those are real harms. You have to ask yourself: why don't we have a privacy law that prohibits the nonconsensual gathering of data and imposes meaningful penalties on people who breach data? I was working in the EU. GDPR was passed. The commissioners I spoke to there said: no one has ever lobbied me as hard as I've been lobbied now. Right now we have more concentration in ad tech than in any other industry, I think, except for maybe eyeglasses, glass bottles, and professional wrestling.Brad: Are we then reduced to: “Help us, Tim Cook! You are our only hope!”?Cory: I think that that's wrong, because Tim Cook doesn't want to give you self-determination. Tim wants you to be subject to his determinations. Among those determinations are some good ones. He doesn't want Facebook to own your eyeballs. You go, Tim. But he also wants you to drop your iPhone in a shredder every 18 months, rather than getting it fixed.Brad: Although I must say, looking at the M1 chip, I'm very tempted to take my laptop and throw it in the shredder today to force me to buy a new one.Noah: It's interesting how iPhone conquered. And yet very few people still use Macs. Steve Jobs’s dream was never actualized.Cory: Firms that are highly concentrated distort policy outcomes, and ad tech is highly concentrated. And we have some obviously distorted policy outcomes. We don't have a federal privacy law with a private right of action. There are no meaningful penalties for breaches. We understand that breaches have compounding effects. A breach that doesn't contain any data that is harmful to the user can be merged with another breach and together they can be harmful—and that's cumulative. And data has a long half-life. Just this week, Ed Felton's old lab published a paper on how old phone numbers can be used to defeat two-factor authentication. You go through a breach, find all the phone numbers that are associated with the two-factor authentication. Then you can go to Verizon and ask: which of these phone numbers is available? Which of these people has changed their phone number? Then you can request that phone number on a new signup—and then you can break into their bank account and steal all their money. Old breaches are cumulative. Yet we still have this actual-damages regime for breaches instead of statutory damages that take account of the downstream effects and these unquantifiable risks that are imposed on the general public through the nonconsensual collection and retention of data under conditions that inevitably lead to breaches.Brad: Okay. Well, I'm very down. So are we ready to end? I think we should end on this downer note.Noah: My favorite Cory Doctorow books also end on a downer note.Brad: Yes. Basically that the political economy does not allow us to move out of this particular fresh semi-hell in which we're embedded. But you had something to say?Cory: Everybody hates monopolies now. So we'll just team up with the people angry about professional wrestling monopolies and eyeglass monopolies and beer monopolies, and we'll form a Prairie Fire United Front of people who will break the monopoly because we're all on the same side—even though we're fighting our different corners of it—the same way that ecology took people who cared about owls and put them on the side of people who care about ozone layers, even though charismatic, nocturnal birds are not the gaseous composition of the upper atmosphere.Brad: Hey, if you have the charismatic megafauna on your side, you’re golden.Noah: How did the original Prairie Fire work out? Let's let's wrap it up there. This is really great episode. Cory, you're awesome. Thanks so much for coming on and feel free to come back in time. Cory: I’d love to. I've just turned in a book about money laundering and cryptocurrency—a noir cyberthreat thriller. Maybe when that comes out, I can come on and we can talk about that. That feels like it's up your guys' alley.Brad: That would be great. Okay. So, as we end this: Noah, what is the key insight?Noah: Hexapodia is the key insight. And what are the other key insights that we got from this day?Brad: DeLong: I'm just depressed. I had a riff about how we got interoperability right with the creation of the market economy and the end of feudalism—and how that was a very lucky historical accident. But I don't see possibilities for an equivalent lucky historical accident in our future.Noah: I have a key insight. It is a little vague, but hopefully it will be good fodder for future episodes. The problems in our society are not diametrically opposed. We have to find optimal interior-solution trade-offs between things that have a non-zero dot product. Sometimes solving the problem with one thing doesn't necessarily create exactly equal and opposite problems on the other side. Instead, it changes the trade-offs that you face with regard to other problems. These things become very complex. You have things like the antitrust problem and things like the Nazi problem. In your society addressing one doesn't necessarily worsen the other. More action against Nazis doesn't necessarily mean less action in antitrust. It's simply means you have to think about antitrust in a slightly different way, and vice versa. That does make these institutional problems very difficult to solve.Brad: Cory, do you wish to add a key insight,Cory: A key insight is: always keep a trash bag in your car.Brad: This has been Brad DeLong and Noah Smith's podcast this week with the amazing Cory Doctorow. Thank you all very much for listening. Get full access to Brad DeLong's Grasping Reality at braddelong.substack.com/subscribe
I often think of companies in relation to their contribution to the next evolution in the forking and merging of disciplines in computing that brought us to where we are today. Many companies have multiple contributions. Few have as many such contributions as Apple. But there was a time when they didn't seem so innovative. This lost decade began about half way through the tenure of John Sculley and can be seen through the lens of the CEOs. There was Sculley, CEO from 1983 to 1993. Co-founders and spiritual centers of Apple, Steve Jobs and Steve Wozniak, left Apple in 1985. Jobs to create NeXT and Wozniak to jump into a variety of companies like making universal remotes, wireless GPS trackers, and and other adventures. This meant Sculley was finally in a position to be fully in charge of Apple. His era would see sales 10x from $800 million to $8 billion. Operationally, he was one of the more adept at cash management, putting $2 billion in the bank by 1993. Suddenly the vision of Steve Jobs was paying off. That original Mac started to sell and grow markets. But during this time, first the IBM PC and then the clones, all powered by the Microsoft operating system, completely took the operating system market for personal computers. Apple had high margins yet struggled for relevance. Under Sculley, Apple released HyperCard, funded a skunkworks team in General Magic, arguably the beginning of ubiquitous computing, and using many of those same ideas he backed the Newton, coining the term personal digital assistant. Under his leadership, Apple marketing sent 200,000 people home with a Mac to try it out. Put the device in the hands of the people is probably one of the more important lessons they still teach newcomers that work in Apple Stores. Looking at the big financial picture it seems like Sculley did alright. But in Apple's fourth-quarter earnings call in 1993, they announced a 97 drop from the same time in 1992. This was also when a serious technical debt problem began to manifest itself. The Mac operating system grew from the system those early pioneers built in 1984 to Macintosh System Software going from version 1 to version 7. But after annual releases leading to version 6, it took 3 years to develop system 7 and the direction to take with the operating system caused a schism in Apple engineering around what would happen once 7 shipped. Seems like most companies go through almost the exact same schism. Microsoft quietly grew NT to resolve their issues with Windows 3 and 95 until it finally became the thing in 2000. IBM had invested heavily into that same code, basically, with Warp - but wanted something new. Something happened while Apple was building macOS 7. They lost Jean Lois Gasseé who had been head of development since Steve Jobs left. When Sculley gave everyone a copy of his memoir, Gasseé provided a copy of The Mythical Man-Month, from Fred Brooks' experience with the IBM System 360. It's unclear today if anyone read it. To me this is really the first big sign of trouble. Gassée left to build another OS, BeOS. By the time macOS 7 was released, it was clear that the operating system was bloated, needed a massive object-oriented overhaul, and under Sculley the teams were split, with one team eventually getting spun off into its own company and then became a part of IBM to help with their OS woes. The team at Apple took 6 years to release the next operating system. Meanwhile, one of Sculley's most defining decisions was to avoid licensing the Macintosh operating system. Probably because it was just too big a mess to do so. And yet everyday users didn't notice all that much and most loved it. But third party developers left. And that was at one of the most critical times in the history of personal computers because Microsoft was gaining a lot of developers for Windows 3.1 and released the wildly popular Windows 95. The Mac accounted for most of the revenue of the company, but under Sculley the company dumped a lot of R&D money into the Newton. As with other big projects, the device took too long to ship and when it did, the early PDA market was a red ocean with inexpensive competitors. The Palm Pilot effectively ended up owning that pen computing market. Sculley was a solid executive. And he played the part of visionary from time to time. But under his tenure Apple found operating system problems, rumors about Windows 95, developers leaving Apple behind for the Windows ecosystem, and whether those technical issues are on his lieutenants or him, the buck stocks there. The Windows clone industry led to PC price wars that caused Apple revenues to plummet. And so Markkula was off to find a new CEO. Michael Spindler became the CEO from 1993 to 1996. The failure of the Newton and Copland operating systems are placed at his feet, even though they began in the previous regime. Markkula hired Digital Equipment and Intel veteran Spindler to assist in European operations and he rose to President of Apple Europe and then ran all international. He would become the only CEO to have no new Mac operating systems released in his tenure. Missed deadlines abound with Copland and then Tempo, which would become Mac OS 8. And those aren't the only products that came out at the time. We also got the PowerCD, the Apple QuickTake digital camera, and the Apple Pippin. Bandai had begun trying to develop a video game system with a scaled down version of the Mac. The Apple Pippin realized Markkula's idea from when the Mac was first conceived as an Apple video game system. There were a few important things that happened under Spindler though. First, Apple moved to the PowerPC architecture. Second, he decided to license the Macintosh operating system to companies wanting to clone the Macintosh. And he had discussions with IBM, Sun, and Philips to acquire Apple. Dwindling reserves, increasing debt. Something had to change and within three years, Spindler was gone. Gil Amelio was CEO from 1996 to 1997. He moved from the board while the CEO at National Semiconductor to CEO of Apple. He inherited a company short on cash and high on expenses. He quickly began pushing forward OS 8, cut a third of the staff, streamline operations, dumping some poor quality products, and releasing new products Apple needed to be competitive like the Apple Network Server. He also tried to acquire BeOS for $200 million, which would have Brough Gassée back but instead acquired NeXT for $429 million. But despite the good trajectory he had the company on, the stock was still dropping, Apple continued to lose money, and an immovable force was back - now with another decade of experience launching two successful companies: NeXT and Pixar. The end of the lost decade can be seen as the return of Steve Jobs. Apple didn't have an operating system. They were in a lurch soy-to-speak. I've seen or read it portrayed that Steve Jobs intended to take control of Apple. And I've seen it portrayed that he was happy digging up carrots in the back yard but came back because he was inspired by Johnny Ive. But I remember the feel around Apple changed when he showed back up on campus. As with other companies that dug themselves out of a lost decade, there was a renewed purpose. There was inspiration. By 1997, one of the heroes of the personal computing revolution, Steve Jobs, was back. But not quite… He became interim CEO in 1997 and immediately turned his eye to making Apple profitable again. Over the past decade, the product line expanded to include a dozen models of the Mac. Anyone who's read Geoffrey Moore's Crossing the Chasm, Inside the Tornado, and Zone To Win knows this story all too well. We grow, we release new products, and then we eventually need to take a look at the portfolio and make some hard cuts. Apple released the Macintosh II in 1987 then the Macintosh Portable in 1989 then the Iicx and II ci in 89 along with the Apple IIgs, the last of that series. By facing competition in different markets, we saw the LC line come along in 1990 and the Quadra in 1991, the same year three models of the PowerBook were released. Different printers, scanners, CD-Roms had come along by then and in 1993, we got a Macintosh TV, the Apple Newton, more models of the LC and by 1994 even more of those plus the QuickTake, Workgroup Server, the Pippin and by 1995 there were a dozen Performas, half a dozen Power Macintosh 6400s, the Apple Network Server and yet another versions of the Performa 6200 and we added the eMade and beige G3 in 1997. The SKU list was a mess. Cleaning that up took time but helped prepare Apple for a simpler sales process. Today we have a good, better, best with each device, with many a computer being build-to-order. Jobs restructured the board, ending the long tenure of Mike Markkula, who'd been so impactful at each stage of the company so far. One of the forces behind the rise of the Apple computer and the Macintosh was about to change the world again, this time as the CEO.
There was a nexus of Digital Research and Xerox PARC, along with Stanford and Berkeley in the Bay Area. The rise of the hobbyists and the success of Apple attracted some of the best minds in computing to Apple. This confluence was about to change the world. One of those brilliant minds that landed at Apple started out as a technical writer. Apple hired Jef Raskin as their 31st employee, to write the Apple II manual. He quickly started harping on people to build a computer that was easy to use. Mike Markkula wanted to release a gaming console or a cheap computer that could compete with the Commodore and Atari machines at the time. He called the project “Annie.” The project began with Raskin, but he had a very different idea than Markkula's. He summed it up in an article called “Computers by the Millions” that wouldn't see publication until 1982. His vision was closer to his PhD dissertation, bringing computing to the masses. For this, he envisioned a menu driven operating system that was easy to use and inexpensive. Not yet a GUI in the sense of a windowing operating system and so could run on chips that were rapidly dropping in price. He planned to use the 6809 chip for the machine and give it a five inch display. He didn't tell anyone that he had a PhD when he was hired, as the team at Apple was skeptical of academia. Jobs provided input, but was off working on the Lisa project, which used the 68000 chip. So they had free reign over what they were doing. Raskin quickly added Joanna Hoffman for marketing. She was on leave from getting a PhD in archaeology at the University of Chicago and was the marketing team for the Mac for over a year. They also added Burrell Smith, employee #282 from the hardware technician team, to do hardware. He'd run with the Homebrew Computer Club crowd since 1975 and had just strolled into Apple one day and asked for a job. Raskin also brought in one of his students from the University of California San Diego who was taking a break from working on his PhD in neurochemistry. Bill Atkinson became employee 51 at Apple and joined the project. They pulled in Andy Hertzfeld, who Steve Jobs hired when Apple bought one of his programs as he was wrapping up his degree at Berkeley and who'd been sitting on the Apple services team and doing Apple III demos. They added Larry Kenyon, who'd worked at Amdahl and then on the Apple III team. Susan Kare came in to add art and design. They, along with Chris Espinosa - who'd been in the garage with Jobs and Wozniak working on the Apple I, ended up comprising the core team. Over time, the team grew. Bud Tribble joined as the manager for software development. Jerrold Manock, who'd designed the case of the Apple II, came in to design the now-iconic Macintosh case. The team would eventually expand to include Bob Belleville, Steve Capps, George Crow, Donn Denman, Bruce Horn, and Caroline Rose as well. It was still a small team. And they needed a better code name. But chronologically let's step back to the early project. Raskin chose his favorite Apple, the Macintosh, as the codename for the project. As far as codenames go it was a pretty good one. So their mission would be to ship a machine that was easy to use, would appeal to the masses, and be at a price point the masses could afford. They were looking at 64k of memory, a Motorola 6809 chip, and a 256 bitmap display. Small, light, and inexpensive. Jobs' relationship with the Lisa team was strained and he was taken off of that and he started moving in on the Macintosh team. It was quickly the Steve Jobs show. Having seen what could be done with the Motorola 68000 chip on the Lisa team, Jobs had them redesign the board to work with that. After visiting Xerox PARC at Raskin's insistence, Jobs finally got the desktop metaphor and true graphical interface design. Xerox had not been quiet about the work at PARC. Going back to 1972 there were even television commercials. And Raskin had done time at PARC while on sabbatical from Stanford. Information about Smalltalk had been published and people like Bill Atkinson were reading about it in college. People had been exposed to the mouse all around the Bay Area in the 60s and 70s or read Engelbart's scholarly works on it. Many of the people that worked on these projects had doctorates and were academics. They shared their research as freely as love was shared during that counter-culture time. Just as it had passed from MIT to Dartmouth and then in the back of Bob Albrecht's VW had spread around the country in the 60s. That spirit of innovation and the constant evolutions over the past 25 years found their way to Steve Jobs. He saw the desktop metaphor and mouse and fell in love with it, knowing they could build one for less than the $400 unit Xerox had. He saw how an object-oriented programming language like Smalltalk made all that possible. The team was already on their way to the same types of things and so Jobs told the people at PARC about the Lisa project, but not yet about the Mac. In fact, he was as transparent as anyone could be. He made sure they knew how much he loved their work and disclosed more than I think the team planned on him disclosing about Apple. This is the point where Larry Tesler and others realized that the group of rag-tag garage-building Homebrew hackers had actually built a company that had real computer scientists and was on track to changing the world. Tesler and some others would end up at Apple later - to see some of their innovations go to a mass market. Steve Jobs at this point totally bought into Raskin's vision. Yet he still felt they needed to make compromises with the price and better hardware to make it all happen. Raskin couldn't make the kinds of compromises Jobs wanted. He also had an immunity to the now-infamous Steve Jobs reality distortion field and they clashed constantly. So eventually Raskin the project just when it was starting to take off. Raskin would go on to work with Canon to build his vision, which became the Canon CAT. With Raskin gone, and armed with a dream team of mad scientists, they got to work, tirelessly pushing towards shipping a computer they all believed would change the world. Jobs brought in Fernandez to help with projects like the macOS and later HyperCard. Wozniak had a pretty big influence over Raskin in the early days of the Mac project and helped here and there withe the project, like with the bit-serial peripheral bus on the Mac. Steve Jobs wanted an inexpensive mouse that could be manufactured en masse. Jim Yurchenco from Hovey-Kelley, later called Ideo, got the task - given that trusted engineers at Apple had full dance cards. He looked at the Xerox mouse and other devices around - including trackballs in Atari arcade machines. Those used optics instead of mechanical switches. As the ball under the mouse rolled beams of light would be interrupted and the cost of those components had come down faster than the technology in the Xerox mouse. He used a ball from a roll-on deodorant stick and got to work. The rest of the team designed the injection molded case for the mouse. That work began with the Lisa and by the time they were done, the price was low enough that every Mac could get one. Armed with a mouse, they figured out how to move windows over the top of one another, Susan Kare designed iconography that is a bit less 8-bit but often every bit as true to form today. Learning how they wanted to access various components of the desktop, or find things, they developed the Finder. Atkinson gave us marching ants, the concept of double-clicking, the lasso for selecting content, the menu bar, MacPaint, and later, HyperCard. It was a small team, working long hours. Driven by a Jobs for perfection. Jobs made the Lisa team the enemy. Everything not the Mac just sucked. He took the team to art exhibits. He had the team sign the inside of the case to infuse them with the pride of an artist. He killed the idea of long product specifications before writing code and they just jumped in, building and refining and rebuilding and rapid prototyping. The team responded well to the enthusiasm and need for perfectionism. The Mac team was like a rebel squadron. They were like a start-up, operating inside Apple. They were pirates. They got fast and sometimes harsh feedback. And nearly all of them still look back on that time as the best thing they've done in their careers. As IBM and many learned the hard way before them, they learned a small, inspired team, can get a lot done. With such a small team and the ability to parlay work done for the Lisa, the R&D costs were minuscule until they were ready to release the computer. And yet, one can't change the world over night. 1981 turned into 1982 turned into 1983. More and more people came in to fill gaps. Collette Askeland came in to design the printed circuit board. Mike Boich went to companies to get them to write software for the Macintosh. Berry Cash helped prepare sellers to move the product. Matt Carter got the factory ready to mass produce the machine. Donn Denman wrote MacBASIC (because every machine needed a BASIC back then). Martin Haeberli helped write MacTerminal and Memory Manager. Bill Bull got rid of the fan. Patti King helped manage the software library. Dan Kottke helped troubleshoot issues with mother boards. Brian Robertson helped with purchasing. Ed Riddle designed the keyboard. Linda Wilkin took on documentation for the engineering team. It was a growing team. Pamela Wyman and Angeline Lo came in as programmers. Hap Horn and Steve Balog as engineers. Jobs had agreed to bring in adults to run the company. So they recruited 44 years old hotshot CEO John Sculley to change the world as their CEO rather than selling sugar water at Pepsi. Scully and Jobs had a tumultuous relationship over time. While Jobs had made tradeoffs on cost versus performance for the Mac, Sculley ended up raising the price for business reasons. Regis McKenna came in to help with the market campaign. He would win over so much trust that he would later get called out of retirement to do damage control when Apple had an antenna problem on the iPhone. We'll cover Antenna-gate at some point. They spearheaded the production of the now-iconic 1984 Super Bowl XVIII ad, which shows woman running from conformity and depicted IBM as the Big Brother from George Orwell's book, 1984. Two days after the ad, the Macintosh 128k shipped for $2,495. The price had jumped because Scully wanted enough money to fund a marketing campaign. It shipped late, and the 128k of memory was a bit underpowered, but it was a success. Many of the concepts such as a System and Finder, persist to this day. It came with MacWrite and MacPaint and some of the other Lisa products were soon to follow, now as MacProject and MacTerminal. But the first killer app for the Mac was Microsoft Word, which was the first version of Word ever shipped. Every machine came with a mouse. The machines came with a cassette that featured a guided tour of the new computer. You could write programs in MacBASIC and my second language, MacPascal. They hit the initial sales numbers despite the higher price. But over time that bit them on sluggish sales. Despite the early success, the sales were declining. Yet the team forged on. They introduced the Apple LaserWriter at a whopping $7,000. This was a laser printer that was based on the Canon 300 dpi engine. Burrell Smith designed a board and newcomer Adobe knew laser printers, given that the founders were Xerox alumni. They added postscript, which had initially been thought up while working with Ivan Sutherland and then implemented at PARC, to make for perfect printing at the time. The sluggish sales caused internal issues. There's a hangover when we do something great. First there were the famous episodes between Jobs, Scully, and the board of directors at Apple. Scully seems to have been portrayed by many to be either a villain or a court jester of sorts in the story of Steve Jobs. Across my research, which began with books and notes and expanded to include a number of interviews, I've found Scully to have been admirable in the face of what many might consider a petulant child. But they all knew a brilliant one. But amidst Apple's first quarterly loss, Scully and Jobs had a falling out. Jobs tried to lead an insurrection and ultimately resigned. Wozniak had left Apple already, pointing out that the Apple II was still 70% of the revenues of the company. But the Mac was clearly the future. They had reached a turning point in the history of computers. The first mass marketed computer featuring a GUI and a mouse came and went. And so many others were in development that a red ocean was forming. Microsoft released Windows 1.0 in 1985. Acorn, Amiga, IBM, and others were in rapid development as well. I can still remember the first time I sat down at a Mac. I'd used the Apple IIs in school and we got a lab of Macs. It was amazing. I could open a file, change the font size and print a big poster. I could type up my dad's lyrics and print them. I could play SimCity. It was a work of art. And so it was signed by the artists that brought it to us: Peggy Alexio, Colette Askeland, Bill Atkinson, Steve Balog, Bob Belleville, Mike Boich, Bill Bull, Matt Carter, Berry Cash, Debi Coleman, George Crow, Donn Denman, Christopher Espinosa, Bill Fernandez, Martin Haeberli, Andy Hertzfeld, Joanna Hoffman, Rod Holt, Bruce Horn, Hap Horn, Brian Howard, Steve Jobs, Larry Kenyon, Patti King, Daniel Kottke, Angeline Lo, Ivan Mach, Jerrold Manock, Mary Ellen McCammon, Vicki Milledge, Mike Murray, Ron Nicholson Jr., Terry Oyama, Benjamin Pang, Jef Raskin, Ed Riddle, Brian Robertson, Dave Roots, Patricia Sharp, Burrell Smith, Bryan Stearns, Lynn Takahashi, Guy "Bud" Tribble, Randy Wigginton, Linda Wilkin, Steve Wozniak, Pamela Wyman and Laszlo Zidek. Steve Jobs left to found NeXT. Some, like George Crow, Joanna Hoffman, and Susan Care, went with him. Bud Tribble would become a co-founder of NeXT and then the Vice President of Software Technology after Apple purchased NeXT. Bill Atkinson and Andy Hertzfeld would go on to co-found General Magic and usher in the era of mobility. One of the best teams ever assembled slowly dwindled away. And the oncoming dominance of Windows in the market took its toll. It seems like every company has a “lost decade.” Some like Digital Equipment don't recover from it. Others, like Microsoft and IBM (who has arguably had a few), emerge as different companies altogether. Apple seemed to go dormant after Steve Jobs left. They had changed the world with the Mac. They put swagger and an eye for design into computing. But in the next episode we'll look at that long hangover, where they were left by the end of it, and how they emerged to become to change the world yet again. In the meantime, Walter Isaacson weaves together this story about as well as anyone in his book Jobs. Steven Levy brilliantly tells it in his book Insanely Great. Andy Hertzfeld gives some of his stories at folklore.org. And countless other books, documentaries, podcasts, blog posts, and articles cover various aspects as well. The reason it's gotten so much attention is that where the Apple II was the watershed moment to introduce the personal computer to the mass market, the Macintosh was that moment for the graphical user interface.
I've been struggling with how to cover a few different companies, topics, or movements for awhile. The lack of covering their stories thus far has little to do with their impact but just trying to find where to put them in the history of computing. One of the most challenging is Apple. This is because there isn't just one Apple. Instead there are stages of the company, each with their own place in the history of computers. Today we can think of Apple as one of the Big 5 tech companies, which include Amazon, Apple, Google, Facebook, and Microsoft. But there were times in the evolution of the company where things looked bleak. Like maybe they would get gobbled up by another tech company. To oversimplify the development of Apple, we'll break up their storied ascent into four parts: Apple Computers: This story covers the mid-1970s to mid 1980s and covers Apple rising out of the hobbyist movement and into a gangbuster IPO. The Apple I through III families all centered on one family of chips and took the company into the 90s. The Macintosh: The rise and fall of the Mac covers the introduction of the now-iconic Mac through to the Power Macintosh era. Mac OS X: This part of the Apple story begins with the return of Steve Jobs to Apple and the acquisition of NeXT, looks at the introduction of the Intel Macs and takes us through to the transition to the Apple M1 CPU. Post PC: Steve Jobs announced the “post PC” era in 2007, and in the coming years the sales of PCs fell for the first time, while tablets, phones, and other devices emerged as the primary means people used devices. We'll start with the early days, which I think of as one of the four key Apple stages of development. And those early days go back far past the days when Apple was hocking the Apple I. They go to high school. Jobs and Woz Bill Fernandez and Steve Wozniak built a computer they called “The Cream Soda Computer” in 1970 when Bill was 16 and Woz was 20. It was a crude punch card processing machine built from some parts Woz got from the company he was working for at the time. Fernandez introduced Steve Wozniak to a friend from middle school because they were both into computers and both had a flare for pranky rebelliousness. That friend was Steve Jobs. By 1972, the pranks turned into their first business. Wozniak designed Blue Boxes, initially conceived by Cap'n Crunch John Draper, who got his phreaker name from a whistle in a Cap'n Crunch box that made a tone in 2600 Hz that sent AT&T phones into operator mode. Draper would actually be an Apple employee for a bit. They designed a digital version and sold a few thousand dollars worth. Jobs went to Reed College. Wozniak went to Berkely. Both dropped out. Woz got a sweet gig at HP designing calculators, where Jobs had worked a summer job in high school. India to find enlightenment. When Jobs became employee number 40 at Atari, he got Wozniak to help create Breakout. That was the year The Altair 8800 was released and Wozniak went to the first meeting of a little club called the Homebrew Computer Club in 1975 when they got an Altair so the People's Computer Company could review it. And that was the inspiration. Having already built one computer with Fernandez, Woz designed schematics for another. Going back to the Homebrew meetings to talk through ideas and nerd out, he got it built and proud of his creation, returned to Homebrew with Jobs to give out copies of the schematics for everyone to play with. This was the age of hackers and hobbyists. But that was about to change ever so slightly. The Apple I Jobs had this idea. What if they sold the boards. They came up with a plan. Jobs sold his VW Microbus and Wozniak sold his HP-65 calculator and they got to work. Simple math. They could sell 50 boards for $40 bucks each and make some cash like they'd done with the blue boxes. But you know, a lot of people didn't know what to do with the board. Sure, you just needed a keyboard and a television, but that still seemed a bit much. Then a little bigger plan - what if they sold 50 full computers. They went to the Byte Shop and talked them into buying 50 for $500. They dropped $20,000 on parts and netted a $5,000 return. They'd go on to sell about 200 of the Apple Is between 1976 and 1977. It came with a MOS 6502 chip running at a whopping 1 MHz and with 4KB of memory, which could go to 8. They provided Apple BASIC, as most vendors did at the time. That MOS chip was critical. Before it, many used an Intel or the Motorola 6800, which went for $175. But the MOS 6502 was just $25. It was an 8-bit microprocessor designed by a team that Chuck Peddle ran after leaving the 6800 team at Motorola. Armed with that chip at that price, and with Wozniak's understanding of what it needed to do and how it interfaced with other chips to access memory and peripherals, the two could do something new. They started selling the Apple 1 and to quote an ad “the Apple comes fully assembled, tested & burned-in and has a complete power supply on-board, initial set-up is essentially “hassle free” and you can be running in minutes.” This really tells you something about the computing world at the time. There were thousands of hobbyists and many had been selling devices. But this thing had on-board RAM and you could just add a keyboard and video and not have to read LEDs to get output. The marketing descriptions were pretty technical by modern Apple standards, telling us something of the users. It sold for $666.66. They got help from Patty Jobs building logic boards. Jobs' friend from college Daniel Kottke joined for the summer, as did Fernandez and Chris Espinosa - now Apple's longest-tenured employee. It was a scrappy garage kind of company. The best kind. They made the Apple I until a few months after they released the successor. But the problem with the Apple I was that there was only one person who could actually support it when customers called: Wozniak. And he was slammed, busy designing the next computer and all the components needed to take it to the mass market, like monitors, disk drives, etc. So they offered a discount for anyone returning the Apple I and destroyed most returned. Those Apple I computers have now been auctioned for hundreds of thousands of dollars all the way up to $1.75 million. The Apple II They knew they were on to something. But a lot of people were building computers. They needed capital if they were going to bring in a team and make a go at things. But Steve Jobs wasn't exactly the type of guy venture capitalists liked to fund at the time. Mike Markkula was a product-marketing manager at chip makers Fairchild and Intel who retired early after making a small fortune on stock options. That is, until he got a visit from Steve Jobs. He brought money but more importantly the kind of assistance only a veteran of a successful corporation who'd ride that wave could bring. He brought in Michael "Scotty" Scott, employee #4, to be the first CEO and they got to work on mapping out an early business plan. If you notice the overlapping employee numbers, Scotty might have had something to do with that… As you may notice by Wozniak selling his calculator, at the time computers weren't that far removed from calculators. So Jobs brought in a calculator designer named Jerry Manock to design a plastic injection molded case, or shell, for the Apple II. They used the same chip and a similar enough motherboard design. They stuck with the default 4KB of memory and provided jumpers to make it easier to go up to 48. They added a cassette interface for IO. They had a toggle circuit that could trigger the built-in speaker. And they would include two game paddles. This is similar to bundles provided with the Commodore and other vendors of the day. And of course it still worked with a standard TV - but now that TVs were mostly color, so was the video coming out of the Apple II. And all of this came at a starting price of $1,298. The computer initially shipped with a version of BASIC written by Wozniak but Apple later licensed the Microsoft 6502 BASIC to ship what they called Applesoft BASIC, short for Apple and Micorosft. Here, they turned to Randy Wiggington who was Apple's employee #6 and had gotten rides to the Homebrew Computer Club from Wozniak as a teenager (since he lived down the street). He and others added features onto Microsoft BASIC to free Wozniak to work on other projects. Deciding they needed a disk operating system, or DOS. Here, rather than license the industry standard CP/M at the time, Wigginton worked with Shepardson, who did various projects for CP/M and Atari. The motherboard on the Apple II remains an elegant design. There were certain innovations that Wozniak made, like cutting down the number of DRAM chips by sharing resources between other components. The design was so elegant that Bill Fernandez had to join them as employee number four, in order to help take the board and create schematics to have it silkscreened. The machines were powerful. All that needed juice. Jobs asked his former boss Al Alcorn for someone to help out with that. Rod Holt, employee number 5, was brought in to design the power supply. By implementing a switching power supply, as Digital Equipment had done in the PDP-11, rather than a transformer-based power supply, the Apple II ended up being far lighter than many other machines. The Apple II was released in 1977 at the West Coast Computer Fair. It, along with the TRS-80 and the Commodore PET would become the 1977 Trinity, which isn't surprising. Remember Peddle who ran the 6502 design team - he designed the PET. And Steve Leininger was also a member of the Homebrew Computer Club who happened to work at National Semiconductor when Radio Shack/Tandy started looking for someone to build them a computer. The machine was stamped with an Apple logo. Jobs hired Rob Janoff, a local graphic designer, to create the logo. This was a picture of an Apple made out of a rainbow, showing that the Apple II had color graphics. This rainbow Apple stuck and became the logo for Apple Computers until 1998, after Steve Jobs returned to Apple, when the Apple went all-black, but the silhouette is now iconic, serving Apple for 45 years and counting. The computers were an instant success and sold quickly. But others were doing well in the market. Some incumbents and some new. Red oceans mean we have to improve our effectiveness. So this is where Apple had to grow up to become a company. Markkula made a plan to get Apple to $500 million in sales in 10 years on the backs of his $92,000 investment and another $600,000 in venture funding. They did $2.7 million dollars in sales in 1977. This idea of selling a pre-assembled computer to the general public was clearly resonating. Parents could use it to help teach their kids. Schools could use it for the same. And when we were done with all that, we could play games on it. Write code in BASIC. Or use it for business. Make some documents in Word Star, spreadsheets in VisiCalc, or use one of the thousands of titles available for the Mac. Sales grew 150x until 1980. Given that many thought cassettes were for home machines and floppies were for professional machines, it was time to move away from tape. Markkela realized this and had Wozniak design a floppy disk for the Apple II, which went on to be known as the Drive II. Wozniak had experience with disk controllers and studied the latest available. Wozniak again managed to come up with a value engineered design that allowed Apple to produce a good drive for less than any other major vendor at the time. Wozniak would actually later go on to say that it was one of his best designs (and many contemporaries agreed). Markkula filled gaps as well as anyone. He even wrote free software programs under the name of Johnny Appleseed, a name also used for years in product documentation. He was a classic hacker type of entrepreneur on their behalf, sitting in the guerrilla marketing chair some days or acting as president of the company others, and mentor for Jobs in other days. From Hobbyists to Capitalists Here's the thing - I've always been a huge fan of Apple. Even in their darkest days, which we'll get to in later episodes, they represented an ideal. But going back to the Apple 1, they were nothing special. Even the Apple II. Osborne, Commodore, Vector Graphics, Atari, and hundreds of other companies were springing up, inspired first by that Altair and then by the rapid drop in the prices of chips. The impact of the 1 megahertz barrier and cost of those MOS 6502 chips was profound. The MOS 6502 chip would be used in the Apple II, the Atari 2600, the Nintendo NES, the BBY Micro. And along with the Zylog Z80 and Intel 8080 would spark a revolution in personal computers. Many of those companies would disappear in what we'd think of as a personal computer bubble if there was more money in it. But those that survived, took things to an order of magnitude higher. Instead of making millions they were making hundreds of millions. Many would even go to war in a race to the bottom of prices. And this is where Apple started to differentiate themselves from the rest. For starters, due to how anemic the default Altair was, most of the hobbyist computers were all about expansion. You can see it on the Apple I schematics and you can see it in the minimum of 7 expansion slots in the Apple II lineup of computers. Well, all of them except the IIc, marketed as a more portable type of device, with a handle and an RCA connection to a television for a monitor. The media seemed to adore them. In an era of JR Ewing of Dallas, Steve Jobs was just the personality to emerge and still somewhat differentiate the new wave of computer enthusiasts. Coming at the tail end of an era of social and political strife, many saw something of themselves in Jobs. He looked the counter-culture part. He had the hair, but this drive. The early 80s were going to be all about the yuppies though - and Jobs was putting on a suit. Many identified with that as well. Fueled by the 150x sales performance shooting them up to $117M in sales, Apple filed for an IPO, going public in 1980, creating hundreds of millionaires, including at least 40 of their own employees. It was the biggest IPO since Ford in 1956, the same year Steve Jobs was born. The stock was filed at $14 and shot up to $29 on the first day alone, leaving Apple sitting pretty on a $1.778 valuation. Scotty, who brought the champagne, made nearly a $100M profit. One of the Venture Capitalists, Arthur Rock, made over $21M on a $57,600 investment. Rock had been the one to convince the Shockley Semiconductor team to found Fairchild, a key turning point in putting silicon into the name of Silicon Valley. When Noyce and Moore left there to found Intel, he was involved. And he would stay in touch with Markkula, who was so enthusiastic about Apple that Rock invested and began a stint on the board of directors at Apple in 1978, often portrayed as the villain in the story of Steve Jobs. But let's think about something for a moment. Rock was a backer of Scientific Data Systems, purchased by Xerox in 1969, becoming the Xerox 500. Certainly not Xerox PARC and in fact, the anti-PARC, but certainly helping to connect Jobs to Xerox later as Rock served on the board of Xerox. The IPO Hangover Money is great to have but also causes problems. Teams get sidetracked trying to figure out what to do with their hauls. Like Rod Holt's $67M haul that day. It's a distraction in a time when executional excellence is critical. We have to bring in more people fast, which created a scenario Mike Scott referred to as a “bozo explosion.” Suddenly more people actually makes us less effective. Growing teams all want a seat at a limited table. Innovation falls off as we rush to keep up with the orders and needs of existing customers. Bugs, bigger code bases to maintain, issues with people doing crazy things. Taking our eyes off the ball and normalizing the growth can be hard. By 1981, Scotty was out after leading some substantial layoffs. Apple stock was down. A big IPO also creates investments in competitors. Some of those would go on a race to the bottom in price. Apple didn't compete on price. Instead, they started to plan the next revolution, a key piece of Steve Jobs emerging as a household name. They would learn what the research and computer science communities had been doing - and bring a graphical interface and mouse to the world with Lisa and a smaller project brought forward at the time by Jef Raskin that Jobs tried to kill - but one that Markkula not only approved, but kept Jobs from killing, the Macintosh. Fernandez, Holt, Wigginton, and even Wozniak just drifted away or got lost in the hyper-growth of the company, as is often the case. Some came back. Some didn't. Many of us go through the same in rapidly growing companies. Next (but not yet NeXT) But a new era of hackers was on the way. And a new movement as counter to the big computer culture as Jobs. But first, they needed to take a trip to Xerox. In the meantime, the Apple III was an improvement but proved that the Apple computer line had run its course. They released it in 1980 and recalled the first 14,000 machines and never peaked 75,000 machines sold, killing off the line in 1984. A special year.
Douglas Shinsato is the managing director of Anthill Ventures and a member of the advisory board of Via Equity, one of Northern Europe's most successful investment firms. Currently based out of Hawaii, Doug has been a member of the Board of Regents of the University of Hawaii and has served in senior leadership roles at many leading technology companies including IBM, Xerox, as well as on the board of Mphasis and Deloitte Japan. I love seeking Doug's mentorship around leadership and venture building, and he has been kind enough to advise me on my journey towards tech and entrepreneurship. Here are some key themes we talked about on this episode: Background: 2:07 Growing up, Education 4:22 Why did Doug study law at Stanford? 30:32 Doug's current work adventures Anecdotes: 5:42 Doug's role in deregulation of airlines and working for the US Assistant Attorney General 8:29 Anti-trust in the 70s and 80s compared with anti-trust now (big tech) 9:26 Working in Japan and learning about working in a new culture 12:40 Role of luck in career success -- joining MPhasis' board when they had 60 employees 55:27 Doug's Xerox story. Have fun along the way :) Advice on venture building: 10:45 What new leaders should be really focusing on when they go about expanding their ventures? 34:31 Thoughts on starting up and building an early-stage team 37:45 On leaders 39:59 Amazon vs American Airlines culture 40:56 How is a good board helpful at a company? 54:21 Single most valuable trait you look for when you go about building a team or partnering with someone Tech Trends: 17:22 Excitement about telemedicine and edtech 19:57 Why is there an opportunity to reinvent the university business model with COVID-19 as a driver of change? 22:37 Bigger universities vs Smaller universities Advice to students and professionals: 25:36 What would you study if you were a student entering college in 2020? 28:54 Reinventing yourself to scale the career ladder On Innovation and change management: 43:06 Kodak, Digital Equipment started off as innovators 48:15 Future of air travel 51:35 Change management is about changing management, not management processes Productivity and Rituals: 15:49 Doug's routine in quarantine 44:39 On networking and building relationships. Role of a personal CRM 30 years ago and now 58:03 How greeting his wife every morning has been the game-changing ritual all these years for Doug
In this episode, we talk to Dr John Carr. John is a recognized international expert in supply chain management. His career stretches back to the 1980s with management position at companies such as Digital Equipment, EMC, Samsung, Irish express Cargo and Flex International.John has a doctorate in supply Chain and master’s degrees in Advanced Manufacturing and Business Administration and currently heads the supply chain consultancy Offerre with clients in both the multinational and SME sectors. John is currently involved with several initiatives with IMC in Ireland and MIT in US to ramp up the production of ventilators in the COVID19 crisis.In this interview, we discuss this exciting initiative and we exchange views on the past, present and likely future of global manufacturing and the supply chains that underpin it. See acast.com/privacy for privacy and opt-out information.
It doesn’t have to be expensive, at least according to Del Winn. I recently spoke with Dell, a global expansion coach, business leader, complex problem solver, and international deal-making attorney, who really can solve complex manufacturing and sales expansion problems. Del was asked to accompany InterDigital Communications Corporation's salesman to Nanjing, China to negotiate a [...]
Michele Hoskins, is a former educator, professional leadership development trainer and motivational speaker for more than 29 years. Currently, Michele and her husband Paul are Urban Air Adventure Park Franchisees, have two (2) parks in San Antonio, and are planning to open an additional park in 2020. Michele is also a licensed Minister, former McDonald’s Franchisee, Author, Motivational Speaker, teacher and active community leader. Michele is very competent in establishing goals, developing strategic action plans, and executing plans for running successful operations to the benefit of all.Michele loves to dance, read, travel and spend time with her family. Michele is married happily married, 28 years to her husband Paul, they have 10 year old twins, Hayley and Hayley, and reside in Garden Ridge, TX. Paul Hoskins along with his wife Michele are owners of (2) Urban Air Adventure Parks in San Antonio. Previously, Paul owned and operated five (5) McDonald’s Franchises in San Antonio. Paul has over 28 years executive sales, management, and computer technology experience. Paul has worked for Hewlett-Packard Company, Digital Equipment, and NCR Corporation. During Paul’s tenure at Hewlett Packard he was responsible for developing new accounts, territory management, marketing, and exceeded annual sales goals every year.Paul enjoys reading, traveling, cooking, and exercising. Paul holds a Bachelor of Science Degree in Marketing from Arizona State University. Paul and Michele are the proud parents of their twins Hayley and Hayden, and they reside in Garden Ridge, Texas.RICH RELATIONSHIPS PODCAST IS THE GENESIS OF SPEAK FREELY WITH GIL & Renée APPSpeak Freely with Gilbert J and Renée M. Beavers app was released February 1,2020Renée M. Beavers is the developer of the App. Gil J. Beavers and M.Renée Beavers are the hosts and creators of the Rich Relationships Podcast with Gil & Renée . Gil& Renée have been introducing the world Rich Relationships as evidenced by television and radio interviews on CBS, NBC, TBN, Atlanta Live, HOT 108 FM, and Blog Talk Radio, just to name a few. Renee has also been featured in the Huffington Post and Sheen Magazine.Apple App Storehttps://apps.apple.com/us/app/speak-freely-with-gil-renee/id1497628436?ls=1Google Play App Store.https://play.google.com/store/apps/details?id=com.app.speakfreelyGil &Renée’s 37-year relationship might mean they know what it takes to maintain and grow healthy relationships. They have a passion for marriage and relationships. Their 37 years together also provide evidence of something worth exploring and modeling. What is their secret, or are the answers hidden in plain sight?Gil & Renée are available for the following platforms and venues Conferences Panel DiscussionsMagazine InterviewsTelevision InterviewsRadio InterviewsNewspaper Interviewswwwrichrelationshipsus.com
Today we're going to talk through the history of the Data General Nova. Digital Equipment was founded in 1957 and released a game changing computer, the PDP-8, in 1965. We covered Digital in a previous episode, but to understand the Data General Nova, you kinda' need to understand the PDP. It was a fully transistorized computer and it was revolutionary in the sense that it brought interactive computing to the masses. Based in part on research from work done for MIT in the TX-0 era, the PDP made computing more accessible to companies that couldn't spend millions on computers and it was easier to program - and the PDP-1 could be obtained for less than a hundred thousand dollars. You could use a screen, type commands on a keyboard for the first time and it would actually output to screen rather than reading teletypes or punch cards. That interactivity unlocked so much. The PDP began the minicomputer revolution. The first real computer game Spacewar! Was played on it and the adoption increased. The computers got faster. They could do as much as large mainframes. The thousands of transistors were faster and less error-prone than the old tubes. In fact, those transistors signaled that the third generation of computers was upon us. And people who liked the PDP were life-long converts. Fanatical even. The PDP evolved until 1965 when the PDP-8 was released. This is where Edson de Castro comes in, acting as the project manager for the PDP-8 development at Digital. 3 years later, he, Henry Burkhardt, and Richard Sogge of Digital would be joined by Herbert Richman a sales person from Fairchild Semiconductor. They were proud of the PDP-8. It was a beautiful machine. But they wanted to go even further. And they didn't feel like they could do so at Digital. They would build a less expensive minicomputer that opened up even more markets. They saw new circuit board manufacturing techniques, new automation techniques, new reasons to abandon the 12-bit CPU techniques. Edson had wanted to build a PDP with all of this and the ability to use 8 bit, 16 bit, or 32 bit architectures, but it got shut down at Digital. So they got two rounds of venture capital at $400,000 each and struck out on their own. They wanted the computer to fit into a 19-inch rack mount. That choice would basically make the 19 inch rack the standard from then on. They wanted the machines to be 16-bit, moving past the 8 or 12 bit computers common in mini-computing at the time. They used an accumulator-based architecture, which is to say that there was a CPU that had a register that stored the results of various bits of code. This way you weren't writing the results of all the maths into memory and then sending it right back to the CPU. Suddenly, you could do infinitely more math! Having someone from Fairchild really unlocked a lot of knowledge about what was happening in the integrated circuit market. They were able to get the price down into the thousands, not tens of thousands. You could actually buy a computer for less than 4 thousand dollars. The Nova would ship in 1969 and be an instant success with a lot of organizations. Especially smaller science labs like one at the University of Texas that was their first real paying cusotmer. Within 6 months they sold 100 units and within the first few years, they were over $100 million in sales. They were seeking into Digital's profits. No one would have invested in Digital had they tried to compete head-on with IBM. Digital had become the leader in the minicomputer market, effectively owning the category. But Nova posed a threat. Until they decided to get into a horse race with Digital and release the SuperNOVA to compete with the PDP-11. They used space age designs. They were great computers. But Digital was moving faster. And Data General started to have production and supply chain problems, which led to law suits and angry customers. Never good. By 1977 Digital came out with the VAX line, setting the standard to 32-bit. Data General was late to that party and honestly, after being a market leader in low-cost computing they started to slip. By the end of the 70s microchips and personal computers would basically kill minicomputers and while transitioning from minicomputers to servers, Data General never made quite the same inroads that Digital Equipment did. Data General would end up with their own DOS, like everyone their own UNIX System V variant, one of the first portable computers, but by the mid-80s, IBM showed up on the market and Data General would make databases and a number of other areas to justify what was becoming a server market. In fact, the eventual home for Data General would be to get acquired by EMC and become CLaRiiON under the EMC imprint. It was an amazing rise. Hardware that often looked like it came straight out of Buck Rogers. Beautiful engineering. But you just can't compete on price and stay in business forever. Especially when you're competing with your former bosses who have much much deeper pockets. EMC benefited from a lot of these types of acquisitions over the years, to become a colossus by the end of the 2010s. We can thank Data General and specifically the space age nova, for helping set many standards we use today. We can thank them for helping democratize computing in general. And if you're a heavy user of EMC appliances, you can probably thank them for plenty of underlying bits of what you do even through to today. But the minicomputer market required companies to make their own chips in that era and that was destroyed by the dominance of Intel in the microchip industry. It's too bad. So many good ideas. But the costs to keep up turned out to be too much for them, as with many other vendors. One way to think about this story. You can pick up on new manufacturing and design techniques and compete with some pretty large players, especially on price. But when the realities of scaling an operation come you can't stumble or customer confidence will erode and there's a chance you won't get to compete for deals again in the future. But try telling that to your growing sales team. I hear people say you have to outgrow the growth rate of your category. You don't. But you do have to do what you say you will and deliver. And when changes in the industry come, you can't be all over the place. A cohesive strategy will help you whether the storm. So thank you for tuning into this episode of the History of Computing Podcast. We are so lucky you chose to join us and we hope to see you next time! Have a great day!
BASIC Welcome to the History of Computing Podcast, where we explore the history of information technology. Because by understanding the past prepares us to innovate the future! Today we're going to look at the computer that was the history of the BASIC programming language. We say BASIC but really BASIC is more than just a programming language. It's a family of languages and stands for Beginner's All-purpose Symbolic Instruction Code. As the name implies it was written to help students that weren't math nerds learn how to use computers. When I was selling a house one time, someone was roaming around in my back yard and apparently they'd been to an open house and they asked if I'm a computer scientist after they saw a dozen books I'd written on my bookshelf. I really didn't know how to answer that question We'll start this story with Hungarian John George Kemeny. This guy was pretty smart. He was born in Budapest and moved to the US with his family in 1940 when his family fled anti-Jewish sentiment and laws in Hungary. Some of his family would go on to die in the Holocaust, including his grandfather. But safely nestled in New York City, he would graduate high school at the top of his class and go on to Princeton. Check this out, he took a year off to head out to Los Alamos and work on the Manhattan Project under Nobel laureate Richard Feynman. That's where he met fellow Hungarian immigrant Jon Von Neumann - two of a group George Marx wrote about in his book on great Hungarian Emmigrant Scientists and thinkers called The Martians. When he got back to Princeton he would get his Doctorate and act as an assistant to Albert Einstein. Seriously, THE Einstein. Within a few years he was a full professor at Dartmouth and go on to publish great works in mathematics. But we're not here to talk about those contributions to the world as an all around awesome place. You see, by the 60s math was evolving to the point that you needed computers. And Kemeny and Thomas Kurtz would do something special. Now Kurtz was another Dartmoth professor who got his PhD from Princeton. He and Kemeny got thick as thieves and wrote the Dartmouth Time-Sharing System (keep in mind that Time Sharing was all the rage in the 60s, as it gave more and more budding computer scientists access to those computer-things that prior to the advent of Unix and the PC revolution had mostly been reserved for the high priests of places like IBM. So Time Sharing was cool, but the two of them would go on to do something far more important. In 1956, they would write DARSIMCO, or Dartmouth Simplified Code. As with Pascal, you can blame Algol. Wait, no one has ever heard of DARSIMCO? Oh… I guess they wrote that other language you're here to hear the story of as well. So in 59 they got a half million dollar grant from the Alfred P. Sloan foundation to build a new department building. That's when Kurtz actually joined the department full time. Computers were just going from big batch processed behemoths to interactive systems. They tried teaching with DARSIMCO, FORTRAN, and the Dartmouth Oversimplified Programming Experiment, a classic acronym for 1960s era DOPE. But they didn't love the command structure nor the fact that the languages didn't produce feedback immediately. What was it called? Oh, so in 1964, Kemeny wrote the first iteration of the BASIC programming language and Kurtz joined him very shortly thereafter. They did it to teach students how to use computers. It's that simple. And as most software was free at the time, they released it to the public. We might think of this as open source-is by todays standards. I say ish as Dartmouth actually choose to copyright BASIC. Kurtz has said that the name BASIC was chosen because “We wanted a word that was simple but not simple-minded, and BASIC was that one.” The first program I wrote was in BASIC. BASIC used line numbers and read kinda' like the English language. The first line of my program said 10 print “Charles was here” And the computer responded that “Charles was here” - the second program I wrote just added a second line that said: 20 goto 10 Suddenly “Charles was here” took up the whole screen and I had to ask the teacher how to terminate the signal. She rolled her eyes and handed me a book. And that my friend, was the end of me for months. That was on an Apple IIc. But a lot happened with BASIC between 1964 and then. As with many technologies, it took some time to float around and evolve. The syntax was kinda' like a simplified FORTRAN, making my FORTRAN classes in college a breeze. That initial distribution evolved into Dartmouth BASIC, and they received a $300k grant and used student slave labor to write the initial BASIC compiler. Mary Kenneth Keller was one of those students and went on to finish her Doctorate in 65 along with Irving Tang, becoming the first two PhDs in computer science. After that she went off to Clarke College to found their computer science department. The language is pretty easy. I mean, like PASCAL, it was made for teaching. It spread through universities like wildfire during the rise of minicomputers like the PDP from Digital Equipment and the resultant Data General Nova. This lead to the first text-based games in BASIC, like Star Trek. And then came the Altair and one of the most pivotal moments in the history of computing, the porting of BASIC to the platform by Microsoft co-founders Bill Gates and Paul Allen. But Tiny BASIC had appeared a year before and suddenly everyone needed “a basic.” You had Commodore BASIC, BBC Basic, Basic for the trash 80, the Apple II, Sinclair and more. Programmers from all over the country had learned BASIC in college on minicomputers and when the PC revolution came, a huge part of that was the explosion of applications, most of which were written in… you got it, BASIC! I typically think of the end of BASIC coming in 1991 when Microsoft bought Visual Basic off of Alan Cooper and object-oriented programming became the standard. But the things I could do with a simple if, then else statement. Or a for to statement or a while or repeat or do loop. Absolute values, exponential functions, cosines, tangents, even super-simple random number generation. And input and output was just INPUT and PRINT or LIST for source. Of course, functional programming was always simpler and more approachable. So there, you now have Kemeny as a direct connection between Einstein and the modern era of computing. Two immigrants that helped change the world. One famous, the other with a slightly more nuanced but probably no less important impact in a lot of ways. Those early BASIC programs opened our eyes. Games, spreadsheets, word processors, accounting, Human Resources, databases. Kemeny would go on to chair the commission investigating Three Mile Island, a partial nuclear meltdown that was a turning point in nuclear proliferation. I wonder what Kemeny thought when he read the following on the Statue of Liberty: Give me your tired, your poor, Your huddled masses yearning to breathe free, The wretched refuse of your teeming shore. Perhaps, like many before and after, he thought that he would breathe free and with that breath, do something great, helping bring the world into the nuclear era and preparing thousands of programmers to write software that would change the world. When you wake up in the morning, you have crusty bits in your eyes and things seem blurry at first. You have to struggle just a bit to get out of bed and see the sunrise. BASIC got us to that point. And for that, we owe them our sincerest thanks. And thank you dear listeners, for your contributions to the world in whatever way they may be. You're beautiful. And of course thank you for giving me some meaning on this planet by tuning in. We're so lucky to have you, have a great day!
In this episode, Klee and I discuss creative (music and art) stigmas that other creatives or professionals use in the art world to criticize you or your craft. We discuss three specific stigmas that are usually used as a weapon against creatives. These Stigmas are:Age, you are either too young or too old.Education, You lack the credentials, thus you are no good.Digital Equipment, You use a computer to create your craft, thus you are not a real artist.There might be a bit of a rant in there, so be prepared.
Dave Oran joins Donald and I to talk about the history of DECnet at Digital Equipment—including the venerable IS-IS interior gateway protocol. Outro Music: Danger Storm Kevin MacLeod (incompetech.com) Licensed under Creative Commons: By Attribution 3.0 License http://creativecommons.org/licenses/by/3.0/
Martyn R. Lewis is an acclaimed business professional with a vast background in all aspects of revenue generation. Born and educated in the UK, he emigrated to Canada in the 1980 and now resides in the Sonoma area of northern California. Prior to founding his own company, he had extensive executive management, sales and marketing experience. His positions included President and CEO for Drake International, North America, where he led a large, multi-divisional sales force to reverse a declining trend to growing the business at more than three times the rate of the industry. Prior to Drake, Martyn was with Digital Equipment of Canada for ten years. During his tenure at DEC he led several hundred sales professionals, culminating in the position of VP Marketing and Sales Services. Prior to joining DEC, Martyn worked in the major accounts division of ICL in the UK. During these years, he participated in countless sales training programs; his proficiency rose to the point that he became certified to deliver one of the leading sales methodologies. He attended peer-level executive retreats, where he met and worked with the key opinion leaders and authors on the topics of business, sales, and marketing. In the mid-90's, he left corporate life to start his own consulting company, Market-Partners Inc. It would be easy to say that he did so to escape the clichéd “corporate grind,” but the reasons were much more complex. Because as successful as he had become, he was not what most would define as a “natural salesperson”. His interests ran to engineering; tearing down and rebuilding his Lotus sportscar, designing and handcrafting high-end audio systems—science, methodology, planning, and results. Yet throughout his sales career, Martyn continually ran into the myths and legends of the “art” and “black magic” supposedly utilized by top-drawer sales wizards. And perhaps for a tiny minority of overachievers, that might have been true. But his concern had always been for the majority—the whole sales force—to succeed, and perhaps not spectacularly, but certainly consistently and profitably. Martyn wanted to apply the elements and rules of science to sales and marketing, to prescribe method where madness often prevailed. And he wasn't going to be shy about it, as his company's original tagline was “The Science of Sales and Marketing.” Of course, with science comes research, and he knew that if they were to bring true scientific discipline to the needs of their clients, they had to exercise that same discipline in their own efforts of observing, recording, and analyzing what they found. Working now at arm's length with his clients' sales forces brought a much wider view of all the inherent issues that affect sales success. And it was at this point that he and his team started to see a repeating pattern. These organizations all had offerings that delivered value to a particular market yet too often were falling short of hoped-for results. They uncovered the disconcerting fact that while most of their clients were doing good work offering a good product, the results all too often suggested otherwise. After seeing this phenomenon so many times, it was obvious to Martyn that something was missing; there was something more at play, and he needed to find out what that was. As their work continued to evolve and they talked to ever more individuals across organizations about how they buy and why they don't, a number of observations began to materialize. And once recognized, defined and documented, these revelatory observations crystallized into hard data and now form the basis of his foundational concept of Outside-In Revenue Generation. Martyn and his team have since distilled the theory, shaken out the bugs as it were, and by putting it into practice, have seen the results. He has seen their work create over a billion dollars of prospective business for a leading high technology company. He has seen it enable a small company that was stuck at revenue of some $20m to being able to sell themselves for well over $500m within a few years. He now knows that the approach that they have created provides significant, swift, scalable and sustainable results. Martyn has consulted in 33 countries with the result that his work has been used across 44 countries, in 17 languages, and has impacted over 85,000 sales professionals. He is internationally recognized as an extraordinary speaker, having delivered numerous keynote addresses live, on radio and television and he is a pioneer in utilizing webinar technology through his sister company, 3GS. Martyn acts as an advisor to a number of executives in the high technology industry and is active on several advisory boards and boards of directors. On a personal note, Martyn lives in Northern California wine country where he enjoys the local food and beverages. He also enjoys music, cooking, hiking and the performing arts and has a particular passion for airplanes and English sports cars. He gained his first pilots license at 16 and has maintained a lifelong interest in aviation. He now flies his own plane combining his passion for aviation with his love of dogs by flying rescue dogs for charity. He also coaches and mentors young adults into the business world and works with young entrepreneurs in the townships of Johannesburg with the Ubuntu Mission. https://buyingjourneydna.com
Martyn R. Lewis was born and educated in the UK, emigrated to Canada in 1980 and now resides in northern California. Initially a programmer, Martyn worked in the major accounts division of ICL in the UK.After moving to Ottawa, he joined Digital Equipment of Canada in sales where he spent the next ten years eventually rising to the position of VP Marketing and Sales Services. On leaving DEC, he was appointed CEO of Drake International, North America, where he led a large, multi-divisional sales force to reverse a declining trend to growing the business at more than three times the industry rate.In the mid-90’s, he left corporate life to start his own consulting company, Market-Partners Inc. A singular and far-reaching element of his methodology was his insistence that they research their clients’ customers as assiduously as the clients’ own sales force. He and his team’s work continued to evolve as they talked to ever more customers about how they buy and why they don’t, and a number of startling observations began to materialize. Once recognized and defined, these revelations crystallized into hard data and now form the basis of his theory of Outside-In Revenue Generation. The practical application of this ground-breaking concept has since evolved into market-proven results that are consistent, significant, scalable and sustainable.Martyn has now consulted in 33 countries with the result that his work has been used across 44 countries, in 17 languages, and has impacted over 85,000 sales professionals.He is also internationally recognized as an extraordinary speaker, having delivered numerous keynote addresses live, on radio and television.Martyn acts as an advisor to a number of executives in the high technology industry and is active on several advisory boards and boards of directors. Learn More: www.buyingjourneydna.comInfluential Influencers with Mike Saundershttp://businessinnovatorsradio.com/influential-entrepreneurs-with-mike-saunders/
Martyn R. Lewis was born and educated in the UK, emigrated to Canada in 1980 and now resides in northern California. Initially a programmer, Martyn worked in the major accounts division of ICL in the UK.After moving to Ottawa, he joined Digital Equipment of Canada in sales where he spent the next ten years eventually rising to the position of VP Marketing and Sales Services. On leaving DEC, he was appointed CEO of Drake International, North America, where he led a large, multi-divisional sales force to reverse a declining trend to growing the business at more than three times the industry rate.In the mid-90’s, he left corporate life to start his own consulting company, Market-Partners Inc. A singular and far-reaching element of his methodology was his insistence that they research their clients’ customers as assiduously as the clients’ own sales force. He and his team’s work continued to evolve as they talked to ever more customers about how they buy and why they don’t, and a number of startling observations began to materialize. Once recognized and defined, these revelations crystallized into hard data and now form the basis of his theory of Outside-In Revenue Generation. The practical application of this ground-breaking concept has since evolved into market-proven results that are consistent, significant, scalable and sustainable.Martyn has now consulted in 33 countries with the result that his work has been used across 44 countries, in 17 languages, and has impacted over 85,000 sales professionals.He is also internationally recognized as an extraordinary speaker, having delivered numerous keynote addresses live, on radio and television.Martyn acts as an advisor to a number of executives in the high technology industry and is active on several advisory boards and boards of directors. Learn More: www.buyingjourneydna.comInfluential Influencers with Mike Saundershttp://businessinnovatorsradio.com/influential-entrepreneurs-with-mike-saunders/
Learn about Embedded Passives Technology with Bruce Mahler from Ohmega Technologies. OhmegaPly® embedded resistor-conductor material is popular, but it’s not new. Ohmega has been making this product since 1972. So why is it getting so much attention lately? It’s reliable and has stood the test of time for five decades--but emerging technologies are making it more relevant than ever. Tune in to learn more about embedded passive and embedded components and find out if it may be the key to solving your current PCB Design challenges. Show Highlights: OhmegaPly® is a true thin-film, Nickel-Phosphorous (NiP) alloy. In the manufacturing process, about 0.05 to 1.00 microns of the alloy is electro-deposited onto the rough, or “tooth side”, of electrodeposited copper foil. Embedded passive and embedded components: ER - Embedded resistors, EC - embedded capacitors People tend to think of this technology as something new. Ohmega has been making this product since 1972. It’s the oldest, new technology out there. Functionality - it can be used in so many different ways. Mica - old copper clad laminator, conceived the technology as a way to add functionality to a copper material. Developed in early 70s as a new product. First users of the technology - Cannon electronics in Japan saw the potential in the product for cameras. Other early user was - Control Data Corporation. From there alot of mainframes utilizing the technology. Ohmega ply - thin film resistive foil, plated process, nickel phosphorus, varied thickness and sheet resistivity, fractions of a micron-thick film. Very linear, as film deposit is thinner, resistivity goes up. Thin film technology. We make it in Culver City, CA for 40+ years. Work with Rogers/Arlon, Taconic, Isola, Nelco and other laminators If you use a tiny discreet resistive element, they can be hard to handle. Etching a 5 or 10 mil trace is no problem. Space restrictions, solution - print and etch a resistor Why would I want to use Ohmega ply? What are the cost, reliability, performance indicators? “There’s no other way I can design this unless I get rid of my resistors!” Most designers use Ohmega ply for densification. Helps when: hard time routing, too many passives, board is a little too thick Example: MEMs or Micro-Electro Mechanical System microphones for cell phones. Applications: military, space based applications - satellites, Uses include: Sensors, IOT, Wearables, Automotive, Memory, Heater, Biomedical Ohmega wants to talk technology with PCB designers. Leverage their expertise, they operate as a part of your design team and happy to be a resource for you. Technical people are available to help. Ohmega and Oak Mitsui - technology partners - Ohmega/FaradFlex is a combined resistor/capacitor core consisting of OhmegaPly RCM laminated to Oak-Mitsui’s FaradFlex capacitive laminate materials. Printed circuit board copper lead times are getting longer Self-reliant company Very close relationships with raw material suppliers Links and Resources: Ohmega Technologies Website Ohmega Technical Library and Tools Ohmega Products Bruce Mahler on Linkedin Hey everybody it's Judy Warner again with Altium's OnTrack Podcast. Thanks for joining us again. We have yet another amazing guest on a fascinating topic that I hope you will enjoy and learn about today. But before we get started I wanted to invite you to please connect with me on LinkedIn. I like to share a lot of content relative to designers and engineers and I'd be happy to connect with you personally, and on Twitter I'm @AltiumJudy and Altium is on Facebook, Twitter and LinkedIn. We also record this podcast simultaneously on video, so on the Altium YouTube channel you can find us under videos, and then you will see the whole series of podcasts that we record. So that is all the housekeeping we have for the moment. So let's jump right into our topic today which is, embedded passives and I have a wonderful expert for you today, and an old friend, Bruce Mahler of Ohmega Technologies. Bruce, welcome, thanks so much for joining us and giving us a lesson today on embedded technology. Thank You Judy, it's great being on board here and I look forward to talking to you and the audience about embedded resistors in particular, as well as other embedded passives. Okay, so before we get going I want to make sure that I'm calling this technology the right thing because I always think of them being embedded passives but I don't think I'm right. How would you characterize the technology exactly? Well the OhmegaPly® product, our embedded resistive product, is ER embedded resistors, PCT planar component technologies it goes by many names: embedded resistors, embedded capacitors; I think the most common now is ER embedded resistors EC embedded capacitors in particular. When we're talking about passive elements - and those are the two main ones that are really driving the embedded passive world - and a better component world right now so yeah, OhmegaPly® is just fine with me. Okay so let's jump in now, you told me something recently that I was kind of shocked to learn about and I'd like you to give us a brief history of Ohmega Technologies and sort of the evolution of this technology. What I was really shocked to learn is the age of the company. So can you tell us more about that? Sure many people who are looking at using embedded passives, think of it as a new technology, something just on the market. It's been out a year or two - no new applications yet but people are looking at it. So when we're asked, this OhmegaPIy® product , how long have you been making it for? And I said oh since about 1972, and they said wait a second, 1972? I said yeah that's actually , we're going on 46 years now and it's amazing that it's probably the oldest new technology out there. [laughter] That's a good way to put it. I think that has a lot to do with the functionality of the material, how it could be used in so many different ways. And so just briefly a history of the technology: originally the OhmegaPly® embedded resistive thin film material was developed, conceived, and developed by Mica Corporation. Many of your old listeners on board know Mica used to be a copper clad laminator, supplied epoxy glass laminates and polyamide glass, did a number of other things, and it was conceived in the early 70s as a way of adding functionality to a laminate material. So rather than just getting copper foil bonded to a dielectric it was a copper coil that had a functional purpose beyond copper traces bonded to a dielectric and so, after many years of development at Mica, a product OhmegaPly® was developed; the Mica laminate product was MicaPly that's how the name originally came about and it was originally developed in the early 70s as a new product. Now with any new product, somebody had to be the first to go ahead and try it you know, who was going to be on the bleeding edge of any new technology, who was going to be the route maker? And the interesting thing is that back in the early seventies - about again, '72, '73 - the first users of the technology were two absolutely opposite companies in absolutely opposite areas of the electronic industry. One of those happened to be Canon electronics in Japan. Canon, making AE-1 SLR cameras at the time, looked at the technology as being a great way of making a step potentiometer who could eliminate the ceramic potentiometers circuits that they were currently using, at the time and it fit very neatly into their camera system. So they were very simple, these were surface resistors, put in FR4, make resistive elements in the potentiometers, and they started using it in their AE-1 camera. Very quickly Nikon and Pentax started doing the same thing. The other first user happened to be somebody completely opposite - now we're talking about the early 70s - and that user was Controlled Data Corporation; used to be in business a long time ago. CDC's aerospace group who had some very dense multi-layer boards of mixed dielectric layers of PTFE Teflon, layers of FR4, ECL ecologic, lots and lots of termination needs and absolutely no real estate on some of their high-speed digital boards for termination. So the idea of being able to print and etch a resistive element, and embed it within a circuit layer, particularly underneath an IC package, speeded up board area for them, allowed them to terminate. They got some other benefits of better electricals. They started using us and then very quickly thereafter, other divisions of CDC started using us in things like their cyber mainframe computer systems, and it kind of dovetailed into people like Cray Research and their supercomputers, and we went from there to super mini computers , places like Digital Equipment and Prime and Wayne, and Data General and Harris. All the guys in the 80s who had ecologic termination needs. So it was the heyday back in the 80s, and a lot of mainframes, supercomputers, super mini computers, kind of like with those very, very powerful systems that people now carry in their cellular phones- In their pocket right? -at the time it was very, very powerful though. And so, although two different areas of growth we - in the 70s and 80s - found new applications and digital application, particularly termination, but we also started working very closely with the military aerospace industry where they saw the elimination of solder joints being a very positive thing. You know, high g-force doesn't affect it -vibration - there's no joint there in the resistor circuits. So we started working with a lot of them in the military aerospace, space-based applications, radars, antenna power dividers, high-speed digital systems - just a variety of different things. And it's evolved from there, it seems that every five years new technology comes on that says I need to use that. We can talk more about that - we'll get back to maybe the basics of what do we actually do, how do we make it. Yeah so let's talk about OhmegaPly®, what is it? What is it like to process, and let's just go in and tell us the whole story. Oh man, you want to go right back to the beginning again. Okay the OhmegaPly® technology is a thin film resistive foil. Now we became Ohmega Technologies - a spinoff of Mica - started as a separate independent company in 1983, and we basically took over that whole technology from Mica, and what that technology involves, is taking copper foil as a standard EDE electrodeposited copper foil that the printed circuit industry uses, and we threw in a reel-to-reel deposition process as a plated process. We plate a very thin coating of a nickel phosphorous NiP resistive alloy onto the mat or two side of that copper and by varying the thickness of that resistive coating we can vary the sheet resistivity. And so this product - a true thin film nickel phosphorus alloy - we're talking about fractions of a micron thick film, so it's truly thin film. So we have a variety of different sheet resistivities, a 10 ohm per square is about a 1 micron thick film, a 25 ohm per square's a 0.4 micron, 50 ohm is 0.2 micron. So it's very linear, as the film that we deposit gets thinner the sheet resistivity goes up. Now we start getting into the dangerous territory of talking about things like ohms per square and I don't want to start having your listener's eyes cross over some strange area, but suffice it to say, it follows thin film technology. So what we do is, we make a resistive foil that's a copper foil resistive coating. Now what that foil does, that's what we make at our facilities, in our factory in Culver City California very close to LAX or a few miles away. We've been doing it now , for literally 40 years plus at that facility. That resistive foil then gets laminated or bonded to a variety of dielectrics. We work with people like Rogers Arlon, Taconic, we work with Isola we work with Noko, we do some work with DuPont we're working with others out there, but essentially the resistive foil can be bonded to almost any kind of dielectric just like any other copper foil. Standard pressing, heat pressure, it bonds to a variety of dielectrics. Now that laminate product - a copper clad laminate with the resistive film between the copper and the substrate - goes to the printed circuit board community, the PCB community, then prints and etch copper circuitry. They normally will do a print develop, extra process to create copper circuits. Now they go through a separate (an additional) print develop bed strip so it's a two-print operation and the first print is defining where they have copper traces, then they etch away all excess copper and they etch away all excess resistive film underneath their copper. Now they have copper circuitry. Underneath all that copper circuitry is a resistive material, but electrically it's shorted out by the copper above it. Well you have a spot for tracers. Makes sense. That's a point think of it as a treatment of copper only like a zinc or a brass. Okay. Now the board shops come back and they apply more photoresist over that copper circuitry and they print a second piece of artwork and that artwork protects all the areas that they wish to keep as copper, and exposes for etching the copper that will be the resistive element. Now in almost all cases, the first etch will define the width of that copper that will be the width of that resistive element. So the second image artwork defines a length of copper that will be the length of the resistor. So it's a very simple piece of artwork to use; very easy to register, but after protecting the copper with photoresist, now they etch away the exposed copper using the 'aplan' based etchings, and they leave behind the resistive film that was underneath it, and they have a resistive element. Interesting. -stripping photoresist off the board; leaves them with copper circuitry with resistive elements that are integral to that copper plane. Those resistors can be tested for value, they can go through standard multi-layer processing, laid up with other cores, pressed and then forget you have the resistive elements embedded, if it goes through traditional drilling, print, develop, etch, strip process, or plate process I should say. So you do a drilling and you desmear, you plate, you etch and your embedded resistor inside; and as a bare board now, prior to shipping for assembly, the board shop can do traditional testing, and they can measure resistor values to ensure they're within spec. They could also be used on the surface of a board, in which case you solder mask over the resistive elements along with your copper traces, and that protects them from abrasion and scratching. The key here is this though: if you use a discrete resistive element, an 0402, an O201. An O201 is a 10 mil by 20 mil resistor. They're pretty small; Yeah. -hard to handle, hard to assemble. So if I go to a board shop now and say: hey guys I want you to etch a copper trace that's 10 mil wide, they're gonna look and they'll laugh and say: come on you're insulting us!- Yeah. -we do 5 & 5, 4 & 4, 3 & 3, 2 & 2 technology. So etching a 10 mil trace isn't a big deal, five mil trace is not a biggo. When they etch that copper trace, they're essentially defining the width of the resistor, so it's like a controlled impedance trace. They're creating a resistive element of a certain width. Now you say: can you cover it with photoresist and have a little box window that's 20 ml long? Sure that's not a big deal if you etch the copper away. Now they've left themselves with a 10 mil by 20 mil resistive element, which does not push the art at all, it's already built in, no assembly, and all that. So if you say: hey can they do a 5 mil by 10 mil resistor? Sure, we have applications that are using 50 micron by 100 micron resistor. If a board shop connected that copper trace, that's the limit of the resistor width you can print. So you can get a significantly small, very, very, precise resistors that could be located right where you want them, under a package, and that's where we're doing a lot of newer applications like microfluidic heaters, you're talking about a couple mils, by four or five mils you can get very small heat rises in a very localized area, very low power, but I'm ahead of myself. Okay yeah well so I'm thinking about our audience right now, who are EEs doing design, or just purist board designers for the most part. Why would I want to use OhmegaPly® over traditional? I mean you just mentioned one, if I had space restrictions and I didn't want to use these tiny, tiny parts that seems like a no-brainer but is it real estate, is it cost? Like what drives people - I think I'm opening a can of worms, sorry but what is the cost, performance, reliability implications? And if I was a designer, why would I want to use OhmegaPly®? Okay, it's a good question and people use it for a variety of reasons. The best reason we like to hear is: I have a design and there's no other way I can design this thing unless I get rid of my resistor and so, kind of I get a tear, I well up a bit, I get very emotional- [laughter] -with those. Because then it's all driven by performance and densification. Right. But look at everybody - realistically - cost is a big driver, as is performance, and obviously densification all goes hand in hand with reliability. I would say most designers design with us for a number of reasons. The key reason that we focus on densification and that is this: if I have a certain number of resistors on a board and I said: I'm having a hard time routing. I have a lot of passes on my board, either I have to route in more layers, so I'm adding to a multi layer design for its traditional through hole, and I'm gonna have to go to HDI which adds a lot of cost to my board. Or my form factor, my X&Y; dimension is a little too big I need to shrink it down, or my board’s a little too thick, I'm gonna make it a little thinner. So here's a tool, a technology that allows you to do that. So let's say I have one resistor in a unit area of a board, and somebody says, well gee I want to etch in a pretty natural resistor. Okay who’s cost’s it going to be? It's gonna cost whatever our materials, divided by one. There's gonna be one resistor. Now instead I have ten resistors - what's the cost? It's our unit cost divided by ten because it's the same material that goes through the same print and etch process. So the greater the number of the resistors lower the cost per unit resistor. One application that uses our technology - and this is where it reinvents itself. A number of years ago - five/six years ago - it started being used in MEMS microphone. If any of your listeners out there, any of your designers have a cell phone, you very likely have us in your cell phone in the MEMS microphone that you're talking out of, or you're listening out of right now. Now why use us in a MEMS microphone? We're part of an RC filter network which improves the sound fidelity significantly. So it's been found to be a very significant offering by the MEMS microphone makers and their end customers who are the cell phone manufacturers - but in very massive, mass quantity production - for many, many years over in the Far East, particularly in China, where our product is used extensively. So in those applications it was a combination of densification, they can make these MEMS microphone boards. The PCB's thinner because they eliminate the chip resistor, you don't have to assemble it, they can make them a little bit smaller and because you're talking about such small little element - even a few resistors only a couple resistors - in that design, you're talking about a fraction of a cent to put these resistive elements in a board. Fraction of a cent, no assembly- Yeah when they're in the millions that matters. -all that's very important. There's another example. If I'm a designer and say: hey I have a high-density IO/IC. My fast rise times I have some termination issues but I'm on a 300 micron pad batch and there's no way I can put a discrete component on my surface. To go ahead and terminate, I have too far to go. I have too many of these line. So I have IO of hundreds of traces, maybe a thousand traces, and I do it but guess what? If you're able to take every trace, every logic trace coming off that that IO and I build a resistor as part of that trace - to have a trace it has just a little of the copper removed - leaving a resistive element behind. So it's a resistor built-in trace which is one of our products: ORBIT Ohmega resistors built-in trace - you can terminate every one of those drivelines - they're underneath the IC package, so they take up no board area. They terminate off that driveline, you improve impedance now, naturally reduce line delay, you also save money because now you literally have hundreds of resistors in a square inch of area or a couple square inches of area, and it saves a lot of cost by not having to assemble and put those discretes on your board now. So cost is a big driver. I just mentioned a couple of them. Densification is as well, but our material also is essentially inductive free. So you know, it means that you have less inductive reactance with fast rise times. So what happens; you get less EMI coming off your board, it's a cleaner signal. Our materials, also because of that, used in certain applications for absorbers or, R cards where they used us, that resistive film, to suppress some of the EMI coming off for- -interesting -as a shielding agent. So there's another application. So we're used extensively, not just in power dividers and R cards and absorbers, but obviously as terminators, as in filters, pull-up/pulldown resistors and now we're seeing a lot of activity in heater elements. We're in the military aerospace uses a 'cell' so my active laser activation where they have tiny resistive elements on PCBs that can go ahead and activate a laser for laser guidance for smart munitia, missile systems, or heater elements that can go ahead and maintain heat on critical components in avionics or even in space based applications. Or our product is used in satellites and even in deep space probes. We were on the Mars Express Beagle 2 Lander, on the surface of Mars where we have an Ohmega heater, key critical components up to above minus 15° C. It would work great if the parachute did not land on top of the lander [laughter] and prevent the deployment of the solar array but hey it was a great application for our product. Well it's again - I think just such a surprise - or at least it was to me, when I learned about one: how old the technology is and two: that it's really because of complexity and just all the different things that are going on in the industry right now that it's growing - it's growing at a quick pace. Significantly so, we've had a wonderful record year; every year is a record year. But that's the nice thing, that the resistive film is like a blank slate. What you do with it is a new assignor and so yeah in the 80s it was all ecologic termination and then it goes into power dividers, and they're still doing all that stuff. But you know what's happening now is, we're saying, it’s utilized in so many different ways so we talked about the MEMS microphone. Well there's new sensor technology, there's accelerometers or other there's other MEMS-type sensors who use us. Now we see automotive sensor technology that says: hey, we could use this, not only is it obviously super-high reliability, been out for decades you know, can be done in high volumes, very cost-effective, density impact identification. But there's some critical components you could use in automotive, 5G technology- What about IoT Bruce, it seems like ideal for IoT, provided the cost- -in IoT you're saying? Yes. The Internet of Things well that's why I'm talking about sensor technologies. IoT is a combination of a lot of things. Yes. Technologies are getting into it, we see our stuff on flexible materials, and wearables. Your wearables, yeah that was the other thing I was wondering about. Wearable devices, we can get smaller home devices, home audio devices, and as things get thinner, smaller, everybody wants things densified. So getting rid of the passives especially, really allows you to do that. So yeah IoT is a big thing, automotive, even memory devices going to DDR4, going out to DDR5 , those fast data rates are causing needs for termination again, and 'Genic' has approved the embedded resistor within some of the DDR4 structures. So memory is another area. So between sensor technologies and automotive, and home devices in things like memory devices, and things like heater microfluidic heater bio biomedical type things you know. We have micro heaters on an embedded board, you can have fluid come in and have basically a breakdown to the protein to do analysis, they use us for things like that. It's pretty exciting - so yeah it's been around for 45 years but guess what, we think that the new technologies, the new applications, it's almost like just starting over again. Yeah I can see that. Especially, we have the reliability long-term use, high volume low volume, high density/low density, so many different ways of doing it so, that's nice to have that background, make people feel good about using the technology, but knowing that all these new things are developing. I mean I can't wait for the next 45 years. That's fun. Well a couple of things I wanted to ask you about what made me think of calling you and wanting to do this - sort of a side note - is, you hear about passives being on allocation and all of that and I'm like: I wonder if Bruce is seeing an uptick just because people are freaking out over automotive buying up whole lines - I don't know if you're seeing that, it was just a curiosity I had? Well yeah I know what you're saying, we definitely see an uptick, and now part of that uptick within the context of the of the industry. First off, I do want to tell your audience, especially your designers, we've been doing this for 40 - 45 years plus, as I mentioned - 46 years. I'd like to say that I was only 2 years old when I first got introduced to technology- -We're going with that; I was three, you were two - let's go with that! But we also have designers at our company whose job it is to work with the design community, particularly a PCB designer who could help them optimize their design, who can develop real footprints of resistors. What we don't want your your listeners to do, is reinvent the wheel; we want you to use our knowledge, talk to our people - say: hey here's what I think I'd like to do, I have an application I want to use, does it make sense for your technology? If it doesn't, we don't want you wasting your time. So ultimately you're gonna say, we're not gonna use it anyway we want you to have an optimized design because we want you to be successful. So think of us as an extension of yourself, of your team. We're part of your design team we're there to help and assist. If you go to our website ohmega.com, there's a lot of white papers, there's a lot of good information there that people could read and reference. But more importantly is the communication with our staff, technical people who can really help you. Now talking about in general, the industry, there is an uptick in that. We talked about passive, so I mentioned it; we're in filters and MEMS microphones, resistors and capacitors and in one case, one of the capacitive materials, the embedded capacitor material FaradFlex, which is a embedded capacitor material, it's produced by Oak Mitsui. So Ohmega Technologies, my company and Oak Mitsui, got together and combined the material and had our resistive material on their capacitor material so we'd have one layer resistive capacitor. What? My head just exploded! What we did was we found that it's pretty simple, from a technology standpoint, to stick two technologies, each separately have its own complexity but working together really worked very well. Importantly enough it had such synergistic effects in terms of improved power, lower RTC characteristics, or change of resistance to function the temperature down to almost nothing, the stability is astounding over a wide temperature range that we applied and we got a jointly held patent for the combined technology which we have in the US, and also all over the world now. So it's a joint technology pact between Ohmega Technologies and between Oak Mitsui and Mitsui Mine in Japan for this technology, and we see applications where if somebody wants to get a resistor and embed it, they also want to embed the capacitor. They get rid of capacitors that are passive. A lot of times they want to get rid of resistors too. So it goes hat in hand with a lot of those. In general, there is a lot of movement in the industry to embed it, but it's a growing thing because of densification, growing needs for real estate, smaller, thinner, lighter. You touched upon something and that is material sources, right now the industry is going through some uptick. I think part of that's military aerospace that has increased the amount of funding and a lot of military programs, but also other areas. So we've seen that as well and our products are used in a lot of stuff. Radar systems F-35, F-22, a lot of missile systems, Eurofighter, just all over the place. A lot of satellites, a lot of SATCOM, a lot of other things like that. A lot of radars on the ground as well, but we're seeing that uptick because the IoT, as you mentioned, in the Internet of Things, there's more and more sensitive technologies being demanded into a lot of different product. People are amazed at how many sensors go into so many things these days and the key with a lot of that, is densification, smaller, faster, cheaper - so that gets hand-in-hand with the 5G, the automotive, self-driving cars that are coming up; a lot of the sensors the Lidar, other sensor technologies are going to self-driving automobiles and what everybody says is: hey, this all sounds great, but you know what? If I have a printed circuit board not using a computer and I have a failure in that it's okay. So it's annoying my computer goes, I swap a board, I put up a board, but I cannot afford to have any failure. I cannot afford to have anything go wrong, if I'm in an automobile that's driving itself, do you have room for any kind of failure? And so it's taken very seriously in the industry and going to a lot of these conferences and hearing the talks, the people involved with testing a lot of these are very concerned. They have to have absolute... as tough as it was, they have to make it even tougher for testing. Nothing can fail, so a lot of that comes into what can we do to improve reliability? Hey let's get rid of solder joints. We want a kind of thing doesn't cause something go 'ding' and fly off a board anymore - or you know X&Y; expansion or z-axis expansion, all those things. Get rid of those solder joints, mechanical joints, improve the reliability while you enhance, densify, improved electrical performance. So we're saying that that's going on right now. And the other thing is that companies are concerned about, the industry is facing some interesting things right now in the printed circuit industry copper lead times are really out. Yeah, that's crazy too. -yeah the industry is getting smaller and smaller, yet at the same time the end-users and designers have to rely more and more on fewer and fewer resources. So we've been around since like I said 1972, so for 46 years we've been supplying this technology and we have never ever not been able to supply this in those 46 years. It's important for us that, A) we manufacture everything ourselves we make that resistive film we test it, we have test facilities which make sure that the product is what it should be before it ships out the door. We have hands-on manufacturing that's critical we don't want to subcontract making our product because we feel it's too important to our customers. They're relying on us. If we subcontracted, what would happen if whoever we had make it, went out of business? Or they sold the business; I don't want to do it anymore, and then we can't get product, our customers can't. We don't want to rely on someone else; that's number one. Number two, we have very good close working relationships with our raw material suppliers. Most of our raw materials are USA-based, we get them in from the US you know. We want to have a critical supply chain. When you're talking about scarce resources like copper and other things, it's important that we have that kind of relationship with our suppliers so that we always have product. We're always there to support our customer needs when they need it, how they need it, and that to me is very, very important because a lot of companies are coming to us saying: oh yeah we're giving two months lead time on getting product, and how are we supposed to deal with that? And say what about you guys? I said: you want some of our stuff we'll ship it tomorrow. To us that's very important. Customer; you've got to go ahead and satisfy customer needs and especially their concerns that's absolutely critical in the industry today. Yeah and it's refreshing because we get in this weird cultural thing as a business and it's like: Oh faster, cheaper or we're gonna be the lean supply chain and buy out. We get into this whole frenetic thing, but we forget if we're not meeting any of the customers we'll be out of business. So I really love that philosophy. Now as far as our listeners go Bruce, we're gonna share all of this in the show notes right. Everything that's on your website I encourage it, so we're going to supply all those links and the website you guys, if you're interested you can call Ohmega Technologies directly, get the help that Bruce alluded to. But they have a really great website with some really neat things that will go into even more depth than Bruce has gone into so far. So thank you so much. So Bruce, as we wrap up here. First of all, thank you Bruce is joining us from IMS in Philadelphia today even though he's - at Ohmega in Culver, so thank you for hopping out of the show for a few minutes to give our listeners a treat, so thank you for that. When I wrap up the podcast I always like to have a little feature in here called 'designers after hours' because most of us techie weirdos have a little bit of a right brain and have interesting hobbies I've found. Is there anything that you do after hours that is creative, compelling, interesting or otherwise, or do you have any after hours? do you just work all the time Bruce? Do I have any after hours? That's a good question. Yeah we encourage people to call us and that keeps me rather active and the staff at Ohmega and we welcome that; please, please, please call us, email us, we'd love to talk to you and listen to you. As to me yeah I enjoy travel, I enjoy writing you know, I always have . Now it's mostly technical things or papers that I publish. But you know, I love doing fiction as well, I do do that and I get very involved. Between that and having a lot of crazy grandchildren running underfoot, that keeps me going. That fills up your plate. So also, would you say you are a geek or a nerd? I'm sorry? Would you say you are a geek or a nerd? That's a good question, I'm probably more geek than nerd yeah they've cleaned me up over the years, so I think I'm more geeky. Yeah I would say you're more geeky, but you are walking on the razor's edge my friend. You can you can dip into that nerd space pretty easily. [laughter] Oh man, and I've been so good I haven't cracked any jokes, you can be mad about. [laughter] Here I am, now you're telling me I'm close- -no, no only in the best kind of way that you like go into this nerdy space of technology but that's really - -you want to know something; it's been a long time, I've been doing this a long time and I'm so excited - it's like it's a renewal if people get that I'm excited about technology about where Ohmega fits into technology it's because I really AM. It's genuine, our president Allan Leve, over at Ohmega Technologies, here's a guy who's had the same kind of passion. So every time we see something, we're always sending articles: look at this it’s neat isn't it? So if you call that nerdy, you call that geeky, that's fine. You know what we call it being enthused with technology and how we fit into that technology. Absolutely. -because I've been called a nerd and a geek I'm gonna drown myself in a Phillies steak salad. -extra cheese and onions. [laughter] No - when I say you teeter is only because I remember when I was working at Transline Technology, you came in and you were showing us how it's done, how it's processed as a board shop - and I remember listening to you going: this guy totally knows his stuff and it was so articulate and I'm like, boy when I grow up I want to be able to talk like this. Like Bruce Mahler does, man he's got it going on! So that's why I say- -just wait until I grow up really. -well it is an exciting time in technology there's no grass growing under our feet so I share your enthusiasm for everything that's in the market and you're seeing everything so that is really exciting. Well thank you again for - -thank you I appreciate it Judy, the opportunity to spend time with you and spend time with your audience, and hi to everyone out there - look forward to talking with you, look forward to working with you and like I said; a lot of exciting things out there right now in our industry so we're working in the best industry out there. We are, now we're gonna send poor Bruce back to booth duty where he can stand in a booth. Sorry to send you back to booth but thank you so much. Again this has been Judy Warner with Altium's OnTrack Podcast and Bruce Mahler of Ohmega Technologies. Thanks for tuning in again until we hear or talk to you next time always remember to stay on track.
Mikey Butler is the Senior Vice President of Engineering at New Relic. He's responsible for Product Engineering, Platform Engineering, and Site Engineering Operations. He brings more than 30 years of experience in the software industry. The first half focused on system coding at Digital Equipment, Bank of America, Data General, Rolm and Sun. The latter half on engineering management for Cisco Systems, Sybase, BEA Systems, Intuit and, most recently, Seagate. And named one of the 50 Most Important African Americans in Technology at the 2009 Soul of Technology Innovation and Equity Symposium. LinkedIn: https://www.linkedin.com/in/mikey-butler-25602a1/ Twitter: https://twitter.com/mikeybutler
How contact centers can leverage agent knowledge to improve the products they support. Design for improved Customer Experience while sharing valuable input with other departments: marketing, product development and sales. Information gathering, analysis and distribution are keys to a quality customer service organization. Feedback shows you the weak spots, while analysis and communication with other parts of your enterprise bring value to the enterprise as a whole. Dr. Van Bennekom will draw the lines between inputs and great outcomes for contact center operations. Guest Host Bio: Dr. Fred Van Bennekom helps organizations collect and apply customer and employee feedback in a statistically valid manner. The author of “Customer Surveying: A Guidebook for Service Managers,” Fred teaches operations management in Northeastern University’s Executive MBA program. Before earning his doctorate, he served as an information systems consultant for Digital Equipment’s field service organization. Through his company, Great Brook, Dr. Van Bennekom conducts workshops and advises clients on their survey practices.
Sally-Ann is the Managing Director and founder of a worldwide series of conferences and exhibitions dedicated to e-learning. She has a long career in consulting and today specializes in eLearning, Competence Management and Performance Management.In 2013 and 2014 Sally-Ann has been named as part of the TOP 10 Movers and Shakers of e-learning in Europe.Sally-Ann Moore is a BSC Honors Graduate from University of Manchester Institute of Science and Technology, and an Associate of the Textile Institute. She studied Finance at INSEAD and has published several articles and papers, lately on the subject of activity-based competence management and performance management. Sally-Ann’s consulting career started with 6 years (from 1980 to 1986) at the prestigious Battelle Research Institute where, she led and managed techno-economic research and consulting programs. From 1986 to 1996, Sally-Ann was European portfolio manager for financial management reporting systems at Digital Equipment. Finally, before creating eLearn Expo in 2000, Sally was the co-founder and responsible for the development of Global’s Education Integration projects and consulting business in strategic accounts worldwide at Global Knowledge Network, the largest IT training company in the world.
Sally-Ann is the Managing Director and founder of a worldwide series of conferences and exhibitions dedicated to e-learning. She has a long career in consulting and today specializes in eLearning, Competence Management and Performance Management.In 2013 and 2014 Sally-Ann has been named as part of the TOP 10 Movers and Shakers of e-learning in Europe.Sally-Ann Moore is a BSC Honors Graduate from University of Manchester Institute of Science and Technology, and an Associate of the Textile Institute. She studied Finance at INSEAD and has published several articles and papers, lately on the subject of activity-based competence management and performance management. Sally-Ann’s consulting career started with 6 years (from 1980 to 1986) at the prestigious Battelle Research Institute where, she led and managed techno-economic research and consulting programs. From 1986 to 1996, Sally-Ann was European portfolio manager for financial management reporting systems at Digital Equipment. Finally, before creating eLearn Expo in 2000, Sally was the co-founder and responsible for the development of Global’s Education Integration projects and consulting business in strategic accounts worldwide at Global Knowledge Network, the largest IT training company in the world.
This episode's guest is Spencer Ante, founder of WhoWeUse and author of Creative Capital: Georges Doriot and the Birth of Venture Capital. This is part 2 of our interview with Spencer. When we left off, Spencer had just told the story of Digital Equipment's IPO, and its 70,000% ROI for ARD and its founder George Doriot. This week, Dave and Spencer get into the drivers of economic progress, the roots of the difference between east and west coast founders and more detail about Spencer's company, WhoWeUse. ====== Download WhoWeUse: http://whoweuse.net/
Laurie Burton, Laurie Burton Training, is an internationally recognized innovator and author who has helped thousands of people improve their ability to communicate effectively, increasing success in business and in life. Laurie is also the author of Presenting You, a generous and very encouraging book on the Art of Communication. Laurie’s techniques generate dramatic results in an amazingly short time from one-on-one coaching, to seminars, to corporate groups. Promoting the use of each individual’s unique passion, energy and animation people discover the answer to creating more effective opportunities to lead, inspire and sell! From 1985-2008 Laurie was a senior lecturer at the University of California, School of Cinematic Arts, where she taught acting and directing for film in both graduate and undergraduate programs. Laurie has developed training sessions with a focus on what she believes is one of the most dynamic, yet underutilized tools available for expressing ourselves – the human body. Laurie Burton Training refers to it as the Human Instrument, the engine that drives the way we speak, present and communicate and a major contributor to the success or failure of reaching any audience, making that sale or getting that job. Laurie Burton Training clients include Fortune 500 companies such as IBM, Mattel Toys, Digital Equipment, Merrill Lynch, Medtronic and Twentieth Century Fox as well as a variety of non-profit and educational institutions and associations.
Surveys are an important tool for measuring your customers’ and agents’satisfaction along with identifying areas needing improvement. Listen in to 15 tips to optimize your surveys. Guest Dr. Fred Van Bennekom, The author of "Customer Surveying: A Guidebook for Service Managers," teaches operations management in Northeastern University’s Executive MBA program and has served as an information systems consultant for Digital Equipment’s field service organization. Through his company, Great Brook, Dr. Van Bennekom conducts workshops and advises clients on their survey practices. Survey Tips & Tricks – the Fast Fifteen or a Bakers Dozen plus two Project Management Think about what you’re trying to learn – Statement of research objectivesLine up your ducks – get the resources and sponsorship lined upMotivate, motivate, motivate Questionnaire Design Talk to the people – do some field research for “attribute identification”Keep it shortOrganize into topical sectionsIt’s how you write it – look for multiple interpretationsData aren’t all created equal -- Know your data types Survey Administration Target the right peopleTarget the right people the right way – Avoid sample biasKnow the math of statistical accuracy Analysis & Reporting Fight the fires immediatelySlice and dice – look for the interesting findings among subgroupsExplain the impactClose the loop – tell them what you learned = motivate
Take Action Get Profits with Michele Scism with her guest Laurie Burton & Lisa Hall: Laurie Burton is internationally recognized as an innovator helping thousands of people improve their ability to communicate. Whether working with individuals in seminars and in one-on-one coaching, or with groups in the corporate world, Laurie's techniques generate dramatic results in an amazingly short time. Individuals at all levels learn to break through communication barriers and develop the skills needed to become effective leaders of groups, teams and businesses. A 30-year veteran performer of film, television and theater, Laurie brings the breadth of her experience to teach people the “art of communication.” Laurie Burton Training's clients include Fortune 500 companies such as IBM, Mattel Toys, Digital Equipment, Merrill Lynch, ITT and Twentieth Century Fox. Laurie also served on the faculty at the University of Southern California School of Cinema/Television teaching filmmakers and directors, in both graduate and undergraduate programs, to develop their communication skills for great impact and effectiveness. Lisa Hall helps businesses get over the challenge of dealing with growth, expansion, or just not being able to cope with the flood of things that hits every business owner every day. Coaching helps owners and managers be more effective. Training helps employees be more productive and committed. One-on-one coaching. Team training. Remote webinars and phone coaching. Value-added lessons in running your business. Speaking to your groups or offices. Helping you navigate the choppy waters to profit and contentment.
Tough Talk with Tony Gambone with his guest International Best Selling Authors: Perceptive Marketing CEO and Founder, Sandy Lawrence, is the driving force behind the company. She is a marketing and promotions executive with over 30 years experience at some of the largest and most well-known corporations in the US, such as Digital Equipment and HP/Compaq. Sandy is a publicist for authors, publishers, speakers and small business owners for over 12 years. Belanie Dishong, Founder and CEO of Live At Choice, Live At Choice Media and the Starfisher Academy of Coaches, (http://www.liveatchoice.com/) is author, keynote speaker, course leader, personal coach and radio talk show host. In 1993, Belanie introduced a proven process that leads participants to critical self-discovery, helps them transform their beliefs and leads them to the realization they can make new choices for success. Dr. Rorick has been in practice for over 30 years. He has spent the length of his career studying and specializing in diagnosing over-looked findings. He is able to obtain true results and solutions for patients by treating the body as a group of systems that need to work together instead of trouble shooting with guesses and general prescribing of medications. His passion, continuous education and highest technology used in his office have made him a highly respected wellness doctor in the greater Houston area.
Is the Customer Effort Score the new Net Promoter Score? Just because research is published on the internet or in a business journal does not mean the research findings are true. Guest Frederick Van Bennekom, Principal, Great Brook Consulting, will be talking about survey methodology and design to determine validity. Dr. Fred Van Bennekom helps organizations collect and apply customer and employee feedback in a statistically valid manner. The author of "Customer Surveying: A Guidebook for Service Managers," Fred teaches operations management in Northeastern University’s Executive MBA program. Before earning his doctorate, he served as an information systems consultant for Digital Equipment’s field service organization. Through his company, Great Brook, Dr. Van Bennekom conducts workshops and advises clients on their survey practices.
Unless we're in a churn and burn business, customer retention is critical to achieve long term profitability. But how do we know what drives customer retention? We may have lots of anecdotal data, but can we be scientific to identify those customers are very likely to return and to give good word of mouth? Dr. Fred Van Bennekom of the Northeastern University's Executive MBA Program, joins us shed some light on those questions! Recently, several key metrics have arisen to measure customer sentiment. The Net Promoter Score (NPS) is the best known, but the Customer Effort Score (CES) is a new entrant in the field. Both are controversial. In this first presentation of two we will discuss NPS. Specifically, we will look at the research basis behind NPS and ask whether there are other metrics that might better indicate a loyal customer. Or, perhaps more importantly, might identify a customer in need of some service recovery event to move them toward loyalty. Dr. Fred Van Bennekom founded Great Brook to help organizations collect and apply customer and employee feedback. Great Brook conducts workshops on survey practices along with advising clients on their surveying practices. Fred authored Customer Surveying: A Guidebook for Service Managers and he teaches operations management in Northeastern University’s Executive MBA program. He served as an information systems consultant for Digital Equipment’s field service organization before earning his doctorate.
Out of the many presentations at Cinemacon this year, this would be one of the more interesting on the technical sides. The panel discusses […]
Crystal is a speaker, marketing strategist and owner of Houston-based marketing firm Black-Market Exchange. I teach professionals of all knowledge levels how to gain measurable marketing results leveraging unique strategy and technology.Perceptive Marketing CEO and Founder, Sandy Lawrence, is the driving force behind the company. She is a marketing and promotions executive with over 20 years experience at some of the largest and most well-known corporations in the US, such as Digital Equipment and HP/Compaq. Sandy recognizes the entrepreneurial spirit and creativity of her clients and develops programs that help highlight their benefits for their customers. Her assistance results in increased business growth, customer satisfaction and customer loyalty. Sandy is a "people person" who works passionately and single-mindedly to take her client's message to the world. Her marketing skills and wealth of knowledge are available to anyone who needs the competitive edge that only a truly perceptive marketing expert can give.
Laurie Burton is internationally recognized as an innovator helping thousands of people improve their ability to communicate. Whether working with individuals in seminars and in one-on-one coaching, or with groups in the corporate world, Laurie's techniques generate dramatic results in an amazingly short time. Individuals at all levels learn to break through communication barriers and develop the skills needed to become effective leaders of groups, teams and businesses. A 30-year veteran performer of film, television and theater, Laurie brings the breadth of her experience to teach people the “art of communication.” Laurie Burton Training's clients include Fortune 500 companies such as IBM, Mattel Toys, Digital Equipment, Merrill Lynch, ITT and Twentieth Century Fox. Laurie also served on the faculty at the University of Southern California School of Cinema/Television teaching filmmakers and directors, in both graduate and undergraduate programs, to develop their communication skills for great impact and effectiveness. Website: laurie@laurieburtontraining.com If you love this show, please leave us a review. Go to:- https://ratethispodcast.com/rate and follow the simple instructions. Support this podcast at — https://redcircle.com/the-dave-pamah-show/donations