American physicist
POPULARITY
Bilgisayar... Tüm dünyayı baştan aşağı değiştiren bir icat. Belki de tarihin en önemli kesiflerinde biri. Fakat bu devrim bir anda olmadı elbette. Basit bir hesap yapma aracından, yapay zekaya kadar uzanan bu serüven, insanlığın kendini aşma çabasının da hikayesiydi aslında. Hiçbir Şey Tesadüf Değil'de bu teknolojik devrimin arka planına odaklanıyoruz. İki bölümden oluşacak mini bu mini serinin ilk ayağındaysa, hayatımızı değiştiren bu teknolojiyi en ilkel günlerinden itibaren incelemeye çalışıyoruz.See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
In 1946, one of the world's first electronic computers was unveiled in Philadelphia, in the USA. It was called the Electronic Numerical Integrator and Computer, or ENIAC, and was initially designed to do calculations for ballistics trajectories. It was programmed by six female mathematicians. Rachel Naylor speaks to Gini Mauchly Calcerano, whose dad John Mauchly co-designed it, and whose mum, Kay McNulty, was one of the programmers. Eye-witness accounts brought to life by archive. Witness History is for those fascinated by the past. We take you to the events that have shaped our world through the eyes of the people who were there. For nine minutes every day, we take you back in time and all over the world, to examine wars, coups, scientific discoveries, cultural moments and much more. Recent episodes explore everything from football in Brazil, the history of the ‘Indian Titanic' and the invention of air fryers, to Public Enemy's Fight The Power, subway art and the political crisis in Georgia. We look at the lives of some of the most famous leaders, artists, scientists and personalities in history, including: visionary architect Antoni Gaudi and the design of the Sagrada Familia; Michael Jordan and his bespoke Nike trainers; Princess Diana at the Taj Mahal; and Görel Hanser, manager of legendary Swedish pop band Abba on the influence they've had on the music industry. You can learn all about fascinating and surprising stories, such as the time an Iraqi journalist hurled his shoes at the President of the United States in protest of America's occupation of Iraq; the creation of the Hollywood commercial that changed advertising forever; and the ascent of the first Aboriginal MP.(Photo: Computer operators programming the ENIAC. Credit: Corbis via Getty Images)
Comment ça marche les ordinateurs ? Et est-ce qu'on peut dire qu'ils sont intelligents ? Depuis leur invention, pour faire des calculs mathématiques, à nos smartphones… Julien t'explique tout dans ce nouvel épisode de “Qui a inventé ?” !
What I learned from reading The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution by Walter Isaacson.Subscribe to continue listening and gain access to all full episodes. All subscriptions come with a 7-day free trial. What other people are saying:“Uniquely outstanding. No fluff and all substance. David does an outstanding job summarizing these biographies and hones in on the elements that make his subjects so unique among entrepreneurs. I particularly enjoy that he focuses on both the founder’s positive and negative characteristics as a way of highlighting things to mimic and avoid.”“Without a doubt, the highest value-to-cost ratio I’ve taken advantage of in the last year is the Founders podcast premium feed. Tap into eons of knowledge and experiences, condensed into digestible portions, for roughly the cost of a takeout meal. Highly, highly recommend.“I haven’t found a better return on my time and money than your podcast for inspiration and time-tested wisdom to help me on my journey.“It is worth every penny. I cannot put into words how fantastic this podcast is. Just stop reading this and get the full access.”“Reading a biography is a privilege that condenses a life's journey, all its lessons, loves AND mistakes into 20 odd hours of reading. Here David condenses many of the best and intriguing Bios into 1-2 hours. Presented organically and thoughtfully with full book links and show notes for ease. Subscribe right away!”START YOUR 7 DAY FREE TRIAL HERE.
In 1946 John Eckert and John Mauchly left the Moore School, patented ENIAC, and founded a company. One of those discussions would have consequences that wouldn't be resolved until 1973. Today we close out our series on ENIAC with a look at the legal battle it spawned, and how it put ownership over the rights to basic digital technology on trial. Along the way we talk legal gobbledygook, conspiracy, and take a look at some of the earliest electronic computers. Like the show? Then why not head over and support me on Patreon. Perks include early access to future episodes, and bonus content: https://www.patreon.com/adventofcomputing
Today we're going to celebrate the birthday of the first real multi-purpose computer: the gargantuan ENIAC which would have turned 74 years old today, on February 15th. Many generations ago in computing. The year is 1946. World War II raged from 1939 to 1945. We'd cracked Enigma with computers and scientists were thinking of more and more ways to use them. The press is now running articles about a “giant brain” built in Philadelphia. The Electronic Numerical Integrator and Computer was a mouthful, so they called it ENIAC. It was the first true electronic computer. Before that there were electromechanical monstrosities. Those had to physically move a part in order to process a mathematical formula. That took time. ENIAC used vacuum tubes instead. A lot of them. To put things in perspective: very hour of processing by the ENiAC was worth 2,400 hours of work calculating formulas by hand. And it's not like you can do 2,400 hours in parallel between people or in a row of course. So it made the previous almost impossible, possible. Sure, you could figure out the settings to fire a bomb where you wanted two bombs to go in a minute rather than about a full day of running calculations. But math itself, for the purposes of math, was about to get really, really cool. The Bush Differential Analyzer, a later mechanical computer, had been built in the basement of the building that is now the ENIAC museum. The University of Pennsylvania ran a class on wartime electronics, based on their experience with the Differential Analyzer. John Mauchly and J. Presper Eckert met in 1941 while taking that class, a topic that had included lots of shiny new or newish things like radar and cryptanalysis. That class was mostly on ballistics, a core focus at the Moore School of Electrical Engineering at the University of Pennsylvania. More accurate ballistics would be a huge contribution to the war effort. But Echert and Mauchly wanted to go further, building a multi-purpose computer that could analyze weather and calculate ballistics. Mauchly got all fired up and wrote a memo about building a general purpose computer. But the University shot it down. And so ENIAC began life as Project PX when Herman Goldstine acted as the main sponsor after seeing their proposal and digging it back up. Mauchly would team up with Eckert to design the computer and the effort was overseen and orchestrated by Major General Gladeon Barnes of the US Army Ordnance Corps. Thomas Sharpless was the master programmer. Arthur Burkes built the multiplier. Robert Shaw designed the function tables. Harry Huskey designed the reader and the printer. Jeffrey Chu built the dividers. And Jack Davis built the accumulators. Ultimately it was just a really big calculator and not a computer that ran stored programs in the same way we do today. Although ENIAC did get an early version of stored programming that used a function table for read only memory. The project was supposed to cost $61,700. The University of Pennsylvania Department of Computer and Information Science in Philadelphia actually spent half a million dollars worth of metal, tubes and wires. And of course the scientists weren't free. That's around $6 and a half million worth of cash today. And of course it was paid for by the US Army. Specifically the Ballistic Research Laboratory. It was designed to calculate firing tables to make blowing things up a little more accurate. Herman Goldstine chose a team of programmers that included Betty Jennings, Betty Snyder, Kay McNulty, Fran Bilas, Marlyn Meltzer, and Ruth Lichterman. They were chosen from a pool of 200 and set about writing the necessary formulas for the machine to process the requirements provided from people using time on the machine. In fact, Kay McNulty invented the concept of subroutines while working on the project. They would flip switches and plug in cables as a means of programming the computer. And programming took weeks of figuring up complex calculations on paper. . Then it took days of fiddling with cables, switches, tubes, and panels to input the program. Debugging was done step by step, similar to how we use break points today. They would feed ENIAC input using IBM punch cards and readers. The output was punch cards as well and these punch cards acted as persistent storage. The machine then used standard octal radio tubes. 18000 tubes and they ran at a lower voltage than they could in order to minimize them blowing out and creating heat. Each digit used in calculations took 36 of those vacuum tubes and 20 accumulators that could run 5,000 operations per second. The accumulators used two of those tubes to form a flip-flop and they got them from the Kentucky Electrical Lamp Company. Given the number that blew every day they must have loved life until engineers got it to only blowing a tube every couple of days. ENIAC was modular computer and used different panels to perform different tasks, or functions. It used ring counters with 10 positions for a lot of operations making it a digital computer as opposed to the modern binary computational devices we have today. The pulses between the rings were used to count. Suddenly computers were big money. A lot of research had happened in a short amount of time. Some had been government funded and some had been part of corporations and it became impossible to untangle the two. This was pretty common with technical advances during World War II and the early Cold War years. John Atanasoff and Cliff Berry had ushered in the era of the digital computer in 1939 but hadn't finished. Maunchly had seen that in 1941. It was used to run a number of calculations for the Manhattan Project, allowing us to blow more things up than ever. That project took over a million punch cards and took precedent over artillery tables. Jon Von Neumann worked with a number of mathematicians and physicists including Stanislaw Ulam who developed the Monte Method. That led to a massive reduction in programming time. Suddenly programming became more about I/O than anything else. To promote the emerging computing industry, the Pentagon had the Moore School of Electrical Engineering at The University of Pennsylvania launch a series of lectures to further computing at large. These were called the Theory and Techniques for Design of Electronic Digital Computers, or just the Moore School Lectures for short. The lectures focused on the various types of circuits and the findings from Eckert and Mauchly on building and architecting computers. Goldstein would talk at length about math and other developers would give talks, looking forward to the development of the EDVAC and back at how they got where they were with ENIAC. As the University began to realize the potential business impact and monetization, they decided to bring a focus to University owned patents. That drove the original designers out of the University of Pennsylvania and they started the Eckert-Mauchly Computer Corporation in 1946. Eckert-Mauchley would the build EDVAC, taking use of progress the industry had made since the ENIAC construction had begun. EDVAC would effectively represent the wholesale move away from digital and into binary computing and while it weighed tons - it would become the precursor to the microchip. After the ENIAC was finished Mauchly filed for a patent in 1947. While a patent was granted, you could still count on your fingers the number of machines that were built at about the same time, including the Atanasoff Berry Computer, Colossus, the Harvard Mark I and the Z3. So luckily the patent was avoided and digital computers are a part of the public domain. That patent was voided in 1973. By then, the Eckert-Mauchly computer corporation had been acquired by Remington Rand, which merged with Sperry and is now called Unisys. The next wave of computers would be mainframes built by GE, Honeywell, IBM, and another of other vendors and so the era of batch processing mainframes began. The EDVAC begat the UNIVAC and Grace Hopper being brought in to write an assembler for that. Computers would become the big mathematical number crunchers and slowly spread into being data processors from there. Following decades of batch processing mainframes we would get minicomputers and interactivity, then time sharing, and then the PC revolution. Distinct eras in computing. Today, computers do far more than just the types of math the ENIAC did. In fact, the functionality of ENIAC was duplicated onto a 20 megahertz microchip in 1996. You know, ‘cause the University of Pennsylvania wanted to do something to celebrate the 50th birthday. And a birthday party seemed underwhelming at the time. And so the date of release for this episode is February 15th, now ENIAC Day in Philadelphia, dedicated as a way to thank the university, creators, and programmers. And we should all reiterate their thanks. They helped put computers front and center into the thoughts of the next generation of physicists, mathematicians, and engineers, who built the mainframe era. And I should thank you - for listening to this episode. I'm pretty lucky to have ya'. Have a great day! .
The Microchip Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Todays episode is on the history of the microchip, or microprocessor. This was a hard episode, because it was the culmination of so many technologies. You don't know where to stop telling the story - and you find yourself writing a chronological story in reverse chronological order. But few advancements have impacted humanity the way the introduction of the microprocessor has. Given that most technological advances are a convergence of otherwise disparate technologies, we'll start the story of the microchip with the obvious choice: the light bulb. Thomas Edison first demonstrated the carbon filament light bulb in 1879. William Joseph Hammer, an inventor working with Edison, then noted that if he added another electrode to a heated filament bulb that it would glow around the positive pole in the vacuum of the bulb and blacken the wire and the bulb around the negative pole. 25 years later, John Ambrose Fleming demonstrated that if that extra electrode is made more positive than the filament the current flows through the vacuum and that the current could only flow from the filament to the electrode and not the other direction. This converted AC signals to DC and represented a boolean gate. In the 1904 Fleming was granted Great Britain's patent number 24850 for the vacuum tube, ushering in the era of electronics. Over the next few decades, researchers continued to work with these tubes. Eccles and Jordan invented the flip-flop circuit at London's City and Guilds Technical College in 1918, receiving a patent for what they called the Eccles-Jordan Trigger Circuit in 1920. Now, English mathematician George Boole back in the earlier part of the 1800s had developed Boolean algebra. Here he created a system where logical statements could be made in mathematical terms. Those could then be performed using math on the symbols. Only a 0 or a 1 could be used. It took awhile, John Vincent Atanasoff and grad student Clifford Berry harnessed the circuits in the Atanasoff-Berry computer in 1938 at Iowa State University and using Boolean algebra, successfully solved linear equations but never finished the device due to World War II, when a number of other technological advancements happened, including the development of the ENIAC by John Mauchly and J Presper Eckert from the University of Pennsylvania, funded by the US Army Ordinance Corps, starting in 1943. By the time it was taken out of operation, the ENIAC had 20,000 of these tubes. Each digit in an algorithm required 36 tubes. Ten digit numbers could be multiplied at 357 per second, showing the first true use of a computer. John Von Neumann was the first to actually use the ENIAC when they used one million punch cards to run the computations that helped propel the development of the hydrogen bomb at Los Alamos National Laboratory. The creators would leave the University and found the Eckert-Mauchly Computer Corporation. Out of that later would come the Univac and the ancestor of todays Unisys Corporation. These early computers used vacuum tubes to replace gears that were in previous counting machines and represented the First Generation. But the tubes for the flip-flop circuits were expensive and had to be replaced way too often. The second generation of computers used transistors instead of vacuum tubes for logic circuits. The integrated circuit is basically a wire set into silicon or germanium that can be set to on or off based on the properties of the material. These replaced vacuum tubes in computers to provide the foundation of the boolean logic. You know, the zeros and ones that computers are famous for. As with most modern technologies the integrated circuit owes its origin to a number of different technologies that came before it was able to be useful in computers. This includes the three primary components of the circuit: the transistor, resistor, and capacitor. The silicon that chips are so famous for was actually discovered by Swedish chemist Jöns Jacob Berzelius in 1824. He heated potassium chips in a silica container and washed away the residue and viola - an element! The transistor is a semiconducting device that has three connections that amplify data. One is the source, which is connected to the negative terminal on a battery. The second is the drain, and is a positive terminal that, when touched to the gate (the third connection), the transistor allows electricity through. Transistors then acts as an on/off switch. The fact they can be on or off is the foundation for Boolean logic in modern computing. The resistor controls the flow of electricity and is used to control the levels and terminate lines. An integrated circuit is also built using silicon but you print the pattern into the circuit using lithography rather than painstakingly putting little wires where they need to go like radio operators did with the Cats Whisker all those years ago. The idea of the transistor goes back to the mid-30s when William Shockley took the idea of a cat's wicker, or fine wire touching a galena crystal. The radio operator moved the wire to different parts of the crystal to pick up different radio signals. Solid state physics was born when Shockley, who first studied at Cal Tech and then got his PhD in Physics, started working on a way to make these useable in every day electronics. After a decade in the trenches, Bell gave him John Bardeen and Walter Brattain who successfully finished the invention in 1947. Shockley went on to design a new and better transistor, known as a bipolar transistor and helped move us from vacuum tubes, which were bulky and needed a lot of power, to first gernanium, which they used initially and then to silicon. Shockley got a Nobel Prize in physics for his work and was able to recruit a team of extremely talented young PhDs to help work on new semiconductor devices. He became increasingly frustrated with Bell and took a leave of absence. Shockley moved back to his hometown of Palo Alto, California and started a new company called the Shockley Semiconductor Laboratory. He had some ideas that were way before his time and wasn't exactly easy to work with. He pushed the chip industry forward but in the process spawned a mass exodus of employees that went to Fairchild in 1957. He called them the “Traitorous 8” to create what would be Fairchild Semiconductors. The alumni of Shockley Labs ended up spawning 65 companies over the next 20 years that laid foundation of the microchip industry to this day, including Intel. . If he were easier to work with, we might not have had the innovation that we've seen if not for Shockley's abbrasiveness! All of these silicon chip makers being in a small area of California then led to that area getting the Silicon Valley moniker, given all the chip makers located there. At this point, people were starting to experiment with computers using transistors instead of vacuum tubes. The University of Manchester created the Transistor Computer in 1953. The first fully transistorized computer came in 1955 with the Harwell CADET, MIT started work on the TX-0 in 1956, and the THOR guidance computer for ICBMs came in 1957. But the IBM 608 was the first commercial all-transistor solid-state computer. The RCA 501, Philco Transac S-1000, and IBM 7070 took us through the age of transistors which continued to get smaller and more compact. At this point, we were really just replacing tubes with transistors. But the integrated circuit would bring us into the third generation of computers. The integrated circuit is an electronic device that has all of the functional blocks put on the same piece of silicon. So the transistor, or multiple transistors, is printed into one block. Jack Kilby of Texas Instruments patented the first miniaturized electronic circuit in 1959, which used germanium and external wires and was really more of a hybrid integrated Circuit. Later in 1959, Robert Noyce of Fairchild Semiconductor invented the first truly monolithic integrated circuit, which he received a patent for. While doing so independently, they are considered the creators of the integrated circuit. The third generation of computers was from 1964 to 1971, and saw the introduction of metal-oxide-silicon and printing circuits with photolithography. In 1965 Gordon Moore, also of Fairchild at the time, observed that the number of transistors, resistors, diodes, capacitors, and other components that could be shoved into a chip was doubling about every year and published an article with this observation in Electronics Magazine, forecasting what's now known as Moore's Law. The integrated circuit gave us the DEC PDP and later the IBM S/360 series of computers, making computers smaller, and brought us into a world where we could write code in COBOL and FORTRAN. A microprocessor is one type of integrated circuit. They're also used in audio amplifiers, analog integrated circuits, clocks, interfaces, etc. But in the early 60s, the Minuteman missal program and the US Navy contracts were practically the only ones using these chips, at this point numbering in the hundreds, bringing us into the world of the MSI, or medium-scale integration chip. Moore and Noyce left Fairchild and founded NM Electronics in 1968, later renaming the company to Intel, short for Integrated Electronics. Federico Faggin came over in 1970 to lead the MCS-4 family of chips. These along with other chips that were economical to produce started to result in chips finding their way into various consumer products. In fact, the MCS-4 chips, which split RAM , ROM, CPU, and I/O, were designed for the Nippon Calculating Machine Corporation and Intel bought the rights back, announcing the chip in Electronic News with an article called “Announcing A New Era In Integrated Electronics.” Together, they built the Intel 4004, the first microprocessor that fit on a single chip. They buried the contacts in multiple layers and introduced 2-phase clocks. Silicon oxide was used to layer integrated circuits onto a single chip. Here, the microprocessor, or CPU, splits the arithmetic and logic unit, or ALU, the bus, the clock, the control unit, and registers up so each can do what they're good at, but live on the same chip. The 1st generation of the microprocessor was from 1971, when these 4-bit chips were mostly used in guidance systems. This boosted the speed by five times. The forming of Intel and the introduction of the 4004 chip can be seen as one of the primary events that propelled us into the evolution of the microprocessor and the fourth generation of computers, which lasted from 1972 to 2010. The Intel 4004 had 2,300 transistors. The Intel 4040 came in 1974, giving us 3,000 transistors. It was still a 4-bit data bus but jumped to 12-bit ROM. The architecture was also from Faggin but the design was carried out by Tom Innes. We were firmly in the era of LSI, or Large Scale Integration chips. These chips were also used in the Busicom calculator, and even in the first pinball game controlled by a microprocessor. But getting a true computer to fit on a chip, or a modern CPU, remained an elusive goal. Texas Instruments ran an ad in Electronics with a caption that the 8008 was a “CPU on a Chip” and attempted to patent the chip, but couldn't make it work. Faggin went to Intel and they did actually make it work, giving us the first 8-bit microprocessor. It was then redesigned in 1972 as the 8080. A year later, the chip was fabricated and then put on the market in 1972. Intel made the R&D money back in 5 months and sparked the idea for Ed Roberts to build The Altair 8800. Motorola and Zilog brought competition in the 6900 and Z-80, which was used in the Tandy TRS-80, one of the first mass produced computers. N-MOSs transistors on chips allowed for new and faster paths and MOS Technology soon joined the fray with the 6501 and 6502 chips in 1975. The 6502 ended up being the chip used in the Apple I, Apple II, NES, Atari 2600, BBC Micro, Commodore PET and Commodore VIC-20. The MOS 6510 variant was then used in the Commodore 64. The 8086 was released in 1978 with 3,000 transistors and marked the transition to Intel's x86 line of chips, setting what would become the standard in future chips. But the IBM wasn't the only place you could find chips. The Motorola 68000 was used in the Sun-1 from Sun Microsystems, the HP 9000, the DEC VAXstation, the Comodore Amiga, the Apple Lisa, the Sinclair QL, the Sega Genesis, and the Mac. The chips were also used in the first HP LaserJet and the Apple LaserWriter and used in a number of embedded systems for years to come. As we rounded the corner into the 80s it was clear that the computer revolution was upon us. A number of computer companies were looking to do more than what they could do with he existing Intel, MOS, and Motorola chips. And ARPA was pushing the boundaries yet again. Carver Mead of Caltech and Lynn Conway of Xerox PARC saw the density of transistors in chips starting to plateau. So with DARPA funding they went out looking for ways to push the world into the VLSI era, or Very Large Scale Integration. The VLSI project resulted in the concept of fabless design houses, such as Broadcom, 32-bit graphics, BSD Unix, and RISC processors, or Reduced Instruction Set Computer Processor. Out of the RISC work done at UC Berkely came a number of new options for chips as well. One of these designers, Acorn Computers evaluated a number of chips and decided to develop their own, using VLSI Technology, a company founded by more Fairchild Semiconductor alumni) to manufacture the chip in their foundry. Sophie Wilson, then Roger, worked on an instruction set for the RISC. Out of this came the Acorn RISC Machine, or ARM chip. Over 100 billion ARM processors have been produced, well over 10 for every human on the planet. You know that fancy new A13 that Apple announced. It uses a licensed ARM core. Another chip that came out of the RISC family was the SUN Sparc. Sun being short for Stanford University Network, co-founder Andy Bchtolsheim, they were close to the action and released the SPARC in 1986. I still have a SPARC 20 I use for this and that at home. Not that SPARC has gone anywhere. They're just made by Oracle now. The Intel 80386 chip was a 32 bit microprocessor released in 1985. The first chip had 275,000 transistors, taking plenty of pages from the lessons learned in the VLSI projects. Compaq built a machine on it, but really the IBM PC/AT made it an accepted standard, although this was the beginning of the end of IBMs hold on the burgeoning computer industry. And AMD, yet another company founded by Fairchild defectors, created the Am386 in 1991, ending Intel's nearly 5 year monopoly on the PC clone industry and ending an era where AMD was a second source of Intel parts but instead was competing with Intel directly. We can thank AMD's aggressive competition with Intel for helping to keep the CPU industry going along Moore's law! At this point transistors were only 1.5 microns in size. Much, much smaller than a cats whisker. The Intel 80486 came in 1989 and again tracking against Moore's Law we hit the first 1 million transistor chip. Remember how Compaq helped end IBM's hold on the PC market? When the Intel 486 came along they went with AMD. This chip was also important because we got L1 caches, meaning that chips didn't need to send instructions to other parts of the motherboard but could do caching internally. From then on, the L1 and later L2 caches would be listed on all chips. We'd finally broken 100MHz! Motorola released the 68050 in 1990, hitting 1.2 Million transistors, and giving Apple the chip that would define the Quadra and also that L1 cache. The DEC Alpha came along in 1992, also a RISC chip, but really kicking off the 64-bit era. While the most technically advanced chip of the day, it never took off and after DEC was acquired by Compaq and Compaq by HP, the IP for the Alpha was sold to Intel in 2001, with the PC industry having just decided they could have all their money. But back to the 90s, ‘cause life was better back when grunge was new. At this point, hobbyists knew what the CPU was but most normal people didn't. The concept that there was a whole Univac on one of these never occurred to most people. But then came the Pentium. Turns out that giving a chip a name and some marketing dollars not only made Intel a household name but solidified their hold on the chip market for decades to come. While the Intel Inside campaign started in 1991, after the Pentium was released in 1993, the case of most computers would have a sticker that said Intel Inside. Intel really one upped everyone. The first Pentium, the P5 or 586 or 80501 had 3.1 million transistors that were 16.7 micrometers. Computers kept getting smaller and cheaper and faster. Apple answered by moving to the PowerPC chip from IBM, which owed much of its design to the RISC. Exactly 10 years after the famous 1984 Super Bowl Commercial, Apple was using a CPU from IBM. Another advance came in 1996 when IBM developed the Power4 chip and gave the world multi-core processors, or a CPU that had multiple CPU cores inside the CPU. Once parallel processing caught up to being able to have processes that consumed the resources on all those cores, we saw Intel's Pentium D, and AMD's Athlon 64 x2 released in May 2005 bringing multi-core architecture to the consumer. This led to even more parallel processing and an explosion in the number of cores helped us continue on with Moore's Law. There are now custom chips that reach into the thousands of cores today, although most laptops have maybe 4 cores in them. Setting multi-core architectures aside for a moment, back to Y2K when Justin Timberlake was still a part of NSYNC. Then came the Pentium Pro, Pentium II, Celeron, Pentium III, Xeon, Pentium M, Xeon LV, Pentium 4. On the IBM/Apple side, we got the G3 with 6.3 million transistors, G4 with 10.5 million transistors, and the G5 with 58 million transistors and 1,131 feet of copper interconnects, running at 3GHz in 2002 - so much copper that NSYNC broke up that year. The Pentium 4 that year ran at 2.4 GHz and sported 50 million transistors. This is about 1 transistor per dollar made off Star Trek: Nemesis in 2002. I guess Attack of the Clones was better because it grossed over 300 Million that year. Remember how we broke the million transistor mark in 1989? In 2005, Intel started testing Montecito with certain customers. The Titanium-2 64-bit CPU with 1.72 billion transistors, shattering the billion mark and hitting a billion two years earlier than projected. Apple CEO Steve Jobs announced Apple would be moving to the Intel processor that year. NeXTSTEP had been happy as a clam on Intel, SPARC or HP RISC so given the rapid advancements from Intel, this seemed like a safe bet and allowed Apple to tell directors in IT departments “see, we play nice now.” And the innovations kept flowing for the next decade and a half. We packed more transistors in, more cache, cleaner clean rooms, faster bus speeds, with Intel owning the computer CPU market and AMD slowly growing from the ashes of Acorn computer into the power-house that AMD cores are today, when embedded in other chips designs. I'd say not much interesting has happened, but it's ALL interesting, except the numbers just sound stupid they're so big. And we had more advances along the way of course, but it started to feel like we were just miniaturizing more and more, allowing us to do much more advanced computing in general. The fifth generation of computing is all about technologies that we today consider advanced. Artificial Intelligence, Parallel Computing, Very High Level Computer Languages, the migration away from desktops to laptops and even smaller devices like smartphones. ULSI, or Ultra Large Scale Integration chips not only tells us that chip designers really have no creativity outside of chip architecture, but also means millions up to tens of billions of transistors on silicon. At the time of this recording, the AMD Epic Rome is the single chip package with the most transistors, at 32 billion. Silicon is the seventh most abundant element in the universe and the second most in the crust of the planet earth. Given that there's more chips than people by a huge percentage, we're lucky we don't have to worry about running out any time soon! We skipped RAM in this episode. But it kinda' deserves its own, since RAM is still following Moore's Law, while the CPU is kinda' lagging again. Maybe it's time for our friends at DARPA to get the kids from Berkley working at VERYUltra Large Scale chips or VULSIs! Or they could sign on to sponsor this podcast! And now I'm going to go take a VERYUltra Large Scale nap. Gentle listeners I hope you can do that as well. Unless you're driving while listening to this. Don't nap while driving. But do have a lovely day. Thank you for listening to yet another episode of the History of Computing Podcast. We're so lucky to have you!
A história mais popular envolvendo tecnologia fora da linha de frente na II Guerra Mundial explica como um matemático britânico chamado Alan Turing criou uma metodologia capaz de decifrar os códigos alemães e como a Inglaterra conseguiu reverter um quadro ruim nos campos de batalha a partir dos códigos interceptados. A chamada "Bletchley bombe", a máquina construída por Turing e sua equipe no Bletchley Park, automatizava e acelerava a quebra das mensagens codificadas pelo sistema Enigma dos nazistas. A bombe é a mais conhecida máquina de guerra, mas não é a única. Do outro lado do oceano Atlântico, os Estados Unidos também estavam correndo para desenvolver uma máquina capaz de calcular rapidamente trajetórias balísticas — os arcos descritos por projéteis e balas do momento em que eles saem da arma ao impacto. Humanos demoravam, em média, 30 horas para completar uma trajetória balística. Como os exércitos precisavam de dezenas por dia, o jeito era procurar alguma forma de automatizar. Em 1942, o professor John Mauchly, da Moore School of Engineering, na Filadélfia, propôs a construção do que chamou de "calculadora eletrônica": um hardware que usasse tubos a vácuo, a tecnologia mais moderna da época, para calcular. No ano seguinte, o governo aprovou o projeto e financiou o chamado Project PX. Só em novembro de 1945, quando a guerra já tinha terminado, o projeto foi concluído e ganhou o nome de Electronic Numerical Integrator and Computer, o Eniac. (mais…)
One of the earliest computing devices was the abacus. This number crunching device can first be found in use by Sumerians, circa 2700BC. The abacus can be found throughout Asia, the Middle East, and India throughout ancient history. Don't worry, the rate of innovation always speeds up as multiple technologies can be combined. Leonardo da Vinci sketched out the first known plans for a calculator. But it was the 17th century, or the Early modern period in Europe, that gave us the Scientific Revolution. Names like Kepler, Leibniz, Boyle, Newton, and Hook brought us calculus, telescopes, microscopes, and even electricity. The term computer is first found in 1613, describing a person that did computations. Wilhelm Schickard built the first calculator in 1623, which he described in a letter to Kepler. Opening the minds of humanity caused people like Blaise Pascal to theorize about vacuums and he then did something very special: he built a mechanical calculator that could add and subtract numbers, do multiplication, and even division. And more important than building a prototype, he sold a few! His programming language was a lantern gear. It took him 50 prototypes and many years, but he presented the calculator in 1645, earning him a royal privilege in France for calculators. That's feudal French for a patent. Leibniz added repetition to the mechanical calculator in his Step Reckoner. And he was a huge proponent of binary, although he didn't use it in his mechanical calculator. Binary would become even more important later, when electronics came to computers. But as with many great innovations it took awhile to percolate. In many ways, the age of enlightenment was taking the theories from the previous century and building on them. The early industrial revolution though, was about automation. And so the mechanical calculator was finally ready for daily use in 1820 when another Frenchman, Colmar, built the arithmometer, based on Leibniz's design. A few years earlier, another innovation had occurred: memory. Memory came in the form of punchcards, an innovation that would go on to last until World War II. The Jacquard loom was used to weave textiles. The punch cards controlled how rods moved and thus were the basis of the pattern of the weave. Punching cards was an early form of programming. You recorded a set of instructions onto a card and the loom performed them. The bash programming of today is similar. Charles Babbage expanded on the ideas of Pascal and Leibniz and added to mechanical computing, making the difference engine, the inspiration of many a steampunk. Babbage had multiple engineers building components for the engine and after he scrapped his first, he moved on to the analytical engine, adding conditional branching, loops, and memory - and further complicating the machine. The engine borrowed the punchcard tech from the Jacquard loom and applied that same logic to math. Ada Lovelace contributed the concept of Bernoulli numbers in algorithms giving us a glimpse into what an open source collaboration might some day look like. And she was in many ways the first programmer - and daughter of Lord Byron and Anne Millbanke, a math whiz. She became fascinated with the engine and ended up becoming an expert at creating a set of instructions to punch on cards, thus the first programmer of the analytical engine and far before her time. In fact, there would be no programmer for 100 years with her depth of understanding. Not to make you feel inadequate, but she was 27 in 1843. The engine was a bit too advanced for its time. While Babbage is credited as the father of computing because of his ideas, shipping is a feature. Having said that, it has been proven that if the build had been completed to specifications the device would have worked. Sometimes the best of plans just can't be operationalized unless you reduce scope. Babbage added scope. Despite having troubles keeping contractors who could build complex machinery, Babbage first looked to tree rings to predict weather and he was a mathematician who worked with keys and ciphers. As with Isaac Newton 150 years earlier, the British government also allowed a great scientist/engineer to reform a political institution: the Postal System. You see, he was also an early proponent of applying the scientific method to the management and administration of governmental, commercial, and industrial processes. He also got one of the first government grants in R&D to help build the difference engine, although ended up putting some of his own money in there as well, of course. Babbage died in 1871 and thus ended computing. For a bit. The typewriter came in 1874, as parts kept getting smaller and people kept tinkerating with ideas to automate all the things. Herman Hollerith filed for a patent in 1884 to use a machine to punch and count punched cars. He used that first in health care management and then in the 1890 census. He later formed Tabulating Machine Company, in 1896. In the meantime, Julius E. Pitrap patented a computing scale in 1885. William S Burroughs (not that one, the other one) formed the American Arithmometer Company in 1886. Sales exploded for these and they merged, creating the Computing-Tabulation-Recording Company. Thomas J Watson, Sr joined the company as president in 1914 and expanded business, especially outside of the United States. The name of the company was changed to International Business Machines, or IBM for short, in 1924. Konrad Zuse built the first electric computer from 1936 to 1938 in his parent's living room. It was called the Z1. OK, so electric is a stretch, how about electromechanical… In 1936 Alan Turing proposed the Turing machine, which printed symbols on tape that simulated a human following a set of instructions. Maybe he accidentally found one of Ada Lovelace's old papers. The first truly programmable electric computer came in 1943, with Colossus, built by Tommy flowers to break German codes. The first truly digital computer came from Professor John Vincent Atanasoff and his grad student Cliff Berry from Iowa State University. The ABC, or Atanasoff-Berry Computer took from 1937 to 1942 to build and was the first to add vacuum tubes. The ENIAC came from J Presper Eckert and John Mauchly from the University of Pennsylvania from 1943 to 1946. 1,800 square feet and ten times that many vacuum tubes, ENIAC weighed 50 tons. ENIAC is considered to be the first digital computer because unlike the ABC it was fully functional. The Small-Scale Experimental Machine from Frederic Williams and Tom Kilburn from the University of Manchester came in 1948 and added the ability to store and execute a program. That program was run by Tom Kilburn on June 21st, 1948. Up to this point, the computer devices were being built in universities, with the exception of the Z1. But in 1950, Konrad Zuse sold the Z4, thus creating the commercial computer industry. IBM got into the business of selling computers in 1952 as well, basically outright owning the market until grunge killed the suit in the 90s. MIT added RAM in 1955 and then transistors in 1956. The PDP-1 was released in 1960 from Digital Equipment Corporation (DEC). This was the first minicomputer. My first computer was a DEC. Pier Giorgio Perotto introduced the first desktop computer, the Programmer 101 in 1964. HP began to sell the HP 9100A in 1968. All of this steam led to the first microprocessor, the Intel 4004, to be released in 1971. The first truly personal computer was released in 1975 by Ed Roberts, who was the first to call it that. It was the Altair 8800. The IBM 5100 was the first portable computer, released the same year. I guess it's portable if 55 pounds is considered portable. And the end of ancient history came the next year, when the Apple I was developed by Steve Wozniak, which I've always considered as the date that the modern era of computing be.
John Mauchly and Presper Eckert designed and built the first digital, electronic computer. Mauchly and Eckert met by chance in 1941 at the University of Pennsylvania's Moore School of Engineering. They soon developed a revolutionary vision: to use electricity as a means of computing - in other words, to make electricity "think." Ignored by their colleagues, in early 1943 they were fortuitously discovered and funded by the U.S. Army, itself in urgent need of a machine that could quickly calculate ballistic missile trajectories in wartime Europe and Africa.
Each week I undertake a bit of a historical deep dive into the life of someone I would consider a Thinker. These thinkers are often leaders as you might imagine. You will of course agree with me that not every great thinker is a leader, there are differences and the relationship is not a bi-directional equivalency. The reverse though I believe is quite often true. Great leaders are often thinkers. I focused this week on an individual who may not be immediately recognized as a leader but I think even in this we do him an injustice. John Mauchly was a leader in a very important way and exhibited a leadership trait which I believe we would do well to learn from and understand better. Great leaders understand the importance of helping others find success. They recognize their position and knowledge can be used to influence others. - Great leaders show discretion in how they share their opinions and ideas. - Great leaders use their position and knowledge to benefit others. - Great leaders
From a small building in Pennsylvania to widespread usage across the world, we track the compelling story of one of the greatest technological innovations in history, setting the stage for the age of data science. Ginette: “I’m Ginette.” Curtis: “And I’m Curtis.” Ginette: “And you are listening to Data Crunch.” Curtis: “A podcast about how data and prediction shape our world.” Ginette: “A Vault Analytics production.” Ginette: “Today our story starts at a business building.” Curtis: “The building is in Philadelphia, Pennsylvania, on Broad and Spring Garden Streets to be precise. Envision the late 1940s.” Ginette: “You see a man absorbed in thought entering the building, and you decide to follow him in.” Curtis: “When you walk through his office, you find some bright engineering minds working on a fairly new startup in town: the Eckert-Mauchly Computer Corporation, or EMCC. It turns out, this is the very first large-scale computer business in the United States.” Ginette: “While this business environment on the surface is vibrant and innovative, behind the scenes, it’s a pressure cooker full of confusion.” Curtis: “The owners, John Mauchly, who you followed into the office, and his business partner, J. Presper Eckert, are talking about something strange that’s been happening: most of their clients had been from the government, and now they’re quietly pulling away from doing business with EMCC without any explanation, which is both alarming and confusing to the business owners. It’d be one thing if the government gave a reason each time it pulled out of a contract, but without one, they have no idea what’s wrong or how to try and fix the situation. It’s like going through several breakups where the only explanation offered is, ‘it’s not you; it’s me.’ “So what’s actually going on here?” Ginette: “The answer is woven into John’s backstory, a backstory that also includes the story of the ENIAC, the very first fully electric general purpose computer. “In John’s earlier career, he was involved with scientific clubs and academia. He started as an engineer and eventually became a professor at the prestigious Moore School of Engineering at UPENN. At one point, he got lucky. He asked essentially this question to the right military person on campus: what if I could build a machine that would significantly reduce your trajectory calculation time for projectiles?” Curtis: “So the military ends up formally accepting his proposal, and John and Presper team up for three years on this top-secret military project to build the ENIAC. “At the time, the ENIAC is really impressive in both size and ability. It weighs about the same as nine adult elephants, which is 27 tons, and it has about 17,500 vacuum tubes, each about the size of your average household light bulb. It has 5,000,000 hand-melted joints. And it’s the size of a small house—about 1,800 square feet. And in today’s dollars, it costs about $7 million. “It’s the very first of its kind. It’s both completely electric and a general purpose machine, meaning you can use it to calculate almost anything as long as you give it the right parameters. The bottom line is that it’s a lot faster than anything before it. It’s 2,400 times faster than human computers, and 1,000 times faster than any other type of machine computer at the time. For example, it took the calculation of a 60-second projectile down from 20 hours to just 30 seconds. To understand the magnitude of this, it's like moving from an average snail’s pace to the average speed of a car on a highway.” Ginette: “Here’s another way to look at this: if you drive your car (the ENIAC) across the country from L.A. to New York City at about 70 miles per hour without stopping, it would take you a little over a day and a half to drive there. In contrast, it’d take a snail (the human computer) without stopping about 11 years.” Curtis: “So it turns out the ENIAC isn’t ready in time f...
November 13, 2015 - Today we're going to hear from writer/journalist Walter Isaacson, president and CEO of the Aspen Institute and author of The Innovators: How a Group of Inventors, Hackers, Geniuses and Geeks Created the Digital Revolution, and the acclaimed biography: Steve Jobs. The Innovators includes names like Grace Hopper, Lord Byron’s daughter, Bletchley Park's Alan Turing, ENIAC, John Mauchly, J. Presper Eckert, and many others that gave us the computer devices we find indispensable to modern life. And remember to subscribe to the History Author Show on iTunes, like our iHeartRadio page, or make us appointment listening on your Android device, so you don’t miss an installment of History in Five Friday. It’s the perfect way to kick off your modern weekend… with people from the past.
The Electronic Numerical Integrator and Computer, or ENIAC, was created under the direction of John Mauchly and J. Presper Eckert of Penn's Moore School of Electrical Engineering (now the School of Engineering and Applied Science). Construction of the 27-ton, 680-square-foot computer began in July 1943 and was announced to the public on Feb. 14, 1946. It was built to calculate ballistic trajectories for the Army during World War II, a time- and labor-intensive process that had previously been performed by teams of mathematicians working with mechanical calculators. ENIAC stored information in the form of electrons trapped in vacuum tubes, making it the first all-electronic, general-purpose digital computer. The long string of adjectives distinguishes it from earlier mechanical computers, which were essentially gear-driven abacuses that could aid in complex math but could only calculate a small subset of equations.