Podcasts about pentium iii

  • 11PODCASTS
  • 14EPISODES
  • 1h 17mAVG DURATION
  • ?INFREQUENT EPISODES
  • Oct 15, 2023LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about pentium iii

Latest podcast episodes about pentium iii

TNT Radio
Fred Litwin on Sky Dragon Slaying - 15 October 2023

TNT Radio

Play Episode Listen Later Oct 15, 2023 55:48


On today's show, Fred Litwin discusses theories about the JFK assassination. GUEST OVERVIEW: Fred Litwin, author of Oliver Stone's Film-Flam: The Demagogue of Dealey Plaza, is a marketing professional who worked nine years for Intel Corporation. In 1998-1999, Fred managed a team of twenty people organizing the launch of the Pentium III in Asia. Prior to joining Intel, Fred was Vice-President of Sales for LAN Systems in New York City. https://www.onthetrailofdelusion.com/ X: @FredLitwin

RETROMATICA
Este cartucho no es un juego

RETROMATICA

Play Episode Listen Later Sep 19, 2023 26:27


Slot 1, slot 2 y slot A. Lo intentaron, pero no triunfaron. Una historia de Pentium II, Pentium III y Athlones que no tuvieron exito.

Screaming in the Cloud
Hacking Old Hardware and Developer Advocate Presentations with Darko Mesaroš

Screaming in the Cloud

Play Episode Listen Later Apr 18, 2023 27:46


Darko Mesaroš, Senior Developer Advocate at AWS, joins Corey on Screaming in the Cloud to discuss all the weird and wonderful things that can be done with old hardware, as well as the necessary skills for being a successful Developer Advocate. Darko walks through how he managed to deploy Kubernetes on a computer from 1986, as well as the trade-offs we've made in computer technology as hardware has progressed. Corey and Darko also explore the forgotten art of optimizing when you're developing, and how it can help to cut costs. Darko also shares what he feels is the key skill every Developer Advocate needs to have, and walks through how he has structured his presentations to ensure he is captivating and delivering value to his audience.About DarkoDarko is a Senior Developer Advocate based in Seattle, WA. His goal is to share his passion and technological know-how with Engineers, Developers, Builders, and tech enthusiasts across the world. If it can be automated, Darko will definitely try to do so. Most of his focus is towards DevOps and Management Tools, where automation, pipelines, and efficient developer tools is the name of the game – click less and code more so you do not repeat yourself ! Darko also collects a lot of old technology and tries to make it do what it should not. Like deploy AWS infrastructure through a Commodore 64.Links Referenced: AWS: https://aws.amazon.com/ Blog post RE deploying Kubernetes on a TRS-80: https://www.buildon.aws/posts/i-deployed-kubernetes-with-a-1986-tandy-102-portable-computer AWS Twitch: https://twitch.tv/aws Twitter: https://twitter.com/darkosubotica Mastodon: https://hachyderm.io/@darkosubotica TranscriptAnnouncer: Hello, and welcome to Screaming in the Cloud with your host, Chief Cloud Economist at The Duckbill Group, Corey Quinn. This weekly show features conversations with people doing interesting work in the world of cloud, thoughtful commentary on the state of the technical world, and ridiculous titles for which Corey refuses to apologize. This is Screaming in the Cloud.Corey: This episode is sponsored in part by our friends at Chronosphere. When it costs more money and time to observe your environment than it does to build it, there's a problem. With Chronosphere, you can shape and transform observability data based on need, context and utility. Learn how to only store the useful data you need to see in order to reduce costs and improve performance at chronosphere.io/corey-quinn. That's chronosphere.io/corey-quinn. And my thanks to them for sponsor ing my ridiculous nonsense. Corey: Do you wish your developers had less permanent access to AWS? Has the complexity of Amazon's reference architecture for temporary elevated access caused you to sob uncontrollably? With Sym, you can protect your cloud infrastructure with customizable, just-in-time access workflows that can be setup in minutes. By automating the access request lifecycle, Sym helps you reduce the scope of default access while keeping your developers moving quickly. Say goodbye to your cloud access woes with Sym. Go to symops.com/corey to learn more. That's S-Y-M-O-P-S.com/coreyCorey: Welcome to Screaming in the Cloud. I'm Cloud Economist Corey Quinn and my guest today is almost as bizarre as I am, in a somewhat similar direction. Darko Mesaroš is a Senior Developer Advocate at AWS. And instead of following my path of inappropriately using things as databases that weren't designed to be used that way, he instead uses the latest of technology with the earliest of computers. Darko, thank you for joining me.Darko: Thank you so much, Corey. First of all, you know, you tell me, Darko is a senior developer advocate. No, Corey. I'm a system administrator by heart. I happen to be a developer advocate these days, but I was born in the cold, cold racks of a data center. I maintain systems, I've installed packages on Linux systems. I even set up Solaris Zones a long time ago. So yeah, but I happen to yell into the camera these days, [laugh] so thank you for having me here.Corey: No, no, it goes well. You started my career as a sysadmin. And honestly, my opinion, if you asked me—which no one does, but I share it anyway—is that the difference between an SRE and a sysadmin is about a 40% salary bump.Darko: Exactly.Corey: That's about it. It is effectively the same job. The tools are different, the approach we take is different, but the fundamental mandate of ‘keep the site up' has not materially changed.Darko: It has not. I don't know, like, what the modern SRS do, but like, I used to also semi-maintain AC units. Like, you have to walk around with a screwdriver nonetheless, so sometimes, besides just installing the freshest packages on your Red Hat 4 system, you have to also change the filters in the AC. So, not sure if that belongs into the SRE manifesto these days.Corey: Well, the reason that I wound up inviting you onto the show was a recent blog post you put up where you were able to deploy Kubernetes from the best computer from 1986, which is the TRS-80, or the Trash-80. For the record, the worst computer from 1986 was—and remains—IBM Cloud. But that's neither here nor there.What does it mean to deploy Kubernetes because, to be direct, the way that I tend to deploy anything these days, if you know, I'm sensible and being grown up about it, is a Git push and then the automation takes it away from there. I get the sense, you went a little bit deep.Darko: So, when it comes to deploying stuff from an old computer, like, you know, you kind of said the right thing here, like, I have the best computer from 1986. Actually, it's a portable version of the best computer from 1986; it's a TRS-80 Model 102. It's a portable, basically a little computer intended for journalists and people on the go to write stuff and send emails or whatever it was back in those days. And I deployed Kubernetes through that system. Now, of course, I cheated a bit because the way I did it is I just used it as a glorified terminal.I just hooked up the RS 232, the wonderful serial connection, to a Raspberry Pi somewhere out there and it just showed the stuff from a Raspberry Pi onto the TRS-80. So, the TRS-80 didn't actually know how to run kubectl—or ‘kube cuddle,' what they call it—it just asked somebody else to do it. But that's kind of the magic of it.Corey: You could have done a Lambda deployment then just as easily.Darko: Absolutely. Like that's the magic of, like, these old hunks of junks is that when you get down to it, they still do things with numbers and transmit electrical signals through some wires somewhere out there. So, if you're capable enough, if you are savvy, or if you just have a lot of time, you can take any old computer and have it do modern things, especially now. Like, and I will say 15 years ago, we could have not done anything like this because 15 years ago, a lot of the stuff at least that I was involved with, which was Microsoft products, were click only. I couldn't, for the love of me, deploy a bunch of stuff on an Active Directory domain by using a command line. PowerShell was not a thing back then. You could use VB Script, but sort of.Corey: Couldn't you wind up using something that would effect, like, Selenium or whatnot that winds up emulating a user session and moving the mouse to certain coordinates and clicking and then waiting some arbitrary time and clicking somewhere else?Darko: Yes.Corey: Which sounds like the absolute worst version of automation ever. That's like, “I deployed Kubernetes using a typewriter.” “Well, how the hell did you do that?” “Oh, I use the typewriter to hit the enter key. Problem solved.” But I don't think that counts.Darko: Well, yeah, so actually even back then, like, just thinking of, like, a 10, 12-year step back to my career, I automated stuff on Windows systems—like Windows 2000, and Windows 2003 systems—by a tool called AutoIt. It would literally emulate clicks of a mouse on a specific location on the screen. So, you were just really hoping that window pops up at the same place all the time. Otherwise, your automation doesn't work. So yeah, it was kind of like that.And so, if you look at it that way, I could take my Trash-80, I could write an AutoIt script with specific coordinates, and I could deploy Windows things. So actually, yeah, you can deploy anything with these days, with an old computer.Corey: I think that we've lost something in the world of computers. If I, like, throw a computer at you these days, you're going to be pretty annoyed with me. Those things are expensive, it'll probably break, et cetera. If I throw a computer from this era at you, your family is taking bereavement leave. Like, those things where—there would be no second hit.These things were beefy. They were a sense of solidity to them. The keyboards were phenomenal. We've been chasing that high ever since. And, yeah, they were obnoxiously heavy and the battery life was 20 seconds, but it was still something that—you felt like it is computer time. And now, all these things have faded into the background. I am not protesting the march of progress, particularly in this particular respect, but I do miss the sense of having keyboards didn't weren't overwhelmingly flimsy plastic.Darko: I think it's just a fact of, like, we have computers as commodities these days. Back then computers were workstations, computers were something you would buy to perform a specific tasks. Today, computer is anything from watching Twitch to going on Twitter, complaining about Twitter, to deploying Kubernetes, right? So, they have become such commodities such… I don't want to call them single-use items, but they're more becoming single-use items as time progresses because they're just not repairable anymore. Like, if you give me a computer that's five years old, I don't know what to do with it. I probably cannot fix it if it's broken. But if you give me a computer that's 35 years old, I bet you can fix it no matter what happened.Corey: And the sheer compute changes have come so fast and furious, it's easy to lose sight of them, especially with branding being more or less the same. But I saved up and took a additional loan out when I graduated high school to spend three grand on a Dell Inspiron laptop, this big beefy thing. And for fun, I checked the specs recently, and yep, that's a Raspberry Pi these days; they're $30, and it's not going to work super well to browse the web because it's underpowered. And I'm sitting here realizing wait a minute, even with a modern computer—forget the Raspberry Pi for a second—I'm sitting here and I'm pulling up web pages or opening Slack, or God forbid, Slack and Chrome simultaneously, and the fan spins up and it sounds incredibly anemic. And it's, these things are magical supercomputers from the future. Why are they churning this hard to show me a funny picture of a cat? What's going on here?Darko: So, my theory on this is… because we can. We can argue about this, but we currently—Corey: Oh, I think you're right.Darko: We have unlimited compute capacity in the world. Like, you can come up with an idea, you're probably going to find a supercomputer out there, you're probably going to find a cloud vendor out there that's going to give you all of the resources you need to perform this massive computation. So, we didn't really think about optimization as much as we used to do in the past. So, that's it: we can. Chrome doesn't care. You have 32 gigs of RAM, Corey. It doesn't care that it takes 28 gigs of that because you have—Corey: I have 128 gigs on this thing. I bought the Mac studio and maxed it out. I gave it the hostname of us-shitpost-1 and we run with it.Darko: [laugh]. There you go. But like, I did some fiddling around, like, recently with—and again, this is just the torture myself—I did some 6502 Assembly for the Atari 2600. 6502 is a CPU that's been used in many things, including the Commodore 64, the NES, and even a whole lot of Apple IIs, and whatnot. So, when you go down to the level of a computer that has 1.19 megahertz and it has only 128 bytes of RAM, you start to think about, okay, I can move these two numbers in memory in the following two ways: “Way number one will require four CPU cycles. Way number two will require seven CPU cycles. I'll go with way number one because it will save me three CPU cycles.”Corey: Oh, yeah. You take a look at some of the most advanced computer engineering out there and it's for embedded devices where—Darko: Yeah.Corey: You need to wind up building code to run in some very tight constraints, and that breeds creativity. And I remember those days. These days, it's well my computer is super-overpowered, what's it matter? In fact, when I go in and I look at customers' AWS bills, very often I'll start doing some digging, and sure enough, EC2 is always the number one expense—we accept that—but we take a look at the breakdown and invariably, there's one instance family and size that is the overwhelming majority, in most cases. You often a—I don't know—a c5.2xl or something or whatever it happens to be.Great. Why is that? And the answer—[unintelligible 00:10:17] to make sense is, “Well, we just started with that size and it seemed to work so we kept using it as our default.” When I'm building things, because I'm cheap, I take one of the smallest instances I possibly can—it used to be one of the Nanos and I'm sorry, half a gig or a gig of RAM is no longer really sufficient when I'm trying to build almost anything. Thanks, JavaScript. So okay, I've gone up a little bit.But at that point, when I need to do something that requires something beefier, well, I can provision those resources, but I don't have it as a default. That forces me to at least in the back of my mind, have a little bit of a sense of I should be parsimonious with what it is that I'm provisioning out there, which is apparently anathema to every data scientist I've ever met, but here we are.Darko: I mean, that's the thing, like, because we're so used to just having those resources, we don't really care about optimizations. Like, I'm not advocating that you all should go and just do assembly language. You should never do that, like, unless you're building embedded systems or you're working for something—Corey: If you need to use that level of programming, you know.Darko: Exactly.Corey: You already know and nothing you are going to talk about here is going to impact what people in that position are doing. Mostly you need to know assembly language because that's a weeder class and a lot of comp-sci programs and if you don't pass it, you don't graduate. That's the only reason to really know assembly language most of the time.Darko: But you know, like, it's also a thing, like, as a developer, right, think about the person using your thing, right? And they may have the 128 gig us—what is it you called it? Us-shitpost-1, right—that kind of power, kind of, the latest and greatest M2 Max Ultra Apple computer that just does all of the stuff. You may have a big ‘ol double Xeon workstation that does a thing.Or you just may have a Chromebook. Think about us with Chromebooks. Like, can I run your website properly? Did you really need all of those animations? Can you think about reducing the amount of animations depending on screen size? So, there's a lot of things that we need to kind of think about. Like, it goes back to the thing where ‘it works on my machine.' Oh, of course it works on your machine. You spent thousands of dollars on your machine. It's the best machine in the world. Of course, it runs smoothly.Corey: Wait 20 minutes and they'll release a new one, and now, “Who sold me this ancient piece of crap?” Honestly, the most depressing thing is watching an Apple Keynote because I love my computer until I watch the Apple Keynote and it's like, oh, like, “Look at this amazing keyboard,” and the keyboard I had was fine. It's like, “Who sold me this rickety piece of garbage?” And then we saw how the Apple butterfly keyboard worked out for everyone and who built that rickety piece of garbage. Let's go back again. And here we are.Darko: Exactly. So, that's kind of the thing, right? You know, like, your computer is the best. And if you develop for it, is great, but you always have to think other people who use it. Hence, containers are great to fix one part of that problem, but not all of the problems. So, there's a bunch of stuff you can do.And I think, like, for all of the developers out there, it's great what you're doing, you're building us so many tools, but always that take a step back and optimize stuff. Optimize, both for the end-user by the amount of JavaScript you're going to throw at me, and also for the back-end, think about if you have to run your web server on a Pentium III server, could you do it? And if you could, how bad would it be? And you don't have to run it on a Pentium III, but like, try to think about what's the bottom 5% of the capacity you need? So yeah, it's just—you'll save money. That's it. You'll save money, ultimately.Corey: So, I have to ask, what you do day to day is you're a senior developer advocate, which is, hmm, some words, yes. You spend a lot of your free time and public time talking about running ancient computers, but you also talk to customers who are looking forward, not back. How do you reconcile the two?Darko: So, I like to mix the two. There's a whole reason why I like old computers. Like, I grew up in Serbia. Like, when I was young in the '90s, I didn't have any of these computers. Like, I could only see, like, what was like a Macintosh from 1997 on TV and I would just drool. Like, I wouldn't even come close to thinking about getting that, let alone something better.So, I kind of missed all of that part. But now that I started collecting all of those old computers and just everything from the '80s and '90s, I've actually realized, well, these things are not that different from something else. So, I like to always make comparisons between, like, an old system. What does it actually do? How does it compare to a new system?So, I love to mix and match in my presentations. I like to mix it, mix and match in my videos. You saw my blog posts on deploying stuff. So, I think it's just a fun way to kind of create a little contrast. I do think we should still be moving forward. I do think that technology is getting better and better and it's going to help people do so much more things faster, hopefully cheaper, and hopefully better.So, I do think that we should definitely keep on moving forward. But I always have this nostalgic feeling about, like, old things and… sometimes I don't know why, but I miss the world without the internet. And I think that without the internet, I think I miss the world with dial-up internet. Because back then you would go on the internet for a purpose. You have to do a thing, you have to wait for a while, you have to make sure nobody's on the phone. And then—Corey: God forbid you dial into a long-distance call. And you have to figure out which town and which number would be long distance versus not, at least where I grew up, and your parents would lose their freaking minds because that was an $8 phone call, which you know, back in the '80s and early '90s was significant. And yeah, great. Now, I still think is a great prank opportunity to teach kids are something that it costs more to access websites that are far away, which I guess in theory, it kind of does, but not to the end-user. I digress.Darko: I have a story about this, and I'm going to take a little sidestep. But long-distance phone calls. Like in the '80s, the World Wide Web was not yet a thing. Like, the www, the websites all, just the general purpose internet was not yet a thing. We had things called BBSes, or Bulletin Board Systems. That was the extreme version of a dial-up system.You don't dial into the internet; you dial into a website. Imagine if you have a sole intent of visiting only one website and the cost of visiting such a website would depend on where that website currently is. If the website is in Germany and you're calling from Serbia, it's going to cost you a lot of money because you're calling internationally. I had a friend back then. The best software you can get were from American BBSes, but calling America from Serbia back then would have been prohibitively expensive, like, just insanely expensive.So, what this friend used to do, he figured out if he would be connected to a BBS six hours a day, it would actually reset the counter of his phone bill. It would loop through a mechanical counter from whatever number, it would loop back again to that number. So, it would take around six and some hours to complete the loop the entire phone counting metric—whatever they use back in the '80s—to kind of charge your bill, so it's effectively cost him zero money back then. So yeah, it was more expensive, kids, back then to call websites, the further away the websites were.[midroll 00:17:11]Corey: So, developer advocates do a lot of things. And I think it is unfair, but also true that people tend to shorthand those of those things do getting on stage and giving conference talks because that at least is the visible part of it. People see that and it's viscerally is understood that that takes work and a bit of courage for those who are not deep into public speaking and those who are, know it takes a lot of courage. And whereas writing a blog post, “Well, I have a keyboard and say dumb things on the internet all the time. I don't see why that's hard.” So, there's a definite perception story there. What's your take on giving technical presentations?Darko: So, yeah. Just as you said, like, I think being a DA, even in my head was always represented, like, oh, you're just on stage, you're traveling, you're doing presentations, you're doing all those things. But it's actually quite a lot more than that, right? We do a lot more. But still, we are the developer advocate. We are the front-facing thing towards you, the wonderful developers listening to this.And we tend to be on stage, we tend to do podcasts with wonderful internet personalities, we tend to do live streams, we tend to do videos. And I think one of the key skills that a DA needs to have—a Developer Advocate needs to have—is presentations, right? You need to be able to present a technical message in the best possible way. Now, being a good technical presenter doesn't mean you're funny, doesn't mean you're entertaining, that doesn't have to be a thing. You just need to take a complex technical message and deliver it in the best way possible so that everybody who has just given you their time, can get it fully.And this means—well, it means a lot of things, but it means taking this complicated topic, distilling it down so it can be digested within 30 to 45 minutes and it also needs to be… it needs to be interesting. Like, we can talk about the most interesting topic, but if I don't make it interesting, you're just going to walk out. So, I also lead, like, a coaching class within internally, like, to teach people how to speak better and I'm working with, like, really good speakers there, but a lot of the stuff I say applies to no matter if you're a top-level speaker, or if you're, like, just beginning out. And my challenge to all of you speakers out there, like, anybody who's listening to this and it has a plan to deliver a video, a keynote, a live stream or speak at a summit somewhere, is get outside of that box. Get outside of that PowerPoint box.I'm not saying PowerPoint is bad. I think PowerPoint is a wonderful tool, but I'm just saying you don't have to present in the way everybody else presents. The more memorable your presentation is, the more outside of that box it is, the more people will remember it. Again, you don't have to be funny. You don't have to be entertaining. You just have to take thing you are really passionate about and deliver it to us in the best possible way. What that best possible way is, well, it really depends. Like a lot of things, there is no concrete answer to this thing.Corey: One of the hard parts I found is that people will see a certain technical presenter that they like and want to emulate and they'll start trying to do what they do. And that works to a point. Like, “Well, I really enjoy how that presenter doesn't read their slides.” Yeah, that's a good thing to pick up. But past a certain point, other people's material starts to fit as well as other people's shoes and you've got to find your own path.My path has always been getting people's attention first via humor, but it's certainly not the only way. In many contexts, it's not even the most effective way. It works for me in the context in which I use it, but I assure you that when I'm presenting to clients, I don't start off with slapstick comedy. Usually. There are a couple of noteworthy exceptions because clients expect that for me, in some cases.Darko: I think one of the important things is that emulating somebody is okay, as you said, to an extent, like, just trying to figure out what the good things are, but good, very objectively good things. Never try to be funny if you're not funny. That's the thing where you can try comedy, but it's very difficult to—it's very difficult to do comedy if you're not that good at it. And I know that's very much a given, but a lot of people try to be funny when they're obviously not funny. And that's okay. You don't have to be funny.So, there are many of ways to get people's attentions, by again, just throwing a joke. What I did once on stage, I threw a bottle at the floor. I was just—I said, I said a thing and threw a bottle at the floor. Everybody started paying attention all of a sudden at me. I don't know why. So, it's going to be that. It can be something—it can be be a shocking statement. When I say shocking, I mean, something, well, not bad, but something that's potentially controversial. Like, for example, emacs is better than vim. I don't know, maybe—Corey: “Serverless is terrible.”Darko: Serverl—yeah.Corey: Like, it doesn't matter. It depends on the audience.Darko: It depends on the audience.Corey: “The cloud is a scam.” I gave a talk once called, “The Cloud is A Scam,” and it certainly got people's attention.Darko: Absolutely. So, breaking up the normal flow because as a participant of a show, of a presentation, you go there you expect, look, I'm going to sit down, Corey's going to come on stage and Corey says, “Hi, my name is Corey Quinn. I'm the CEO of The Duckbill Group. This is what I do. And welcome to my talk about blah.”Corey: Tactically, my business partner, Mike, is the CEO. I don't want to I don't want to step too close to that fire, let's be clear.Darko: Oh, okay [laugh]. Okay. Then, “Today's agenda is this. And slide one, slide two, slide three.” And that the expectation of the audience. And the audience comes in in this very autopilot way, like, “Okay, I'm just going to sit there and just nod my head as Corey speaks.”But then if Corey does a weird thing and Corey comes out in a bathtub. Just the bathtub and Corey. And Corey starts talking about how bathtubs are amazing, it's the best place to relax. “Oh, by the way, managing costs in the cloud is so easy, I can do it from a bathtub.” Right? All of a sudden, whoa [laugh], wait a second, this is something that's interesting. And then you can go through your rest of your conversation. But you just made a little—you ticked the box in our head, like, “Oh, this is something weird. This is different. I don't know what to expect anymore,” and people start paying more attention.Corey: “So, if you're managing AWS costs from your bathtub, what kind of computer do you use?” “In my case, a toaster.”Darko: [laugh]. Yes. But ultimately, like, some of those things are very good and they just kind of—they make you as a presenter, unpredictable, and that's a good thing. Because people will just want to sit on the edge of the seat and, like, listen to what you say because, I don't know what, maybe he throws that toaster in, right? I don't know. So, it is like that.And one of the things that you'll notice, Corey, especially if you see people who are more presenting for a longer time, like, they've been very common on events and people know them by name and their face, then that turns into, like, not just presenting but somebody comes, literally not because of the topic, but because they want to hear Corey talk about a thing. You can go there and talk about unicorns and cats, people will still come and listen to that because it's Corey Quinn. And that's where you, by getting outside of that box, getting outside of that ‘this is how we present things at company X,' this is what you get in the long run. People will know who you are people will know, what not to expect from your presentations, and they will ultimately be coming to your presentations to enjoy whatever you want to talk about.Corey: That is the dream. I really want to thank you for taking the time to talk so much about how you view the world and the state of ancient and modern technologies and the like. If people want to learn more, where's the best place for them to find you?Darko: The best way to find me is on twitch.tv/aws these days. So, you will find me live streaming twice a week there. You will find me on Twitter at @darkosubotica, which is my Twitter handle. You will find me at the same handle on Mastodon. And just search for my name Darko Mesaroš, I'm sure I'll pop up on MySpace as well or whatever. So, I'll post a lot of cloud-related things. I posted a lot of old computer-related things, so if you want to see me deploy Kubernetes through an Atari 2600, click that subscribe button or follow or whatever.Corey: And we will, of course, include a link to this in the show notes. Thank you so much for being so generous with your time. I appreciate it.Darko: Thank you so much, Corey, for having me.Corey: Darko Mesaroš, senior developer advocate at AWS, Cloud Economist Corey Quinn and this is Screaming in the Cloud. If you've enjoyed this podcast, please leave a five-star review on your podcast platform of choice, whereas if you've hated this podcast, please leave a five-star review on your podcast platform of choice along with an angry and insulting comment that you compose and submit from your IBM Selectric typewriter.Corey: If your AWS bill keeps rising and your blood pressure is doing the same, then you need The Duckbill Group. We help companies fix their AWS bill by making it smaller and less horrifying. The Duckbill Group works for you, not AWS. We tailor recommendations to your business and we get to the point. Visit duckbillgroup.com to get started.

The History of Computing
The Story of Intel

The History of Computing

Play Episode Listen Later Mar 7, 2023 16:51


We've talked about the history of microchips, transistors, and other chip makers. Today we're going to talk about Intel in a little more detail.  Intel is short for Integrated Electronics. They were founded in 1968 by Robert Noyce and Gordon Moore. Noyce was an Iowa kid who went off to MIT to get a PhD in physics in 1953. He went off to join the Shockley Semiconductor Lab to join up with William Shockley who'd developed the transistor as a means of bringing a solid-state alternative to vacuum tubes in computers and amplifiers. Shockley became erratic after he won the Nobel Prize and 8 of the researchers left, now known as the “traitorous eight.”  Between them came over 60 companies, including Intel - but first they went on to create a new company called Fairchild Semiconductor where Noyce invented the monolithic integrated circuit in 1959, or a single chip that contains multiple transistors.  After 10 years at Fairchild, Noyce joined up with coworker and fellow traitor Gordon Moore. Moore had gotten his PhD in chemistry from Caltech and had made an observation while at Fairchild that the number of transistors, resistors, diodes, or capacitors in an integrated circuit was doubling every year and so coined Moore's Law, that it would continue to to do so. They wanted to make semiconductor memory cheaper and more practical. They needed money to continue their research. Arthur Rock had helped them find a home at Fairchild when they left Shockley and helped them raise $2.5 million in backing in a couple of days.  The first day of the company, Andy Grove joined them from Fairchild. He'd fled the Hungarian revolution in the 50s and gotten a PhD in chemical engineering at the University of California, Berkeley. Then came Leslie Vadász, another Hungarian emigrant. Funding and money coming in from sales allowed them to hire some of the best in the business. People like Ted Hoff , Federico Faggin, and Stan Mazor. That first year they released 64-bit static random-access memory in the 3101 chip, doubling what was on the market as well as the 3301 read-only memory chip, and the 1101. Then DRAM, or dynamic random-access memory in the 1103 in 1970, which became the bestselling chip within the first couple of years. Armed with a lineup of chips and an explosion of companies that wanted to buy the chips, they went public within 2 years of being founded. 1971 saw Dov Frohman develop erasable programmable read-only memory, or EPROM, while working on a different problem. This meant they could reprogram chips using ultraviolet light and electricity. In 1971 they also created the Intel 4004 chip, which was started in 1969 when a calculator manufacturer out of Japan ask them to develop 12 different chips. Instead they made one that could do all of the tasks of the 12, outperforming the ENIAC from 1946 and so the era of the microprocessor was born. And instead of taking up a basement at a university lab, it took up an eight of an inch by a sixth of an inch to hold a whopping 2,300 transistors. The chip didn't contribute a ton to the bottom line of the company, but they'd built the first true microprocessor, which would eventually be what they were known for. Instead they were making DRAM chips. But then came the 8008 in 1972, ushering in an 8-bit CPU. The memory chips were being used by other companies developing their own processors but they knew how and the Computer Terminal Corporation was looking to develop what was a trend for a hot minute, called programmable terminals. And given the doubling of speeds those gave way to microcomputers within just a few years. The Intel 8080 was a 2 MHz chip that became the basis of the Altair 8800, SOL-20, and IMSAI 8080. By then Motorola, Zilog, and MOS Technology were hot on their heals releasing the Z80 and 6802 processors. But Gary Kildall wrote CP/M, one of the first operating systems, initially for the 8080 prior to porting it to other chips. Sales had been good and Intel had been growing. By 1979 they saw the future was in chips and opened a new office in Haifa, Israiel, where they designed the 8088, which clocked in at 4.77 MHz. IBM chose this chip to be used in the original IBM Personal Computer. IBM was going to use an 8-bit chip, but the team at Microsoft talked them into going with the 16-bit 8088 and thus created the foundation of what would become the Wintel or Intel architecture, or x86, which would dominate the personal computer market for the next 40 years. One reason IBM trusted Intel is that they had proven to be innovators. They had effectively invented the integrated circuit, then the microprocessor, then coined Moore's Law, and by 1980 had built a 15,000 person company capable of shipping product in large quantities. They were intentional about culture, looking for openness, distributed decision making, and trading off bureaucracy for figuring out cool stuff. That IBM decision to use that Intel chip is one of the most impactful in the entire history of personal computers. Based on Microsoft DOS and then Windows being able to run on the architecture, nearly every laptop and desktop would run on that original 8088/86 architecture. Based on the standards, Intel and Microsoft would both market that their products ran not only on those IBM PCs but also on any PC using the same architecture and so IBM's hold on the computing world would slowly wither. On the back of all these chips, revenue shot past $1 billion for the first time in 1983. IBM bought 12 percent of the company in 1982 and thus gave them the Big Blue seal of approval, something important event today. And the hits kept on coming with the 286 to 486 chips coming along during the 1980s. Intel brought the 80286 to market and it was used in the IBM PC AT in 1984. This new chip brought new ways to manage addresses, the first that could do memory management, and the first Intel chip where we saw protected mode so we could get virtual memory and multi-tasking.  All of this was made possible with over a hundred thousand transistors. At the time the original Mac used a Motorola 68000 but the sales were sluggish while they flourished at IBM and slowly we saw the rise of the companies cloning the IBM architecture, like Compaq. Still using those Intel chips.  Jerry Sanders had actually left Fairchild a little before Noyce and Moore to found AMD and ended up cloning the instructions in the 80286, after entering into a technology exchange agreement with Intel. This led to AMD making the chips at volume and selling them on the open market. AMD would go on to fast-follow Intel for decades. The 80386 would go on to simply be known as the Intel 386, with over 275,000 transistors. It was launched in 1985, but we didn't see a lot of companies use them until the early 1990s. The 486 came in 1989. Now we were up to a million transistors as well as a math coprocessor. We were 50 times faster than the 4004 that had come out less than 20 years earlier.  I don't want to take anything away from the phenomenal run of research and development at Intel during this time but the chips and cores and amazing developments were on autopilot. The 80s also saw them invest half a billion in reinvigorating their manufacturing plants. With quality manufacturing allowing for a new era of printing chips, the 90s were just as good to Intel. I like to think of this as the Pentium decade with the first Pentium in 1993. 32-bit here we come. Revenues jumped 50 percent that year closing in on $9 billion. Intel had been running an advertising campaign around Intel Inside. This represented a shift from the IBM PC to the Intel. The Pentium Pro came in 1995 and we'd crossed 5 million transistors in each chip. And the brand equity was rising fast. More importantly, so was revenue. 1996 saw revenues pass $20 billion. The personal computer was showing up in homes and on desks across the world and most had Intel Inside - in fact we'd gone from Intel inside to Pentium Inside. 1997 brought us the Pentium II with over 7 million transistors, the Xeon came in 1998 for servers, and 1999 Pentium III. By 2000 they introduced the first gigahertz processor at Intel and they announced the next generation after Pentium: Itanium, finally moving the world to the 64 bit processor.  As processor speeds slowed they were able to bring multi-core processors and massive parallelism out of the hallowed halls of research and to the desktop computer in 2005. 2006 saw Intel go from just Windows to the Mac. And we got 45 nanometer logic technology in 2006 using hafnium-based high-k for transistor gates represented a shift from the silicon-gated transistors of the 60s and allowed them to move to hundreds of millions of transistors packed into a single chip. i3, i5, i7, an on. The chips now have over a couple hundred million transistors per core with 8 cores on a chip potentially putting us over 1.7 or 1.8 transistors per chip. Microsoft, IBM, Apple, and so many others went through huge growth and sales jumps then retreated dealing with how to run a company of the size they suddenly became. This led each to invest heavily into ending a lost decade effectively with R&D - like when IBM built the S/360 or Apple developed the iMac and then iPod. Intel's strategy had been research and development. Build amazing products and they sold. Bigger, faster, better. The focus had been on power. But mobile devices were starting to take the market by storm. And the ARM chip was more popular on those because with a reduced set of instructions they could use less power and be a bit more versatile.  Intel coined Moore's Law. They know that if they don't find ways to pack more and more transistors into smaller and smaller spaces then someone else will. And while they haven't been huge in the RISC-based System on a Chip space, they do continue to release new products and look for the right product-market fit. Just like they did when they went from more DRAM and SRAM to producing the types of chips that made them into a powerhouse. And on the back of a steadily rising revenue stream that's now over $77 billion they seem poised to be able to whether any storm. Not only on the back of R&D but also some of the best manufacturing in the industry.  Chips today are so powerful and small and contain the whole computer from the era of those Pentiums. Just as that 4004 chip contained a whole ENIAC. This gives us a nearly limitless canvas to design software. Machine learning on a SoC expands the reach of what that software can process. Technology is moving so fast in part because of the amazing work done at places like Intel, AMD, and ARM. Maybe that positronic brain that Asimov promised us isn't as far off as it seems. But then, I thought that in the 90s as well so I guess we'll see.        

Hemispheric Views
051: I Don't Like Green Bananas!

Hemispheric Views

Play Episode Listen Later Mar 10, 2022 52:47


Andrew shares his extensive knowledge of toilets, Jason learns the difference between 'Mate...', 'Mate!' and 'Maaaate' and Martin apologises to all listeners for the incessant lawnmowing outside his study window. #suburbia Dark Mode, Focus Mode And Toilets 00:00:00 Dark Mode (https://support.apple.com/en-us/HT210332)

The History of Computing
The Evolution Of The Microchip

The History of Computing

Play Episode Listen Later Sep 13, 2019 31:14


The Microchip Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us for the innovations of the future! Todays episode is on the history of the microchip, or microprocessor. This was a hard episode, because it was the culmination of so many technologies. You don't know where to stop telling the story - and you find yourself writing a chronological story in reverse chronological order. But few advancements have impacted humanity the way the introduction of the microprocessor has. Given that most technological advances are a convergence of otherwise disparate technologies, we'll start the story of the microchip with the obvious choice: the light bulb. Thomas Edison first demonstrated the carbon filament light bulb in 1879. William Joseph Hammer, an inventor working with Edison, then noted that if he added another electrode to a heated filament bulb that it would glow around the positive pole in the vacuum of the bulb and blacken the wire and the bulb around the negative pole. 25 years later, John Ambrose Fleming demonstrated that if that extra electrode is made more positive than the filament the current flows through the vacuum and that the current could only flow from the filament to the electrode and not the other direction. This converted AC signals to DC and represented a boolean gate. In the 1904 Fleming was granted Great Britain's patent number 24850 for the vacuum tube, ushering in the era of electronics. Over the next few decades, researchers continued to work with these tubes. Eccles and Jordan invented the flip-flop circuit at London's City and Guilds Technical College in 1918, receiving a patent for what they called the Eccles-Jordan Trigger Circuit in 1920. Now, English mathematician George Boole back in the earlier part of the 1800s had developed Boolean algebra. Here he created a system where logical statements could be made in mathematical terms. Those could then be performed using math on the symbols. Only a 0 or a 1 could be used. It took awhile, John Vincent Atanasoff and grad student Clifford Berry harnessed the circuits in the Atanasoff-Berry computer in 1938 at Iowa State University and using Boolean algebra, successfully solved linear equations but never finished the device due to World War II, when a number of other technological advancements happened, including the development of the ENIAC by John Mauchly and J Presper Eckert from the University of Pennsylvania, funded by the US Army Ordinance Corps, starting in 1943. By the time it was taken out of operation, the ENIAC had 20,000 of these tubes. Each digit in an algorithm required 36 tubes. Ten digit numbers could be multiplied at 357 per second, showing the first true use of a computer. John Von Neumann was the first to actually use the ENIAC when they used one million punch cards to run the computations that helped propel the development of the hydrogen bomb at Los Alamos National Laboratory. The creators would leave the University and found the Eckert-Mauchly Computer Corporation. Out of that later would come the Univac and the ancestor of todays Unisys Corporation. These early computers used vacuum tubes to replace gears that were in previous counting machines and represented the First Generation. But the tubes for the flip-flop circuits were expensive and had to be replaced way too often. The second generation of computers used transistors instead of vacuum tubes for logic circuits. The integrated circuit is basically a wire set into silicon or germanium that can be set to on or off based on the properties of the material. These replaced vacuum tubes in computers to provide the foundation of the boolean logic. You know, the zeros and ones that computers are famous for. As with most modern technologies the integrated circuit owes its origin to a number of different technologies that came before it was able to be useful in computers. This includes the three primary components of the circuit: the transistor, resistor, and capacitor. The silicon that chips are so famous for was actually discovered by Swedish chemist Jöns Jacob Berzelius in 1824. He heated potassium chips in a silica container and washed away the residue and viola - an element! The transistor is a semiconducting device that has three connections that amplify data. One is the source, which is connected to the negative terminal on a battery. The second is the drain, and is a positive terminal that, when touched to the gate (the third connection), the transistor allows electricity through. Transistors then acts as an on/off switch. The fact they can be on or off is the foundation for Boolean logic in modern computing. The resistor controls the flow of electricity and is used to control the levels and terminate lines. An integrated circuit is also built using silicon but you print the pattern into the circuit using lithography rather than painstakingly putting little wires where they need to go like radio operators did with the Cats Whisker all those years ago. The idea of the transistor goes back to the mid-30s when William Shockley took the idea of a cat's wicker, or fine wire touching a galena crystal. The radio operator moved the wire to different parts of the crystal to pick up different radio signals. Solid state physics was born when Shockley, who first studied at Cal Tech and then got his PhD in Physics, started working on a way to make these useable in every day electronics. After a decade in the trenches, Bell gave him John Bardeen and Walter Brattain who successfully finished the invention in 1947. Shockley went on to design a new and better transistor, known as a bipolar transistor and helped move us from vacuum tubes, which were bulky and needed a lot of power, to first gernanium, which they used initially and then to silicon. Shockley got a Nobel Prize in physics for his work and was able to recruit a team of extremely talented young PhDs to help work on new semiconductor devices. He became increasingly frustrated with Bell and took a leave of absence. Shockley moved back to his hometown of Palo Alto, California and started a new company called the Shockley Semiconductor Laboratory. He had some ideas that were way before his time and wasn't exactly easy to work with. He pushed the chip industry forward but in the process spawned a mass exodus of employees that went to Fairchild in 1957. He called them the “Traitorous 8” to create what would be Fairchild Semiconductors. The alumni of Shockley Labs ended up spawning 65 companies over the next 20 years that laid foundation of the microchip industry to this day, including Intel. . If he were easier to work with, we might not have had the innovation that we've seen if not for Shockley's abbrasiveness! All of these silicon chip makers being in a small area of California then led to that area getting the Silicon Valley moniker, given all the chip makers located there. At this point, people were starting to experiment with computers using transistors instead of vacuum tubes. The University of Manchester created the Transistor Computer in 1953. The first fully transistorized computer came in 1955 with the Harwell CADET, MIT started work on the TX-0 in 1956, and the THOR guidance computer for ICBMs came in 1957. But the IBM 608 was the first commercial all-transistor solid-state computer. The RCA 501, Philco Transac S-1000, and IBM 7070 took us through the age of transistors which continued to get smaller and more compact. At this point, we were really just replacing tubes with transistors. But the integrated circuit would bring us into the third generation of computers. The integrated circuit is an electronic device that has all of the functional blocks put on the same piece of silicon. So the transistor, or multiple transistors, is printed into one block. Jack Kilby of Texas Instruments patented the first miniaturized electronic circuit in 1959, which used germanium and external wires and was really more of a hybrid integrated Circuit. Later in 1959, Robert Noyce of Fairchild Semiconductor invented the first truly monolithic integrated circuit, which he received a patent for. While doing so independently, they are considered the creators of the integrated circuit. The third generation of computers was from 1964 to 1971, and saw the introduction of metal-oxide-silicon and printing circuits with photolithography. In 1965 Gordon Moore, also of Fairchild at the time, observed that the number of transistors, resistors, diodes, capacitors, and other components that could be shoved into a chip was doubling about every year and published an article with this observation in Electronics Magazine, forecasting what's now known as Moore's Law. The integrated circuit gave us the DEC PDP and later the IBM S/360 series of computers, making computers smaller, and brought us into a world where we could write code in COBOL and FORTRAN. A microprocessor is one type of integrated circuit. They're also used in audio amplifiers, analog integrated circuits, clocks, interfaces, etc. But in the early 60s, the Minuteman missal program and the US Navy contracts were practically the only ones using these chips, at this point numbering in the hundreds, bringing us into the world of the MSI, or medium-scale integration chip. Moore and Noyce left Fairchild and founded NM Electronics in 1968, later renaming the company to Intel, short for Integrated Electronics. Federico Faggin came over in 1970 to lead the MCS-4 family of chips. These along with other chips that were economical to produce started to result in chips finding their way into various consumer products. In fact, the MCS-4 chips, which split RAM , ROM, CPU, and I/O, were designed for the Nippon Calculating Machine Corporation and Intel bought the rights back, announcing the chip in Electronic News with an article called “Announcing A New Era In Integrated Electronics.” Together, they built the Intel 4004, the first microprocessor that fit on a single chip. They buried the contacts in multiple layers and introduced 2-phase clocks. Silicon oxide was used to layer integrated circuits onto a single chip. Here, the microprocessor, or CPU, splits the arithmetic and logic unit, or ALU, the bus, the clock, the control unit, and registers up so each can do what they're good at, but live on the same chip. The 1st generation of the microprocessor was from 1971, when these 4-bit chips were mostly used in guidance systems. This boosted the speed by five times. The forming of Intel and the introduction of the 4004 chip can be seen as one of the primary events that propelled us into the evolution of the microprocessor and the fourth generation of computers, which lasted from 1972 to 2010. The Intel 4004 had 2,300 transistors. The Intel 4040 came in 1974, giving us 3,000 transistors. It was still a 4-bit data bus but jumped to 12-bit ROM. The architecture was also from Faggin but the design was carried out by Tom Innes. We were firmly in the era of LSI, or Large Scale Integration chips. These chips were also used in the Busicom calculator, and even in the first pinball game controlled by a microprocessor. But getting a true computer to fit on a chip, or a modern CPU, remained an elusive goal. Texas Instruments ran an ad in Electronics with a caption that the 8008 was a “CPU on a Chip” and attempted to patent the chip, but couldn't make it work. Faggin went to Intel and they did actually make it work, giving us the first 8-bit microprocessor. It was then redesigned in 1972 as the 8080. A year later, the chip was fabricated and then put on the market in 1972. Intel made the R&D money back in 5 months and sparked the idea for Ed Roberts to build The Altair 8800. Motorola and Zilog brought competition in the 6900 and Z-80, which was used in the Tandy TRS-80, one of the first mass produced computers. N-MOSs transistors on chips allowed for new and faster paths and MOS Technology soon joined the fray with the 6501 and 6502 chips in 1975. The 6502 ended up being the chip used in the Apple I, Apple II, NES, Atari 2600, BBC Micro, Commodore PET and Commodore VIC-20. The MOS 6510 variant was then used in the Commodore 64. The 8086 was released in 1978 with 3,000 transistors and marked the transition to Intel's x86 line of chips, setting what would become the standard in future chips. But the IBM wasn't the only place you could find chips. The Motorola 68000 was used in the Sun-1 from Sun Microsystems, the HP 9000, the DEC VAXstation, the Comodore Amiga, the Apple Lisa, the Sinclair QL, the Sega Genesis, and the Mac. The chips were also used in the first HP LaserJet and the Apple LaserWriter and used in a number of embedded systems for years to come. As we rounded the corner into the 80s it was clear that the computer revolution was upon us. A number of computer companies were looking to do more than what they could do with he existing Intel, MOS, and Motorola chips. And ARPA was pushing the boundaries yet again. Carver Mead of Caltech and Lynn Conway of Xerox PARC saw the density of transistors in chips starting to plateau. So with DARPA funding they went out looking for ways to push the world into the VLSI era, or Very Large Scale Integration. The VLSI project resulted in the concept of fabless design houses, such as Broadcom, 32-bit graphics, BSD Unix, and RISC processors, or Reduced Instruction Set Computer Processor. Out of the RISC work done at UC Berkely came a number of new options for chips as well. One of these designers, Acorn Computers evaluated a number of chips and decided to develop their own, using VLSI Technology, a company founded by more Fairchild Semiconductor alumni) to manufacture the chip in their foundry. Sophie Wilson, then Roger, worked on an instruction set for the RISC. Out of this came the Acorn RISC Machine, or ARM chip. Over 100 billion ARM processors have been produced, well over 10 for every human on the planet. You know that fancy new A13 that Apple announced. It uses a licensed ARM core. Another chip that came out of the RISC family was the SUN Sparc. Sun being short for Stanford University Network, co-founder Andy Bchtolsheim, they were close to the action and released the SPARC in 1986. I still have a SPARC 20 I use for this and that at home. Not that SPARC has gone anywhere. They're just made by Oracle now. The Intel 80386 chip was a 32 bit microprocessor released in 1985. The first chip had 275,000 transistors, taking plenty of pages from the lessons learned in the VLSI projects. Compaq built a machine on it, but really the IBM PC/AT made it an accepted standard, although this was the beginning of the end of IBMs hold on the burgeoning computer industry. And AMD, yet another company founded by Fairchild defectors, created the Am386 in 1991, ending Intel's nearly 5 year monopoly on the PC clone industry and ending an era where AMD was a second source of Intel parts but instead was competing with Intel directly. We can thank AMD's aggressive competition with Intel for helping to keep the CPU industry going along Moore's law! At this point transistors were only 1.5 microns in size. Much, much smaller than a cats whisker. The Intel 80486 came in 1989 and again tracking against Moore's Law we hit the first 1 million transistor chip. Remember how Compaq helped end IBM's hold on the PC market? When the Intel 486 came along they went with AMD. This chip was also important because we got L1 caches, meaning that chips didn't need to send instructions to other parts of the motherboard but could do caching internally. From then on, the L1 and later L2 caches would be listed on all chips. We'd finally broken 100MHz! Motorola released the 68050 in 1990, hitting 1.2 Million transistors, and giving Apple the chip that would define the Quadra and also that L1 cache. The DEC Alpha came along in 1992, also a RISC chip, but really kicking off the 64-bit era. While the most technically advanced chip of the day, it never took off and after DEC was acquired by Compaq and Compaq by HP, the IP for the Alpha was sold to Intel in 2001, with the PC industry having just decided they could have all their money. But back to the 90s, ‘cause life was better back when grunge was new. At this point, hobbyists knew what the CPU was but most normal people didn't. The concept that there was a whole Univac on one of these never occurred to most people. But then came the Pentium. Turns out that giving a chip a name and some marketing dollars not only made Intel a household name but solidified their hold on the chip market for decades to come. While the Intel Inside campaign started in 1991, after the Pentium was released in 1993, the case of most computers would have a sticker that said Intel Inside. Intel really one upped everyone. The first Pentium, the P5 or 586 or 80501 had 3.1 million transistors that were 16.7 micrometers. Computers kept getting smaller and cheaper and faster. Apple answered by moving to the PowerPC chip from IBM, which owed much of its design to the RISC. Exactly 10 years after the famous 1984 Super Bowl Commercial, Apple was using a CPU from IBM. Another advance came in 1996 when IBM developed the Power4 chip and gave the world multi-core processors, or a CPU that had multiple CPU cores inside the CPU. Once parallel processing caught up to being able to have processes that consumed the resources on all those cores, we saw Intel's Pentium D, and AMD's Athlon 64 x2 released in May 2005 bringing multi-core architecture to the consumer. This led to even more parallel processing and an explosion in the number of cores helped us continue on with Moore's Law. There are now custom chips that reach into the thousands of cores today, although most laptops have maybe 4 cores in them. Setting multi-core architectures aside for a moment, back to Y2K when Justin Timberlake was still a part of NSYNC. Then came the Pentium Pro, Pentium II, Celeron, Pentium III, Xeon, Pentium M, Xeon LV, Pentium 4. On the IBM/Apple side, we got the G3 with 6.3 million transistors, G4 with 10.5 million transistors, and the G5 with 58 million transistors and 1,131 feet of copper interconnects, running at 3GHz in 2002 - so much copper that NSYNC broke up that year. The Pentium 4 that year ran at 2.4 GHz and sported 50 million transistors. This is about 1 transistor per dollar made off Star Trek: Nemesis in 2002. I guess Attack of the Clones was better because it grossed over 300 Million that year. Remember how we broke the million transistor mark in 1989? In 2005, Intel started testing Montecito with certain customers. The Titanium-2 64-bit CPU with 1.72 billion transistors, shattering the billion mark and hitting a billion two years earlier than projected. Apple CEO Steve Jobs announced Apple would be moving to the Intel processor that year. NeXTSTEP had been happy as a clam on Intel, SPARC or HP RISC so given the rapid advancements from Intel, this seemed like a safe bet and allowed Apple to tell directors in IT departments “see, we play nice now.” And the innovations kept flowing for the next decade and a half. We packed more transistors in, more cache, cleaner clean rooms, faster bus speeds, with Intel owning the computer CPU market and AMD slowly growing from the ashes of Acorn computer into the power-house that AMD cores are today, when embedded in other chips designs. I'd say not much interesting has happened, but it's ALL interesting, except the numbers just sound stupid they're so big. And we had more advances along the way of course, but it started to feel like we were just miniaturizing more and more, allowing us to do much more advanced computing in general. The fifth generation of computing is all about technologies that we today consider advanced. Artificial Intelligence, Parallel Computing, Very High Level Computer Languages, the migration away from desktops to laptops and even smaller devices like smartphones. ULSI, or Ultra Large Scale Integration chips not only tells us that chip designers really have no creativity outside of chip architecture, but also means millions up to tens of billions of transistors on silicon. At the time of this recording, the AMD Epic Rome is the single chip package with the most transistors, at 32 billion. Silicon is the seventh most abundant element in the universe and the second most in the crust of the planet earth. Given that there's more chips than people by a huge percentage, we're lucky we don't have to worry about running out any time soon! We skipped RAM in this episode. But it kinda' deserves its own, since RAM is still following Moore's Law, while the CPU is kinda' lagging again. Maybe it's time for our friends at DARPA to get the kids from Berkley working at VERYUltra Large Scale chips or VULSIs! Or they could sign on to sponsor this podcast! And now I'm going to go take a VERYUltra Large Scale nap. Gentle listeners I hope you can do that as well. Unless you're driving while listening to this. Don't nap while driving. But do have a lovely day. Thank you for listening to yet another episode of the History of Computing Podcast. We're so lucky to have you!

Tech Café
Chronique des composants 99

Tech Café

Play Episode Listen Later Aug 6, 2019 72:16


Enregistrement le vendredi 6 août 1999 La révolution multimédia ! MPEG 2, 3, 4 : l’avenir de la vidéo, la preuve : l’épisode I sur CD ROM !Le baladeur MP3 ? Mouais, mais aucune chance face aux Minidisc !Shoutcast sur Winamp, vers une autre forme de radio ?L’USB se généralise ! Modem, souris, joysticks, stockage…La gamme Apple simplifiée et avec une esthétique… originale.iBook G3 : le poudrier avec internet sans fil ! Et USB only.Le reste de la gamme et Mac OS X server (en open source) ?AOL vs Microsoft et Office 2000 paré pour le web. Le moment de changer de PC… Retour en force d’AMD : l’Athlon, K6-III et Duron en low cost. Les perdreaux CPU de l’année : Celeron 300A, Pentium II et III.Numéro de séries et SSE : le Pentium III.Overclocker son CPU, une bonne idée ?Voodoo 3Dfx, NVIDIA TNT2, ATI RAGE, Matrox G400 : c’est la guerre !Bump mapping, Transform and Lighting : les cartes du futur ! 99 en 3D : Direction Domination Dreamcast ! Accélération 3D obligatoire: Half Life, Starwars racer, Dungeon Keeper 2, Descent 3…Mais pas pour Outcast, le jeu de l’année qui est en voxels !Ou bien ça sera Quake 3 ? Darkstone ? Drakan ? Homeworld ?System Shock 2 ? Deus EX ? Unreal Tournament ? Omikron ? E3 1999400 exposants, 2 000 titres et pleins de filles !Sony et l’apogée de la Playstation.Nintendo qui truste toutes les récompenses.On a vu la Dreamcast : pourquoi elle va tout dominer.Le projet Dolphin de Nintendo. Références GameSpotTV, Puissance Nintendo et aussi: PC Team numéro 48 (juillet / août 99), Maximum PC avril et Juillet 99. Macworld Juillet 99. Console+ numéro 90. CD Consoles numéro 52.

Rebobinando Fitas
Rebobinando Fitas #05 – Teclado e Mouse

Rebobinando Fitas

Play Episode Listen Later Apr 27, 2019 70:01


Fala pessoal do REBOBINANDO FITAS, sejam bem vindos a mais um PODCAST e hoje não serão FITAS e sim DISQUETES, por quê hoje vou falar dos jogos que necessitavam de TECLADO E MOUSE para jogar e ai já remete ao PC e acredito quer muitos caíram nesse mundo, nomeu caso foi em 1989, em empresas que trabalhei ou na casa de amigos, pois só comecei dez anos depois em 1999 com o Pentium III que meu pai comprou pra mim. Então venha conhecer os clássicos que joguei na época!

Lore Boys
F.E.A.R. Lore - Projects Origin, Perseus, Paragon, and Harbinger

Lore Boys

Play Episode Listen Later Oct 17, 2018 65:11


Lore Boys are back, and this time we're talking about the First Encounter Assault Recon (F.E.A.R.) team and their backstory. Who is the spooky little girl running across the screen of my Pentium III laptop? Why am I standing in a hallway full of blood with a scary skeleton dooting right at me? Why shouldn't I feel bad about killing all these soldiers?These questions and more get answered as we delve into the Origins of the paranormal in the year 2025. Alma, Paxton Fettel, Harlan Wade, erm, Point Man, all the household names for the series make an appearance. We're also joined by self-proclaimed F.E.A.R. expert Martin Lis, writer for screengeek.com , who came on to help us out with today's episode. You can find all his links here:https://www.youtube.com/user/WittyUsernameSA/videoshttps://twitter.com/CCscorsese/status/931612998388670464https://www.screengeek.net/author/protocol-tin/Thanks so much for listening, if you like the show, think about telling one friend about it RIGHT NOW. It'd mean the world to us

Tech Café
92. Connexion à hauts débats

Tech Café

Play Episode Listen Later Jun 26, 2018 100:07


Soutenez Tech Café sur Patreon Tous les liens sur techcafe.fr Discutez avec nous sur Telegram Des Hauts et Débats Project Debater : IBM lance un super chatbot qui peut débattre avec vous. Même s’il semble avoir du mal à rentabiliser Watson… Adobe crée une IA qui détecte les photos truquées par Photoshop. Un comble ? Pêche au gros : des algorithmes apprennent à tromper les outils anti phishing. Alexa se met au room service. Une grosse pétition pour empêcher Amazon de vendre Rekognition. Walobby bi bi, j’en suis baba : Jeff Bezos retourne le conseil municipal de Seattle. Tesla dégraisse, sur fond de scandale, d'espionnage et de sabotage. Lobbyaisé : la loi européenne sur le copyright est presque votée. Après les pubs, Eyeo veut lutter contre les fake news avec la blockchain de MetaCert. Ad astra per aspera : Brave va tester son modèle publicitaire à base de tokens. Fox veut transformer les pubs en romans feuilletons. Bientôt un AdFLix. Inside Intel Too much love will kill you : le CEO d’Intel poussé vers la sortie. TLBleed et LazyFP, encore des failles matérielle pour les CPU Intels. Scoop : vous avez un Pentium III et Windows 7 ? Il est temps de changer… Des chercheurs Microsoft portent Linux sur un processeur maison. C’est les soldes : MIPS racheté par Wave Computing. Les Hololens 2 coûteront moins cher et seront basées sur le Qualcomm XR1. Qualcomm planche sur un CPU exclusivement pour les PC. NVIDIA aurait trop de pascal : “La passion ne peut pas être sans excès.” Tout est mini dans notre vie : le record du plus petit ordinateur encore battu. En bref Plus de pubs pour les armes aux mineurs sur Facebook. C’est jamais trop tard… Des groupes payants sur Facebook Pokemon Go autorisera bientôt les échanges. C’est jamais trop tard. En fait... si. Finalement non, pas de VR pour la Xbox One. Le Magic Leap apparaît Valve et son contrôleur qui reconnaît les poignées de mains. Pour election simulator ? John Carmack envoie de la 5K dans l'Oculus Go. Respect. Un exemple d’AR pour le web pour Chrome et un cours en ligne si ça vous titille… Bonus Guillaume Poggiaspalla : Le Speedrun caritatif avec Summer Games Done Quick. Pierre-Olivier : les pubs FB politiques Guillaume : La centième ! Participants : Guillaume Poggiaspalla Pierre-Olivier Dybman Présenté par Guillaume Vendé

Rozgrywka
Rozgrywka #118 – Karlica na wilku

Rozgrywka

Play Episode Listen Later Apr 11, 2016 235:56


Wystarczy szybki rzut oka na rozpiskę i wszystko staje się jasne. Albo świeżynki i tripeleje, albo tytuły z nieco niższej półki, niektóre nawet pachnące myszami z krulewskich lochów. Podróżujemy trochę w czasie, za to nikomu z nas nie chciało ruszyć się z domu na nowe przygody bohaterów w czarnej i czerwonej pelerynie, co prawdopodobnie może świadczyć o jakości tej produkcji. A propos komiksów - Deusza znowu nie było. Tak, wiemy, wszelkie zażalenia składajcie w komentarzach. Obsuwę w premierze odcinka można zwalić na Pentium III, niby najlepszy sprzęt na rynku, a coś niespecjalnie szybko działa. Aha, będzie też o cyckach.

Rozgrywka
Rozgrywka #118 – Karlica na wilku

Rozgrywka

Play Episode Listen Later Apr 11, 2016 235:56


Wystarczy szybki rzut oka na rozpiskę i wszystko staje się jasne. Albo świeżynki i tripeleje, albo tytuły z nieco niższej półki, niektóre nawet pachnące myszami z krulewskich lochów. Podróżujemy trochę w czasie, za to nikomu z nas nie chciało ruszyć się z domu na nowe przygody bohaterów w czarnej i czerwonej pelerynie, co prawdopodobnie może świadczyć o jakości tej produkcji. A propos komiksów - Deusza znowu nie było. Tak, wiemy, wszelkie zażalenia składajcie w komentarzach. Obsuwę w premierze odcinka można zwalić na Pentium III, niby najlepszy sprzęt na rynku, a coś niespecjalnie szybko działa. Aha, będzie też o cyckach.

Tech Talk Interviews
Tech Talk #003 Joey Andrews | Arkadelphia School District

Tech Talk Interviews

Play Episode Listen Later Aug 1, 2015 27:28


This is a fun show by two IT sales reps trying to add some color to the DMR [Direct Market Reseller] world. In it they interview customers and prospect clients on good IT planning and implementation strategies. Enjoy this episode with Joey Andrews. He is the IT Director over at Arkadelphia School District. When did you get interested in technology? Back in college in the 80's Joey began to get interested in technology. He started at Arkadelphia School District back in 2005. When he first started the district had Pentium III's and Windows 98! I remember those days as a gamer in highschool. What hobbies do you have besides deploying systems and implementing switches? Joey likes to hunt and fish. He also enjoys watching his kids play baseball. Star Wars or Star Trek? And Why? Star Trek! Because Star Trek is based on a human future. Current / new technology you like: Joey like their new chromebooks although they have apple too. However, he is not a fan of Apple because of the price and difficulty managing them with active directory. Best IT moves of 2014-2015 YTD? Getting a 'Metro Area Network' and switching to fiber for bandwidth. The school is going to 500 Megs of bandwidth this year [good for a school in Arkansas with 2,000 students]. He also is glad he went with Aerohive for their wireless network although he states that xirrus wireless is a bit easier to program. Biggest Lesson learned? If Joey was able to go back to himself 10 years ago right when he started he would have told himself to not go to a 1-1 initiative [New Tech] until he had been able to fully implement a wireless system. His recommendation is to have 1 AP to classroom [1 AP : 15 Clients runs the best]. He says that you also need to watch out for cable companies that will be over redundant on the amount of cable you need. What brands do you like and why? Joey has used HP, Lenovo, and Dell. His favorite manufacturer for client is Dell. If you were on a tropical island and could only bring 1 piece of technology – what would it be and why? Joey would bring his Dell Laptop - that is all he really needs. He would watch YouTube and Netflix. GO TO http://techtalkinterviews.com/techtalk003/ to see the funny video short from this interview. AND VISIT http://techtalkinterviews.com/ for more great interviews.

Overthinking It Podcast
Episode 250: Pointing a Pentium III at a Dinosaur

Overthinking It Podcast

Play Episode Listen Later Apr 14, 2013 72:58


The Overthinkers tackle Jurassic Park 3D. Episode 250: Pointing a Pentium III at a Dinosaur originally appeared on Overthinking It, the site subjecting the popular culture to a level of scrutiny it probably doesn't deserve. [Latest Posts | Podcast (iTunes Link)]