POPULARITY
Free, ungated access to all 250+ episodes of “It's 5:05!” on your favorite podcast platforms: https://bit.ly/505-updates. You're welcome to
Episode: 2828 Seymour Cray and Cray Supercomputers. Today, we go super.
Computer pioneer Seymour Cray died after a traffic accident in Colorado Springs in 1996. His computers were used in a variety of military and intelligence roles, so is it possible that Cray's death was something more than it seems? Possible but not likely. We talk about the life and death of Seymour Cray. Sources: Seymour Cray Biography "Colorado Springs computer firm broadens its reach with first commercial product" - Colorado Springs Gazette "Assassination on X-Mas Eve" by Archers of Loaf --- This episode is sponsored by · Anchor: The easiest way to make a podcast. https://anchor.fm/app
Some people just gotta dig! Seymour Cray dug tunnels so the elves could finish his coding. William Lyttle said of his tunneling "there is great beauty in inventing things that serve no purpose". Elton McDonald was "getting away from regular things, away from life. Nothing in particular. Just life itself.” Episode 1 of Finding Quantum Quest is out now, and episode 2 drops tomorrow! Go listen, subscribe, and share!
Jim Grisanzio talks with John Spurling, a JVM engineer at Twitter, at UnVoxxed Hawaii 2020 about debugging and the mental process of solving difficult technical issues. John Spurling https://twitter.com/synecdotal Jim Grisanzio https://twitter.com/jimgris Video on YouTube https://youtu.be/6dwOPQSJwaI UnVoxxed Hawaii https://flic.kr/s/aHsmLF23KD https://twitter.com/UnVoxxedHawaii https://www.youtube.com/playlist?list=PLX8CzqL3ArzU0APb6QgpMMTMPEz1jok5Q Seymour Cray https://en.wikipedia.org/wiki/Seymour_Cray Make It Stick: The Science of Successful Learning https://www.amazon.com/Make-Stick-Science-Successful-Learning/dp/0674729013
Oxide and Friends Twitter Space: November 8th, 2021Supercomputers, Cray, and How Sun Picked SGI's PocketWe've been holding a Twitter Space weekly on Mondays at 5p for about an hour. Even though it's not (yet?) a feature of Twitter Spaces, we have been recording them all; here is the recording for our Twitter Space for November 8th, 2021.In addition to Bryan Cantrill and Adam Leventhal, speakers on November 8th included Tom Lyon, Shahin Khan, Darryl Ramm, Dan Cross, Courtney Malone, MattSci, Aaron Goldman, Simeon Miteff, and Jason Ozolins. (Did we miss your name and/or get it wrong? Drop a PR!)Some of the topics we hit on, in the order that we hit them: Bryan's tweet about George Brown's recommending “The Supermen” Charles Murray (1997) “The Supermen: The story of Seymour Cray and the Technical Wizards Behind the Supercomputer” book [@1:28](https://youtu.be/y07PyBrrzMw?t=88) Tom's story meeting Boris Tom's tweet on meeting Boris Babayan Elbrus computers [@9:27](https://youtu.be/y07PyBrrzMw?t=567) Supercomputers and power [@15:16](https://youtu.be/y07PyBrrzMw?t=916) Cray designs Engineering Research Associates wiki Control Data Corporation wiki, CDC 1604 [@20:36](https://youtu.be/y07PyBrrzMw?t=1236) ETA Systems wiki [@23:57](https://youtu.be/y07PyBrrzMw?t=1437) On to the next big thing Steve Chen Cray X-MP [@29:37](https://youtu.be/y07PyBrrzMw?t=1777) Super computers as one-offs National Computational Infrastructure in Australia, NCI Gallium arsenide GPGPU [@33:47](https://youtu.be/y07PyBrrzMw?t=2027) Shahin on interconnects Jason on failure caused by a storm Cray C90 [@41:06](https://youtu.be/y07PyBrrzMw?t=2466) Courtney on bespoke toolchains and systems [@42:42](https://youtu.be/y07PyBrrzMw?t=2562) Influence of Cray on Sun 1996 Sun to purchase Cray Business Systems Division, hpcwire Floating Point Systems Inc wiki > Shahin: SGI really had no use for this system. They should have just killed it. [@50:10](https://youtu.be/y07PyBrrzMw?t=3010) Origin story of DTrace (2006 article) E10k [@56:14](https://youtu.be/y07PyBrrzMw?t=3374) Thinking Machines Corp, wiki [@57:36](https://youtu.be/y07PyBrrzMw?t=3456) Seymour Cray Les Davis “The ultimate team player” write up 2010 Oral history of Les Davis pdf [@1:00:08](https://youtu.be/y07PyBrrzMw?t=3608) Business Systems Division history, long road to Starfire [@1:04:20](https://youtu.be/y07PyBrrzMw?t=3860) SGI and Sun early history Non-uniform memory access NUMA [@1:10:40](https://youtu.be/y07PyBrrzMw?t=4240) Cray T3EMassively parallel MPP [@1:12:33](https://youtu.be/y07PyBrrzMw?t=4353) E10k stories boo.com wiki [@1:18:37](https://youtu.be/y07PyBrrzMw?t=4717) Cray, spooks, pop count [@1:20:45](https://youtu.be/y07PyBrrzMw?t=4845) Chen Cray X-MP and Y-MP Sequent [@1:24:04](https://youtu.be/y07PyBrrzMw?t=5044) An engineer sees his defunct machine being scrapped [@1:26:27](https://youtu.be/y07PyBrrzMw?t=5187) Jason's story of capacitors popping off the board The Capacitor plague If we got something wrong or missed something, please file a PR! Our next Twitter space will likely be on Monday at 5p Pacific Time; stay tuned to our Twitter feeds for details. We'd love to have you join us, as we always love to hear from new speakers!
Gene Amdahl grew up in South Dakota and as with many during the early days of computing went into the Navy during World War II. He got his degree from South Dakota State in 1948 and went on to the University of Wisconsin-Madison for his PhD, where he got the bug for computers in 1952, joining the ranks of IBM that year. At IBM he worked on the iconic 704 and then the 7030 but found it too bureaucratic. And yet he came back to become the Chief Architect of the IBM S/360 project. They pushed the boundaries of what was possible with transistorized computing and along the way, Amdahl gave us Amdahl's Law, which is an important aspect of parallel computing - how much latency tasks take when split across different CPUs. Think of it like the law of diminishing returns applied to processing. Contrast this with Fred Brook's Brook's Law - which says that adding incremental engineers don't make projects happen faster by the same increment, or that it can cause a project to take even more time. As with Seymour Cray, Amdahl had ideas for supercomputers and left IBM again in 1970 when they didn't want to pursue them - ironically just a few years after Thomas Watson Jr admitted that just 34 people at CDC had kicked IBM out of their leadership position in the market. First he needed to be able to build a computer, then move into supercomputers. Fully transistorized computing had somewhat cleared the playing field. So he developed the Amdahl 470V/6 - more reliable, more pluggable, and so cheaper than the IBM S/370. He also used virtual machine technology so customers could simulate a 370 and so run existing workloads cheaper. The first went to NASA and the second to the University of Michigan. During the rise of transistorized computing they just kept selling more and more machines. The company grew fast, taking nearly a quart of the market share. As we saw in the CDC episode, the IBM antitrust case was again giving a boon to other companies. Amdahl was able to leverage the fact that IBM software was getting unbundled with the hardware as a big growth hack. As with Cray at the time, Amdahl wanted to keep to one CPU per workload and developed chips and electronics with Fujitsu to enable doing so. By the end of the 70s they had grown to 6,000 employees on the back of a billion dollars in sales. And having built a bureaucratic organization like the one he just left, he left his namesake company much as Seymour Cray had left CDC after helping build it (and would later leave Cray to start yet another Cray). That would be Trilogy systems, which failed shortly after an IPO. I guess we can't always bet on the name. Then Andor International. Then Commercial Data Servers, now a part of Xbridge systems. Meanwhile the 1980s weren't kind to the company with his name on the masthead. The rise of Unix and first minicomputers then standard servers meant people were building all kinds of new devices. Amdahl started selling servers, given the new smaller and pluggable form factors. They sold storage. They sold software to make software, like IDEs. The rapid proliferation of networking and open standards let them sell networking products. Fujitsu ended up growing faster and when Gene Amdahl was gone, in the face of mounting competition with IBM, Amdahl tried to merge with Storage Technology Corporation, or StorageTek as it might be considered today. CDC had pushed some of its technology to StorageTek during their demise and StorageTek in the face of this new competition ended up filing Chapter 11 and getting picked up by Sun for just over $4 billion. But Amdahl was hemorrhaging money as we moved into the 90s. They sold off half the shares to Fujitsu, laid off over a third of their now 10,000 plus workforce, and by the year 2000 had been lapped by IBM on the high end market. They sold off their software division, and Fujitsu acquired the rest of the shares. Many of the customers then moved to the then-new IBM Z series servers that were coming out with 64 bit G3 and G4 chips. As opposed to the 31-bit chips Amdahl, now Fujitsu under the GlobalServer mainframe brand, sells. Amdahl came out of the blue, or Big Blue. On the back of Gene Amdahl's name and a good strategy to attack that S/360 market, they took 8% of the mainframe market from IBM at one point. But they sold to big customers and eventually disappeared as the market shifted to smaller machines and a more standardized lineup of chips. They were able to last for awhile on the revenues they'd put together but ultimately without someone at the top with a vision for the future of the industry, they just couldn't make it as a standalone company. The High Performance Computing server revenues steadily continue to rise at Fujitsu though - hitting $1.3 billion in 2020. In fact, in a sign of the times, the 20 million Euro PRIMEHPC FX700 that's going to the Minho Advanced Computing Centre in Portugal is a petascale computer built on an ARM plus x86 architecture. My how the times have changed. But as components get smaller, more precise, faster, and more mass producible we see the same types of issues with companies being too large to pivot quickly from the PC to the post-PC era. Although at this point, it's doubtful they'll have a generations worth of runway from a patron like Fujitsu to be able to continue in business. Or maybe a patron who sees the benefits downmarket from the new technology that emerges from projects like this and takes on what amounts to nation-building to pivot a company like that. Only time will tell.
28 Tháng 9 Là Ngày Gì? Hôm Nay Là Ngày Sinh Của Khổng Tử SỰ KIỆN 189 – Đổng Trác phế truất hoàng đế Lưu Biện, lập em của Lưu Biện là Lưu Hiệp lên ngôi, tức Hán Hiến Đế– hoàng đế cuối cùng của triều Đông Hán. 1928 - Alexander Fleming nhận thấy một loại nấm mốc diệt vi khuẩn đang phát triển trong phòng thí nghiệm của mình, phát hiện ra thứ mà sau này được gọi là penicillin . 1951 - CBS cung cấp những chiếc tivi màu đầu tiên để bán cho công chúng, nhưng sản phẩm này bị ngừng sản xuất chưa đầy một tháng sau đó. 1988 – Máy bay tầm xa bốn động cơ thân rộng Ilyushin Il–96 tiến hành chuyến bay đầu tiên. 2006 – Sân bay quốc tế Suvarnabhumi tại tỉnh Samut Prakan, Thái Lan bắt đầu chính thức hoạt động, thay thế cho Sân bay quốc tế Don Mueang. 2008 - Singapore Grand Prix được tổ chức như là cuộc đua đêm đầu tiên của Công thức một , với Fernando Alonso đã giành chiến thắng trong sự kiện này Ngày lễ và kỷ niệm Ngày thế giới phòng chống bệnh dại ( Quốc tế ) Sinh 551 TCN - Khổng Tử , nhà giáo, nhà biên tập, nhà chính trị, nhà triết học thời Xuân Thu của lịch sử Trung Quốc 1925 - Seymour Cray , nhà khoa học máy tính người Mỹ, thành lập Công ty Máy tính CRAY (mất năm 1996) 1926 – Nguyễn Cảnh Toàn, Giáo sư toán học người Việt Nam 1936 - Emmett Chapman , nghệ sĩ guitar người Mỹ, phát minh ra cây gậy Chapman 1938 - Ben E. King , ca sĩ, nhạc sĩ và nhà sản xuất người Mỹ Mất 1914 - Richard Warren Sears , doanh nhân người Mỹ, đồng sáng lập hãng Sears (sinh năm 1863) 1956 - William Boeing , doanh nhân người Mỹ, thành lập Công ty Boeing (sinh năm 1881) 2016 – Shimon Peres, cựu Thủ tướng, Tổng thống Israel, đoạt giải Nobel hòa bình năm 1994. Chương trình "Hôm nay ngày gì" hiện đã có mặt trên Youtube, Facebook và Spotify: - Facebook: https://www.facebook.com/aweekmedia - Youtube: https://www.youtube.com/c/AWeekTV - Spotify: https://open.spotify.com/show/6rC4CgZNV6tJpX2RIcbK0J - Apple Podcast: https://podcasts.apple.com/.../h%C3%B4m-nay.../id1586073418 #aweektv #28thang9 #AlexanderFleming # Suvarnabhumi #SingaporeGrand Prix #KhổngTử #penicillin Các video đều thuộc quyền sở hữu của Adwell jsc, mọi hành động sử dụng lại nội dung của chúng tôi đều không được phép. --- Send in a voice message: https://anchor.fm/aweek-tv/message
Supercomputers are big, they're noisy, and they use more energy than a small town. They don't look like much from the outside, but the inside tells a different story! Supercomputers are helping us solve some of the world's biggest problems, and they could be coming soon to a desktop near you...
Let's oversimplify something in the computing world. Which is what you have to do when writing about history. You have to put your blinders on so you can get to the heart of a given topic without overcomplicating the story being told. And in the evolution of technology we can't mention all of the advances that lead to each subsequent evolution. It's wonderful and frustrating all at the same time. And that value judgement of what goes in and what doesn't can be tough. Let's start with the fact that there are two main types of processors in our devices. There's the x86 chipset developed by Intel and AMD and then there's the RISC-based processors, which are ARM and for the old school people, also include PowerPC and SPARC. Today we're going to set aside the x86 chipset that was dominant for so long and focus on how the RISC and so ARM family emerged. First, let's think about what the main difference is between ARM and x86. RISC and so ARM chips have a focus on reducing the number of instructions required to perform a task to as few as possible, and so RISC stands for Reduced Instruction Set Computing. Intel, other than the Atom series chips, with the x86 chips has focused on high performance and high throughput. Big and fast, no matter how much power and cooling is necessary. The ARM processor requires simpler instructions which means there's less logic and so more instructions are required to perform certain logical operations. This increases memory and can increase the amount of time to complete an execution, which ARM developers address with techniques like pipelining, or instruction-level parallelism on a processor. Seymour Cray came up with this to split up instructions so each core or processor handles a different one and so Star, Amdahl and then ARM implemented it as well. The X86 chips are Complex Instruction Set Computing chips, or CISC. Those will do larger, more complicated tasks, like computing floating point integers or memory searches, on the chip. That often requires more consistent and larger amounts of power. ARM chips are built for low power. The reduced complexity of operations is one reason but also it's in the design philosophy. This means less heat syncs and often accounting for less consistent streams of power. This 130 watt x86 vs 5 watt ARM can mean slightly lower clock speeds but the chips can cost more as people will spend less in heat syncs and power supplies. This also makes the ARM excellent for mobile devices. The inexpensive MOS 6502 chips helped revolutionize the personal computing industry in 1975, finding their way into the Apple II and a number of early computers. They were RISC-like but CISC-like as well. They took some of the instruction set architecture family from the IBM System/360 through to the PDP, General Nova, Intel 8080, Zylog, and so after the emergence of Windows, the Intel finally captured the personal computing market and the x86 flourished. But the RISC architecture actually goes back to the ACE, developed in 1946 by Alan Turing. It wasn't until the 1970s that Carver Mead from Caltech and Lynn Conway from Xerox PARC saw that the number of transistors was going to plateau on chips while workloads on chips were growing exponentially. ARPA and other agencies needed more and more instructions, so they instigated what we now refer to as the VLSI project, a DARPA program initiated by Bob Kahn to push into the 32-bit world. They would provide funding to different universities, including Stanford and the University of North Carolina. Out of those projects, we saw the Geometry Engine, which led to a number of computer aided design, or CAD efforts, to aid in chip design. Those workstations, when linked together, evolved into tools used on the Stanford University Network, or SUN, which would effectively spin out of Stanford as Sun Microsystems. And across the bay at Berkeley we got a standardized Unix implementation that could use the tools being developed in Berkely Software Distribution, or BSD, which would eventually become the operating system used by Sun, SGI, and now OpenBSD and other variants. And the efforts from the VLSI project led to Berkely RISC in 1980 and Stanford MIPS as well as the multi chip wafer.The leader of that Berkeley RISC project was David Patterson who still serves as vice chair of the RISC-V Foundation. The chips would add more and more registers but with less specializations. This led to the need for more memory. But UC Berkeley students shipped a faster ship than was otherwise on the market in 1981. And the RISC II was usually double or triple the speed of the Motorola 68000. That led to the Sun SPARC and DEC Alpha. There was another company paying attention to what was happening in the RISC project: Acorn Computers. They had been looking into using the 6502 processor until they came across the scholarly works coming out of Berkeley about their RISC project. Sophie Wilson and Steve Furber from Acorn then got to work building an instruction set for the Acorn RISC Machine, or ARM for short. They had the first ARM working by 1985, which they used to build the Acorn Archimedes. The ARM2 would be faster than the Intel 80286 and by 1990, Apple was looking for a chip for the Apple Newton. A new company called Advanced RISC Machines or Arm would be founded, and from there they grew, with Apple being a shareholder through the 90s. By 1992, they were up to the ARM6 and the ARM610 was used for the Newton. DEC licensed the ARM architecture to develop the StrongARMSelling chips to other companies. Acorn would be broken up in 1998 and parts sold off, but ARM would live on until acquired by Softbank for $32 billion in 2016. Softbank is currently in acquisition talks to sell ARM to Nvidia for $40 billion. Meanwhile, John Cocke at IBM had been working on the RISC concepts since 1975 for embedded systems and by 1982 moved on to start developing their own 32-bit RISC chips. This led to the POWER instruction set which they shipped in 1990 as the RISC System/6000, or as we called them at the time, the RS/6000. They scaled that down to the Power PC and in 1991 forged an alliance with Motorola and Apple. DEC designed the Alpha. It seemed as though the computer industry was Microsoft and Intel vs the rest of the world, using a RISC architecture. But by 2004 the alliance between Apple, Motorola, and IBM began to unravel and by 2006 Apple moved the Mac to an Intel processor. But something was changing in computing. Apple shipped the iPod back in 2001, effectively ushering in the era of mobile devices. By 2007, Apple released the first iPhone, which shipped with a Samsung ARM. You see, the interesting thing about ARM is they don't fab chips, like Intel - they license technology and designs. Apple licensed the Cortex-A8 from ARM for the iPhone 3GS by 2009 but had an ambitious lineup of tablets and phones in the pipeline. And so in 2010 did something new: they made their own system on a chip, or SoC. Continuing to license some ARM technology, Apple pushed on, getting between 800MHz to 1 GHz out of the chip and using it to power the iPhone 4, the first iPad, and the long overdue second-generation Apple TV. The next year came the A5, used in the iPad 2 and first iPad Mini, then the A6 at 1.3 GHz for the iPhone 5, the A7 for the iPhone 5s, iPad Air. That was the first 64-bit consumer SoC. In 2014, Apple released the A8 processor for the iPhone 6, which came in speeds ranging from 1.1GHz to the 1.5 GHz chip in the 4th generation Apple TV. By 2015, Apple was up to the A9, which clocked in at 1.85 GHz for the iPhone 6s. Then we got the A10 in 2016, the A11 in 2017, the A12 in 2018, A13 in 2019, A14 in 2020 with neural engines, 4 GPUs, and 11.8 billion transistors compared to the 30,000 in the original ARM. And it's not just Apple. Samsung has been on a similar tear, firing up the Exynos line in 2011 and continuing to license the ARM up to Cortex-A55 with similar features to the Apple chips, namely used on the Samsung Galaxy A21. And the Snapdragon. And the Broadcoms. In fact, the Broadcom SoC was used in the Raspberry Pi (developed in association with Broadcom) in 2012. The 5 models of the Pi helped bring on a mobile and IoT revolution. And so nearly every mobile device now ships with an ARM chip as do many a device we place around our homes so our digital assistants can help run our lives. Over 100 billion ARM processors have been produced, well over 10 for every human on the planet. And the number is about to grow even more rapidly. Apple surprised many by announcing they were leaving Intel to design their own chips for the Mac. Given that the PowerPC chips were RISC, the ARM chips in the mobile devices are RISC, and the history Apple has with the platform, it's no surprise that Apple is going back that direction with the M1, Apple's first system on a chip for a Mac. And the new MacBook Pro screams. Even software running in Rosetta 2 on my M1 MacBook is faster than on my Intel MacBook. And at 16 billion transistors, with an 8 core GPU and a 16 core neural engine, I'm sure developers are hard at work developing the M3 on these new devices (since you know, I assume the M2 is done by now). What's crazy is, I haven't felt like Intel had a competitor other than AMD in the CPU space since Apple switched from the PowerPC. Actually, those weren't great days. I haven't felt that way since I realized no one but me had a DEC Alpha or when I took the SPARC off my desk so I could play Civilization finally. And this revolution has been a constant stream of evolutions, 40 years in the making. It started with an ARPA grant, but various evolutions from there died out. And so really, it all started with Sophie Wilson. She helped give us the BBC Micro and the ARM. She was part of the move to Element 14 from Acorn Computers and then ended up at Broadcom when they bought the company in 2000 and continues to act as the Director of IC Design. We can definitely thank ARPA for sprinkling funds around prominent universities to get us past 10,000 transistors on a chip. Given that chips continue to proceed at such a lightning pace, I can't imagine where we'll be at in another 40 years. But we owe her (and her coworkers at Acorn and the team at VLSI, now NXP Semiconductors) for their hard work and innovations.
The Cabin is presented by the Wisconsin Counties Association, this week we're featuring Sheboygan County. Campfire Conversation: This week we're talking about some legendary celebrities that hail from Wisconsin, there are more than you may think! We're talking about: Mark Ruffalo, Butch Vig, Liberace, Jane Wiedlin, Les Paul, Steve Miller, Harry Houdini, Willem Dafoe, Jim Abrahams, David Zucker, and Jerry Zucker, Frank Lloyd Wright, Jane Kaczmarek, Al Jarreau, Dick Trickle, Jim Lovell, Seymour Cray, Kurtwood Smith, Gene Wilder, JJ Watt, Chris Farley. Behind-the-Scenes: We’re taking it wayyyy back, and talking about Old World Wisconsin. For centuries, people left old worlds for a new life in Wisconsin. They faced unfamiliar landscapes and languages, learned to establish farms and communities, and blended the old with the new. Here are some of our favorite spots and activities: 1880s High Wheel Bikes, Blacksmithing, Historical Baseball Games, Collecting Eggs, Historical Chores, Historical Beer Brewing, Kettle Moraine Trail. Know Your Wisconsin: County Parks
Today we're going to talk through the history of Cray Computers. And really, this is then a history of supercomputers during Seymour Cray's life. If it's not obvious by his name, he was the founder of Cray. But before we go there, let's back up a bit and talk about some things that were classified for a long time. The post-World War II spending by the US government definitely leveled up the US computer industry. And defense was the name of the game in those early years. Once upon a time, the computer science community referred to the Minneapolis/St Paul area as the Land of 10,000 Top Secret Projects. And a lot of things ended up coming out of that. One of the most important in the history of computing though, was Engineering Research Associates, or ERA. They built highly specialized computers. Those made for breaking Soviet codes. Honeywell had been founded in Minneapolis and as with Vannevar Bush, had gone from thermostats to computers. Honeywell started pumping out the DATAmatic 1000 in 1957. There was a computer shipping and Honeywell was well situated to capitalize on the growing mainframe computer market. ERA had some problems because the owners were embroiled in Washington politics and so they were acquired by Sperry Rand, today's Unisys, but at the time one of the larger mainframe developers and the progeny of both the Harvard Mark series and ENIAC series of mainframes. Only problem is that the Sperry Rand crew were making a bundle off Univacs and so didn't put money into forward looking projects. The engineers knew that there were big changes coming in computing. And they wanted to be at the forefront. Who wouldn't. But with Sperry Rand barely keeping up with orders they couldn't focus on R&D the way many former ERA engineers wanted to. So many of the best and brightest minds from ERA founded Control Data Corporation, or CDC. And CDC built some serious computers that competed with everyone at the time. Because they had some seriously talented engineers. One, who had come over from ERA, was Seymour Cray. And he was a true visionary. And so you had IBM and their seven biggest competitors, known as Snow White and the Seven Dwarfs. Three of those dwarfs were doing a lot of R&D in Minneapolis (or at least the Minneapolis area). None are still based in the Twin Cities. But all three build ruggedized computers that could withstand nuclear blasts, corrosive elements, and anything you could throw at them. But old Seymour. He wanted to do something great. Cray had a vision of building the fastest computer in the world. And as luck would have it, transistors were getting cheaper by the day. They had initially been designed to use germanium but Seymour Cray worked to repackage those at CDC to be silicon and was able to pack enough in to make the CDC 6600 the fastest computer in the world in 1964. They had leapfrogged the industry and went to market, selling the machines like hotcakes. Now CDC would build one of the first real supercomputers in that 6600. And supercomputers are what Cray is known for today. But there's a little more drama to get from CDC to Cray and then honestly from Cray to the other Crays that Seymour founded. CDC went into a big of a buying tornado as well. As with the Univacs, they couldn't keep up with demand and so suddenly were focused too much on Development to look beyond fulfillment and shipping and into the Research part of R&D. Additionally shipping all those computers and competing with IBM was rough and CDC was having financial problems, so CEO William Norris wouldn't let them redesign the 6600 from the ground up. But Cray saw massive parallel processing as the future, which is kinda' what supercomputing really is at the end of the day, and was bitten by that bug. He wanted to keep building the fastest computers in the world. And he would get his wish. He finally left CDC in 1972 and founded Cray Research along with cofounding engineer Lester Davis. They went to Chippewa Falls Wisconsin. It took him four years, but Cray shipped the Cray-1 in 1976, which became the best selling supercomputer in history (which means they sold more than 80 and less than a hundred). It was 80MhZ, or 200 gigaFLOPS. And that was vector processing. They would math faster by re-arranging the memory and registers to more intelligently process big amounts of data. He used Maxwell's equations on his boards. He designed it all on paper. The first Cray-1 would ship to Los Alamos National Laboratory. The Cray-1 was 5 and a half tons, cost around $8 million dollars in 1976 money and the fact that they were the fastest computer in the world combined with the fact that they were space age looking gave Seymour Cray instant star status. The Cray-1 would soon get competition from the ILLIAC IV out of the University of Illinois, an ARPA project. So Cray got to work thinkin'. He liked to dig when he thought, and he tried to dig a tunnel under his house. This kinda' sums up what I think of Wisconsin. The Cray-2 would come in 1985, which was the first multiple CPU design by Cray. It came in at 1.9 Gigaflops. They rearranged memory to allow for more parallelization and used two sets of memory registers. It effectively set the stage for modern processing architectures in a lot of ways, offloading tasks for a dedicated foreground processor to main memory connected over the fastest channels possible to each CPU. But IBM wouldn't release the first real multi core processor until 2001. And we see this with supercomputers. The techniques used in them come downmarket over time. But some of the biggest problems were how to keep the wires close together. The soldering of connecters at that level was nearly impossible. And the thing was hot. So they added, get this, liquid coolant, leading some people to call the Cray-2 “Bubbles.” By now, Seymour Cray had let other people run the company and thee were competing projects like the Cray X-MP underway. Almost immediately after the release of the Cray-2 Seymour moved to working on the Cray-3 but the project was abandoned and again, Cray found himself wanting to just go do research without priorities shifting what he could do. But Seymour always knew best. Again, he's from Wisconsin. So he left the company with his name and started another company, this one called Cray Computer, where he did manage to finish the Cray-3. But that Cold War war spending from the Cold War dried up. And while he thought of designs for a Cray-4, the company would go bankrupt in 1995. He was athletic and healthy, so in his 70s, why not keep at it? His next company would focus on massively parallel processing, which would be the trend of the future, but Seymour Cray died from complications to a car accident in 1996. He was one of the great pioneers of the computing industry. He set a standard that computers like IBM's Blue Gene then Summit or China's Sunway TahuLight or Dell's Frontera or Cray's HPE or Fujitsu's aBCI or Lenovo's SuperMUC-NG carry on. Those run at between 20 gigaflops to close to 150 gigaflops. Today, the Cray X1E pays homage to it's ancestor, the great Cray-1. But no one does it with style the way the Cray-1 did - and think about this, Moore's Law says transistors will double every two years. Not to oversimplify things but that means that since the Cray-2 we should have had a 262 gigaflop machine by now. But I guess he's not here to break down the newer barriers like he did with the von Neumann bottleneck. Also, think about this, those early supercomputers were funded by the departments that became the NSA. They even helped fund the development of Cray's throughout history. So maybe we have hit 262 and it's just classified. I swoon at that thought. But maybe it's just that this is where the move from bits to qubits and quantum computing becomes the next significant jump. Who knows? But hey, thanks for joining me on this episode of the History of Computing Podcast. Do you have a story you want to tell? I plan to run more interviews soon and while we have a cast of innovators that we're talking to, we'd love even more weird and amazing humans. Hit us up if you want to! And in the meantime, thanks again for listening, we are so lucky to have you.
Computers have undoubtedly evolved from their humble beginnings to the lifeline super machines they are today. Computer Museum of America (CMoA) Founder Lonnie Mimms and Vice President Karin Mimms join this week's Around Atlanta segment of Atlanta Real Estate Forum Radio to discuss the museum's mission, artifacts and exhibits with co-hosts Carol Morgan and Todd Schnick. What initially started as an effort to collect and protect vintage computers has evolved into one of the world's most all-inclusive collections of computing artifacts, helping preserve the history of computing for generations to come. CMoA serves as a permanent record of the computer evolution process and the market experiments that drove discovery forward. Visitors of all ages enjoy engaging exhibits that share stories and highlight artifacts throughout the history of computing. In addition to temporary pop-up exhibits, CMoA has loaned rare artifacts to other museums including the Smithsonian Institution. Current exhibits include: A Tribute to Apollo 11 Beginning with an animated documentary, Getting to the Moon and Back, A Tribute to Apollo 11 shows the type of computers that NASA used from an IBM 3420, a front panel of an IBM 360, modular computer systems and more. Completely immerse yourself in the history of Rocketry, the race to space and 3D views of space from the Apollo missions. Supercomputing: Vanquishing the Impossible Serving as a tribute to the father of supercomputing, Seymour Cray, this exhibit displays more than 70 supercomputers showing weather predictions, artificial intelligence, cybersecurity and more. From the iconic Cray1A and Pixar Machine, Sun Microsystems, to the Connection Machine 2 and a life-size mural of the IBM summit. Timeline of Computer History Beginning with the catalysts of the digital age including an abacus, slide rules, a rotary telephone and a transistor radio, this exhibit takes you through the decades with its artifacts of the digital past that include a Datapoint 2200, original Apple 1, the infamous RadioShack TRS80, a rare Apple Lisa 1, Gameboy, Nintendo, Commodore 64 and more. Byte Wall Magazine Collection A true display of nostalgia, CMoA also features a complete collection of Byte magazine covers including special editions. Step back in time and see how Byte Magazine covers told stories and moved to a product catalog. Two original Robert Tinney artwork covers are on display in the collection. The computer museum offers a fun, thought-provoking way to learn about the digital age. To learn more about CMoA including its unique artifacts, plans to expand and more, listen to the complete interview above or visit www.ComputerMuseumOfAmerica.org. A special thank you to Jackson EMC for sponsoring Atlanta Real Estate Forum Radio. Jackson EMC offers homebuyers peace of mind and lower bills with its certified Right Choice™ new home program. These homes are built to be energy efficient and sustainable with improved indoor air quality, convenience and comfort. For more information on Right Choice new homes and Jackson EMC, visit https://RightChoice.JacksonEMC.com. Please subscribe to Atlanta Real Estate Forum Radio on iTunes. If you like this week's show, be sure to rate it. The “Around Atlanta” segment, sponsored by Denim Marketing, is designed to showcase the best of metro Atlanta – the communities, attractions and special events that make this city great. To submit your event, community or attraction to the Around Atlanta edition of Atlanta Real Estate Forum Radio, contact Denim Marketing at 770-383-3360 or fill out the Atlanta Real Estate Forum contact form here.
Gabriel Broner hosts Irene Qualters to discuss her career and the evolution of HPC. Irene, an HPC pioneer, went from being a young female engineer working with Seymour Cray to become president of Cray. She then reinvented herself to work in the pharma space and then at the National Science Foundation. She was awarded the 2018 HPCwire Readers’ Award for Outstanding Leadership in HPC.
Gabriel Broner hosts Irene Qualters to discuss her career and the evolution of HPC. Irene, an HPC pioneer, went from being a young female engineer working with Seymour Cray to become president of Cray. She then reinvented herself to work in the pharma space and then at the National Science Foundation. She was awarded the 2018 HPCwire Readers’ Award for Outstanding Leadership in HPC.
El lunes hubo un terremoto en Caudete, Albacete, que se sentía en Yecla, y por la zona de Lorca se han registrado movimientos sísmicos en los últimos días. Hemos querido analizar estos movimientos con la geología, que es la ciencia que estudia la composición, la estructura y movimientos de la tierra.Los movimientos que se han percibido son normales en una zona sísmica activa como la de Murcia, según José Martínez, profesor de Geología de la Universidad Complutense de Madrid. Añade que en la Región de Murcia hay terremotos de entre 1 y 3,5 grados porque es una zona sísmica activa. Recuerda que hay movimientos casi a diario que no percibimos.Según el profesor de Geología de la Universidad Complutense de Madrid, los geólogos no pueden predecir cuándo va a ocurrir un seísmo de gran magnitud, pero estudian los que han tenido lugar en el pasado para hacer un estudio de la actividad. Ahora están investigando uno que ocurrió en Lorca en 1674 y otros de mayor magnitud del ocurrido en 2011, recordamos que fue de 5,1 grados. Lo investigan analizando la falla de Alhama, viendo los restos que el terremoto ha dejado en la orografía.¿Por qué Murcia es una zona de actividad sísmica? El sureste de España está muy cerca de las placas de Europa y África y se acercan unos 4 ó 5 milímetros al año. Esa tensión se concentra en las fallas y éstas se tienen que relajar a través de terremotos cada cierto tiempo.Sin embargo, la zona de actividad sísmica más intensa del planeta no es la del sureste de España, es el cinturón del fuego del Pacífico, sobre todo Chile, México, Californía y las costas de Japón o Nueva Zelanda.Por otra parte, entrevistamos a Mateo Valero, ganador del equivalente al Nobel en Arquitectura de Computadores, el premio de supercomputación Seymour Cray, otorgado por la IEEE Computer Society, y director del Centro Nacional de Supercomputador de Barcelona, que ha visitado la Región para hacer charlas a estudiantes de Secundaria en Cartagena y Molina de Segura, organizadas por la Fundación de Estudios Médicos de Molina de Segura.
Fredrik låter lite klippt ibland, det är helt och hållet hans eget fel. Jocke citerar fel person, det är helt och hållet hans eget fel. 0 Örnsköldsvik, superdatorer, fiber och SAN 21:47: Datormagazin har en BBS igen! 30:17: IKEA Trådfri 35:58: Apple har bytt ikon för Kartor 36:25: Nya möjliga CMS för Macpro 44:11: Ett viktigt mejl, och sätt att sponsra podden. Vill du inte använda Patreon men vill donera pengar går det att höra av sig till oss för Swish-uppgifter 47:32: Fredrik har äntligen sett Mr Robot! Spoilervarning från 48:49. 52:59: Fredrik lyssnar på snack om blockkedjor och ser chans till bubblor 1:01:50: Discord kastar ut nazister, Trump är hemsk 1:10:19: Chris Lattner går till Google brain och appstorlekar är löjliga 1:15:05: Jocke försöker bygga nytt webbkluster 1:21:29: Jocke recenserar sin nya USB-hubb Länkar Nationellt superdatorcentrum SGI Origin 3200 Silicon graphics Cray Seymour Cray Den första datorn värd att kritisera verkar vara ett citat från Alan Kay Be och Beos Infiniband Fibre channel R12000-processorn Ernie Bayonne Jockes superdatorloot. Craylink - även känd som NUMAlink Promise thunderbold-fibre channel-adapter Ali - Ali express Datormagazin BBS är tillbaka! Fabbes BBS SUGA A590 Terrible fire Vampire-acceleratorerna FPGA Plipbox SD2IEC - 1541-emulatorn Jocke beställde från Polen. Satandisk IKEA Trådfri Artikeln på Macrumors AAPL:s nyhetsbrev Apple har bytt ikon för Kartor Grav Jekyll Bloxsom Sourceforge - där en del lade kod förr Ilir - tusen tack käre Oneplus 5-sponsor Man kan stödja podden på Patreon, men bara om man vill Mr Robot Incomparableavsnittet om Mr Robot Vi pratade lite milt om blockkedjor i avsnitt 67 Discord kastar ut nazister Cloudflare också Tim Cooks brev till de anställda Videoklippet där Anderson Cooper sakligt tar all heder av Trump Chris Lattner blörjar jobba på Google Brain Appstorlekar är fortfarande löjliga Kod är en deprimerande stor del av Facebook-appens filstorlek Acorn Alpine Linux PHP-FPM Nginx WP super cache Varnish Docker Openbsd Ballmer peak Jocke recenserar sin USB-hubb Henge dock Jockes USB-grafikkort Fullständig avsnittsinformation finns här: https://www.bjoremanmelin.se/podcast/avsnitt-90-superdatorer-med-inbyggda-soffor.html.
We are back with Feedback-a- palooza show for the fifteenth episode of NaPodPoMo 2012, November 15, 2012. Facebook Feedback about TechPhx. Which 3 foods will Mother Superior never eat? Seymour Cray – Rockstar!!! Blue Laws – Origins More attention from the presidential candidates and 420 crowd for Colorado than they need. Biodeisel for your used cooking oil? Put your box in the closet [...]