POPULARITY
Leonard Kleinrock is considered one of the founding fathers of the internet, and he has served on the UCLA faculty for over 60 years. Podcasts contributor Marty Johnson sits down with Kleinrock to discuss his life, career and advice he has for UCLA students.
Dr. Leonard Kleinrock is a computer scientist who developed the mathematical theory behind packet switching. He sent the first message between two computers on a network that was a precursor of the Internet. His is a distinguished Professor of Computer Science at UCLA. He has published over 250 papers and authored six books. Follow the My Fame, Explained podcast on Facebook and Instagram.
Seit dem 1.1.1983 arbeitet das Internet ausschliesslich mit TCP/IP und damit wurde das Vorgängerprotokoll NCP komplett verdrängt. Aus diesem Grund wird der 1.1.1983 auch oft als der Geburtstag des modernen Internets bezeichnet - nicht zuletzt von einem der Entwickler der Protokolle. In dieser Folge feiern wir daher 40 Jahre modernes Internet nach und begleiten die Entwicklung von den Anfängen bis zum 1.1.1983. Es geht um die Frühgeschichte des Internets. Mehr zu Neulich im Netz auf https://www.neulich-im.net/ music by scottholmesmusic.com Quellen: Spotify Stats, https://wrappedforpodcasters.byspotify.com/?id=c4be6b4f50597936f728332e757a439e50489b9b7f7bc2036e0fb2d879a72385&utm_medium=share&utm_source=copy_link Marking the birth of the modern-day Internet, Vint Cert, Vint Cert, Internet Hall of Fame, Paul Baran, Internet Hall of Fame, Paul Baran and the Origins of the Internet, RAND Corporation, Donald Davies, Internet Hall of Fame, A Brief History of the Internet, ISOC, J.C.R. Licklider, Internet Hall of Fame, “Man-Computer Symbiosis” und “The Computer as a Communication Device”, Robert Taylor, Internet Hall of Fame, Lawrence Roberts, Internet Hall of Fame, MULTIPLE COMPUTER NETWORKS AND INTERCOMPUTER COMMUNICATION, Lawrence Roberts, The Birth of the Internet, Leonard Kleinrock, Peter Kirstein, Internet Hall of Fame, THE ALOHANET — SURFING FOR WIRELESS DATA, Cerf und Kahn, A Protocol for Packet Network Intercommunication, SPECIFICATION OF INTERNET TRANSMISSION CONTROL PROGRAM, NCP/TCP TRANSITION PLAN, --- Send in a voice message: https://podcasters.spotify.com/pod/show/neulich-im-netz/message
Qual è il vostro podcast preferito? La RISPOSTA perfetta sarebbe Parlandom, ma ci accontentiamo di essere nella vostra top 3 ;)| Libro:Me -> Cara E., perché non mi rispondi? P. S. Segue lettera - Paula Danziger e Ann M. Martin |Pi -> I dolori del giovane Werther - Johann Wolfgang Goethe || Canzone:Me -> Blowin' in the wind - Bob Dylan -> https://spoti.fi/3SMjGqt | Risposta non c'è - testo di Mogol -> https://youtu.be/O_3QJypgJNs |Pi -> Standing Ovation - Grido -> https://youtu.be/tXqUyuJa54c|| Film:Me -> The imitation game - diretto da Morten Tyldum |Pi -> The Millionaire (Slumdog Millionaire) - diretto da Danny Boyle |Siamo su• Facebook: https://www.facebook.com/parlandompodcast• Instagram: https://www.instagram.com/parlandom_podcast/• Telegram: https://t.me/parlandom• Playlist Spotify: Parole d'Autore - Canzoni random del Parlandom Podcast -> https://spoti.fi/3zoBO2p Puoi recensirci da telefono su Spotify, oppure su: https://podcasts.apple.com/it/podcast/parlandom-parole-random/id1508896821| Fonti: https://it.wikipedia.org/wiki/Messaggio_di_Arecibohttps://it.wikipedia.org/wiki/Leonard_Kleinrock, https://www.treccani.it/vocabolario/risposta/ || Sigla: Whiskey Blues - Ilya Truhanov - https://icons8.com/music/author/ilya-truhanov-1
29 octobre 1969 - 22h30 : Leonard Kleinrock est installé sur l'ordinateur Sigma 710 avec son assistant-programmeur Charley Kline dans la salle de calcul du département informatique de l'UCLA, l'université de Los Angeles.
Qualcomm is the world's largest fabless semiconductor designer. The name Qualcomm is a mashup of Quality and Communications and communications has been a hallmark of the company since its founding. They began in satellite communications and today most every smartphone has a Qualcomm chip. The ubiquity of communications in our devices and everyday lives has allowed them a $182 billion market cap as of the time of this writing. Qualcomm began with far humbler beginnings. They emerged out of a company called Linkabit in 1985. Linkabit was started by Irwin Jacobs, Leonard Kleinrock, and Andrew Viterbi - all three former graduate students at MIT. Viterbi moved to California to take a job with JPL in Pasadena, where he worked on satellites. He then went off to UCLA where he developed what we now call the Viterti algorithm, for encoding and decoding digital communications. Jacobs worked on a book called Principles of Communication Engineering after getting his doctorate at MIT. Jacobs then took a year of leave to work at JPL after he met Viterbi in the early 1960s and the two hit it off. By 1966, Jacobs was a professor at the University of California, San Diego. Kleinrock was at UCLA by then and the three realized they had too many consulting efforts between them, but if they consolidated the request they could pool their resources. Eventually Jacobs and Viterbi left and Kleinrock got busy working on the first ARPANET node when it was installed at UCLA. Jerry Heller, Andrew Cohen, Klein Gilhousen, and James Dunn eventually moved into the area to work at Linkabit and by the 1970s Jacobs was back to help design telecommunications for satellites. They'd been working to refine the theories from Claude Shannon's time at MIT and Bell Labs and were some of the top names in the industry on the work. And the space race needed a lot of this type of work. They did their work on Scientific Data Systems computers in an era before that company was acquired by Xerox. Much as Claude Shannon got started thinking of data loss as it pertains to information theory while trying to send telegraphs over barbed wire, they refined that work thinking about sending images from mars to earth. Others from MIT worked on other space projects as a part of missions. Many of those early employees were Viterbi's PhD students and they were joined by Joseph Odenwalder, who took Viterbi's decoding work and combined it with a previous dissertation out of MIT when he joined Linkabit. That got used in the Voyager space probes and put Linkabit on the map. They were hiring some of the top talent in digital communications and were able to promote not only being able to work with some of the top minds in the industry but also the fact that they were in beautiful San Diego, which appealed to many in the Boston or MIT communities during harsh winters. As solid state electronics got cheaper and the number of transistors more densely packed into those wafers, they were able to exploit the ability to make hardware and software for military applications by packing digital signal processors that had previously taken a Sigma from SDS into smaller and smaller form factors, like the Linkabit Microprocessor, which got Viterbi's algorithm for encoding data into a breadboard and a chip. The work continued with defense contractors and suppliers. They built modulation and demodulation for UHF signals for military communications. That evolved into a Command Post Modem/Processor they sold, or CPM/P for short. They made modems for the military in the 1970s, some of which remained in production until the 1990s. And as they turned their way into the 1980s, they had more than $10 million in revenue. The UC San Diego program grew in those years, and the Linkabit founders had more and more local talent to choose from. Linkabit developed tools to facilitate encoded communications over commercial satellites as well. They partnered with companies like IBM and developed smaller business units they were able to sell off. They also developed a tool they called VideoCipher to encode video, which HBO and others used to do what we later called scrambling on satellite signals. As we rounded the corner into the 1990s, though, they turned their attention to cellular services with TDMA (Time-Division Multiple Access), an early alternative to CDMA. Along the way, Linkabit got acquired by a company called MACOM in 1980 for $25 million. The founders liked that the acquirer was a fellow PhD from MIT and Linkabit stayed separate but grew quickly with the products they were introducing. As with most acquisitions, the culture changed and by 1985 the founders were gone. The VideoCipher and other units were sold off, spun off, or people just left and started new companies. Information theory was decades old at this point, plenty of academic papers had been published, and everyone who understood the industry knew that digital telecommunications was about to explode; a perfect storm for defections. Qualcomm Over the course of the next few years over two dozen companies were born as the alumni left and by 2003, 76 companies were founded by Linkabit alumni, including four who went public. One of the companies that emerged included the Linkabit founders Irwin Jacobs and Andrew Viterbi, Begun in 1985, Qualcomm is also based in San Diego. The founders had put information theory into practice at Linkabit and seen that the managers who were great at finance just weren't inspiring to scientists. Qualcomm began with consulting and research, but this time looked for products to take to market. They merged with a company called Omninet and the two released the OmniTRACS satellite communication system for trucking and logistical companies. They landed Schneider National and a few other large customers and grew to over 600 employees in those first five years. It remained a Qualcomm subsidiary until recently. Even with tens of millions in revenue, they operated at a loss while researching what they knew would be the next big thing. Code-Division Multiple Acces, or CDMA, is a technology that allows for sending information over multiple channels so users can share not just a single frequency of the radio band, but multiple frequencies without a lot of interference. The original research began all the way back in the 1930s when Dmitry Ageyev in the Soviet Union researched the theory of code division of signals at Leningrad Electrotechnical Institute of Communications. That work and was furthered during World War II by German researchers like Karl Küpfmüller and Americans like Claude Shannon, who focused more on the information theory of communication channels. People like Lee Yuk-wing then took the cybernetics work from pioneers like Norbert Weiner and helped connect those with others like Qualcomm's Jacobs, a student of Yuk-wing's when he was a professor at MIT. They were already working on CDMA jamming in the early 1950s at MIT's Lincoln Lab. Another Russian named Leonid Kupriyanovich put the concept of CMDA into practice in the later 1950s so the Soviets could track people using a service they called Altai. That made it perfect for perfect for tracking trucks and within a few years was released in 1965 as a pre-cellular radiotelephone network that got bridged to standard phone lines. The Linkabit and then Qualcomm engineers had worked closely with satellite engineers at JPL then Hughes and other defense then commercial contractors. They'd come in contact with work and built their own intellectual property for decades. Bell was working on mobile, or cellular technologies. Ameritech Mobile Communications, or Advanced Mobile Phone System (AMPS) as they were known at the time, launched the first 1G network in 1983 and Vodaphone launched their first service in the UK in 1984. Qualcomm filed their first patent for CDMA the next year. That patent is one of the most cited documents in all of technology. Qualcomm worked closely with the Federal Communications Commission (FCC) in the US and with industry consortiums, such as the CTIA, or Cellular Telephone Industries Association. Meanwhile Ericsson promoted the TDMA standard as they claimed it was more standard; however, Qualcomm worked on additional patents and got to the point that they licensed their technology to early cell phone providers like Ameritech, who was one of the first to switch from the TDMA standard Ericsson promoted to CDMA. Other carriers switched to CDMA as well, which gave them data to prove their technology worked. The OmniTRACS service helped with revenue, but they needed more. So they filed for an initial public offering in 1991 and raised over $500 billion in funding between then and 1995 when they sold another round of shares. By then, they had done the work to get CDMA encoding on a chip and it was time to go to the mass market. They made double what they raised back in just the first two years, reaching over $800 million in revenue in 1996. Qualcomm and Cell Phones One of the reasons Qualcomm was able to raise so much money in two substantial rounds of public funding is that the test demonstrations were going so well. They deployed CDMA in San Diego, New York, Honk Kong, Los Angeles, and within just a few years had over a dozen carriers running substantial tests. The CTIA supported CDMA as a standard in 1993 and by 1995 they went from tests to commercial networks. The standard grew in adoption from there. South Korea standardized on CDMA between 1993 to 116. The CDMA standard was embraced by Primeco in 1995, who used the 1900 MHz PCS band. This was a joint venture between a number of vendors including two former regional AT&T spin-offs from before the breakup of AT&T and represented interests from Cox Communications, Sprint, and turned out to be a large undertaking. It was also the largest cellular launch with services going live in 19 cities and the first phones were from a joint venture between Qualcomm and Sony. Most of PrimeCo's assets were later merged with AirTouch Cellular and the Bell Atlantic Mobile to form what we now know as Verizon Wireless. Along the way, there were a few barriers to mass proliferation of the Qualcomm CDMA standards. One is that they made phones. The Qualcomm Q cost them a lot to manufacture and it was a market with a lot of competition who had cheaper manufacturing ecosystems. So Qualcomm sold the manufacturing business to Kyocera, who continued to license Qualcomm chips. Now they could shift all of their focus on encoding bits of data to be carried over multiple radio channels to do their part in paving the way for 2G and 3G networks with the chips that went into most phones of the era. Qualcomm couldn't have built out a mass manufacturing ecosystem to supply the world with every phone needed in the 2G and 3G era. Nor could they make the chips that went in those phones. The mid and late 1990s saw them outsource then just license their patents and know-how to other companies. A quarter of a billion 3G subscribers across over a hundred carriers in dozens of countries. They got in front of what came after CDMA and worked on multiple other standards, including OFDMA, or Orthogonal frequency-Division Multiple Access. For those they developed the Qualcomm Flarion Flash-OFDM and 3GPP 5G NR, or New Radio. And of course a boatload of other innovative technologies and chips. Thus paving the way to have made Qualcomm instrumental in 5G and beyond. This was really made possible by this hyper-specialization. Many of the same people who developed the encoding technology for the Voyager satellite decades prior helped pave the way for the mobile revolution. They ventured into manufacturing but as with many of the designers of technology and chips, chose to license the technology in massive cross-licensing deals. These deals are so big Apple sued Qualcomm recently for a billion in missed rebates. But there were changes happening in the technology industry that would shake up those licensing deals. Broadcom was growing into a behemoth. Many of their designs sent from stand-alone chips to being a small part of a SoC, or system on a chip. Suddenly, cross-licensing the ARM gave Qualcomm the ability to make full SoCs. Snapdragon has been the moniker of the current line of SoCs since 2007. Qualcomm has an ARM Architectural License and uses the ARM instruction set to create their own CPUs. The most recent incarnation is known as Krait. They also create their own Graphics Processor (GPU) and Digital Signal Processors (DSPs) known as Adreno and Hexagon. They recently acquired Arteris' technology and engineering group, and they used Arteris' Network on Chip (NoC) technology. Snapdragon chips can be found in the Samsung Galaxy, Vivo, Asus, and Xiaomi phones. Apple designs their own chips that are based on the ARM architecture, so in some ways compete with the Snapdragon, but still use Qualcomm modems like every other SoC. Qualcomm also bought a new patent portfolio from HP, including the Palm patents and others, so who knows what we'll find in the next chips - maybe a chip in a stylus. Their slogan is "enabling the wireless industry," and they've certainly done that. From satellite communications that required a computer the size of a few refrigerators to battlefield communications to shipping trucks with tracking systems to cell towers, and now the full processor on a cell phone. They've been with us since the beginning of the mobile era and one has to wonder if the next few generations of mobile technology will involve satellites, so if Qualcomm will end up right back where they began: encoding bits of information theory into silicon.
The Internet is not a simple story to tell. In fact, every sentence here is worthy of an episode if not a few. Many would claim the Internet began back in 1969 when the first node of the ARPAnet went online. That was the year we got the first color pictures of earthen from Apollo 10 and the year Nixon announced the US was leaving Vietnam. It was also the year of Stonewall, the moon landing, the Manson murders, and Woodstock. A lot was about to change. But maybe the story of the Internet starts before that, when the basic research to network computers began as a means of networking nuclear missile sites with fault-tolerant connections in the event of, well, nuclear war. Or the Internet began when a T3 backbone was built to host all the datas. Or the Internet began with the telegraph, when the first data was sent over electronic current. Or maybe the Internet began when the Chinese used fires to send messages across the Great Wall of China. Or maybe the Internet began when drums sent messages over long distances in ancient Africa, like early forms of packets flowing over Wi-Fi-esque sound waves. We need to make complex stories simpler in order to teach them, so if the first node of the ARPAnet in 1969 is where this journey should end, feel free to stop here. To dig in a little deeper, though, that ARPAnet was just one of many networks that would merge into an interconnected network of networks. We had dialup providers like CompuServe, America Online, and even The WELL. We had regional timesharing networks like the DTSS out of Dartmouth University and PLATO out of the University of Illinois, Champaign-Urbana. We had corporate time sharing networks and systems. Each competed or coexisted or took time from others or pushed more people to others through their evolutions. Many used their own custom protocols for connectivity. But most were walled gardens, unable to communicate with the others. So if the story is more complicated than that the ARPAnet was the ancestor to the Internet, why is that the story we hear? Let's start that journey with a memo that we did an episode on called “Memorandum For Members and Affiliates of the Intergalactic Computer Network” sent by JCR Licklider in 1963 and can be considered the allspark that lit the bonfire called The ARPANet. Which isn't exactly the Internet but isn't not. In that memo, Lick proposed a network of computers available to research scientists of the early 60s. Scientists from computing centers that would evolve into supercomputing centers and then a network open to the world, even our phones, televisions, and watches. It took a few years, but eventually ARPA brought in Larry Roberts, and by late 1968 ARPA awarded an RFQ to build a network to a company called Bolt Beranek and Newman (BBN) who would build Interface Message Processors, or IMPs. The IMPS were computers that connected a number of sites and routed traffic. The first IMP, which might be thought of more as a network interface card today, went online at UCLA in 1969 with additional sites coming on frequently over the next few years. That system would become ARPANET. The first node of ARPAnet went online at the University of California, Los Angeles (UCLA for short). It grew as leased lines and more IMPs became more available. As they grew, the early computer scientists realized that each site had different computers running various and random stacks of applications and different operating systems. So we needed to standardize certain aspects connectivity between different computers. Given that UCLA was the first site to come online, Steve Crocker from there began organizing notes about protocols and how systems connected with one another in what they called RFCs, or Request for Comments. That series of notes was then managed by a team that included Elizabeth (Jake) Feinler from Stanford once Doug Engelbart's project on the “Augmentation of Human Intellect” at Stanford Research Institute (SRI) became the second node to go online. SRI developed a Network Information Center, where Feinler maintained a list of host names (which evolved into the hosts file) and a list of address mappings which would later evolve into the functions of Internic which would be turned over to the US Department of Commerce when the number of devices connected to the Internet exploded. Feinler and Jon Postel from UCLA would maintain those though, until his death 28 years later and those RFCs include everything from opening terminal connections into machines to file sharing to addressing and now any place where the networking needs to become a standard. The development of many of those early protocols that made computers useful over a network were also being funded by ARPA. They funded a number of projects to build tools that enabled the sharing of data, like file sharing and some advancements were loosely connected by people just doing things to make them useful and so by 1971 we also had email. But all those protocols needed to flow over a common form of connectivity that was scalable. Leonard Kleinrock, Paul Baran, and Donald Davies were independently investigating packet switching and Roberts brought Kleinrock into the project as he was at UCLA. Bob Kahn entered the picture in 1972. He would team up with Vint Cerf from Stanford who came up with encapsulation and so they would define the protocol that underlies the Internet, TCP/IP. By 1974 Vint Cerf and Bob Kahn wrote RFC 675 where they coined the term internet as shorthand for internetwork. The number of RFCs was exploding as was the number of nodes. The University of California Santa Barbara then the University of Utah to connect Ivan Sutherland's work. The network was national when BBN connected to it in 1970. Now there were 13 IMPs and by 1971, 18, then 29 in 72 and 40 in 73. Once the need arose, Kleinrock would go on to work with Farouk Kamoun to develop the hierarchical routing theories in the late 70s. By 1976, ARPA became DARPA. The network grew to 213 hosts in 1981 and by 1982, TCP/IP became the standard for the US DOD and in 1983, ARPANET moved fully over to TCP/IP. And so TCP/IP, or Transport Control Protocol/Internet Protocol is the most dominant networking protocol on the planet. It was written to help improve performance on the ARPAnet with the ingenious idea to encapsulate traffic. But in the 80s, it was just for researchers still. That is, until NSFNet was launched by the National Science Foundation in 1986. And it was international, with the University College of London connecting in 1971, which would go on to inspire a British research network called JANET that built their own set of protocols called the Colored Book protocols. And the Norwegian Seismic Array connected over satellite in 1973. So networks were forming all over the place, often just time sharing networks where people dialed into a single computer. Another networking project going on at the time that was also getting funding from ARPA as well as the Air Force was PLATO. Out of the University of Illinois, was meant for teaching and began on a mainframe in 1960. But by the time ARPAnet was growing PLATO was on version IV and running on a CDC Cyber. The time sharing system hosted a number of courses, as they referred to programs. These included actual courseware, games, convent with audio and video, message boards, instant messaging, custom touch screen plasma displays, and the ability to dial into the system over lines, making the system another early network. In fact, there were multiple CDC Cybers that could communicate with one another. And many on ARPAnet also used PLATO, cross pollinating non-defense backed academia with a number of academic institutions. The defense backing couldn't last forever. The Mansfield Amendment in 1973 banned general research by defense agencies. This meant that ARPA funding started to dry up and the scientists working on those projects needed a new place to fund their playtime. Bob Taylor split to go work at Xerox, where he was able to pick the best of the scientists he'd helped fund at ARPA. He helped bring in people from Stanford Research Institute, where they had been working on the oNLineSystem, or NLS and people like Bob Metcalfe who brought us Ethernet and better collusion detection. Metcalfe would go on to found 3Com a great switch and network interface company during the rise of the Internet. But there were plenty of people who could see the productivity gains from ARPAnet and didn't want it to disappear. And the National Science Foundation (NSF) was flush with cash. And the ARPA crew was increasingly aware of non-defense oriented use of the system. So the NSF started up a little project called CSNET in 1981 so the growing number of supercomputers could be shared between all the research universities. It was free for universities that could get connected and from 1985 to 1993 NSFNET, surged from 2,000 users to 2,000,000 users. Paul Mockapetris made the Internet easier than when it was an academic-only network by developing the Domain Name System, or DNS, in 1983. That's how we can call up remote computers by names rather than IP addresses. And of course DNS was yet another of the protocols in Postel at UCLAs list of protocol standards, which by 1986 after the selection of TCP/IP for NSFnet, would become the standardization body known as the IETF, or Internet Engineering Task Force for short. Maintaining a set of protocols that all vendors needed to work with was one of the best growth hacks ever. No vendor could have kept up with demand with a 1,000x growth in such a small number of years. NSFNet started with six nodes in 1985, connected by LSI-11 Fuzzball routers and quickly outgrew that backbone. They put it out to bid and Merit Network won out in a partnership between MCI, the State of Michigan, and IBM. Merit had begun before the first ARPAnet connections went online as a collaborative effort by Michigan State University, Wayne State University, and the University of Michigan. They'd been connecting their own machines since 1971 and had implemented TCP/IP and bridged to ARPANET. The money was getting bigger, they got $39 million from NSF to build what would emerge as the commercial Internet. They launched in 1987 with 13 sites over 14 lines. By 1988 they'd gone nationwide going from a 56k backbone to a T1 and then 14 T1s. But the growth was too fast for even that. They re-engineered and by 1990 planned to add T3 lines running in parallel with the T1s for a time. By 1991 there were 16 backbones with traffic and users growing by an astounding 20% per month. Vint Cerf ended up at MCI where he helped lobby for the privatization of the internet and helped found the Internet Society in 1988. The lobby worked and led to the the Scientific and Advanced-Technology Act in 1992. Before that, use of NSFNET was supposed to be for research and now it could expand to non-research and education uses. This allowed NSF to bring on even more nodes. And so by 1993 it was clear that this was growing beyond what a governmental institution whose charge was science could justify as “research” for any longer. By 1994, Vent Cerf was designing the architecture and building the teams that would build the commercial internet backbone at MCI. And so NSFNET began the process of unloading the backbone and helped the world develop the commercial Internet by sprinkling a little money and know-how throughout the telecommunications industry, which was about to explode. NSFNET went offline in 1995 but by then there were networks in England, South Korea, Japan, Africa, and CERN was connected to NSFNET over TCP/IP. And Cisco was selling routers that would fuel an explosion internationally. There was a war of standards and yet over time we settled on TCP/IP as THE standard. And those were just some of the nets. The Internet is really not just NSFNET or ARPANET but a combination of a lot of nets. At the time there were a lot of time sharing computers that people could dial into and following the release of the Altair, there was a rapidly growing personal computer market with modems becoming more and more approachable towards the end of the 1970s. You see, we talked about these larger networks but not hardware. The first modulator demodulator, or modem, was the Bell 101 dataset, which had been invented all the way back in 1958, loosely based on a previous model developed to manage SAGE computers. But the transfer rate, or baud, had stopped being improved upon at 300 for almost 20 years and not much had changed. That is, until Hayes Hayes Microcomputer Products released a modem designed to run on the Altair 8800 S-100 bus in 1978. Personal computers could talk to one another. And one of those Altair owners was Ward Christensen met Randy Suess at the Chicago Area Computer Hobbyists' Exchange and the two of them had this weird idea. Have a computer host a bulletin board on one of their computers. People could dial into it and discuss their Altair computers when it snowed too much to meet in person for their club. They started writing a little code and before you know it we had a tool they called Computerized Bulletin Board System software, or CBBS. The software and more importantly, the idea of a BBS spread like wildfire right along with the Atari, TRS-80, Commodores and Apple computers that were igniting the personal computing revolution. The number of nodes grew and as people started playing games, the speed of those modems jumped up with the v.32 standard hitting 9600 baud in 84, and over 25k in the early 90s. By the early 1980s, we got Fidonet, which was a network of Bulletin Board Systems and by the early 90s we had 25,000 BBS's. And other nets had been on the rise. And these were commercial ventures. The largest of those dial-up providers was America Online, or AOL. AOL began in 1985 and like most of the other dial-up providers of the day were there to connect people to a computer they hosted, like a timesharing system, and give access to fun things. Games, news, stocks, movie reviews, chatting with your friends, etc. There was also CompuServe, The Well, PSINet, Netcom, Usenet, Alternate, and many others. Some started to communicate with one another with the rise of the Metropolitan Area Exchanges who got an NSF grant to establish switched ethernet exchanges and the Commercial Internet Exchange in 1991, established by PSINet, UUNet, and CERFnet out of California. Those slowly moved over to the Internet and even AOL got connected to the Internet in 1989 and thus the dial-up providers went from effectively being timesharing systems to Internet Service Providers as more and more people expanded their horizons away from the walled garden of the time sharing world and towards the Internet. The number of BBS systems started to wind down. All these IP addresses couldn't be managed easily and so IANA evolved out of being managed by contracts from research universities to DARPA and then to IANA as a part of ICANN and eventually the development of Regional Internet Registries so AFRINIC could serve Africa, ARIN could serve Antarctica, Canada, the Caribbean, and the US, APNIC could serve South, East, and Southeast Asia as well as Oceania LACNIC could serve Latin America and RIPE NCC could serve Europe, Central Asia, and West Asia. By the 90s the Cold War was winding down (temporarily at least) so they even added Russia to RIPE NCC. And so using tools like WinSOCK any old person could get on the Internet by dialing up. Modems for dial-ups transitioned to DSL and cable modems. We got the emergence of fiber with regional centers and even national FiOS connections. And because of all the hard work of all of these people and the money dumped into it by the various governments and research agencies, life is pretty darn good. When we think of the Internet today we think of this interconnected web of endpoints and content that is all available. Much of that was made possible by the development of the World Wide Web by Tim Berners-Lee in in 1991 at CERN, and Mosaic came out of the National Center for Supercomputing applications, or NCSA at the University of Illinois, quickly becoming the browser everyone wanted to use until Mark Andreeson left to form Netscape. Netscape's IPO is probably one of the most pivotal moments where investors from around the world realized that all of this research and tech was built on standards and while there were some patents, the standards were freely useable by anyone. Those standards let to an explosion of companies like Yahoo! from a couple of Stanford grad students and Amazon, started by a young hedge fund Vice President named Jeff Bezos who noticed all the money pouring into these companies and went off to do his own thing in 1994. The companies that arose to create and commercialize content and ideas to bring every industry online was ferocious. And there were the researchers still writing the standards and even commercial interests helping with that. And there were open source contributors who helped make some of those standards easier to implement by regular old humans. And tools for those who build tools. And from there the Internet became what we think of today. Quicker and quicker connections and more and more productivity gains, a better quality of life, better telemetry into all aspects of our lives and with the miniaturization of devices to support wearables that even extends to our bodies. Yet still sitting on the same fundamental building blocks as before. The IANA functions to manage IP addressing has moved to the private sector as have many an onramp to the Internet. Especially as internet access has become more ubiquitous and we are entering into the era of 5g connectivity. And it continues to evolve as we pivot due to new needs and threats a globally connected world represent. IPv6, various secure DNS options, options for spam and phishing, and dealing with the equality gaps surfaced by our new online world. We have disinformation so sometimes we might wonder what's real and what isn't. After all, any old person can create a web site that looks legit and put whatever they want on it. Who's to say what reality is other than what we want it to be. This was pretty much what Morpheus was offering with his choices of pills in the Matrix. But underneath it all, there's history. And it's a history as complicated as unraveling the meaning of an increasingly digital world. And it is wonderful and frightening and lovely and dangerous and true and false and destroying the world and saving the world all at the same time. This episode is pretty simplistic and many of the aspects we cover have entire episodes of the podcast dedicated to them. From the history of Amazon to Bob Taylor to AOL to the IETF to DNS and even Network Time Protocol. It's a story that leaves people out necessarily; otherwise scope creep would go all the way back to to include Volta and the constant electrical current humanity received with the battery. But hey, we also have an episode on that! And many an advance has plenty of books and scholarly works dedicated to it - all the way back to the first known computer (in the form of clockwork), the Antikythera Device out of Ancient Greece. Heck even Louis Gerschner deserves a mention for selling IBM's stake in all this to focus on things that kept the company going, not moonshots. But I'd like to dedicate this episode to everyone not mentioned due to trying to tell a story of emergent networks. Just because they were growing fast and our modern infrastructure was becoming more and more deterministic doesn't mean that whether it was writing a text editor or helping fund or pushing paper or writing specs or selling network services or getting zapped while trying to figure out how to move current that there aren't so, so, so many people that are a part of this story. Each with their own story to be told. As we round the corner into the third season of the podcast we'll start having more guests. If you have a story and would like to join us use the email button on thehistoryofcomputing.net to drop us a line. We'd love to chat!
Java, Ruby, PHP, Go. These are web applications that dynamically generate code then interpreted as a file by a web browser. That file is rarely static these days and the power of the web is that an app or browser can reach out and obtain some data, get back some xml or json or yaml, and provide an experience to a computer, mobile device, or even embedded system. The web is arguably the most powerful, transformational technology in the history of technology. But the story of the web begins in philosophies that far predate its inception. It goes back to a file, which we can think of as a document, on a computer that another computer reaches out to and interprets. A file comprised of hypertext. Ted Nelson coined the term hypertext. Plenty of others put the concepts of linking objects into the mainstream of computing. But he coined the term that he's barely connected to in the minds of many. Why is that? Tim Berners-Lee invented the World Wide Web in 1989. Elizabeth Feinler developed a registry of names that would evolve into DNS so we could find computers online and so access those web sites without typing in impossible to remember numbers. Bob Kahn and Leonard Kleinrock were instrumental in the Internet Protocol, which allowed all those computers to be connected together, providing the schemes for those numbers. Some will know these names; most will not. But a name that probably doesn't come up enough is Ted Nelson. His tale is one of brilliance and the early days of computing and the spread of BASIC and an urge to do more. It's a tale of the hacker ethic. And yet, it's also a tale of irreverence - to be used as a warning for those with aspirations to be remembered for something great. Or is it? Steve Jobs famously said “real artists ship.” Ted Nelson did ship. Until he didn't. Let's go all the way back to 1960, when he started Project Xanadu. Actually, let's go a little further back first. Nelson was born to TV directory Ralph Nelson and Celeste Holm, who won an Academy Award for her role in Gentleman's Agreement in 1947 and took home another pair of nominations through her career, and for being the original Ado Annie in Oklahoma. His dad worked on The Twilight Zone - so of course he majored in philosophy at Swarthmore College and then went off to the University of Chicago and then Harvard for graduate school, taking a stab at film after he graduated. But he was meant for an industry that didn't exist yet but would some day eclipse the film industry: software. While in school he got exposed to computers and started to think about this idea of a repository of all the world's knowledge. And it's easy to imagine a group of computing aficionados sitting in a drum circle, smoking whatever they were smoking, and having their minds blown by that very concept. And yet, it's hard to imagine anyone in that context doing much more. And yet he did. Nelson created Project Xanadu in 1960. As we'll cover, he did a lot of projects during the remainder of his career. The Journey is what is so important, even if we never get to the destination. Because sometimes we influence the people who get there. And the history of technology is as much about failed or incomplete evolutions as it is about those that become ubiquitous. It began with a project while he was enrolled in Harvard grad school. Other word processors were at the dawn of their existence. But he began thinking through and influencing how they would handle information storage and retrieval. Xanadu was supposed to be a computer network that connected humans to one another. It was supposed to be simple and a scheme for world-wide electronic publishing. Unlike the web, which would come nearly three decades later, it was supposed to be bilateral, with broken links self-repairing, much as nodes on the ARPAnet did. His initial proposal was a program in machine language that could store and display documents. Being before the advent of Markdown, ePub, XML, PDF, RTF, or any of the other common open formats we use today, it was rudimentary and would evolve over time. Keep in mind. It was for documents and as Nelson would say later, the web - which began as a document tool, was a fork of the project. The term Xanadu was borrowed from Samuel Taylor Coleridge's Kubla Khan, itself written after some opium fueled dreams about a garden in Kublai Khan's Shangdu, or Xanadu.In his biography, Coleridge explained the rivers in the poem supply “a natural connection to the parts and unity to the whole” and he said a “stream, traced from its source in the hills among the yellow-red moss and conical glass-shaped tufts of bent, to the first break or fall, where its drops become audible, and it begins to form a channel.” Connecting all the things was the goal and so Xanadu was the name. He gave a talk and presented a paper called “A File Structure for the Complex, the Changing and the Indeterminate” at the Association for Computing Machinery in 1965 that laid out his vision. This was the dawn of interactivity in computing. Digital Equipment had launched just a few years earlier and brought the PDP-8 to market that same year. The smell of change was in the air and Nelson was right there. After that, he started to see all these developments around the world. He worked on a project at Brown University to develop a word processor with many of his ideas in it. But the output of that project, as with most word processors since - was to get things printed. He believed content was meant to be created and live its entire lifecycle in the digital form. This would provide perfect forward and reverse citations, text enrichment, and change management. And maybe if we all stand on the shoulders of giants, it would allow us the ability to avoid rewriting or paraphrasing the works of others to include them in own own writings. We could do more without that tedious regurgitation. He furthered his counter-culture credentials by going to Woodstock in 1969. Probably not for that reason, but it happened nonetheless. And he traveled and worked with more and more people and companies, learning and engaging and enriching his ideas. And then he shared them. Computer Lib/Dream Machines was a paperback book. Or two. It had a cover on each side. Originally published in 1974, it was one of the most important texts of the computer revolution. Steven Levy called it an epic. It's rare to find it for less than a hundred bucks on eBay at this point because of how influential it was and what an amazing snapshot in time it represents. Xanadu was to be a hypertext publishing system in the form of Xanadocs, or files that could be linked to from other files. A Xanadoc used Xanalinks to embed content from other documents into a given document. These spans of text would become transclusions and change in the document that included the content when they changed in the live document. The iterations towards working code were slow and the years ticked by. That talk in 1965 gave way to the 1970s, then 80s. Some thought him brilliant. Others didn't know what to make of it all. But many knew of his ideas for hypertext and once known it became deterministic. Byte Magazine published many of his thoughts in 1988 called “Managing Immense Storage” and by then the personal computer revolution had come in full force. Tim Berners-Lee put the first node of the World Wide Web online the next year, using a protocol they called Hypertext Transfer Protocol, or http. Yes, the hypertext philosophy was almost a means of paying homage to the hard work and deep thinking Nelson had put in over the decades. But not everyone saw it as though Nelson had made great contributions to computing. “The Curse of Xanadu” was an article published in Wired Magazine in 1995. In the article, the author points out the fact that the web had come along using many of the ideas Nelson and his teams had worked on over the years but actually shipped - whereas Nelson hadn't. Once shipped, the web rose in popularity becoming the ubiquitous technology it is today. The article looked at Xanadu as vaporware. But there is a deeper, much more important meaning to Xanadu in the history of computing. Perhaps inspired by the Wired article, the group released an incomplete version of Xanadu in 1998. But by then, other formats - including PDF which was invented in 1993 and .doc for Microsoft Word, were the primary mechanisms we stored documents and first gopher and then the web were spreading to interconnect humans with content. https://www.youtube.com/watch?v=72M5kcnAL-4 The Xanadu story isn't a tragedy. Would we have had hypertext as a part of Douglas Engelbart's oNLine System without it? Would we have object-oriented programming or later the World Wide Web without it? The very word hypertext is almost an homage, even if they don't know it, to Nelson's work. And the look and feel of his work lives on in places like GitHub, whether directly influenced or not, where we can see changes in code side-by-side with actual production code, changes that are stored and perhaps rolled back forever. Larry Tessler coined the term Cut and Paste. While Nelson calls him a friend in Werner Herzog's Lo and Behold, Reveries of the Connected World, he also points out that Tessler's term is flawed. And I think this is where we as technologists have to sometimes trim down our expectations of how fast evolutions occur. We take tiny steps because as humans we can't keep pace with the rapid rate of technological change. We can look back and see a two steps forward and one step back approach since the dawn of written history. Nelson still doesn't think the metaphors that harken back to paper have any place in the online written word. Here's another important trend in the history of computing. As we've transitioned to more and more content living online exclusively, the content has become diluted. One publisher I wrote online pieces for asked that they all be +/- 700 words and asked that paragraphs be no more than 4 sentences long (preferably 3) and the sentences should be written at about a 5th or 6th grade level. Maybe Nelson would claim that this de-evolution of writing is due to search engine optimization gamifying the entirety of human knowledge and that a tool like Xanadu would have been the fix. After all, if we could borrow the great works of others we wouldn't have to paraphrase them. But I think as with most things, it's much more nuanced than that. Our always online, always connected brains can only accept smaller snippets. So that's what we gravitate towards. Actually, we have plenty of capacity for whatever we actually choose to immerse ourselves into. But we have more options than ever before and we of course immerse ourselves into video games or other less literary pursuits. Or are they more literary? Some generations thought books to be dangerous. As do all oppressors. So who am I to judge where people choose to acquire knowledge or what kind they indulge themselves in. Knowledge is power and I'm just happy they have it. And they have it in part because others were willing to water own the concepts to ship a product. Because the history of technology is about evolutions, not revolutions. And those often take generations. And Nelson is responsible for some of the evolutions that brought us the ht in http or html. And for that we are truly grateful! As with the great journey from Lord of the Rings, rarely is greatness found alone. The Xanadu adventuring party included Cal Daniels, Roger Gregory, Mark Miller, Stuart Greene, Dean Tribble, Ravi Pandya, became a part of Autodesk in the 80s, got rewritten in Smalltalk, was considered a rival to the web, but really is more of an evolutionary step on that journey. If anything it's a divergence then convergence to and from Vannevar Bush's Memex. So let me ask this as a parting thought? Are the places you are not willing to sacrifice any of your core designs or beliefs worth the price being paid? Are they worth someone else ending up with a place in the history books where (like with this podcast) we oversimplify complex topics to make them digestible? Sometimes it's worth it. In no way am I in a place to judge the choices of others. Only history can really do that - but when it happens it's usually an oversimplification anyways… So the building blocks of the web lie in irreverence - in hypertext. And while some grew out of irreverence and diluted their vision after an event like Woodstock, others like Nelson and his friend Douglas Englebart forged on. And their visions didn't come with commercial success. But as an integral building block to the modern connected world today they represent as great a mind as practically anyone else in computing.
Communications pioneer Andrew J.Viterbi — who in 1962 earned one of the first doctorates in electrical engineering granted at the University of Southern California — has forever changed how people everywhere connect and communicate. Dr.Viterbi’s lifelong interest in communications began as a child, when his family fled Italy for America in 1939 to escape the persecution of Jews. Born into an analog world, this visionary thinker opened the doors to the digital age with the Viterbi Algorithm, a groundbreaking mathematical formula for eliminating signal interference. Today, the Viterbi Algorithm is used in all four international standards for digital cellular telephones, as well as in data terminals, digital satellite broadcast receivers and deep space telemetry. In the spring of 1967, Dr.Viterbi met Irwin Jacobs at a telecommunications conference in California. Both men, and another of Dr. Viterbi’s colleagues, Leonard Kleinrock, shared an interest in forming a consulting group. With an investment of $1,500 — $500 from each man — the trio founded Linkabit. By the 1970s, Linkabit began providing technology for defense communications satellites using very large antennas. Dr. Viterbi and his Linkabit associates came up with a breakthrough computer to accomplish the task and dubbed it a “microprocessor,” even though it was made up of many chips. His renown grew as fast as the company. In 1975, Italy’s National Research Council awarded Dr. Viterbi one of its highest academic accolades, the Christopher Columbus Award. In 1980, Linkabit merged with M/A-COM of Boston. It soon produced the VSAT (Very Small Aperture Terminal), the foundation for private satellite communications networks. In 1985, the VSAT division was sold to Hughes. The team of Viterbi and Jacobs had a new dream: together, they founded Qualcomm Corp. to develop and manufacture satellite communications and digital wireless telephones. 00:00:00 Intro 00:01:12 How did you come to write your latest book? 00:03:02 Centro Primo Levi NYC and The Italian Jewish Experience 00:05:27 Knowing Primo Levi 17:53:06 Early days of wireless digital communications 00:23:33 Why didn't you patent the Viterbi Algorithm? 00:25:20 How do you separate tech hype from reality? 4G Vs 5G? 00:30:47 The commercial value of basic research and how to keep funding it. 00:38:16 Would a tax on scientific innovation impeed progress? 00:42:00 What do you think about SETI? How would you communicate with an alien civilization? 00:48:32 What would you put in your ethical will? 00:51:27 What would you put in your billion-year time capsule? 00:53:56 What advice would you give your younger self? Support the podcast: https://www.patreon.com/drbriankeating And please join my mailing list to get resources and enter giveaways to win a FREE copy of my book (and more) http://briankeating.com/mailing_list.php
Internet art is a broad term for the work of artists who use the internet as their canvas. Think of Flash animation, psychedelic glitch art, computer-generated art, GIFs, and many other examples. The internet has been around for five decades now, and internet art is falling victim to broken links, expired domains, and unsupported file types."I believe we're in a crisis right now, where so much work is disappearing and will never be seen again. We want to be in communication with these artists while they're still with us,” said Casey Reas, co-founder of the UCLA Arts Conditional Studio with Lauren Lee McCarthy and Chandler McWilliams, all professors in the Department of Design Media Arts.The UCLA Arts Conditional Studio is launching an initiative to collect internet art made by LA-based artists and preserve it for future generations. “Art and the Internet in LA 1969+” will explore the history of artists in Los Angeles who have worked with, responded to, and transformed the internet.Their beginning point is November 21, 1969, when UCLA professor Leonard Kleinrock established the first permanent ARPANET link from his laboratory to Stanford University. It continues through the emergence of the World Wide Web to the ubiquitous influence of the internet today.
JCR Licklider sent a memo called "Memorandum For Members and Affiliates of the Intergalactic Computer Network" in 1963 that is quite possibly the original spark that lit the bonfire called The ARPANet, that was the nascent beginnings of what we now called the Internet. In the memo, “Lick” as his friends called him, documented early issues in building out a time-sharing network of computers available to research scientists of the early 60s. The memo is a bit long so I'll include quotes followed by explanations or I guess you might call them interpretations. Let's start with the second paragraph: The need for the meeting and the purpose of the meeting are things that I feel intuitively, not things that I perceive in clear structure. I am afraid that that fact will be too evident in the following paragraphs. Nevertheless, I shall try to set forth some background material and some thoughts about possible interactions among the various activities in the overall enterprise for which, as you may have detected in the above subject, I am at a loss for a name. Intuition, to me, is important. Lick had attended conferences on cybernetics and artificial intelligence going back to the 40s. He had been MIT faculty and was working for a new defense research organization. He was a visionary. The thing is, let's call his vision a hypothesis. During the 1960s, the Soviets would attempt to build multiple networks similar to ARPANet. Thing is, much like a modern product manager, he chunked the work to be done up and had various small teams tackle parts of projects, each building a part but in the whole proving the theory in a decentralized way. As compared to Soviet projects that went all-in. A couple of paragraphs later, Lick goes on to state: In pursuing the individual objectives, various members of the group will be preparing executive the monitoring routines, languages amd [sic.] compilers, debugging systems and documentation schemes, and substantive computer programs of more or less general usefulness. One of the purposes of the meeting–perhaps the main purpose–is to explore the possibilities for mutual advantage in these activities–to determine who is dependent upon whom for what and who may achieve a bonus benefit from which activities of what other members of the group. It will be necessary to take into account the costs as well as the values, of course. Nevertheless, it seems to me that it is much more likely to be advantageous than disadvantageous for each to see the others' tentative plans before the plans are entirely crystalized. I do not mean to argue that everyone should abide by some rigid system of rules and constraints that might maximize, for example, program interchangeability. Here, he's acknowledging that stakeholders have different needs, goals and values, but stating that if everyone shared plans the outcome could be greater across the board. He goes on to further state that: But, I do think that we should see the main parts of the several projected efforts, all on one blackboard, so that it will be more evident than it would otherwise be, where network-wide conventions would be helpful and where individual concessions to group advantage would be most important. These days we prefer a whiteboard or maybe even a Miro board. But this act of visualization would let research from disparate fields, like Paul Baran at RAND working on packet switching at the time, be pulled in to think about how networks would look and work. While the government was providing money to different institutes the research organizations were autonomous and by having each node able to operate on their own rather than employ a centralized approach, the network could be built such that signals could travel along multiple paths in case one path broke down, thus getting at the heart of the matter - having a network that could survive a nuclear attach provided some link or links survived. He then goes on to state: It is difficult to determine, of course, what constitutes “group advantage.” Even at the risk of confusing my own individual objectives (or ARPA's) with those of the “group,” however, let me try to set forth some of the things that might be, in some sense, group or system or network desiderata. This is important. In this paragraph he acknowledges his own motive, but sets up a value proposition for the readers. He then goes on to lay out a future that includes an organization like what we now use the IETF for in: There will be programming languages, debugging languages, time-sharing system control languages, computer-network languages, data-base (or file-storage-and-retrieval languages), and perhaps other languages as well. It may or may not be a good idea to oppose or to constrain lightly the proliferation of such. However, there seems to me to be little question that it is desireable to foster “transfer of training” among these languages. One way in which transfer can be facilitated is to follow group consensus in the making of the arbitrary and nearly-arbitrary decisions that arise in the design and implementation of languages. There would be little point, for example, in having a diversity of symbols, one for each individual or one for each center, to designate “contents of” or “type the contents of.” The IETF and IEEE now manage the specifications that lay out the structure that controls protocols and hardware respectively. The early decisions made were for a small collection of nodes on the ARPANet and as the nodes grew and the industry matured, protocols began to be defined very specifically, such as DNS, covered in the what, second episode of this podcast. It's important that Lick didn't yet know what we didn't know, but he knew that if things worked out that these governing bodies would need to emerge in order to keep splinter nets at a minimum. At the time though, they weren't thinking much of network protocols. They were speaking of languages, but he then goes on to lay out a network-control language, which would emerge as protocols. Is the network control language the same thing as the time-sharing control language? (If so, the implication is that there is a common time-sharing control language.) Is the network control language different from the time-sharing control language, and is the network-control language common to the several netted facilities? Is there no such thing as a network-control language? (Does one, for example, simply control his own computer in such a way as to connect it into whatever part of the already-operating net he likes, and then shift over to an appropriate mode?) In the next few paragraphs he lays out a number of tasks that he'd like to accomplish - or at least that he can imagine others would like to accomplish, such as writing programs to run on computers, access files over the net, or read in teletypes remotely. And he lays out storing photographs on the internet and running applications remotely, much the way we do with microservices today. He referrs to information retrieval, searching for files based on metadata, natural language processing, accessing research from others, and bringing programs into a system from a remote repository, much as we do with cpan, python imports, and github today. Later, he looks at how permissions will be important on this new network: here is the problem of protecting and updating public files. I do not want to use material from a file that is in the process of being changed by someone else. There may be, in our mutual activities, something approximately analogous to military security classification. If so, how will we handle it? It turns out that the first security issues were because of eased restrictions on resources. Whether that was viruses, spam, or just accessing protected data. Keep in mind, the original network was to facilitate research during the cold war. Can't just have commies accessing raw military research can we? As we near the end of the memo, he says: The fact is, as I see it, that the military greatly needs solutions to many or most of the problems that will arise if we tried to make good use of the facilities that are coming into existence. Again, it was meant to be a military network. It was meant to be resilient and withstand a nuclear attack. That had already been discussed in meetings before this memo. Here, he's shooting questions to stakeholders. But consider the name of the memo, Memorandum For Members and Affiliates of the Intergalactic Computer Network. Not “A” network but “the” network. And not just any network, but THE Intergalactic Network. Sputnik had been launched in 1957. The next year we got NASA. Eisenhower then began the process that resulted in the creation of ARPA to do basic research so the US could leapfrog the Soviets. The Soviets had beaten the US to a satellite by using military rocketry to get to space. The US chose to use civilian rocketry and so set a standard that space (other than the ICBMs) would be outside the cold war. Well, ish. But here, we were mixing military and civilian research in the hallowed halls of universities. We were taking the best and brightest and putting them into the employ of the military without putting them under the control of the military. A relationship that worked well until the Mansfield Amendment to the 1970 Military Authorization Act ended the military funding of research that didn't have a direct or apparent relationship to specific military function. What happened between when Lick started handing out grants to people he trusted and that act would change the course of the world and allow the US to do what the Soviets and other countries had been tinkering with, effectively develop a nationwide link of computers to provided for one of the biggest eras of collaborative research the world has ever seen. What the world wanted was an end to violence in Vietnam. What they got was a transfer of technology from the military industrial complex to corporate research centers like Xerox PARC, Digital Equipment Corporation, and others. Lick then goes on to wrap the memo up: In conclusion, then, let me say again that I have the feeling we should discuss together at some length questions and problems in the set to which I have tried to point in the foregoing discussion. Perhaps I have not pointed to all the problems. Hopefully, the discussion may be a little less rambling than this effort that I am now completing. The researchers would continue to meet. They would bring the first node of the ARPANET online in 1969. In that time they'd also help fund research such as the NLS, or oN-Line System. That eventually resulted in mainstreaming the graphical user interface and the mouse. Lick would found the Information Processing Techniques office and launch Project MAC, the first big, serious research into personal computing. They'd fund Transit, an important navigation system that ran until 1996 when it was replaced by GPS. They built Shakey the robot. And yes, they did a lot of basic military research as well. And today, modern networks are Intergalactic. A bunch of nerds did their time planning and designing and took UCLA online then Stanford, then UCSB and then a PDP10 at the University of Utah. Four nodes, four types of computers. Four operating systems. Leonard Kleinrock and the next generation would then take the torch and bring us into the modern era. But that story is another episode. Or a lot of other episodes. We don't have a true Cold War today. We do have some pretty intense rhetoric. And we have a global pandemic. Kinda' makes you wonder what basic research is being funded today and how that will shape the world in the next 57 years, the way this memo has shaped the world. Or given that there were programs in the Soviet Union and other countries to do something similar was it really a matter of technological determinism? Not to take anything away from the hard work put in at ARPA and abroad. But for me at least, the jury is still out on that. But I don't have any doubt that the next wave of changes will be even more impactful. Crazy to think, right?
In this episode I speak with Leonard Kleinrock, one of the co-inventors of the Internet (it wasn't Al Gore) about his wakeup calls as a child to the wonder in the world.
The professor gives Cal a tour of the sacred space at UCLA where the first Internet message was sent in 1969. There’s no better tour guide. Leonard was there when it happened. The computer scientist talks about the development of the Internet in a way that humanizes it for Cal. He also touches on subjects like the lack of privacy, and what the Internet will mean to our future, making the podcast essential listening for everybody.
The earliest Unix code, how to replace fail2ban with blacklistd, OpenBSD crossed 400k commits, how to install Bolt CMS on FreeBSD, optimized hammer2, appeasing the OSI 7-layer burrito guys, and more. Headlines The Earliest Unix Code: An Anniversary Source Code Release (https://computerhistory.org/blog/the-earliest-unix-code-an-anniversary-source-code-release/) What is it that runs the servers that hold our online world, be it the web or the cloud? What enables the mobile apps that are at the center of increasingly on-demand lives in the developed world and of mobile banking and messaging in the developing world? The answer is the operating system Unix and its many descendants: Linux, Android, BSD Unix, MacOS, iOS—the list goes on and on. Want to glimpse the Unix in your Mac? Open a Terminal window and enter “man roff” to view the Unix manual entry for an early text formatting program that lives within your operating system. 2019 marks the 50th anniversary of the start of Unix. In the summer of 1969, that same summer that saw humankind’s first steps on the surface of the Moon, computer scientists at the Bell Telephone Laboratories—most centrally Ken Thompson and Dennis Ritchie—began the construction of a new operating system, using a then-aging DEC PDP-7 computer at the labs. This man sent the first online message 50 years ago (https://www.cbc.ca/radio/thecurrent/the-current-for-oct-29-2019-1.5339212/this-man-sent-the-first-online-message-50-years-ago-he-s-since-seen-the-web-s-dark-side-emerge-1.5339244) As many of you have heard in the past, the first online message ever sent between two computers was "lo", just over 50 years ago, on Oct. 29, 1969. It was supposed to say "log," but the computer sending the message — based at UCLA — crashed before the letter "g" was typed. A computer at Stanford 560 kilometres away was supposed to fill in the remaining characters "in," as in "log in." The CBC Radio show, “The Current” has a half-hour interview with the man who sent that message, Leonard Kleinrock, distinguished professor of computer science at UCLA "The idea of the network was you could sit at one computer, log on through the network to a remote computer and use its services there," 50 years later, the internet has become so ubiquitous that it has almost been rendered invisible. There's hardly an aspect in our daily lives that hasn't been touched and transformed by it. Q: Take us back to that day 50 years ago. Did you have the sense that this was going to be something you'd be talking about a half a century later? A: Well, yes and no. Four months before that message was sent, there was a press release that came out of UCLA in which it quotes me as describing what my vision for this network would become. Basically what it said is that this network would be always on, always available. Anybody with any device could get on at anytime from any location, and it would be invisible. Well, what I missed ... was that this is going to become a social network. People talking to people. Not computers talking to computers, but [the] human element. Q: Can you briefly explain what you were working on in that lab? Why were you trying to get computers to actually talk to one another? A: As an MIT graduate student, years before, I recognized I was surrounded by computers and I realized there was no effective [or efficient] way for them to communicate. I did my dissertation, my research, on establishing a mathematical theory of how these networks would work. But there was no such network existing. AT&T said it won't work and, even if it does, we want nothing to do with it. So I had to wait around for years until the Advanced Research Projects Agency within the Department of Defence decided they needed a network to connect together the computer scientists they were supervising and supporting. Q: For all the promise of the internet, it has also developed some dark sides that I'm guessing pioneers like yourselves never anticipated. A: We did not. I knew everybody on the internet at that time, and they were all well-behaved and they all believed in an open, shared free network. So we did not put in any security controls. When the first spam email occurred, we began to see the dark side emerge as this network reached nefarious people sitting in basements with a high-speed connection, reaching out to millions of people instantaneously, at no cost in time or money, anonymously until all sorts of unpleasant events occurred, which we called the dark side. But in those early days, I considered the network to be going through its teenage years. Hacking to spam, annoying kinds of effects. I thought that one day this network would mature and grow up. Well, in fact, it took a turn for the worse when nation states, organized crime and extremists came in and began to abuse the network in severe ways. Q: Is there any part of you that regrets giving birth to this? A: Absolutely not. The greater good is much more important. News Roundup How to use blacklistd(8) with NPF as a fail2ban replacement (https://www.unitedbsd.com/d/63-how-to-use-blacklistd8-with-npf-as-a-fail2ban-replacement) blacklistd(8) provides an API that can be used by network daemons to communicate with a packet filter via a daemon to enforce opening and closing ports dynamically based on policy. The interface to the packet filter is in /libexec/blacklistd-helper (this is currently designed for npf) and the configuration file (inspired from inetd.conf) is in etc/blacklistd.conf Now, blacklistd(8) will require bpfjit(4) (Just-In-Time compiler for Berkeley Packet Filter) in order to properly work, in addition to, naturally, npf(7) as frontend and syslogd(8), as a backend to print diagnostic messages. Also remember npf shall rely on the npflog* virtual network interface to provide logging for tcpdump() to use. Unfortunately (dont' ask me why ??) in 8.1 all the required kernel components are still not compiled by default in the GENERIC kernel (though they are in HEAD), and are rather provided as modules. Enabling NPF and blacklistd services would normally result in them being automatically loaded as root, but predictably on securelevel=1 this is not going to happen. FreeBSD’s handbook chapter on blacklistd (https://www.freebsd.org/doc/en_US.ISO8859-1/books/handbook/firewalls-blacklistd.html) OpenBSD crossed 400,000 commits (https://marc.info/?l=openbsd-tech&m=157059352620659&w=2) Sometime in the last week OpenBSD crossed 400,000 commits (*) upon all our repositories since starting at 1995/10/18 08:37:01 Canada/Mountain. That's a lot of commits by a lot of amazing people. (*) by one measure. Since the repository is so large and old, there are a variety of quirks including ChangeLog missing entries and branches not convertible to other repo forms, so measuring is hard. If you think you've got a great way of measuring, don't be so sure of yourself -- you may have overcounted or undercounted. Subject to the notes Theo made about under and over counting, FreeBSD should hit 1 million commits (base + ports + docs) some time in 2020 NetBSD + pkgsrc are approaching 600,000, but of course pkgsrc covers other operating systems too How to Install Bolt CMS with Nginx and Let's Encrypt on FreeBSD 12 (https://www.howtoforge.com/how-to-install-bolt-cms-nginx-ssl-on-freebsd-12/) Bolt is a sophisticated, lightweight and simple CMS built with PHP. It is released under the open-source MIT-license and source code is hosted as a public repository on Github. A bolt is a tool for Content Management, which strives to be as simple and straightforward as possible. It is quick to set up, easy to configure, uses elegant templates. Bolt is created using modern open-source libraries and is best suited to build sites in HTML5 with modern markup. In this tutorial, we will go through the Bolt CMS installation on FreeBSD 12 system by using Nginx as a web server, MySQL as a database server, and optionally you can secure the transport layer by using acme.sh client and Let's Encrypt certificate authority to add SSL support. Requirements The system requirements for Bolt are modest, and it should run on any fairly modern web server: PHP version 5.5.9 or higher with the following common PHP extensions: pdo, mysqlnd, pgsql, openssl, curl, gd, intl, json, mbstring, opcache, posix, xml, fileinfo, exif, zip. Access to SQLite (which comes bundled with PHP), or MySQL or PostgreSQL. Apache with mod_rewrite enabled (.htaccess files) or Nginx (virtual host configuration covered below). A minimum of 32MB of memory allocated to PHP. hammer2 - Optimize hammer2 support threads and dispatch (http://lists.dragonflybsd.org/pipermail/commits/2019-September/719632.html) Refactor the XOP groups in order to be able to queue strategy calls, whenever possible, to the same CPU as the issuer. This optimizes several cases and reduces unnecessary IPI traffic between cores. The next best thing to do would be to not queue certain XOPs to an H2 support thread at all, but I would like to keep the threads intact for later clustering work. The best scaling case for this is when one has a large number of user threads doing I/O. One instance of a single-threaded program on an otherwise idle machine might see a slightly reduction in performance but at the same time we completely avoid unnecessarily spamming all cores in the system on the behalf of a single program, so overhead is also significantly lower. This will tend to increase the number of H2 support threads since we need a certain degree of multiplication for domain separation. This should significantly increase I/O performance for multi-threaded workloads. You know, we might as well just run every network service over HTTPS/2 and build another six layers on top of that to appease the OSI 7-layer burrito guys (http://boston.conman.org/2019/10/17.1) I've seen the writing on the wall, and while for now you can configure Firefox not to use DoH, I'm not confident enough to think it will remain that way. To that end, I've finally set up my own DoH server for use at Chez Boca. It only involved setting up my own CA to generate the appropriate certificates, install my CA certificate into Firefox, configure Apache to run over HTTP/2 (THANK YOU SO VERY XXXXXXX MUCH GOOGLE FOR SHOVING THIS HTTP/2 XXXXXXXX DOWN OUR THROATS!—no, I'm not bitter) and write a 150 line script that just queries my own local DNS, because, you know, it's more XXXXXXX secure or some XXXXXXXX reason like that. Sigh. Beastie Bits An Oral History of Unix (https://www.princeton.edu/~hos/Mahoney/unixhistory) NUMA Siloing in the FreeBSD Network Stack [pdf] (https://people.freebsd.org/~gallatin/talks/euro2019.pdf) EuroBSDCon 2019 videos available (https://www.youtube.com/playlist?list=PLskKNopggjc6NssLc8GEGSiFYJLYdlTQx) Barbie knows best (https://twitter.com/eksffa/status/1188638425567682560) For the #OpenBSD #e2k19 attendees. I did a pre visit today. (https://twitter.com/bob_beck/status/1188226661684301824) Drawer Find (https://twitter.com/pasha_sh/status/1187877745499561985) Slides - Removing ROP Gadgets from OpenBSD - AsiaBSDCon 2019 (https://www.openbsd.org/papers/asiabsdcon2019-rop-slides.pdf) Feedback/Questions Bostjan - Open source doesn't mean secure (http://dpaste.com/1M5MVCX#wrap) Malcolm - Allan is Correct. (http://dpaste.com/2RFNR94) Michael - FreeNAS inside a Jail (http://dpaste.com/28YW3BB#wrap) Send questions, comments, show ideas/topics, or stories you want mentioned on the show to feedback@bsdnow.tv (mailto:feedback@bsdnow.tv) Your browser does not support the HTML5 video tag.
Scott Jennings, former adviser to U.S. Senator Mitch McConnell, and John Nichols, correspondent at the Nation, join Christiane Amanpour to explore the Democrats' big win in Kentucky. Iltija Mufti, daughter of Mehbooba Mufti, the former leader of Kashmir, discuss India's crackdown in Kashmir. Our Miles O'Brien sits down with Leonard Kleinrock and Vint Cerf, the founding fathers of the internet, to reflect on their first pioneering steps and what they would do differently if they could go back.
Get er dårligst på kundetilfredshetGet skulle bli best på markedsføring og PR, men ble i stedet dårligst på kundetilfredshet.Rema 1000 vinner priser for sin markedsføring, men taper både omsetning, omdømme og markedsandeler.Er det virkelig slik at reklame som vinner priser er 11 ganger mer effektiv, og er det viktigere å være god på markedsføring og PR enn å være best på sin kjernevirksomhet?Internett er 50 år29. oktober 1969 sendte informatikkprofessor Leonard Kleinrock og studenten hans, Charley Kline, en melding fra UCLA til Standford Universitys forsker, Bill Duval. Meldingen skulle vært «login«, men kun «lo» ble overført. De resterende tre bokstavene forsvant på grunn av et datakrasj på Stanford. Dette var starten på det som skulle bli internett, og som har endret måten vi lever, leker, lærer og lever.Google har utviklet verdens første kvantedatamaskinHva er en kvantedatamaskin og hvor rask er egentlig en kvantedatamaskin sammenlignet med en klassisk superdatamaskin? Google hevder å ha utviklet en kvantedatamaskin som kan fullføre en utregning på noen minutter, noen en tradisjonell superdatamaskin ville brukt 10.000 år på. The Washington Post sammenligner dette med Wright-brødrenes første flytur som varte i 12 sekunder, tilbake til 1903.8 av 10 har tillit til norske medierMange har vært skeptiske til Aftenpostens Forklart, som på lørdag kom med en annonsørbasert episode, levert av Schibsted og Equinor. Skepsisen har vært knyttet til hvorvidt det vil ødelegge tilliten til de redaksjonelle episodene fra forklart. Men vil det virkelig det? I en fersk undersøkelse fra Medietilsynet, kommer det frem at 8 av 10 har tillit til norske medier. Aftenposten, NRK og TV 2 scorer høyest, mens VG opplever størst tillit. Det skyldes derimot ikke VGs omfattende bruk av annonsørinnhold, men det journalistiske arbeidet de gjorde med "Giske-dansen". Er dårlig journalistikk en større trussel mot tilliten enn godt merket annonsørinnhold?Øvrige saker:Skjermtid er bra for barnaMadam Secretary advarer mot kunstig intelligensEpisodens sponsor:Tripletex - Regnskap gjort smart, enkelt og effektivtCheckIn - forenkler påmelding og billettsalgLenker til at jeg har snakket om, finner du på HansPetter.info. See acast.com/privacy for privacy and opt-out information.
A Internet está completando aniversário! No dia 29 de outubro de 1969, a rede mundial de computadores foi criada num experimento liderado pelo pesquisador Leonard Kleinrock e, em ao longo das décadas, ganhou força, se popularizou e hoje faz parte do dia a dia de todos nós. É neste clima que o comentário de hoje fala sobre os hábitos do passado que as pessoas aposentaram por conta dos recursos da Web.
Professor Leonard Kleinrock is a distinguished professor of computer science at UCLA. His mathematical theory of packet networks is one of the underlying technologies of the internet. Professor Kleinrock directed the first-ever transmission of a message on the internet from his lab at UCLA to the Stanford Research Institute on October 26, 1969. In 2007, the professor received the President's National Medal of Science, the highest achievement in science bestowed by the President of the United States. During this podcast, Professor Kleinrock describes the moment the internet was born and explains why it was created. He also talks about the vision he and others had for the internet and what drove them in the early days to develop it. And Professor Kleinrock also reveals one thing that he didn't anticipate about how the internet is used today.
Our GuestPaolo Bergamo became a visiting professor at UCLA (EE dept) in 2002. He worked in the group of Leonard Kleinrock, the “father of the internet” who built the first three ARPANET nodes. Paolo then left Academia to join a mobile technology startup, called Sendia, based in Santa Monica, CA. Sendia became the first Salesforce acquisition in 2006. Since then Paolo has been the lead of mobile products for Salesforce. In 2008, the Salesforce Mobile app was the first business app ever in the iPhone App Stores. He’s currently SVP for Salesforce1 and Mobile, defining the mobile strategy and leading mobile product management.Here are the highlights of our conversation with our guest:A combination of events and items shaped Paolo’s interest around communications and drove him towards mobile: as a young student he considered himself more of a mathematician until a teacher pointed out he was an engineer. His father working in the telco industry for 35 years in Italy, and Paolo was always obsessed with bundling and hiding all the different wires sticking out from his computers. This led him to explore mobile and wireless at an early age and encouraged an educational and career in electronics. He was very much inspired by his experience learning from Leonard Kleinrock - literally the father of the internet.After two years in UCLA, he learned that he really enjoyed research, teaching and loved to be around young people who are passionate and want to learn. However, academic research was a little too lengthy, and provided almost no opportunity to productize inventions. In order to fulfill his calling as an engineer, he knew he needed to take an idea and actually get it out in front of people - get feedback from actual users and consumers.The opportunity to join Sendia, a start-up building business tools for early-era mobile phones. They wanted to translate the most important business tools into mobile. The first web services started appearing in back end systems, like Salesforce. We built a middle layer between the web services API and the mobile phone. They built the apps for a mobile phone highly optimized for offline data communications. It was a success, they were acquired by Salesforce and he became a product manager. In 2007, when the App Store was launched for 3rd party developers, Salesforce Mobile app caught Steve Jobs’ eye and Sendia received a phone call. It was a success once more and people started trusting him to manage the mobile strategies, product and platform more and more. Paolo shares that with all this he believes that rethinking mobile is the key to innovation. Paolo talks about Salesforce’s legacy as one of the companies which built the cloud. They help customers to do business in a whole new way and enables companies to be smarter and more efficient. He believes that the mission of any large enterprise is to go back to the era where the quality of customer service is no less than excellent and they aim that they will be the platform which will enable businesses to do this.As SVP of Mobile in Salesforce1, he is moving the needle towards their goal by mobilizing data, business processes and collaboration. They also have optimized applications, like Wave, should businesses need to slice and dice data. They are also enabling customers to build their own applications for their own customers through mobile platforms and SDKs. They have a low code, no code mantra wherein every time they build a piece of code or functionally, anyone can build an application in a visual way. All these applications are all connected as they are not stand alone applications and this is where he comes in as a product manager. Discover the critical challenges they are facing include tactically deciding which devices that they will support out of the box and which ones to delegate to partners; the challenge of making it really easy for an intelligent admin to orchestrate events from multiple systems and channels in line with their low code, no code mantra; and having a center to cater all systems. Rapid Fire QuestionsWhat is your definition of innovation?Innovation is anything that freezes us from the mundane tasks and gives us back the time to do what we want.Would you put more emphasis on the idea or the execution? How would you weigh each of them and why?30% idea and 70% execution. When you do things right and you execute well, you are going to be in a good position to give people your strategy in the moment in which your idea is not successful.What is your biggest learning lesson on your journey so far?Adapting. What took you here will not take you there. You need to keep challenging yourself to scale and be less of a control freak.What is your favorite business book?Multipliers: How the Best Leaders Make Everyone Smarter by Liz WisemanCrossing the Chasm by Geoffrey MooreEscape Velocity by Geoffrey MooreWhat is your favorite digital resource?Mobile FirstFreakonomicsThe Rubin ReportWhat is your favorite app and why?SlackFitbit
Episode 008 What to say in your Opening Statement Hello, I’m Will Harris. Welcome to the Power Sales Guru podcast. The title of this episode is What to Say in Your Opening Statement. I love the story of the high school basketball coach who was attempting to motivate his players to persevere through a difficult season. Halfway through the season he stood before his team and said, “Did Michael Jordan ever quit?” The team responded, “No!” “What about the Wright brothers? Did they ever give up?” “No!” the team responded. “Did Steve Jobs ever quit?” “Again the team yelled, “No!” “Did Elmer McAllister ever quit?” There was a long silence. Finally one player was bold enough to ask, “Coach…who’s Elmer McAllister? We never heard of him.” The coach snapped back , “Of course you never heard of him….he quit!” I love that story. In this podcast we are going to discuss the four parts of a great opening statement. And ensuring you never quit before you connect with a new customer. Allow me to paint a picture for you. You made a call to speak with a potential client. You have built rapport with the gatekeeper, who was happy to supply helpful information. When you are transferred to the Decision Maker, He answers the call and says: “Hello. This is Tom.” The clock begins…. You have ten seconds to generate interest before you lose them forever. The moment is here and the pressure is on. What you say next makes or breaks the call. What do you do when the pressure is on? What do you say to gain progress? The desires of our heart often require unquenchable spirit, sweat, energy beyond what we think is possible, and an undying commitment. These prerequisites to success culminate in one overriding quality – Perseverance. I’m a big fan of the great psychologist William James. Dr. James said “Fatigue gets worse up to a certain point, when gradually or suddenly, it passes away and we are fresher than before!” At that point we have tapped a new level of energy…There may be layer after layer of this experience, a third even fourth ‘wind.’ We find amounts of ease and power we never dreamed ourselves to own, sources of strength habitually not taxed, because habitually we never push through the obstruction right in front of us. Dr. Seuss’s first children’s book was rejected by 23 publishers. The twenty-fourth publisher sold six million copies and Dr. Seuss realized his perseverance resulted in challenging and educating millions of children. After having been rejected by both Hewlett-Packard and Atari, Apple microcomputers had first-year sales of $2.5 million. And Formula 409…got its name because 4 hundred and 8 times they submitted their compound and were rejected. But the 4 hundred and 9th time …………….was accepted. So how do you know when enough is enough? When we achieve what we set out to do, that is when it is enough? Are you willing to do enough? To cold call, or to prospect within an account you need 4 parts in your opening statement. For a great opening statement 4 things are enough. The four major parts to a powerful opening statement are: Greeting Attention Statement Interest Statement Opening Question The opening statement creates a bridge from “I don’t know you” to “I need to know you.” Different is the new great. The goal is to stand out amongst the sea of sales people fighting for your prospect’s interest. Now let’s take a look at the first thing to say in your opening statement: The Greeting Some sales people whisper the name of their company as if they were passing notes to another student in class. Or they say it so quickly that it lacks the luster and power the name deserves. The greeting is where you acknowledge who you are and which company employs you. This should be said slowly and with pride. “Hello. My name is Will Harris. I’m with Willpower Consultation.” The Greeting quickly says: Hi, Me, & Us. A great opening statement quickly shifts focus from the salesperson onto the prospect. Avoid the trap of focusing on you during the call. Introduce yourself and your company, and then quickly move the conversation onto the prospect. I have been asked whether to use first name only in your greeting….or should we use first and last name for ultimate professionalism. Whatever makes you feel comfortable…that’s what you should do. If you have some super cool name like Alexander Fraiser III. Then you may use it in order to stand out and be different. But if your style is more laid back then you may say, “I’m Alex.” You can phrase the greeting whichever way fits your style. Just remember, the greeting is simply “Hi, Me, & Us.” Done! That was the first part to say in your opening statement. The second thing to say in your opening statement is your attention statement. The Attention Statement A salesperson traveled out of state for a big meeting with a prospect. He was dining at a restaurant chain he was familiar with back home. The usual steak ordered well-done was served in rare condition. Irritated, he furiously motioned for the waiter. When the waiter came to his table, the salesperson blurted fastly, “I said WELL-DONE!” “Well, thank you, “responded the waiter, “your compliment is appreciated.” Have you ever received a phone call and it took a minute to understand what was being said to you…either you struggled to recognize the voice or figure out why this person was calling you. Maybe it was the accent or even the speed in which someone spoke. Many opening statements are great but they have the content in the wrong order…and prospects miss out on what you actually say because they were still trying to process what you were saying and what’s the call all about. Like the waiter, they totally miss out on your point. But when prospecting, they have the control to end the call quickly without ever finding out what you really want. The attention statement allows time for your prospect to process your speech pattern and recognize the reason for your call. But, the main goal is to grab the prospect’s attention. Before you called them they were not sitting around waiting for you to call. Prospecting is an interruption in someone’s day. So you have to say something that will make them stop, drop their pen or stop typing long enough to hear you. 10 seconds, right. The key to a great attention statement is that you must mention something familiar to them. Something that they recognize enough to make them pause. You may say, but Will if this is a cold call, if I never spoke to them before then how can I mention something familiar? So, I have a Free Power Tools for you called Grabbing Attention. You can find it www.PowerSales.guru under Power Tools. You can use that Power Tool to know what to say in your Opening Statements. And I am going to lay it out for you right now. The various types of attention statements: Name Recognition Industry …you can mention something specific around their vertical market Company Job Title These four areas make you stand out when you say your Opening Statement. So grab that Free Power Tool called Grabbing Attention. The next part of the opening statement is the most important part….The Interest Statement Different is the new great because people do not just go through the motions when they are different. The ability to approach Power Prospecting from a fresh outlook and staying away from conventional ways of prospecting is paramount. Different does not mean crazy. Different arises when people look at old problems in a new way. The interest statement is the most important part of the opening statement. While all parts are important, this one is the end-all-be-all of your opening statement. The WIIFM message you developed gets condensed into one simple statement that says it all. Remember, your prospect will always wonder what’s in it for them. While they may not say it while you are talking, they are definitely thinking it. If you cannot help them come up with a reason they should be talking to you, your call will come to an end. You know ….Several bankers were debating the question: who was the greatest inventor? One cast his vote for Stephenson who invented the railroad; another for the Wright brothers for inventing the plane. One man even voted for Leonard Kleinrock, for contributing to the invention of the Internet! Finally, one banker turned to a man in the lobby listening, but not contributing to the debate. The banker asked the man what he thought. “Well,” the man replied with a big smile. “Whoever created INTEREST was the greatest.” The man’s reply has double meaning. Interest attributed to a loan or interest in buying a product. The sales person in me likes the idea of creating interest in buying a product. No matter what product you sell, people buy because of their interests. Many business-to-business sales people launch into talking about their products without focusing on the prospect’s interests. Since different is the new great, speak about the overall business problems the customer is interested in solving. I had a coaching call with a client who sells barcode scanning equipment. She complained that when she prospects, she quickly hears that the potential customer does not have a need for her product. After listening to her opening statement, I informed her that the reason for her failure was that she was being too specific in how she was going to help the prospect. She had a hard time grasping this concept. She did not want to resort to tricks or gimmicks to sell her equipment. I asked her if she had ever heard of a company named Smith Corolla. She knew that the company sold typewriters. Smith Corolla was the largest manufacturer of typewriters in the world. In their hay day, a young executive had suggested they take a look at data processing. His suggestion was quickly shut down by the other executives who felt they were in the business of selling typewriters and only typewriters. Now, many years later, we are hard pressed to find a Smith Corolla anywhere. They missed the fact that they were not in the business of selling typewriters. They were in the business of processing and transferring information. I explained to the sales rep that she was suffering from the same disillusion. She did not sell barcode scanners. She was in the business of developing businesses. The goal is to find solutions for your customers and not customers for your solutions. By adjusting her opening statements to reflect her prospect’s overall business needs, she would be able to increase her prospecting effectiveness. Creatively paring a customer’s need to your product is the wining philosophy around a great interest statement. So in this part of your opening statement you want to start with a business development need…and not a product feature. There is a difference in saying I specialize in helping my customers find customers. And saying I sell barcode scanning equipment to help you have real time information. What is the root business cause that you champion….because that is where you will find what to say during your Opening Statement? Another Power Tool you can get on my site is “Power Prospecting”. It can guide you through writing your opening statement. Now we will look at the last part of what to say during your opening statement….the question. Opening Question At this point in your opening statement it is time for you to get them involved in the conversation. This is where you ask a question that serves as a bridge out of the opening statement and into a deeper conversation. A question must be properly selected for maximum engagement. Remember, you have been talking for the last 8 seconds and now you merely want to get them talking. Consider it the launch pad for closing your sale. This is a huge thing that is the crashing point for many sales. There are tips around crafting the perfect questions that we will cover in upcoming episodes. But, if you proactively select a great opening question then it will be enough to increase your success. The four parts we discussed today around the Opening Statement is what I have worked on every day for the past 15 years. What you say in your Opening Statement can be used personally or professionally. It can be used over the phone or in email. By email it only takes some tweeking on the last part…the question can become a suggested next step. But in order to have a next step…you have to take the first step and then comes the next. Charles Goodyear was obsessed with the idea of making rubber unaffected by extreme temperatures. Years of unsuccessful experimentation caused bitter disappointment, imprisonment for debt, family difficulties, and ridicule from friends. He persevered and Goodyear discovered that adding sulfur to rubber achieved his purpose. I have four of them on my car right now. But he’s not the only one with a great story of perseverance. In his first three years in the automobile industry, Henry Ford went bankrupt twice. Inventor Chester Carlson cold called for years before he could find backers for his Xerox photcoyping process. Don’t give up before you win. Develop an opening statement that includes the four parts and you will have enough….for success. I hope I kept your attention during this episode. I do this every day in my own prospecting and working with others. So I love it. This podcast’s Free Power Tool is available now on my site www.PowerSales.guru grab the one called Grabbing Attention. Thank you for listening to the Power Sales Guru podcast. I am your host Will Harris wishing you Happy Selling and remember different is the new great.
En Clave7 también hay espacio para la cultura, si es que hay aun quien piensa que lo que nuestro equipo hace se aleja de ella. En Canarias tenemos muchos jóvenes emprendedores que no ven barreras en el limitado territorio de unas islas en medio de un mar inmenso. La imaginación no tiene límites. Y este es el verdadero misterio que rondará a esta entrevista. La escritora y guionista Jeniffer Castañeda García visita nuestro programa para hablarnos de esa “mágica” cualidad de crear escenarios y personajes en el espacio bidimensional de un papel, que sean capaces de transmitir una emoción a quien acierte a leer las líneas en que, en apariencia, estén escondidos. Algo que ella sin duda conoce y maneja con esmero en sus obras “Por una cabeza” o “El Sendero Bimbache”. Es el catalizador que activa el big bang de la imaginación del lector, quien se ve sorprendido observando todo un universo multidimensional donde esas frases escritas cobran entidad. Si quieres conocer o adquirir las obras de Jeniffer Castañeda, puedes ponerte en contacto con ella a través de su web. http://xn--jeniffercastaeda-jub.es/ Internet está tan integrado en nuestras vidas, (al menos en el llamado “primer mundo”), que acabará convirtiéndose en un “sistema nervioso global”, absolutamente invisible a nuestros ojos… Esto es lo que aventura Leonard Kleinrock, creador de “Arpanet”, el embrión de la actual World Wide Web. Muchos expertos, incluido el propio Kleinrock, dejan claro los beneficios que “la red” ha traído consigo. Pero al mismo tiempo, no olvidan su parte más negativa ¿La excesiva dependencia de internet, por su facilidad para acceder a cualquier dato que precisemos, hará que perdamos la capacidad para idear con nuestro propio criterio?
This talk discusses research being undertaken at the Electronic Visualization Laboratory at the University of Illinois at Chicago and its consequences for future forms of computer-mediated communication and for the Internet. On 29 October 1969, Leonard Kleinrock's research team at UCLA transmitted a message from a computer to another one located at Douglas Engelbart's Stanford University research lab. That transmission was the first to send a message via ARPANET using packets, just like messages are sent via today's Internet. This presentation uses the occasion of the Internet's fortieth birthday to discuss research being undertaken at the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago and its consequences for future forms of computer-mediated communication and for the Internet.
This talk discusses research being undertaken at the Electronic Visualization Laboratory at the University of Illinois at Chicago and its consequences for future forms of computer-mediated communication and for the Internet. On 29 October 1969, Leonard Kleinrock's research team at UCLA transmitted a message from a computer to another one located at Douglas Engelbart's Stanford University research lab. That transmission was the first to send a message via ARPANET using packets, just like messages are sent via today's Internet. This presentation uses the occasion of the Internet's fortieth birthday to discuss research being undertaken at the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago and its consequences for future forms of computer-mediated communication and for the Internet.