POPULARITY
On this episode of Christopher Lochhead: Follow Your Different, we have a dialogue with Steve Vassallo, a General Partner at Foundation Capital, to celebrate the firm's 30th anniversary and explore the evolution of venture capital (VC) in Silicon Valley. We discuss the shift of many traditional VC firms from early-stage investments to a more asset management-oriented approach. Steve Vassallo talks about the importance of maintaining a craft-oriented, personalized approach to VC, focusing on product excellence and effective distribution. He also highlights the current trends in AI and blockchain, urging founders to prioritize innovation and core product differentiation in a rapidly changing market. With nearly 18 years at Foundation Capital, Steve shares his unique insights into the changing landscape of venture capital, the importance of maintaining a craft-oriented approach, and the challenges faced by early-stage founders. You're listening to Christopher Lochhead: Follow Your Different. We are the real dialogue podcast for people with a different mind. So get your mind in a different place, and hey ho, let's go. Celebrating 30 Years of Foundation Capital Foundation Capital recently marked its 30th anniversary, a significant milestone in its journey. The firm has launched its 11th fund, a $600 million vehicle aimed at supporting extraordinary founders at the earliest stages of their ventures, particularly in the enterprise sector. Steve Vassallo adds that more than half of their investments focus on early-stage companies, including seed and Series A rounds, with a strong emphasis on technology, particularly in fintech and crypto. Steve Vassallo on the Changing VC Landscape Christopher and Steve Vassallo then discuss the evolution of VC firms, noting that many traditional firms have transformed into asset managers rather than remaining true venture capitalists. Steve points out that the percentage of capital raised by these firms for early-stage investments has dwindled significantly. He estimates that only about 20% of the capital raised by these larger firms is allocated to early-stage investments, with the majority directed towards growth-stage companies. In contrast, Foundation Capital dedicates approximately 70-80% of their recent fund to backing founders at the inception of their ideas. Steve humorously refers to these early-stage entrepreneurs as "pre-founders" or "pre-criminals," highlighting the raw potential and creativity that often characterize this stage of entrepreneurship. The Craft of Venture Capital Their conversation then moves to the notion that venture capital is fundamentally a craft business rather than a scalable factory-like operation. Christopher likens the venture capital process to crafting custom surfboards, where the quality and personal touch of the creator matter significantly. He argues that the best results come from a deep, personalized partnership with founders, rather than a one-size-fits-all approach. Reflecting on his own background in product design, Steve Vassallo adds that he initially believed that the best product would always win in the market. However, he quickly learned that effective distribution often trumps product quality. This realization was humbling for him, as he recognized the critical role that marketing and sales play in a product's success. He stresses that when extraordinary products are paired with exceptional distribution channels, remarkable outcomes can occur. To hear more from Steve Vassallo and the future of Venture Capital, download and listen to this episode. Bio Steve Vassallo is a General Partner at Foundation Capital, where he invests at the intersection of design, technology, and business. Since joining the firm in 2007, he has led investments in transformative companies such as Stripe, Sunrun, Cerebras Systems, and Solana. Steve co-leads Foundation's crypto practice and is known for backing product-first founders tackling consequential problems ...
Meet the CEO of Cerebras Systems in the latest installment of our oral history project. We Meet: Cerebras CEO Andrew FeldmanCredits:This episode of SHIFT was produced by Jennifer Strong with help from Emma Cillekens. It was mixed by Garret Lang, with original music from him and Jacob Gorski. Art by Meg Marco.
Hagay Lupesko is the SVP for AI Inference at Cerebras Systems. Subscribe to the Gradient Flow Newsletter
Forbes Assistant Managing Editor Katharine Schwab talks with Cerebras Systems' CEO and cofounder Andrew Feldman about his startup's AI chip, the impact of China's DeepSeek and its implications for the global AI landscape.See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
C'est un dessin à la Une de Libération à Paris : on y voit deux robots en train de faire la course, l'un américain, l'autre chinois… et derrière, un petit robot estampillé « France-Union Européenne », chevauché par Emmanuel Macron, qui tente désespérément de rattraper les deux autres…« La chasse à l'intelligence artificielle », titre le journal. « Chefs d'État et gratin mondial de l'IA se réunissent pendant deux jours (à Paris). Face au leadership américano-chinois, la France et l'Europe tentent de rester dans la course et la réguler ».Pour Le Figaro, « l'Europe ne peut pas passer à côté de la révolution de l'intelligence artificielle. Sa survie en dépend ».Le Figaro qui s'interroge : « l'Europe veut-elle se cantonner au rôle de régulateur en rédigeant les codes de la route des technologies conçues ailleurs ? Se contenter d'ériger des garde-fous pendant que d'autres bâtissent des empires ? Ou bien va-t-elle enfin comprendre qu'il est urgent de déverser des milliards et de faire tomber des barrières pour se doter d'infrastructures dignes de ce nom, de favoriser les alliances transfrontalières, de faire émerger ses propres titans technologiques en les aidant massivement, en capitaux et dans l'accès aux données ? Car l'enjeu dépasse largement la seule compétitivité économique, pointe encore Le Figaro : il s'agit d'une question de souveraineté. Un continent qui ne contrôle pas ses outils numériques devient dépendant, vulnérable, tôt ou tard une colonie numérique des puissances de la Silicon Valley ou de Shenzhen… »Un « contexte turbulent et incertain »Le Monde s'interroge : « ces efforts hexagonaux et européens sont-ils trop limités pour exister face aux puissances américaine et chinoise ? “L'exemple de DeepSeek montre qu'il n'y a pas besoin de centaines de milliards de dollars pour développer des IA. Et que, dans l'IA, tout n'est pas joué, répond-on à l'Élysée. Il reste énormément de transformations à venir, et la situation, la force des différents pays et start-up d'IA peuvent encore évoluer très rapidement“ ».Commentaire du Monde : « un optimisme de rigueur dans un contexte des plus turbulents et incertains ».« L'urgence d'une gouvernance mondiale de l'IA », s'exclame Le Soir à Bruxelles. Pour le quotidien belge, « au sommet de Paris, il sera crucial de choisir, collectivement, si l'IA sera un levier d'émancipation ou une menace incontrôlable. Laisser les géants de la tech décider à notre place n'est pas une option ».Macron se met en scène…Le Corriere Della Sera à Rome s'attarde, lui sur la vidéo diffusée samedi par l'Elysée, une vidéo créée grâce à l'intelligence artificielle et qui met en scène Emmanuel Macron… « Les images qui défilent, rigoureusement fausses, montrent un Macron sous diverses facettes, comme personne ne l'a jamais vu, relate le quotidien italien. Macron avec les cheveux longs donnant des conseils de beauté, Macron avec une coiffure des années 80, Macron dans une rave-party, Macron parlant de son amour pour les voitures dans une scène du film OSS117, Macron en rappeur, Macron habillé en femme… Et à la fin de cette vidéo, le président français explique à quoi sert le sommet international qui s'ouvre ce lundi à Paris : “plus sérieusement, dit-il, l'intelligence artificielle peut faire de grandes choses pour la santé, l'énergie et la vie en société. La France et l'Europe doivent être au cœur de cette révolution pour défendre les principes auxquels nous croyons" ».La France moteur européen de l'IA ?D'ailleurs, souligne le Corriere Della Sera, signe que l'Europe n'est pas à la traine : « l'assistant d'intelligence artificielle le plus rapide du monde est européen et s'appelle Le Chat. Il a été conçu par la société française Mistral en partenariat avec le fabricant de puces d'intelligence artificielle Cerebras Systems, soutenu par le conglomérat technologique émirati G42. (…) Et, affirme encore le quotidien italien, ce n'est qu'une des nouveautés qui émergent au sein de l'Union européenne, pour contrer la suprématie américaine en matière d'intelligence artificielle ».Et on revient au Monde à Paris, qui a publié ce week-end une tribune signée Sam Altman, le patron d'OpenAI, qui développe l'outil ChatGPT. Sam Altman affirme que la France est devenue un « centre névralgique de l'IA » sur le Vieux Continent.Pour lui, l'intelligence artificielle est indispensable pour stimuler l'économie. Et « si l'on veut de la croissance, des emplois et du progrès, affirme-t-il, il faut permettre aux innovateurs d'innover, aux bâtisseurs de bâtir et aux développeurs de développer. Le risque de l'inaction est trop grand pour être ignoré, c'est pourquoi le pays qui nous a apporté les Lumières (la France) prend aujourd'hui des mesures pour réussir sa transition vers l'ère de l'intelligence ».
This episode is sponsored by Shopify. Shopify is a commerce platform that allows anyone to set up an online store and sell their products. Whether you're selling online, on social media, or in person, Shopify has you covered on every base. With Shopify you can sell physical and digital products. You can sell services, memberships, ticketed events, rentals and even classes and lessons. Sign up for a $1 per month trial period at http://shopify.com/eyeonai In this episode of the Eye on AI podcast, Andrew D. Feldman, Co-Founder and CEO of Cerebras Systems, unveils how Cerebras is disrupting AI inference and high-performance computing. Andrew joins Craig Smith to discuss the groundbreaking wafer-scale engine, Cerebras' record-breaking inference speeds, and the future of AI in enterprise workflows. From designing the fastest inference platform to simplifying AI deployment with an API-driven cloud service, Cerebras is setting new standards in AI hardware innovation. We explore the shift from GPUs to custom architectures, the rise of large language models like Llama and GPT, and how AI is driving enterprise transformation. Andrew also dives into the debate over open-source vs. proprietary models, AI's role in climate mitigation, and Cerebras' partnerships with global supercomputing centers and industry leaders. Discover how Cerebras is shaping the future of AI inference and why speed and scalability are redefining what's possible in computing. Don't miss this deep dive into AI's next frontier with Andrew Feldman. Like, subscribe, and hit the notification bell for more episodes! Stay Updated: Craig Smith Twitter: https://twitter.com/craigss Eye on A.I. Twitter: https://twitter.com/EyeOn_AI (00:00) Intro to Andrew Feldman & Cerebras Systems (00:43) The rise of AI inference (03:16) Cerebras' API-powered cloud (04:48) Competing with NVIDIA's CUDA (06:52) The rise of Llama and LLMs (07:40) OpenAI's hardware strategy (10:06) Shifting focus from training to inference (13:28) Open-source vs proprietary AI (15:00) AI's role in enterprise workflows (17:42) Edge computing vs cloud AI (19:08) Edge AI for consumer apps (20:51) Machine-to-machine AI inference (24:20) Managing uncertainty with models (27:24) Impact of U.S.–China export rules (30:29) U.S. innovation policy challenges (33:31) Developing wafer-scale engines (34:45) Cerebras' fast inference service (37:40) Global partnerships in AI (38:14) AI in climate & energy solutions (39:58) Training and inference cycles (41:33) AI training market competition
Family offices like Maelstrom and Motier Ventures are significantly investing in AI startups, driving innovation and reshaping the tech landscape. Elon Musk's lawsuit against Open AI reveals internal conflicts and strategic decisions, highlighting concerns about the shift from a nonprofit to a for-profit model and broader industry challenges. Elon Musk has expanded his lawsuit against Open AI and Microsoft, alleging monopolistic practices and raising concerns over AI power consolidation and regulatory scrutiny. Microsoft aims to become carbon negative by 2030, investing in direct air capture technology and partnering with RBC and Deep Sky to fund innovative carbon capture projects. JobGet's acquisition of Snagajob aims to create a powerful platform for hourly and frontline workers, leveraging AI tools to enhance job matching and streamline the application process. Elon Musk's AI company, xAI, is raising $6 billion to invest in Nvidia chips, enhancing its AI infrastructure and capabilities, including its chatbot Grok. The integration of AI into daily life has raised significant privacy and safety concerns, prompting FTC Chair Melissa Holyoak to call for an investigation into AI data practices. O2 has introduced Daisy, an AI-powered tool designed to engage telephone scammers in meaningless conversations, developed with cybersecurity expert Jim Browning. The New York State Department of Financial Services has issued new guidance for financial institutions to mitigate cybersecurity risks associated with AI, emphasizing robust risk management frameworks and data quality. Xiaodi Hou, former CEO of TuSimple, is seeking a court order to prevent the company from transferring its U.S. assets to China and has launched a new autonomous trucking startup, Bot Auto. The startup ecosystem saw activity with Klarna filing for a U.S. IPO and PayU planning to go public in 2025, highlighting growth and regulatory challenges in fintech. In 2017, Open AI considered acquiring Cerebras Systems to leverage its AI chip technology but ultimately shifted focus to collaborating with semiconductor firms. Bluesky is gaining traction as a decentralized social network prioritizing user privacy and control, attracting users disillusioned with traditional platforms like X. Cruise, General Motors' autonomous vehicle subsidiary, is addressing safety concerns and regulatory actions following a high-profile incident involving a robotaxi. Lenovo is diversifying its supply chain by establishing new manufacturing facilities outside China, including a significant investment in Saudi Arabia, and capitalizing on the AI PC market.
Follow Prof G Markets: Apple Podcasts Spotify Scott and Ed open the show by discussing the ongoing machinist strike at Boeing, Amazon's new AI tool for delivery drivers, the DOJ's suggested remedies for the Google antitrust case, and a potential delay in Cerebras Systems' IPO. Then they break down Hindenburg's accusations against Roblox and discuss why its business model is so problematic. Finally, they break down Germany's economic issues and why the country's lack of spending might be the root cause of its problems. For our take on the Tesla robotaxi event and the stock's resulting drawdown, tune in on Thursday. Order "The Algebra of Wealth," out now Subscribe to No Mercy / No Malice Follow the podcast across socials @profgpod: Instagram Threads X Reddit Learn more about your ad choices. Visit podcastchoices.com/adchoices
Follow Prof G Markets: Apple Podcasts Spotify Scott and Ed open the show by discussing Tesla's quarterly deliveries, a potential CVS breakup, and a venture capital firm's decision to return money to investors. Then Scott explains the biggest red flag he sees in chipmaker Cerebras Systems as it prepares to go public, but breaks down why he would still invest in the company. Scott and Ed debate about sovereign wealth funds in the Gulf and whether or not the funds make smart investments. Finally, they examine Nike's earnings and break down why Nike's dependence on its brand might have led to its downfall. Vote for the Prof G Pod at the Signal Awards Order "The Algebra of Wealth," out now Subscribe to No Mercy / No Malice Follow the podcast across socials @profgpod: Instagram Threads X Reddit Follow Scott on Instagram Follow Ed on Instagram and X Learn more about your ad choices. Visit podcastchoices.com/adchoices
Our 181st episode with a summary and discussion of last week's big AI news! With hosts Andrey Kurenkov and Jeremie Harris Read out our text newsletter and comment on the podcast at https://lastweekin.ai/ If you would like to become a sponsor for the newsletter, podcast, or both, please fill out this form. Email us your questions and feedback at contact@lastweekinai.com and/or hello@gladstone.ai In this episode: - Google's AI advancements with Gemini 1.5 models and AI-generated avatars, along with Samsung's lithography progress. - Microsoft's Inflection usage caps for Pi, new AI inference services by Cerebrus Systems competing with Nvidia. - Biases in AI, prompt leak attacks, and transparency in models and distributed training optimizations, including the 'distro' optimizer. - AI regulation discussions including California's SB1047, China's AI safety stance, and new export restrictions impacting Nvidia's AI chips. Timestamps + Links: (00:00:00) Intro / Banter (00:03:08)Response to listener comments / corrections Tools & Apps(00:09:19) Google's custom AI chatbots have arrived (00:12:52) Google releases three new experimental AI models (00:17:14) Google Gemini will let you create AI-generated people again (00:22:32) Five months after Microsoft hired its founders, Inflection adds usage caps to Pi (00:26:42:) Plaud takes a crack at a simpler AI pin Applications & Business(00:30:31) Cerebras Systems throws down gauntlet to Nvidia with launch of ‘world's fastest' AI inference service (00:41:06) Nvidia announces $50 billion stock buyback (00:46:24) OpenAI in talks to raise funding that would value it at more than $100 billion (00:50:44) OpenAI Aims to Release New AI Model, ‘Strawberry,' in Fall (00:52:53) 3 Co-Founders Leave French AI Startup H Amid ‘Operational Differences' (00:57:29) Samsung to Adopt High-NA Lithography Alongside Intel, Ahead of TSMC (01:02:11) Unitree's $16,000 G1 could become the first mainstream humanoid robot Projects & Open Source(01:04:59) Meta leads open-source AI boom, Llama downloads surge 10x year-over-year (01:09:08) A_Preliminary_Report_on_DisTrO. Research & Advancements(01:13:56) Diffusion Models Are Real-Time Game Engines (01:23:18) LLM Defenses Are Not Robust to Multi-Turn Human Jailbreaks Yet (01:32:21) Interviewing AI researchers on automation of AI R&D (01:40:33) Anthropic releases AI model system prompts, winning praise for transparency Policy & Safety(01:47:12) U.S. AI Safety Institute Signs Agreements Regarding AI Safety Research, Testing and Evaluation With Anthropic and OpenAI (01:50:46) China's Views on AI Safety Are Changing—Quickly (01:56:27) Poll: 7 in 10 Californians Support SB1047, Will Blame Governor Newsom for AI-Enabled Catastrophe if He Vetoes (02:01:31) Elon Musk voices support for California bill requiring safety tests on AI models (02:03:55) Chinese Engineers Reportedly Accessing NVIDIA's High-End AI Chips Through Decentralized “GPU Rental Services” (02:08:25) U.S. gov't tightens China restrictions on supercomputer component sales Synthetic Media & Art(02:11:13) Actors Say AI Voice-Over Generator ElevenLabs Cloned Likenesses (02:14:06) Outro
Andy Jassy freut sich über die Reduzierung von 50 Entwicklertagen auf wenige Stunden bei der Upgradezeit für Java-Anwendungen. Warum glaubt Klarna noch weiter sehr viele Leute entlassen zu können? Cerebras Systems hat einen neuen KI-Inferenz-Dienst gestartet, der laut dem Unternehmen 20-mal schneller als vergleichbare cloud-basierte Dienste mit Nvidias leistungsstärksten GPUs ist und deutlich niedrigere Kosten pro Token bietet. Tether, das Unternehmen hinter der gleichnamigen Kryptowährung, investiert angeblich laut WSJ mit Hilfe von Christian Angermayer in scheinbar unverwandte Unternehmen wie Northern Data und BlackRock Neurotech. Werbung: Melde dich jetzt an zum Webinar von LIQID: „Warum Profis auf Venture Capital setzen“ am 7. September um 11 Uhr. Philipp Glöckler und Philipp Klöckner sprechen heute über: (00:00:00) Intro (00:09:30) Klarna (00:22:00) Nvidia Friede Freude AI Kuchen (00:35:00) AI Einsatzgebiet Andrew Jessy Amazon Q (00:40:10) Tether (00:44:30) Uber FSD (00:45:20) Yelp (00:47:00) Reddit (00:55:35) Crowdstrike (01:00:45) Salesforce (01:04:00) Birkenstock Shownotes: Klarna: Handelszeitung, Tech.eu, Pips LinkedIn OpenAI Funding: WSJ Cerebras: X, Siliconangel Angermayer Tether: WSJ Uber FSD: Reuters Yelp: The Information
In this episode of Gradient Dissent, Andrew Feldman, CEO of Cerebras Systems, joins host Lukas Biewald to discuss the latest advancements in AI inference technology. They explore Cerebras Systems' groundbreaking new AI inference product, examining how their wafer-scale chips are setting new benchmarks in speed, accuracy, and cost efficiency. Andrew shares insights on the architectural innovations that make this possible and discusses the broader implications for AI workloads in production. This episode provides a comprehensive look at the cutting-edge of AI hardware and its impact on the future of machine learning.✅ *Subscribe to Weights & Biases* → https://bit.ly/45BCkYz
OpenAI has significantly influenced the M&A and IPO markets, with notable acquisitions and rapidly growing revenue. Recently, OpenAI acquired search and analytics startup Rockset and video collaboration startup Multi. These deals mark a shift towards inorganic growth, essential for large tech companies. This quarter, M&A activity has slightly increased with over 430 deals, though still lower than in previous years. Additionally, AI chips startup Cerebras Systems filed for an IPO, reflecting rising interest in AI technologies, which has also seen Nvidia become highly valuable. Investors watch closely as AI companies may drive the re-opening of IPO and M&A markets, potentially leading to awaited returns.Learn more on this news visit us at: https://greyjournal.net/news/ Hosted on Acast. See acast.com/privacy for more information.
In this episode, Mark and Shashank are joined by a special guest, Matt from Cerebras Systems. Matt, a key figure at Cerebras and a regular at the South Bay Generative AI Meetup, shares his wealth of knowledge about the cutting-edge advancements in AI hardware. He discusses how Cerebras is revolutionizing the field with their specialized ML training chips, which compete with Nvidia by optimizing for specific machine learning workloads. Tune in to learn about wafer-scale computing, the challenges and innovations in AI hardware, and how Cerebras is poised to lead the future of AI infrastructure.
Join us at our first in-person conference on June 25 all about AI Quality: https://www.aiqualityconference.com/ MLOps Coffee Sessions Special episode with Databricks, Introducing DBRX: The Future of Language Models, fueled by our Premium Brand Partner, Databricks. DBRX is designed to be especially capable of a wide range of tasks and outperforms other open LLMs on standard benchmarks. It also promises to excel at code and math problems, areas where others have struggled. Our panel of experts will get into the technical nuances, potential applications, and implications of DBRx for businesses, developers, and the broader tech community. This session is a great opportunity to hear from insiders about how DBRX's capabilities can benefit you. // Bio Denny Lee - Co-host Denny Lee is a long-time Apache Spark™ and MLflow contributor, Delta Lake maintainer, and a Sr. Staff Developer Advocate at Databricks. A hands-on distributed systems and data sciences engineer with extensive experience developing internet-scale data platforms and predictive analytics systems. He has previously built enterprise DW/BI and big data systems at Microsoft, including Azure Cosmos DB, Project Isotope (HDInsight), and SQL Server. Davis Blalock Davis Blalock is a research scientist and the first employee at MosaicML. He previously worked at PocketSonics (acquired 2013) and completed his PhD at MIT, where he was advised by John Guttag. He received his M.S. from MIT and his B.S. from the University of Virginia. He is a Qualcomm Innovation Fellow, NSF Graduate Research Fellow, and Barry M. Goldwater Scholar. He is also the author of Davis Summarizes Papers, one of the most widely-read machine learning newsletters. Bandish Shah Bandish Shah is an Engineering Manager at MosaicML/Databricks, where he focuses on making generative AI training and inference efficient, fast, and accessible by bridging the gap between deep learning, large-scale distributed systems, and performance computing. Bandish has over a decade of experience building systems for machine learning and enterprise applications. Prior to MosaicML, Bandish held engineering and development roles at SambaNova Systems where he helped develop and ship the first RDU systems from the ground up, and Oracle where he worked as an ASIC engineer for SPARC-based enterprise servers. Abhi Venigalla Abhi is an NLP architect working on helping organizations build their own LLMs using Databricks. Joined as part of the MosaicML team and used to work as a researcher at Cerebras Systems. Ajay Saini Ajay is an engineering manager at Databricks leading the GenAI training platform team. He was one of the early engineers at MosaicML (acquired by Databricks) where he first helped build and launch Composer (an open source deep learning training framework) and afterwards led the development of the MosaicML training platform which enabled customers to train models (such as LLMs) from scratch on their own datasets at scale. Prior to MosaicML, Ajay was co-founder and CEO of Overfit, an online personal training startup (YC S20). Before that, Ajay worked on ML solutions for ransomware detection and data governance at Rubrik. Ajay has both a B.S. and MEng in computer science with a concentration in AI from MIT. // MLOps Jobs board https://mlops.pallet.xyz/jobs // MLOps Swag/Merch https://mlops-community.myshopify.com/ // Related Links Website: https://www.databricks.com/ Databricks DBRX: https://www.databricks.com/blog/introducing-dbrx-new-state-art-open-llm --------------- ✌️Connect With Us ✌️ ------------- Join our slack community: https://go.mlops.community/slack Follow us on Twitter: @mlopscommunity Sign up for the next meetup: https://go.mlops.community/register Catch all episodes, blogs, newsletters, and more: https://mlops.community/
Nvidia's new Blackwell GPU is HUGE, literally! If you're looking to be an Nvidia AI chip competitor, why not just make physically bigger chips? In this video, we explore the physics and economics behind AI chip design. We'll cover Nvidia's Blackwell packaging secrets, rival Cerebras Systems' wafer-scale chips, and the critical role of fab equipment makers in the race for AI system dominance.
No Priors: Artificial Intelligence | Machine Learning | Technology | Startups
The GPU supply crunch is causing desperation amongst AI teams large and small. Cerebras Systems has an answer, and it's a chip the size of a dinner plate. Andrew Feldman, CEO and Co-founder of Cerebras and previously SeaMicro, joins Sarah Guo and Elad Gil this week on No Priors. They discuss why there might be an alternative to Nvidia, localized models and predictions for the accelerator market. Show Links: Andrew Feldman - Cerebras CEO & Co-founder | LinkedIn Cerebras Sign up for new podcasts every week. Email feedback to show@no-priors.com Follow us on Twitter: @NoPriorsPod | @Saranormous | @EladGil | @andrewdfeldman Show Notes: (0:00:00) - Cerebra Systems CEO Discusses AI Supercomputers (0:07:03) - AI Advancement in Architecture and Training (0:16:58) - Future of AI Accelerators and Chip Specialization (0:26:38) - Scaling Open Source Models and Fine-Tuning
Cerebras Systems and G42 unveil Condor Galaxy, the world's largest AI training supercomputer, AI companies commit to safeguards at the request of the White House, Nous.co launches Generative AI to help manage household bills, ETHCC conference highlights Ethereum developments, GitHub's Copilot Chat enters limited public beta, OpenAI's ChatGPT Plus offers custom instructions, Google Messages adopts end-to-end encryption, car owners increasingly dissatisfied with infotainment systems, Games for Change announces winners of 2023 awards, Estée Lauder and Ofcom fall victim to MOVEit hack, Lazarus hackers breach JumpCloud to target cryptocurrency clients.
On this episode, we're joined by Andrew Feldman, Founder and CEO of Cerebras Systems. Andrew and the Cerebras team are responsible for building the largest-ever computer chip and the fastest AI-specific processor in the industry.We discuss:- The advantages of using large chips for AI work.- Cerebras Systems' process for building chips optimized for AI.- Why traditional GPUs aren't the optimal machines for AI work.- Why efficiently distributing computing resources is a significant challenge for AI work.- How much faster Cerebras Systems' machines are than other processors on the market.- Reasons why some ML-specific chip companies fail and what Cerebras does differently.- Unique challenges for chip makers and hardware companies.- Cooling and heat-transfer techniques for Cerebras machines.- How Cerebras approaches building chips that will fit the needs of customers for years to come.- Why the strategic vision for what data to collect for ML needs more discussion.Resources:Andrew Feldman - https://www.linkedin.com/in/andrewdfeldman/Cerebras Systems - https://www.linkedin.com/company/cerebras-systems/Cerebras Systems | Website - https://www.cerebras.net/Thanks for listening to the Gradient Dissent podcast, brought to you by Weights & Biases. If you enjoyed this episode, please leave a review to help get the word out about the show. And be sure to subscribe so you never miss another insightful conversation.#OCR #DeepLearning #AI #Modeling #ML
Nathan Labenz sits down with Andrew Feldman, CEO and Co-Founder of Cerebras Systems, a company building a new class of computer system for accelerating AI and changing the future of work. Cerebras Systems is the creator of the world's largest chip, at 2.6 trillion transistors. In this episode, they discuss the founding story of Cerebras, the experience of creating the world's largest chip, and the process that goes into chip design and manufacturing for an AI-focused chip. This episode is the first part of our hardware exploration series focused on the people building at the forefront of hardware applications in AI. LINKS: Cerebras: https://www.cerebras.net/ Book: The Chip War by Chris Miller RECOMMENDED PODCAST: The HR industry is at a crossroads. What will it take to construct the next generation of incredible businesses – and where can people leaders have the most business impact? Hosts Nolan Church and Kelli Dragovich have been through it all, the highs and the lows – IPOs, layoffs, executive turnover, board meetings, culture changes, and more. With a lineup of industry vets and experts, Nolan and Kelli break down the nitty-gritty details, trade offs, and dynamics of constructing high performing companies. Through unfiltered conversations that can only happen between seasoned practitioners, Kelli and Nolan dive deep into the kind of leadership-level strategy that often happens behind closed doors. Check out the first episode with the architect of Netflix's culture deck Patty McCord. https://link.chtbl.com/hrheretics PODCAST RECOMMENDATION: The AI Breakdown: https://pod.link/1680633614 As anyone in AI knows, the pace of progress of new releases is relentless. The AI Breakdown is a daily podcast (10-20min long) that helps us ensure we don't miss anything important by curating news and analysis. TIMESTAMPS (00:00) Preview (04:27) Andrew's story of creating the world's largest chip and Cerebras (07:19) What is a chip? (08:14) The diversity of chips and what they can accomplish (09:47) What is it like to design a 2.5 trillion transistor chip? (12:41) The founding story of Cerebras and building the team (14:20) Sponsor: Omneky (23:00) What was the hardest part about building the company? (26:11) What happens after designing the chip's blueprint? (27:29) The tradeoffs needed in chipmaking (34:08) The comparison between chips and neural networks (38:31) The generalization vs specialization of a chip (40:11) Sparse compute vs dense compute (43:55) Ghost in the machine (46:54) Supply chain challenges of the Cerebras chip (54:59) The future for chips (58:19) Building chip clusters (58:57) The Cerebras business model (01:00:41) Building a chip cluster vs using a Cerebras chip (01:02:57) Giant chips on the edge (01:05:32) What is the edge? (01:08:04) Andrew's favorite AI products (01:10:08) Would Andrew get a Neuralink implant? (01:14:16) Consciousness and chips (01:17:50) AI hopes and fears TWITTER: @CogRev_Podcast @andrewdfeldman (Andrew) @labenz (Nathan) @eriktorenberg (Erik) Thank you Omneky for sponsoring The Cognitive Revolution. Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work, customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off. Music Credit: MusicLM More show notes and reading material released in our Substack: https://cognitiverevolution.substack.com
What does it take to solve an intimidating problem that many feel is unsolvable? Andrew Feldman, Co-founder and CEO of Cerebras Systems, can tell you, because he and his team engineered an unprecedented technological breakthrough. They set out to build a new class of computer system to accelerate Artificial Intelligence work. In the end, they built the fastest AI accelerator, based on the largest processor in the industry. Tune in to hear his story, his thoughts on building and selling companies, and his career advice for aspiring founders. Enjoy this episode. Main Takeaways: - Pioneering Solutions to Big Problems: Andrew explains his love of tackling big problems where there “isn't a safety net” and his love of “fearless engineering.” He shares his experience searching for a solution that many thought couldn't be found. - Artificial Intelligence: Andrew discusses his thoughts on AI, how it could be used for extraordinary good and how it will permeate every facet of our lives moving forward. - Building and Selling Companies: Andrew discusses his experience building, leading and selling companies, diving into his decision making process for when to sell. - Advice for Aspiring Entrepreneurs: Andrew provides advice for aspiring entrepreneurs and also dives into the mistakes that he has made along the way. He stresses the need to focus on the customer, to build trust within your team and to build the right professional network. Key Quotes: In my experience, these aren't sitting on a park bench with an idea arriving like a child from Zeus's head fully formed, right? That's not the way they come. You articulate a problem….What choices are available to you to solve this? You have to decide in your career what sort of problems you're going to attack. And you have to decide if you're more afraid of failing in pursuit of a really interesting big problem or succeeding at a mediocre problem. I think like every technology, [AI] has the opportunity for tremendous good and tremendous evil, both. I'd say the same for nuclear power. I'd say the same for, you know, any number of monstrous technologies. It is that their very power can be used for good or for bad. And I think that the technologies in AI can be used for evil and the exact same technology can be used for such good, it's extraordinary. And so the challenge is on us to manage it. I think it's a tremendous mistake to build a company to sell it. I think it's a tremendous mistake to have a religious view that you have to go public. You are using other people's money in what we do. You are building a company in partnership with people who are lending you part of their career, and you're the steward of that. I think one of the things young people should think about is they see resumes and they see LinkedIn links and it's success, lots of bullets, another success, lots of bullets. I think you can just ignore all that because nobody puts up their failures. You know, bad idea, six months wasted on a bad idea, millions of dollars destroyed because of arrogance, right? Nobody puts that on their LinkedIn. And so you get this, sort of like the Instagram version of a career, perfect angles, perfect lighting filters done properly. But that's not really the way careers went.
Have suggestions for future podcast guests (or other feedback)? Let us know here!In episode 42 of The Gradient Podcast, Daniel Bashir speaks to Andrew Feldman.Andrew is the co-founder and CEO of Cerebras Systems, an AI accelerator company that has built the largest processor in the industry. Before Cerebras, Andrew co-founded and served as CEO of SeaMicro, which was acquired by AMD in 2012. He has also served in executive positions at Force10 Networks and RiverStone Networks.Subscribe to The Gradient Podcast: Apple Podcasts | Spotify | Pocket Casts | RSSFollow The Gradient on TwitterOutline:(00:00) Intro(02:05) Andrew's trajectory, from business school to Cerebras(10:00) The large model problem and Cerebras' approach(19:50) Cerebras's GPT-J announcement(22:20) Andrew explains weight streaming to Daniel(32:30) Andrew's thoughts on the MLPerf benchmark(38:20) The venture landscape for AI accelerator companies(42:50) The hardware lottery, hardware support for sparsity(45:40) The CHIPS Act, NVIDIA China ban and the accelerator industry(48:00) Politics and Chips, US and China(52:20) Andrew's perspective on tackling difficult problems(56:42) OutroLinks:Cerebras' HomepageGPT-J AnnouncementTotalEnergiesGlaxoSmithKline (GSK)Sources mentioned“Political Chips” by Ben Thompson (because Daniel's a fanboy)Daniel's conversation with Sara HookerThe Hardware Lottery Get full access to The Gradient at thegradientpub.substack.com/subscribe
Jean-luc Chatelain, Applied Intelligence CTO, talks with Andrew Feldman, Founder and CEO of Cerebras Systems about transformers, massive models and the future of AI. They discuss how we are moving from a world of a large amount of models to fewer more powerful models known as transformers. Hear what they think this means for the future of AI.
Top threats of 2022, Corel acquires Awingu, Cerebras Systems on AI compute in the cloud, and more. Cloud Security Alliance's top threats of 2022 Microsoft 365 function leaves SharePoint, OneDrive files open to ransomware attacks Cisco Live announcement about AppDynamics Ransomware gang creates a site for employees to search for their stolen data Corel acquires Awingu Cerebras Systems Founder and CEO Andrew Feldman on high-performance AI Compute in the cloud Hosts: Louis Maresca, Brian Chee, and Curt Franklin Guest: Andrew Feldman Download or subscribe to this show at https://twit.tv/shows/this-week-in-enterprise-tech. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: CDW.com/IntelClient nureva.com linode.com/twiet
Top threats of 2022, Corel acquires Awingu, Cerebras Systems on AI compute in the cloud, and more. Cloud Security Alliance's top threats of 2022 Microsoft 365 function leaves SharePoint, OneDrive files open to ransomware attacks Cisco Live announcement about AppDynamics Ransomware gang creates a site for employees to search for their stolen data Corel acquires Awingu Cerebras Systems Founder and CEO Andrew Feldman on high-performance AI Compute in the cloud Hosts: Louis Maresca, Brian Chee, and Curt Franklin Guest: Andrew Feldman Download or subscribe to this show at https://twit.tv/shows/this-week-in-enterprise-tech. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: CDW.com/IntelClient nureva.com linode.com/twiet
Top threats of 2022, Corel acquires Awingu, Cerebras Systems on AI compute in the cloud, and more. Cloud Security Alliance's top threats of 2022 Microsoft 365 function leaves SharePoint, OneDrive files open to ransomware attacks Cisco Live announcement about AppDynamics Ransomware gang creates a site for employees to search for their stolen data Corel acquires Awingu Cerebras Systems Founder and CEO Andrew Feldman on high-performance AI Compute in the cloud Hosts: Louis Maresca, Brian Chee, and Curt Franklin Guest: Andrew Feldman Download or subscribe to this show at https://twit.tv/shows/this-week-in-enterprise-tech. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: CDW.com/IntelClient nureva.com linode.com/twiet
Top threats of 2022, Corel acquires Awingu, Cerebras Systems on AI compute in the cloud, and more. Cloud Security Alliance's top threats of 2022 Microsoft 365 function leaves SharePoint, OneDrive files open to ransomware attacks Cisco Live announcement about AppDynamics Ransomware gang creates a site for employees to search for their stolen data Corel acquires Awingu Cerebras Systems Founder and CEO Andrew Feldman on high-performance AI Compute in the cloud Hosts: Louis Maresca, Brian Chee, and Curt Franklin Guest: Andrew Feldman Download or subscribe to this show at https://twit.tv/shows/this-week-in-enterprise-tech. Get episodes ad-free with Club TWiT at https://twit.tv/clubtwit Sponsors: CDW.com/IntelClient nureva.com linode.com/twiet
Vitaliy Chiley is a Machine Learning Research Engineer at the next-generation computing hardware company Cerebras Systems. We spoke about how DL workloads including sparse workloads can run faster on Cerebras hardware. [00:00:00] Housekeeping [00:01:08] Preamble [00:01:50] Vitaliy Chiley Introduction [00:03:11] Cerebrus architecture [00:08:12] Memory management and FLOP utilisation [00:18:01] Centralised vs decentralised compute architecture [00:21:12] Sparsity [00:23:47] Does Sparse NN imply Heterogeneous compute? [00:29:21] Cost of distributed memory stores? [00:31:01] Activation vs weight sparsity [00:37:52] What constitutes a dead weight to be pruned? [00:39:02] Is it still a saving if we have to choose between weight and activation sparsity? [00:41:02] Cerebras is a cool place to work [00:44:05] What is sparsity? Why do we need to start dense? [00:46:36] Evolutionary algorithms on Cerebras? [00:47:57] How can we start sparse? Google RIGL [00:51:44] Inductive priors, why do we need them if we can start sparse? [00:56:02] Why anthropomorphise inductive priors? [01:02:13] Could Cerebras run a cyclic computational graph? [01:03:16] Are NNs locality sensitive hashing tables? References; Rigging the Lottery: Making All Tickets Winners [RIGL] https://arxiv.org/pdf/1911.11134.pdf [D] DanNet, the CUDA CNN of Dan Ciresan in Jurgen Schmidhuber's team, won 4 image recognition challenges prior to AlexNet https://www.reddit.com/r/MachineLearning/comments/dwnuwh/d_dannet_the_cuda_cnn_of_dan_ciresan_in_jurgen/ A Spline Theory of Deep Learning [Balestriero] https://proceedings.mlr.press/v80/balestriero18b.html
EPISODE NOTESWiDS Executive Director Margot Gerritsen welcomes her new co-host, Cindy Orozco, in a wide-ranging conversation about their career paths and valuable learnings along the way. Cindy is thrilled to be joining as podcast co-host and believes that showcasing women at all stages of their careers shows that we “share the same fears or experiences every day. It's just that some of us have been on the path a little bit longer than others.” Cindy is an applied mathematician who is currently working as a machine learning solutions engineer at Cerebras Systems. Originally from Colombia, she loved applied math, and did a master's in civil engineering and mathematics from King Abdullah University of Science and Technology (KAUST), in Saudi Arabia, and a PhD in Computational and Mathematical Engineering from ICME at Stanford. She met Margot at Stanford and has been contributing to WiDS for many years at conferences, workshops and datathons.After answering some questions about herself, Cindy stepped right into her co-host role to interview Margot. A native of the Netherlands, Margot said her career path was similar to Cindy's as she started in math, got excited about applied math, and decided to study fluid mechanics. After getting her PhD at Stanford, she became a professor at the University of Auckland in New Zealand and then returned to Stanford where she has been a professor for 20 years. During this time, she has been an accomplished researcher, professor, mentor, and leader in the School of Earth, Energy & Environmental Sciences, the Institute for Computational & Mathematical Engineering (ICME), and Women in Data Science (WiDS).When asked how she managed to juggle all of these things, Margot said she learned to not worry about making mistakes or striving for perfection, saying, “80% is perfect”, adding “I always felt I can't have it all. So you make choices, and there's always something that's got to give.” Cindy agreed that the busier she is, the better she manages her time, and when you have many balls in the air, often what you learn in one area can help you solve problems in another. In discussing the “imposter syndrome”, Margot said she had often felt like an imposter, and soon discovered this was a common feeling among students and faculty at Stanford. And it's even stronger when you stand out, like a woman in STEM. It puts an extra burden on you to succeed to set the example for those who come after you. The pace of research in AI and deep learning contributes to feeling like an imposter. People publish very quickly and it's hard to understand what really good solid research is and what is just an idea. It gives people this sense that they're not on top. They forget the purpose of school is creating a lifelong interest in learning. “There's a lot of failure on the way to success. My favorite definition of an expert is somebody who's made every possible mistake.”RELATED LINKSConnect with Cindy Orozco on LinkedIN Find out more about Cerebras SystemsConnect with Margot Gerritsen on Twitter (@margootjeg) and LinkedInFind out more about Margot on her Stanford Profile
Innovators don't see limitations – they see challenges. And that's exactly what happened when Andrew Feldman and his team at Cerebras Systems were told that it was impossible to build a computer chip that could deliver the same performance as hundreds of graphics processing units. They tackled that challenge head-on, and have created the CS-2, the fastest AI computer in existence. This mega-sized chip is being used to tackle the world's most pressing problems. None of this would have been possible without a bit of audacity and what Andrew calls “fearless engineering”. Dive into the thought process of a status-quo challenger on this episode of IT Visionaries. Tune in to learn: What is Cerebras doing that others can't? (0:23)What is the difference between a regular computer and an “AI” computer? (2:34)How does a bigger chip make a difference? (6:55)How do our work habits affect tech? (8:48)What is the mindset of someone who tackles unsolvable problems? (13:34)How is AI shifting the way we approach healthcare? (14:30)How is creating new hardware similar to raising a baby? (21:05)Why does a faster computer change everything? (23:21)Will Cerebras ever try to go even faster? (27:43)IT Visionaries is brought to you by Salesforce Platform. If you love the thought leadership on this podcast, Salesforce has even more meaty IT thoughts to chew on. Take your company to the next level with in-depth research and trends right in your inbox. Subscribe to a newsletter tailored to your role at Salesforce.com/newsletter.Mission.org is a media studio producing content for world-class clients. Learn more at mission.org.
Michael chats with Andrew Feldman, co-founder and CEO of Cerebras Systems, about the role of AI in transforming health care. The team at Cerebras includes computer architects, system engineers, software engineers, and ML researchers who design and build systems to accelerate AI in multiple industries, including health care. Feldman is an entrepreneur dedicated to pushing boundaries in the compute space, and his experience in bringing innovative AI solutions to health care is reflected in his work with industry leaders such as GlaxoSmithKline, AstraZeneca, and Argonne National Laboratories. The role of AI in health care and life sciences is becoming more prominent than ever, he asserts, by leading to efficiencies that just years ago would have been unheard of—and thrusting health care into a new age of possibilities. This episode is sponsored by Cerebras Systems, www.cerebras.net.
Even though there are currently no truly self-driving cars on the market, autonomous vehicles are on the way to the future. https://www.cnet.com/roadshow/news/mercedes-benz-drive-pilot-hands-free-driving-traffic-jam-assist/ Cerebras Systems, the San Jose, California-based startup that makes computers for processing deep learning algorithms and other large-scale scientific computing tasks, announced Wednesday morning that it has sold its first "CS-2" computer to TotalEnergies , the 98-year-old Paris-based energy exploration and production company. https://www.zdnet.com/article/ai-computer-maker-cerebras-nabs-totalenergies-se-as-first-energy-sector-customer/ "Would you like to sign in with the palm of your hand? https://www.nytimes.com/2022/02/28/technology/whole-foods-amazon-automation.html When assessing their current level of comfort, only 29% of respondents said they would be comfortable driving automatically in their own fully autonomous vehicle in the future. https://www.theautochannel.com/news/2022/03/03/1113702-survey-says-consumers-arent-comfortable-with-fully-autonomous-vehicles.html In collaboration with AIM, Cedars-Sinai has developed several key programs in which artificial intelligence is increasingly used. https://hitconsultant.net/2022/03/01/cedars-sinai-establishes-artificial-intelligence-in-medicine-division/ Visit www.integratedaisolutions.com
Podcast jest dostępny także w formie newslettera: https://ainewsletter.integratedaisolutions.com/ Nawet jeśli na rynku nie ma obecnie prawdziwych autonomicznych samochodów, pojazdy autonomiczne wkraczają w przyszłość. https://www.cnet.com/roadshow/news/mercedes-benz-drive-pilot-hands-free-driving-traffic-jam-assist/ Cerebras Systems, startup z siedzibą w San Jose w Kalifornii, który produkuje komputery przeznaczone do przetwarzania algorytmów głębokiego uczenia i innych zadań obliczeniowych na dużą skalę, ogłosił w środę rano, że sprzedał swój pierwszy komputer „CS-2” firmie TotalEnergies, 98-letniemu -stara, paryska firma zajmująca się poszukiwaniem i produkcją energii. https://www.zdnet.com/article/ai-computer-maker-cerebras-nabs-totalenergies-se-as-first-energy-sector-customer/ „Chcesz się zalogować za pomocą dłoni?” https://www.nytimes.com/2022/02/28/technology/whole-foods-amazon-automation.html Oceniając ich obecny poziom komfortu, tylko 29% respondentów stwierdziło, że w przyszłości czułoby się komfortowo, gdyby automatycznie jeździli własnym, w pełni autonomicznym pojazdem. https://www.theautochannel.com/news/2022/03/03/1113702-survey-says-consumers-arent-comfortable-with-fully-autonomous-vehicles.html W połączeniu z AIM, Cedars-Sinai opracował kilka kluczowych programów, w których w coraz większym stopniu wykorzystuje się sztuczną inteligencję. https://hitconsultant.net/2022/03/01/cedars-sinai-establishes-artificial-intelligence-in-medicine-division/ Odwiedź www.integratedaisolutions.com
Auch wenn es derzeit keine wirklich selbstfahrenden Autos auf dem Markt gibt, sind autonome Fahrzeuge auf dem Weg in die Zukunft. https://www.cnet.com/roadshow/news/mercedes-benz-drive-pilot-hands-free-driving-traffic-jam-assist/ Cerebras Systems, das in San Jose, Kalifornien, ansässige Startup, das Computer für die Verarbeitung von Deep-Learning-Algorithmen und anderen groß angelegten wissenschaftlichen Rechenaufgaben herstellt, gab am Mittwochmorgen bekannt, dass es seinen ersten „CS-2“-Computer an TotalEnergies, den 98-Jährigen, verkauft hat -altes, in Paris ansässiges Energieexplorations- und Produktionsunternehmen. https://www.zdnet.com/article/ai-computer-maker-cerebras-nabs-totalenergies-se-as-first-energy-sector-customer/ „Möchten Sie sich mit Ihrer Handfläche anmelden?“ https://www.nytimes.com/2022/02/28/technology/whole-foods-amazon-automation.html Bei der Bewertung ihres aktuellen Komfortniveaus gaben nur 29 % der Befragten an, dass sie sich wohl fühlen würden, wenn sie in Zukunft in ihrem eigenen vollautonomen Fahrzeug automatisch fahren würden. https://www.theautochannel.com/news/2022/03/03/1113702-survey-says-consumers-arent-comfortable-with-fully-autonomous-vehicles.html In Zusammenarbeit mit AIM hat Cedars-Sinai mehrere Schlüsselprogramme entwickelt, in denen künstliche Intelligenz zunehmend eingesetzt wird. https://hitconsultant.net/2022/03/01/cedars-sinai-establishes-artificial-intelligence-in-medicine-division/ Visit www.integratedaisolutions.com
It's a bird, it's a plane, it's the largest AI processor ever made! In this week's Fish Fry podcast, Andy Hock (Cerebras Systems) joins me to chat about the largest AI processor ever made - the 7 nm wafer scale engine 2, the details of their brain-scale AI training, and how Cerebras Systems is democratizing access to high performance AI computation. Also this week, I check out a new kickstarter campaign called the SPORTSMATE 5: the world's first and lightest portable wearable robotic exoskeleton that aims to alter the way we interact with the world by applying exoskeletons to daily life.
Demand for AI compute is growing faster than conventional systems architecture can match, so companies like Cerebras Systems are building massive special-purpose processing units. In this episode, Andy Hock, VP of Product for Cerebras Systems, joins Frederic Van Haren and Stephen Foskett to discuss this new class of hardware. The Cerebras Wafer-Scale Engine (WSE-2) has 850,000 processors on a single chip the size of a dinner plate, along with 40 GB of SRAM and supporting interconnects. But Cerebras also has a software stack that integrates with standard ML frameworks like PyTorch and TensorFlow. Although the trillion-parameter model is a real need for certain applications, platforms need to be flexible to support both massive-scale and more mainstream workloads, and this is a focus for Cerebras as well. Three Questions Frederic's Question: How small can ML get? Will we have ML-powered household appliances? Toys? Disposable devices? Stephen's Question: Will we ever see a Hollywood-style “artificial mind” like Mr. Data or other characters? Leon Adato, host of the Technically Religious Podcast: I'm curious, what responsibility do you think IT folks have to insure the things that we build are ethical? Guests and Hosts Andy Hock, VP of Product at Cerebras Systems. Connect with Andy on LinkedIn. Follow Cerebras Systems on Twitter at @CerebrasSystems. Frederic Van Haren, Founder at HighFens Inc., Consultancy & Services. Connect with Frederic on Highfens.com or on Twitter at @FredericVHaren. Stephen Foskett, Publisher of Gestalt IT and Organizer of Tech Field Day. Find Stephen's writing at GestaltIT.com and on Twitter at @SFoskett. Date: 10/19/2021 Tags: @CerebrasSystems, @SFoskett, @FredericVHaren
Andrew Feldman, one of the founders and CEO of Cerebras Systems, talks about the company's wafer-scale computer chip optimized for machine learning and about the network of chips that company has built that has as much computing power as a human brain.
Andrew Feldman, whose company Cerebras Systems makes the world's largest computer chip to accelerate grand artificial-intelligence applications ranging from drug discovery to new-materials research, discusses what it takes to launch a generational company, and where the biggest entrepreneurial opportunities lie as the AI revolution unfolds across society. In conversation with Eclipse Ventures' Kushagra Vaid, Andrew shares his experience with taking on mission-impossible challenges and explains that deep-tech builders would do better to fail in the pursuit of doing something extraordinary rather than thinking incrementally. More info: Cerebras Systems IEEE Spectrum – Cerebras' New Monster AI Chip Adds 1.4 Trillion Transistors Eclipse Ventures
In this episode, Marcus Edwardes speaks to Tom Case. Tom is the head of Talent at Cerebras Systems, a computer systems company, building a new type of computer optimized for AI work, hosting contains the world's largest computer chip.Tom has an Executive search background that began in an agency, went to Facebook, and worked at Intel Capital as Talent Partner before joining Cerebras.Tune in to listen to Tom sharing his passion for Start-Up recruiting and the nuances that make it such a giant leap from his Talent Partner role, where he was serving start-ups on a much more ad-hoc basis. Tom shares experiences working in different roles and how to establish a working recruitment team that benefits both the company and the employees. He also gives tips on building a talent strategy, how to bring in a new team, why diversity and inclusion are important for startups, and other informative topics.[02:01] What Tom is most passionate about as a recruiter[04:54] Working as a Talent Partner[05:55] Should companies pay more attention to a working talent strategy?[06:57] Perception on talent strategy as a roadmap[11:25] Establishing a recruiting team[16:30] Recruiting process and organization's culture[21:18] Appreciating employees[23:25] Flat structure vs. Management structure[24:26] How management helps people improve[25:21] Diversity and inclusion for startups[29:30] Key challenges when raiding the number of employees[32:55] Bringing in new people into the organization[38:08] Figuring out what to take to the market as recruiters Notable QuotesStartups don't have many resources. They need to double down to the single one or two things that move the company forwardIf you build a recruiting team, you have to get the right people to do that.If you are bringing in managers, you gonna make sure you're bringing the right ones. And you gonna make sure that people see the value in going from flat to management structure.The more diversity you have, the more diversity you recruit.Sometimes, it's like we only hire the most exceptional people. But it's not true. There are not many exceptional people, and when you get one, they move your company 10x more than anybody else.Connect with TomLinkedIn https://www.linkedin.com/in/tom-case-81b8504/Cerebras Systems Website https://cerebras.net/
Understanding intent represents a major milestone in the world of Artificial Intelligence. The practice of ascertaining a person's intent (or that of a machine, frankly) is similar to the long-standing discipline of predictive analytics, but it's arguably a deeper view into the psyche (or inner workings). This is just another way in which AI is transforming business. Check out this episode of DM Radio to hear host @eric_kavanagh interview several experts, including Andrew Feldman of Cerebras Systems, Chris Nicholson of Pathmind, and a special guest!
Andrew Feldman is the co-founder and CEO of Cerebras Systems which is a computer systems company dedicated to accelerating deep learning. The company has raised so far $200 million from top tier investors like Benchmark, Foundation Capital, and Altimer Capital. Prior to this, Andrew Feldman cofounded SeaMicro (acquired by AMD for $355M) and Riverstone Networks (acquired by YAGO for $280M).