POPULARITY
Rob Zuber sits down with Tara Hernandez, VP of Developer Productivity at MongoDB and former Netscape engineer who helped create early continuous integration systems, to explore strategic frameworks for build vs. buy decisions in modern software delivery.Hernandez shares insights from scaling MongoDB's proprietary CI system—processing 10 engineer years of compute daily—and reveals how organizations can evaluate when custom infrastructure drives competitive advantage versus when strategic partnerships accelerate growth. Her perspective on navigating the evolving landscape of CI/CD tooling offers actionable guidance for engineering leaders balancing innovation with operational efficiency.Have someone in mind you'd like to hear on the show? Reach out to us on X at @CircleCI!
In this episode of the Engineering Enablement podcast, host Abi Noda is joined by Quentin Anthony, Head of Model Training at Zyphra and a contributor at EleutherAI. Quentin participated in METR's recent study on AI coding tools, which revealed that developers often slowed down when using AI—despite feeling more productive. He and Abi unpack the unexpected results of the study, which tasks AI tools actually help with, and how engineering teams can adopt them more effectively by focusing on task-level fit and developing better digital hygiene.Where to find Quentin Anthony: • LinkedIn: https://www.linkedin.com/in/quentin-anthony/• X: https://x.com/QuentinAnthon15Where to find Abi Noda:• LinkedIn: https://www.linkedin.com/in/abinoda In this episode, we cover:(00:00) Intro(01:32) A brief overview of Quentin's background and current work(02:05) An explanation of METR and the study Quentin participated in (11:02) Surprising results of the METR study (12:47) Quentin's takeaways from the study's results (16:30) How developers can avoid bloated code bases through self-reflection(19:31) Signs that you're not making progress with a model (21:25) What is “context rot”?(23:04) Advice for combating context rot(25:34) How to make the most of your idle time as a developer(28:13) Developer hygiene: the case for selectively using AI tools(33:28) How to interact effectively with new models(35:28) Why organizations should focus on tasks that AI handles well(38:01) Where AI fits in the software development lifecycle(39:40) How to approach testing with models(40:31) What makes models different (42:05) Quentin's thoughts on agents Referenced:DX Core 4 Productivity FrameworkZyphraEleutherAIMETRCursorClaudeLibreChatGoogle GeminiIntroducing OpenAI o3 and o4-miniMETR's study on how AI affects developer productivityQuentin Anthony on X: "I was one of the 16 devs in this study."Context rot from Hacker NewsTracing the thoughts of a large language modelKimiGrok 4 | xAI
What happens after AI helps you write code faster? You create a bottleneck in testing, security, and operations. In part two of their conversation, SADA's Simon Margolis and Google Cloud's Ameer Abbas tackle this exact problem. They explore how Google's AI strategy extends beyond the developer's keyboard with Gemini Code Assist and Cloud Assist, creating a balanced and efficient software lifecycle from start to finish. We address the burning questions about AI's impact on the software development ecosystem: Is AI replacing developers? What does the future hold for aspiring software engineers? Gain insights on embracing AI as an augmentation tool, the concept of "intentional prompting" versus "vibe coding," and why skilled professionals are more crucial than ever in the enterprise. This episode offers practical advice for enterprises on adopting AI tools, measuring success through quantitative and qualitative metrics, and finding internal champions to drive adoption. We also peek into the near future, discussing the evolution towards AI agents capable of multi-step inferencing and full automation for specific use cases. Key Takeaways: Gemini Code Assist: AI for developer inner-loop productivity, supporting various IDEs and SCMs. Gemini Cloud Assist: AI for cloud operations, cost optimization, and incident resolution within GCP. AI's Role in Development: Augmentation, not replacement; the importance of human agency and prompting skills. Enterprise Adoption: Strategies for integrating AI tools, measuring ROI, and fostering a culture of innovation. The Future: Agents with multi-step inferencing, automation for routine tasks, and background AI processes. Relevant Links: Blog: A framework for adopting Gemini Code Assist and measuring its impact Gemini Code Assist product page Gemini Cloud Assist product page Listen now to understand how AI is shaping the future of software delivery! Join us for more content by liking, sharing, and subscribing!
HTML All The Things - Web Development, Web Design, Small Business
In this episode, Mike explores his growing dependence on AI tools like Cursor and ChatGPT to handle everyday coding tasks. From code generation and DevOps to security reviews and task planning, AI is integrated into nearly every part of his workflow. But as these tools take over more responsibilities, Mike asks the hard questions: Am I losing my edge? Should I still code manually even if AI can do it faster—or better? He shares how he uses AI day-to-day, when he steps in to take control, and whether it's time to focus on solving tougher problems that AI can't yet tackle. Show Notes: https://www.htmlallthethings.com/podcasts/boosting-developer-productivity-with-ai-without-losing-your-edge Powered by CodeRabbit - AI Code Reviews: https://coderabbit.link/htmlallthethings Use our Scrimba affiliate link (https://scrimba.com/?via=htmlallthethings) for a 20% discount!! Full details in show notes.
Guest: Adir Ben-Yehuda, CEO at Autonomy AIThe go-to-market playbook for AI SaaS is being rewritten in real-time, and those who cling to old models risk being left behind.In this episode, Adir Ben-Yehuda, founder and CEO of Autonomy AI, joins host Ken Lempit to share how his team went from zero to 70 customers in months by ditching outdated frameworks and building a brand buyers can trust.We unpack: ✅ Why “Crossing the Chasm” no longer applies in the age of ChatGPT ✅ How brand marketing now beats lead gen in AI go-to-market ✅ What it really takes to convert skeptical enterprise buyers ✅ The shift from SEO to “share of response” in AI search platforms ✅ Why every SaaS GTM leader must become an orchestrator—not just a doerAdir also breaks down the sales motion that helped Autonomy scale so quickly—and shares how his team leverages live demos and social proof to close deals in a single call.If you're a B2B SaaS CMO or CRO navigating AI adoption, rethinking pipeline strategy, or looking for a more effective way to win technical buyers, this episode is your cheat code.---Not Getting Enough Demos? Your messaging could be turning buyers away before you even get a chance to pitch.
How do you drive meaningful AI transformation across 150 software engineers without mandates or force? Peter Gostev, Head of AI at Moonpig, reveals the technical strategies and organizational approaches behind scaling AI adoption from 130 to 400+ users while navigating the gap between industry hype and implementation reality. From managing complex integration challenges where 80% of AI projects involve traditional software engineering to implementing three-pillar strategies (tool adoption, automation workflows, experimental features), Peter shares hard-earned insights on building AI capabilities through process re-engineering rather than simple automation. Technical insights for CTOs and engineering leaders: •
Is generative AI just another tool in the belt, or is it a fundamental transformation of the developer profession? We kick off a two-part special to get to the bottom of how AI is impacting the enterprise. SADA's Associate CTO of AI & ML, Simon Margolis, sits down with Ameer Abbas, Senior Product Manager at Google Cloud, for an insider's look at the future of software development. They cut through the noise to discuss how tools like Gemini Code Assist are moving beyond simple code completion to augment the entire software delivery lifecycle, solving real-world challenges and changing the way we think about productivity, quality, and automation. In this episode, you'll learn: What Gemini Code Assist is and the broad range of developer personas it serves. The critical debate: Is AI augmenting developer skills or automating their jobs? How to leverage AI for practical enterprise challenges like application modernization, improving test coverage, and tackling technical debt. Why the focus is shifting from developer productivity to overall software delivery performance. Ameer's perspective on the future of development careers and why students should lean into AI, not fear it. The limitations of "vibe coding" and the need for intentional, high-quality AI prompting in a corporate environment. Join us for more content by liking, sharing, and subscribing!
In this episode, Abi Noda talks with Frank Fodera, Director of Engineering for Developer Experience at CarGurus. Frank shares the story behind CarGurus' transition from a monolithic architecture to microservices, and how that journey led to the creation of their internal developer portal, Showroom. He outlines the five pillars of the IDP, how it integrates with infrastructure, and why they chose to build rather than buy. The conversation also explores how CarGurus is approaching AI tool adoption across the engineering team, from experiments and metrics to culture change and leadership buy-in.Where to find Frank Fodera : • LinkedIn: https://www.linkedin.com/in/frankfodera/Where to find Abi Noda:• LinkedIn: https://www.linkedin.com/in/abinoda In this episode, we cover:(00:00) Intro: IDPs (Internal Developer Portals) and AI (02:07) The IDP journey at CarGurus(05:53) A breakdown of the people responsible for building the IDP(07:05) The five pillars of the Showroom IDP(09:12) How DevX worked with infrastructure(11:13) The business impact of Showroom(13:57) The transition from monolith to microservices and struggles along the way(15:54) The benefits of building a custom IDP(19:10) How CarGurus drives AI coding tool adoption (28:48) Getting started with an AI initiative(31:50) Metrics to track (34:06) Tips for driving AI adoptionReferenced:DX Core 4 Productivity Framework Internal Developer Portals: Use Cases and Key ComponentsStrangler Fig Pattern - Azure Architecture Center | Microsoft LearnSpotify for BackstageThe AI adoption playbook: Lessons from Microsoft's internal strategy
"There's a lot of hype with the AI agents and their productivity and potential outcomes. AI Agents are quite amazing, says Eric Paulsen, EMEA Field CTO at Coder.In this episode of the Tech Transformed podcast, Shubhangi Dua, Podcast Host and Producer at EM360Tech, talks to Paulsen about the constantly advancing role of AI agents in development environments. Paulsen explains how AI agents can help developers by handling simpler tasks, almost like having assistants or junior developers to assist them. Not only would this boost productivity and time efficiency, but the technology will also ensure human oversight. The conversation further explores how AI fits into cloud development environments, especially in regulated areas like finance, where security and scalability matter most. Paulsen stresses the value of internal AI models and points out Coder's unique role in offering infrastructure-neutral solutions that meet various enterprise needs.AI Agents Are More Than Just Code WritersWhen people hear "agentic AI" or "coding agents," there's often a misconception about fully autonomous coders. However, Paulsen clarifies, "That's a far stretch from where we currently have been, which is with just AI-assisted IDE extensions such as GitHub, Copilot, Amazon Q Developer and systems of that nature." Coder focuses on agentic solutions that have a human developer in the loop, emphasising Paulsen. “Think of an AI agent as a junior engineer working alongside you,” Paulsen explains. "If anything, it's improving the output of the human engineer by having an autonomous or artificial or AI process. In the same development environment, working on other tasks that might not necessarily be as complex," he adds. This means developers can offload simple tasks like bug fixes or dependency updates, freeing them to focus on more complex features.How to Scale AI Agents Securely in Enterprises?For large financial institutions that have hundreds and even thousands of software engineers, deploying AI agents at scale requires a consistent and secure approach. Cloud development environments provide the best way to deliver and package these agents for developers.The main concern for enterprises is ensuring data security in addition to stopping AI agents from "running wild on a laptop." Paulsen stresses the need for agents to work within an "isolated compute," with "boundaries around those agents inside of that isolated compute." Such a secure environment provides guardrails to synchronise and boost productivity between humans and AI while preventing sensitive data breaches or "hallucinations" from the AI.Additionally, financial institutions are now increasingly developing their own internal AI models. Paulsen mentions, "What these institutions need is an AI agent that is trained on the internal dataset and internal LLM that is built within the firm so that it can make those decisions and return the relevant output to the data scientist or software engineer." This move towards self-hosted LLMs and internal AI infrastructure is essential for adopting enterprise-grade AI.The ultimate message is that cloud development environments should provide the framework where AI agents are running inside an enterprise's infrastructure. “AI agents have access to the data, and they're observed and governed by a set of security standards that you have internally,” says the EMEA Field CTO at Coder.TakeawaysAI agents can assist developers by handling simpler...
Rachel Owen dives into the complexities of developer productivity with expert insights from Ohto Rainio (Team Tech Lead, Fortum), Tommi Kotikangas (CTO, Vihreä Älyenergia), Antti Jaakkonen (Senior Director Product Engineering, ICEYE), and Lukasz Walach (Software Team Lead, OptoFidelity). The conversation explores the practical and philosophical challenges of measuring output, balancing performance with developer experience, and aligning productivity metrics with business value. Ideal for tech leaders, software managers, and engineering teams striving for better performance outcomes in modern development environments.
"How do you measure the impact you have with your platform engineering initiative?" is a question you should be able to answer. To show improvement you must first need to know what the status quo is. And this is where frameworks such as DX Core 4 come in. Never heard about it? Then tune into this episode where we have Dušan Katona, Sr Director of Platform Engineering at Ataccama, who is a big fan of the DX Core Four Metrics and who has just applied it in his current role to optimize developer experience.Dušan explains the details behind those 4 Core metrics: Speed, Effectiveness, Quality and Impact. He also shares how improving those metrics by a single point results in the equivalent of 10 hours saved per developer per year.And here the relevant links we discussed todayDusan's LinkedIn Profile: https://www.linkedin.com/in/dusankatona/DX Core 4 Blog: https://getdx.com/research/measuring-developer-productivity-with-the-dx-core-4/Marian's JIRA Analytics Open Source Project: https://github.com/marian-kamenistak/jira-lead-cycle-time-duration-extractor
In this episode, Abi Noda speaks with Gilad Turbahn, Head of Developer Productivity, and Amy Yuan, Director of Engineering at Snowflake, about how their team builds and sustains operational excellence. They break down the practices and principles that guide their work—from creating two-way communication channels to treating engineers as customers. The conversation explores how Snowflake fosters trust, uses feedback loops to shape priorities, and maintains alignment through thoughtful planning. You'll also hear how they engage with teams across the org, convert detractors, and use Customer Advisory Boards to bring voices from across the company into the decision-making process.Where to find Amy Yuan: • LinkedIn: https://www.linkedin.com/in/amy-yuan-a8ba783/Where to find Gilad Turbahn:• LinkedIn: https://www.linkedin.com/in/giladturbahn/Where to find Abi Noda:• LinkedIn: https://www.linkedin.com/in/abinoda In this episode, we cover:(00:00) Intro: an overview of operational excellence(04:13) Obstacles to executing with operational excellence(05:51) An overview of the Snowflake playbook for operational excellence(08:25) Who does the work of reaching out to customers(09:06) The importance of customer engagement(10:19) How Snowflake does customer engagement (14:13) The types of feedback received and the two camps (supporters and detractors)(16:55) How to influence detractors and how detractors actually help (18:27) Using insiders as messengers(22:48) An overview of Snowflake's customer advisory board(26:10) The importance of meeting in person (learnings from Warsaw and Berlin office visits)(28:08) Managing up(30:07) How planning is done at Snowflake(36:25) Setting targets for OKRs, and Snowflake's philosophy on metrics (39:22) The annual plan and how it's shared Referenced:CTO buy-in, measuring sentiment, and customer focusSnowflakeBenoit Dageville - Snowflake Computing | LinkedInThierry Cruanes - Snowflake Computing | LinkedIn
In this episode, Abi Noda speaks with DX CTO Laura Tacho about the real obstacles holding back AI adoption in engineering teams. They discuss why technical challenges are rarely the blocker, and how fear, unclear expectations, and inflated hype can stall progress. Laura shares practical strategies for driving adoption, including how to model usage from the top down, build momentum through champions and training programs, and measure impact effectively—starting with establishing a baseline before introducing AI tools.Where to find Laura Tacho: • LinkedIn: https://www.linkedin.com/in/lauratacho/• Website: https://lauratacho.com/Where to find Abi Noda:• LinkedIn: https://www.linkedin.com/in/abinoda In this episode, we cover:(00:00) Intro: The full spectrum of AI adoption(03:02) The hype of AI(04:46) Some statistics around the current state of AI coding tool adoption(07:27) The real barriers to AI adoption(09:31) How to drive AI adoption (15:47) Measuring AI's impact (19:49) More strategies for driving AI adoption (23:54) The Methods companies are actually using to drive impact(29:15) Questions from the chat (39:48) Wrapping upReferenced:DX Core 4 Productivity FrameworkThe AI adoption playbook: Lessons from Microsoft's internal strategyMicrosoft CEO says up to 30% of the company's code was written by AI | TechCrunchViral Shopify CEO Manifesto Says AI Now Mandatory For All EmployeesDORA | Impact of Generative AI in Software DevelopmentGuide to AI assisted engineeringJustin Reock - DX | LinkedIn
Jack Herrington, podcaster, software engineer, writer and YouTuber, joins the pod to uncover the truth behind server functions and why they don't actually exist in the web platform. We dive into the magic behind frameworks like Next.js, TanStack Start, and Remix, breaking down how server functions work, what they simplify, what they hide, and what developers need to know to build smarter, faster, and more secure web apps. Links YouTube: https://www.youtube.com/@jherr Twitter: https://x.com/jherr Github: https://github.com/jherr ProNextJS: https://www.pronextjs.dev Discord: https://discord.com/invite/KRVwpJUG6p LinkedIn: https://www.linkedin.com/in/jherr Website: https://jackherrington.com Resources Server Functions Don't Exist (It Matters) (https://www.youtube.com/watch?v=FPJvlhee04E) We want to hear from you! How did you find us? Did you see us on Twitter? In a newsletter? Or maybe we were recommended by a friend? Let us know by sending an email to our producer, Em, at emily.kochanek@logrocket.com (mailto:emily.kochanek@logrocket.com), or tweet at us at PodRocketPod (https://twitter.com/PodRocketpod). Follow us. Get free stickers. Follow us on Apple Podcasts, fill out this form (https://podrocket.logrocket.com/get-podrocket-stickers), and we'll send you free PodRocket stickers! What does LogRocket do? LogRocket provides AI-first session replay and analytics that surfaces the UX and technical issues impacting user experiences. Start understanding where your users are struggling by trying it for free at LogRocket.com. Try LogRocket for free today. (https://logrocket.com/signup/?pdr) Special Guest: Jack Herrington.
Thomas Dohmke, CEO of GitHub, joins Azeem to explore how AI is fundamentally transforming software development. In this episode you'll hear: (01:50) What's left for developers in the age of AI? (04:54) How GitHub Copilot unlocks flow state (07:09) Three big shifts in how engineers work today (10:47) Is software development art or assembly line? (15:26) Why developers are climbing the abstraction ladder (19:35) Have we already lost control of the code? (23:15) What it's actually like to work with AI coding agents (39:35) Welcome to the age of ultra-personalized software(45:37) Building the next-generation web Thomas's links:GitHub: https://github.com/LinkedIn: https://www.linkedin.com/in/ashtom/Twitter/X: https://x.com/ashtomAzeem's links:Substack: https://www.exponentialview.co/Website: https://www.azeemazhar.com/LinkedIn: https://www.linkedin.com/in/azharTwitter/X: https://x.com/azeemOur new show This was originally recorded for "Friday with Azeem Azhar", a new show that takes place every Friday at 9am PT and 12pm ET. You can tune in through Exponential View on Substack. Produced by supermix.io and EPIIPLUS1 Ltd
In this episode, Abi Noda speaks with Derek DeBellis, lead researcher at Google's DORA team, about their latest report on generative AI's impact on software productivity.They dive into how the survey was built, what it reveals about developer time and “flow,” and the surprising gap between individual and team outcomes. Derek also shares practical advice for leaders on measuring AI impact and aligning metrics with organizational goals.Where to find Derek DeBellis: • LinkedIn: https://www.linkedin.com/in/derekdebellis/Where to find Abi Noda:• LinkedIn: https://www.linkedin.com/in/abinoda In this episode, we cover:(00:00) Intro: DORA's new Impact of Gen AI report(03:24) The methodology used to put together the surveys DORA used for the report (06:44) An example of how a single word can throw off a question (07:59) How DORA measures flow (10:38) The two ways time was measured in the recent survey(14:30) An overview of experiential surveying (16:14) Why DORA asks about time (19:50) Why Derek calls survey results ‘observational data' (21:49) Interesting findings from the report (24:17) DORA's definition of productivity (26:22) Why a 2.1% increase in individual productivity is significant (30:00) The report's findings on decreased team delivery throughput and stability (32:40) Tips for measuring AI's impact on productivity (38:20) Wrap up: understanding the data Referenced:DORA | Impact of Generative AI in Software DevelopmentThe science behind DORAYale Professor Divulges Strategies for a Happy Life Incredible! Listening to ‘When I'm 64' makes you forget your ageSlow Productivity: The Lost Art of Accomplishment without BurnoutDORA, SPACE, and DevEx: Which framework should you use?SPACE framework, PRs per engineer, AI research
In this episode, Abi Noda is joined by Laura Tacho, CTO at DX, engineering leadership coach, and creator of the Core 4 framework. They explore how engineering organizations can avoid common pitfalls when adopting metrics frameworks like SPACE, DORA, and Core 4.Laura shares a practical guide to getting started with Core 4—beginning with controllable input metrics that teams can actually influence. The conversation touches on Goodhart's Law, why focusing too much on output metrics can lead to data distortion, and how leaders can build a culture of continuous improvement rooted in meaningful measurement.Where to find Laura Tacho: • LinkedIn: https://www.linkedin.com/in/lauratacho/• Website: https://lauratacho.com/Where to find Abi Noda:• LinkedIn: https://www.linkedin.com/in/abinoda In this episode, we cover:(00:00) Intro: Improving systems, not distorting data(02:20) Goal setting with the new Core 4 framework(08:01) A quick primer on Goodhart's law(10:02) Input vs. output metrics—and why targeting outputs is problematic(13:38) A health analogy demonstrating input vs. output(17:03) A look at how the key input metrics in Core 4 drive output metrics (24:08) How to counteract gamification (28:24) How to get developer buy-in(30:48) The number of metrics to focus on (32:44) Helping leadership and teams connect the dots to how input goals drive output(35:20) Demonstrating business impact (38:10) Best practices for goal settingReferenced:DX Core 4 Productivity FrameworkEngineering Enablement PodcastDORA's software delivery metrics: the four keysThe SPACE of Developer Productivity: There's more to it than you thinkDevEx: What Actually Drives ProductivityDORA, SPACE, and DevEx: Which framework should you use?Goodhart's law Nicole Forsgren - Microsoft | LinkedInCampbell's law Introducing Core 4: The best way to measure and improve your product velocityDX Core 4: Framework overview, key design principles, and practical applicationsDX Core 4: 2024 benchmarks - by Abi Noda
This interview was recorded for GOTO Unscripted.https://gotopia.techRead the full transcription of this interview here:https://gotopia.tech/articles/347Tudor Girba - Software Environmentalist and CEO of feenk Julian Wood - Serverless Developer Advocate at AWSRESOURCESTudorhttps://bsky.app/profile/tudorgirba.comhttps://github.com/girbahttps://www.linkedin.com/in/girbahttp://www.tudorgirba.comhttps://medium.com/@girbaJulianhttps://bsky.app/profile/julianwood.comhttp://www.wooditwork.comhttps://www.linkedin.com/in/julianrwoodDESCRIPTIONGet a sneak peak into the concept of moldable development as a transformative approach through a discussion between Tudor Girba and Julian Wood. By emphasizing the creation of tailored, flexible tools, Girba presents a way to reduce the friction of working with complex systems and legacy code. He explores how contextualized tools, such as those provided by the Glamorous Toolkit, allow developers to engage with software environments in a more intuitive and efficient manner.The integration of generative AI is also examined, where Girba argues that AI's true potential lies not in providing definitive answers but in assisting developers by generating hypotheses and creating tools that support deeper engagement with code. Additionally, the concept of "habitability" is introduced, proposing that software systems, like physical spaces, should be navigable and comprehensible to developers, fostering a more productive and enjoyable experience.This vision of a future where software systems are more accessible and adaptable reflects the growing need to rethink how we interact with code, empowering developers with the tools and perspectives necessary to navigate increasingly complex digital landscapes.RECOMMENDED BOOKSRichard P. Gabriel • Patterns of SoftwareSusanne Kaiser • Adaptive Systems With Domain-Driven Design, Wardley Mapping, and Team TopologiesEric Evans • Domain-Driven DesignMatthew Skelton & Manuel Pais • Team TopologiesHeidi Helfand • Dynamic ReteamingVlad Khononov • Learning Domain-Driven DesignErik Schön • The Art of StrategyThomas BlueskyTwitterInstagramLinkedInFacebookCHANNEL MEMBERSHIP BONUSJoin this channel to get early access to videos & other perks:https://www.youtube.com/channel/UCs_tLP3AiwYKwdUHpltJPuA/joinLooking for a unique learning experience?Attend the next GOTO conference near you! Get your ticket: gotopia.techSUBSCRIBE TO OUR YOUTUBE CHANNEL - new videos posted daily!
This is the Engineering Culture Podcast, from the people behind InfoQ.com and the QCon conferences. In this podcast, Shane Hastie, Lead Editor for Culture & Methods, spoke with Trisha Gee about the challenges and importance of addressing flaky tests, their impact on developer productivity and morale, best practices for testing, and broader concepts of measuring and improving developer productivity. Read a transcript of this interview: https://bit.ly/4iiUC6a Subscribe to the Software Architects' Newsletter for your monthly guide to the essential news and experience from industry peers on emerging patterns and technologies: https://www.infoq.com/software-architects-newsletter Upcoming Events: InfoQ Dev Summit Boston (June 9-10, 2025) Actionable insights on today's critical dev priorities. devsummit.infoq.com/conference/boston2025 InfoQ Dev Summit Munich (October 15-16, 2025) Essential insights on critical software development priorities. https://devsummit.infoq.com/conference/munich2025 QCon San Francisco 2025 (November 17-21, 2025) Get practical inspiration and best practices on emerging software trends directly from senior software developers at early adopter companies. https://qconsf.com/ QCon AI NYC 2025 (December 16-17, 2025) https://ai.qconferences.com/ The InfoQ Podcasts: Weekly inspiration to drive innovation and build great teams from senior software leaders. Listen to all our podcasts and read interview transcripts: - The InfoQ Podcast https://www.infoq.com/podcasts/ - Engineering Culture Podcast by InfoQ https://www.infoq.com/podcasts/#engineering_culture - Generally AI: https://www.infoq.com/generally-ai-podcast/ Follow InfoQ: - Mastodon: https://techhub.social/@infoq - Twitter: twitter.com/InfoQ - LinkedIn: www.linkedin.com/company/infoq - Facebook: bit.ly/2jmlyG8 - Instagram: @infoqdotcom - Youtube: www.youtube.com/infoq Write for InfoQ: Learn and share the changes and innovations in professional software development. - Join a community of experts. - Increase your visibility. - Grow your career. https://www.infoq.com/write-for-infoq
Brian Houck from Microsoft returns to discuss effective strategies for driving AI adoption among software development teams. Brian shares his insights into why the immense hype around AI often serves as a barrier rather than a facilitator for adoption, citing skepticism and inflated expectations among developers. He highlights the most effective approaches, including leadership advocacy, structured training, and cultivating local champions within teams to demonstrate practical use cases. Brian emphasizes the importance of honest communication about AI's capabilities, avoiding over-promises, and ensuring that teams clearly understand what AI tools are best suited for. Additionally, he discusses common pitfalls, such as placing excessive pressure on individuals through leaderboards and unrealistic mandates, and stresses the importance of framing AI as an assistant rather than a replacement for developer skills. Finally, Brian explores the role of data and metrics in adoption efforts, offering practical advice on how to measure usage effectively and sustainably.Where to find Brian Houck: • LinkedIn: https://www.linkedin.com/in/brianhouck/ • Website: https://www.microsoft.com/en-us/research/people/bhouck/ Where to find Abi Noda:• LinkedIn: https://www.linkedin.com/in/abinoda In this episode, we cover:(00:00) Intro: Why AI hype can hinder adoption among teams(01:47) Key strategies companies use to successfully implement AI(04:47) Understanding why adopting AI tools is uniquely challenging(07:09) How clear and consistent leadership communication boosts AI adoption(10:46) The value of team leaders ("local champions") demonstrating practical AI use(14:26) Practical advice for identifying and empowering team champions(16:31) Common mistakes companies make when encouraging AI adoption(19:21) Simple technical reminders and nudges that encourage AI use(20:24) Effective ways to track and measure AI usage through dashboards(23:18) Working with team leaders and infrastructure teams to promote AI tools(24:20) Understanding when to shift from adoption efforts to sustained use(25:59) Insights into the real-world productivity impact of AI(27:52) Discussing how AI affects long-term code maintenance(29:02) Updates on ongoing research linking sleep quality to productivityReferenced:DX Core 4 Productivity FrameworkEngineering Enablement PodcastDORA MetricsDropbox Engineering BlogEtsy Engineering BlogPfizer Digital InnovationBrown Bag Sessions – A GuideIDE Integration and AI ToolsDeveloper Productivity Dashboard Examples
Justin Reock has spent a lot of his career thinking about how to help developers be more productive. In this episode we learn about the methodologies that can help developers spend more time in "flow state" - happily coding the fun stuff. Further reading: Measuring developer productivity with the DX Core 4 Discuss this episode: discord.gg/XVKD2uPKyF
In this episode, we're joined by author and researcher Gene Kim for a wide-ranging conversation on the evolution of DevOps, developer experience, and the systems thinking behind organizational performance. Gene shares insights from his latest work on socio-technical systems, the role of developer platforms, and how AI is reshaping the shape of engineering teams. We also explore the coordination challenges facing modern organizations, the limits of tooling, and the deeper principles that unite DevOps, lean, and platform engineering.Mentions and links:Phoenix ProjectDecoding the DNA of the Toyota Production SystemWiring the Winning OrganizationETLS VegasFind Gene on LinkedInDiscussion points:(0:00) Introduction(2:12) The evolving landscape of developer experience(10:34) Option Value theory, and how GenAI helps developers(13:45) The aim of developer experience work(19:59) The significance of layer three changes(23:23) Framing developer experience(32:12) GenAI's part in ‘the death of the stubborn developer”(36:05) GenAI's implications on the workforce(38:05) Where Gene's work is heading
In this episode, Airbnb Developer Productivity leader Anna Sulkina shares the story of how her team transformed itself and became more impactful within the organization. She starts by describing how the team previously operated, where teams were delivering but felt they needed more clarity and alignment across teams. Then, the conversation digs into the key changes they made, including reorganizing the team, clarifying team roles, defining strategy, and improving their measurement systems. Mentions and linksFollow Anna on LinkedInFor A deeper look into how our Engineers and Data Scientists build a world of belonging, check out The Airbnb Tech BlogDiscussion points:(0:00) Intro(1:40) Skills that make a great developer productivity leader(4:36) Challenges in how the team operated previously(10:49) Changing the platform org's focus and structure(16:04) Clarifying roles for EM's, PM's, and tech leads(20:22) How Airbnb defined its infrastructure org's strategy(28:23) Improvements they've seen to developer experience satisfaction(32:13) The evolution of Airbnb's developer experience survey
This interview was recorded for GOTO Unscripted.https://gotopia.techRead the full transcription of this interview hereDr. Gail Murphy - Vice-President Research & Innovation & Professor of Computer Science at The University of British ColumbiaCharles Humble - Freelance Techie, Podcaster, Editor, Author & ConsultantRESOURCESGailhttps://x.com/gail_murphyhttps://social.sigsoft.org/@gail_murphyhttps://www.linkedin.com/in/gailcmurphyhttps://blogs.ubc.ca/gailcmurphyCharleshttps://bsky.app/profile/charleshumble.bsky.socialhttps://linkedin.com/in/charleshumblehttps://mastodon.social/@charleshumblehttps://conissaunce.comDESCRIPTIONCharles Humble interviews Dr. Gail Murphy about the challenges in software engineering today. They discuss how productivity isn't just about lines of code but is more about focus and minimizing task-switching.Gail also talks about the difficulty of managing the rapid evolution of system architectures, stressing the need for regular restructuring and refactoring to avoid issues like increased coupling and decreased performance. The conversation moves to open-source development, where Gail highlights how using open-source components can create complex, brittle dependencies, and the need for better communication within these ecosystems.They wrap up by discussing the evolving role of technical leadership in navigating these challenges. [...]RECOMMENDED BOOKSHeidi Helfand • Dynamic ReteamingHeidi Helfand • How to Change Your TeamsCarl Larson & Frank M J LaFasto • TeamworkGene Kim & Steve Spear • Wiring the Winning OrganizationIchak Adizes • Managing Corporate LifecyclesHenri Lipmanowicz & Keith McCandless • The Surprising Power of Liberating StructuresMatthew Skelton & Manuel Pais • Team TopologiesWilliam Bridges & Susan Bridges • TransitionsBlueskyTwitterInstagramLinkedInFacebookCHANNEL MEMBERSHIP BONUSJoin this channel to get early access to videos & other perks:https://www.youtube.com/channel/UCs_tLP3AiwYKwdUHpltJPuA/joinLooking for a unique learning experience?Attend the next GOTO conference near you! Get your ticket: gotopia.techSUBSCRIBE TO OUR YOUTUBE CHANNEL - new videos posted daily!
Roy Derks, Developer Experience at IBM, talks about the integration of Large Language Models (LLMs) in web development. We explore practical applications such as building agents, automating QA testing, and the evolving role of AI frameworks in software development. Links https://www.linkedin.com/in/gethackteam https://www.youtube.com/@gethackteam https://x.com/gethackteam https://hackteam.io We want to hear from you! How did you find us? Did you see us on Twitter? In a newsletter? Or maybe we were recommended by a friend? Let us know by sending an email to our producer, Emily, at emily.kochanekketner@logrocket.com (mailto:emily.kochanekketner@logrocket.com), or tweet at us at PodRocketPod (https://twitter.com/PodRocketpod). Follow us. Get free stickers. Follow us on Apple Podcasts, fill out this form (https://podrocket.logrocket.com/get-podrocket-stickers), and we'll send you free PodRocket stickers! What does LogRocket do? LogRocket provides AI-first session replay and analytics that surfaces the UX and technical issues impacting user experiences. Start understand where your users are struggling by trying it for free at [LogRocket.com]. Try LogRocket for free today.(https://logrocket.com/signup/?pdr) Special Guest: Roy Derks.
In this podcast Michael Stiefel spoke with Lizzie Matusov about the dependency of effective, productive, and satisfied teams on good software architecture. Understanding this relationship requires understanding exactly what software productivity really is, how modern software engineering research has become more rigorous and practical, and how to apply that research to software development. Read a transcript of this interview: https://bit.ly/41trQt4 Subscribe to the Software Architects' Newsletter for your monthly guide to the essential news and experience from industry peers on emerging patterns and technologies: https://www.infoq.com/software-architects-newsletter Upcoming Events: QCon London (April 7-10, 2025) Discover new ideas and insights from senior practitioners driving change and innovation in software development. https://qconlondon.com/ InfoQ Dev Summit Boston (June 9-10, 2025) Actionable insights on today's critical dev priorities. devsummit.infoq.com/conference/boston2025 InfoQ Dev Summit Munich (October 15-16, 2025) Essential insights on critical software development priorities. https://devsummit.infoq.com/ QCon San Francisco 2025 (17-21, 2025) Get practical inspiration and best practices on emerging software trends directly from senior software developers at early adopter companies. https://qconsf.com/ InfoQ Dev Summit New York (Save the date - December 2025) https://devsummit.infoq.com/ The InfoQ Podcasts: Weekly inspiration to drive innovation and build great teams from senior software leaders. Listen to all our podcasts and read interview transcripts: - The InfoQ Podcast https://www.infoq.com/podcasts/ - Engineering Culture Podcast by InfoQ https://www.infoq.com/podcasts/#engineering_culture - Generally AI: https://www.infoq.com/generally-ai-podcast/ Follow InfoQ: - Mastodon: https://techhub.social/@infoq - Twitter: twitter.com/InfoQ - LinkedIn: www.linkedin.com/company/infoq - Facebook: bit.ly/2jmlyG8 - Instagram: @infoqdotcom - Youtube: www.youtube.com/infoq Write for InfoQ: Learn and share the changes and innovations in professional software development. - Join a community of experts. - Increase your visibility. - Grow your career. https://www.infoq.com/write-for-infoq
This interview was recorded for GOTO Unscripted.https://gotopia.techRead the full transcription of this interview hereDaniel Terhorst-North - Originator of Behavior Driven Development (BDD) & Principal at Dan North & AssociatesJulian Wood - Serverless Developer Advocate at AWSRESOURCESDanielhttps://bsky.app/profile/tastapod.comhttps://www.linkedin.com/in/tastapodhttps://github.com/tastapodhttps://mastodon.social/@tastapodhttp://dannorth.net/blogJulianhttps://bsky.app/profile/julianwood.comhttps://twitter.com/julian_woodhttp://www.wooditwork.comhttps://www.linkedin.com/in/julianrwoodhttps://s12d.com/gotoDESCRIPTIONDaniel Terhorst-North and Julian Wood share decades of experience to offer a nuanced view of programming, governance, and product delivery. By framing programming as a socio-technical activity, they emphasize the critical role of collaboration, feedback, and sustainable practices.The conversation challenges traditional governance models, advocating for hypothesis-driven product management and continuous feedback mechanisms. Through humorous anecdotes and hard-won wisdom, Terhorst-North inspires people to look beyond technical expertise to the broader ecosystem of teams, culture, and organizational alignment. [...]RECOMMENDED BOOKSJez Humble & David Farley • Continuous DeliveryNicole Forsgren, Jez Humble & Gene Kim • AccelerateKim, Humble, Debois, Willis & Forsgren • The DevOps HandbookJez Humble, Joanne Molesky & Barry O'Reilly • Lean EnterpriseHeidi Helfand • Dynamic ReteamingHeidi Helfand • How to Change Your TeamsCarl Larson & Frank M J LaFasto • TeamworkGene Kim & Steve Spear • Wiring the Winning OrganizationMatthew Skelton & Manuel Pais • Team TopologiesBlueskyTwitterInstagramLinkedInFacebookCHANNEL MEMBERSHIP BONUSJoin this channel to get early access to videos & other perks:https://www.youtube.com/channel/UCs_tLP3AiwYKwdUHpltJPuA/joinLooking for a unique learning experience?Attend the next GOTO conference near you! Get your ticket: gotopia.techSUBSCRIBE TO OUR YOUTUBE CHANNEL - new videos posted daily!
If you're a non-technical founder or leader, you might find developers frustrating to work with. (They also think the same thing about you) Developers resist quick changes, seem annoyed by status meetings, and always want the most expensive equipment. That's not because they are prima donnas. This episode will explain why developers work so differently from other professionals, and show you how to create communication systems that work for both business and technical teams. Listen to learn: Why a "5-minute change" actually costs 2 hours of developer time The hidden costs of cheap equipment that hurt your bottom line How to structure meetings to maximise development speed Practical communication strategies that get results Resources mentioned in this episode: https://paulgraham.com/makersschedule.html Timestamps 00:00 Introduction 001:10 Developer Mindsets 02:55 The Importance of Sprints 06:10 Managing Developer Interruptions 09:04 The Maker vs. Manager Schedule 11:59 Hiring and Equipment Considerations 14:50 Effective Communication 18:05 Developer Productivity 21:00 Conclusion For the transcript, go to: https://www.techfornontechies.co/blog/243-how-to-work-with-developers-a-guide-for-non-technical-leaders For more career & tech lessons, subscribe to Tech for Non-Techies on: Apple Spotify YouTube Amazon Podcasts Stitcher Pandora FREE COURSE: 5 Tech Concepts Every Business Leader Needs To Know Growth Through Innovation If your organisation wants to drive revenue through innovation, book a call with us here. Our workshops and innovation strategies have helped Constellation Brands, the Royal Bank of Canada and Oxford University.
Many teams struggle to use developer productivity data effectively because they don't know how to use it to decide what to do next. We know that data is here to help us improve, but how do you know where to look? And even then, what do you actually do to put the wheels of change in motion? Listen to this conversation with Abi Noda and Laura Tacho (CEO and CTO at DX) about data-driven management and how to take a structured, analytical approach to using data for improvement.Mentions and Links:Measuring developer productivity with the DX Core 4Laura's developer productivity metrics courseDiscussion points:(0:00) Intro(2:07) The challenge we're seeing(6:53) Overview on using data(8:58) Use cases for data-engineering organizations(15:57) Use cases for data - engineering systems teams(21:38) Two types of metrics - Diagnostics and Improvement(38:09) Summary
In this episode, David Betts, leader of Twilio's developer platform team, shares how Twilio leverages developer sentiment data to drive platform engineering initiatives, optimize Kubernetes adoption, and demonstrate ROI for leadership. David details Twilio's journey from traditional metrics to sentiment-driven insights, the innovative tools his teams have built to streamline CI/CD workflows, and the strategies they use to align platform investments with organizational goals.Mentions and links:Find David on LinkedInMeasuring developer productivity with the DX Core 4Ask Your Developer by Jeff Lawson, former CEO of TwilioDiscussion points:(0:00) Introduction(0:49) Twilio's developer platform team(2:03) Twilio's approach to release engineering and CD(4:10) How they use sentiment data and telemetry metrics(7:27) Comparing sentiment data and telemetry metrics(10:25) How to take action on sentiment data(13:16) What resonates with execs(15:44) Proving DX value: sentiment, efficiency, and ROI(19:15) Balancing quarterly and real-time developer feedback
Chris Chandler is a Senior Member of the Technical Staff for Developer Productivity at T-Mobile. Chris has led several major initiatives to improve developer experience including their internal developer portal, Starter Kits (a patented developer platform that predates Backstage), and Workforce Transformation Bootcamps for onboarding developers faster.Mentions and links:Follow Chris on LinkedInMeasuring developer productivity with the DX Core 4Listen to Decoder with Nilay Patel.Discussion points:(0:47) From developer experience to developer productivity(7:03) Getting executive buy-in for developer productivity initiatives(13:54) What Chris's team is responsible for(17:02) How they've built relationships with other teams(20:57) How they built and got funding for Dev Console and Starter Kits(27:23) Homegrown solution vs Backstage
In this episode, Abi and Laura dive into the 2024 DX Core 4 benchmarks, sharing insights across data from 500+ companies. They discuss what these benchmarks mean for engineering leaders, how to interpret key metrics like the Developer Experience Index, and offer advice on how to best use benchmarking data in your organization. Mentions and Links:DX core 4 benchmarksMeasuring developer productivity with the DX Core 4Developer experience index (DXI)Will Larson's article on the Core 4 and power of benchmarking dataDiscussion points:(0:42) What benchmarks are for(3:44) Overview of the DX Core 4 benchmarks(6:07) PR throughput data (11:05) Key insights related to startups and mobile teams (14:54) Change fail rate data (19:42) How to best use benchmarking data
Brad and Amy dive into their year-end tech reflections, discussing goal-setting strategies and Amy's ambitious Build 12 project for 2025. The hosts explore various database hosting solutions, share their favorite hardware purchases including cameras and peripherals, and examine how AI tools are reshaping development workflows. The episode concludes with insights into emerging tech trends and anticipated developments for 2025.Chapter Marks00:00 Episode introduction and host intros00:41 Year-end goals discussion and 12-week planning02:02 Amy's Build 12 project announcement03:01 Goal setting strategies and focus04:25 Brad's 2024 goals review05:35 Travel plans and New York City trips06:58 More 2024 goals: fitness, career, and finances08:21 Technical stack discussion13:22 AI tools and development workflows17:19 Database hosting options comparison25:45 Tech gear and hardware updates33:47 Notable tech purchases review43:29 AI tools and future tech discussionLinksBuild Twelve, by Brian P. Moran - Amy's upcoming projectThe 12 Week Year (book)Atomic Habits, by James Clear (book)The Power of Habit, by Charles Duhigg (book)SupabaseNeon databaseDigital OceanTursoCursor IDERemarkable Tablet (v2)Oura RingRazer Basilisk V3 Pro mouseSwish app for MacNuphy Air 75 keyboardDrop keyboardInsta360 One cameraInsta360 Go 3 cameraNikon ZFC cameraRay Deck - Episode 182: Low-Code as a Medium For High-Speed DevelopersMarc LouPieter LevelsWorkOSThe Best Way to Add Authentication to Your Astro Website (Amy's YouTube Video)Comparing Frameworks - Amy's projectGitHub CopilotClaudeconvertkit.comloops.soPrisma
Join Amy and Brad as they break down the latest developments in the React ecosystem following React Conf 2024. From quality-of-life improvements in React 19 to the introduction of the new React compiler, they analyze how these changes will impact everyday development. The episode features an in-depth discussion about Remix's strategic decision to focus on React Router, the ongoing debate between JavaScript frameworks and traditional backend frameworks, and thoughtful insights into choosing the right tools for your projects. Whether you're a seasoned React developer or just getting started, this episode offers valuable perspective on the future of web development.SponsorsSanity delivers content anywhere (just like a headless CMS).Beyond that, Sanity gives you total composability. A fully decoupled, real-time content back end. Entirely customizable content workspaces.Chapters00:00 - Introduction00:42 - Sponsor: Sanity02:12 - React Conf Experience05:00 - Conference Personalities08:52 - React Compiler Deep Dive13:20 - Remix "Taking a Nap" Discussion26:41 - React 19 Features33:54 - JavaScript vs PHP/Laravel Debate41:11 - Framework Decision Fatigue44:45 - Picks & Plugs
In this episode, Abi and Laura introduce the DX Core 4, a new framework designed to simplify how organizations measure developer productivity. They discuss the evolution of productivity metrics, comparing Core 4 with frameworks like DORA, SPACE, and DevEx, and emphasize its focus on speed, effectiveness, quality, and impact. They explore why each metric was chosen, the importance of balancing productivity measures with developer experience, and how Core 4 can help engineering leaders align productivity goals with broader business objectives. Mentions and Links:Measuring developer productivity with the DX Core 4Laura's developer productivity metrics courseDiscussion Points:(2:42) Introduction to the DX Core 4(3:42) Identifying the Core 4's target audience and key stakeholders(4:38) Origins and purpose(9:20) Building executive alignment(14:15) Tying metrics to business value through output-oriented measures(24:45) Defining impact(32:42) Choosing between DORA, SPACE, and Core 4 frameworks
2025 will test every assumption about how engineering teams work.With the new year fast approaching Ori Keren, CEO of LinearB, has some bold predictions that might surprise you, like why developer productivity could actually go down in 2025.Yep, you read that right.As AI tools flood the market, we might see a dip in both productivity and creativity before the long-term benefits kick in. It's a wake-up call for engineering leaders to rethink how they lead their teams.Ori dives into the trends that'll dominate:- AI's rise- The ever-growing need for cybersecurity- Why DevEx and developer productivity are heading for a showdownHis advice? Stop flying blind. “You can't optimize what you don't measure,” he says.If you're leading an engineering org, this episode is your 2025 game plan: a mix of data-driven decision-making and people-first strategies to stay ahead in a year of change. Don't miss this insightful fusion of qual and quant.Show Notes:Join us at Dev Interrupted Live!2025 Engineering Benchmarks Insights WebinarFollow OriFollow BenFollow AndrewSupport the show: Subscribe to our Substack Leave us a review Subscribe on YouTube Follow us on Twitter or LinkedIn Offers: Learn about Continuous Merge with gitStream Get your DORA Metrics free forever
Exciting news from AWS re:Invent! GitLab and AWS are joining forces to supercharge AI-powered software development!
In this episode, Brian Houck, Applied Scientist, Developer Productivity at Microsoft, covers SPACE, DORA, and some specific metrics the developer productivity research team is finding useful. The conversation starts by comparing DORA and SPACE. Brian explains why activity metrics were included in the SPACE framework, then dives into one metric in particular: pull request throughput. Brian also describes another metric Microsoft is finding useful, and gives a preview into where his research is heading. Mentions and linksConnect with Brian on LinkedInThe SPACE of Developer Productivity: There's More to It Than You ThinkMeasuring developer productivity with the DX Core 4DevEx in actionDORA, SPACE, and DevEx: Which framework should you use?Discussion points(0:48) SPACE framework's growth and adoption(3:47) Comparing DORA and SPACE(6:30) SPACE misconceptions and common implementation challenges(9:34) Whether PR throughput is useful (15:13) Real-world example of using PR throughput (21:33) Talking about metrics like PR throughput internally (24:39) Where Brian's research is heading
If information is scattered across all the different sections of a developer portal, how can you make it easy for developers to find exactly what they are looking for? Is AI really the answer to challenges in data exploration? In this episode, Praneet Singh (Product Manager at Intuit) shares valuable insights about improving findability, as well as the emerging personas on developer portals and how repeating data patterns can be the key to serving them.
In this episode, we dive into the exciting announcements from MongoDB's recent keynote, featuring Gaurav, a Senior Product Manager specializing in developer tools. Discover how the new Intelligent Plugin and VSCode Copilot Extension are set to revolutionize the way developers interact with MongoDB.Join us as we explore the future of development with MongoDB and how these tools can significantly improve your workflow. Don't miss out on the opportunity to sign up for the Intelligent Plugin private preview and learn how to get started with the VSCode Copilot Extension today!
Hi, Spring fans! In this installment, I talk to legendary Gradle Developer Productivity Engineering guru (formerly of JFrog) and hero to the JVM-language community, Baruch Sadogursky, recorded live from Dr. Venkat Subramaniam's amazing conference, Dev2Next 2024!
Keri Olson (@ksolson20, VP AI for Code at @IBM) talks about coding assistants across the software development lifecycle, the future of agents, and domain-specific assistants.SHOW: 869SHOW TRANSCRIPT: The Cloudcast #869 TranscriptSHOW VIDEO: https://youtube.com/@TheCloudcastNET CLOUD NEWS OF THE WEEK: http://bit.ly/cloudcast-cnotwNEW TO CLOUD? CHECK OUT OUR OTHER PODCAST: "CLOUDCAST BASICS" SHOW SPONSOR:While data may be shaping our world, Data Citizens Dialogues is shaping the conversationFollow Data Citizens Dialogues on Apple, Spotify, YouTube, or wherever you get your podcastsSHOW NOTES:IBM Watsonx Code Assistant (homepage)IBM Watsonx Code Assistant for Ansible Lightspeed (homepage)IBM Watsonx Code Assistant for Z (homepage)Topic 1 - Welcome to the show. Tell us about your background, and then give us a little bit of background on where you focus your time at IBM these days?Topic 2 - Developer code assistants have become one of the most popular areas of GenAI usage. At a high level, how mature are the technologies that augment developers today? Topic 3 - Software development has an entire lifecycle (Generate, Complete, Explain, Test, Transform, Document). It's easy for developers to just plug in a service, but is that often the most effective way to start using GenAI in the software development lifecycle? Topic 4 - Software developers are notoriously picky about what tools they use and how they use them. GenAI doesn't “guarantee” outputs. Are there concerns that if different developers or groups use different coding assistants, that it could create more challenges than it helps? Topic 5 - What is a holistic way to think about code assistants? How much should be actively engaged with developers, how much should be behind the scenes, how much will be automated or agentic in the future? Topic 6 - In the past, we essentially had “real developers” (people who wrote code) and things like Low-Code for “citizen developers” on process tasks. Do you expect to see code assistants bringing more powerful skills to people that previously hadn't identified as a real developer? (e.g. the great idea on a napkin that turns into a mobile app)FEEDBACK?Email: show at the cloudcast dot netTwitter: @cloudcastpodInstagram: @cloudcastpodTikTok: @cloudcastpod
When measuring developer productivity, traditional output metrics might not tell the whole story. So, its essential to look at environmental, organizational, and cultural factors for a holistic view.
Become a more effective team in this CTO podcast featuring Rebecca Murphey, Field CTO of Swarmia and co-author of Build. From her years of experience working in the developer productivity organizations at Stripe and Indeed and now at Swarmia, Rebecca knows this conversation isn't just about developer metrics and productivity - it's about the broader picture
What are the limitations of general large language models, and when should you evaluate more specialized models for your team's most important use case?This week, Conor Bronsdon sits down with Brandon Jung, Vice President of Ecosystem at Tabnine, to explore the difference between specialized models and LLMs. Brandon highlights how specialized models outperform LLMs when it comes to specific coding tasks, and how developers can leverage tailored solutions to improve developer productivity and code quality. The conversation covers the importance of data transparency, data origination, cost implications, and regulatory considerations such as the EU's AI Act.Whether you're a developer looking to boost your productivity or an engineering leader evaluating solutions for your team, this episode offers important context on the next wave of AI solutionsTopics:00:31 Specialized models vs. LLMs01:56 The problems with LLMs and data integrity12:34 Why AGI is further away than we think16:11 Evaluating the right models for your engineering team23:42 Is AI code secure?26:22 How to adjust to work with AI effectively 32:48 Training developers in the new AI worldLinks:Brandon Jung on LinkedInBrandon Jung (@brandoncjung) / XTabnine (@tabnine) / XTabnine AI code assistant | Private, personalized, protectedManaging Bot-Generated PRs & Reducing Team Workload by 6%Support the show: Subscribe to our Substack Leave us a review Subscribe on YouTube Follow us on Twitter or LinkedIn Offers: Learn about Continuous Merge with gitStream Get your DORA Metrics free forever
This Week in Machine Learning & Artificial Intelligence (AI) Podcast
Today, we're joined by Simon Willison, independent researcher and creator of Datasette to discuss the many ways software developers and engineers can take advantage of large language models (LLMs) to boost their productivity. We dig into Simon's own workflows and how he uses popular models like ChatGPT and Anthropic's Claude to write and test hundreds of lines of code while out walking his dog. We review Simon's favorite prompting and debugging techniques, his strategies for sidestepping the limitations of contemporary models, how he uses Claude's Artifacts feature for rapid prototyping, his thoughts on the use and impact of vision models, the role he sees for open source models and local LLMs, and much more. The complete show notes for this episode can be found at https://twimlai.com/go/701.
Ruben Casas discusses software migrations at scale, understanding different migration patterns, making critical decisions on whether a full rewrite is necessary, and more. This episode covers all the essentials you need to navigate your next big software transformation. Links https://www.linkedin.com/in/ruben-casas-17100383 github.com/infoxicator https://www.infoxicator.com/ https://x.com/Infoxicador https://www.youtube.com/c/RubenCasas We want to hear from you! How did you find us? Did you see us on Twitter? In a newsletter? Or maybe we were recommended by a friend? Let us know by sending an email to our producer, Emily, at emily.kochanekketner@logrocket.com (mailto:emily.kochanekketner@logrocket.com), or tweet at us at PodRocketPod (https://twitter.com/PodRocketpod). Follow us. Get free stickers. Follow us on Apple Podcasts, fill out this form (https://podrocket.logrocket.com/get-podrocket-stickers), and we'll send you free PodRocket stickers! What does LogRocket do? LogRocket provides AI-first session replay and analytics that surfaces the UX and technical issues impacting user experiences. Start understand where your users are struggling by trying it for free at [LogRocket.com]. Try LogRocket for free today.(https://logrocket.com/signup/?pdr)
Damien Filiatrault is the founder and CEO of Scalable Path, a software staffing agency that matches companies and startups with vetted, remote software developers. The company was founded in 2010, and since then has worked on hundreds of client projects and has built a freelance network with 35,000 remote developers in 177 countries. Damien joins The post Developer Productivity with Damien Filiatrault appeared first on Software Engineering Daily.
Software Engineering Radio - The Podcast for Professional Software Developers
Hans Dockter, the creator of the Gradle build tool and founder of Gradle Inc, the company behind the developer productivity platform Develocity, joins SE Radio host Giovanni Asproni to talk about developer productivity. They start with some definitions and an explanation of the importance of developer productivity, its relationship with cognitive load, and the big impact that development tools have on it. Hans describes how to implement developer productivity metrics in an organization, as well as warns about some pitfalls. The episode closes with some discussion on Hans's views on the future of this discipline, as well as some near-term developments and expectations. Brought to you by IEEE Computer Society and IEEE Software magazine.