Podcasts about generativeai

  • 179PODCASTS
  • 477EPISODES
  • 32mAVG DURATION
  • 5WEEKLY NEW EPISODES
  • Jun 25, 2025LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about generativeai

Latest podcast episodes about generativeai

AWS for Software Companies Podcast
Ep111: The Architecture of Growth: Sonar's Evolution to Multi-Region SaaS

AWS for Software Companies Podcast

Play Episode Listen Later Jun 25, 2025 28:17


Andrea Malagodi, CTO of Sonar, discusses how the company successfully transitioned from on-premise to SaaS, leveraging AWS partnership and maintaining focus on developer-centric code quality and security solutions.Topics Include:Andrea Malagodi is CTO of Sonar, guest on podcastSonar founded 16+ years ago by three software engineersFounders wanted to help developers understand code quality issuesFocus on giving developers precise, actionable insights for improvementProducts include SonarQube Server, Cloud, and IDE versionsRecent acquisitions: ACR, Tidelift, and Structure 101 companiesSaaS journey began seven years ago with SonarQube CloudInitially targeted individual developers, then expanded to enterprisesNow multi-region with comprehensive enterprise features availableSeven million developers rely on Sonar's solutions globally400,000 organizations and 28,000 enterprise customers use SonarStarted SaaS to test market demand, not assumptionsEngaged customers early to understand migration requirements neededRecommends alpha versions with design customers for feedbackFree tier for open-source code enables quick trialEnterprise certifications (ISO 27001, SOC 2) build trustAWS partnership includes enterprise support and technical resourcesUsed CDK for infrastructure-as-code, experienced early adoption challengesMulti-region strategy should be considered from the beginningAWS Learning partnership certified all engineers in cloudCloud enables faster development cycles than traditional infrastructureRecommends avoiding architectural one-way doors during transitionConsider data residency requirements for global customer baseAI-generated code creates productivity gains but needs validationSonar provides deterministic rules for AI-generated code reviewWorking on MCP protocol and AI code quality solutionsSecurity approach is "start left" not "shift left"Advanced Security offering includes dependency scanning and vulnerabilitiesAvailable on sonarsource.com and AWS MarketplaceFree tier offers 50,000 lines of code analysisParticipants:Andrea Malagodi – Chief Technical Officer, SonarFurther Links:Website: www.sonarsource.comSonar in the AWS MarketplaceSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon.com/isv/

AWS for Software Companies Podcast
Ep110: Redefining Network Detection & Response with Generative AI – The Partnership of ExtraHop Networks and AWS

AWS for Software Companies Podcast

Play Episode Listen Later Jun 23, 2025 18:01


Kanaiya Vasani, Chief Product Officer, explains how ExtraHop leverages AWS services and generative AI to help enterprise customers address the growing security challenges of uncontrolled AI adoption.Topics Include:ExtraHop reinventing network detection and response categoryPlatform addresses security, performance, compliance, forensic use casesBehavioral analysis identifies potential security threats in infrastructureNetwork observability and attack surface discovery capabilities includedApplication and network performance assurance built-in featuresTraditional IDS capability with rules and IOCs detectionPacket forensics for investigating threats and wire evidenceCloud-native implementations and compromised credential investigation supportExtraHop partnership with AWS spans 35-40 different servicesAWS handles infrastructure while ExtraHop focuses core competenciesExtraHop early adopter of generative AI in NDRNatural language interface enables rapid data access queriesEnglish questions replace complex query languages for usersAgentic AI experiments focus on SOC automation workflowsL1 and L2 analyst workflow automation improves productivityShadow AI creates major risk concern for customersUncontrolled chatbot usage risks accidental data leakageGovernance structures needed around enterprise gen AI usageVisibility required into LLM usage across infrastructure endpointsAI innovation pace challenges security industry keeping upModels evolved from billion to trillion parameters rapidlyTraditional security tools focus policies, miss real-time activity"Wire doesn't lie" - network traffic reveals actual behaviorExtraHop maps baseline behavior patterns across infrastructure endpointsAnomalous behavioral patterns flagged through network traffic analysisMCP servers enable LLM access through standardized protocolsStolen tokens allow adversaries unauthorized MCP server accessMachine learning identifies anomalous traffic patterns L2-L7 protocolsGen AI automates incident triage, investigation, response workflowsBest practices include clear policies, governance, monitoring, educationParticipants:Kanaiya Vasani – Chief Product Officer, ExtraHop NetworksSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon.com/isv/Notes:

Process Transformers
Episode 25: Jobs-To-Be-Done in the Age of AI: What Stays, What Adapts?

Process Transformers

Play Episode Listen Later Jun 20, 2025 39:52


Join us in a transformative discussion with Tony Ulwick, who shares his expert insights on mastering innovation in the AI-driven world. Discover the importance of focusing on what customers truly need and how this can set your product apart. Explore the critical mindset shift from product features to job satisfaction, which could be the key to your next breakthrough. This episode is a treasure trove for anyone passionate about making a mark with innovation.

AWS for Software Companies Podcast
Ep109: Sustaining Data Quality and Quantity: How Cribl is helping Customers Control Costs and Unlock Value

AWS for Software Companies Podcast

Play Episode Listen Later Jun 18, 2025 20:54


Cribl's Field CISO Ed Bailey discusses how customers can manage the quality and quantity of data by providing intelligent controls between data sources and destinations.Topics Include:Cribl company name originCompany helps organizations screen data to find valuable insightsEd Bailey was Cribl's first customer back in 2018Data growth of 25% yearly created seven-figure cost increasesCEOs and CIOs complained about explosive data storage costsUsers demanded more data while budgets remained constrainedBailey discovered Cribl through a random Facebook advertisementCribl Stream sits between data sources and destinationsNo new agents required, uses existing infrastructure connectionsReduced data growth from 28% to 8% within yearDevelopment cycles shortened from six weeks to two weeksBailey managed global security and telemetry data systemsOperated large Splunk instance across forty different countriesTeam spent time collecting data instead of extracting valueCribl provided consistent data control plane for operationsSmart engineers could focus on machine learning solutionsMigrated from terrible SIEM to better security platformData strategy should focus on business requirements firstNot all data has the same business valueTier one: Critical data goes to expensive platformsTier two: Important data stored in cheaper lakesTier three: Compliance data in low-cost object storageSIEM costs around one dollar per gigabyte storedData lakes cost twelve to eighteen cents per gigabyteObject storage costs fractions of pennies per gigabyteAWS partnership provides scalable infrastructure for rapid growthEC2, EKS, and S3 are heavily utilized servicesCribl Search finds data directly in object storageAvoids costly data movement for search and analysisParticipants:Edward Bailey – Field CISO, CriblSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon.com/isv/

AWS for Software Companies Podcast
Ep108: Getting Ahead of the Curve - How Saviynt Automates Identity Security at Scale

AWS for Software Companies Podcast

Play Episode Listen Later Jun 16, 2025 17:36


Saviynt Co-Founder Amit Saha discusses how their AWS partnership has enabled the identity security company to deliver comprehensive identity protection while minimizing organizational friction.Topics Include:Saviynt is leading identity security provider in marketSecures human, non-human, workforce, and privileged access identitiesEliminates friction while automating organizational access management processesBiggest challenge: reducing friction in new access processesSecond challenge: visibility into accumulated technical debt problemsLost business context makes access permissions difficult to unwindSaviynt provides quick visibility to prioritize identity risksShadow IT creates ungoverned workloads and cloud applicationsNeed integration with asset management and cloud providersMust derive intelligence from multiple disconnected information sourcesAWS partnership provides access to prolific customer baseAWS security owners are same buyers for SaviyntEleven-year AWS relationship with early security competencyISV Accelerate program connects with sellers and architectsRising Star program helps stand out in crowded marketplaceFind mutual customers for successful AWS partnership storiesGenAI in bad actors' hands compromises customer securityProduct engineering uses GenAI tools for better qualityAgentic AI creates new paradigm between human/non-human identitiesAgentic AI requires dynamic, fluid access management approachesAI agents can generate their own bots needing accessZero trust principles needed at broader scale for AINext twelve months: getting ahead of GenAI curveNew AWS services launch daily in GenAI spaceContributing to new standards like MCP and A2A protocolsAWS Marketplace simplifies procurement and buyer discovery processesEDP program and migration incentives benefit ISV transactionsAWS developer-friendly startup programs accelerate time to marketCloud-native approach enables predictable scaling and AWS integrationAWS-Saviynt partnership aims for once-in-generation security impactParticipants:Amit Saha – Co-Founder and Chief Growth Officer, SaviyntSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon.com/isv/

AWS for Software Companies Podcast
Ep107: Cloud-Scale Security Monitoring – How Panther and AI are Revolutionizing Cybersecurity

AWS for Software Companies Podcast

Play Episode Listen Later Jun 11, 2025 23:54


Chief Architect Russell Leighton discusses how Panther's cloud platform revolutionizes security operations by treating detections as Python code and AI enabled alert vetting turning responses from hours into minutes. Topics Include:Panther is a cloud security monitoring tool (cloud SIEM)Works at massive scale, more cost-effective than legacy systemsKey differentiator: "detections as code" written in PythonBrings software engineering best practices to security operationsEnables unit testing and version control for security detectionsRecently adopted generative AI to improve security workflowsSOC burnout is renowned due to tedious ticket processingAI has intelligence of security engineer, works much fasterExample: Alert shows "Russ Leighton removed branch protection"Old way: Manual log analysis, checking user profiles manuallyTakes hours of squinting at detailed log dataNew AI way: Automatic vetting happens in minutesAI checks user profile in Okta or IDPDetermines engineer status, assesses typical behavior patternsProvides risk assessment based on historical alert dataLow risk for engineers, high risk for unusual usersExample: HR person accessing production code is escalatedCustomer quote: Takes vetting "from hours to seconds"Panther customers get dedicated AWS accounts for securityCompany can't see customer data, only self-reported metricsAI provides summaries, risk assessments, timelines, visualizationsAlso suggests remediations like human security engineer wouldInitial concerns about putting AI in production environmentCustomer feedback exceeded expectations with feature requestsAWS Bedrock integration addresses customer security concernsUses Anthropic Claude as base LLM through BedrockCustomers can enable additional Bedrock guardrails independentlyAI transparency prevents hallucination concerns through explanationsClaude's extended thinking mode shows reasoning processAI visualizes thinking with flowcharts explaining decision processParticipants:Russell Leighton – Chief Architect, PantherFurther Links:Website: Panther.comAWS MarketplaceSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon.com/isv/

AWS for Software Companies Podcast
Ep106: Building Secure and Agile AI Agents at Scale with Anthropic and AWS

AWS for Software Companies Podcast

Play Episode Listen Later Jun 10, 2025 37:20


Security leaders from Anthropic and AWS discuss how agentic AI is transforming cybersecurity functions to autonomously handle everything from code reviews to SOC operations.Topics Include:Agentic AI differs from traditional AI through autonomy and agencyTraditional AI handles single workflow nodes, agents collapse multiple stepsHigher model intelligence enables understanding of broader business contextsAgents make intelligent decisions across complex multi-step workflows processesEnterprise security operations are seeing workflow consolidation through GenAIOrganizations embedding GenAI directly into customer-facing production applicationsSoftware-as-a-service transitioning to service-as-software through AI agentsSecuring AI requires guardrails to prevent hallucinations in applicationsNew vulnerabilities appear at interaction points between system componentsAttackers target RAG systems and identity/authorization layers insteadLLMs hallucinate non-existent packages, attackers create malicious honeypotsGovernance frameworks must be machine-readable for autonomous agent reasoningAmazon investing in automated reasoning to prove software correctnessAnthropic uses Claude to write over 50% of codeAutomated code review systems integrated into CI/CD pipelinesSecurity design reviews use MITRE ATT&CK framework automationLow-risk assessments enable developers to self-approve security reviews40% reduction in application security team review workloadAnthropic eliminated SOC, replaced entirely with Claude-based automationIT support roles transitioning to engineering as automation replaces frontlineCompliance questionnaires fully automated using agentic AI workflowsISO 42001 framework manages AI deployment risks alongside securityExecutive risk councils evaluate AI risks using traditional enterprise processesAWS embeds GenAI into testing, detection, and user experienceFinding summarization helps L1 analysts understand complex AWS environmentsAmazon encourages teams to "live in the future" with AIInterview candidates expected to demonstrate Claude usage during interviewsSecurity remains biggest barrier to enterprise AI adoption beyond POCsVirtual employees predicted to arrive within next 12 monthsModel Context Protocol (MCP) creates new supply chain security risksParticipants:Jason Clinton – Chief Information Security Officer, AnthropicGee Rittenhouse – Vice President, Security Services, AWSHart Rossman – Vice President, Global Services Security, AWSBrian Shadpour – GM of Security and B2B Software Sales, AWSSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon.com/isv/

Process Transformers
Unplugged Episode 01: Cheap Content, Expensive Trust

Process Transformers

Play Episode Listen Later Jun 10, 2025 10:25


Join us in this episode to engage with the thought-provoking theme of AI's disruption of content creation as Lukas Egger elucidates its impact on business trust. Understand the transformation from a content-driven to a trust-driven market and grasp how leading businesses are leveraging trust as their currency of choice. This episode is a must-listen for anyone eager to navigate through the evolving challenges and opportunities presented by AI.

AWS for Software Companies Podcast
Ep105: Transforming B2B - How Spryker Powers Complex B2B Commerce with AWS

AWS for Software Companies Podcast

Play Episode Listen Later Jun 9, 2025 21:32


Spryker's Chief Product Officer, Elena Leonova, discusses the Spryker Business Intelligence platform and how working with AWS as a strategic advisor unlocked deeper opportunities for transformative growth.Topics Include:Elena Leonova introduces Spryker as digital commerce platformSpryker focuses on sophisticated B2B commerce transactionsTraditional industries: manufacturing, industrial goods, med techCustomers sell complex equipment like MRI machines, tractorsProducts are custom-built to order through procurement processesExtensive negotiation and aftermarket servicing are requiredCompetitors focus on fashion, food - not complex equipmentSpryker exclusively hosted on AWS cloud infrastructureAWS partnership enables new capabilities and customer innovationBusiness intelligence tools and AI capabilities now availableRicoh example: global manufacturer of industrial-grade printersRicoh sells through dealers and distributors worldwideS-Diverse: new automotive software marketplace partnership platformConnects automotive manufacturers with embedded software producersSpryker Business Intelligence powered by Amazon QuickSight launchedCommerce becoming more intelligent than traditional repeat purchasesComplex equipment buyers don't purchase MRI machines weeklyPlatform provides insights into customer portal navigation patternsCombines commerce data with search, CRM, competitive intelligenceHelps merchants identify revenue optimization signals from noiseBusiness intelligence integrated directly within Spryker platformCustomers should evaluate platform's future scalability and flexibilityRevenue optimization requires understanding what metrics to improveEasy-to-use data analysis prevents information overload problemsQuickSight's GenAI capabilities enable faster executive decision-makingAWS partnership provided cost optimization and innovation confidenceElena initially viewed AWS as just hosting providerBuilding shared vision with AWS unlocked deeper collaborationAWS became trusted advisor for strategy and partnershipsGenerative AI enables multi-persona communication across customer typesParticipants:Elena Leonova – Chief Product Officer, SprykerSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon.com/isv/

AWS for Software Companies Podcast
Ep104: Partnership in Innovation - How ActiveFence and AWS are De-risking AI

AWS for Software Companies Podcast

Play Episode Listen Later Jun 4, 2025 26:59


ActiveFence CEO Noam Schwartz discusses how his company evolved from protecting platforms against user-generated harmful content to helping companies deploy public-facing AI safely at scale.Topics Include:Noam Schwartz introduces himself as ActiveFence CEOFormer intelligence officer specializing in open source intelligenceMission: protect online experiences for everyone everywhereOnline platforms constantly hammered by various attacksAttacks include cybersecurity, abuse, hate speech, spamCompanies playing endless whack-a-mole game with violationsNeed scalable solution that works across languages/formatsDeveloped enterprise-grade technology for sophisticated companiesAmazon became customer and great partner early onGenerative AI introduction changed the game completelyLLMs non-deterministic unlike traditional programmed chatbotsSame input produces different outputs each timeAI deployed in customer support, healthcare, airlinesNew risks when models speak on company's behalfOne bad output creates legal and reputational damageCompanies need to deploy public-facing AI safelyTransition affects healthcare, finance, gaming, government sectorsBuilding on years of user-generated content expertiseNo specific ChatGPT moment triggered their AI pivotActiveFence was AI company since day oneModel companies like Amazon, Nvidia asked for helpRealized their expertise perfectly suited for AI safetyStaying on top of AI developments is impossibleFocus on customer adoption, not every new releaseMain enterprise challenge is trusting AI technologyUnrealistic expectations for 100% accuracy from AIMost companies will license existing models, not buildSecurity solutions remain independent like traditional cybersecurityParticipants:Noam Schwartz – CEO and Co-Founder, ActiveFenceOfer Oringher – Software and Technology Account Manager, AWSSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon.com/isv/

AWS for Software Companies Podcast
Ep103: Supercharging Security with GenAI – Best Practice Sharing with Sonrai Security

AWS for Software Companies Podcast

Play Episode Listen Later Jun 2, 2025 17:04


Jeff Moncrief discusses Sonrai Security's Cloud Permissions Firewall, and the best practices for using AI-powered summaries and orchestration to ensure security at all points.Topics Include:Jeff Moncrief introduces Sonrai Security and Cloud Permissions FirewallFocus on achieving least privilege access in AWS quicklyLightweight orchestration layer secures IAM from inside outEliminates need to write hundreds of individual policiesCustomers struggle with identity risk in CNAP/CSPM toolsGenerative AI adoption driving top security use casesBedrock and AI agents mentioned daily by customersProduct managers should consider underlying platform security risksAI models have control over infrastructure they run onIdentity is fundamental infrastructure enabling AWS AI modelsSonrai uses Bedrock capability inside Cloud Permissions FirewallJust-in-time access provides temporary, time-boxed AWS accessBedrock generates session summaries from audit logs automaticallyPlain English insights show what happened during sessionsSession summaries improve audit compliance and incident responseCustomer with 1000 accounts manually deployed service controlsFriday afternoon deployment caused very bad weekend disasterPolicy inheritance issues broke child accounts and OUsPlanning and orchestration essential for scaling AI securitySonrai platform built 100% cloud-native on AWSCoordinates service control policies and resource control policiesJust-in-time access relies on IAM Identity CenterParticipates in ISV Accelerate and AWS MarketplaceSecurity best practices start with identity as foundation"Hackers don't hack, they just log in" philosophyEliminate standing privileges with just-in-time access patternsRestrict AI services by user, location, and accountReview over-permissioned or inactive third-party vendor accessActionable insights through useful logging and AI summarizationFuture focus on protecting new services and permissionsParticipants:Jeff Moncrief – Field CTO & Director of Sales Engineering, Sonrai SecurityLinks:Website – Sonraisecurity.comAWS Marketplace – Sonrai SecuritySee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon.com/isv/

AWS for Software Companies Podcast
Ep102: 500 Billion Connected Devices: Intel's Investment in improving Enterprise AI

AWS for Software Companies Podcast

Play Episode Listen Later May 29, 2025 16:35


Akanksha Bilani of Intel shares how businesses can successfully adopt generative AI with significant performance gains while saving on costs.Topics Include:Akanksha runs go-to-market team for Amazon at IntelPersonal and business devices transformed how we communicateForrester predicts 500 billion connected devices by 20265,000 billion sensors will be smartly connected online40% of machines will communicate machine-to-machineWe're living in a world of data delugeAI and Gen AI help make data effectiveGoal is making businesses more profitable and effectiveVarious industries need Gen AI and data transformationIntel advises companies as partners with AWSThree factors determine which Gen AI use cases adoptFactor one: availability and ease of use casesHow unique and important are they for business?Does it have enough data for right analytics?Factor two: purchasing power for Gen AI adoption70% of companies target Gen AI but lack clarityLeaders must ensure capability and purchasing power existFactor three: necessary skill sets for implementationNeed access to right partnerships if lacking skillsIntel and AWS partnered for 18 years since inceptionIntel provides latest silicon customized for Amazon servicesEngineer-to-engineer collaboration on each processor generation92% of EC2 runs on Intel processorsIntel powers compute capability for EC2-based servicesIntel ensures access to skillsets making cloud aliveAWS services include Bedrock, SageMaker, DLAMIs, KinesisPerformance is the top three priorities for successNot every use case requires expensive GPU acceleratorsCPUs can power AI inference and training effectivelyEvery GPU has a CPU head node component Participants:Akanksha Bilani – Global Sales Director, IntelSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

Data Protection Gumbo
301: Why AI Fails Without the Right Data (And How to Fix It) - View Systems Inc.

Data Protection Gumbo

Play Episode Listen Later May 27, 2025 25:36


Joel Christner, CEO of View Systems, talks about why so many enterprise AI projects fall flat—despite powerful models and high expectations. The culprit? Bad or disorganized data. Joel explains how fragmented tools, shadow IT, and poor data hygiene create major roadblocks to AI success. He shares how View Systems built a platform to solve this—streamlining everything from data ingestion to AI-driven insights in just minutes. If you're tired of AI hype and want real answers, this episode is your blueprint for building smarter, faster, and more effective AI solutions.

AWS for Software Companies Podcast
Ep101: Beyond Chat - How Asana and Amazon Q Are Embedding AI Into Enterprise Workflows

AWS for Software Companies Podcast

Play Episode Listen Later May 27, 2025 25:13


Victoria Chin of Asana and Michael Horn of AWS demonstrate how Amazon Q integrates with Asana to enable AI-powered workflows while dramatically reducing manual work and improving cross-functional collaboration.Topics Include:Victoria Chin introduces herself as Asana's CPO Chief of StaffMichael Horn from AWS discusses customer feedback on generative AIAI agents limited by quality of data pulled into themAmazon Q Business created to analyze information and take actionHundreds of customers using Q Business across various industries dailyAWS hosts most business applications, ideal for AI journeyAmazon Q has most built-in, managed, secure data connectors availableQ Index creates comprehensive, accessible index of all company dataSecurity permissions automatically pulled in, no manual configuration neededSupports both structured and unstructured data from multiple sourcesVictoria returns to discuss Asana's integration with Q IndexBillions invested in integrations, but usage still lags behindTeams switch between apps 1000 times daily, missing connectionsRoot problem: no reliable way to track who/what/when/whyContent platforms store work but don't manage or coordinateAsana bridges content and communication for effective teamwork scalingAI disrupting software, but questions remain about real valueSoftware must provide structured framework to guide LLMs effectivelyAI needs data AND structure to separate signal from noiseAsana Work Graph maps how work actually gets done organizationallyWork Graph visualized as interconnected data, not rows and columnsMost strategic work is cross-functional, requiring multiple teams collaboratingTraditional integrations require manual setup and knowing when to useQ Index gives Asana access to 40+ different data connectorsUsers can ask questions, get answers with cross-application contextAI Studio enables no-code building of workflows with AI agentsProduct launch example shows intake, planning, execution, and reporting stagesAI can surface relevant documents, research, and updates automaticallyChat is tip of iceberg; real power comes from embedded workflowsIntegration evolves from feature-level to AI-powered product-level connectionsParticipants:Victoria J. Chin – Chief of Staff / Product Strategy, AI, AsanaMichael Horn – Principal Head of Business Development – Artificial Intelligence & Machine Learning, AWSSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

AWS for Software Companies Podcast
Ep100: The Power of ISV Community - Celebrating 100 Episodes with ISV Customers and AWS Leaders

AWS for Software Companies Podcast

Play Episode Listen Later May 22, 2025 17:59


AWS leaders commemorate the podcast's 100th episode while looking ahead to expanded coverage of technology partners and continued focus on generative AI, modern data strategies, agentic AI solutions and more!Topics Include:Episode 100 celebrates milestone of AWS software companies podcastWeekly podcast shares ISV stories, best practices, guidanceToday features AWS leader thoughts on ISV communityArym Diamond heads North America data and AI salesSpecialist team helps win deals, create happy customersISV customers do cutting-edge work on AWS platformISVs create force multiplier effect for entire companyBuilding community through podcast video and audio contentKristen Backeberg leads global ISV partner marketing at AWSPodcast featured 157 ISV leaders from 121 companiesReached over 30,000 listeners across 90+ countries worldwideISV partners drive cloud innovation across all industriesAWS supports growth from startups to enterprise leadersAPN network designed to help partners succeed, scaleOlawale Oladehin directs ISV solutions architecture in North AmericaPodcast shares customer insights, journeys, and innovationsAWS technology continues evolving to meet customer needsCarol Potts leads North America ISV sales at AWSPodcast started less than two years agoFirst episode titled "Data the Engine for Growth"Customer obsession drives everything AWS does for ISVsDeep collaboration focused on joint ISV success partnershipsVishal Sanghvi heads ISV marketing for North AmericaISVs face pressure delivering products at generative AI paceModern data strategy foundational for ISV product successFavorite episodes include Snowflake, Wiz, Coupang discussionsAWS offers programs for every ISV persona typeFuture episodes focus on generative AI, cybersecurity, dataAgentic AI becoming important for production phase evolutionPodcast expanding scope to include technology partnersParticipants:Kristen Backeberg – Director, Global ISV, Solutions Enterprise and Alliance Partner Marketing, Amazon Web ServicesArym Diamond – Director, US ISV Specialists, Amazon Web ServicesOlawale Oladehin – Director, ISV, Solutions Architecture, North America, Amazon Web ServicesCarol Potts – GM, ISV Sales Segment, North America, Amazon Web ServicesVishal Sanghvi - Head of ISV Field Marketing, North America, Amazon Web ServicesSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon.com/isv/

Moore's Lobby: Where engineers talk all about circuits
Generative AI for Dirty Jobs—The Next Industrial Revolution?

Moore's Lobby: Where engineers talk all about circuits

Play Episode Listen Later May 20, 2025 54:05


Christopher Savoie, the founder and CEO of Zapata Computing, has had a fascinating career journey. After beginning as a young programmer working with early computers, he switched gears to immunology and biophysics in Japan and is now founding AI companies. Along the way, he was also involved in creating the foundational technology for Apple Siri, working on early language models embedded in agents to solve complex natural language problems. In this interview with our host, Daniel Bogdanoff, Savoie highlights the evolution of AI into specialized systems. Like an orchestra, small, task-specific models working in ensembles are more effective than large, monolithic ones.  He also shares how AI transforms automotive, motorsports, and grid management industries.  Savoie recounts his experiences at Nissan with predictive battery analytics and Andretti Autosport, where AI-driven simulations optimize race strategies. Savoy warned about the potential misuse of AI and big data, advocating for ethical considerations, especially around privacy and government control. Despite these challenges, he remains optimistic about AI's potential, expressing a desire for tools to handle complex personal organization tasks, such as multi-modal time and travel management.

High Agency: The Podcast for AI Builders
How Graphite's $50M Series B is Transforming AI Code Review

High Agency: The Podcast for AI Builders

Play Episode Listen Later May 20, 2025 43:15


Merrill Lutsky, co-founder and CEO of Graphite, discusses their evolution from stack diff workflows to Diamond, an AI code review agent that just helped secure their $50M Series B. He shares insights on building reliable AI review systems, why over-generating and pruning comments works better than single responses, and the shift from RAG to agentic code browsing. Merrill offers a provocative vision where developers define requirements and AI agents build the code, potentially eliminating traditional IDE coding. This episode provides valuable perspectives on how AI is fundamentally reshaping software development workflows and engineering roles.Chapters:00:00 - Introduction and Graphite overview01:58 - Evolution from stack diffs to AI review07:39 - Diamond: The AI code reviewer explained10:13 - Human vs AI review: Finding the balance11:44 - Engineering challenges of reliable AI review17:38 - Over-generate and prune: A winning strategy24:49 - From RAG to code browser agents28:12 - The bitter lesson of AI engineering30:48 - The future of software engineering37:33 - Is AI over or under-hyped?

AWS for Software Companies Podcast
Ep099: Marketing Transformed: Reimagining Advertising and MarTech with Amazon Bedrock

AWS for Software Companies Podcast

Play Episode Listen Later May 13, 2025 28:08


Executive leaders from UneeQ and Zeta Global discuss the revolutionary impact of AI technologies that enable enhanced customer experiences and improved sales performances.Topics Include:Dave Cristini introduces panel on AI in advertising and marketing.Panel explores personalized experiences at scale with privacy focus.UneeQ creates AI-powered digital humans for brand interactions.Zeta Global uses AI to optimize customer messaging.LLMs combined with traditional ML empowers marketers to create models.Marketers can now build models without needing data scientists.AI agents integrated into systems can take action, not just respond.Agent chaining orchestrates sophisticated marketing actions automatically.AWS Bedrock provides tools to shape AI marketing future.Hyper-personalization becoming more achievable through AI automation.Ethics requires authenticity in brand AI representation.Transparency about data usage builds customer trust.Win-win approach: AI should augment teams, not just reduce costs.Integration difficulties remain a major challenge for AI implementation.AI agents have limited context windows and memory.Solution: Create specialized agents with persistent viewpoints.Companies need strong integration capabilities before implementing AI.Privacy regulations impact AI use in global marketing.Highly regulated industries require careful AI implementation strategies.Generative AI creates compliance challenges with unpredictable outputs.Digital humans eliminate judgment, revealing new customer insights.Banking clients discovered customers didn't understand financial terminology.Zeta improved onboarding with AI agents for data mapping.AI data mapping increased NPS scores and accelerated monetization.CMOs and CIOs increasingly collaborating on AI initiatives.Tension exists between marketing (quick wins) and IT (security).Strategic alignment with approved infrastructure enables scaling AI solutions.CEOs have critical role in aligning AI goals across departments.Internal AI use case: practicing sales with digital humans.Sales teams achieved 500% higher sales through AI role-playing.Participants:Danny Tomsett – Chief Executive Officer, UneeQRoman Gun – Vice President, Product, Zeta GlobalDavid Cristini – Director, ISV Sales, North America – Business Applications, AWSSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon.com/isv/

High Agency: The Podcast for AI Builders
The End of Language-Only Models l Amit Jain, Luma AI

High Agency: The Podcast for AI Builders

Play Episode Listen Later May 13, 2025 40:17


This week Raza is joined by Amit Jain, CEO and co-founder of Luma AI, to explore why the future of artificial intelligence lies beyond language. Amit shares Luma's bold mission to build world models through multimodal training and why video is the most overlooked and critical data source in AI today.Chapters:00:00 - Introduction03:40 - Competing with Big AI Labs: Language vs. Multimodality08:09 - Joint Training and Why Current Multimodal Models Fall Short11:01 - Language is Discrete, the World is Continuous14:36 - Do These Models Have World Models?18:18 - Planning, Counterfactuals, and Causal Reasoning in AI22:08 - Capabilities of Ray 2 and Real-World Use Cases26:14 - Rethinking Video Length and Creative Workflows29:18 - Solving Coherence Across Shots and Characters30:00 - When Will AI Create a Feature-Length Film?31:27 - What You Can Build with Luma's API Today35:49 - Overlooked Ideas and Noise in the AI Industry38:34 - Why Video is the Missing Link in AI

AWS for Software Companies Podcast
Ep098: From BI to Gen AI: A CTO's Journey Through Data Evolution

AWS for Software Companies Podcast

Play Episode Listen Later May 7, 2025 12:53


Ash Pembroke, Portfolio CTO of Caylent, discusses the critical balance of data accuracy in the era of Gen AI for the benefit of boosting innovation.Topics Include:Ash Pembroke, Portfolio CTO of Caylent, self-identifies as a "recovering data scientist."Caylent is an AWS native services company.Data quality remains an issue despite Gen AI.Contrasts legalism versus mysticism in data quality.Legalism: accurate data when applications need it.Mysticism: insights that help decision-making.Traditional data foundations approach is being challenged weekly.Gen AI developments force rethinking of solution architectures.Teams share solutions through giant Slack threads.Example: Vector databases questioned after model context protocol.Still do traditional data assessments, but stay flexible.Integration and data processing constantly get abstracted.Data strategy equals architecture strategy equals business strategy.Traditional approach: standardize data across engineering teams.New approach: allow business users to innovate.Bring valuable techniques back to the organization.Case study: North Sea wind turbine alerts.Initially seen as data quality issue, revealed new predictive failure signal.Gen AI enables local experimentation by business users.Blurring lines between enterprise enablement and software building.BrainBox AI case study: energy optimization across buildings.Architecture decisions impact ability to scale products.Work with business edges rather than looking for patterns.Gen AI can process information from these working groups.Think about data as a product, not asset.Redimensionalize dependencies across your organization.Now's a good time to attack data quality.New tools help visualize complexity across organizations.Participants:· Ash Pembroke – Portfolio CTO, CaylentSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon.com/isv/

AWS re:Think Podcast
Episode 40: AI Observabilty and Evaluation with Arize AI

AWS re:Think Podcast

Play Episode Listen Later May 7, 2025 39:04


AI can still sometimes hallucinate and give less than optimal answers. To address this, we are joined by Arize AI's Co-Founder a Aparna Dhinakaran for a discussion on Observability and Evaluation for AI. We begin by discussing the challenges AI Observability and Evaluation. For example, how does “LLM as a Judge” work? We conclude with some valuable advice from Aparna for first time entrepreneurs.Begin Observing and Evaluating your AI Applications with Open Source Phoenix:https://phoenix.arize.com/AWS Hosts: Nolan Chen & Malini ChatterjeeEmail Your Feedback: rethinkpodcast@amazon.com

Title Agents Podcast
How Title Agents Can Help Realtors Sell Smarter with AI Video with Ajay Shah

Title Agents Podcast

Play Episode Listen Later Apr 29, 2025 39:46


From his accidental start in interactive TV to founding ListReports and now launching Ethica, Ajay Shah shares the strategic pivots, industry insights, and technical breakthroughs that powered his AI-video platform. Explore how animation, data-driven infographics, and voice-generated commentary combine to create listing videos that win more listings, streamline workflows, and redefine client expectations.   What you'll learn from this episode The transformative power of AI in professional property marketing for agents Why title professionals should become “AI Sherpas” for their clients How generative AI tools like Ethica are transforming listing videos Ways to future-proof your real estate business amid rapid tech disruption Practical advice for adopting AI without getting overwhelmed   Resources mentioned in this episode Highway About Ajay ShahAjay combines his extensive experience in developing technology solutions for consumers and businesses with a proven track record of leading international teams. His mission is to transform the way the residential real estate industry operates. Through his leadership, his team is building an industry-impacting business designed to generate billions of dollars in value for shareholders while creating significant opportunities for ListReports' employees, partners, and customers. With a diverse background spanning successful ventures in technology, SaaS, and entertainment startups, Ajay leverages nearly two decades of experience across multiple industries to solve complex problems and unlock new business opportunities. His ultimate goal is to create businesses that drive positive, meaningful change in the world.   Connect with Ajay Website: Ethica AI Connect With UsLove what you're hearing? Don't miss an episode! Follow us on our social media channels and stay connected. Explore more on our website: www.alltechnational.com/podcast Stay updated with our newsletter: www.mochoumil.com Follow Mo on LinkedIn: Mo Choumil Stop waiting on underwriter emails or callbacks—TitleGPT.ai gives you instant, reliable answers to your title questions. Whether it's underwriting, compliance, or tricky closings, the information you need is just a click away. No more delays—work smarter, close faster. Try it now at www.TitleGPT.ai. Closing more deals starts with more appointments. At Alltech National Title, our inside sales team works behind the scenes to fill your pipeline, so you can focus on building relationships and closing business. No more cold calling—just real opportunities. Get started at AlltechNationalTitle.com. Extra hands without extra overhead—that's Safi Virtual. Our trained virtual assistants specialize in the title industry, handling admin work, client communication, and data entry so you can stay focused on closing deals. Scale smarter and work faster at SafiVirtual.com.

AWS for Software Companies Podcast
Ep097: Specialized Agents & Agentic Orchestration - New Relic and the Future of Observability

AWS for Software Companies Podcast

Play Episode Listen Later Apr 28, 2025 29:04


New Relic's Head of AI and ML Innovation, Camden Swita discusses their four-cornered AI strategy and envisions a future of "agentic orchestration" with specialized agents.Topics Include:Introduction of Camden Swita, Head of AI at New Relic.New Relic invented the observability space for monitoring applications.Started with Java workloads monitoring and APM.Evolved into full-stack observability with infrastructure and browser monitoring.Uses advanced query language (NRQL) with time series database.AI strategy focuses on AI ops for automation.First cornerstone: Intelligent detection capabilities with machine learning.Second cornerstone: Incident response with generative AI assistance.Third cornerstone: Problem management with root cause analysis.Fourth cornerstone: Knowledge management to improve future detection.Initially overwhelmed by "ocean of possibilities" with LLMs.Needed narrow scope and guardrails for measurable progress.Natural language to NRQL translation proved immensely complex.Selecting from thousands of possible events caused accuracy issues.Shifted from "one tool" approach to many specialized tools.Created routing layer to select right tool for each job.Evaluation of NRQL is challenging even when syntactically correct.Implemented multi-stage validation with user confirmation step.AWS partnership involves fine-tuning models for NRQL translation.Using Bedrock to select appropriate models for different tasks.Initially advised prototyping on biggest, best available models.Now recommends considering specialized, targeted models from start.Agent development platforms have improved significantly since beginning.Future focus: "Agentic orchestration" with specialized agents.Envisions agents communicating through APIs without human prompts.Integration with AWS tools like Amazon Q.Industry possibly plateauing in large language model improvements.Increasing focus on inference-time compute in newer models.Context and quality prompts remain crucial despite model advances.Potential pros and cons to inference-time compute approach.Participants:Camden Swita – Head of AI & ML Innovation, Product Management, New RelicSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

Play the King & Win the Day!
Episode 29- Ethical AI, Human Connection & the Future of Innovation with Noah Kenney

Play the King & Win the Day!

Play Episode Listen Later Apr 25, 2025 64:55


In this thought-provoking episode of Play the King Win the Day, host Brad Banyas sits down with Noah Kenney, (AI) Research & Development, ethical AI strategist, entrepreneur and founder of Disruptive AI Lab. Together, they explore the transformative impact of artificial intelligence (AI) across major industries—including healthcare, finance, and transportation.Noah Kenney shares expert insights on the future of autonomous vehicles, the role of AI in medical diagnostics, and the growing challenge of misinformation and algorithmic bias. The discussion dives into the ethical complexities of AI-generated content, intellectual property in the age of generative tools, and the global push for standardized, human-centered AI governance.This episode also introduces the Global Artificial Intelligence Framework (GAIF)—a pioneering set of guidelines designed to ensure AI is developed and deployed responsibly.Noah Kenney offers fresh insights into AI that will broaden your perspective—whether you're in technology, business, policy, or simply curious. Play the King, Win the Day!*wisdom to power your success.

AWS for Software Companies Podcast
Ep096: Navigating Cloud Marketplaces: How Suger is Streamlining Software Distribution

AWS for Software Companies Podcast

Play Episode Listen Later Apr 22, 2025 15:53


Jon Yoo, CEO of Suger, shares how his company automates the complex & challenging workflows of selling software through cloud marketplaces like AWS.Topics Include:Jon Yoo is co-founder/CEO of Suger.Suger automates B2B marketplace workflows.Handles listing, contracts, offers, billing for marketplaces like AWS.Co-founder previously led Confluent's marketplace enablement product.Confluent had 40-50% revenue through cloud marketplaces.Required 10-20 engineers working solely on marketplace integration.Engineers prefer core product work over marketplace integration.Product/engineering leaders struggle with marketplace deployment requirements.Marketplace customers adopt without marketing, creating unexpected management needs.Version control is challenging for marketplace-deployed products.License management through marketplace creates engineering challenges.Suger helps sell, resell, co-sell through AWS Marketplace.Marketplace integration isn't one-time; requires ongoing maintenance.Business users constantly request marketplace automation features.Suger works with Snowflake, Intel, and AI startups.Data security concerns drive self-hosted AI deployments.AI products increasingly deploy via AMI/container solutions.AI products use usage-based pricing, not seat-based.Usage-based pricing creates complex billing challenges.AI products are tested at unprecedented rates.Two deployment options: vendor cloud or customer cloud.SaaS requires reporting usage to marketplace APIs.Customer-hosted deployment simplifies some billing aspects.Marketplaces need integration with ERP systems.Version control particularly challenging for AI products.Companies need automated updates for marketplace-deployed products.License management includes scaling up/down and expiration handling.Suger aims to integrate with GitHub for automatic updates.Participants:· Jon Yoo – CEO and Co-founder, SugerSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

The .NET Core Podcast
Google Gemini in .NET: The Ultimate Guide with Jochen Kirstaetter

The .NET Core Podcast

Play Episode Listen Later Apr 18, 2025 55:01


RJJ Software's Software Development Service This episode of The Modern .NET Show is supported, in part, by RJJ Software's Software Development Services, whether your company is looking to elevate its UK operations or reshape its US strategy, we can provide tailored solutions that exceed expectations. Show Notes "So on my side it was actually, the interesting experience was that I kind of used it one way, because it was mainly about reading the Python code, the JavaScript code, and, let's say like, the Go implementations, trying to understand what are the concepts, what are the ways about how it has been implemented by the different teams. And then, you know, switching mentally into the other direction of writing than the code in C#."— Jochen Kirstaetter Welcome friends to The Modern .NET Show; the premier .NET podcast, focusing entirely on the knowledge, tools, and frameworks that all .NET developers should have in their toolbox. We are the go-to podcast for .NET developers worldwide, and I am your host: Jamie “GaProgMan” Taylor. In this episode, Jochen Kirstaetter joined us to talk about his .NET SDK for interacting with Google's Gemini suite of LLMs. Jochen tells us that he started his journey by looking at the existing .NET SDK, which didn't seem right to him, and wrote his own using the HttpClient and HttpClientFactory classes and REST. "I provide a test project with a lot of tests. And when you look at the simplest one, is that you get your instance of the Generative AI type, which you pass in either your API key, if you want to use it against Google AI, or you pass in your project ID and location if you want to use it against Vertex AI. Then you specify which model that you like to use, and you specify the prompt, and the method that you call is then GenerateContent and you get the response back. So effectively with four lines of code you have a full integration of Gemini into your .NET application."— Jochen Kirstaetter Along the way, we discuss the fact that Jochen had to look into the Python, JavaScript, and even Go SDKs to get a better understanding of how his .NET SDK should work. We discuss the “Pythonistic .NET” and “.NETy Python” code that developers can accidentally end up writing, if they're not careful when moving from .NET to Python and back. And we also talk about Jochen's use of tests as documentation for his SDK. Anyway, without further ado, let's sit back, open up a terminal, type in `dotnet new podcast` and we'll dive into the core of Modern .NET. Supporting the Show If you find this episode useful in any way, please consider supporting the show by either leaving a review (check our review page for ways to do that), sharing the episode with a friend or colleague, buying the host a coffee, or considering becoming a Patron of the show. Full Show Notes The full show notes, including links to some of the things we discussed and a full transcription of this episode, can be found at: https://dotnetcore.show/season-7/google-gemini-in-net-the-ultimate-guide-with-jochen-kirstaetter/ Jason's Links: JoKi's MVP Profile JoKi's Google Developer Expert Profile JoKi's website Other Links: Generative AI for .NET Developers with Amit Bahree curl Noda Time with Jon Skeet Google Cloud samples repo on GitHub Google's Gemini SDK for Python Google's Gemini SDK for JavaScript Google's Gemini SDK for Go Vertex AI JoKi's base NuGet package: Mscc.GenerativeAI JoKi's NuGet package: Mscc.GenerativeAI.Google System.Text.Json gcloud CLI .NET Preprocessor directives .NET Target Framework Monikers QUIC protocol IAsyncEnumerable Microsoft.Extensions.AI Supporting the show: Leave a rating or review Buy the show a coffee Become a patron Getting in Touch: Via the contact page Joining the Discord Remember to rate and review the show on Apple Podcasts, Podchaser, or wherever you find your podcasts, this will help the show's audience grow. Or you can just share the show with a friend. And don't forget to reach out via our Contact page. We're very interested in your opinion of the show, so please get in touch. You can support the show by making a monthly donation on the show's Patreon page at: https://www.patreon.com/TheDotNetCorePodcast. Music created by Mono Memory Music, licensed to RJJ Software for use in The Modern .NET Show

Wise Decision Maker Show
#315: Gen AI Helps Our Staff Delight Our Clients: Andrew Min, SVP, Strategy & Digital Initiatives, RXR

Wise Decision Maker Show

Play Episode Listen Later Apr 17, 2025 28:00


In this episode of the Wise Decision Maker Show, Dr. Gleb Tsipursky speaks to Andrew Min, SVP, Strategy & Digital Initiatives at RXR, about how Gen AI helps their staff delight clients.You can learn about RXR at https://rxr.com/

On Cloud
How AI can help make transportation safer and smarter

On Cloud

Play Episode Listen Later Apr 16, 2025 23:48


How can AI make streets safer and smarter? With AI, predictive analytics, and human-centered design that combines technology with collaboration.

AWS for Software Companies Podcast
Ep095: AI and Cybersecurity - How SentinelOne Is Changing the Game

AWS for Software Companies Podcast

Play Episode Listen Later Apr 16, 2025 15:20


SentinelOne's Ric Smith shares how Purple AI, built on Amazon Bedrock, helps security teams handle increasing threat volumes while facing budget constraints and talent shortages.Topics Include:Introduction of Ric Smith, President of Product Technology and OperationsSentinelOne overview: cybersecurity company focused on endpoint and data securityCustomer range: small businesses to Fortune 10 companiesProducts protect endpoints, cloud environments, and provide enterprise observabilityRic oversees 65% of company operationsPurple AI launched on AWS BedrockPurple AI helps security teams become more efficient and productiveSecurity teams face budget constraints and talent shortagesPurple AI helps teams manage increasing alert volumesTop security challenge: increased malware variants through AIAI enables more convincing spear-phishing attemptsIdentity breaches through social engineering are increasingVoice deepfakes used to bypass security protocolsFuture threats: autonomous AI agents conducting orchestrated attacksSentinelOne helps with productivity and advanced detection capabilitiesSentinelOne primarily deployed on AWS infrastructureUsing SageMaker and Bedrock for AI capabilitiesBest practice: find partners for AI training and deploymentCustomer insight: Purple AI made teams more confident and creativeAI frees security teams from constant anxietySentinelOne's hyper-automation handles cascading remediation tasksMultiple operational modes: fully automated or human-in-the-loopAgent-to-agent interactions expected within 24 monthsCommon misconception: generative AI is infallibleAI helps with "blank slate problem" providing starting frameworksAI content still requires human personalization and reviewAWS partnership provides cost efficiency and governance benefitsParticipants:· Ric Smith – President – Product, Technology and Operations, SentinelOneSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

Oracle University Podcast
AI-Assisted Development in Oracle APEX

Oracle University Podcast

Play Episode Listen Later Apr 15, 2025 12:57


Get ready to explore how generative AI is transforming development in Oracle APEX. In this episode, hosts Lois Houston and Nikita Abraham are joined by Oracle APEX experts Apoorva Srinivas and Toufiq Mohammed to break down the innovative features of APEX 24.1. Learn how developers can use APEX Assistant to build apps, generate SQL, and create data models using natural language prompts.   Oracle APEX: Empowering Low Code Apps with AI: https://mylearn.oracle.com/ou/course/oracle-apex-empowering-low-code-apps-with-ai/146047/ Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode.   --------------------------------------------------------------   Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started! 00:25 Nikita: Welcome back to another episode of the Oracle University Podcast! I'm Nikita Abraham, Team Lead of Editorial Services with Oracle University, and I'm joined by Lois Houston, Director of Innovation Programs. Lois: Hi everyone! In our last episode, we spoke about Oracle APEX and AI. We covered the data and AI -centric challenges businesses are up against and explored how AI fits in with Oracle APEX. Niki, what's in store for today? Nikita: Well, Lois, today we're diving into how generative AI powers Oracle APEX. With APEX 24.1, developers can use the Create Application Wizard to tell APEX what kind of application they want to build based on available tables. Plus, APEX Assistant helps create, refine, and debug SQL code in natural language. 01:16 Lois: Right. Today's episode will focus on how generative AI enhances development in APEX. We'll explore its architecture, the different AI providers, and key use cases. Joining us are two senior product managers from Oracle—Apoorva Srinivas and Toufiq Mohammed. Thank you both for joining us today. We'll start with you, Apoorva. Can you tell us a bit about the generative AI service in Oracle APEX? Apoorva: It is nothing but an abstraction to the popular commercial Generative AI products, like OCI Generative AI, OpenAI, and Cohere. APEX makes use of the existing REST infrastructure to authenticate using the web credentials with Generative AI Services. Once you configure the Generative AI Service, it can be used by the App Builder, AI Assistant, and AI Dynamic Actions, like Show AI Assistant and Generate Text with AI, and also the APEX_AI PL/SQL API. You can enable or disable the Generative AI Service on the APEX instance level and on the workspace level. 02:31 Nikita: Ok. Got it. So, Apoorva, which AI providers can be configured in the APEX Gen AI service? Apoorva: First is the popular OpenAI. If you have registered and subscribed for an OpenAI API key, you can just enter the API key in your APEX workspace to configure the Generative AI service. APEX makes use of the chat completions endpoint in OpenAI. Second is the OCI Generative AI Service. Once you have configured an OCI API key on Oracle Cloud, you can make use of the chat models. The chat models are available from Cohere family and Meta Llama family.  The third is the Cohere. The configuration of Cohere is similar to OpenAI. You need to have your Cohere OpenAI key. And it provides a similar chat functionality using the chat endpoint.  03:29 Lois: What is the purpose of the APEX_AI PL/SQL public API that we now have? How is it used within the APEX ecosystem? Apoorva: It models the chat operation of the popular Generative AI REST Services. This is the same package used internally by the chat widget of the APEX Assistant. There are more procedures around consent management, which you can configure using this package. 03:58 Lois: Apoorva, at a high level, how does generative AI fit into the APEX environment? Apoorva: APEX makes use of the existing REST infrastructure—that is the web credentials and remote server—to configure the Generative AI Service. The inferencing is done by the backend Generative AI Service. For the Generative AI use case in APEX, such as NL2SQL and creation of an app, APEX performs the prompt enrichment.  04:29 Nikita: And what exactly is prompt enrichment? Apoorva: Let's say you provide a prompt saying "show me the average salary of employees in each department." APEX will take this prompt and enrich it by adding in more details. It elaborates on the prompt by mentioning the requirements, such as Oracle SQL syntax statement, and providing some metadata from the data dictionary of APEX. Once the prompt enrichment is complete, it is then passed on to the LLM inferencing service. Therefore, the SQL query provided by the AI Assistant is more accurate and in context.  05:15 Unlock the power of AI Vector Search with our new course and certification. Get more accurate search results, handle complex datasets easily, and supercharge your data-driven decisions. From now to May 15, 2025, we are waiving the certification exam fee (valued at $245). Visit mylearn.oracle.com to enroll. 05:41  Nikita: Welcome back! Let's talk use cases. Apoorva, can you share some ways developers can use generative AI with APEX? Apoorva: SQL is an integral part of building APEX apps. You use SQL everywhere. You can make use of the NL2SQL feature in the code editor by using the APEX Assistant to generate SQL queries while building the apps. The second is the prompt-based app creation. With APEX Assistant, you can now generate fully functional APEX apps by providing prompts in natural language. Third is the AI Assistant, which is a chat widget provided by APEX in all the code editors and for creation of apps. You can chat with the AI Assistant by providing your prompts and get responses from the Generative AI Services. 06:37 Lois: Without getting too technical, can you tell us how to create a data model using AI? Apoorva: A SQL Workshop utility called Create Data Model Using AI uses AI to help you create your own data model. The APEX Assistant generates a script to create tables, triggers, and constraints in either Oracle SQL or Quick SQL format. You can also insert sample data into these tables. But before you use this feature, you must create a generative AI service and enable the Used by App Builder setting. If you are using the Oracle SQL format, when you click on Create SQL Script, APEX generates the script and brings you to this script editor page. Whereas if you are using the Quick SQL format, when you click on Review Quick SQL, APEX generates the Quick SQL code and brings you to the Quick SQL page. 07:39 Lois: And to see a detailed demo of creating a custom data model with the APEX Assistant, visit mylearn.oracle.com and search for the "Oracle APEX: Empowering Low Code Apps with AI" course. Apoorva, what about creating an APEX app from a prompt. What's that process like? Apoorva: APEX 24.1 introduces a new feature where you can generate an application blueprint based on a prompt using natural language. The APEX Assistant leverages the APEX Dictionary Cache to identify relevant tables while suggesting the pages to be created for your application. You can iterate over the application design by providing further prompts using natural language and then generating an application based on your needs. Once you are satisfied, you can click on Create Application, which takes you to the Create Application Wizard in APEX, where you can further customize your application, such as application icon and other features, and finally, go ahead to create your application. 08:53 Nikita: Again, you can watch a demo of this on MyLearn. So, check that out if you want to dive deeper.  Lois: That's right, Niki. Thank you for these great insights, Apoorva! Now, let's turn to Toufiq. Toufiq, can you tell us more about the APEX Assistant feature in Oracle APEX. What is it and how does it work? Toufiq: APEX Assistant is available in Code Editors in the APEX App Builder. It leverages generative AI services as the backend to answer your questions asked in natural language. APEX Assistant makes use of the APEX dictionary cache to identify relevant tables while generating SQL queries. Using the Query Builder mode enables Assistant. You can generate SQL queries from natural language for Form, Report, and other region types which support SQL queries. Using the general assistance mode, you can generate PL/SQL JavaScript, HTML, or CSS Code, and seek further assistance from generative AI. For example, you can ask the APEX Assistant to optimize the code, format the code for better readability, add comments, etc. APEX Assistant also comes with two quick actions, Improve and Explain, which can help users improve and understand the selected code. 10:17 Nikita: What about the Show AI Assistant dynamic action? I know that it provides an AI chat interface, but can you tell us a little more about it?  Toufiq: It is a native dynamic action in Oracle APEX which renders an AI chat user interface. It leverages the generative AI services that are configured under Workspace utilities. This AI chat user interface can be rendered inline or as a dialog. This dynamic action also has configurable system prompt and welcome message attributes.  10:52 Lois: Are there attributes you can configure to leverage even more customization?  Toufiq: The first attribute is the initial prompt. The initial prompt represents a message as if it were coming from the user. This can either be a specific item value or a value derived from a JavaScript expression. The next attribute is use response. This attribute determines how the AI Assistant should return responses. The term response refers to the message content of an individual chat message. You have the option to capture this response directly into a page item, or to process it based on more complex logic using JavaScript code. The final attribute is quick actions. A quick action is a predefined phrase that, once clicked, will be sent as a user message. Quick actions defined here show up as chips in the AI chat interface, which a user can click to send the message to Generative AI service without having to manually type in the message. 12:05 Lois: Thank you, Toufiq and Apoorva, for joining us today. Like we were saying, there's a lot more you can find in the “Oracle APEX: Empowering Low Code Apps with AI” course on MyLearn. So, make sure you go check that out. Nikita: Join us next week for a discussion on how to integrate APEX with OCI AI Services. Until then, this is Nikita Abraham… Lois: And Lois Houston signing off! 12:28 That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.  

AWS for Software Companies Podcast
Ep094: The DEX Factor – How Nexthink is Eliminating IT Headaches Before They Happen

AWS for Software Companies Podcast

Play Episode Listen Later Apr 14, 2025 32:41


Sam Gantner, Chief Product Officer of Nexthink, reveals how DEX is moving IT from reactive firefighting to proactive problem prevention and transforming enterprise productivity.Topics Include:DEX stands for Digital Employee ExperienceDEX eliminates IT issues preventing employee productivityShifts IT from reactive to proactive problem-solvingEmployees often serve as IT problem alerting systemsBest IT is transparent to employeesDEX solves device sluggishness and slow application issuesNetwork problems consistently appear across organizationsIT teams often lack visibility into employee experiencesMany organizations waste money on unused software licensesDEX Score measures comprehensive employee IT experienceSurveys capture subjective aspects of technology experienceReduction of actual problems differs from ticket reductionNexthink uses lightweight agents on employee devicesBrowser monitoring essential as browsers become application platformsEmployee engagement metrics capture real-time feedbackNexthink rebuilt as cloud-native platform using AWS servicesCompany deploys across 10+ global AWS regions30% of engineering resources dedicated to AI developmentOne customer eliminated 50% of IT ticketsAnother recovered 37,000 productivity hours worth $3M annuallyA third saved $1.3M by identifying unused licensesAI implementation requires dedicated employee trainingGood AI now better than perfect AI neverTechnology adoption is the next DEX frontierDigital dexterity becoming critical for maximizing IT investmentsParticipants:Samuele Gantner – Chief Product Officer, NexthinkSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

AWS for Software Companies Podcast
Ep093: Forrester's Vision: Linda Ivy-Rosser on the Evolution and Future of Business Applications

AWS for Software Companies Podcast

Play Episode Listen Later Apr 11, 2025 44:37


Linda Ivy-Rosser, Vice President for Forrester, outlines the evolution of business applications and forward thinking predictions of their future.Topics Include:Linda Ivy-Rosser has extensive business applications experience since the 1990s.Business applications historically seen as rigid and lethargic.1990s: On-premise software with limited scale and flexibility.2000s: SaaS emergence with Salesforce, AWS, and Azure.2010s: Mobile-first applications focused on accessibility.Present: AI-driven applications characterize the "AI economy."Purpose of applications evolved from basic to complex capabilities.User expectations grew from friendly interfaces to intelligent systems.Four agreements: AI-infused, composable, cloud-native, ecosystem-driven.AI-infused: 69% consider essential/important in vendor selection.Composability expected to grow in importance with API architectures.Cloud-native: 79% view as foundation for digital transformation.Ecosystem-driven: 68% recognize importance of strategic alliances.Challenges: integration, interoperability, data accessibility, user adoption.43% prioritizing cross-functional workflow and data accessibility capabilities.Tech convergence recycles as horizontal strategy for software companies.Data contextualization crucial for employee adoption of intelligent applications.Explainable AI necessary to build trust in recommendations.Case study: 83% of operators rejected AI recommendations without explanations.Tulip example demonstrated three of four agreements successfully.Software giants using strategic alliances as competitive advantage.AWS offers comprehensive AI infrastructure, platforms, models, and services.Salesforce created ecosystem both within and outside their platform.SaaS marketplaces bridge AI model providers and businesses.Innovation requires partnerships between software vendors and ISVs.Enterprises forming cohorts with startups to solve business challenges.Software supply chain transparency increasingly important.Government sector slower to adopt cloud and AI technologies.Change resistance remains significant challenge for adoption.69% prioritize improving innovation capability over next year.Participants:Linda Ivy-Rosser - Vice President, Enterprise Software, IT services and Digital Transformation Executive Portfolio, ForresterSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

AWS for Software Companies Podcast
Ep092: The Evolution of Monitoring: How New Relic is Transforming Cloud Operations

AWS for Software Companies Podcast

Play Episode Listen Later Apr 9, 2025 16:02


New Relic's Chief Customer Officer Arnaldo (Arnie) Lopez details how their observability platform helps 70,000+ customers monitor cloud performance through AWS infrastructure while introducing AI capabilities that simplify operations.Topics Include:Arnie Lopez is SVP, Chief Customer Officer at New Relic.Oversees pre-sales, post-sales, technical support, and enablement teams.New Relic University offers customer certifications.Founded in 2008, pioneered application performance monitoring (APM).Now offers "Observability 3.0" for full-stack visibility.Prevents interruptions during cloud migration and operations.Serves 70,000+ customers across various industries.16,000 enterprise-level paying customers.Platform consolidates multiple monitoring tools into one solution.Helps detect issues before customers experience performance problems.Market challenge: customers using disparate observability solutions.Reduces TCO by eliminating multiple monitoring tools.Targets VPs, CTOs, CIOs, and sometimes CEOs.Decade-long partnership with AWS.Platform built on largest unified telemetry data cloud.Uses AWS Graviton instances and Amazon EKS.AWS partnership enables innovation and customer trust.Three AI approaches: user assistance, LLM monitoring, faster insights.New Relic AI helps write query language (NURCLs).Monitors LLMs in customer environments.Uses AI to accelerate incident resolution.Lesson learned: should have started AI implementation sooner.Many customers still cautiously adopting AI technologies.Goal: continue growth with AWS partnership.Offers compute-based pricing model.Customers only pay for what they use.Announced one-step AWS monitoring for enterprise scale.Amazon Q Business and New Relic AI integration.Agent-to-agent AI eliminates data silos.Embeds performance insights into business application workflows.Participants:Arnie Lopez – SVP Chief Customer Officer, New RelicSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

AWS for Software Companies Podcast
Ep091: The Business of Better Decisions: PDI's Journey in Retail and Energy Tech

AWS for Software Companies Podcast

Play Episode Listen Later Apr 7, 2025 29:45


PDI Technologies' Steve Antonakakis shares how his company is implementing generative AI across their fuel and retail technology ecosystem through a practical, customer-focused approach using Amazon Bedrock.Topics Include:PDI's COO/CTO discussing generative AI implementationPractical step-by-step approach to AI integrationTesting in real-world settings with customer feedbackAWS Bedrock and Nova models exceeded expectationsEarly adoption phase with huge potentialFuel/retail industry processes many in-person transactionsPDI began in 1983 as ERP providerGrew through 33+ acquisitionsProvides end-to-end fuel industry solutionsOwns GasBuddy and Shell Fuel RewardsProcesses millions of transactions dailyGenerative AI fits into their intelligence plane architectureAWS Bedrock integrates well with existing infrastructureFocus on trusted, controlled, accountable AIProductizing AI features harder than traditional featuresCreated entrepreneurial structure alongside regular product teamsTeam designed to fail fast but stay customer-focusedAI features can access databases without disrupting applicationsCustomers want summarization across different business areasAI provides insights and actionable recommendationsConversational AI replaces traditional reporting limitationsWorking closely with customers to solve problems togetherBeyond prototyping phase, now in implementationAWS Nova provides excellent cost-to-value ratioFocus on measuring customer value over immediate profitabilityRFP use case saved half a million dollarsEarly prompts were massive, now more structuredSetting realistic customer expectations is importantData security approach same as other applicationsTreating AI outputs with same data classification standardsParticipants:Steve Antonakakis – COO & CTO, Retail & Energy, PDI TechnologiesSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

AWS for Software Companies Podcast
Ep090: Triple-Digit Growth - How Leading Partners Successfully Leverage AWS Marketplace

AWS for Software Companies Podcast

Play Episode Listen Later Apr 3, 2025 48:21


AWS partners Braze, Qualtrics, and Tealium share strategies for marketplace success, vertical industry expansion, and generative AI integration that have driven significant business growth. Topics Include:Jason Warren introduces AWS Business Application Partnerships panel.Three key topics: Marketplace Strategy, Vertical Expansion, Gen-AI Integration.Alex Rees of Braze, Matthew Gray of Tealium, and Jason Mann of Qualtrics join discussion.Braze experienced triple-digit percentage growth through AWS Marketplace.Braze dedicating resources specifically to Marketplace procurement.Tealium accelerated deal velocity by listing on Marketplace.Tealium saw broader use case expansion with AWS co-selling.Qualtrics views Marketplace listing as earning a "diploma."Understanding AWS incentives and metrics is crucial.Knowing AWS "love language" helps partnership success.Braze saw transaction volume increase between Q1 and Q4.Aligning with industry verticals unlocked faster growth.Tealium sees bigger deals and faster close times.Tealium moved from transactional to strategic marketplace approach.Private offers work well for complex enterprise agreements.Qualtrics measures AWS partnership through "influence, intel, introductions."AWS relationships help navigate IT and procurement challenges.Propensity-to-buy data guides AWS engagement strategy.Marketplace strategy evolving with new capabilities and international expansion.Brazilian marketplace distribution reduces currency and tax challenges.Partnership evolution: sell first, then market, then co-innovate.Braze penetrated airline market through AWS Travel & Hospitality.RFP introductions show tangible partnership benefits.Tealium partnering with Virgin Australia and United Airlines.MUFG bank case study shows joint AWS-Tealium success.Qualtrics won awards despite not completing formal competencies.Focus on fewer verticals yields better results.Gen AI brings both opportunities and regulatory concerns.First-party data rights critical for AI implementation.AWS Bedrock integration provides security and prescriptive solutions.Participants:Alex Rees – Director Tech Partnerships, BrazeJason Mann – Global AWS Alliance Lead, QualtricsMatthew Gray - SVP, Partnerships & Alliances, TealiumJason Warren - Head of Business Applications ISV Partnerships (Americas), AWSSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

AWS for Software Companies Podcast
Ep089: Enterprise AI at Scale: Box's Approach to Secure Content Intelligence

AWS for Software Companies Podcast

Play Episode Listen Later Mar 31, 2025 31:12


Yashodha Bhavnani, Head of AI at Box, reveals Box's vision for intelligent content management that transforms unstructured data into actionable insights. Topics Include:Yashodha Bhavnani leads AI products at Box.Box's mission: power how the world works together.Box serves customers globally across various industries.Works with majority of Fortune 500 companies.AI agents will join workforce for repetitive tasks.Workflows like hiring will become easily automated with AI.Content will work for users, not vice versa.Customers demand better experiences with generative AI.Box calls this shift "intelligent content management."90% of enterprise content is unstructured data.AI thrives on unstructured data.Current content systems are unproductive and unsecured.AI can generate insights from scattered company knowledge.AI extracts metadata automatically from documents like contracts.Automated workflows triggered by AI-extracted data.Box provides enterprise-grade AI connected to your content.AI follows same permissions as the content itself.Customer data never used to train AI models.AI helps classify sensitive data to prevent leaks.Box offers choice of AI models to customers.AI is seamlessly connected with customer content.Administrators control AI deployment across their organization.Partnership with AWS Bedrock brings frontier models to Box.Box supports customers using their own custom models.Box preparing for AI agents to join workforce.Introduced "AI Units" for flexible pricing.Basic AI included free with Business Plus tiers.Both horizontal and vertical multi-agent architectures planned.Working toward agent-to-agent communication protocols.Participants:Yashodha Bhavnani - VP of Product Management, AI products, BoxSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

Permanently Moved
2506: Information Age Iconoclasm

Permanently Moved

Play Episode Listen Later Mar 30, 2025 5:02 Transcription Available


Ghibli images aren't really about copyright or ethics, they're about unexamined questions of power. Who gets to make images? What gives them meaning? And what is their value when machines can produce them at scale? Full Show Notes: https://thejaymo.net/2025/03/30/2506-information-age-iconoclasm/ Experience.Computer: https://experience.computer/ Worldrunning.guide: https://worldrunning.guide/ Subscriber Zine! https://startselectreset.com/ Permanently moved is a personal podcast 301 seconds in length, written and recorded by @thejaymo Subscribe to the Podcast: https://permanentlymoved.online/

AWS for Software Companies Podcast
Ep088: Monetizing and Productizing Generative AI for SaaS with RingCentral & Zoom

AWS for Software Companies Podcast

Play Episode Listen Later Mar 27, 2025 36:30


Tech leaders from RingCentral, Zoom and AWS discuss how generative AI is transforming business communications while balancing challenges & regulatory concerns in this rapidly evolving landscape.Topics Include:Introduction of panel on generative AI's impact on businesses.How to transition AI from prototypes to production.Understanding value creation for customers through AI.Introduction of Khurram Tajji from RingCentral.Introduction of Brendan Ittleson from Zoom.How generative AI fits into Zoom's product offerings.Zoom's AI companion available to all paid customers.Zoom's federated approach to AI model selection.RingCentral's new AI Receptionist (AIR) launch.How AIR routes calls using generative AI capabilities.AI improving customer experience through sentiment analysis.The disproportionate value of real-time AI assistance.Economics of delivering real-time AI capabilities.Real-time AI compliance monitoring in banking.Value of preventing regulatory fines through AI.Voice cloning detection through AI security.Democratizing AI access across Zoom's platform.Monetizing specialized AI solutions for business value.Challenges in taking AI prototypes to production.Importance of selecting the right AI models.Privacy considerations when training AI models.Maintaining quality without using customer data for training.Co-innovation with customers during product development.Scaling challenges for AI businesses.Case study of AI in legal case assessment.Ensuring unit economics work before scaling AI applications.Zoom's approach to scaling AI across products.Importance of centralizing but federating AI capabilities.Breaking down data silos for effective AI context.Navigating evolving regulations around AI.EU AI Act restrictions on emotion inference.Balancing regulations with customer experience needs.Future of AI agents interacting with other agents.How AI enhances human connection by handling routine tasks.Impact of AI on company valuations and M&A activity.Participants:Khurram Tajji – Group CMO & Partnerships, RingCentralBrendan Ittleson – Chief Ecosystem Officer, ZoomSirish Chandrasekaran – VP of Analytics, AWSSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

On Cloud
How higher education is using AI to innovate for growth

On Cloud

Play Episode Listen Later Mar 26, 2025 30:08


Declining enrollment was a problem at WKU. The university used AI-powered analytics to optimize outreach, which helped increase enrollment and revenue.

AWS for Software Companies Podcast
Ep087: The Multi-Agent Advantage: How Sumo Logic Leverages AI for Observability

AWS for Software Companies Podcast

Play Episode Listen Later Mar 25, 2025 23:25


CEO Joe Kim shares how Sumo Logic has implemented generative AI to democratize data analytics, leveraging AWS Bedrock's multi-agent capabilities to dramatically improve accuracy.Topics Include:Introduction of Joe Kim, CEO of Sumo Logic.Question: Overview of Sumo Logic's products and customers?Sumo Logic specializes in observability and security markets.Company leverages industry-leading log management and analytics capabilities.Question: How has generative AI entered this space?Kim's background is in product, strategy and engineering.Non-experts struggle to extract value from complex telemetry data.Generative AI provides easier interface for interacting with data.Question: How do you measure success of AI initiatives?Focus on customer problems, not retrofitting AI everywhere.Launched "Mo, the co-pilot" at AWS re:Invent.Mo enables natural language queries of complex data.Mo suggests visualizations and follow-up questions during incidents.Question: What challenges did you face implementing AI?Team knew competitors would eventually implement similar capabilities.Single model approach topped out at 80% accuracy.Multi-agent approach with AWS Bedrock achieved mid-90% accuracy.Bedrock offered security benefits and multiple model capabilities.Question: How was working with the AWS team?Partnered with Bedrock team and tribe.ai for implementation.Partners helped avoid pitfalls from thousands of prior projects.Question: What advice for other software leaders?Don't implement AI just to satisfy board pressure.Identify problems without mentioning generative AI first.Innovation should come from listening to customers.Question: Future plans with AWS partnership?Moving toward automated remediation beyond just analysis.Question: Has Sumo Logic monetized generative AI?Changed pricing from data ingestion to data usage.New model encourages more data sharing without cost barriers.Participants:Joe Kim – Chief Executive Officer, Sumo LogicSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

AWS for Software Companies Podcast
Ep086: Battling Fraud in the Age of Generative AI: Socure's Mission to Secure the Internet

AWS for Software Companies Podcast

Play Episode Listen Later Mar 21, 2025 25:53


CTO Arun Kumar discusses how Socure leverages AWS and generative AI to collect billions of data points each day in order to combat sophisticated online fraud at scale.Topics Include:Introduction of Arun Kumar, CTO of SocureWhat does Socure specialize in?KYC and anti-money laundering checksMission: eliminate 100% fraud on the internetFraud has increased since COVIDSocure blocks fraud at entry pointWorks with top banks and government agenciesCTO responsibilities include product and engineeringFocus on increasing efficiency through technologyTwo goals: internal efficiency and combating fraudCountering tools like FraudGPT on dark webMeasuring success through reduced human capital needsFraud investigations reduced from hours to minutesImproved success rates in uncovering fraud ringsDetecting multi-hop connections in fraud networksQuestion: Who's winning - fraudsters or AI?It's a constant "cat and mouse game"Creating a fraud "red team" similar to cybersecurityPartnership details with AWSAmazon Bedrock provides multiple LLM optionsBuilding world's largest identity graph with NeptuneReal-time suspicious activity detectionBlocking account takeovers through phone number changesSuccess story: detecting deepfake across 3,000 IDsCollecting hundreds of data points per identityChallenges: adding selfie checks and liveness detectionFuture strategy: 10x-100x performance improvementsCreating second and third-order intelligence signalsInternal efficiency applications of generative AIAI-powered sales tools and legal document reviewParticipants:Arun Kumar – Chief Technical Officer, SocureSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

The Product Podcast
Postscript CPO on Driving Product Revenue with SMS | Chiara McPhee | E260

The Product Podcast

Play Episode Listen Later Mar 19, 2025 43:53


In this episode, Carlos Gonzalez de Villaumbrosia interviews Chiara McPhee, Chief Product Officer at Postscript, a leader in SMS marketing for e-commerce. Postscript is revolutionizing how e-commerce brands engage with customers. With over $100 million in Annual Recurring Revenue (ARR) and more than 20,000 Shopify merchants using their platform, Postscript has become a powerhouse in conversational commerce. Their SMS marketing tools have helped generate over $2 billion in e-commerce revenue for their customers annually, with open rates exceeding 90%, far outpacing traditional email marketing.In this episode, Chiara shares her insights on:* Leveraging generative AI to create personalized, one-to-one conversations that drive revenue.* How AI agents are outperforming human customer support in certain areas.* Key revenue leading indicators in SMS marketing and setting up effective attribution models.* Overcoming challenges in scaling infrastructure to achieve $100M in Annual Recurring Revenue.* The benefits of focusing exclusively on Shopify.* The "Horizon Strategy" approach to building teams tailored for Horizon 1 (cash cow), Horizon 2 (growth), and Horizon 3 (moonshots), balancing short-term wins with long-term ambitious goals in product development.In this episode, we'll explore how Postscript is leveraging cutting-edge technology to deliver personalized customer experiences, driving revenue and redefining e-commerce marketing. We'll discuss leveraging generative AI to create personalized, one-to-one conversations that drive revenue, how AI agents are outperforming human customer support in certain areas, overcoming challenges in scaling infrastructure to achieve $100M in Annual Recurring Revenue, and the benefits of focusing exclusively on Shopify. What you'll learn:* Chiara's journey to becoming CPO at Postscript and her insights on the power of SMS marketing.* How generative AI enables personalized, one-to-one conversations that drive revenue.* Key strategies for SMS marketing, including compliance, personalization, and integration with other channels.* How to structure product teams using the "Horizon Strategy" to balance short-term wins with long-term innovation. Key Takeaways:*Personalized Conversations: Chiara emphasizes the importance of leveraging generative AI to create personalized, one-to-one conversations that drive revenue.*Focus on Shopify: Chiara highlights the company's strategic decision to focus exclusively on Shopify, and the impact it had on business outcomes.*Horizon Strategy: Chiara shares the benefits of the "Horizon Strategy" approach to building product teams tailored for different stages of growth and innovation.

AWS for Software Companies Podcast
Ep085: Securing the AI Frontier: Overcoming Security Risks featuring Oron Noah of Wiz

AWS for Software Companies Podcast

Play Episode Listen Later Mar 19, 2025 12:30


Oron Noah of Wiz outlines how organizations evolve their security practices to address new vulnerabilities in AI systems through improved visibility, risk assessment, and pipeline protection.Topics Include:Introduction of Oron Noah, VP at Wiz.Wiz: largest private service security company.$1.9 billion raised from leading VCs.45% of Fortune 100 use Wiz.Wiz scans 60+ Amazon native services.Cloud introduced visibility challenges.Cloud created risk prioritization issues.Security ownership shifted from CISOs to everyone.Wiz offers a unified security platform.Three pillars: Wiz Cloud, Code, and Defend.Wiz democratizes cloud security for all teams.Security Graph uses Amazon Neptune.Wiz has 150+ available integrations.Risk analysis connects to cloud environments.Wiz identifies critical attack paths.AI assists in security graph searches.AI helps with remediation scripts.AI introduces new security challenges.70% of customers already use AI services.AI security requires visibility, risk assessment, pipeline protection.AI introduces risks like prompt injection.Data poisoning can manipulate AI results.Model vulnerabilities create attack vectors.AI Security Posture Management (ASPM) introduced.Four key questions for AI security.AI pipelines resemble traditional cloud infrastructure.Wiz researchers found real AI security vulnerabilities.Wiz AI ASPM provides agentless visibility.Supports major AI services (AWS, OpenAI, etc.).Built-in rules detect AI service misconfigurations.Participants:Oron Noah – VP Product Extensibility & Partnerships, WizSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

AWS for Software Companies Podcast
Ep084: Accelerating ISV Modernization: SoftServe's Six-Month Success Formula

AWS for Software Companies Podcast

Play Episode Listen Later Mar 17, 2025 23:28


Ruslan Kusov of SoftServe presents how their Application Modernization Framework accelerates ISV modernization, assesses legacy code, and delivers modernized applications through platform engineering principles.Topics Include:Introduction of Ruslan Kusov, Cloud CoE Director at SoftServeSoftServe builds code for top ISVsSuccess case: accelerated security ISV modernization by six monthsHealthcare tech company assessment: 1.6 million code lines in weeksBusiness need: product development acceleration for competitive advantageBusiness need: intelligent operations automationBusiness need: ecosystem integration and "sizeification" to cloudBusiness need: secure and compliant solutionsBusiness need: customer-centric platforms with personalized experiencesBusiness need: AWS marketplace integrationDistinguishing intentional from unintentional complexityPlatform engineering concept introductionSelf-service internal platforms for standardizationApplying platform engineering across teams (GenAI, CSO, etc.)No one-size-fits-all approach to modernizationSAMP/SEMP framework introductionCore components: EKS, ECS, or LambdaModular structure with interchangeable componentsCase study: ISV switching from hardware to software productsFour-week MVP instead of planned ten weeksSix-month full modernization versus planned twelve monthsAssessment phase importance for business case developmentCalculating cost of doing nothing during modernization decisionsHealthcare customer case: 1.6 million code lines assessedBenefits: platform deployment in under 20 minutesBenefits: 5x reduced assessment timeBenefits: 30% lower infrastructure costsBenefits: 20% increased development productivity with GenAIIntegration with Amazon Q for developer productivityClosing Q&A on security modernization and ongoing managementParticipants:Ruslan Kusov – Cloud CoE Director, SoftserveSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

AWS for Software Companies Podcast
Ep083: Navigating the AWS Bedrock Journey: Planview's AI Evolution

AWS for Software Companies Podcast

Play Episode Listen Later Mar 13, 2025 32:24


Richard Sonnenblick and Lee Rehwinkel of Planview discuss their transition to Amazon Bedrock for a multi-agent AI system while sharing valuable implementation and user experience lessons.Topics Include:Introduction to Planview's 18-month journey creating an AI co-pilot.Planview builds solutions for strategic portfolio and agile planning.5,000+ companies with millions of users leverage Planview solutions.Co-pilot vision: AI assistant sidebar across multiple applications.RAG used to ingest customer success center documents.Tracking product data, screens, charts, and tables.Incorporating industry best practices and methodologies.Can ingest customer-specific documents to understand company terminology.Key benefit: Making every user a power user.Key benefit: Saving time on tedious and redundant tasks.Key benefit: De-risking initiatives through early risk identification.Cost challenges: GPT-4 initially cost $60 per million tokens.Cost now only $1.20 per million tokens.Market evolution: AI features becoming table stakes.Performance rubrics created for different personas and applications.Multi-agent architecture provides technical and organizational scalability.Initial implementation used Azure and GPT-4 models.Migration to AWS Bedrock brought model choice benefits.Bedrock allowed optimization across cost, benchmarking, and speed dimensions.Added AWS guardrails and knowledge base capabilities.Lesson #1: Users hate typing; provide clickable options.Lesson #2: Users don't like waiting; optimize for speed.Lesson #3: Users take time to trust AI; provide auditable answers.Question about role-based access control and permissions.Co-pilot uses user authentication to access application data.Question about subscription pricing for AI features.Need to educate customers about AI's value proposition.Question about reasoning modes and timing expectations.Showing users the work process makes waiting more tolerable.Participants:Richard Sonnenblick - Chief Data Scientist, PlanviewLee Rehwinkel – Principal Data Scientist, PlanviewSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

AWS for Software Companies Podcast
Ep082: Accelerating Profitable Growth with SaaS with DataRobot, LaunchDarkly and ServiceNow

AWS for Software Companies Podcast

Play Episode Listen Later Mar 11, 2025 62:21


Executives from DataRobot, LaunchDarkly and ServiceNow share strategies, actions and recommendations to achieve profitable growth in today's competitive SaaS landscape.Topics Include:Introduction of panelists from DataRobot, LaunchDarkly & ServiceNowServiceNow's journey from service management to workflow orchestration platform.DataRobot's evolution as comprehensive AI platform before AI boom.LaunchDarkly's focus on helping teams decouple release from deploy.Rule of 40: balancing revenue growth and profit margin.ServiceNow exceeding standards with Rule of 50-60 approach.Vertical markets expansion as key strategy for sustainable growth.AWS Marketplace enabling largest-ever deal for ServiceNow.R&D investment effectiveness through experimentation and feature management.Developer efficiency as driver of profitable SaaS growth.Competition through data-driven decisions rather than guesswork.Speed and iteration frequency determining competitive advantage in SaaS.Balancing innovation with early customer adoption for AI products.Product managers should adopt revenue goals and variable compensation.Product-led growth versus sales-led motion: strategies and frictions.Sales-led growth optimized for enterprise; PLG for practitioners.Marketplace-led growth as complementary go-to-market strategy.Customer acquisition cost (CAC) as primary driver of margin erosion.Pricing and packaging philosophy: platform versus consumption models.Value realization must precede pricing and packaging discussions.Good-better-best pricing model used by LaunchDarkly.Security as foundation of trust in software delivery.LaunchDarkly's Guardian Edition for high-risk software release scenarios.Security for regulated industries through public cloud partnerships.GenAI security: benchmarks, tests, and governance to prevent issues.M&A strategy: ServiceNow's 33 acquisitions for features, not revenue.Replatforming acquisitions into core architecture for consistent experience.Balancing technology integration with people aspects during acquisitions.Trends in buying groups: AI budgets and tool consolidation.Implementing revenue goals in product teams for new initiatives.Participants:Prajakta Damle – Head of Product / SVP of Product, DataRobotClaire Vo – Chief Product & Technology Officer, LaunchDarklyAnshuman Didwania – VP/GM, Hyperscalers Business Group, ServiceNowAkshay Patel – Global SaaS Strategist, Amazon Web ServicesSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

Thoughts on the Market
Why Uncertainty Won't Slow AI Hardware Investment

Thoughts on the Market

Play Episode Listen Later Mar 10, 2025 4:03


Our Head of U.S. IT Hardware Erik Woodring gives his key takeaways from Morgan Stanley's Technology, Media and Telecom (TMT) conference, including why there appears to be a long runway ahead for AI infrastructure spending, despite macro uncertainty. ----- Transcript -----Welcome to Thoughts on the Market. I'm Erik Woodring, Morgan Stanley's Head of U.S. IT Hardware Research. Here are some reflections I recorded last week at Morgan Stanley's Technology, Media, and Telecom Conference in San Francisco. It's Monday, March 10th at 9am in New York. This was another year of record attendance at our TMT Conference. And what is clear from speaking to investors is that the demand for new, under-discovered or under-appreciated ideas is higher than ever. In a stock-pickers' market – like the one we have now – investors are really digging into themes and single name ideas. Big picture – uncertainty was a key theme this week. Whether it's tariffs and the changing geopolitical landscape, market volatility, or government spending, the level of relative uncertainty is elevated. That said, we are not hearing about a material change in demand for PCs, smartphones, and other technology hardware. On the enterprise side of my coverage, we are emerging from one of the most prolonged downcycles in the last 10-plus years, and what we heard from several enterprise hardware vendors and others is an expectation that most enterprise hardware markets – PCs , Servers, and Storage – return to growth this year given pent up refresh demand. This, despite the challenges of navigating the tariff situation, which is resulting in most companies raising prices to mitigate higher input costs. On the consumer side of the world, the demand environment for more discretionary products like speakers, cameras, PCs and other endpoint devices looks a bit more challenged. The recent downtick in consumer sentiment is contributing to this environment given the close correlation between sentiment and discretionary spending on consumer technology goods. Against this backdrop, the most dynamic topic of the conference remains GenerativeAI. What I've been hearing is a confidence that new GenAI solutions can increasingly meet the needs of market participants. They also continue to evolve rapidly and build momentum towards successful GenAI monetization. To this point, underlying infrastructure spending—on servers, storage and other data center componentry – to enable these emerging AI solutions remains robust. To put some numbers behind this, the 10 largest cloud customers are spending upwards of [$]350 billion this year in capex, which is up over 30 percent year-over-year. Keep in mind that this is coming off the strongest year of growth on record in 2024. Early indications for 2026 CapEx spending still point to growth, albeit a deceleration from 2025. And what's even more compelling is that it's still early days. My fireside chats this week highlighted that AI infrastructure spending from their largest and most sophisticated customers is only in the second inning, while AI investments from enterprises, down to small and mid-sized businesses, is only in the first inning, or maybe even earlier. So there appears to be a long runway ahead for AI infrastructure spending, despite the volatility we have seen in AI infrastructure stocks, which we see as an opportunity for investors. I'd just highlight that amidst the elevated market uncertainty, there is a prioritization on cost efficiencies and adopting GenAI to drive these efficiencies. Company executives from some of the major players this week all discussed near-term cost efficiency initiatives, and we expect these efforts to both help protect the bottom line and drive productivity growth amidst a quickly changing market backdrop. Thanks for listening. If you enjoy the show, please leave us a review wherever you listen and share Thoughts on the Market with a friend or colleague today.

Thoughts on the Market
Funding the Next Phase of AI Development

Thoughts on the Market

Play Episode Listen Later Mar 6, 2025 10:39


Recorded at our 2025 Technology, Media and Telecom (TMT) Conference, TMT Credit Research Analyst Lindsay Tyler joins Head of Investment Grade Debt Coverage Michelle Wang to discuss the how the industry is strategically raising capital to fund growth.----- Transcript -----Lindsay Tyler: Welcome to Thoughts on the Market. I'm Lindsay Tyler, Morgan Stanley's Lead Investment Grade TMT Credit Research Analyst, and I'm here with Michelle Wang, Head of Investment Grade Debt Coverage in Global Capital Markets.On this special episode, we're recording at the Morgan Stanley Technology, Media, and Telecom (TMT) Conference, and we will discuss the latest on the technology space from the fixed income perspective.It's Thursday, March 6th at 12 pm in San Francisco.What a week it's been. Last I heard, we had over 350 companies here in attendance.To set the stage for our discussion, technology has grown from about 2 percent of the broader investment grade market – about two decades ago – to almost 10 percent now; though that is still relatively a small percentage, relative to the weightings in the equity market.So, can you address two questions? First, why was tech historically such a small part of investment grade? And then second, what has driven the growth sense?Michelle Wang: Technology is still a relatively young industry, right? I'm in my 40s and well over 90 percent of the companies that I cover were founded well within my lifetime. And if you add to that the fact that investment grade debt is, by definition, a later stage capital raising tool. When the business of these companies reaches sufficient scale and cash generation to be rated investment grade by the rating agencies, you wind up with just a small subset of the overall investment grade universe.The second question on what has been driving the growth? Twofold. Number one the organic maturation of the tech industry results in an increasing number of scaled investment grade companies. And then secondly, the increasing use of debt as a cheap source of capital to fund their growth. This could be to fund R&D or CapEx or, in some cases, M&A.Lindsay Tyler: Right, and I would just add in this context that my view for this year on technology credit is a more neutral one, and that's against a backdrop of being more cautious on the communications and media space.And part of that is just driven by the spread compression and the lack of dispersion that we see in the market. And you mentioned M&A and capital allocation; I do think that financial policy and changes there, whether it's investment, M&A, shareholder returns – that will be the main driver of credit spreads.But let's turn back to the conference and on the – you know, I mentioned investment. Let's talk about investment.AI has dominated the conversation here at the conference the past two years, and this year is no different. Morgan Stanley's research department has four key investment themes. One of those is AI and tech diffusion.But from the fixed income angle, there is that focus on ongoing and upcoming hyperscaler AI CapEx needs.Michelle Wang: Yep.Lindsay Tyler: There are significant cash flows generated by many of these companies, but we just discussed that the investment grade tech space has grown relative to the index in recent history.Can you discuss the scale of the technology CapEx that we're talking about and the related implications from your perspective?Michelle Wang: Let's actually get into some of the numbers. So in the past three years, total hyperscaler CapEx has increased from [$]125 billion three years ago to [$]220 billion today; and is expected to exceed [$]300 billion in 2027.The hyperscalers have all publicly stated that generative AI is key to their future growth aspirations. So, why are they spending all this money? They're investing heavily in the digital infrastructure to propel this growth. These companies, however, as you've pointed out, are some of the most scaled, best capitalized companies in the entire world. They have a combined market cap of [$]9 trillion. Among them, their balance sheet cash ranges from [$]70 to [$]100 billion per company. And their annual free cash flow, so the money that they generate organically, ranges from [$]30 to [$]75 billion.So they can certainly fund some of this CapEx organically. However, the unprecedented amount of spend for GenAI raises the probability that these hyperscalers could choose to raise capital externally.Lindsay Tyler: Got it.Michelle Wang: Now, how this capital is raised is where it gets really interesting. The most straightforward way to raise capital for a lot of these companies is just to do an investment grade bond deal.Lindsay Tyler: Yep.Michelle Wang: However, there are other more customized funding solutions available for them to achieve objectives like more favorable accounting or rating agency treatment, ways for them to offload some of their CapEx to a private credit firm. Even if that means that these occur at a higher cost of capital.Lindsay Tyler: You touched on private credit. I'd love to dig in there. These bespoke capital solutions.Michelle Wang: Right.Lindsay Tyler: I have seen it in the semiconductor space and telecom infrastructure, but can you please just shed some more light, right? How has this trend come to fruition? How are companies assessing the opportunity? And what are other key implications that you would flag?Michelle Wang: Yeah, for the benefit of the audience, Lindsay, I think just to touch a little bit…Lindsay Tyler: Some definitions,Michelle Wang: Yes, some definitions around ...Lindsay Tyler: Get some context.Michelle Wang: What we're talking about.Lindsay Tyler: Yes.So the – I think what you're referring to is investment grade companies doing asset level financing. Usually in conjunction with a private credit firm, and like all financing trends that came before it, all good financing trends, this one also resulted from the serendipitous intersection of supply and demand of capital.On the supply of capital, the private credit pocket of capital driven by large pockets of insurance capital is now north of $2 trillion and it has increased 10x in scale in the past decade. So, the need to deploy these funds is driving these private credit firms to seek out ways to invest in investment grade companies in a yield enhanced manner.Lindsay Tyler: Right. And typically, we're saying 150 to 200 basis points greater than what maybe an IG bond would yield.Michelle Wang: That's exactly right. That's when it starts to get interesting for them, right? And then the demand of capital, the demand for this type of capital, that's always existed in other industries that are more asset-heavy like telcos.However, the new development of late is the demand for capital from tech due to two megatrends that we're seeing in tech. The first is semiconductors. Building these chip factories is an extremely capital-intensive exercise, so creates a demand for capital. And then the second megatrend is what we've seen with the hyperscalers and GenerativeAI needs. Building data centers and digital infrastructure for GenerativeAI is also extremely expensive, and that creates another pocket of demand for capital that private credit conveniently kinda serves a role in.Lindsay Tyler: Right.Michelle Wang: So look, think we've talked about the ways that companies are using these tools. I'm interested to get your view, Lindsay, on the investor perspective.Lindsay Tyler: Sure.Michelle Wang: How do investors think about some of these more bespoke solutions?Lindsay Tyler: I would say that with deals that have this touch of extra complexity, it does feel that investor communication and understanding is all important. And I have found that, some of these points that you're raising – whether it's the spread pickup and the insurance capital at the asset managers and also layering in ratings implications and the deal terms. I think all of that is important for investors to get more comfortable and have a better understanding of these types of deals.The last topic I do want us to address is the macro environment. This has been another key theme with the conference and with this recent earnings season, so whether it's rate moves this year, the talk of M& A, tariffs – what's your sense on how companies are viewing and assessing macro in their decision making?Michelle Wang: There are three components to how they're thinking about it.The first is the rate move. So, the fact that we're 50 to 60 basis points lower in Treasury yields in the past month, that's welcome news for any company looking to issue debt. The second thing I'll say here is about credit spreads. They remain extremely tight. Speaking to the incredible kind of resilience of the investment grade investor base. The last thing I'll talk about is, I think, the uncertainty. [Because] that's what we're hearing a ton about in all the conversations that we've had with companies that have presented here today at the conference.Lindsay Tyler: Yeah. For my perspective, also the regulatory environment around that M&A, whether or not companies will make the move to maybe be more acquisitive with the current new administration.Michelle Wang: Right, so until the dust settles on some of these issues, it's really difficult as a corporate decision maker to do things like big transformative M&A, to make a company public when you don't know what could happen both from a the market environment and, as you point out, regulatory standpoint.The thing that's interesting is that raising debt capital as an investment grade company has some counter cyclical dynamics to it. Because risk-off sentiment usually translates into lower treasury yields and more favorable cost of debt.And then the second point is when companies are risk averse it drives sometimes cash hoarding behavior, right? So, companies will raise what they call, you know, rainy day liquidity and park it on balance sheet – just to feel a little bit better about where their balance sheets are. To make sure they're in good shape…Lindsay Tyler: Yeah, deal with the maturities that they have right here in the near term.Michelle Wang: That's exactly right. So, I think as a consequence of that, you know, we do see some tailwinds for debt issuance volumes in an uncertain environment.Lindsay Tyler: Got it. Well, appreciate all your insights. This has been great. Thank you for taking the time, Michelle, to talk during such a busy week.Michelle Wang: It's great speaking with you, Lindsay.Lindsay Tyler: And thanks to everyone listening in to this special episode recorded at the Morgan Stanley TMT Conference in San Francisco. If you enjoy Thoughts on the Market, please leave us a review wherever you listen and share the podcast with a friend or colleague today.

AWS for Software Companies Podcast
Ep081: Customer-First AI: DTEX Systems' Journey with Generative AI and AWS

AWS for Software Companies Podcast

Play Episode Listen Later Mar 4, 2025 28:19


Ryan Steeb shares DTEX Systems' strategic approach to implementing generative AI with AWS Bedrock, reducing risk while focusing on meaningful customer outcomes.Topics Include:Introduction of Ryan Steeb, Head of Product at DTEX Systems Explanation of insider risk challenges Three categories of insider risk (malicious, negligent, compromised) How DTEX Systems is using generative AI Collection of proprietary data to map human behavior on networks Three key areas leveraging Gen AI: customer value, services acceleration, operations How partnership with AWS has impacted DTEX's AI capabilities Value of AWS expertise for discovering AI possibilities AWS Bedrock providing flexibility in AI implementation Collaboration on unique applications beyond conventional chat assistants AWS OpenSearch as a foundational component Creating invisible AI workflows that simplify user experiences The path to monetization for generative AI Three approaches: direct pricing, service efficiency, operational improvements Second and third-order effects (retention, NPS, reduced churn) How DTEX prioritizes Gen AI projects Starting with customer problems vs. finding problems for AI solutions Business impact prioritization framework Technical capability considerations Benefits of moving AI solutions to AWS Bedrock Fostering a culture of experimentation and innovation Adopting Amazon's "working backwards" philosophy Balancing customer-driven evolution with original innovation Time machine advice: start experimenting with Gen AI earlier Importance of leveraging peer groups and experts Future outlook: concerns about innovation outpacing risk mitigation Security implications of Gen AI adoption Participation in the OpenSearch Linux Foundation initiative Final thoughts on the DTEX-AWS partnershipParticipants:Ryan Steeb – Head of Product, DTEX SystemsSee how Amazon Web Services gives you the freedom to migrate, innovate, and scale your software company at https://aws.amazon/isv/

Thoughts on the Market
Will GenAI Turn a Profit in 2025?

Thoughts on the Market

Play Episode Listen Later Mar 3, 2025 12:49


Our Semiconductors and Software analysts Joe Moore and Keith Weiss dive into the biggest market debate around AI and why it's likely to shape conversations at Morgan Stanley's Technology, Media and Telecom (TMT) Conference in San Francisco. ----- Transcript -----Joe Moore: Welcome to Thoughts on the Market. I'm Joe Moore, Morgan Stanley's Head of U.S. Semiconductors.Keith Weiss: And I'm Keith Weiss, Head of U.S. Software.Joe Moore: Today on the show, one of the biggest market debates in the tech sector has been around AI and the Return On Investment, or ROI. In fact, we think this will be the number one topic of conversation at Morgan Stanley's annual Technology, Media and Telecom (TMT) conference in San Francisco.And that's precisely where we're bringing you this episode from.It's Monday, March 3rd, 7am in San Francisco.So, let's get right into it. ChatGPT was released November 2022. Since then, the biggest tech players have gained more than $9 trillion in combined market capitalization. They're up more than double the amount of the S&P 500 index. And there's a lot of investor expectation for a new technology cycle centered around AI. And that's what's driving a lot of this momentum.You know, that said, there's also a significant investor concern around this topic of ROI, especially given the unprecedented level of investment that we've seen and sparse data points still on the returns.So where are we now? Is 2025 going to be a year when the ROI and GenAI finally turns positive?Keith Weiss: If we take a step back and think about the staging of how innovation cycles tend to play out, I think it's a helpful context.And it starts with research. I would say the period up until When ChatGPT was released – up until that November 2022 – was a period of where the fundamental research was being done on the transformer models; utilizing, machine learning. And what fundamental research is, is trying to figure out if these fundamental capabilities are realistic. If we can do this in software, if you will.And with the release of ChatGPT, it was a very strong, uh, stamp of approval of ‘Yes, like these transformer models can work.'Then you start stage two. And I think that's basically November 22 through where are today of, where you have two tracks going on. One is development. So these large language models, they can do natural language processing well.They can contextually understand unstructured and semi structured data. They can generate content. They could create text; they could create images and videos.So, there's these fundamental capabilities. But you have to develop a product to get work done. How are we going to utilize those capabilities? So, we've been working on development of product over the past two years. And at the same time, we've been scaling out the infrastructure for that product development.And now, heading into 2025, I think we're ready to go into the next stage of the innovation cycle, which will be market uptake.And that's when revenue starts to flow to the software companies that are trying to automate business processes. We definitely think that monetization starts to ramp in 2025, which should prove out a better ROI or start to prove out the ROI of all this investment that we've been making.Joe Moore: Morgan Stanley Research projects that GenAI can potentially drive a $1.1 trillion dollar revenue opportunity in 2028, up from $45 billion in 2024. Can you break this down for our listeners?Keith Weiss: We recently put out a report where we tried to size kind of what the revenue generation capability is from GenerativeAI, because that's an important part of this ROI equation. You have the return on the top of where you could actually monetize this. On the bottom, obviously, investment. And we took a look at all the investment needed to serve this type of functionality.The [$]1.1 trillion, if you will, it breaks down into two big components. Um, One side of the equation is in my backyard, and that's the enterprise software side of the equation. It's about a third of that number. And what we see occurring is the automation of more and more of the work being done by information workers; for people in overall.And what we see is about 25 percent, of overall labor being impacted today. And we see that growing to over 45 percent over the next three years.So, what that's going to look like from a software perspective is a[n] opportunity ramping up to about, just about $400 billion of software opportunity by 2028. At that point, GenerativeAI will represent about 22 percent of overall software spending. At that point, the overall software market we expect to be about a $1.8 trillion market.The other side of the equation, the bigger side of the equation, is actually the consumer platforms. And that kind of makes sense if you think about the broader economy, it's basically one-third B2B, two-thirds B2C. The automation is relatively equivalent on both sides of the equation.Joe Moore: So, let's drill further into your outlook for software. What are the biggest catalysts you expect to see this year, and then over the coming three years?Keith Weiss: The key catalyst for this year is proving out the efficacy of these solutions, right?Proving out that they're going to drive productivity gains and yield real hard dollar ROI for the end customer. And I think where we'll see that is from labor savings.Once that occurs, and I think it's going to be over the next 12 to 18 months, then we go into the period of mainstream adoption. You need to start utilizing these technologies to drive the efficiencies within your businesses to be able to keep up with your competitors. So, that's the main thing that we're looking for in the near term.Over the next three years, what you're looking for is the breakthrough technologies. Where can we find opportunities not just to create efficiencies within existing processes, but to completely rewrite the business process.That's where you see new big companies emerge within the software opportunity – is the people that really fundamentally change the equation around some of these processes.So, Joe, turning it over to you, hardware remains a bottleneck for AI innovation. Why is that the case? And what are the biggest hurdles in the semiconductor space right now?Joe Moore: Well, this has proven to be an extremely computationally intensive application, and I think it started with training – where you started seeing tens of thousands of GPUs or XPUS clustered together to train these big models, these Large Language Models. And you started hearing comments two years ago around the development of ChatGPT that, you know, the scaling laws are tricky.You might need five times as much hardware to make a model that's 10 percent smarter. But the challenge of making a model that's 10 percent smarter, the table stakes of that are very significant. And so, you see, you know, those investments continuing to scale up. And that's been a big debate for the market.But we've heard from most of the big spenders in the market that we are continuing to scale up training. And then after that happened, we started seeing inference suddenly as a big user of advanced processors, GPUs, in a way that they hadn't before. And that was sort of simple conversational types of AI.Now as you start migrating into more of a reasoning AI, a multi pass approach, you're looking at a really dramatic scaling in the amount of hardware, that's required from both GPUs and XPUs.And at the same time the hardware companies are focused a lot on how do we deliver that – so that it doesn't become prohibitively expensive; which it is very expensive. But there's a lot of improvement. And that's where you're sort of seeing this tug of war in the stocks; that when you see something that's deflationary, uh, it becomes a big negative. But the reality is the hardware is designed to be deflationary because the workloads themselves are inflationary.And so I think there's a lot of growth still ahead of us. A lot of investment, and a lot of rich debate in the market about this.Keith Weiss: Let's pull on that thread a little bit. You talked initially about the scaling of the GPU clusters to support training. Over the past year, we've gotten a little bit more pushback on the ideas or the efficacy of those scaling laws.They've come more under question. And at the same time, we've seen the availability of some lower cost, but still very high-performance models. Is this going to reshape the investments from the large semiconductor players in terms of how they're looking to address the market?Joe Moore: I think we have to assess that over time. Right now, there are very clear comments from everybody who's in charge of scaling large models that they intend to continue to scale.I think there is a benefit to doing so from the standpoint of creating a richer model, but is the ROI there? You know, and that's where I think, you know, your numbers do a very good job of justifying our model for our core companies – where we can say, okay, this is not a bubble. This is investment that's driven by these areas of economic benefit that our software and internet teams are seeing.And I think there is a bit of an arms race at the high end of the market where people just want to have the biggest cluster. And that's, we think that's about 30 percent of the revenue right now in hardware – is supporting those really big models. But we're also seeing, to your point, a very rich hardware configuration on the inference side post training model customization. Nvidia said on their on their earnings call recently that they see several orders of magnitude more compute required for those applications than for that pre-training. So, I think over time that's where the growth is going to come from.But you know, right now we're seeing growth really from all aspects of the market.Keith Weiss: Got it. So, a lot of really big opportunities out there utilizing these GPUs and ASICs, but also a lot of unknowns and potential risks. So, what are the key catalysts that you're looking for in the semiconductor space over the course of this year and maybe over the next three years?Joe Moore: Well, 2025 is, is a year that is really mostly about supply.You know, we're ramping up, new hardware But also, several companies doing custom silicon. We have to ramp all that hardware up and it's very complicated.It uses every kind of trick and technique that semiconductors use to do advanced packaging and things like that. And so, it's a very challenging supply chain and it has been for two years. And fortunately, it's happened in a time when there's plenty of semiconductor capacity out there.But I think, you know, we're ramping very quickly. And I think what you're seeing is the things that matter this year are gonna be more about how quickly we can get that supply, what are the gross margins on hardware, things like that.I think beyond that, we have to really get a sense of, you know, these ROI questions are really important beyond 2025. Because again, this is not a bubble. But hardware is cyclical and there; it doesn't slow gracefully. So, there will be periods where investment may fall off and it'll be a difficult time to own the stocks. And that's, you know, we do think that over time, the value sort of transitions from hardware to software.But we model for 2026 to be a year where it starts to slow down a little bit. We start to see some consolidation in these investments.Now, 12 months ago, I thought that about 2025. So, the timeframe keeps getting pushed out. It remains very robust. But I think at some point it will plateau a little bit and we'll start to see some fragmentation; and we'll start to see markets like, you know, reasoning models, inference models becoming more and more critical. But that's where when I hear you and Brian Nowak talking about sort of the early stage that we are of actually implementing this stuff, that inference has a long way to go in terms of growth.So, we're optimistic around the whole AI space for semiconductors. Obviously, the market is as well. So, there's expectations, challenges there. But there's still a lot of growth ahead of us.So Keith, looking towards the future, as AI expands the functionality of software, how will that transform the business models of your companies?Keith Weiss: We're also fundamentally optimistic about software and what GenerativeAI means for the overall software industry.If we look at software companies today, particularly application companies, a lot of what you're trying to do is make information workers more productive. So, it made a lot of sense to price based upon the number of people who are using your software. Or you've got a lot of seat-based models.Now we're talking about completely automating some of those processes, taking people out of the loop altogether. You have to price differently. You have to price based upon the number of transactions you're running, or some type of consumptive element of the amount of work that you're getting done. I think the other thing that we're going to see is the market opportunity expanding well beyond information workers.So, the way that we count the value, the way that we accrue the value might change a little bit. But the underlying value proposition remains the same. It's about automating, creating productivity in those business processes, and then the software companies pricing for their fair share of that productivity.Joe Moore: Great. Well, let me just say this has been a really useful process for me. The collaboration between our teams is really helpful because as a semiconductor analyst, you can see the data points, you can see the hardware being built. And I know the enthusiasm that people have on a tactical level. But understanding where the returns are going to come from and what milestones we need to watch to see any potential course correction is very valuable.So on that note, it's time for us to get to the exciting panels at the Morgan Stanley TMT conference. Uh, And we'll have more from the conference on the show later this week. Keith, thanks for taking the time to talk.Keith Weiss: Great speaking with you, Joe.Joe Moore: And thanks for listening. If you enjoy Thoughts on the Market, please leave us a review wherever you listen and share the podcast with a friend or colleague today.