Podcasts about streaming analytics

  • 21PODCASTS
  • 29EPISODES
  • 41mAVG DURATION
  • ?INFREQUENT EPISODES
  • Feb 4, 2025LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about streaming analytics

Latest podcast episodes about streaming analytics

Next in Marketing
Why CTV Ad Targeting is Much Harder Than You Think

Next in Marketing

Play Episode Listen Later Feb 4, 2025 24:24


Next in Media spoke with David Levy, CEO of OpenAP, about some of the misconceptions in the market when it comes to data-driven TV advertising, and how TV networks can balance collaboration and competition in the face of the growth of Big Tech in TV.Takeaways:The Evolution of OpenAP: Standardizing TV AdvertisingOpenAP was founded to unify ad innovations across TV networks, creating scalable, standardized solutions for advertisers.Bridging Linear TV and Streaming for AdvertisersWhile linear TV dominated ad spending, OpenAP has expanded its infrastructure to solve audience fragmentation in Connected TV (CTV).The Role of First-Party Data in TV AdvertisingBrands investing in first-party data need standardized methods to target consumers consistently across multiple media platforms.The Transparency Challenge in CTV Ad TargetingUnlike digital, CTV ad targeting lacks transparency due to multiple data transformations from audience lists to device-level identifiers.Solving Audience Measurement DiscrepanciesDifferences in identity-matching methods among media companies create inconsistencies in audience targeting and measurement.The Push for a Unified Ad Planning InfrastructureOpenAP aims to establish a seamless way for advertisers to plan and execute campaigns across different streaming services and TV networks.The Need for Scalable TV Advertising for Small BusinessesUnlike Google and Meta, TV lacks an easy-to-use, cost-effective ad buying platform for small advertisers—a gap the industry must address.Guest: David LevyHost: Mike ShieldsSponsor: EpsilonProducer: FEL Creative 

Needs Some Introduction - House of the Dragon/The Patient
'Severance' Season 2 Episode 2 'Goodbye Mrs. Selvig'

Needs Some Introduction - House of the Dragon/The Patient

Play Episode Listen Later Jan 26, 2025 74:26


mailto:needssomeintroduction@gmail.com   In this episode of 'Need Some Introduction,' hosts Sona and Victor dive deep into Severance's second episode of the second season titled 'Goodbye, Mrs. Selvig.' They breakdown key scenes, character motivations, and speculate on future plot developments. From Heli's fascination with her innie's life to the mysterious intentions behind Mark's decisions, this discussion covers themes of duality, manipulation, and the emotional weight of personal traumas. Key points include Mark's confrontation with Ms. Cobell, Dylan's ill-fated job interview, and the revelation of how Lumen's insiders are controlling the narrative. Special focus is given to the new intro sequence and its potential clues, alongside an analysis of intrigue surrounding characters like Irv and Helena. The episode concludes with anticipation for further developments in next week's longer episode directed by Ben Stiller.   00:00 Introduction 02:43 Streaming Analytics and the Value of AppleTV Programming 07:37 Severance Theories and Speculations 11:28 Character Dynamics and Emotional Moments 21:30 Helena's Duality and Family Tensions 36:40 New Intro Sequence Analysis 40:19 Devin and Mark's Tense Interactions 42:19 Milchik's Manipulative Visit 46:29 Helena's Fascination with Her Innie 50:28 Dylan's Job Interview Disaster 54:06 Mark's Struggle with Grief and Denial 58:08 Irv's Mysterious Actions 01:07:47 Mark's Confrontation with Selvig 01:10:57 Speculations and Final Thoughts  

Robots and Red Tape: AI and the Federal Government
Streaming, Analytics, and AI with Miguel Curiel

Robots and Red Tape: AI and the Federal Government

Play Episode Listen Later Oct 7, 2024 47:37


Miguel and I chat about the streaming industry and AI's role in humanity's future!

ai curiel streaming analytics
UBC News World
Spotify Music Distribution With Automatic Royalty Splits & Streaming Analytics

UBC News World

Play Episode Listen Later Jan 25, 2024 3:06


Audience engagement will be critical to artists' financial success under Spotify's new royalties policies. Good Morning Music, a new artist-centered distribution service, can help position emerging artists to connect with listeners. Visit https://GoodMornMusic.com for more. Good Morning Music City: Boise Address: 2417 N Bank Dr Website https://GoodMornMusic.com Phone +1 310 8954930 Email rory@goodmornmusic.com

The Analytics Engineering Podcast
Operationalizing Your Warehouse, Streaming Analytics, and Cereal (W/ Arjun Narayan of Materialize and Nathan Bean of General Mills)

The Analytics Engineering Podcast

Play Episode Listen Later Oct 6, 2023 42:23


It turns out data plays a big role in getting cereal manufactured and delivered so you can enjoy your Cheerios reliably for breakfast. We talk with Arjun Narayan, CEO of Materialize, a company building an operational warehouse, and Nathan Bean, a data leader at General Mills responsible for all of the company's manufacturing analytics and insights.  We discuss Materialize's founding story, how streaming technology has matured, and how exactly companies are leveraging their warehouse to operationalize their business—in this case, at one of the largest consumer product companies in the United States.  For full show notes and to read 6+ years of back issues of the podcast's companion newsletter, head to https://roundup.getdbt.com.  The Analytics Engineering Podcast is sponsored by dbt Labs.

BI or DIE
Vertrauenswürdige Daten - Data Quality als Teamsport | Im Gespräch mit Steffen Bischoff & Helmut Plinke, Talend

BI or DIE

Play Episode Listen Later Oct 3, 2023 36:39


Jedes Unternehmen will datenbasierte Entscheidungen treffen. Was dabei oft vergessen wird, ist die Datenqualität. Helmut und Steffen zeigen Kai, wie man Vertrauen in Daten aufbauen und beibehalten kann. Was du in dieser Folge lernen kannst: - Was genau ist Datenqualität? - Was sagen Studien zur Datenqualität in Unternehmen? - Was passiert, wenn Unternehmen Datenqualität nicht ernst nehmen? - Wie geht man Datenqualität richtig an? - Welche Use Cases gibt es zu verbesserter Datenqualität? Helmut Plinke fungiert aktuell als Lead Solutions Engineer für Talend (a Qlik Company). Sein Fokus liegt darauf, für Kunden die permanent wachsende Datenlandschaft zugänglicher und nutzbarer zu machen. Dazu befasst er sich umfassend mit allen Aspekten von Datenmanagement-Lösungen. In den letzten 20 Jahren erlebte Helmut die Weiterentwicklung der Softwarelösungen und der Methoden der Branche von Batch-ETL über Datenqualität, ELT und Data Governance bis hin zu aktuellen Trends wie Streaming Analytics, Self-Service Data Preparation und Data Mesh. Helmut ist Data-Health-Enthusiast und möchte die Fitness von Daten verbessern, um Unternehmen dabei zu helfen, ihre Daten effizienter zu nutzen und Wettbewerbsvorteile zu erzielen. Steffen Bischoff ist Senior Manager, Solutions Engineering für EMEA Central and Eastern Europe. In dieser Funktion leitet Steffen das Solutions-Engineer-Team und setzt sein technisches Wissen und seine Vertriebserfahrung ein, um professionelle Beratung und Support für alle Talend-Produkte zu bieten. Bevor er zu Talend kam, arbeitete Steffen als Pre-Sales-Berater sowie Server- und Netzwerkadministrator.

BI or DIE
Mission Modernisation - Wie sieht moderne Datenintegration aus? | Im Gespräch mit Steffen Bischoff & Helmut Plinke

BI or DIE

Play Episode Listen Later Sep 12, 2023 31:14


Steffen und Helmut beschäftigen sich schon seit Jahren mit Daten. Nun geht es aber darum, diese zu modernisieren und zu integrieren. Wie sie das mit Talend schnell und effizient schaffen, erklären sie Kai. Außerdem dabei… … Was ist moderne Datenintegration? … Was ist Datenonboarding? … Gibt es Integrationstrendthemen? … Wie gut kann Talend Datenintegration? Helmut Plinke fungiert aktuell als Lead Solutions Engineer für Talend (a Qlik Company). Sein Fokus liegt darauf, für Kunden die permanent wachsende Datenlandschaft zugänglicher und nutzbarer zu machen. Dazu befasst er sich umfassend mit allen Aspekten von Datenmanagement-Lösungen. In den letzten 20 Jahren erlebte Helmut die Weiterentwicklung der Softwarelösungen und der Methoden der Branche von Batch-ETL über Datenqualität, ELT und Data Governance bis hin zu aktuellen Trends wie Streaming Analytics, Self-Service Data Preparation und Data Mesh. Helmut ist Data-Health-Enthusiast und möchte die Fitness von Daten verbessern, um Unternehmen dabei zu helfen, ihre Daten effizienter zu nutzen und Wettbewerbsvorteile zu erzielen. Steffen Bischoff ist Senior Manager, Solutions Engineering für EMEA Central and Eastern Europe. In dieser Funktion leitet Steffen das Solutions-Engineer-Team und setzt sein technisches Wissen und seine Vertriebserfahrung ein, um professionelle Beratung und Support für alle Talend-Produkte zu bieten. Bevor er zu Talend kam, arbeitete Steffen als Pre-Sales-Berater sowie Server- und Netzwerkadministrator.

BI or DIE
Schneller in der Cloud - Daten Onboarding mit Talend Stitch | Im Gespräch mit Steffen Bischoff & Helmut Plinke

BI or DIE

Play Episode Listen Later Aug 22, 2023 32:12


Helmut und Steffen erklären Kai, wie Stitch Unternehmen helfen kann bessere, datenbasierte Entscheidungen für ihre Produkte zu treffen. Außerdem erfahren wir… … was hat sich durch die Übernahme von Qlik verändert? … wie schnell kann man mit Sticht arbeiten? … wie wird Self Service unterstützt? … wie werden Kunden enabled … ist Qlik jetzt für Stitch ein Muss? Helmut Plinke fungiert aktuell als Lead Solutions Engineer für Talend (a Qlik Company). Sein Fokus liegt darauf, für Kunden die permanent wachsende Datenlandschaft zugänglicher und nutzbarer zu machen. Dazu befasst er sich umfassend mit allen Aspekten von Datenmanagement-Lösungen. In den letzten 20 Jahren erlebte Helmut die Weiterentwicklung der Softwarelösungen und der Methoden der Branche von Batch-ETL über Datenqualität, ELT und Data Governance bis hin zu aktuellen Trends wie Streaming Analytics, Self-Service Data Preparation und Data Mesh. Helmut ist Data-Health-Enthusiast und möchte die Fitness von Daten verbessern, um Unternehmen dabei zu helfen, ihre Daten effizienter zu nutzen und Wettbewerbsvorteile zu erzielen. Steffen Bischoff ist Senior Manager, Solutions Engineering für EMEA Central and Eastern Europe. In dieser Funktion leitet Steffen das Solutions-Engineer-Team und setzt sein technisches Wissen und seine Vertriebserfahrung ein, um professionelle Beratung und Support für alle Talend-Produkte zu bieten. Bevor er zu Talend kam, arbeitete Steffen als Pre-Sales-Berater sowie Server- und Netzwerkadministrator.

Software Engineering Daily
Streaming Analytics with Hojjat Jafarpour

Software Engineering Daily

Play Episode Listen Later Apr 6, 2023 46:48


Streaming analytics refers to the process of analyzing real-time data that is generated continuously and rapidly from various sources, such as sensors, applications, social media, and other internet-connected devices. Streaming analytics platforms enable organizations to extract business value from data in motion, similar to how traditional analytics tools derive insights from data at rest. DeltaStream The post Streaming Analytics with Hojjat Jafarpour appeared first on Software Engineering Daily.

Podcast – Software Engineering Daily
Streaming Analytics with Hojjat Jafarpour

Podcast – Software Engineering Daily

Play Episode Listen Later Apr 6, 2023 46:48


Streaming analytics refers to the process of analyzing real-time data that is generated continuously and rapidly from various sources, such as sensors, applications, social media, and other internet-connected devices. Streaming analytics platforms enable organizations to extract business value from data in motion, similar to how traditional analytics tools derive insights from data at rest. DeltaStream The post Streaming Analytics with Hojjat Jafarpour appeared first on Software Engineering Daily.

CISO Tradecraft
#118 - Data Engineering (with Gal Shpantzer)

CISO Tradecraft

Play Episode Listen Later Feb 27, 2023 44:45


Our systems generate fantastic amounts of information, but do we have a complete understanding of how we collect, analyze, manage, store, and retrieve possibly petabytes a day? Gal Shpantzer has been doing InfoSec for over 20 years and has managed some huge data engineering projects, and offers a lot of actionable insights in this CISO Tradecraft episode. Gal's LinkedIn Page - https://www.linkedin.com/in/riskmanagement/ Gal's Twitter Page - https://twitter.com/Shpantzer Full Transcript - https://docs.google.com/document/d/14RXnsVttvKlRi6VL94BTrItCjOAjgGem/ Chapters 00:00 Introduction 02:00 How do you Architect Big Data Data Infrastructure 03:33 Are you taking a look at Ransomware? 06:11 Web Scale Technologies are used mostly in Marketing & Fraud Detection 08:11 Data Engineering - The Mindset Shift 10:51 The Iron Triangle of Data Engineering 13:55 Can I Outsource My Logging Pipeline to a Vendor 15:37 Kafka & Flink - Data Engineering in the Pipeline 18:12 Streaming Analytics & Kafka 22:08 How to Enable Data Science Analytics with Streaming Analytics 26:33 Streaming Analytics 30:25 Data Engineering - Is there a Security Log 32:30 Streaming Analytics is a Weird Thing 35:50 How to Get a Handle on a Big Data Pipeline 39:11 Data Engineering Hacks for Big Data Analytics

Streaming Audio: a Confluent podcast about Apache Kafka
Streaming Analytics and Real-Time Signal Processing with Apache Kafka

Streaming Audio: a Confluent podcast about Apache Kafka

Play Episode Listen Later Jul 14, 2022 66:33 Transcription Available


Imagine you can process and analyze real-time event streams for intelligence to mitigate cyber threats or keep soldiers constantly alerted to risks and precautions they should take based on events. In this episode, Jeffrey Needham (Senior Solutions Engineer, Advanced Technology Group, Confluent) shares use cases on how Apache Kafka® can be used for real-time signal processing to mitigate risk before it arises. He also explains the classic Kafka transactional processing defaults and the distinction between transactional and analytic processing. Jeffrey is part of the customer solutions and innovations division (CSID), which involves designing event streaming platforms and innovations to improve productivity for organizations by pushing the envelope of Kafka for real-time signal processing. What is signal intelligence? Jeffrey explains that it's not always affiliated with the military. Signal processing improves your operational or situational awareness by understanding the petabyte datasets of clickstream data, or the telemetry coming in from sensors, which could be the satellite or sensor arrays along a water pipeline. That is, bringing in event data from external sources to analyze, and then finding the pattern in the series of events to make informed decisions. Conventional On-Line Analytical Processing (OLAP) or data warehouse platforms evolved out of the transaction processing model. However, when analytics or even AI processing is applied to any data set, these algorithms never look at a single column or row, but look for patterns within millions of rows of transactionally derived data. Transaction-centric solutions are designed to update and delete specific rows and columns in an “ACID” compliant manner, which makes them inefficient and usually unaffordable at scale because this capability is less critical when the analytic goal is to look for a pattern within millions or even billions of these rows.Kafka was designed as a step forward from classic transaction processing technologies, which can also be configured in a way that's optimized for signal processing high velocities of noisy or jittery data streams, in order to make sense, in real-time, of a dynamic, non-transactional environment.With its immutable, write-append commit logs, Kafka functions as a flight data recorder, which remains resilient even when network communications, or COMMs, are poor or nonexistent. Jeffrey shares the disconnected edge project he has been working on—smart soldier, which runs Kafka on a Raspberry Pi and x64-based handhelds. These devices are ergonomically integrated on each squad member to provide real-time visibility into the soldiers' activities or situations. COMMs permitting, the topic data is then mirrored upstream and aggregated at multiple tiers—mobile command post, battalion, HQ—to provide ever-increasing views of the entire battlefield, or whatever the sensor array is monitoring, including the all important supply chain. Jeffrey also shares a couple of other use cases on how Kafka can be used for signal intelligence, including cybersecurity and protecting national critical infrastructure.EPISODE LINKSUsing Kafka for Analytic ProcessingWatch the video version of this podcastStreaming Audio Playlist Learn more on Confluent DeveloperUse PODCAST100 to get $100 of free Confluent Cloud usage (details)   

Streaming Audio: a Confluent podcast about Apache Kafka
Streaming Analytics on 50M Events Per Day with Confluent Cloud at Picnic

Streaming Audio: a Confluent podcast about Apache Kafka

Play Episode Listen Later May 5, 2022 34:41 Transcription Available


What are useful practices for migrating a system to Apache Kafka® and Confluent Cloud, and why use Confluent to modernize your architecture?Dima Kalashnikov (Technical Lead, Picnic Technologies) is part of a small analytics platform team at Picnic, an online-only, European grocery store that processes around 45 million customer events and five million internal events daily. An underlying goal at Picnic is to try and make decisions as data-driven as possible, so Dima's team collects events on all aspects of the company—from new stock arriving at the warehouse, to customer behavior on their websites, to statistics related to delivery trucks. Data is sent to internal systems and to a data warehouse.Picnic recently migrated from their existing solution to Confluent Cloud for several reasons:Ecosystem and community: Picnic liked the tooling present in the Kafka ecosystem. Since being a small team means they aren't able to devote extra time to building boilerplate-type code such as connectors for their data sources or functionality for extensive monitoring capabilities. Picnic also has analysts that use SQL so appreciated the processing capabilities of ksqlDB. Finally, they found that help isn't hard to locate if one gets stuck.Monitoring: They wanted better monitoring; specifically they found it challenging to measure for SLAs with their former system as they couldn't easily detect the positions of consumers in their streams.Scaling and data retention times: Picnic is growing so they needed to scale horizontally without having to worry about manual reassignment. They also hit a wall with their previous streaming solution with respect to the length of time they could save data, which is a serious issue for a company that makes data-first decisions. Cloud: Another factor of being a small team is that they don't have resources for extensive maintenance of their tooling.Dima's team was extremely careful and took their time with the migration. They ran a pilot system simultaneously with the old system, in order to make sure it could achieve their fundamental performance goals: complete stability, zero data loss, and no performance degradation. They also wanted to check it for costs.The pilot was successful and they actually have a second, IoT pilot in the works that uses Confluent Cloud and Debezium to track the robotics data emanating from their automatic fulfillment center. And it's a lot of data, Dima mentions that the robots in the center generate data sets as large as their customer events streams. EPISODE LINKSPicnic Analytics Platform: Migration from AWS Kinesis to Confluent CloudPicnic Modernizes Data Architecture with ConfluentData Engineer: Event Streaming PlatformWatch this podcast in videoKris Jenkins' TwitterStreaming Audio Playlist Join the Confluent CommunityLearn more with Kafka  resources on Confluent DeveloperLive demo: Event-Driven Microservices with ConfluentUse PODCAST100 to get $100 of free Confluent Cloud usageBuilding Data Streaming App | Coding In Motion

Data Transforming Business
S5E4: Data in Overdrive

Data Transforming Business

Play Episode Listen Later Aug 3, 2021 29:02


Episode 4 of our 12 part series with Cloudera. In this podcast, Simon Elliston Ball, Director Product Management, Streaming Analytics, and Cyber ML at Cloudera explain why real-time streaming has become so important. Simon starts by talking about the role real-time data streaming plays in encouraging digitally driven innovation. Also, he explains the types of data structures organisations need as well as how this differs from traditional data practices. Finally, Simon discusses the potential that ML brings to real-time data streaming and the considerations for compliance and governance. Next week's episode: Cloud Native Data Warehouse Make sure you're subscribed so that you don't miss an episode of this fantastic series.

data ml overdrive cloudera streaming analytics
The Private Equity Digital Transformation Show
Streaming Analytics for Speedier Decision Making

The Private Equity Digital Transformation Show

Play Episode Listen Later May 14, 2021 69:24


Our guest, Rohit Chauhan is the Founder & CEO at Topmist, a cloud-native real-time streaming data analytics firm. He has over 25 years of experience as a technology executive, data scientist, and business consultant with deep knowledge of industry processes and value drivers. In this episode, Rohit and Bruce discuss: The tech architecture of a runtime streaming data pipeline. What it really means to do analytics in “real time”. The differences between doing analytics on data collected in the past versus doing analytics on data being collected in the present. Use cases that are a great fit for streaming analytics. Case study on Coca Cola that quantifies the value they created from using streaming analytics to make faster marketing decisions. Related links you may find useful: Season 2: Episodes and show notes Season 2 book: The Private Equity Digital Operating Partner Season 1: Episodes and show notes Season 1 book: IoT Inc Training: Digital transformation certification

Streaming Audio: a Confluent podcast about Apache Kafka
Change Data Capture and Kafka Connect on Microsoft Azure ft. Abhishek Gupta

Streaming Audio: a Confluent podcast about Apache Kafka

Play Episode Listen Later Jan 11, 2021 43:04 Transcription Available


What’s it like being a Microsoft Azure Cloud advocate working with Apache Kafka® and change data capture (CDC) solutions? Abhishek Gupta would know! At Microsoft, Abhishek focuses his time on Kafka, Databases, Kubernetes, and open source projects. His experience in a wide variety of roles ranging from engineering, consulting, and product management for developer-focused products has positioned him well for developer advocacy, where he is now.Switching gears, Abhishek proceeds to break down the concept of CDC starting off with some of the core concepts such as "commit logs." Abhishek then explains how CDC can turn data around when you compare it to the traditional way of querying the database to access data—you don't call the database; it calls you. He then goes on to discuss Debezium, which is an open source change data capture solution for Kafka. He also covers some of the Azure connectors on Confluent, Azure Data Explorer and use cases powered by the Azure Data Explorer Sink connector for Kafka.EPISODE LINKSStreaming data from Confluent Cloud into Azure Data ExplorerIntegrate Apache Kafka with Azure Data ExplorerChange Data Capture with Debezium ft. Gunnar MorlingTales From The Frontline of Apache Kafka DevOps ft. Jason BellMySQL CDC Source (Debezium) Connector for Confluent CloudMySQL, Cassandra, BigQuery, and Streaming Analytics with Joy GaoJoin the Confluent Community SlackLearn more with Kafka tutorials, resources, and guides at Confluent DeveloperLive demo: Kafka streaming in 10 minutes on Confluent CloudUse 60PDCAST to get an additional $60 of free Confluent Cloud usage (details)

DHG GrowthCast
Streaming Analytics for Manufacturers

DHG GrowthCast

Play Episode Listen Later Nov 18, 2020 8:31


DHG's Chief Data Officer Amit Arya discusses how streaming analytics can be a useful tool for manufacturers by utilizing real-time data for proactive insights and decision-making.

manufacturers dhg streaming analytics
Hashmap on Tap
#3 Streaming Analytics

Hashmap on Tap

Play Episode Listen Later Mar 27, 2020 29:16


Kieran and Randy take a deeper dive into Streaming Analytics and how companies derive value from collecting, processing, and learning from their streaming data. On tap for today's episode: Eureka Heights Churro & White Claw Contact us: https://www.hashmapinc.com/datarebelsontap

streaming analytics
Streaming Audio: a Confluent podcast about Apache Kafka
Apache Kafka and Apache Druid – The Perfect Pair ft. Rachel Pedreschi

Streaming Audio: a Confluent podcast about Apache Kafka

Play Episode Listen Later Dec 23, 2019 50:12


As the head of global field engineering and community at Imply, Rachel Pedreschi is passionate about engaging both externally with customers and internally with departments all across the board, from sales to engineering. Rachel’s involvement in the open source community focuses primarily on Apache Druid, a real-time, high-performance datastore that provides fast, sub-second analytics and complements another powerful open source project as well: Apache Kafka®. Together, Kafka and Druid provide real-time event streaming and high-performance streaming analytics with powerful visualizations.EPISODE LINKSHow To Use Kafka and Druid to Tame Your Router DataETL and Event Streaming Explained ft. Stewart BrysonWho is Abraham Wald?How Not to Be Wrong: The Power of Mathematical ThinkingJoin the Confluent Community SlackFully managed Apache Kafka as a service! Try free.

Data Engineering Podcast
Building The Materialize Engine For Interactive Streaming Analytics In SQL - Episode 112

Data Engineering Podcast

Play Episode Listen Later Dec 22, 2019 48:07 Transcription Available


Transactional databases used in applications are optimized for fast reads and writes with relatively simple queries on a small number of records. Data warehouses are optimized for batched writes and complex analytical queries. Between those use cases there are varying levels of support for fast reads on quickly changing data. To address that need more completely the team at Materialize has created an engine that allows for building queryable views of your data as it is continually updated from the stream of changes being generated by your applications. In this episode Frank McSherry, chief scientist of Materialize, explains why it was created, what use cases it enables, and how it works to provide fast queries on continually updated data.

Develomentor
Ep. 11 Biochemist to Principal Technologist, with Ellen Friedman

Develomentor

Play Episode Listen Later Nov 4, 2019 42:42


Our guest today is Ellen Friedman. Ellen is currently the Principal Technologist at MapR, a data platform company. And she took the scenic route to get there! These days, Ellen is well known as a keynote speaker, tech writer and open source leader. But this transition didn’t happen overnight. Listen to hear how Ellen went from a career as a biochemist to principal technologist! To find out more about careers in tech, click here. Ellen received a PhD from Rice University and started her career in scientific research. But her curiosity for cutting edge ideas and her passion for communication has created a truly unique voice in technology. For full show notes, click hereCONNECT WITH ELLEN FRIEDMANLinkedInTwitterCONNECT WITH GRANT INGERSOLLLinkedInTwitter

Streaming Audio: a Confluent podcast about Apache Kafka
MySQL, Cassandra, BigQuery, and Streaming Analytics with Joy Gao

Streaming Audio: a Confluent podcast about Apache Kafka

Play Episode Listen Later Oct 2, 2019 43:59


Joy Gao chats with Tim Berglund about all things related to streaming ETL—how it works, its benefits, and the implementation and operational challenges involved. She describes the streaming ETL architecture at WePay from MySQL/Cassandra to BigQuery using Apache Kafka®, Kafka Connect, and Debezium. EPISODE LINKSCassandra Source Connector DocumentationStreaming Databases in Real Time with MySQL, Debezium, and KafkaStreaming Cassandra at WePayChange Data Capture with Debezium ft. Gunnar MorlingJoin the Confluent Community SlackFully managed Apache Kafka as a service! Try free.

Microsoft Azure for Industry : Podcast
IoT with Streaming Data and Analytics and a Little Design Thinking with Element

Microsoft Azure for Industry : Podcast

Play Episode Listen Later Sep 1, 2018 27:01


Transcript This IoT episode ranges from oil and gas to Rasberry Pi, real-time streaming analytics and even Design Thinking. Sameer Kalwani joins us to talk all things IoT including the cultural changes data-informed decision making has on organizations. Sameer Kalwani A passion for infrastructure inspired Sameer to found Element, where he oversees product strategy, design, and marketing. Previously, he was CTO & Head of Product at Piramal Sarvajal, an innovative water utility with distributed water treatment plants. His patented solutions reduced Sarvajal’s O&M costs by >60% and consumer water price by 60x. His products have been recognized with Frost & Sullivan's Enabling Technology Award of the Year and as FastCo’s Top 10 Most Innovative Products. Earlier, Sameer consulted on IT M&A at Deloitte, managed product at Apcera, finance at Goldman Sachs, and manufacturing at Visteon. Sameer holds a BS in Electrical Engineering and BS in Cognitive Psychology from University of Illinois, Urbana-Champaign, and an MBA from Harvard Business School. He and his wife Ashi enjoy exploring regional cuisines. Find Element on the web and follow them on LinkedIn. Follow Sameer on LinkedIn and Twitter. Diego Tamburini Diego is the Principal Industry Lead for Azure Manufacturing in the Microsoft Industry Experience team, where he focuses on developing technical content to help manufacturing companies and software developers deliver their solutions on Azure, at scale. He also champions partners who deliver manufacturing solutions using Azure. Follow Diego on LinkedIn and Twitter.

T3chfest
Entendiendo el mundo real: streaming analytics para IoT - Álvaro Egea

T3chfest

Play Episode Listen Later Feb 27, 2018 31:43


Si quieres ver el video con slides: https://www.youtube.com/watch?v=e7cqkS2uUqo Monitorización para IoT. machine learning. Inteligencia artificial. Sistemas distribuidos.

Roaring Elephant
Episode 20 – Dave’s Hadoop Summit San Jose 2016 Retrospective – Part 2

Roaring Elephant

Play Episode Listen Later Jul 19, 2016 66:28


In this second part, we discuss the sessions that Dave attended at the San Jose Hadoop Summit and we go in depth on some related topics. Since we ran over an hour with the main topic, and we did not want to make this a three-parter, we decided to forgo the questions from the audience just this one time...   00:00 Recent events Vacation tine! Edx.Org Big Data Courses 04:00 Dave's Hadoop Summit San Jose 2016 Retrospective - Part 2 Session 1: End-to-End Processing of 3.7 Million Telemetry Events per Second Using Lambda Architecture, by Saurabh Mishra @ Hortonworks and Raghavendra Nandagopal @ Symantec Talking point: Hero-culture or why nobody wants to talk about failure anymore Session 2: Top Three - Big Data Governance Issues and How Apache ATLAS resolves it for the Enterprise, by Andrew Ahn @ Hortonworks Talking point: Guaranteed Governance, who certifies the certificate? Session 3: IoT, Streaming Analytics and Machine Learning: Delivering Real-Time Intelligence With Apache NiFi, by Paul Kent @ SAS and Dan Zaratsian @ SAS Talking point: Commercial solutions versus build your own in open source Session 4: Productionizing Spark on YARN for ETL at Petabyte Scale, by Ashwin Shankar and Nezih Yigitbasi @ Netflix Talking point: Is Hadoop stilll a low-cost commodity affair? Session 5: Analyzing Telecom Fraud at Hadoop Scale, by Sanjay Vyas @ Diyotta Talking Point: Do commercial, proprietary products have a place at Hadoop Summit or are they just marketing fluff?   01:06:28 End Please use the Contact Form on this blog or our twitter feed to send us your questions, or to suggest future episode topics you would like us to cover.

Roaring Elephant
Episode 20 – Dave’s Hadoop Summit San Jose 2016 Retrospective – Part 2

Roaring Elephant

Play Episode Listen Later Jul 19, 2016 66:28


In this second part, we discuss the sessions that Dave attended at the San Jose Hadoop Summit and we go in depth on some related topics. Since we ran over an hour with the main topic, and we did not want to make this a three-parter, we decided to forgo the questions from the audience just this one time...   00:00 Recent events Vacation tine! Edx.Org Big Data Courses 04:00 Dave's Hadoop Summit San Jose 2016 Retrospective - Part 2 Session 1: End-to-End Processing of 3.7 Million Telemetry Events per Second Using Lambda Architecture, by Saurabh Mishra @ Hortonworks and Raghavendra Nandagopal @ Symantec Talking point: Hero-culture or why nobody wants to talk about failure anymore Session 2: Top Three - Big Data Governance Issues and How Apache ATLAS resolves it for the Enterprise, by Andrew Ahn @ Hortonworks Talking point: Guaranteed Governance, who certifies the certificate? Session 3: IoT, Streaming Analytics and Machine Learning: Delivering Real-Time Intelligence With Apache NiFi, by Paul Kent @ SAS and Dan Zaratsian @ SAS Talking point: Commercial solutions versus build your own in open source Session 4: Productionizing Spark on YARN for ETL at Petabyte Scale, by Ashwin Shankar and Nezih Yigitbasi @ Netflix Talking point: Is Hadoop stilll a low-cost commodity affair? Session 5: Analyzing Telecom Fraud at Hadoop Scale, by Sanjay Vyas @ Diyotta Talking Point: Do commercial, proprietary products have a place at Hadoop Summit or are they just marketing fluff?   01:06:28 End Please use the Contact Form on this blog or our twitter feed to send us your questions, or to suggest future episode topics you would like us to cover.

Roaring Elephant
Episode 20 – Dave’s Hadoop Summit San Jose 2016 Retrospective – Part 2

Roaring Elephant

Play Episode Listen Later Jul 19, 2016 66:28


In this second part, we discuss the sessions that Dave attended at the San Jose Hadoop Summit and we go in depth on some related topics. Since we ran over an hour with the main topic, and we did not want to make this a three-parter, we decided to forgo the questions from the audience just this one time...   00:00 Recent events Vacation tine! Edx.Org Big Data Courses 04:00 Dave's Hadoop Summit San Jose 2016 Retrospective - Part 2 Session 1: End-to-End Processing of 3.7 Million Telemetry Events per Second Using Lambda Architecture, by Saurabh Mishra @ Hortonworks and Raghavendra Nandagopal @ Symantec Talking point: Hero-culture or why nobody wants to talk about failure anymore Session 2: Top Three - Big Data Governance Issues and How Apache ATLAS resolves it for the Enterprise, by Andrew Ahn @ Hortonworks Talking point: Guaranteed Governance, who certifies the certificate? Session 3: IoT, Streaming Analytics and Machine Learning: Delivering Real-Time Intelligence With Apache NiFi, by Paul Kent @ SAS and Dan Zaratsian @ SAS Talking point: Commercial solutions versus build your own in open source Session 4: Productionizing Spark on YARN for ETL at Petabyte Scale, by Ashwin Shankar and Nezih Yigitbasi @ Netflix Talking point: Is Hadoop stilll a low-cost commodity affair? Session 5: Analyzing Telecom Fraud at Hadoop Scale, by Sanjay Vyas @ Diyotta Talking Point: Do commercial, proprietary products have a place at Hadoop Summit or are they just marketing fluff?   01:06:28 End Please use the Contact Form on this blog or our twitter feed to send us your questions, or to suggest future episode topics you would like us to cover.

The Innovation Engine Podcast
Streaming Analytics & Data Visualization, with Chris Graham, Dan Greene, & Sayantam Dey

The Innovation Engine Podcast

Play Episode Listen Later Apr 11, 2016 28:38


On this episode of the podcast we'll look at the present and future of streaming analytics & data visualization. We talk about a tool that a 3Pillar Global team built for digesting vast quantities of streaming video analytics data and for visualizing that data in a meaningful way. The tool we'll discuss was built as a proof of concept for a major public television network, and you'll hear about the motivation behind its creation and other potential applications for the data visualization aspect of the tool.   The guests on this episode of the podcast are Dan Greene, Chris Graham, and Sayantam Dey. Dan Greene is the Director of Architecture at 3Pillar Global. He has 18 years of software design and development experience, with software and product architecture expertise in areas including eCommerce, Geospatial Analysis, SOA architecture, Big Data, and Cloud Computing. Chris Graham leads the Media & Entertainment vertical at 3Pillar Global. He works with clients across the Media and Entertainment space including Broadcast Media, Online Media, and Publications and Newspapers. Sayantam Dey is the Director of 3Pillar's Advanced Technology Group. He has been with 3Pillar for a decade, delivering enterprise products and building frameworks for accelerated software development and testing in various technologies. His current areas of interest are data analytics, messaging systems and cloud services. Show Notes Read Sayantam & Dan's piece on the technology behind the POC: http://www.3pillarglobal.com/insights/real-time-analytics-visualization-apache-spark Follow Chris Graham on Twitter: https://twitter.com/ChrisGraham Follow Dan Greene on Twitter: https://twitter.com/MrDanGreene

The Cloudcast
The Cloudcast #184 - Streaming Analytics for Distributed Applications

The Cloudcast

Play Episode Listen Later Mar 29, 2015 25:12


Aaron talks to Karthik Rau (@krrau; Founder/CEO of @SignalFx) about the launch of their advanced monitoring platform, doing streaming analytics for distributed applications, the new role of developers and the mindset of technology-centric business groups. Links: - SignalFx Website - https://signalfx.com/ - SignalFx REST API - https://support.signalfx.com/hc/en-us/articles/201270489 - TheNewStack covers SignalFx launch - http://thenewstack.io/signalfx-a-saas-to-monitor-apps-at-any-scale/ - Ben’s Blog - http://www.bhorowitz.com/the_past_and_future_of_systems_management Music Credit: Nine Inch Nails (nin.com)