American baseball player
 
			POPULARITY
Categories
Here is the story behind the PDS1. David Wright talks to the man who came up with it, Peter Jones – and asks the question: is the Proficient Deer Stalking Certificate 1 for you? You can find the PDS1 in our shop, priced at £395 For more ways to listen to this podcast, visit FieldsportsChannel.tv/fieldsportschannelpodcast131
Have you ever considered how a single server can support countless applications and workloads at once? In this episode, hosts Lois Houston and Nikita Abraham, together with Principal OCI Instructor Orlando Gentil, explore the sophisticated technologies that make this possible in modern cloud data centers. They discuss the roles of hypervisors, virtual machines, and containers, explaining how these innovations enable efficient resource sharing, robust security, and greater flexibility for organizations. Cloud Tech Jumpstart: https://mylearn.oracle.com/ou/course/cloud-tech-jumpstart/152992 Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode. -------------------------------------------------- Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started! 00:25 Lois: Hello and welcome to the Oracle University Podcast! I'm Lois Houston, Director of Innovation Programs with Oracle University, and with me is Nikita Abraham, Team Lead: Editorial Services. Nikita: Hi everyone! For the last two weeks, we've been talking about different aspects of cloud data centers. In this episode, Orlando Gentil, Principal OCI Instructor at Oracle University, joins us once again to discuss how virtualization, through hypervisors, virtual machines, and containers, has transformed data centers. 00:58 Lois: That's right, Niki. We'll begin with a quick look at the history of virtualization and why it became so widely adopted. Orlando, what can you tell us about that? Orlando: To truly grasp the power of virtualization, it's helpful to understand its journey from its humble beginnings with mainframes to its pivotal role in today's cloud computing landscape. It might surprise you, but virtualization isn't a new concept. Its roots go back to the 1960s with mainframes. In those early days, the primary goal was to isolate workloads on a single powerful mainframe, allowing different applications to run without interfering with each other. As we moved into the 1990s, the challenge shifted to underutilized physical servers. Organizations often had numerous dedicated servers, each running a single application, leading to significant waste of computing resources. This led to the emergence of virtualization as we know it today, primarily from the 1990s to the 2000s. The core idea here was to run multiple isolated operating systems on a single physical server. This innovation dramatically improved the resource utilization and laid the technical foundation for cloud computing, enabling the scalable and flexible environments we rely on today. 02:26 Nikita: Interesting. So, from an economic standpoint, what pushed traditional data centers to change and opened the door to virtualization? Orlando: In the past, running applications often meant running them on dedicated physical servers. This led to a few significant challenges. First, more hardware purchases. Every new application, every new project often required its own dedicated server. This meant constantly buying new physical hardware, which quickly escalated capital expenditure. Secondly, and hand-in-hand with more servers came higher power and cooling costs. Each physical server consumed power and generated heat, necessitating significant investment in electricity and cooling infrastructure. The more servers, the higher these operational expenses became. And finally, a major problem was unused capacity. Despite investing heavily in these physical servers, it was common for them to run well below their full capacity. Applications typically didn't need 100% of server's resources all the time. This meant we were wasting valuable compute power, memory, and storage, effectively wasting resources and diminishing the return of investment from those expensive hardware purchases. These economic pressures became a powerful incentive to find more efficient ways to utilize data center resources, setting the stage for technologies like virtualization. 04:05 Lois: I guess we can assume virtualization emerged as a financial game-changer. So, what kind of economic efficiencies did virtualization bring to the table? Orlando: From a CapEx or capital expenditure perspective, companies spent less on servers and data center expansion. From an OpEx or operational expenditure perspective, fewer machines meant lower electricity, cooling, and maintenance costs. It also sped up provisioning. Spinning a new VM took minutes, not days or weeks. That improved agility and reduced the operational workload on IT teams. It also created a more scalable, cost-efficient foundation which made virtualization not just a technical improvement, but a financial turning point for data centers. This economic efficiency is exactly what cloud providers like Oracle Cloud Infrastructure are built on, using virtualization to deliver scalable pay as you go infrastructure. 05:09 Nikita: Ok, Orlando. Let's get into the core components of virtualization. To start, what exactly is a hypervisor? Orlando: A hypervisor is a piece of software, firmware, or hardware that creates and runs virtual machines, also known as VMs. Its core function is to allow multiple virtual machines to run concurrently on a single physical host server. It acts as virtualization layer, abstracting the physical hardware resources like CPU, memory, and storage, and allocating them to each virtual machine as needed, ensuring they can operate independently and securely. 05:49 Lois: And are there types of hypervisors? Orlando: There are two primary types of hypervisors. The type 1 hypervisors, often called bare metal hypervisors, run directly on the host server's hardware. This means they interact directly with the physical resources offering high performance and security. Examples include VMware ESXi, Oracle VM Server, and KVM on Linux. They are commonly used in enterprise data centers and cloud environments. In contrast, type 2 hypervisors, also known as hosted hypervisors, run on top of an existing operating system like Windows or macOS. They act as an application within that operating system. Popular examples include VirtualBox, VMware Workstation, and Parallels. These are typically used for personal computing or development purposes, where you might run multiple operating systems on your laptop or desktop. 06:55 Nikita: We've spoken about the foundation provided by hypervisors. So, can we now talk about the virtual entities they manage: virtual machines? What exactly is a virtual machine and what are its fundamental characteristics? Orlando: A virtual machine is essentially a software-based virtual computer system that runs on a physical host computer. The magic happens with the hypervisor. The hypervisor's job is to create and manage these virtual environments, abstracting the physical hardware so that multiple VMs can share the same underlying resources without interfering with each other. Each VM operates like a completely independent computer with its own operating system and applications. 07:40 Lois: What are the benefits of this? Orlando: Each VM is isolated from the others. If one VM crashes or encounters an issue, it doesn't affect the other VMs running on the same physical host. This greatly enhances stability and security. A powerful feature is the ability to run different operating systems side-by-side on the very same physical host. You could have a Windows VM, a Linux VM, and even other specialized OS, all operating simultaneously. Consolidate workloads directly addresses the unused capacity problem. Instead of one application per physical server, you can now run multiple workloads, each in its own VM on a single powerful physical server. This dramatically improves hardware utilization, reducing the need of constant new hardware purchases and lowering power and cooling costs. And by consolidating workloads, virtualization makes it possible for cloud providers to dynamically create and manage vast pools of computing resources. This allows users to quickly provision and scale virtual servers on demand, tapping into these shared pools of CPU, memory, and storage as needed, rather than being tied to a single physical machine. 09:10 Oracle University's Race to Certification 2025 is your ticket to free training and certification in today's hottest technology. Whether you're starting with Artificial Intelligence, Oracle Cloud Infrastructure, Multicloud, or Oracle Data Platform, this challenge covers it all! Learn more about your chance to win prizes and see your name on the Leaderboard by visiting education.oracle.com/race-to-certification-2025. That's education.oracle.com/race-to-certification-2025. 09:54 Nikita: Welcome back! Orlando, let's move on to containers. Many see them as a lighter, more agile way to build and run applications. What's your take? Orlando: A container packages an application in all its dependencies, like libraries and other binaries, into a single, lightweight executable unit. Unlike a VM, a container shares the host operating system's kernel, running on top of the container runtime process. This architectural difference provides several key advantages. Containers are incredibly portable. They can be taken virtually anywhere, from a developer's laptop to a cloud environment, and run consistently, eliminating it works on my machine issues. Because containers share the host OS kernel, they don't need to bundle a full operating system themselves. This results in significantly smaller footprints and less administration overhead compared to VMs. They are faster to start. Without the need to boot a full operating system, containers can start up in seconds, or even milliseconds, providing rapid deployment and scaling capabilities. 11:12 Nikita: Ok. Throughout our conversation, you've spoken about the various advantages of virtualization but let's consolidate them now. Orlando: From a security standpoint, virtualization offers several crucial benefits. Each VM operates in its own isolated sandbox. This means if one VM experiences a security breach, the impact is generally contained to that single virtual machine, significantly limiting the spread of potential threats across your infrastructure. Containers also provide some isolation. Virtualization allows for rapid recovery. This is invaluable for disaster recovery or undoing changes after a security incident. You can implement separate firewalls, access rules, and network configuration for each VM. This granular control reduces the overall exposure and attack surface across your virtualized environments, making it harder for malicious actors to move laterally. Beyond security, virtualization also brings significant advantages in terms of operational and agility benefits for IT management. Virtualization dramatically improves operational efficiency and agility. Things are faster. With virtualization, you can provision new servers or containers in minutes rather than days or weeks. This speed allows for quicker deployment of applications and services. It becomes much simpler to deploy consistent environment using templates and preconfigured VM images or containers. This reduces errors and ensures uniformity across your infrastructure. It's more scalable. Virtualization makes your infrastructure far more scalable. You can reshape VMs and containers to meet changing demands, ensuring your resources align precisely with your needs. These operational benefits directly contribute to the power of cloud computing, especially when we consider virtualization's role in enabling cloud and scalability. Virtualization is the very backbone of modern cloud computing, fundamentally enabling its scalability. It allows multiple virtual machines to run on a single physical server, maximizing hardware utilization, which is essential for cloud providers. This capability is core of infrastructure as a service offerings, where users can provision virtualized compute resources on demand. Virtualization makes services globally scalable. Resources can be easily deployed and managed across different geographic regions to meet worldwide demand. Finally, it provides elasticity, meaning resources can be automatically scaled up or down in response to fluctuating workloads, ensuring optimal performance and cost efficiency. 14:21 Lois: That's amazing. Thank you, Orlando, for joining us once again. Nikita: Yeah, and remember, if you want to learn more about the topics we covered today, go to mylearn.oracle.com and search for the Cloud Tech Jumpstart course. Lois: Well, that's all we have for today. Until next time, this is Lois Houston… Nikita: And Nikita Abraham, signing off! 14:40 That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.
In this episode of Sports the NEMO Way we bring the best Mets to the table for discussion. After the ending to this disappointing season by the Mets which had them looking like kings at the beginning of the season to them looking like they had a little too much to drink, stumbling out of the club trying to find a uber to get to their ex-girlfriends house who don't want nothing to do with them. So with that said lets pump up the fan base and show them some love with the all time great players that they used to have, and can be happy about. Get ready for this awesome Mets episode, and see if Brock can make some head way and get back on top in trivia. Have a great day and we'll see you all next week... Peace.
Have you ever wondered where all your digital memories, work projects, or favorite photos actually live in the cloud? In this episode, Lois Houston and Nikita Abraham are joined by Principal OCI Instructor Orlando Gentil to discuss cloud storage. They explore how data is carefully organized, the different ways it can be stored, and what keeps it safe and easy to find. Cloud Tech Jumpstart: https://mylearn.oracle.com/ou/course/cloud-tech-jumpstart/152992 Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode. ------------------------------------------------------ Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started! 00:25 Nikita: Welcome to the Oracle University Podcast! I'm Nikita Abraham, Team Lead of Editorial Services with Oracle University, and with me is Lois Houston, Director of Innovation Programs. Lois: Hey there! Last week, we spoke about the differences between traditional and cloud data centers, and covered components like CPU, RAM, and operating systems. If you haven't listened to the episode yet, I'd suggest going back and listening to it before you dive into this one. Nikita: Joining us again is Orlando Gentil, Principal OCI Instructor at Oracle University, and we're going to ask him about another fundamental concept: storage. 01:04 Lois: That's right, Niki. Hi Orlando! Thanks for being with us again today. You introduced cloud data centers last week, but tell us, how is data stored and accessed in these centers? Orlando: At a fundamental level, storage is where your data resides persistently. Data stored on a storage device is accessed by the CPU and, for specialized tasks, the GPU. The RAM acts as a high-speed intermediary, temporarily holding data that the CPU and the GPU are actively working on. This cyclical flow ensures that applications can effectively retrieve, process, and store information, forming the backbone for our computing operations in the data center. 01:52 Nikita: But how is data organized and controlled on disks? Orlando: To effectively store and manage data on physical disks, a structured approach is required, which is defined by file systems and permissions. The process began with disks. These are the raw physical storage devices. Before data can be written to them, disks are typically divided into partitions. A partition is a logical division of a physical disk that acts as if it were a separated physical disk. This allows you to organize your storage space and even install multiple operating systems on a single drive. Once partitions are created, they are formatted with a file system. 02:40 Nikita: Ok, sorry but I have to stop you there. Can you explain what a file system is? And how is data organized using a file system? Orlando: The file system is the method and the data structure that an operating system uses to organize and manage files on storage devices. It dictates how data is named, is stored, retrieved, and managed on the disk, essentially providing the roadmap for data. Common file systems include NTFS for Windows and ext4 or XFS for Linux. Within this file system, data is organized hierarchically into directories, also known as folders. These containers help to logically group related files, which are the individual units of data, whether they are documents, images, videos, or applications. Finally, overseeing this entire organization are permissions. 03:42 Lois: And what are permissions? Orlando: Permissions define who can access a specific files and directories and what actions they are allowed to perform-- for example, read, write, or execute. This access control, often managed by user, group, and other permissions, is fundamental for security, data integrity, and multi-user environments within a data center. 04:09 Lois: Ok, now that we have a good understanding of how data is organized logically, can we talk about how data is stored locally within a server? Orlando: Local storage refers to storage devices directly attached to a server or computer. The three common types are Hard Disk Drive. These are traditional storage devices using spinning platters to store data. They offer large capacity at a lower cost per gigabyte, making them suitable for bulk data storage when high performance isn't the top priority. Unlike hard disks, solid state drives use flash memory to store data, similar to USB drives but on a larger scale. They provide significantly faster read and write speeds, better durability, and lower power consumption than hard disks, making them ideal for operating systems, applications, and frequently accessed data. Non-Volatile Memory Express is a communication interface specifically designed for solid state that connects directly to the PCI Express bus. NVME offers even faster performance than traditional SATA-based solid state drives by reducing latency and increasing bandwidth, making it the top choice for demanding workloads that require extreme speed, such as high-performance databases and AI applications. Each type serves different performance and cost requirements within a data center. While local storage is essential for immediate access, data center also heavily rely on storage that isn't directly attached to a single server. 05:59 Lois: I'm guessing you're hinting at remote storage. Can you tell us more about that, Orlando? Orlando: Remote storage refers to data storage solutions that are not physically connected to the server or client accessing them. Instead, they are accessed over the network. This setup allows multiple clients or servers to share access to the same storage resources, centralizing data management and improving data availability. This architecture is fundamental to cloud computing, enabling vast pools of shared storage that can be dynamically provisioned to various users and applications. 06:35 Lois: Let's talk about the common forms of remote storage. Can you run us through them? Orlando: One of the most common and accessible forms of remote storage is Network Attached Storage or NAS. NAS is a dedicated file storage device connected to a network that allows multiple users and client devices to retrieve data from a centralized disk capacity. It's essentially a server dedicated to serving files. A client connects to the NAS over the network. And the NAS then provides access to files and folders. NAS devices are ideal for scenarios requiring shared file access, such as document collaboration, centralized backups, or serving media files, making them very popular in both home and enterprise environments. While NAS provides file-level access over a network, some applications, especially those requiring high performance and direct block level access to storage, need a different approach. 07:38 Nikita: And what might this approach be? Orlando: Internet Small Computer System Interface, which provides block-level storage over an IP network. iSCSI or Internet Small Computer System Interface is a standard that allows the iSCSI protocol traditionally used for local storage to be sent over IP networks. Essentially, it enables servers to access storage devices as if they were directly attached even though they are located remotely on the network. This means it can leverage standard ethernet infrastructure, making it a cost-effective solution for creating high performance, centralized storage accessible over an existing network. It's particularly useful for server virtualization and database environments where block-level access is preferred. While iSCSI provides block-level access over standard IP, for environments demanding even higher performance, lower latency, and greater dedicated throughput, a specialized network is often deployed. 08:47 Nikita: And what's this specialized network called? Orlando: Storage Area Network or SAN. A Storage Area Network or SAN is a high-speed network specifically designed to provide block-level access to consolidated shared storage. Unlike NAS, which provides file level access, a SAN presents a storage volumes to servers as if they were local disks, allowing for very high performance for applications like databases and virtualized environments. While iSCSI SANs use ethernet, many high-performance SANs utilize fiber channel for even faster and more reliable data transfer, making them a cornerstone of enterprise data centers where performance and availability are paramount. 09:42 Oracle University's Race to Certification 2025 is your ticket to free training and certification in today's hottest technology. Whether you're starting with Artificial Intelligence, Oracle Cloud Infrastructure, Multicloud, or Oracle Data Platform, this challenge covers it all! Learn more about your chance to win prizes and see your name on the Leaderboard by visiting education.oracle.com/race-to-certification-2025. That's education.oracle.com/race-to-certification-2025. 10:26 Nikita: Welcome back! Orlando, are there any other popular storage paradigms we should know about? Orlando: Beyond file level and block level storage, cloud environments have popularized another flexible and highly scalable storage paradigm, object storage. Object storage is a modern approach to storing data, treating each piece of data as a distinct, self-contained unit called an object. Unlike file systems that organize data in a hierarchy or block storage that breaks data into fixed size blocks, object storage manages data as flat, unstructured objects. Each object is stored with unique identifiers and rich metadata, making it highly scalable and flexible for massive amounts of data. This service handles the complexity of storage, providing access to vast repositories of data. Object storage is ideal for use cases like cloud-native applications, big data analytics, content distribution, and large-scale backups thanks to its immense scalability, durability, and cost effectiveness. While object storage is excellent for frequently accessed data in rapidly growing data sets, sometimes data needs to be retained for very long periods but is accessed infrequently. For these scenarios, a specialized low-cost storage tier, known as archive storage, comes into play. 12:02 Lois: And what's that exactly? Orlando: Archive storage is specifically designed for long-term backup and retention of data that you rarely, if ever, access. This includes critical information, like old records, compliance data that needs to be kept for regulatory reasons, or disaster recovery backups. The key characteristics of archive storage are extremely low cost per gigabyte, achieved by optimizing for infrequent access rather than speed. Historically, tape backup systems were the common solution for archiving, where data from a data center is moved to tape. In modern cloud environments, this has evolved into cloud backup solutions. Cloud-based archiving leverages high-cost, effective during cloud storage tiers that are purpose built for long term retention, providing a scalable and often more reliable alternative to physical tapes. 13:05 Lois: Thank you, Orlando, for taking the time to talk to us about the hardware and software layers of cloud data centers. This information will surely help our listeners to make informed decisions about cloud infrastructure to meet their workload needs in terms of performance, scalability, cost, and management. Nikita: That's right, Lois. And if you want to learn more about what we discussed today, head over to mylearn.oracle.com and search for the Cloud Tech Jumpstart course. Lois: In our next episode, we'll take a look at more of the fundamental concepts within modern cloud environments, such as Hypervisors, Virtualization, and more. I can't wait to learn more about it. Until then, this is Lois Houston… Nikita: And Nikita Abraham, signing off! 13:47 That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.
Living and Working in Spain with David Wright The Ultimate Expat Guide
Curious about what really goes on inside a cloud data center? In this episode, Lois Houston and Nikita Abraham chat with Principal OCI Instructor Orlando Gentil about how cloud data centers are transforming the way organizations manage technology. They explore the differences between traditional and cloud data centers, the roles of CPUs, GPUs, and RAM, and why operating systems and remote access matter more than ever. Cloud Tech Jumpstart: https://mylearn.oracle.com/ou/course/cloud-tech-jumpstart/152992 Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode. ------------------------------------- Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started! 00:25 Lois: Hello and welcome to the Oracle University Podcast! I'm Lois Houston, Director of Innovation Programs with Oracle University, and with me is Nikita Abraham, Team Lead: Editorial Services. Nikita: Hi everyone! Today, we're covering the fundamentals you need to be successful in a cloud environment. If you're new to cloud, coming from a SaaS environment, or planning to move from on-premises to the cloud, you won't want to miss this. With us today is Orlando Gentil, Principal OCI Instructor at Oracle University. Hi Orlando! Thanks for joining us. 01:01 Lois: So Orlando, we know that Oracle has been a pioneer of cloud technologies and has been pivotal in shaping modern cloud data centers, which are different from traditional data centers. For our listeners who might be new to this, could you tell us what a traditional data center is? Orlando: A traditional data center is a physical facility that houses an organization's mission critical IT infrastructure, including servers, storage systems, and networking equipment, all managed on site. 01:32 Nikita: So why would anyone want to use a cloud data center? Orlando: The traditional model requires significant upfront investment in physical hardware, which you are then responsible for maintaining along with the underlying infrastructure like physical security, HVAC, backup power, and communication links. In contrast, cloud data centers offer a more agile approach. You essentially rent the infrastructure you need, paying only for what you use. In the traditional data center, scaling resources up and down can be a slow and complex process. On cloud data centers, scaling is automated and elastic, allowing resources to adjust dynamically based on demand. This shift allows business to move their focus from the constant upkeep of infrastructure to innovation and growth. The move represents a shift from maintenance to momentum, enabling optimized costs and efficient scaling. This fundamental shift is how IT infrastructure is managed and consumed, and precisely what we mean by moving to the cloud. 02:39 Lois: So, when we talk about moving to the cloud, what does it really mean for businesses today? Orlando: Moving to the cloud represents the strategic transition from managing your own on-premise hardware and software to leveraging internet-based computing services provided by a third-party. This involves migrating your applications, data, and IT operations to a cloud environment. This transition typically aims to reduce operational overhead, increase flexibility, and enhance scalability, allowing organizations to focus more on their core business functions. 03:17 Nikita: Orlando, what's the “brain” behind all this technology? Orlando: A CPU or Central Processing Unit is the primary component that performs most of the processing inside the computer or server. It performs calculations handling the complex mathematics and logic that drive all applications and software. It processes instructions, running tasks, and operations in the background that are essential for any application. A CPU is critical for performance, as it directly impacts the overall speed and efficiency of the data center. It also manages system activities, coordinating user input, various application tasks, and the flow of data throughout the system. Ultimately, the CPU drives data center workloads from basic server operations to powering cutting edge AI applications. 04:10 Lois: To better understand how a CPU achieves these functions and processes information so efficiently, I think it's important for us to grasp its fundamental architecture. Can you briefly explain the fundamental architecture of a CPU, Orlando? Orlando: When discussing CPUs, you will often hear about sockets, cores, and threads. A socket refers to the physical connection on the motherboard where a CPU chip is installed. A single server motherboard can have one or more sockets, each holding a CPU. A core is an independent processing unit within a CPU. Modern CPUs often have multiple cores, enabling them to handle several instructions simultaneously, thus increasing processing power. Think of it as having multiple mini CPUs on a single chip. Threads are virtual components that allow a single CPU core to handle multiple sequence of instructions or threads concurrently. This technology, often called hyperthreading, makes a single core appear as two logical processors to the operating system, further enhancing efficiency. 05:27 Lois: Ok. And how do CPUs process commands? Orlando: Beyond these internal components, CPUs are also designed based on different instruction set architectures which dictate how they process commands. CPU architectures are primarily categorized in two designs-- Complex Instruction Set Computer or CISC and Reduced Instruction Set Computer or RISC. CISC processors are designed to execute complex instructions in a single step, which can reduce the number of instructions needed for a task, but often leads to a higher power consumption. These are commonly found in traditional Intel and AMD CPUs. In contrast, RISC processors use a simpler, more streamlined set of instructions. While this might require more steps for a complex task, each step is faster and more energy efficient. This architecture is prevalent in ARM-based CPUs. 06:34 Are you looking to boost your expertise in enterprise AI? Check out the Oracle AI Agent Studio for Fusion Applications Developers course and professional certification—now available through Oracle University. This course helps you build, customize, and deploy AI Agents for Fusion HCM, SCM, and CX, with hands-on labs and real-world case studies. Ready to set yourself apart with in-demand skills and a professional credential? Learn more and get started today! Visit mylearn.oracle.com for more details. 07:09 Nikita: Welcome back! We were discussing CISC and RISC processors. So Orlando, where are they typically deployed? Are there any specific computing environments and use cases where they excel? Orlando: On the CISC side, you will find them powering enterprise virtualization and server workloads, such as bare metal hypervisors in large databases where complex instructions can be efficiently processed. High performance computing that includes demanding simulations, intricate analysis, and many traditional machine learning systems. Enterprise software suites and business applications like ERP, CRM, and other complex enterprise systems that benefit from fewer steps per instruction. Conversely, RISC architectures are often preferred for cloud-native workloads such as Kubernetes clusters, where simpler, faster instructions and energy efficiency are paramount for distributed computing. Mobile device management and edge computing, including cell phones and IoT devices where power efficiency and compact design are critical. Cost optimized cloud hosting supporting distributed workloads where the cumulative energy savings and simpler design lead to more economical operations. The choice between CISC and RISC depends heavily on the specific workload and performance requirements. While CPUs are versatile generalists, handling a broad range of tasks, modern data centers also heavily rely on another crucial processing unit for specialized workloads. 08:54 Lois: We've spoken a lot about CPUs, but our conversation would be incomplete without understanding what a Graphics Processing Unit is and why it's important. What can you tell us about GPUs, Orlando? Orlando: A GPU or Graphics Processing Unit is distinct from a CPU. While the CPU is a generalist excelling at sequential processing and managing a wide variety of tasks, the GPU is a specialist. It is designed specifically for parallel compute heavy tasks. This means it can perform many calculations simultaneously, making it incredibly efficient for workloads like rendering graphics, scientific simulations, and especially in areas like machine learning and artificial intelligence, where massive parallel computation is required. In the modern data center, GPUs are increasingly vital for accelerating these specialized, data intensive workloads. 09:58 Nikita: Besides the CPU and GPU, there's another key component that collaborates with these processors to facilitate efficient data access. What role does Random Access Memory play in all of this? Orlando: The core function of RAM is to provide faster access to information in use. Imagine your computer or server needing to retrieve data from a long-term storage device, like a hard drive. This process can be relatively slow. RAM acts as a temporary high-speed buffer. When your CPU or GPU needs data, it first checks RAM. If the data is there, it can be accessed almost instantaneously, significantly speeding up operations. This rapid access to frequently used data and programming instructions is what allows applications to run smoothly and systems to respond quickly, making RAM a critical factor in overall data center performance. While RAM provides quick access to active data, it's volatile, meaning data is lost when power is off, or persistent data storage, the information that needs to remain available even after a system shut down. 11:14 Nikita: Let's now talk about operating systems in cloud data centers and how they help everything run smoothly. Orlando, can you give us a quick refresher on what an operating system is, and why it is important for computing devices? Orlando: At its core, an operating system, or OS, is the fundamental software that manages all the hardware and software resources on a computer. Think of it as a central nervous system that allows everything else to function. It performs several critical tasks, including managing memory, deciding which programs get access to memory and when, managing processes, allocating CPU time to different tasks and applications, managing files, organizing data on storage devices, handling input and output, facilitate communication between the computer and its peripherals, like keyboards, mice, and displays. And perhaps, most importantly, it provides the user interface that allows us to interact with the computer. 12:19 Lois: Can you give us a few examples of common operating systems? Orlando: Common operating system examples you are likely familiar with include Microsoft Windows and MacOS for personal computers, iOS and Android for mobile devices, and various distributions of Linux, which are incredibly prevalent in servers and increasingly in cloud environments. 12:41 Lois: And how are these operating systems specifically utilized within the demanding environment of cloud data centers? Orlando: The two dominant operating systems in data centers are Linux and Windows. Linux is further categorized into enterprise distributions, such as Oracle Linux or SUSE Linux Enterprise Server, which offer commercial support and stability, and community distributions, like Ubuntu and CentOS, which are developed and maintained by communities and are often free to use. On the other side, we have Windows, primarily represented by Windows Server, which is Microsoft's server operating system known for its robust features and integration with other Microsoft products. While both Linux and Windows are powerful operating systems, their licensing modes can differ significantly, which is a crucial factor to consider when deploying them in a data center environment. 13:43 Nikita: In what way do the licensing models differ? Orlando: When we talk about licensing, the differences between Linux and Windows become quite apparent. For Linux, Enterprise Distributions come with associated support fees, which can be bundled into the initial cost or priced separately. These fees provide access to professional support and updates. On the other hand, Community Distributions are typically free of charge, with some providers offering basic community-driven support. Windows server, in contrast, is a commercial product. Its license cost is generally included in the instance cost when using cloud providers or purchased directly for on-premise deployments. It's also worth noting that some cloud providers offer a bring your own license, or BYOL program, allowing organizations to use their existing Windows licenses in the cloud, which can sometimes provide cost efficiencies. 14:46 Nikita: Beyond choosing an operating system, are there any other important aspects of data center management? Orlando: Another critical aspect of data center management is how you remotely access and interact with your servers. Remote access is fundamental for managing servers in a data center, as you are rarely physically sitting in front of them. The two primary methods that we use are SSH, or secure shell, and RDP, remote desktop. Secure shell is widely used for secure command line access for Linux servers. It provides an encrypted connection, allowing you to execute commands, transfer files, and manage your servers securely from a remote location. The remote desktop protocol is predominantly used for graphical remote access to Windows servers. RDP allows you to see and interact with the server's desktop interface, just as if you were sitting directly in front of it, making it ideal for tasks that require a graphical user interface. 15:54 Lois: Thank you so much, Orlando, for shedding light on this topic. Nikita: Yeah, that's a wrap for today! To learn more about what we discussed, head over to mylearn.oracle.com and search for the Cloud Tech Jumpstart course. In our next episode, we'll take a close look at how data is stored and managed. Until then, this is Nikita Abraham… Lois: And Lois Houston, signing off! 16:16 That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.
Living and Working in Spain with David Wright The Ultimate Expat Guide
AI is reshaping industries at a rapid pace, but as its influence grows, so do the ethical concerns that come with it. This episode examines how AI is being applied across sectors such as healthcare, finance, and retail, while also exploring the crucial issue of ensuring that these technologies align with human values. In this conversation, Lois Houston and Nikita Abraham are joined by Hemant Gahankari, Senior Principal OCI Instructor, who emphasizes the importance of fairness, inclusivity, transparency, and accountability in AI systems. AI for You: https://mylearn.oracle.com/ou/course/ai-for-you/152601/ Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode. ---------------------------------------------------- Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started! 00:25 Lois: Welcome to the Oracle University Podcast! I'm Lois Houston, Director of Innovation Programs with Oracle University, and with me is Nikita Abraham, Team Lead: Editorial Services. Nikita: Hey everyone! In our last episode, we spoke about how Oracle integrates AI capabilities into its Fusion Applications to enhance business workflows, and we focused on Predictive, Generative, and Agentic AI. Lois: Today, we'll discuss the various applications of AI. This is the final episode in our AI series, and before we close, we'll also touch upon ethical and responsible AI. 01:01 Nikita: Taking us through all of this is Senior Principal OCI Instructor Hemant Gahankari. Hi Hemant! AI is pretty much everywhere today. So, can you explain how it is being used in industries like retail, hospitality, health care, and so on? Hemant: AI isn't just for sci-fi movies anymore. It's helping doctors spot diseases earlier and even discover new drugs faster. Imagine an AI that can look at an X-ray and say, hey, there is something sketchy here before a human even notices. Wild, right? Banks and fintech companies are all over AI. Fraud detection. AI has got it covered. Those robo advisors managing your investments? That's AI too. Ever noticed how e-commerce companies always seem to know what you want? That's AI studying your habits and nudging you towards that next purchase or binge watch. Factories are getting smarter. AI predicts when machines will fail so they can fix them before everything grinds to a halt. Less downtime, more efficiency. Everyone wins. Farming has gone high tech. Drones and AI analyze crops, optimize water use, and even help with harvesting. Self-driving cars get all the hype, but even your everyday GPS uses AI to dodge traffic jams. And if AI can save me from sitting in bumper-to-bumper traffic, I'm all for it. 02:40 Nikita: Agreed! Thanks for that overview, but let's get into specific scenarios within each industry. Hemant: Let us take a scenario in the retail industry-- a retail clothing line with dozens of brick-and-mortar stores. Maintaining proper inventory levels in stores and regional warehouses is critical for retailers. In this low-margin business, being out of a popular product is especially challenging during sales and promotions. Managers want to delight shoppers and increase sales but without overbuying. That's where AI steps in. The retailer has multiple information sources, ranging from point-of-sale terminals to warehouse inventory systems. This data can be used to train a forecasting model that can make predictions, such as demand increase due to a holiday or planned marketing promotion, and determine the time required to acquire and distribute the extra inventory. Most ERP-based forecasting systems can produce sophisticated reports. A generative AI report writer goes further, creating custom plain-language summaries of these reports tailored for each store, instructing managers about how to maximize sales of well-stocked items while mitigating possible shortages. 04:11 Lois: Ok. How is AI being used in the hospitality sector, Hemant? Hemant: Let us take an example of a hotel chain that depends on positive ratings on social media and review websites. One common challenge they face is keeping track of online reviews, leading to missed opportunities to engage unhappy customers complaining on social media. Hotel managers don't know what's being said fast enough to address problems in real-time. Here, AI can be used to create a large data set from the tens of thousands of previously published online reviews. A textual language AI system can perform a sentiment analysis across the data to determine a baseline that can be periodically re-evaluated to spot trends. Data scientists could also build a model that correlates these textual messages and their sentiments against specific hotel locations and other factors, such as weather. Generative AI can extract valuable suggestions and insights from both positive and negative comments. 05:27 Nikita: That's great. And what about Financial Services? I know banks use AI quite often to detect fraud. Hemant: Unfortunately, fraud can creep into any part of a bank's retail operations. Fraud can happen with online transactions, from a phone or browser, and offsite ATMs too. Without trust, banks won't have customers or shareholders. Excessive fraud and delays in detecting it can violate financial industry regulations. Fraud detection combines AI technologies, such as computer vision to interpret scanned documents, document verification to authenticate IDs like driver's licenses, and machine learning to analyze patterns. These tools work together to assess the risk of fraud in each transaction within seconds. When the system detects a high risk, it triggers automated responses, such as placing holds on withdrawals or requesting additional identification from customers, to prevent fraudulent activity and protect both the business and its client. 06:42 Nikita: Wow, interesting. And how is AI being used in the health industry, especially when it comes to improving patient care? Hemant: Medical appointments can be frustrating for everyone involved—patients, receptionists, nurses, and physicians. There are many time-consuming steps, including scheduling, checking in, interactions with the doctors, checking out, and follow-ups. AI can fix this problem through electronic health records to analyze lab results, paper forms, scans, and structured data, summarizing insights for doctors with the latest research and patient history. This helps practice reduced costs, boost earnings, and deliver faster, more personalized care. 07:32 Lois: Let's take a look at one more industry. How is manufacturing using AI? Hemant: A factory that makes metal parts and other products use both visual inspections and electronic means to monitor product quality. A part that fails to meet the requirements may be reworked or repurposed, or it may need to be scrapped. The factory seeks to maximize profits and throughput by shipping as much good material as possible, while minimizing waste by detecting and handling defects early. The way AI can help here is with the quality assurance process, which creates X-ray images. This data can be interpreted by computer vision, which can learn to identify cracks and other weak spots, after being trained on a large data set. In addition, problematic or ambiguous data can be highlighted for human inspectors. 08:36 Oracle University's Race to Certification 2025 is your ticket to free training and certification in today's hottest tech. Whether you're starting with Artificial Intelligence, Oracle Cloud Infrastructure, Multicloud, or Oracle Data Platform, this challenge covers it all! Learn more about your chance to win prizes and see your name on the Leaderboard by visiting education.oracle.com/race-to-certification-2025. That's education.oracle.com/race-to-certification-2025. 09:20 Nikita: Welcome back! AI can be used effectively to automate a variety of tasks to improve productivity, efficiency, cost savings. But I'm sure AI has its constraints too, right? Can you talk about what happens if AI isn't able to echo human ethics? Hemant: AI can fail due to lack of ethics. AI can spot patterns, not make moral calls. It doesn't feel guilt, understand context, or take responsibility. That is still up to us. Decisions are only as good as the data behind them. For example, health care AI underdiagnosing women because research data was mostly male. Artificial narrow intelligence tends to automate discrimination at scale. Recruiting AI downgraded resumes just because it had a word "women's" (for example, women's chess club). Who is responsible when AI fails? For example, if a self-driving car hits someone, we cannot blame the car. Then who owns the failure? The programmer? The CEO? Can we really trust corporations or governments having programmed the use of AI not to be evil correctly? So, it's clear that AI needs oversight to function smoothly. 10:48 Lois: So, Hemant, how can we design AI in ways that respect and reflect human values? Hemant: Think of ethics like a tree. It needs all parts working together. Roots represent intent. That is our values and principles. The trunk stands for safeguards, our systems, and structures. And the branches are the outcomes we aim for. If the roots are shallow, the tree falls. If the trunk is weak, damage seeps through. The health of roots and trunk shapes the strength of our ethical outcomes. Fairness means nothing without ethical intent behind it. For example, a bank promotes its loan algorithm as fair. But it uses zip codes in decision-making, effectively penalizing people based on race. That's not fairness. That's harm disguised as data. Inclusivity depends on the intent sustainability. Inclusive design isn't just a check box. It needs a long-term commitment. For example, controllers for gamers with disabilities are only possible because of sustained R&D and intentional design choices. Without investment in inclusion, accessibility is left behind. Transparency depends on the safeguard robustness. Transparency is only useful if the system is secure and resilient. For example, a medical AI may be explainable, but if it is vulnerable to hacking, transparency won't matter. Accountability depends on the safeguard privacy and traceability. You can't hold people accountable if there is no trail to follow. For example, after a fatal self-driving car crash, deleted system logs meant no one could be held responsible. Without auditability, accountability collapses. So remember, outcomes are what we see, but they rely on intent to guide priorities and safeguards to support execution. That's why humans must have a final say. AI has no grasp of ethics, but we do. 13:16 Nikita: So, what you're saying is ethical intent and robust AI safeguards need to go hand in hand if we are to truly leverage AI we can trust. Hemant: When it comes to AI, preventing harm is a must. Take self-driving cars, for example. Keeping pedestrians safe is absolutely critical, which means the technology has to be rock solid and reliable. At the same time, fairness and inclusivity can't be overlooked. If an AI system used for hiring learns from biased past data, say, mostly male candidates being hired, it can end up repeating those biases, shutting out qualified candidates unfairly. Transparency and accountability go hand in hand. Imagine a loan rejection if the AI's decision isn't clear or explainable. It becomes impossible for someone to challenge or understand why they were turned down. And of course, robustness supports fairness too. Loan approval systems need strong security to prevent attacks that could manipulate decisions and undermine trust. We must build AI that reflects human values and has safeguards. This makes sure that AI is fair, inclusive, transparent, and accountable. 14:44 Lois: Before we wrap, can you talk about why AI can fail? Let's continue with your analogy of the tree. Can you explain how AI failures occur and how we can address them? Hemant: Root elements like do not harm and sustainability are fundamental to ethical AI development. When these roots fail, the consequences can be serious. For example, a clear failure of do not harm is AI-powered surveillance tools misused by authoritarian regimes. This happens because there were no ethical constraints guiding how the technology was deployed. The solution is clear-- implement strong ethical use policies and conduct human rights impact assessment to prevent such misuse. On the sustainability front, training AI models can consume massive amount of energy. This failure occurs because environmental costs are not considered. To fix this, organizations are adopting carbon-aware computing practices to minimize AI's environmental footprint. By addressing these root failures, we can ensure AI is developed and used responsibly with respect for human rights and the planet. An example of a robustness failure can be a chatbot hallucinating nonexistent legal precedence used in court filings. This could be due to training on unverified internet data and no fact-checking layer. This can be fixed by grounding in authoritative databases. An example of a privacy failure can be AI facial recognition database created without user consent. The reason being no consent was taken for data collection. This can be fixed by adopting privacy-preserving techniques. An example of a fairness failure can be generated images of CEOs as white men and nurses as women, minorities. The reason being training on imbalanced internet images reflecting societal stereotypes. And the fix is to use diverse set of images. 17:18 Lois: I think this would be incomplete if we don't talk about inclusivity, transparency, and accountability failures. How can they be addressed, Hemant? Hemant: An example of an inclusivity failure can be a voice assistant not understanding accents. The reason being training data lacked diversity. And the fix is to use inclusive data. An example of a transparency and accountability failure can be teachers could not challenge AI-generated performance scores due to opaque calculations. The reason being no explainability tools are used. The fix being high-impact AI needs human review pathways and explainability built in. 18:04 Lois: Thank you, Hemant, for a fantastic conversation. We got some great insights into responsible and ethical AI. Nikita: Thank you, Hemant! If you're interested in learning more about the topics we discussed today, head over to mylearn.oracle.com and search for the AI for You course. Until next time, this is Nikita Abraham…. Lois: And Lois Houston, signing off! 18:26 That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.
The New Yorker: The Writer's Voice - New Fiction from The New Yorker
David Wright Faladé reads his story “Amarillo Boulevard,” from the October 6, 2025, issue of the magazine. Wright Faladé, the recipient of a Zora Neale Hurston/Richard Wright Award, is the author of a nonfiction book, “Fire on the Beach: Recovering the Lost Story of Richard Etheridge and the Pea Island Lifesavers,” and the novels “Black Cloud Rising” and “The New Internationals,” which was published earlier this year. Learn about your ad choices: dovetail.prx.org/ad-choices
Want to make AI work for your business? In today's episode, Lois Houston and Nikita Abraham continue their discussion of AI in Oracle Fusion Applications by focusing on three key AI capabilities: predictive, generative, and agentic. Joining them is Principal Instructor Yunus Mohammed, who explains how predictive, generative, and agentic AI can optimize efficiency, support decision-making, and automate tasks—all without requiring technical expertise. AI for You: https://mylearn.oracle.com/ou/course/ai-for-you/152601/ Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode. ------------------------------------------------------------ Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started! 00:25 Nikita: Welcome to the Oracle University Podcast! I'm Nikita Abraham, Team Lead: Editorial Services with Oracle University, and with me is Lois Houston, Director of Innovation Programs. Lois: Hi there! In our last episode, we explored the essential components of the Oracle AI stack and spoke about Oracle's suite of AI services. Nikita: Yeah, and in today's episode, we're going to go down a similar path and take a closer look at the AI functionalities within Oracle Fusion Applications. 00:53 Lois: With us today is Principal Instructor Yunus Mohammed. Hi Yunus! It's lovely to have you back with us. For anyone who doesn't already know, what are Oracle Fusion Cloud Applications? Yunus: Oracle Fusion Applications are a suite of cloud-based enterprise applications designed to run for your business across finance, HR, supply chain, sales, services and more, all on a unified platform. They are designed to help enterprises operate smarter, faster by embedding AI directly into business process. That means better forecasts in finance, faster hiring decisions in HR, and optimized supply chains, and more personalized customer experience. 01:42 Nikita: And we know they've been built for today's fast-paced, AI-driven business environment. So, what are the different functional pillars within Oracle Fusion Apps? Yunus: The first one is the ERP, Enterprise Resource Planning, which supports financials, procurements, and project management. It's the backbone of many organizations, or day-to-day operations. HCM or Human Capital Management, handles workforce-related processes such as hiring, payroll, performance, and talent development, helping HR teams operate more efficiently. SCM, the Supply Chain Management, enables businesses to manage their logistics, inventory, and suppliers and manufacturers in the business. It's particularly critical in industries with complex operations like retail and manufacturing. The CX, which is the Customer Experience, covers the full customer life cycle, which includes sales, marketing, and service. These models help the businesses connect with their customers more personally and proactively, whether through the targeted campaigns or responsive support. 03:02 Lois: Yunus, what sets Fusion apart? Yunus: What sets Fusion apart is how these applications work seamlessly together. They share data natively and continuously improve with AI and automation, giving you not just tools, but intelligence at scale. Oracle applications are built to be AI first, with a complete suite of finance, supply chain, manufacturing, HR, sales, service, and marketing, that is tightly coupled with our industry and data intelligence applications. The easiest and the most effective way to start building your organization's AI muscle is with AI embedded in Fusion applications. For example, if the customer needs to return a defective product, the service representative simply clicks on Ask Oracle for the answers. Since the AI agent is embedded in the application, it has contextual information about the customer, the order, and any special service, contract, or any other feature that is required for this process. The AI agent automatically figures out the return policy, including the options to send a replacement product immediately or offer a discount for the inconvenience, and also expedite shipping. Another AI agent sends a personalized email confirming details of the return, and different AI agent creates the replacement order for fulfillment and shipping. Our AI-embedded Fusion Applications can automate an end-to-end business process from service request to return order to fulfillment and shipping and then accounting. These are pre-built and tested so that all the worry and hard work is removed from the implementation point of view. They cover the core workflows. Basically, they address tasks that form part of the organization's core workflow. User requires no technical knowledge in the scenarios. 05:16 Lois: That's great! So, you don't need to be an AI expert or a data scientist to get going. Yunus: The outcomes are super fast in business softwares and context is everything. Just having the right information isn't enough. This is about having the information in the right place at the right time for it to be instantly actionable. They are ready from day one and can be optimized over time. They are powerful out of the box and only get better with day-to-day processes and performance. 05:55 Are you working towards an Oracle Certification this year? Join us at one of our certification prep live events in the Oracle University Learning Community. Get insider tips from seasoned experts and learn from others who have already taken their certifications. Go to community.oracle.com/ou to jump-start your journey towards certification today! 06:20 Nikita: Welcome back! So, when we talk about the AI capabilities in Fusion apps, I know we have different types. Can you tell us more about them? Yunus: Predictive AI is where it all started. These models analyze historical patterns and data to anticipate what might happen next. For example, predicting employee attrition, forecasting demand in supply chain, or flagging potential late payments in finance workflows. These are embedded into business processes to surface insights before action is needed. Then we have got the generative AI, which takes this a step more further. Instead of just providing insights, it creates content, such as auto-generating job descriptions, summarizing performance reviews, or even crafting draft responses to supplier queries. This saves time and boosts productivity across functions like HR, CX, and procurement. Last but not the least, we have got the agentic AI, which is the most advanced layer. These agents don't just provide suggestions, they take actions on behalf of the users. Think of an agent that not only recommends actions in a workflow, but also executes them, creating tasks, filling tickets, updating systems, and communicating with stakeholders, all autonomously but under user control. And importantly, many business scenarios today benefit from a blend of these types. For example, an AI assistant in Fusion HCM might predict employees turnover, which is predictive AI, generates tailored retention plans, which is generative, and it is generative AI, and initiate outreach or next steps, which is done by the process of agents, which is called agentic AI. So, Oracle integrates these capabilities in a harmonious way, enabling users to act faster, personalize at scale, and drive better business outcomes. 08:39 Lois: Ok, let's get into the specifics. How does Oracle use predictive AI across its Fusion apps, helping businesses anticipate what's coming and act proactively. Yunus: So in HCM, things like recommended jobs, in this, candidates visiting a potential employer's website encountered an improved online experience, whereby if they have uploaded their resumes, they will be shown job opportunities that match their skills and experience mix. This helps candidates who are unsure what to search by showing them roles and titles they may not have considered. Time to hire provides an estimated as to how long it will take for an HR team to fill an open role, but this is really useful not only in terms of planning, recruitment, but also in terms of understanding whether you might need some temporary cover and for how long will it actually take the process to complete. In the process of supply chain management, the predictive AI is leveraged to revolutionize transit time and estimated time of arrival, which is called as the predictive analysis, enhancing efficiency, and optimizing operations. It can flag abnormal patterns in supply or inventory. For example, if a batch of parts is behaving differently in the production line and predict future demands, helping avoid overstocking or stockouts is a process that can be done by using the SCM predictive analysis or predictive AI. In ERPs, where you can audit your expenses, plan for future expenses, and do dynamic discounting for vendors who are likely to accept earlier payments or earlier payment discounts, it can also speed up reimbursements by automated expense entries. In CX, you have the options to go with adaptive intelligence for sales, which helps representatives prioritize the leads and the likelihood that a specific lead will close, helping representatives focus their time and effort. So predictive scheduling and routing in service delivery ensures that the right resource is assigned to the right customer at the right time, boosting operational efficiency and customer satisfaction, also known as fatigue analysis. 11:23 Lois: Now let's shift our focus to generative AI. How does Oracle implement generative AI across HCM, ERP, Supply Chain, and CX? Yunus: So, in HCM, the generative AI can automatically generate performance review summaries from raw data, saving time for HR teams, and can help you in providing candidates with summaries of their interview process, feedback, and next steps, all auto generated. With AI assistance, goal creation for employees can be automated, and the system analyzes performance data and trends to propose meaningful and attainable goals, aligning them with organizational objectives and employee capabilities. In SCM, similarly, the generative AI process helps you in defining drafting summaries of purchase orders. So generative AI can automatically create clear, readable synopses, and can be summarized with complex negotiations and discussions, making it easier for supply chain managers to analyze supplier proposals, track negotiations, processes, and understand key takeaways. With predictive AI embedded, it is helping you to leverage to help generate the repairs of master definitions of summaries, and can generate descriptions for item based on their specification, helping product teams automatically generate catalog contents. With ERPs, you can automate the creation of business reports, offering more insights and actionable narratives, rather than just showing the raw data. The AI can provide context, interpretations, and recommendations. AI can also take raw project data and generate a comprehensive, easy-to-read project status, reports that stakeholders can quickly review. In CX, we have got service request summarization, which can provide these long summaries for the customer services and the tickets that have been requested by the customers, allowing support teams to understand the key points in the fraction of time, and can also create knowledge base articles directly from common service requests or inquiries, which not only improves internal knowledge management but also empowers customers by enabling self-service. So generative AI can automatically generate success stories or case studies from successful opportunities or sales, which can be used as marketing content or for internal knowledge sharing. 14:20 Nikita: And what about Oracle's Agentic AI? What are its capabilities across the different pillars? Yunus: In HCM, Agentic AI handles the end-to-end onboarding experience, from explaining policies to guiding document submissions, even booking orientation sessions, allowing the HR staff to focus on human engagement. It can further support HR teams during performance review cycles by surfacing high potential employees, pulling in performance data, and recommending next actions like promotions or learning paths. It helps manage time with requests by checking eligibility, policy constraints, and suggesting appropriate substitutes, reducing administrative frictions and errors. With SCM, the Agentic AI Fusion Applications act as a real time Assistant to ensure buyers follow procurement policies, and reducing compliance risk and manual errors. It can also support sales representatives with real-time insights and next best actions during the quoting or ordering process, improving customer satisfaction and sales performance. With ERP, you can handle document intake, extraction, and routing, saving significant time on manual document management across financial functions using Fusion Applications. AI automates reconciliation tasks by matching transactions, flagging anomalies, and suggesting resolutions. It helps you in reducing close cycle timelines and continuously analyzes profit margins. And it recommends the pricing adjustments that can be done in your ERPs. In CX, the Agentic AI Fusion Application supports staff by instantly compiling full customer histories, orders, service requests, interactions, and can act like a real-time assistant, summarizing open tickets and resolutions, helping agents take over or escalate without needing to dig through the notes, and dynamically adjust technicals and technician routes based on traffic, priority, or cancelation, increasing the field efficiency and customer satisfaction. 17:04 Lois: Thank you so much, Yunus. To learn more about the topics covered today, visit mylearn.oracle.com and search for the AI for You course. Nikita: Join us next week as we cover how AI is being applied across sectors like healthcare, finance, and retail, and tackle the big question: how do we keep these technologies aligned with human values? Until then, this is Nikita Abraham… Lois: And Lois Houston, signing off! 17:30 That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.
In this special episode of Health Matters, host Courtney Allison visits Citi Field, home of the New York Mets, to speak with two guests: legendary Mets third baseman David Wright and Dr. Tony Puliafico, a psychologist with NewYork-Presbyterian and Columbia. Together, they discuss the importance of approaching challenges and failures with a growth mindset—in professional sports, at home, at work, at school, and beyond. Through the latest clinical research and stories from David's time with the Mets, they explore healthy habits for approaching failure, connecting to a supportive community, and building resilience for the long term. ___Anthony Puliafico, Ph.D. is a psychologist with the Center for Youth Mental Health at NewYork-Presbyterian. He is also an associate professor of clinical psychology in the Division of Child and Adolescent Psychiatry at Columbia University and serves as Director of the Columbia University Clinic for Anxiety and Related Disorders (CUCARD) -Westchester, an outpatient clinic that specializes in the treatment of anxiety disorders, obsessive-compulsive disorder (OCD) and related disorders in children, adolescents and adults. Dr. Puliafico specializes in the assessment and cognitive-behavioral treatment of anxiety, mood and externalizing disorders. His clinical work and research have focused on the treatment of pediatric OCD, school refusal, and adapting treatments for young children with anxiety.David Wright was a third baseman and captain for the New York Mets from 2004 to 2018. A seven-time All-Star, two-time Gold Glove Award winner, two-time Silver Slugger Award winner, and a member of the 30–30 club, Wright was recently inducted into the Mets Hall of Fame and had his number 5 retired by the team. ___Health Matters is your weekly dose of health and wellness information, from the leading experts. Join host Courtney Allison to get news you can use in your own life. New episodes drop each Wednesday.If you are looking for practical health tips and trustworthy information from world-class doctors and medical experts you will enjoy listening to Health Matters. Health Matters was created to share stories of science, care, and wellness that are happening every day at NewYork-Presbyterian, one of the nation's most comprehensive, integrated academic healthcare systems. In keeping with NewYork-Presbyterian's long legacy of medical breakthroughs and innovation, Health Matters features the latest news, insights, and health tips from our trusted experts; inspiring first-hand accounts from patients and caregivers; and updates on the latest research and innovations in patient care, all in collaboration with our renowned medical schools, Columbia and Weill Cornell Medicine. To learn more visit: https://healthmatters.nyp.org Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
In this episode, Lois Houston and Nikita Abraham are joined by Principal Instructor Yunus Mohammed to explore Oracle's approach to enterprise AI. The conversation covers the essential components of the Oracle AI stack and how each part, from the foundational infrastructure to business-specific applications, can be leveraged to support AI-driven initiatives. They also delve into Oracle's suite of AI services, including generative AI, language processing, and image recognition. AI for You: https://mylearn.oracle.com/ou/course/ai-for-you/152601/ Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode. ------------------------------------------------------------- Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started! 00:25 Lois: Welcome to the Oracle University Podcast! I'm Lois Houston, Director of Innovation Programs with Oracle University, and with me is Nikita Abraham, Team Lead: Editorial Services. Nikita: Hey everyone! In our last episode, we discussed why the decision to buy or build matters in the world of AI deployment. Lois: That's right, Niki. Today is all about the Oracle AI stack and how it empowers not just developers and data scientists, but everyday business users as well. Then we'll spend some time exploring Oracle AI services in detail. 01:00 Nikita: Yunus Mohammed, our Principal Instructor, is back with us today. Hi Yunus! Can you talk about the different layers in Oracle's end-to-end AI approach? Yunus: The first base layer is the foundation of AI infrastructure, the powerful compute and storage layer that enables scalable model training and inferences. Sitting above the infrastructure, we have got the data platform. This is where data is stored, cleaned, and managed. Without a reliable data foundation, AI simply can't perform. So base of AI is the data, and the reliable data gives more support to the AI to perform its job. Then, we have AI and ML services. These provide ready-to-use tools for building, training, and deploying custom machine learning models. Next, to the AI/ML services, we have got generative AI services. This is where Oracle enables advanced language models and agentic AI tools that can generate content, summarize documents, or assist users through chat interfaces. Then, we have the top layer, which is called as the applications, things like Fusion applications or industry specific solutions where AI is embedded directly into business workflows for recommendations, forecasting or customer support. Finally, Oracle integrates with a growing ecosystem of AI partners, allowing organizations to extend and enhance their AI capabilities even further. In short, Oracle doesn't just offer AI as a feature. It delivers it as a full stack capability from infrastructure to the layer of applications. 02:59 Nikita: Ok, I want to get into the core AI services offered by Oracle Cloud Infrastructure. But before we get into the finer details, broadly speaking, how do these services help businesses? Yunus: These services make AI accessible, secure, and scalable, enabling businesses to embed intelligence into workflows, improve efficiency, and reduce human effort in repetitive or data-heavy tasks. And the best part is, Oracle makes it easy to consume these through application interfaces, APIs, software development kits like SDKs, and integration with Fusion Applications. So, you can add AI where it matters without needing a data scientist team to do that work. 03:52 Lois: So, let's get down to it. The first core service is Oracle's Generative AI service. What can you tell us about it? Yunus: This is a fully managed service that allows businesses to tap into the power of large language models. You can actually work with these models from scratch to a well-defined develop model. You can use these models for a wide range of use cases like summarizing text, generating content, answering questions, or building AI-powered chat interfaces. 04:27 Lois: So, what will I find on the OCI Generative AI Console? Yunus: OCI Generative AI Console highlights three key components. The first one is the dedicated AI cluster. These are GPU powered environments used to fine tune and host your own custom models. It gives you control and performance at scale. Then, the second point is the custom models. You can take a base language model and fine tune it using your own data, for example, company manuals or HR policies or customer interactions, which are your own personal data. You can use this to create a model that speaks your business language. And last but not the least, the endpoints. These are the interfaces through which your application connect to the model. Once deployed, your app can query the model securely and at different scales, and you don't need to be a developer to get started. Oracle offers a playground, which is a non-core environment where you can try out models, craft parameters, and test responses interactively. So overall, the generative AI service is designed to make enterprise-grade AI accessible and customizable. So, fitting directly into business processes, whether you are building a smart assistant or you're automating the content generation process. 06:00 Lois: The next key service is OCI Generative AI Agents. Can you tell us more about it? Yunus: OCI Generative AI agents combines a natural language interface with generative AI models and enterprise data stores to answer questions and take actions. The agent remembers the context, uses previous interactions, and retrieves deeper product speech details. They aren't just static chat bots. They are context aware, grounded in business data, and able to handle multi-turns, follow-up queries with relevant accurate responses, and driving productivity and decision-making across departments like sales, support, or operations. 06:54 Oracle University's Race to Certification 2025 is your ticket to free training and certification in today's hottest tech. Whether you're starting with Artificial Intelligence, Oracle Cloud Infrastructure, Multicloud, or Oracle Data Platform, this challenge covers it all! Learn more about your chance to win prizes and see your name on the Leaderboard by visiting education.oracle.com/race-to-certification-2025. That's education.oracle.com/race-to-certification-2025. 07:37 Nikita: Welcome back! Yunus, let's move on to the OCI Language service. Yunus: OCI Language helps business understand and process natural language at scale. It uses pretrained models, which means they are already trained on large industry data sets and are ready to be used right away without requiring AI expertise. It detects over 100 languages, including English, Japanese, Spanish, and more. This is great for global business that receive multilingual inputs from customers. It works with identity sentiments. For different aspects of the sentence, for example, in a review like, “The food was great, but the service sucked,” OCI Language can tell that food has a positive sentiment while service has a negative one. This is called aspect-based sentiment analysis, and it is more insightful than just labeling the entire text as positive or negative. Then we have got to identify key phrases representing important ideas or subjects. So, it helps in extracting these key phrases, words, or terms that capture the core messages. They help automate tagging, summarizing, or even routing of content like support tickets or emails. In real life, the businesses are using this for customer feedback analysis, support ticket routing, social media monitoring, and even regulatory compliances. 09:21 Nikita: That's fantastic. And what about the OCI Speech service? Yunus: The OCI Speech is an AI service that transcribes speech to text. Think of it as an AI-powered transcription engine that listens to the spoken English, whether in audio or video files, and turns it into usable and searchable and readable text. It provides timestamps, so you know exactly when something was said. A valuable feature for reviewing legal discussions, media footages, or compliance audits. OCI Speech even understands different speakers. You don't need to train this from scratch. It is pre-trained model hosted on an API. Just send your audio to the service, and you get an accurate timestamp text back in return. 10:17 Lois: I know we also have a service for object detection… called OCI Vision? Yunus: OCI Vision uses pretrained, deep learning models to understand and analyze visual content. Just like a human might, you can upload an image or videos, and the AI can tell you what is in it and where they might be useful. There are two primary use cases, which you can use this particular OCI Vision for. One is for object detection. You have got a red color car. So OCI Vision is not just identifying that's a car. It is detecting and labeling parts of the car too, like the bumper, the wheels, the design components. This is a critical in industries like manufacturing, retail, or logistics. For example, in quality control, OCI Vision can scan product images to detect missing or defective parts automatically. Then we have got the image classification. This is useful in scenarios like automated tagging of photos, managing digital assets, classifying this particular scene or context of this particular scene. So basically, when we talk about OCI Vision, which is actually a fully managed, no complex model training is required for this particular service. It's available via API. It is also working with defining their own custom model for working with the environments. 11:51 Nikita: And the final service is related to text and called OCI Document Understanding, right? Yunus: So OCI Document Understanding allows businesses to automatically extract structured insights from unstructured documents like invoices, contracts, recipes, and also sometimes resumes, or even business documents. 12:13 Nikita: And how does it work? Yunus: OCI reads the content from the scanned document. The OCR is smarter. It recognizes both printed and handwritten text. Then determines what type of document it is. So document classification is done. Text recognition recognizes text, then classifies the document. For example, if this is a purchase order, or bank statement, or any medical report. If your business handles documents in multiple languages, then the AI can actually help in language detection also, which helps you in routing the language or translating that particular language. Many documents contain structured data in table format. Think pricing tables or line items. OCI will help you in extracting these with high accuracy for reporting on feeding into ERP systems. And finally, I would say the key value extraction. It puts our critical business values like invoice numbers, payment amounts, or customer names from fields that may not always allow a fixed format. So, this service reduces the need for manual review, cuts down processes time, and ensures high accuracy for your system. 13:36 Lois: What are the key takeaways our listeners should walk away with after this episode? Yunus: The first one, Oracle doesn't treat AI as just a standalone tool. Instead, AI is integrated from the ground up. Whether you're talking about infrastructure, data platforms, machine learning services, or applications like HCM, ERP, or CX. In real world, the Oracle AI Services prioritize data management, security, and governance, all essential for enterprise AI use cases. So, it is about trust. Can your AI handle sensitive data? Can it comply with regulations? Oracle builds its AI services with strong foundation in data governance, robust security measures, and tight control over data residency and access. So this makes Oracle AI especially well-suited for industries like health care, finance, logistics, and government, where compliance and control aren't optional. They are critical. 14:44 Nikita: Thank you for another great conversation, Yunus. If you're interested in learning more about the topics we discussed today, head on over to mylearn.oracle.com and search for the AI for You course. Lois: In our next episode, we'll get into Predictive AI, Generative AI, Agentic AI, all with respect to Oracle Fusion Applications. Until then, this is Lois Houston… Nikita: And Nikita Abraham, signing off! 15:10 That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.
On this episode of CA Media Podcast, I had the honor and privelege to interview Rev. David Wright who is the Pastor of New Grace Tabernacle Christian Center Church of God in Christ in Brooklyn, NY. He is the son of Legendary Gospel Singer Rev. Timothy Wright & Evangelist. Betty Wright. We dive into his upbringing, his musical ministry, call to ministry and his current role in gospel music. We also did a rapid fire questions in which you will enjoy. So come and enjoy this amazing interview with Rev. Wright.You can follow Rev. David Wright:IG: Instagram.com/pastordw3Facebook: Facebook.com/david.wright.12023New Grace Tabernacle Christian Center COGIC: / newgracetabernaclechristiancenter You can listen to the podcast on the following platforms:Apple Podcast: https://podcasts.apple.com/us/podcast...SPOTIFY: https://open.spotify.com/show/0T1qlQv...You can follow the podcast atFacebook: facebook.com/CAMediaPodcastInstagram: Instagram.com/CAMediaPodcastBlue Sky:https://bsky.app/profile/camediapodca...X: https://x.com/CAMediaPodcastIF you want to be on the podcast you can email the podcast at camediapodcast@gmail.com or book on linktree at linktr.ee/CAMediaPodcast and click in the booking link.Visionary Minds Public Relations and Media is a founding supporting sponsor of the CAMedia PodcastMake sure you get your Publicity, Digital Marketing, Writing, Media Consulting Services at visionarymindsny@gmail.com where Tammy Reese is the owner.
Living and Working in Spain with David Wright The Ultimate Expat Guide
Tune in to a selection of originals, throwbacks, and remixes from the 90s on, featuring uptempo Hip-Hop, Latin, R&B, Dancehall, and everything in between. Watch on YouTube: https://youtu.be/qU4iaR4OnHM ---------- Follow David Wright ◊ https://soundcloud.com/david-jamaal-wright ◊ https://www.instagram.com/davidjwright_/ ---------- Follow MSYH.FM » http://MSYH.FM » http://x.com/MSYHFM » http://instagram.com/MSYH.FM » http://facebook.com/MSYH.FM » http://patreon.com/MSYHFM ---------- Follow Make Sure You Have Fun™ ∞ http://MakeSureYouHaveFun.com ∞ http://x.com/MakeSureYouHave ∞ http://instagram.com/MakeSureYouHaveFun ∞ http://facebook.com/MakeSureYouHaveFun ∞ http://youtube.com/@makesureyouhavefun ∞ http://twitch.tv/@MakeSureYouHaveFun
428: David Wright and HFG Architecture's place in the world by Wichita Business Journal
The Mets free fall since June 13th is discussed in detail on this weeks show. From the starting rotation's inability to go deep in games, the recent bullpen melt downs, to the streaky offense - we go over it all here with top baseball insiders Andy Martino and Laura Albanese. We also look at the phenomenal debut of Nolan McLean on August 16th and discuss why he was not brought up sooner (the real insider will surprise you). The Mets fed off the energy of the kids at the Little League Classic - against the Seattle Mariners - but lose Francisco Alvarez in the process! Plus Juan Soto shuffling for the kids! August 2015 was a much different story for the Mets. We discuss that month with the Mets marching to the NL Championship with Terry Collins, Mets VP of Alumni relations Jay Horwitz and Andy Martino. Tunnel to Towers provides stories of inspiration with David Wright and Sylvester Stallone. Watch the entire episode on our YouTube channel - which includes highlights of August 2015! https://youtu.be/qhYovs8DLCU?si=uy40FHbaZasSdLGc Subscribe to our YouTube Channel here: https://www.youtube.com/@TheTerryCollinsShow Subscribe to the Terry Collins show on your favorite podcast platform. Like and Subscribe to our YouTube channel: / @theterrycollinsshow Follow The Terry Collins Show: X: https://x.com/TerryCollins_10 Instagram: / terrycollins_10 Facebook: https://www.facebook.com/profile.php?... Follow John Arezzi on X: https://x.com/johnarezzi Follow John Arezzi on Instagram:  / johnarezzi Donate $11 a month to now help first responders, veterans and our military heroes. Go to Tunnel to Towers and help them do good: https://t2t.org/ Host: Terry Collins Co-Host: John Arezzi Creative Director: Marsh Researcher - Dominic DiBiase Executive Producer: John Arezzi Learn more about your ad choices. Visit megaphone.fm/adchoices
Wilmer Flores relives the night at Citi Field—finding out on his phone mid-game that he'd been traded, breaking down on the field, and the private words David Wright shared with him in the tunnel. He opens up about Jeff Wilpon telling him there was no deal, the next-day walk-off that made him a Mets legend, and why the love from Queens still follows him on every road trip. Plus: 2015 World Series takeaways (that Royals defense!), how Wright treated rookies, life now with family, and the story behind the name “Wilmer.” Mets fans, bring the tissues. Learn more about your ad choices. Visit megaphone.fm/adchoices
Barstool Sports' Chris Klemmer is back on the show talking all things Mets Baseball. Chris joins Dylan Campione & Matt Potter to discuss the recent cold streak, trade deadline, Pete Alonso's record setting Home Run and David Wright's jersey retirement! All that & more packed into our annual episode with Chris! Thanks so much for joining us again, appreciate all the time & insight as always.
The Mets and Yankees both suffered frustrating losses. Gio and Jerry agree the Yankees' offensive struggles against Joe Ryan are déjà vu. Gio believes Aaron Boone will be fired if the Yankees miss the playoffs, while Jerry is unsure. A positive Mets fan from LA, Gio's critique of the Infinity Sports Network talent draft, and C-Lo's updates are also discussed. Gio is hesitant to ask Boomer for a favor. David Wright dislikes the Mets' home pinstripes. C-Lo explains a TikTok trend Brian Daboll referenced. Taylor Swift discussed her relationship with Travis and Jason Kelce on their podcast. Eddie undermines Gio's caddy story. The Moment of the Day: "Down goes Straw" and Jerry is "our Jeff McNeil." The hour concludes with discussions about watching Hard Knocks and CBS celebrating "The NFL Today."
Hour 1 Returning from California, Gio and Jerry discuss the Mets' 6-0 lead blown last night, attributing the loss to David Peterson's poor performance and the team's recent pitching struggles. They question the Mets' future, especially with the pitching staff. C-Lo's update covers the Mets' loss, the Brewers' 12th consecutive win, the Yankees' inability to sweep, Giancarlo Stanton's strong play, injuries at the Jets and Giants joint practice, Breece Hall's contract extension comments, and Jerry Jones's experimental drug trial. The hour ends with Gio and Jerry still struggling with the Mets' ongoing issues. Hour 2 Gio and Jerry discuss their trip to Pebble Beach and their golf games. C-Lo provides an update, and they talk about the Mets' blown lead and Pete Alonso's out at the plate. C-Lo explains the Brewers' free burger promotion. They also discuss Cam Schlittler's early exit, Paul Goldschmidt's potential injury, and a spat between The 7 Line and BT & Sal. Finally, Gio praises Jaxson Dart as an NFL QB prospect, and a caller thanks them for the Pebble Beach trip. Hour 3 Gio thinks Jerry will have an issue with Pete Alonso writing “down goes Straw” on his 253rd home run ball but Jerry is ok with it. Gio says there was no reason at all to include that. They agree that Pete definitely didn't have any negative intentions when he wrote it. Gio wasn't a big fan of Gary Cohen's call of the home run but Jerry liked it. C-Lo returns for an update but first Gio asks him to chime in on the Pete and Darryl Strawberry debate. The Mets and Yankees are a combined 6-17 since the trade deadline. They're hanging onto playoff spots by one game. It's been a dark two weeks. Chris “Mad Dog” Russo was bothered by the Mets celebration of Pete Alonso's 253rd HR and Gary Cohen's call of it. Gio doesn't agree with Dog but he thinks the pregame celebration after the record-breaking game is unnecessary. In the final segment of the hour, Gio and Jerry wonder if Pete's home run ball will stay in the Mets Hall of Fame at Citi Field or at his home. A caller wants Jeremy Hefner fired. Gio says it could happen after the season but Jerry would put the blame on the players and David Stearns before Hefner. Hour 4 The Mets and Yankees both suffered frustrating losses. Gio and Jerry agree the Yankees' offensive struggles against Joe Ryan are déjà vu. Gio believes Aaron Boone will be fired if the Yankees miss the playoffs, while Jerry is unsure. A positive Mets fan from LA, Gio's critique of the Infinity Sports Network talent draft, and C-Lo's updates are also discussed. Gio is hesitant to ask Boomer for a favor. David Wright dislikes the Mets' home pinstripes. C-Lo explains a TikTok trend Brian Daboll referenced. Taylor Swift discussed her relationship with Travis and Jason Kelce on their podcast. Eddie undermines Gio's caddy story. The Moment of the Day: "Down goes Straw" and Jerry is "our Jeff McNeil." The hour concludes with discussions about watching Hard Knocks and CBS celebrating "The NFL Today."
On the latest episode of The Mets Pod presented by Tri-State Cadillac, Connor Rogers and Joe DeMayo have the perfect diversion from the struggling Mets - an exclusive interview with Mets legend David Wright! The guys talk to Number Five about stories from his number retirement day, behind the scenes tales from the production of his SNY documentary, the 2015 trade deadline, his thoughts on the current team, his choice of the best Mets uniform ever, and all the details of the Battle of the Badges Game between the NYPD and FDNY that David is hosting at Citi Field on Sunday August 17th. Later, Connor and Joe dive down deep (and low) to talk about the current mess that is the Mets, including the pitching problems, the hitting problems, and all the other problems. The show also goes Down on the Farm to reveal what's behind the recent success of Brandon Sproat, and opens up a loud Mailbag to let the listeners let it all out as well. Be sure to subscribe to The Mets Pod at Apple Podcasts, Spotify, or wherever you get your podcasts.Today's Show:00:00 Welcome to the show00:20 David Wright joins the pod!01:15 Stories from the number retirement day03:05 How did David end up with number 5 in the first place?05:15 David hosts Battle of the Badges, NYPD vs FDNY, at Citi Field 8/17!07:10 Thoughts on SNY's documentary, “The Wright Way”08:45 Behind the scenes with the crew making the show10:25 Thoughts on the trade deadline, and adding Yoenis Cespedes in 201512:00 The modern MLB14:00 Could David have stolen 40 bases with today's rules?15:20 The big answer: what's the best Mets uniform?17:10 The meaning of being a captain17:20 Current Mets are in the rough, what should a captain do?19:30 Recounting the catch: how the bare hand dive in SD went down21:00 The Mets Pod Mount Rushmore: David's 4 favorite Shea Stadium memories23:55 Goodbye to David Wright26:00 The Week That Was…just terrible in every way39:50 Mailbag – Ranking collapses43:05 Mailbag – What can change to shake things up?49:10 Mailbag – Questioning the starting pitching strategy of David Stearns?56:25 The Scoreboard: last week's recap59:50 Mailbag/Down on the Farm: Brandon Sproat deep dive01:05:00 Any way to piggyback Brandon Sproat and Nolan McLean?
David Wright joins Jay Horwitz for a special Amazin' Conversation ahead of the 2025 Battle of the Badges at Citi Field — the annual showdown between New York's police and firefighters. Wright opens up about: How Tom Seaver inspired one of his most memorable on-field moments Why honoring first responders means so much to him and his family The fierce competitiveness (and trash talk) at Battle of the Badges His reflections on his jersey retirement day and the bond with Mets fans The lasting friendships from his 15-year career in Queens
In today's episode we look back at two Hall of Fame interviews with Mets legend, David Wright, and NFL legend Eric Allen. Hosts: Cousin Sal Guest: David Wright, Eric Allen Producer: Michael Szokoli The Ringer is committed to responsible gaming, please visit theringer.com/RG to learn more about the resources and helplines available, and listen to the end of the episode for additional details. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Mike Francesa reaches into his inbox and reacts to listener emails. You'll hear his thoughts on David Wright's career, Caitlin Clark's cultural impact, favorite Florida golf courses, and much more. Get more Mike! Subscribe to the free Mike Francesa newsletter at mikefrancesapodcast.com
On the latest episode of The Mets Pod presented by Tri-State Cadillac, SNY Mets play-by-play broadcaster Gary Cohen joins Connor Rogers and Joe DeMayo to talk about the results of the trade deadline and the road ahead for the Mets. The crew covers the team's new additions, plans for the starting rotation, the streakiness of the Mets and their younger players, David Wright, Pete Alonso, plus Gary answers a listener's question about the idea of Juan Soto as a leadoff hitter. Later, Connor and Joe go Down on the Farm for a Carson Benge deep dive and a check-in on Jett Williams, then score the Scoreboard and open the Mailbag for questions answered about the race for the NL East and the Mets future starting staffs. Be sure to subscribe to The Mets Pod at Apple Podcasts, Spotify, or wherever you get your podcasts.Today's Show:00:00 Welcome to the show, Gary Cohen joins the pod!00:35 Thoughts on the trade deadline 02:15 The starting rotation going forward03:40 The Mets are hot and cold06:10 Balancing Mauricio/Vientos/Baty playing time07:55 What makes Carlos Mendoza a successful leader09:15 Takeaways from David Wright's number retirement10:35 What is different about Francisco Alvarez after return?12:25 Pete Alonso chasing history, free agency again14:20 Mailbag for Gary: Should the Mets hit Juan Soto in the leadoff spot?15:20 Goodbye to Gary Cohen15:30 A rough week, a tight playoff race21:35 When to call up Brandon Sproat and/or Nolan McLean?27:50 Mailbag/Down on the Farm: Carson Benge deep dive30:25 Mailbag/Down on the Farm: With Drew Gilbert gone, does Jett Williams go full CF?31:55 The Scoreboard: last week's recap33:40 The Scoreboard: making this week's bets40:05 Mailbag – Why does Joe mispronounce “platoon?”42:05 Mailbag – Can the Marlins threaten for the NL East?46:45 Mailbag – Fitting Mets starters into future rotations
Former Mets first baseman Ike Davis sits down with Jay Horwitz for an unforgettable Amazin' Conversations episode — diving deep into his career highs, crushing injuries, and life after baseball. Ike opens up about battling Valley Fever, his ankle injury with David Wright, playing first base during Johan Santana's historic no-hitter, and hitting big home runs for R.A. Dickey's 20-win season. He shares behind-the-scenes stories of playing with Jose Reyes and David Wright, reflects on his 2014 trade from the Mets, and reveals what he's doing now in commercial real estate. Plus — an unbelievable childhood encounter with Joe DiMaggio you have to hear.
From 'Rico Brogna' (subscribe here): It was a tough 3-game series for the Mets as they return from all-star break. David Wright's retirement of his number 5 distracted the fans from losing the series to the Reds. A victory on Sunday at least allowed the Mets to avoid being swept at home. To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy Learn more about your ad choices. Visit https://podcastchoices.com/adchoices
On the latest episode of The Mets Pod presented by Tri-State Cadillac, Connor Rogers and Joe DeMayo look back at the week that was coming off of the All-Star break. Leading off, Connor and Joe talk about the Mets offense playing small ball, Francisco Alvarez' return to the big league club, and Brett Baty's contributions to the lineup. Then, the guys discuss the upcoming trade deadline and how president of baseball operations David Stearns plans to approach next Thursday for the Mets. Connor and Joe also share their reaction to David Wright's number retirement and go Down on the Farm to discuss potential position player call ups. They wrap the show with their scoreboard predictions and some Mailbag questions answered about Seth Lugo and potential prospects in centerfield. Be sure to subscribe to The Mets Pod at Apple Podcasts, Spotify, or wherever you get your podcasts. Today's Show: 00:00 Welcome to the show! 0:54 The Week That Was: Reds and Angels 5:44 The Bottom of the Lineup Comes Through 8:12 Francisco Alvarez returns 10:31 Trade Deadline Chatter 21:08 David Stearns trade deadline approach 25:11 David Wright Day Reaction 27:35 Down on the Farm: Position Player September Call Ups? 31:18 The Scoreboard 41:12 Mailbag: Is Seth Lugo a trade deadline option? 42:58 Mailbag: Any chance the Mets see how prospects handle centerfield before the deadline?
David Wright To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy Learn more about your ad choices. Visit https://podcastchoices.com/adchoices
Evan, Tommy and Rosie discuss their experiences at David Wright's Number Retirement Ceremony at CitiField this past Saturday.
Evan and Tiki talk about the WFAN Quarter Century team results focused on the Jets & Giants. They then give their thoughts on David Wright's Number Retirement Ceremony this past weekend.
On Monday's ENN, Jerry Jones responds to Micah, Bengals president Mike Brown on Shemar Stewart and David Wright number retirement. Learn more about your ad choices. Visit podcastchoices.com/adchoices
We heard from David Wright as he addressed the crowd when the Mets retired his number. Scottie Scheffler talked about staying calm as he won the Open Championship.
Boomer discusses Netflix's 'Quarterback', admiring Jared Goff and empathizing with Kirk Cousins, while finding Joe Burrow unique. He notes Aaron Glenn followed Dan Campbell's strategy of replacing the existing quarterback. Netflix is focusing on cruises, releasing 'Poop Cruise' and a documentary about a missing woman sold into sex trafficking. Jerry provides updates: Yankees beat Braves, MLB investigates a Braves coach's threat, Mets avoided a sweep against the Reds, and David Wright's number was retired. Scottie Scheffler calmly won the Open Championship. The hour concluded with a caller urging Gio to handle a wasp nest, prompting calls from two exterminators named Vinny.
Hour 1 Boomer, Gio, Jerry, Al & Eddie are back. Scottie Scheffler won The Open Championship; his dominance contrasts with his boring personality. Jerry's update included Scheffler's win. The Mets avoided a sweep, beating the Reds with Juan Soto scoring on an infield hit. The Yankees beat the Braves, winning the series as Aaron Judge homered. Gio saw a Joe Namath hearing aid commercial, leading to a discussion on Boomer doing a boner pill commercial. Hour 2 Boomer discusses Netflix's 'Quarterback' series, liking Jared Goff and sympathizing with Kirk Cousins, while finding Joe Burrow unique. He notes Aaron Glenn's approach mirrors Dan Campbell's in Detroit. Netflix is also targeting cruises with documentaries, including one about a missing woman sold into sex trafficking. Jerry's update covers the Yankees beating the Braves, MLB investigating a Braves coach, the Mets avoiding a sweep, and David Wright's number retirement speech. Scottie Scheffler reflected on staying calm during his Open Championship win. Finally, Gio's wasp nest issue was discussed, with two exterminators named Vinny calling in. Hour 3 Despite losing their series to the Reds, the Mets avoided a sweep. Boomer noted high-paid players underperforming, a sentiment echoed by a caller who highlighted the offense's season-long struggles, previously masked by dominant pitching. Jerry's update began with the Mets' win and then covered the Yankees taking 2-of-3 from the Braves before their Toronto series. Scottie Scheffler won the Open. The hour concluded with a discussion on the upcoming NFL season, focusing on the Jets' new GM, Head Coach, and Offensive Coordinator. Hour 4 Scottie Scheffler, dubbed "boring," won the Open Championship, prompting comparisons to Tiger Woods. Jerry's final update covered WFAN personalities' votes for NY's all-quarter-century teams. Aaron Judge tied A-Rod for sixth all-time Yankee homers. The Mets beat the Reds, with Juan Soto scoring on an infield single. Scheffler reacted to his win. The Moment of the Day: Boomer endorsing a "boner pill." The final segment revisited the WFAN quarter-century teams, including the hockey team. The building's fire alarm frequently goes off unnoticed.
Hear the best interviews of the week on WFAN. David Wright joins Sal Licata to discuss the Mets retiring his No. 5 this weekend. Plus, Sal talks Mets and more with Terry Collins; Joe Torre joins to talk All-Star Game and Yankees stories, MLB trade deadline talk with SNY's Andy Martino, Knicks talk with John Starks and an entertaining chat with comedian Andrew Dice Clay.
Hear the best interviews of the week on WFAN. David Wright joins Sal Licata to discuss the Mets retiring his No. 5 this weekend. Plus, Sal talks Mets and more with Terry Collins; Joe Torre joins to talk All-Star Game and Yankees stories, MLB trade deadline talk with SNY's Andy Martino, Knicks talk with John Starks and an entertaining chat with comedian Andrew Dice Clay.
This week's Mets highlights include Sal Licata ripping Major League Baseball for not naming Juan Soto an All-Star, and for not naming Pete Alonso All-Star Game MVP. Plus, Boomer and Gio discuss Francisco Lindor potentially being named the next team captain, as do Tiki and Morash. Finally, Sal talks with David Wright ahead of his number retirement.
This week's Mets highlights include Sal Licata ripping Major League Baseball for not naming Juan Soto an All-Star, and for not naming Pete Alonso All-Star Game MVP. Plus, Boomer and Gio discuss Francisco Lindor potentially being named the next team captain, as do Tiki and Morash. Finally, Sal talks with David Wright ahead of his number retirement.
David Wright Doc To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy Learn more about your ad choices. Visit https://podcastchoices.com/adchoices
From 'Rico Brogna' (subscribe here): In lieu of the Mets retiring David Wright's number on Saturday, July 19th, Evan Roberts takes you through his top moments of David Wright's career. To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy Learn more about your ad choices. Visit https://podcastchoices.com/adchoices
Tommy has a ticket to David Wright Day, but he isn't sure if he wants to go, much to Shaun's surprise.
Hour 2: Tommy has not accepted free tickets to David Wright Day quite yet. Shaun argues that he has to go or he loses his Mets fan card. That and much more.
Hour 3: Shaun and Tommy discuss the importance of David Wright to the Mets, Shaun is frustrated by the constant Yankee injuries, and much more.
Shaun gives the Top-Five #5's in his lifetime in honor of David Wright. Also, some breaking injury news on Cam Schlittler.
David Wright, the ESPYS, and Led Zeppelin To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy Learn more about your ad choices. Visit https://podcastchoices.com/adchoices
Hour 4: Shaun says that naming Lindor Captain would take away from David Wright's day. That and much more.
Rob Has a Podcast | Survivor / Big Brother / Amazing Race - RHAP
Today, Brandon talks to David Wright about Rick Devens' time on Survivor.
