Podcasts about exadata

  • 21PODCASTS
  • 99EPISODES
  • 28mAVG DURATION
  • 1MONTHLY NEW EPISODE
  • May 6, 2025LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about exadata

Latest podcast episodes about exadata

Oracle University Podcast
Oracle GoldenGate 23ai: New Features & Product Family

Oracle University Podcast

Play Episode Listen Later May 6, 2025 17:39


In this episode, Lois Houston and Nikita Abraham continue their deep dive into Oracle GoldenGate 23ai, focusing on its evolution and the extensive features it offers. They are joined once again by Nick Wagner, who provides valuable insights into the product's journey.   Nick talks about the various iterations of Oracle GoldenGate, highlighting the significant advancements from version 12c to the latest 23ai release. The discussion then shifts to the extensive new features in 23ai, including AI-related capabilities, UI enhancements, and database function integration.   Oracle GoldenGate 23ai: Fundamentals: https://mylearn.oracle.com/ou/course/oracle-goldengate-23ai-fundamentals/145884/237273 Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, Radhika Banka, and the OU Studio Team for helping us create this episode.   -----------------------------------------------------------------   Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started! 00:25 Lois: Hello and welcome to the Oracle University Podcast! I'm Lois Houston, Director of Innovation Programs with Oracle University, and with me is Nikita Abraham, Team Lead: Editorial Services.  Nikita: Hi everyone! Last week, we introduced Oracle GoldenGate and its capabilities, and also spoke about GoldenGate 23ai. In today's episode, we'll talk about the various iterations of Oracle GoldenGate since its inception. And we'll also take a look at some new features and the Oracle GoldenGate product family. 00:57 Lois: And we have Nick Wagner back with us. Nick is a Senior Director of Product Management for GoldenGate at Oracle. Hi Nick! I think the last time we had an Oracle University course was when Oracle GoldenGate 12c was out. I'm sure there's been a lot of advancements since then. Can you walk us through those? Nick: GoldenGate 12.3 introduced the microservices architecture. GoldenGate 18c introduced support for Oracle Autonomous Data Warehouse and Autonomous Transaction Processing Databases. In GoldenGate 19c, we added the ability to do cross endian remote capture for Oracle, making it easier to set up the GoldenGate OCI service to capture from environments like Solaris, Spark, and HP-UX and replicate into the Cloud. Also, GoldenGate 19c introduced a simpler process for upgrades and installation of GoldenGate where we released something called a unified build. This means that when you install GoldenGate for a particular database, you don't need to worry about the database version when you install GoldenGate. Prior to this, you would have to install a version-specific and database-specific version of GoldenGate. So this really simplified that whole process. In GoldenGate 23ai, which is where we are now, this really is a huge release.  02:16 Nikita: Yeah, we covered some of the distributed AI features and high availability environments in our last episode. But can you give us an overview of everything that's in the 23ai release? I know there's a lot to get into but maybe you could highlight just the major ones? Nick: Within the AI and streaming environments, we've got interoperability for database vector types, heterogeneous capture and apply as well. Again, this is not just replication between Oracle-to-Oracle vector or Postgres to Postgres vector, it is heterogeneous just like the rest of GoldenGate. The entire UI has been redesigned and optimized for high speed. And so we have a lot of customers that have dozens and dozens of extracts and replicats and processes running and it was taking a long time for the UI to refresh those and to show what's going on within those systems. So the UI has been optimized to be able to handle those environments much better. We now have the ability to call database functions directly from call map. And so when you do transformation with GoldenGate, we have about 50 or 60 built-in transformation routines for string conversion, arithmetic operation, date manipulation. But we never had the ability to directly call a database function. 03:28 Lois: And now we do? Nick: So now you can actually call that database function, database stored procedure, database package, return a value and that can be used for transformation within GoldenGate. We have integration with identity providers, being able to use token-based authentication and integrate in with things like Azure Active Directory and your other single sign-on for the GoldenGate product itself. Within Oracle 23ai, there's a number of new features. One of those cool features is something called lock-free reservation columns. So this allows you to have a row, a single row within a table and you can identify a column within that row that's like an inventory column. And you can have multiple different users and multiple different transactions all updating that column within that same exact row at that same time. So you no longer have row-level locking for these reservation columns. And it allows you to do things like shopping carts very easily. If I have 500 widgets to sell, I'm going to let any number of transactions come in and subtract from that inventory column. And then once it gets below a certain point, then I'll start enforcing that row-level locking. 04:43 Lois: That's really cool… Nick: The one key thing that I wanted to mention here is that because of the way that the lock-free reservations work, you can have multiple transactions open on the same row. This is only supported for Oracle to Oracle. You need to have that same lock-free reservation data type and availability on that target system if GoldenGate is going to replicate into it. 05:05 Nikita: Are there any new features related to the diagnosability and observability of GoldenGate?  Nick: We've improved the AWR reports in Oracle 23ai. There's now seven sections that are specific to Oracle GoldenGate to allow you to really go in and see exactly what the GoldenGate processes are doing and how they're behaving inside the database itself. And there's a Replication Performance Advisor package inside that database, and that's been integrated into the Web UI as well. So now you can actually get information out of the replication advisor package in Oracle directly from the UI without having to log into the database and try to run any database procedures to get it. We've also added the ability to support a per-PDB Extract.  So in the past, when GoldenGate would run on a multitenant database, a multitenant database in Oracle, all the redo data from any pluggable database gets sent to that one redo stream. And so you would have to configure GoldenGate at the container or root level and it would be able to access anything at any PDB. Now, there's better security and better performance by doing what we call per-PDB Extract. And this means that for a single pluggable database, I can have an extract that runs at that database level that's going to capture information just from that pluggable database. 06:22 Lois And what about non-Oracle environments, Nick? Nick: We've also enhanced the non-Oracle environments as well. For example, in Postgres, we've added support for precise instantiation using Postgres snapshots. This eliminates the need to handle collisions when you're doing Postgres to Postgres replication and initial instantiation. On the GoldenGate for big data side, we've renamed that product more aptly to distributed applications in analytics, which is really what it does, and we've added a whole bunch of new features here too. The ability to move data into Databricks, doing Google Pub/Sub delivery. We now have support for XAG within the GoldenGate for distributed applications and analytics. What that means is that now you can follow all of our MAA best practices for GoldenGate for Oracle, but it also works for the DAA product as well, meaning that if it's running on one node of a cluster and that node fails, it'll restart itself on another node in the cluster. We've also added the ability to deliver data to Redis, Google BigQuery, stage and merge functionality for better performance into the BigQuery product. And then we've added a completely new feature, and this is something called streaming data and apps and we're calling it AsyncAPI and CloudEvent data streaming. It's a long name, but what that means is that we now have the ability to publish changes from a GoldenGate trail file out to end users. And so this allows through the Web UI or through the REST API, you can now come into GoldenGate and through the distributed applications and analytics product, actually set up a subscription to a GoldenGate trail file. And so this allows us to push data into messaging environments, or you can simply subscribe to changes and it doesn't have to be the whole trail file, it can just be a subset. You can specify exactly which tables and you can put filters on that. You can also set up your topologies as well. So, it's a really cool feature that we've added here. 08:26 Nikita: Ok, you've given us a lot of updates about what GoldenGate can support. But can we also get some specifics? Nick: So as far as what we have, on the Oracle Database side, there's a ton of different Oracle databases we support, including the Autonomous Databases and all the different flavors of them, your Oracle Database Appliance, your Base Database Service within OCI, your of course, Standard and Enterprise Edition, as well as all the different flavors of Exadata, are all supported with GoldenGate. This is all for capture and delivery. And this is all versions as well. GoldenGate supports Oracle 23ai and below. We also have a ton of non-Oracle databases in different Cloud stores. On an non-Oracle side, we support everything from application-specific databases like FairCom DB, all the way to more advanced applications like Snowflake, which there's a vast user base for that. We also support a lot of different cloud stores and these again, are non-Oracle, nonrelational systems, or they can be relational databases. We also support a lot of big data platforms and this is part of the distributed applications and analytics side of things where you have the ability to replicate to different Apache environments, different Cloudera environments. We also support a number of open-source systems, including things like Apache Cassandra, MySQL Community Edition, a lot of different Postgres open source databases along with MariaDB. And then we have a bunch of streaming event products, NoSQL data stores, and even Oracle applications that we support. So there's absolutely a ton of different environments that GoldenGate supports. There are additional Oracle databases that we support and this includes the Oracle Metadata Service, as well as Oracle MySQL, including MySQL HeatWave. Oracle also has Oracle NoSQL Spatial and Graph and times 10 products, which again are all supported by GoldenGate. 10:23 Lois: Wow, that's a lot of information! Nick: One of the things that we didn't really cover was the different SaaS applications, which we've got like Cerner, Fusion Cloud, Hospitality, Retail, MICROS, Oracle Transportation, JD Edwards, Siebel, and on and on and on.  And again, because of the nature of GoldenGate, it's heterogeneous. Any source can talk to any target. And so it doesn't have to be, oh, I'm pulling from Oracle Fusion Cloud, that means I have to go to an Oracle Database on the target, not necessarily.  10:51 Lois: So, there's really a massive amount of flexibility built into the system.  11:00 Unlock the power of AI Vector Search with our new course and certification. Get more accurate search results, handle complex datasets easily, and supercharge your data-driven decisions. From now through May 15, 2025, we are waiving the certification exam fee (valued at $245). Visit mylearn.oracle.com to enroll. 11:26 Nikita: Welcome back! Now that we've gone through the base product, what other features or products are in the GoldenGate family itself, Nick? Nick: So we have quite a few. We've kind of touched already on GoldenGate for Oracle databases and non-Oracle databases. We also have something called GoldenGate for Mainframe, which right now is covered under the GoldenGate for non-Oracle, but there is a licensing difference there. So that's something to be aware of. We also have the OCI GoldenGate product. We are announcing and we have announced that OCI GoldenGate will also be made available as part of the Oracle Database@Azure and Oracle Database@ Google Cloud partnerships.  And then you'll be able to use that vendor's cloud credits to actually pay for the OCI GoldenGate product. One of the cool things about this is it will have full feature parity with OCI GoldenGate running in OCI. So all the same features, all the same sources and targets, all the same topologies be able to migrate data in and out of those clouds at will, just like you do with OCI GoldenGate today running in OCI.  We have Oracle GoldenGate Free.  This is a completely free edition of GoldenGate to use. It is limited on the number of platforms that it supports as far as sources and targets and the size of the database.  12:45 Lois: But it's a great way for developers to really experience GoldenGate without worrying about a license, right? What's next, Nick? Nick: We have GoldenGate for Distributed Applications and Analytics, which was formerly called GoldenGate for big data, and that allows us to do all the streaming. That's also where the GoldenGate AsyncAPI integration is done. So in order to publish the GoldenGate trail files or allow people to subscribe to them, it would be covered under the Oracle GoldenGate Distributed Applications and Analytics license. We also have OCI GoldenGate Marketplace, which allows you to run essentially the on-premises version of GoldenGate but within OCI. So a little bit more flexibility there. It also has a hub architecture. So if you need that 99.99% availability, you can get it within the OCI Marketplace environment. We have GoldenGate for Oracle Enterprise Manager Cloud Control, which used to be called Oracle Enterprise Manager. And this allows you to use Enterprise Manager Cloud Control to get all the statistics and details about GoldenGate. So all the reporting information, all the analytics, all the statistics, how fast GoldenGate is replicating, what's the lag, what's the performance of each of the processes, how much data am I sending across a network. All that's available within the plug-in. We also have Oracle GoldenGate Veridata. This is a nice utility and tool that allows you to compare two databases, whether or not GoldenGate is running between them and actually tell you, hey, these two systems are out of sync. And if they are out of sync, it actually allows you to repair the data too. 14:25 Nikita: That's really valuable…. Nick: And it does this comparison without locking the source or the target tables. The other really cool thing about Veridata is it does this while there's data in flight. So let's say that the GoldenGate lag is 15 or 20 seconds and I want to compare this table that has 10 million rows in it. The Veridata product will go out, run its comparison once. Once that comparison is done the first time, it's then going to have a list of rows that are potentially out of sync. Well, some of those rows could have been moved over or could have been modified during that 10 to 15 second window. And so the next time you run Veridata, it's actually going to go through. It's going to check just those rows that were potentially out of sync to see if they're really out of sync or not. And if it comes back and says, hey, out of those potential rows, there's two out of sync, it'll actually produce a script that allows you to resynchronize those systems and repair them. So it's a very cool product.  15:19 Nikita: What about GoldenGate Stream Analytics? I know you mentioned it in the last episode, but in the context of this discussion, can you tell us a little more about it?  Nick: This is the ability to essentially stream data from a GoldenGate trail file, and they do a real time analytics on it. And also things like geofencing or real-time series analysis of it.  15:40 Lois: Could you give us an example of this? Nick: If I'm working in tracking stock market information and stocks, it's not really that important on how much or how far down a stock goes. What's really important is how quickly did that stock rise or how quickly did that stock fall. And that's something that GoldenGate Stream Analytics product can do. Another thing that it's very valuable for is the geofencing. I can have an application on my phone and I can track where the user is based on that application and all that information goes into a database. I can then use the geofencing tool to say that, hey, if one of those users on that app gets within a certain distance of one of my brick-and-mortar stores, I can actually send them a push notification to say, hey, come on in and you can order your favorite drink just by clicking Yes, and we'll have it ready for you. And so there's a lot of things that you can do there to help upsell your customers and to get more revenue just through GoldenGate itself. And then we also have a GoldenGate Migration Utility, which allows customers to migrate from the classic architecture into the microservices architecture. 16:44 Nikita: Thanks Nick for that comprehensive overview.  Lois: In our next episode, we'll have Nick back with us to talk about commonly used terminology and the GoldenGate architecture. And if you want to learn more about what we discussed today, visit mylearn.oracle.com and take a look at the Oracle GoldenGate 23ai Fundamentals course. Until next time, this is Lois Houston… Nikita: And Nikita Abraham, signing off! 17:10 That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

Oracle University Podcast
Oracle Database@Azure

Oracle University Podcast

Play Episode Listen Later Mar 25, 2025 15:12


The final episode of the multicloud series focuses on Oracle Database@Azure, a powerful cloud database solution. Hosts Lois Houston and Nikita Abraham, along with Senior Manager of CSS OU Cloud Delivery Samvit Mishra, discuss how this service allows customers to run Oracle databases within the Microsoft Azure data center, simplifying deployment and management. The discussion also highlights the benefits of native integration with Azure services, eliminating the need for complex networking setups.   Oracle Cloud Infrastructure Multicloud Architect Professional: https://mylearn.oracle.com/ou/course/oracle-cloud-infrastructure-multicloud-architect-professional-2025-/144474 Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X: https://x.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Kris-Ann Nansen, and the OU Studio Team for helping us create this episode.   ---------------------------------------------------------------   Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started! 00:25 Lois: Hello and welcome to the Oracle University Podcast! I'm Lois Houston, Director of Innovation Programs with Oracle University, and with me is Nikita Abraham, Team Lead: Editorial Services. Nikita: Hi everyone! For the last two weeks, we've been talking about different aspects of multicloud. In the final episode of this three-part series, Samvit Mishra, Senior Manager of CSS OU Cloud Delivery, joins us once again to tell us about the Oracle Database@Azure service. Hi Samvit! Thanks for being here today. Samvit: Hi Niki! Hi Lois! Happy to be back. 01:01 Lois: In our last episode, we spoke about the strategic partnership between Oracle and Microsoft, and specifically discussed the Oracle Interconnect for Azure.  Nikita: Yeah, and Oracle Database@Azure is yet another addition to this partnership. What can you tell us about this service, Samvit? Samvit: The Oracle Database@Azure service, which was made generally available in 2023, runs right inside the Microsoft Azure data center and uses Azure networking. The entire Oracle Cloud Database Service infrastructure resides in the Azure data center, while it is managed by an expert Oracle Cloud Infrastructure operations team.  It provides customers simple and secure access to Oracle Cloud database services within their chosen Azure deployment region, without getting into the complexity of managing networking between the cloud vendors. It is natively integrated with various Microsoft Azure services. This provides a seamless user experience when configuring and using the different Azure services with OCI Oracle database, since much of the complexity associated with the configuration is greatly simplified.  There is no need to set up a private interconnect between Microsoft Azure and OCI because the service itself resides within the Azure data center and uses the Azure network. This is very beneficial in terms of strategic deployment because customers can experience microseconds network latency between the endpoints, while receiving a high-performance database environment.  02:42 Nikita: How do I get started with the Oracle Database@Azure service? Samvit: You begin by purchasing the subscription from Oracle and setting up your billing account. Then you provision the database, resources, and service.  With that you are ready to configure your application to connect to the database and work on the remaining deployment. As you continue using the service, you can monitor the different resource metrics using the Azure monitoring services and analyze those logs using Azure Log Analytics.  03:15 Lois: So, the adoption is pretty easy, then. What about the responsibilities? Who is responsible for what? Samvit: The Oracle Cloud operations team is entirely responsible for managing the Exadata Database Infrastructure and the VM cluster resources that are provisioned in the Microsoft Azure data center.  Oracle is responsible for maintaining the service software and infrastructure by applying updates as they are released. Any issues arising from the OCI Database Service and the resources will be addressed by Oracle Support. You have to raise a support ticket for them to investigate and provide a resolution.  And as Azure customers, you have to do rightsizing, based on your workload needs, and provision the Exadata Database Infrastructure and VM cluster in the OCI pod within the Azure data center. You have to provision the database in Exadata Database Service, apply the database and system updates, and take advantage of the cloud automation to maintain and manage the database.  You have to load data, establish the connectivity, and support development on your database. As a customer, you monitor the database and infrastructure metrics and events, and also analyze those logs using the Microsoft Azure-provided native services.  04:42 Nikita: Samvit, what sort of challenges were being faced by customers that necessitated the creation of the Oracle Database@Azure service?  Samvit: A common deployment scenario in customer environments was that a lot of critical applications, which could be packaged applications, in-house applications, or customized third-party applications, used Oracle Database as their primary database solution.  These Oracle databases were deployed in Exadata Infrastructure on-premises or even in Enterprise Server hardware. Some customers evaluated and migrated many of their packaged and other applications to Microsoft Azure compute. Since Oracle Exadata was not supported in Azure, they had to configure a hybrid deployment in order to use Oracle databases that reside in the Exadata infrastructure on-premises.  They needed to configure a dedicated and secure network between the Azure data center and their on-premises data center. This added complexity, incurred high costs, had a latency effect, and was even unreliable. There were also cases where customers migrated Oracle databases on Enterprise Server on-premises to Oracle databases hosted on Azure compute.  This did not boost efficiency to a large scale. And those were the only options available when provisioning Oracle Database in Azure because Exadata was not available earlier in Azure. 06:18 Lois: And how has that been resolved now? Samvit: With the Oracle Database@Azure service, customer requirements have been aptly met by allowing them to host their Oracle databases on Exadata infrastructure, right next to their application in the Azure data center.  Customers, while migrating their applications to Azure compute, can also migrate their Oracle databases on-premises on Exadata infrastructure directly to Exadata Database Service in Azure. And Oracle databases that are on Enterprise Server on-premises can be consolidated directly into Exadata Database Service in Azure, providing them the benefits of scalability, security, performance, and availability, all that are inherent property of OCI Oracle Exadata Database Service.  Customers can see growth in the operational efficiency, saving on the overall cost.  07:17 Nikita: Can you take us through the process of deployment?  Samvit: It's quite simple, actually. First, you deploy the Exadata Database Service that is plugged into Azure VNET. Next, you provision the required number of databases, which might be migrated as is or with a consolidated exercise.  You can use any of the Oracle database tools or utilities to do the migration or even use the Oracle Zero Downtime Migration method to automate the entire Oracle database migration. Finally, migrate your enterprise application into the Azure environment.  Establish the required network configuration to allow communication between the migrated applications and Oracle databases.  And then you are all set to publish your application that is running entirely in Azure. You can leverage other Azure services, like monitoring, log analytics, Power BI, or DevOps tools, to enhance existing or even build and deploy newer enterprise applications that are powered by OCI Oracle Database Service in the back end.  08:25 Lois: What about multi-cloud deployment scenarios where applications reside in Azure, but the Oracle databases are deployed on third-party cloud providers, either as a native solution or in computes? Samvit: These Oracle databases can be migrated to Exadata Database Service in the Oracle Database@Azure service.  There is no need for the complex cross-cloud connectivity setup between the vendors. And at the same time, you experience the lowest latency between the application and the database deployment. 09:05 Want to learn how to design stunning, responsive enterprise applications directly from your browser with minimal coding? The new Oracle APEX Developer Professional learning path and certification enables you to leverage AI-assisted development, including generative AI and Database 23ai, to build secure, scalable web and mobile applications with advanced AI-powered features. From now through May 15, 2025, we're waiving the certification exam fee (valued at $245). So, what are you waiting for? Visit mylearn.oracle.com to get started today. 09:45 Nikita: Welcome back! Samvit, what's the onboarding process like? Samvit: You have to complete the onboarding process to use the service in Microsoft Azure. But before you do that, you first have to complete the subscription process. You must have an active Microsoft Azure account subscription that will be used for subscribing and onboarding the Oracle Database@Azure service.  To subscribe to Oracle Database@Azure, you need to purchase an Oracle Database@Azure private offer from Azure Marketplace. As a customer, you will first reach out to Oracle Sales and negotiate a price for the service. Oracle will provide you with the billing account ID and contact details of the person within the organization who will be handling the service.  After this, Oracle will create a private offer in Azure Marketplace.  10:40 Lois: Sorry to interrupt you, but what's a private offer? Samvit: That's alright, Lois. Private offers are basically solutions or services created for customers by a Microsoft partner, which, in this case, is Oracle. Purchase of those private offers happens from the private offer management page of Azure Marketplace.  But there is a prerequisite. The Azure account must be enabled to make private offer purchases on the subscription from Azure Marketplace. You can refer to the Azure documentation to enable the account, if it is not enabled. You review the offer terms and accept the purchase offer, which will take you to the Create Oracle Subscription page.  You validate the subscription and other particulars and proceed with the process. After the service is deployed, the purchase status of the private offer changes to subscribed.  There are a few points to note here. Billing and payment are done via Azure, and you can use Microsoft Azure Consumption Commitment.  You can also use your on-premises licenses with the Bring Your Own License option and the Unlimited License Agreements to pay towards your service consumption. And you also receive Oracle Support rewards for every dollar spent on the service.  12:02 Nikita: OK, now that I'm subscribed, what's next? Samvit: After you complete the subscription step, Oracle Database@Azure will appear as an Azure resource, just like any other Azure service, and you can move on to onboarding. Onboarding begins with the linking of your OCI account, which will be used for provisioning and managing database resources.  The account is also used for provisioning infrastructure and software maintenance updates for the database service. You can either provide an existing OCI account or create a new one. Then you set up Identity Federation between the Azure account and the OCI tenancy.  This can authenticate login to the OCI portal using Azure credentials, which you require while performing certain operations in OCI. For example, provisioning databases, getting infrastructure and software maintenance updates, and so on. This is an optional step, but it is recommended that you complete the Federation.  The last step is to authorize users by assigning groups and roles in order to have the needed privileges to perform different operations. For example, some groups of users can manage Exadata Database Service resources in Azure, while some can manage the databases in OCI.  You can refer to OCI documentation to get detailed descriptions of roles and group names. 13:31 Lois: Right. That will ensure you assign the correct permissions to the appropriate users. Samvit: Exactly. Assigning the correct roles and permissions to individuals inside the organization is a necessary step for transacting in the marketplace and guaranteeing a smooth purchasing experience. Azure Marketplace uses Azure Role-Based Access Control to enable you to acquire solutions certified to run on Azure. Those are then going to determine the purchasing privileges within the organization.  14:03 Nikita: There's so much more we can discuss about Oracle Database@Azure, but we have to stop somewhere! Thank you so much, Samvit, for joining us over these last three episodes. Lois: Yeah, it's been great to have you, Samvit. Samvit: Thank you for having me. Nikita: Remember, we also have the Oracle Database@Google Cloud service. So, if you want to learn about that, or even if you want to dive deeper into the topics we covered today, go to mylearn.oracle.com and search for the Oracle Cloud Infrastructure Multicloud Architect Professional course.  Lois: There are a lot of demonstrations that you'll surely find useful. Well, that's all we have for today. Until next time, this is Lois Houston… Nikita: And Nikita Abraham, signing off! 14:43 That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

Oracle University Podcast
Best of 2024: Autonomous Database on Serverless Infrastructure

Oracle University Podcast

Play Episode Listen Later Dec 17, 2024 17:25


Want to quickly provision your autonomous database? Then look no further than Oracle Autonomous Database Serverless, one of the two deployment choices offered by Oracle Autonomous Database.   Autonomous Database Serverless delegates all operational decisions to Oracle, providing you with a completely autonomous experience.   Join hosts Lois Houston and Nikita Abraham, along with Oracle Database experts, as they discuss how serverless infrastructure eliminates the need to configure any hardware or install any software because Autonomous Database handles provisioning the database, backing it up, patching and upgrading it, and growing or shrinking it for you.   Survey: https://customersurveys.oracle.com/ords/surveys/t/oracle-university-gtm/survey?k=focus-group-2-link-share-5   Oracle MyLearn: https://mylearn.oracle.com/   Oracle University Learning Community: https://education.oracle.com/ou-community   LinkedIn: https://www.linkedin.com/showcase/oracle-university/   X (formerly Twitter): https://twitter.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Rajeev Grover, and the OU Studio Team for helping us create this episode.   --------------------------------------------------------   Episode Transcript:   00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started. 00:26 Lois: Hello and welcome to the Oracle University Podcast! I'm Lois Houston, Director of Innovation Programs with Oracle University, and with me is Nikita Abraham, Team Lead: Editorial Services. Nikita: Hi everyone! We hope you've been enjoying these last few weeks as we've been revisiting our most popular episodes of the year.  Lois: Today's episode is the last one in this series and is a throwback to a conversation on Autonomous Databases on Serverless Infrastructure with three experts in the field: Hannah Nguyen, Sean Stacey, and Kay Malcolm. Hannah is a Staff Cloud Engineer, Sean is the Director of Platform Technology Solutions, and Kay is Vice President of Database Product Management. For this episode, we'll be sharing portions of our conversations with them.  01:14 Nikita: We began by asking Hannah how Oracle Cloud handles the process of provisioning an  Autonomous Database. So, let's jump right in! Hannah: The Oracle Cloud automates the process of provisioning an Autonomous Database, and it automatically provisions for you a highly scalable, highly secure, and a highly available database very simply out of the box. 01:35 Lois: Hannah, what are the components and architecture involved when provisioning an Autonomous Database in Oracle Cloud? Hannah: Provisioning the database involves very few steps. But it's important to understand the components that are part of the provisioned environment. When provisioning a database, the number of CPUs in increments of 1 for serverless, storage in increments of 1 terabyte, and backup are automatically provisioned and enabled in the database. In the background, an Oracle 19c pluggable database is being added to the container database that manages all the user's Autonomous Databases. Because this Autonomous Database runs on Exadata systems, Real Application Clusters is also provisioned in the background to support the on-demand CPU scalability of the service. This is transparent to the user and administrator of the service. But be aware it is there. 02:28 Nikita: Ok…So, what sort of flexibility does the Autonomous Database provide when it comes to managing resource usage and costs, you know… especially in terms of starting, stopping, and scaling instances? Hannah: The Autonomous Database allows you to start your instance very rapidly on demand. It also allows you to stop your instance on demand as well to conserve resources and to pause billing. Do be aware that when you do pause billing, you will not be charged for any CPU cycles because your instance will be stopped. However, you'll still be incurring charges for your monthly billing for your storage. In addition to allowing you to start and stop your instance on demand, it's also possible to scale your database instance on demand as well. All of this can be done very easily using the Database Cloud Console. 03:15 Lois: What about scaling in the Autonomous Database? Hannah: So you can scale up your OCPUs without touching your storage and scale it back down, and you can do the same with your storage. In addition to that, you can also set up autoscaling. So the database, whenever it detects the need, will automatically scale up to three times the base level number of OCPUs that you have allocated or provisioned for the Autonomous Database. 03:38 Nikita: Is autoscaling available for all tiers?  Hannah: Autoscaling is not available for an always free database, but it is enabled by default for other tiered environments. Changing the setting does not require downtime. So this can also be set dynamically. One of the advantages of autoscaling is cost because you're billed based on the average number of OCPUs consumed during an hour. 04:01 Lois: Thanks, Hannah! Now, let's bring Sean into the conversation. Hey Sean, I want to talk about moving an autonomous database resource. When or why would I need to move an autonomous database resource from one compartment to another? Sean: There may be a business requirement where you need to move an autonomous database resource, serverless resource, from one compartment to another. Perhaps, there's a different subnet that you would like to move that autonomous database to, or perhaps there's some business applications that are within or accessible or available in that other compartment that you wish to move your autonomous database to take advantage of. 04:36 Nikita: And how simple is this process of moving an autonomous database from one compartment to another? What happens to the backups during this transition? Sean: The way you can do this is simply to take an autonomous database and move it from compartment A to compartment B. And when you do so, the backups, or the automatic backups that are associated with that autonomous database, will be moved with that autonomous database as well. 05:00 Lois: Is there anything that I need to keep in mind when I'm moving an autonomous database between compartments?  Sean: A couple of things to be aware of when doing this is, first of all, you must have the appropriate privileges in that compartment in order to move that autonomous database both from the source compartment to the target compartment. In addition to that, once the autonomous database is moved to this new compartment, any policies or anything that's defined in that compartment to govern the authorization and privileges of that said user in that compartment will be applied immediately to that new autonomous database that has been moved into that new compartment. 05:38 Nikita: Sean, I want to ask you about cloning in Autonomous Database. What are the different types of clones that can be created?  Sean: It's possible to create a new Autonomous Database as a clone of an existing Autonomous Database. This can be done as a full copy of that existing Autonomous Database, or it can be done as a metadata copy, where the objects and tables are cloned, but they are empty. So there's no rows in the tables. And this clone can be taken from a live running Autonomous Database or even from a backup. So you can take a backup and clone that to a completely new database. 06:13 Lois: But why would you clone in the first place? What are the benefits of this?  Sean: When cloning or when creating this clone, it can be created in a completely new compartment from where the source Autonomous Database was originally located. So it's a nice way of moving one database to another compartment to allow developers or another community of users to have access to that environment. 06:36 Nikita: I know that along with having a full clone, you can also have a refreshable clone. Can you tell us more about that? Who is responsible for this? Sean: It's possible to create a refreshable clone from an Autonomous Database. And this is one that would be synced with that source database up to so many days. The task of keeping that refreshable clone in sync with that source database rests upon the shoulders of the administrator. The administrator is the person who is responsible for performing that sync operation. Now, actually performing the operation is very simple, it's point and click. And it's an automated process from the database console. And also be aware that refreshable clones can trail the source database or source Autonomous Database up to seven days. After that period of time, the refreshable clone, if it has not been refreshed or kept in sync with that source database, it will become a standalone, read-only copy of that original source database. 07:38 Nikita: Ok Sean, so if you had to give us the key takeaways on cloning an Autonomous Database, what would they be?  Sean: It's very easy and a lot of flexibility when it comes to cloning an Autonomous Database. We have different models that you can take from a live running database instance with zero impact on your workload or from a backup. It can be a full copy, or it can be a metadata copy, as well as a refreshable, read-only clone of a source database. 08:12 Did you know that Oracle University offers free courses on Oracle Cloud Infrastructure? You'll find training on everything from cloud computing, database, and security to artificial intelligence and machine learning, all of which is available free to subscribers. So, get going! Pick a course of your choice, get certified, join the Oracle University Learning Community, and network with your peers. If you're already an Oracle MyLearn user, go to MyLearn to begin your journey. If you have not yet accessed Oracle MyLearn, visit mylearn.oracle.com and create an account to get started.  08:50 Nikita: Welcome back! Thank you, Sean, and hi Kay! I want to ask you about events and notifications in Autonomous Database. Where do they really come in handy?  Kay: Events can be used for a variety of notifications, including admin password expiration, ADB services going down, and wallet expiration warnings. There's this service, and it's called the notifications service. It's part of OCI. And this service provides you with the ability to broadcast messages to distributed components using a publish and subscribe model. These notifications can be used to notify you when event rules or alarms are triggered or simply to directly publish a message. In addition to this, there's also something that's called a topic. This is a communication channel for sending messages to subscribers in the topic. You can manage these topics and their subscriptions really easy. It's not hard to do at all. 09:52 Lois: Kay, I want to ask you about backing up Autonomous Databases. How does Autonomous Database handle backups? Kay: Autonomous Database automatically backs up your database for you. The retention period for backups is 60 days. You can restore and recover your database to any point in time during this retention period. You can initiate recovery for your Autonomous Database by using the cloud console or an API call. Autonomous Database automatically restores and recovers your database to the point in time that you specify. In addition to a point in time recovery, we can also perform a restore from a specific backup set.  10:37 Lois: Kay, you spoke about automatic backups, but what about manual backups?  Kay: You can do manual backups using the cloud console, for example, if you want to take a backup say before a major change to make restoring and recovery faster. These manual backups are put in your cloud object storage bucket. 10:58 Nikita: Are there any special instructions that we need to follow when configuring a manual backup? Kay: The manual backup configuration tasks are a one-time operation. Once this is configured, you can go ahead, trigger your manual backup any time you wish after that. When creating the object storage bucket for the manual backups, it is really important-- so I don't want you to forget-- that the name format for the bucket and the object storage follows this naming convention. It should be backup underscore database name. And it's not the display name here when I say database name. In addition to that, the object name has to be all lowercase. So three rules. Backup underscore database name, and the specific database name is not the display name. It has to be in lowercase. Once you've created your object storage bucket to meet these rules, you then go ahead and set a database property. Default_backup_bucket. This points to the object storage URL and it's using the Swift protocol. Once you've got your object storage bucket mapped and you've created your mapping to the object storage location, you then need to go ahead and create a database credential inside your database. You may have already had this in place for other purposes, like maybe you were loading data, you were using Data Pump, et cetera. If you don't, you would need to create this specifically for your manual backups. Once you've done so, you can then go ahead and set your property to that default credential that you created. So once you follow these steps as I pointed out, you only have to do it one time. Once it's configured, you can go ahead and use it from now on for your manual backups. 13:00 Lois: Kay, the last topic I want to talk about before we let you go is Autonomous Data Guard. Can you tell us about it? Kay: Autonomous Data Guard monitors the primary database, in other words, the database that you're using right now.  13:14 Lois: So, if ADB goes down… Kay: Then the standby instance will automatically become the primary instance. There's no manual intervention required. So failover from the primary database to that standby database I mentioned, it's completely seamless and it doesn't require any additional wallets to be downloaded or any new URLs to access APEX or Oracle Machine Learning. Even Oracle REST Data Services. All the URLs and all the wallets, everything that you need to authenticate, to connect to your database, they all remain the same for you if you have to failover to your standby database. 13:58 Lois: And what happens after a failover occurs? Kay: After performing a failover, a new standby for your primary will automatically be provisioned. So in other words, in performing a failover your standby does become your new primary. Any new standby is made for that primary. I know, it's kind of interesting. So currently, the standby database is created in the same region as the primary database. For better resilience, if your database is provisioned, it would be available on AD1 or Availability Domain 1. My secondary, or my standby, would be provisioned on a different availability domain. 14:49 Nikita: But there's also the possibility of manual failover, right? What are the differences between automatic and manual failover scenarios? When would you recommend using each? Kay: So in the case of the automatic failover scenario following a disastrous situation, if the primary ADB becomes completely unavailable, the switchover button will turn to a failover button. Because remember, this is a disaster. Automatic failover is automatically triggered. There's no user action required. So if you're asleep and something happens, you're protected. There's no user action required, but automatic failover is allowed to succeed only when no data loss will occur. For manual failover scenarios in the rare case when an automatic failover is unsuccessful, the switchover button will become a failover button and the user can trigger a manual failover should they wish to do so. The system automatically recovers as much data as possible, minimizing any potential data loss. But you can see anywhere from a few seconds or minutes of data loss. Now, you should only perform a manual failover in a true disaster scenario, expecting the fact that a few minutes of potential data loss could occur, to ensure that your database is back online as soon as possible.  16:23 Lois: We hope you've enjoyed revisiting some of our most popular episodes over these past few weeks. We always appreciate your feedback and suggestions so remember to take that quick survey we've put out. You'll find it in the show notes for today's episode. Thanks a lot for your support. We're taking a break for the next two weeks and will be back with a brand-new season of the Oracle University Podcast in January. Happy holidays, everyone! Nikita: Happy holidays! Until next time, this is Nikita Abraham... Lois: And Lois Houston, signing off! 16:56 That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

Techzine Talks
Het 'open multi-cloud tijdperk' is begonnen volgens Oracle: wat houdt dat in?

Techzine Talks

Play Episode Listen Later Sep 16, 2024 28:39


De cloud is volop in beweging. Dat was vorige week tijdens Oracle #CloudWorld in Las Vegas ook weer heel erg duidelijk zichtbaar. Wij waren erbij en bespraken aan het einde van die week de stand van zaken met Wilfred Scholman van Oracle. Wil je weten hoe de multi-cloud er in 2024 uitziet, ook vanuit Nederlands perspectief, luister dan naar deze aflevering. Tijdens Oracle CloudWorld ging het uiteraard veel over #AI. Wat ons betreft waren alle aankondigingen en sessies daarover weliswaar interessant, maar niet het belangrijkste. Dat was zonder twijfel de aankondiging dat #Oracle en #AWS na vele jaren de strijdbijl hebben begraven. De twee bedrijven kondigden namelijk gezamenlijk Oracle Database@AWS aan. Er komt dus Oracle-hardware in datacenters van AWS te staan. De Autonomous Database en Exadata van Oracle waren al beschikbaar binnen de datacenters van Microsoft #Azure en #Google Cloud, maar nu dus ook binnen de "populairste public cloud" van allemaal, AWS. Dat zijn overigens niet onze woorden, maar die van AWS-CEO Matt Garman, vorige week op het podium tijdens Oracle CloudWorld, in een gesprek met Larry Ellison. Dat AWS, dat lange tijd vooral heeft geprobeerd om alles wat met Oracle te maken had, met chirurgische precisie uit de eigen datacenters had gehaald op het podium stond bij Oracle, was al erg bijzonder. Dat iemand in aanwezigheid van Ellison expliciet niet Oracle maar AWS als aanbieder van de populairste cloud benoemde, deed de oren nog wat harder klapperen.Open multi-cloud eraDe stap die Oracle en AWS gezamenlijk hebben gezet is eigenlijk alleen maar logisch. Allereerst omdat de Oracle Database inmiddels al in Microsoft Azure en Google Cloud beschikbaar is. Dat gebeurt niet voor de lol, dat is omdat er kennelijk voldoende klanten zijn om dit interessant te maken voor de deelnemende partijen. Er zijn ongetwijfeld ook veel AWS-klanten die de Oracle Database gebruiken.Daarnaast is het steeds duidelijker dat er relatief weinig klanten zijn die alles bij dezelfde partij afnemen of willen afnemen. Veel organisaties gebruiken meerdere clouds. Dan is het niet meer dan logisch dat die clouds beter met elkaar gaan samenwerken. Daar is de samenwerking tussen Oracle en AWS een volgend voorbeeld van. Of dit ook betekent dat er een Interconnect komt tussen Oracle en AWS (die er al is met Azure en GCP), is nog niet duidelijk. Maar dat zou een volgende stap kunnen zijn.Volgens Ellison is de aankondiging van vorige week het teken dat het 'open #multi-cloud tijdperk' echt is begonnen. Coen en Sander bespreken in een podcast opgenomen in Las Vegas tijdens CloudWorld wat dit betekent. Dit doen ze met Wilfred Scholman van Oracle. Wilfred is country lead voor Oracle Nederland, maar heeft ook een interessante Europese functie. Daar hebben we het ook zeker over in deze nieuwe aflevering van #Techzine Talks.

Audio News
ORACLE PRESENTA EXADATA EXASCALE

Audio News

Play Episode Listen Later Jul 22, 2024 7:11


Siendo una innovadora arquitectura de datos inteligente que reduce los costos de infraestructura hasta en un 95 %, Exadata Exascale de Oracle facilita a organizaciones de todos los tamaños beneficiarse de su uso. Combinando la avanzada base de datos Exadata con la flexibilidad de la nube, este nuevo servicio proporciona un rendimiento excepcional para inteligencia artificial, análisis y cargas de trabajo críticas.

Futurum Tech Podcast
Q1 Earnings and another busy week in Tech! - Infrastructure Matters, Episode 39

Futurum Tech Podcast

Play Episode Listen Later May 1, 2024 32:57


In this episode of Infrastructure Matters, hosts Steven Dickens and Camberley Bates cover the Rubrik IPO, Oracle's positioning in the cloud market, plus IBM's Storage announcements. Key topics covered:  Rubric IPO: Rubric, a data protection company, went public with a $5.6 billion valuation, emphasizing its scale-out data protection capability and its focus on cybersecurity. IBM Earnings and Acquisition: IBM reported solid earnings, with notable growth in software and hybrid cloud, and announced the acquisition of HashiCorp for $35 a share, enhancing its open-source support model. IBM Storage Announcements: IBM introduced Storage Assurance Perpetual, a program aimed at keeping customers' storage infrastructure current over an eight-year period, and unveiled enhancements to its virtualized storage product, SVC, focusing on replication and mirroring capabilities. Google Cloud Growth: Google Cloud's revenue surged 28% year-on-year, nearing $10 billion for the quarter, signaling strong sales execution and growth momentum in the enterprise market. Oracle Developments: Oracle showcased its Exadata platform's cloud connectivity and cost-effectiveness, positioning itself as a competitive player in the IaaS space. Larry Ellison's involvement highlighted the company's AI strategy and its integration with the Cerner acquisition for healthcare innovation.  

Oracle University Podcast
Everything You Need to Know to Get Certified on Oracle Autonomous Database

Oracle University Podcast

Play Episode Listen Later Jan 23, 2024 15:58


How do I get certified in Oracle Autonomous Database? What material can I use to prepare for it? What's the exam like? How long is the certification valid for?   If these questions have been keeping you up at night, then join Lois Houston and Nikita Abraham in their conversation with Senior Principal OCI Instructor Susan Jang to understand the process of getting certified and begin your learning adventure.   Oracle MyLearn: mylearn.oracle.com/ Oracle University Learning Community: education.oracle.com/ou-community LinkedIn: linkedin.com/showcase/oracle-university/ X (formerly Twitter): twitter.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, and the OU Studio Team for helping us create this episode.   --------------------------------------------------------   Episode Transcript   00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started! 00:26 Lois: Hello and welcome to the Oracle University Podcast. I'm Lois Houston, Director of Innovation Programs with Oracle University, and with me is Nikita Abraham, Principal Technical Editor. Nikita: Hi everyone! If you've listened to us these last few weeks, you'll know we've been discussing Oracle Autonomous Database in detail. We looked at Autonomous Database on serverless and dedicated infrastructure. 00:51 Lois: That's right, Niki. Then, last week, we explored Autonomous Database tools. Today, we thought we'd wrap up our focus on Autonomous Database by talking about the training offered by Oracle University, the associated certification, how to prepare for it, what you should do next, and more. Nikita: Yeah, we'll get answers to all the big questions. And we're going to get them from Susan Jang. Sue is a Senior Principal OCI Instructor with Oracle University. She has created and delivered training in Oracle databases and Oracle Cloud Infrastructure for over 20 years. Hi Sue! Thanks for joining us today. Sue: Happy to be here! 01:29 Lois: Sue, what training does Oracle have on Autonomous Database?   Sue: Oracle University offers a professional-level course called the Oracle Autonomous Database Administration Workshop. So, if you want to learn to deploy and administer autonomous databases, this is the one for you. You'll explore the fundamentals of the autonomous databases, their features, and benefits. You'll learn about the technical architecture, the tasks that are involved in creating an autonomous database on a shared and on a dedicated Exadata infrastructure. You'll discover what is the Machine Learning, you'll discover what is APEX, which is Application Express, and SQL Developer Web, which is all deployed with the Autonomous Database. So basically everything you need to take your skills to the next level and become a proficient database administrator is in this course. 02:28 Nikita: Who can take this course, Sue?    Sue: The course is really for anyone interested in Oracle Autonomous Database, whether you're a database administrator, a cloud data management professional, or a consultant. The topics in the course include everything from the features of an Autonomous Database through provisioning, managing, and monitor of the database. Most people think that just because it is an Autonomous Database, Oracle will do everything for you, and there is nothing a DBA can do or needs to do. But that's not true.   An Oracle Autonomous Database automates the day-to-day DBA tasks, like tuning the database to ensure it is running at performance level or that the backups are done successfully. By letting the Autonomous Database perform those tasks, it gives the database administrator time to fully understand the new features of an Oracle database and figure out how to implement the features that will benefit the DBA's company. 03:30 Lois: Would a non-database administrator benefit from taking this course?   Sue: Yes, Lois. Oracle courses are designed in modules, so you can focus on the modules that meet your needs. For example, if you're a senior technical manager, you may not need to manage and monitor the Autonomous Database. But still, it's important to understand its features and architecture to know how other Oracle products integrate with the database. 03:57 Nikita: Right. Talking about the course itself, each module consists of videos that teach different concepts, right? Sue: Yes, Niki. Each video covers one topic. A group of topics, or I should say a group of related topics, makes up a module. We know your time is important to you, and your success is important to us. You don't just want to spend time taking training. You want to know that you're really understanding the concepts of what you are learning.  So to help you do this, we have skill checks at the end of most modules. You must successfully answer 80% of the questions to pass these knowledge checks. These checks are an excellent way to ensure that you're on the right track and have the understanding of each module before you move on to the next one.  04:48   Lois: That's great. And are there any other resources to help reinforce what's been learned?   Sue: I grew up with this phrase from my Mom. Education was her career. I remember hearing, “I hear and I forget. I see and I remember. I do and I understand.” It's important to us that you understand the concepts and can actually “do” or “perform” the tasks.  You'll find several demos in the different modules of the Autonomous Database Administration Workshop. These videos are where the instructor shows you how to perform the tasks so you can reinforce what you learned in the lessons. You'll find demos on provisioning an autonomous database, creating an autonomous database clone, and configuring disaster recovery, and lots more.   Oracle also has what we call LiveLabs. These are a series of hands-on tutorials with step-by-step instructions to guide you through performing the tasks. 05:49 Nikita: I love the idea of LiveLabs. You can follow instructions on how to perform administrative tasks and then practice doing that on your own. Lois: Yeah, that's fantastic. OK Sue, say I've taken the course. What do I do next?  Sue: Well, after you've taken the course, you'll want to demonstrate your expertise with a certification. Because you want to get that better job. You want to increase your earning potential. You need to take the certification called the Oracle Autonomous Database Cloud Professional. We have a couple of resources to help you along the way to ensure you succeed in securing that certification. In MyLearn, the Oracle University online learning platform, you'll see that the course, Oracle Autonomous Database Administration Workshop, falls within a learning path called Become an Oracle Autonomous Database Cloud Professional. The course is the first section of this learning path. The next section is a video describing the certification exam and how to prepare for it. The section after that is a practice exam. Now, though it doesn't have the actual questions, you'll find the exam will give you a good idea of the type of questions that will be asked in the exam.  07:10 Lois: OK, so now I've done all that, and I'm ready to validate my knowledge and expertise. Tell me more about the certification, Sue. Sue: To get the certification, you must take an online exam. The duration of the exam is 90 minutes. It's a Multiple Choice format, and there are 60 questions to the exam.  By getting this certification, you're demonstrating to the world that you have the knowledge to provision, manage, and monitor, as well as migrate workloads to the Autonomous Database, on both a shared as well as a dedicated Exadata infrastructure. You will show you have the understanding of the architect of the Autonomous Database and can successfully use the features as well as its workflow, and you are capable of using Autonomous Database tools in developing an Autonomous Database. 08:05 Nikita: Great! So what do I need to do to take the exam? Sue: We assume you've already taken the course (making sure that you're up to date with the training), that you've taken the time to study the topics in depth rather than memorizing superficial information just to pass the exam, looked at the available preparation material, and you've also taken the practice exam. I highly recommend that you have the hands-on experience or practice on an Autonomous Database before you take the certification exam. 08:38 Nikita: Hold on, Sue. You said to make sure we're up to date with the training. How do I do that? Sue: Technology is ever-changing, and at Oracle, we continually enhance our products to provide features that make them faster or more straightforward to use. So, if you're taking a course, you may find a small tag that says “New” next to a topic. That indicates that there are some new training that's been added to the course. So what I'm trying to say is if you're looking to take some certification, check the course before you register for the exam and to see if there are any “New” tags. If you find them, you can learn what's new and not have to go through the entire course again. This way, you're up to date with the training! 09:25 Nikita: Ok. Got it. Tell us more about the certification, Sue. Sue: If you're ready, search for the Become An Oracle Autonomous Database Cloud Professional learning path in MyLearn and scroll down to the Oracle Database Cloud Professional exam. Click on the “Register Now” button. You'll be taken to a page where you'll see the exam overview, the resources to help you prepare for the exam, a button to register for the exam, and things to do before your exam session. It will also describe what happens after the exam and some exam policies, like what to do if you need to reschedule your exam. When you're ready to take the exam, you can schedule the date and time according to when it's convenient for you.  10:15 Lois: What's the actual experience of taking the exam like? Sue: It's pretty straightforward. You want to prepare your system a day or two before the exam. You want to ensure you can connect successfully to the test site and that your laptop is plugged in and not running on battery. You want to make sure all other applications are closed before you perform the system test. Now, the system test is really with the test site and consists of testing your microphone, an internet speed test, and your video. You will also be asked to do a test-exam simulation. You will need to be able to download the simulation exam and answer a few simple true or false questions. Once you have successfully done that, you're ready to take the test on your laptop on the actual day of the test. Now, on the day of the test, set up your test environment. For your test environment, what it really entails is that you have an environment that you do not have anything on your desk. You cannot have a second monitor. And it's best to have a clear wall behind you so that the proctor can see there is nothing around you. And don't forget to turn off your mobile device. 11:34 Lois: Ok, I've taken the test, and I passed. Wohoo! What happens now? Sue: When you pass the exam, you will receive an email from Oracle with your results as well as a link to Oracle CertView. This is the Oracle certification candidate portal. In CertView, you can download and print your eCertificate. You can share your newly earned badge on places like Facebook, Twitter, and LinkedIn, or even email your employer and others a secure link that they can use to confirm and validate your credentials. 12:11 Nikita: Can anyone take the certification? Sue: Yes, Niki. This certification is available to all candidates, including on-premise database administrators, cloud data management professionals, and consultants. 12:24 Lois: How long is the certification valid? What happens when it expires? Sue: Certain Oracle credentials require periodic recertification for Oracle to recognize them as "active." For such credentials, you must upgrade to a current version within 12 months following the Oracle credential retirement to keep your certification active. 12:51 Are you planning to become an Oracle Certified Professional this year? Whether you're a seasoned IT pro or just starting your career, getting certified can give you a significant boost. And don't worry, we've got your back. Join us at one of our cert prep live events in the Oracle University Learning Community. You'll get insider tips from seasoned experts and learn from other professionals' experiences. Plus, once you've earned your certification, you'll become part of our exclusive forum for Oracle-certified users. So, what are you waiting for? Head over to mylearn.oracle.com and create an account to jump-start your journey towards certification today! 13:35 Nikita: Welcome back. Sue, what other training can I take after Autonomous Database?    Sue: Now that you have a strong foundation in the database, there is so much more that you can learn in Oracle. You can consider Exadata if you work on a high-performance data workload that's running mission-critical applications. Look for a learning path called Become an Exadata Service Cloud Administrator, in MyLearn, to help you with that. GoldenGate is also a good choice if you work with data that needs to be shared and replicated, both locally as well as globally. The course for this is called Oracle GoldenGate 19c: Administration/Implementation.   A hot topic in technology today is generative AI (Artificial Intelligence). You want to learn how to implement data security on different levels when it needs to be shared with large language model providers.  Perhaps venture beyond the database and learn about Oracle Cloud Infrastructure and how its components and the many cloud services work together. Just go to mylearn.oracle.com, and in the field where you see “What do you want to learn?” type in what interests you and let your learning adventure begin! 14:59 Lois: And since you brought up AI, Sue, this is the perfect time to mention that we'll be focusing on it for the next couple of weeks. We'll be speaking to some of our colleagues on topics like artificial intelligence, machine learning, deep learning, generative AI, the OCI AI portfolio and more, but we'll talk more about that next week. Nikita: Yeah, can't wait for that. Thank you so much, Sue, for giving us your time today. Sue: Thanks for having me! Lois: Until next time, this is Lois Houston… Nikita: And Nikita Abraham, signing off! 15:29 That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click  Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate  and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

Oracle University Podcast
Autonomous Database Tools

Oracle University Podcast

Play Episode Listen Later Jan 16, 2024 36:04


In this episode, hosts Lois Houston and Nikita Abraham speak with Oracle Database experts about the various tools you can use with Autonomous Database, including Oracle Application Express (APEX), Oracle Machine Learning, and more.   Oracle MyLearn: https://mylearn.oracle.com/   Oracle University Learning Community: https://education.oracle.com/ou-community   LinkedIn: https://www.linkedin.com/showcase/oracle-university/   X (formerly Twitter): https://twitter.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Tamal Chatterjee, and the OU Studio Team for helping us create this episode.   ---------------------------------------------------------   Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started! 00:26 Lois: Hello and welcome to the Oracle University Podcast. I'm Lois Houston, Director of Innovation Programs with Oracle University, and with me is Nikita Abraham, Principal Technical Editor. Nikita: Hi everyone! We spent the last two episodes exploring Oracle Autonomous Database's deployment options: Serverless and Dedicated. Today, it's tool time! Lois: That's right, Niki. We'll be chatting with some of our Database experts on the tools that you can use with the Autonomous Database. We're going to hear from Patrick Wheeler, Kay Malcolm, Sangeetha Kuppuswamy, and Thea Lazarova. Nikita: First up, we have Patrick, to take us through two important tools. Patrick, let's start with Oracle Application Express. What is it and how does it help developers? 01:15 Patrick: Oracle Application Express, also known as APEX-- or perhaps APEX, we're flexible like that-- is a low-code development platform that enables you to build scalable, secure, enterprise apps with world-class features that can be deployed anywhere. Using APEX, developers can quickly develop and deploy compelling apps that solve real problems and provide immediate value. You don't need to be an expert in a vast array of technologies to deliver sophisticated solutions. Focus on solving the problem, and let APEX take care of the rest. 01:52 Lois: I love that it's so easy to use. OK, so how does Oracle APEX integrate with Oracle Database? What are the benefits of using APEX on Autonomous Database? Patrick: Oracle APEX is a fully supported, no-cost feature of Oracle Database. If you have Oracle Database, you already have Oracle APEX. You can access APEX from database actions. Oracle APEX on Autonomous Database provides a preconfigured, fully managed, and secure environment to both develop and deploy world-class applications. Oracle takes care of configuration, tuning, backups, patching, encryption, scaling, and more, leaving you free to focus on solving your business problems. APEX enables your organization to be more agile and develop solutions faster for less cost and with greater consistency. You can adapt to changing requirements with ease, and you can empower professional developers, citizen developers, and everyone else. 02:56 Nikita: So you really don't need to have a lot of specializations or be an expert to use APEX. That's so cool! Now, what are the steps involved in creating an application using APEX?  Patrick: You will be prompted to log in as the administrator at first. Then, you may create workspaces for your respective users and log in with those associated credentials. Application Express provides you with an easy-to-use, browser-based environment to load data, manage database objects, develop REST interfaces, and build applications which look and run great on both desktop and mobile devices. You can use APEX to develop a wide variety of solutions, import spreadsheets, and develop a single source of truth in minutes. Create compelling data visualizations against your existing data, deploy productivity apps to elegantly solve a business need, or build your next mission-critical data management application. There are no limits on the number of developers or end users for your applications. 04:01 Lois: Patrick, how does APEX use SQL? What role does SQL play in the development of APEX applications?  Patrick: APEX embraces SQL. Anything you can express with SQL can be easily employed in an APEX application. Application Express also enables low-code development, providing developers with powerful data management and data visualization components that deliver modern, responsive end user experiences out-of-the-box. Instead of writing code by hand, you're able to use intelligent wizards to guide you through the rapid creation of applications and components. Creating a new application from APEX App Builder is as easy as one, two, three. One, in App Builder, select a project name and appearance. Two, add pages and features to the app. Three, finalize settings, and click Create. 05:00 Nikita: OK. So, the other tool I want to ask you about is Oracle Machine Learning. What can you tell us about it, Patrick? Patrick: Oracle Machine Learning, or OML, is available with Autonomous Database. A new capability that we've introduced with Oracle Machine Learning is called Automatic Machine Learning, or AutoML. Its goal is to increase data scientist productivity while reducing overall compute time. In addition, AutoML enables non-experts to leverage machine learning by not requiring deep understanding of the algorithms and their settings. 05:37 Lois: And what are the key functions of AutoML? Patrick: AutoML consists of three main functions: Algorithm Selection, Feature Selection, and Model Tuning. With Automatic Algorithm Selection, the goal is to identify the in-database algorithms that are likely to achieve the highest model quality. Using metalearning, AutoML leverages machine learning itself to help find the best algorithm faster than with exhaustive search. With Automatic Feature Selection, the goal is to denoise data by eliminating features that don't add value to the model. By identifying the most predicted features and eliminating noise, model accuracy can often be significantly improved with a side benefit of faster model building and scoring. Automatic Model Tuning tunes algorithm hyperparameters, those parameters that determine the behavior of the algorithm, on the provided data. Auto Model Tuning can significantly improve model accuracy while avoiding manual or exhaustive search techniques, which can be costly both in terms of time and compute resources. 06:44 Lois: How does Oracle Machine Learning leverage the capabilities of Autonomous Database? Patrick: With Oracle Machine Learning, the full power of the database is accessible with the tremendous performance of parallel processing available, whether the machine learning algorithm is accessed via native database SQL or with OML4Py through Python or R.  07:07 Nikita: Patrick, talk to us about the Data Insights feature. How does it help analysts uncover hidden patterns and anomalies? Patrick: A feature I wanted to call the electromagnet, but they didn't let me. An analyst's job can often feel like looking for a needle in a haystack. So throw the switch and all that metallic stuff is going to slam up onto that electromagnet. Sure, there are going to be rusty old nails and screws and nuts and bolts, but there are going to be a few needles as well. It's far easier to pick the needles out of these few bits of metal than go rummaging around in a pile of hay, especially if you have allergies. That's more or less how our Insights tool works. Load your data, kick off a query, and grab a cup of coffee. Autonomous Database does all the hard work, scouring through this data looking for hidden patterns, anomalies, and outliers. Essentially, we run some analytic queries that predict expected values. And where the actual values differ significantly from expectation, the tool presents them here. Some of these might be uninteresting or obvious, but some are worthy of further investigation. You get this dashboard of various exceptional data patterns. Drill down on a specific gauge in this dashboard and significant deviations between actual and expected values are highlighted. 08:28 Lois: What a useful feature! Thank you, Patrick. Now, let's discuss some terms and concepts that are applicable to the Autonomous JSON Database with Kay. Hi Kay, what's the main focus of the Autonomous JSON Database? How does it support developers in building NoSQL-style applications? Kay: Autonomous Database supports the JavaScript Object Notation, also known as JSON, natively in the database. It supports applications that use the SODA API to store and retrieve JSON data or SQL queries to store and retrieve data stored in JSON-formatted data.  Oracle AJD is Oracle ATP, Autonomous Transaction Processing, but it's designed for developing NoSQL-style applications that use JSON documents. You can promote an AJD service to ATP. 09:22 Nikita: What makes the development of NoSQL-style, document-centric applications flexible on AJD?  Kay: Development of these NoSQL-style, document-centric applications is particularly flexible because the applications use schemaless data. This lets you quickly react to changing application requirements. There's no need to normalize the data into relational tables and no impediment to changing the data structure or organization at any time, in any way. A JSON document has its own internal structure, but no relation is imposed on separate JSON documents. Nikita: What does AJD do for developers? How does it actually help them? Kay: So Autonomous JSON Database, or AJD, is designed for you, the developer, to allow you to use simple document APIs and develop applications without having to know anything about SQL. That's a win. But at the same time, it does give you the ability to create highly complex SQL-based queries for reporting and analysis purposes. It has built-in binary JSON storage type, which is extremely efficient for searching and for updating. It also provides advanced indexing capabilities on the actual JSON data. It's built on Autonomous Database, so that gives you all of the self-driving capabilities we've been talking about, but you don't need a DBA to look after your database for you. You can do it all yourself. 11:00 Lois: For listeners who may not be familiar with JSON, can you tell us briefly what it is?  Kay: So I mentioned this earlier, but it's worth mentioning again. JSON stands for JavaScript Object Notation. It was originally developed as a human readable way of providing information to interchange between different programs. So a JSON document is a set of fields. Each of these fields has a value, and those values can be of various data types. We can have simple strings, we can have integers, we can even have real numbers. We can have Booleans that are true or false. We can have date strings, and we can even have the special value null. Additionally, values can be objects, and objects are effectively whole JSON documents embedded inside a document. And of course, there's no limit on the nesting. You can nest as far as you like. Finally, we can have a raise, and a raise can have a list of scalar data types or a list of objects. 12:13 Nikita: Kay, how does the concept of schema apply to JSON databases? Kay: Now, JSON documents are stored in something that we call collections. Each document may have its own schema, its own layout, to the JSON. So does this mean that JSON document databases are schemaless? Hmmm. Well, yes. But there's nothing to fear because you can always use a check constraint to enforce a schema constraint that you wish to introduce to your JSON data. Lois: Kay, what about indexing capabilities on JSON collections? Kay: You can create indexes on a JSON collection, and those indexes can be of various types, including our flexible search index, which indexes the entire content of the document within the JSON collection, without having to know anything in advance about the schema of those documents.  Lois: Thanks Kay! 13:18 AI is being used in nearly every industry—healthcare, manufacturing, retail, customer service, transportation, agriculture, you name it! And, it's only going to get more prevalent and transformational in the future. So it's no wonder that AI skills are the most sought after by employers.  We're happy to announce a new OCI AI Foundations certification and course that is available—for FREE! Want to learn about AI? Then this is the best place to start! So, get going! Head over to mylearn.oracle.com to find out more.  13:54 Nikita: Welcome back! Sangeetha, I want to bring you in to talk about Oracle Text. Now I know that Oracle Database is not only a relational store but also a document store. And you can load text and JSON assets along with your relational assets in a single database.  When I think about Oracle and databases, SQL development is what immediately comes to mind. So, can you talk a bit about the power of SQL as well as its challenges, especially in schema changes? Sangeetha: Traditionally, Oracle has been all about SQL development. And with SQL development, it's an incredibly powerful language. But it does take some advanced knowledge to make the best of it. So SQL requires you to define your schema up front. And making changes to that schema could be a little tricky and sometimes highly bureaucratic task. In contrast, JSON allows you to develop your schema as you go--the schemaless, perhaps schema-later model. By imposing less rigid requirements on the developer, it allows you to be more fluid and Agile development style. 15:09 Lois: How does Oracle Text use SQL to index, search, and analyze text and documents that are stored in the Oracle Database? Sangeetha: Oracle Text can perform linguistic analyses on documents as well as search text using a variety of strategies, including keyword searching, context queries, Boolean operations, pattern matching, mixed thematic queries, like HTML/XML session searching, and so on. It can also render search results in various formats, including unformatted text, HTML with term highlighting, and original document format. Oracle Text supports multiple languages and uses advanced relevance-ranking technology to improve search quality. Oracle Text also offers advantage features like classification, clustering, and support for information visualization metaphors. Oracle Text is now enabled automatically in Autonomous Database. It provides full-text search capabilities over text, XML, JSON content. It also could extend current applications to make better use of textual fields. It builds new applications specifically targeted at document searching. Now, all of the power of Oracle Database and a familiar development environment, rock-solid autonomous database infrastructure for your text apps, we can deal with text in many different places and many different types of text. So it is not just in the database. We can deal with data that's outside of the database as well. 17:03 Nikita: How does it handle text in various places and formats, both inside and outside the database? Sangeetha: So in the database, we can be looking a varchar2 column or LOB column or binary LOB columns if we are talking about binary documents such as PDF or Word. Outside of the database, we might have a document on the file system or out on the web with URLs pointing out to the document. If they are on the file system, then we would have a file name stored in the database table. And if they are on the web, then we should have a URL or a partial URL stored in the database. And we can then fetch the data from the locations and index it in the term documents format. We recognize many different document formats and extract the text from them automatically. So the basic forms we can deal with-- plain text, HTML, JSON, XML, and then formatted documents like Word docs, PDF documents, PowerPoint documents, and also so many different types of documents. All of those are automatically handled by the system and then processed into the format indexing. And we are not restricted by the English either here. There are various stages in the index pipeline. A document starts one, and it's taken through the different stages so until it finally reaches the index. 18:44 Lois: You mentioned the indexing pipeline. Can you take us through it? Sangeetha: So it starts with a data store. That's responsible for actually reaching the document. So once we fetch the document from the data store, we pass it on to the filter. And now the filter is responsible for processing binary documents into indexable text. So if you have a PDF, let's say a PDF document, that will go through the filter. And that will extract any images and return it into the stream of HTML text ready for indexing. Then we pass it on to the sectioner, which is responsible for identifying things like paragraphs and sentences. The output from the section is fed onto the lexer. The lexer is responsible for dividing the text into indexable words. The output of the lexer is fed into the index engine, which is responsible for laying out to the indexes on the disk. Storage, word list, and stop list are some additional inputs there. So storage tells exactly how to lay out the index on disk. Word list which has special preferences like desegmentation. And then stop is a list word that we don't want to index. So each of these stages and inputs can be customized. Oracle has something known as the extensibility framework, which originally was designed to allow people to extend capabilities of these products by adding new domain indexes. And this is what we've used to implement Oracle Text. So when kernel sees this phrase INDEXTYPE ctxsys.context, it knows to handle all of the hard work creating the index. 20:48 Nikita: Other than text indexing, Oracle Text offers additional operations, right? Can you share some examples of these operations? Sangeetha: So beyond the text index, other operations that we can do with the Oracle Text, some of which are search related. And some examples of that are these highlighting markups and snippets. Highlighting and markup are very similar. They are ways of fetching these results back with the search. And then it's marked up with highlighting within the document text. Snippet is very similar, but it's only bringing back the relevant chunks from the document that we are searching for. So rather than getting the whole document back to you, just get a few lines showing this in a context and the theme and extraction. So Oracle Text is capable of figuring out what a text is all about. We have a very large knowledge base of the English language, which will allow you to understand the concepts and the themes in the document. Then there's entity extraction, which is the ability to find out people, places, dates, times, zip codes, et cetera in the text. So this can be customized with your own user dictionary and your own user rules. 22:14 Lois: Moving on to advanced functionalities, how does Oracle Text utilize machine learning algorithms for document classification? And what are the key types of classifications? Sangeetha: The text analytics uses machine learning algorithms for document classification. We can process a large set of data documents in a very efficient manner using Oracle's own machine learning algorithms. So you can look at that as basically three different headings. First of all, there's classification. And that comes in two different types-- supervised and unsupervised. The supervised classification which means in this classification that it provides the training set, a set of documents that have already defined particular characteristics that you're looking for. And then there's unsupervised classification, which allows your system itself to figure out which documents are similar to each other. It does that by looking at features within the documents. And each of those features are represented as a dimension in a massively high dimensional feature space in documents, which are clustered together according to that nearest and nearness in the dimension in the feature space. Again, with the named entity recognition, we've already talked about that a little bit. And then finally, there is a sentiment analysis, the ability to identify whether the document is positive or negative within a given particular aspect. 23:56 Nikita: Now, for those who are already Oracle database users, how easy is it to enable text searching within applications using Oracle Text? Sangeetha: If you're already an Oracle database user, enabling text searching within your applications is quite straightforward. Oracle Text uses the same SQL language as the database. And it integrates seamlessly with your existing SQL. Oracle Text can be used from any programming language which has SQL interface, meaning just about all of them.  24:32 Lois: OK from Oracle Text, I'd like to move on to Oracle Spatial Studio. Can you tell us more about this tool? Sangeetha: Spatial Studio is a no-code, self-service application that makes it easy to access the sorts of spatial features that we've been looking at, in particular, in order to get that data prepared to use with spatial, visualizing results in maps and tables, and also doing the analysis and sharing results. Spatial Studios is encoded at no extra cost with Autonomous Database. The studio web application itself has no additional cost and it runs on the server. 25:13 Nikita: Let's talk a little more about the cost. How does the deployment of Spatial Studio work, in terms of the server it runs on?  Sangeetha: So, the server that it runs on, if it's running in the Cloud, that computing node, it would have some cost associated with it. It can also run on a free tier with a very small shape, just for evaluation and testing.  Spatial Studio is also available on the Oracle Cloud Marketplace. And there are a couple of self-paced workshops that you can access for installing and using Spatial Studio. 25:47 Lois: And how do developers access and work with Oracle Autonomous Database using Spatial Studio? Sangeetha: Oracle Spatial Studio allows you to access data in Oracle Database, including Oracle Autonomous Database. You can create connections to Oracle Autonomous Databases, and then you work with the data that's in the database. You can also see Spatial Studio to load data to Oracle Database, including Oracle Autonomous Database. So, you can load these spreadsheets in common spatial formats. And once you've loaded your data or accessed data that already exists in your Autonomous Database, if that data does not already include native geometrics, Oracle native geometric type, then you can prepare the data if it has addresses or if it has latitude and longitude coordinates as a part of the data. 26:43 Nikita: What about visualizing and analyzing spatial data using Spatial Studio? Sangeetha: Once you have the data prepared, you can easily drag and drop and start to visualize your data, style it, and look at it in different ways. And then, most importantly, you can start to ask spatial questions, do all kinds of spatial analysis, like we've talked about earlier. While Spatial Studio provides a GUI that allows you to perform those same kinds of spatial analysis. And then the results can be dropped on the map and visualized so that you can actually see the results of spatial questions that you're asking. When you've done some work, you can save your work in a project that you can return to later, and you can also publish and share the work you've done. 27:34 Lois: Thank you, Sangeetha. For the final part of our conversation today, we'll talk with Thea. Thea, thanks so much for joining us. Let's get the basics out of the way. How can data be loaded directly into Autonomous Database? Thea: Data can be loaded directly to ADB through applications such as SQL Developer, which can read data files, such as txt and xls, and load directly into tables in ADB. 27:59 Nikita: I see. And is there a better method to load data into ADB? Thea: A more efficient and preferred method for loading data into ADB is to stage the data cloud object store, preferably Oracle's, but also supported our Amazon S3 and Azure Blob Storage. Any file type can be staged in object store. Once the data is in object store, Autonomous Database can access a directly. Tools can be used to facilitate the data movement between object store and the database. 28:27 Lois: Are there specific steps or considerations when migrating a physical database to Autonomous? Thea: A physical database can simply be migrated to autonomous because database must be converted to pluggable database, upgraded to 19C, and encrypted. Additionally, any changes to an Oracle-shipped stored procedures or views must be found and reverted. All uses of container database admin privileges must be removed. And all legacy features that are not supported must be removed, such as legacy LOBs. Data Pump, expdp/impdp must be used for migrating databases versions 10.1 and above to Autonomous Database as it addresses the issues just mentioned. For online migrations, GoldenGate must be used to keep old and new database in sync. 29:15 Nikita: When you're choosing the method for migration and loading, what are the factors to keep in mind? Thea: It's important to segregate the methods by functionality and limitations of use against Autonomous Database. The considerations are as follows. Number one, how large is the database to be imported? Number two, what is the input file format? Number three, does the method support non-Oracle database sources? And number four, does the methods support using Oracle and/or third-party object store? 29:45 Lois: Now, let's move on to the tools that are available. What does the DBMS_CLOUD functionality do? Thea: The Oracle Autonomous Database has built-in functionality called DBMS_CLOUD specifically designed so the database can move data back and forth with external sources through a secure and transparent process. DBMS_CLOUD allows data movement from the Oracle object store. Data from any application or data source export to text-- .csv or JSON-- output from third-party data integration tools. DBMS_CLOUD can also access data stored on Object Storage from the other clouds, AWS S3 and Azure Blob Storage. DBMS_CLOUD does not impose any volume limit, so it's the preferred method to use. SQL*Loader can be used for loading data located on the local client file systems into Autonomous Database. There are limits around OS and client machines when using SQL*Loader. 30:49 Nikita: So then, when should I use Data Pump and SQL Developer for migration? Thea: Data Pump is the best way to migrate a full or part database into ADB, including databases from previous versions. Because Data Pump will perform the upgrade as part of the export/import process, this is the simplest way to get to ADB from any existing Oracle Database implementation. SQL Developer provides a GUI front end for using data pumps that can automate the whole export and import process from an existing database to ADB. SQL Developer also includes an import wizard that can be used to import data from several file types into ADB. A very common use of this wizard is for importing Excel files into ADW. Once a credential is created, it can be used to access a file as an external table or to ingest data from the file into a database table. DBMS_CLOUD makes it much easier to use external tables, and the organization external needed in other versions of the Oracle Database are not needed. 31:54 Lois: Thea, what about Oracle Object Store? How does it integrate with Autonomous Database, and what advantages does it offer for staging data? Thea: Oracle Object Store is directly integrated into Autonomous Database and is the best option for staging data that will be consumed by ADB. Any file type can be stored in object store, including SQL*Loader files, Excel, JSON, Parquet, and, of course, Data Pump DMP files. Flat files stored on object store can also be used as Oracle Database external tables, so they can queried directly from the database as part of a normal DML operation. Object store is a separate bin storage allocated to the Autonomous Database for database Object Storage, such as tables and indexes. That storage is part of the Exadata system Autonomous Database runs on, and it is automatically allocated and managed. Users do not have direct access to that storage. 32:50 Nikita: I know that one of the main considerations when loading and updating ADB is the network latency between the data source and the ADB. Can you tell us more about this? Thea: Many ways to measure this latency exist. One is the website cloudharmony.com, which provides many real-time metrics for connectivity between the client and Oracle Cloud Services. It's important to run these tests when determining with Oracle Cloud service location will provide the best connectivity. The Oracle Cloud Dashboard has an integrated tool that will provide real time and historic latency information between your existing location and any specified Oracle Data Center. When migrating data to Autonomous Database, table statistics are gathered automatically during direct-path load operations. If direct-path load operations are not used, such as with SQL Developer loads, the user can gather statistics manually as needed. 33:44 Lois: And finally, what can you tell us about the Data Migration Service? Thea: Database Migration Service is a fully managed service for migrating databases to ADB. It provides logical online and offline migration with minimal downtime and validates the environment before migration. We have a requirement that the source database is on Linux. And it would be interesting to see if we are going to have other use cases that we need other non-Linux operating systems. This requirement is because we are using SSH to directly execute commands on the source database. For this, we are certified on the Linux only. Target in the first release are Autonomous databases, ATP, or ADW, both serverless and dedicated. For agent environment, we require Linux operating system, and this is Linux-safe. In general, we're targeting a number of different use cases-- migrating from on-premise, third-party clouds, Oracle legacy clouds, such as Oracle Classic, or even migrating within OCI Cloud and doing that with or without direct connection. If you have any direct connection behind a firewall, we support offline migration. If you have a direct connection, we support both offline and online migration. For more information on all migration approaches are available for your particular situation, check out the Oracle Cloud Migration Advisor. 35:06 Nikita: I think we can wind up our episode with that. Thanks to all our experts for giving us their insights.  Lois: To learn more about the topics we've discussed today, visit mylearn.oracle.com and search for the Oracle Autonomous Database Administration Workshop. Remember, all of the training is free, so dive right in! Join us next week for another episode of the Oracle University Podcast. Until then, Lois Houston… Nikita: And Nikita Abraham, signing off! 35:35 That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

Oracle University Podcast
Autonomous Database on Dedicated Infrastructure

Oracle University Podcast

Play Episode Listen Later Jan 9, 2024 26:36


The Oracle Autonomous Database Dedicated deployment is a good choice for customers who want to implement a private database cloud in their own dedicated Exadata infrastructure. That dedicated infrastructure can either be in the Oracle Public Cloud or in the customer's own data center via Oracle Exadata Cloud@Customer.   In a dedicated environment, the Exadata infrastructure is entirely dedicated to the subscribing customer, isolated from other cloud tenants, with no shared processor, storage, and memory resource.   In this episode, hosts Lois Houston and Nikita Abraham speak with Oracle Database experts about how Autonomous Database Dedicated offers greater control of the software and infrastructure life cycle, customizable policies for separation of database workload, software update schedules and versioning, workload consolidation, availability policies, and much more.   Oracle MyLearn: https://mylearn.oracle.com/ Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X (formerly Twitter): https://twitter.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Tamal Chatterjee, and the OU Studio Team for helping us create this episode.   -------------------------------------------------------   Episode Transcript:   00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started. 00:26 Nikita: Hello and welcome to the Oracle University Podcast. I'm Nikita Abraham, Principal Technical Editor with Oracle University, and I'm joined by Lois Houston, Director of Innovation Programs. Lois: Hi there! This is our second episode on Oracle's Autonomous Database, and today we're going to spend time discussing Autonomous Database on Dedicated Infrastructure. We'll be talking with three of our colleagues: Maria Colgan, Kamryn Vinson, and Kay Malcolm. 00:53 Nikita: Maria is a Distinguished Product Manager for Oracle Database, Kamryn is a Database Product Manager, and Kay is a Senior Director of Database Product Management.  Lois: Hi Maria! Thanks for joining us today. We know that Oracle Autonomous Database offers two deployment choices: serverless and dedicated Exadata infrastructure. We spoke about serverless infrastructure last week but for anyone who missed that episode, can you give us a quick recap of what it is? 01:22 Maria: With Autonomous Database Serverless, Oracle automates all aspects of the infrastructure and database management for you. That includes provisioning, configuring, monitoring, backing up, and tuning. You simply select what type of database you want, maybe a data warehouse, transaction processing, or a JSON document store, which region in the Oracle Public Cloud you want that database deployed, and the base compute and storage resources necessary. Oracle automatically takes care of everything else. Once provisioned, the database can be instantly scaled through our UI, our APIs, or automatically based on your workload needs. All scaling activities happen completely online while the database remains open for business. 02:11 Nikita: Ok, so now that we know what serverless is, let's move on to dedicated infrastructure. What can you tell us about it? Maria: Autonomous Database Dedicated allows customers to implement a private database cloud running on their own dedicated Exadata infrastructure. That dedicated infrastructure can be in Oracle's Public Cloud or in the customer's own data center via Oracle Exadata Cloud@Customer. It makes an ideal platform to consolidate multiple databases regardless of their workload type or their size. And it also allows you to offer database as a service within your enterprise. 02:50 Lois: What are the primary benefits of Autonomous Database Dedicated infrastructure? Maria: With the dedicated deployment option, you must first subscribe to Dedicated Exadata Cloud Infrastructure that is isolated from other tenants with no shared processors, memory, network, or storage resources. This infrastructure choice offers greater control of both the software and the infrastructure life cycle. Customers can specify their own policies for workload separation, software update schedules, and availability. One of the key benefits of an autonomous database is a lower total cost of ownership through more automation and operational delegation to Oracle. Remember it's a fully managed service. All database operations, such as backup, software updates, upgrades, OS maintenance, incident management, and health monitoring, will be automatically done for you by Oracle. Its maximum availability architecture protects you from any hardware failures and in the event of a full outage, the service will be automatically failed over to your standby site. Built-in application continuity ensures zero downtime during the standard software update or in the event of a failover.  04:09 Nikita: And how is this billed?  Maria: Autonomous Database also has true pay-per-use billing so even when autoscale is enabled, you'll only pay for those additional resources when you use them. And we make it incredibly simple to develop on this environment with managed developer add-ons like our low code development environment, APEX, and our REST data services. This means you don't need any additional development environments in order to get started with a new application. 04:40 Lois: Ok. So, it looks like the dedicated option offers more control and customization. Maria, how do we access a dedicated database over a network? Maria: The network path is through a VCN, or Virtual Cloud Network, and the subnet that's defined by the Exadata infrastructure hosting the database. By default, this subnet is defined as private, meaning, there's no public internet access to those databases. This ensures only your company can access your Exadata infrastructure and your databases. Autonomous Database Dedicated can also take advantage of network services provided by OCI, including subnets or VCN peering, as well as connections to on-prem databases through the IP secure VPN and FastConnect dedicated corporate network connections. 05:33 Maria: You can also take advantage of the Oracle Microsoft partnership that enables customers to connect their Oracle Cloud Infrastructure resources and Microsoft Azure resources through a dedicated private connection. However, for some customers, a move to the public cloud is just not possible. Perhaps it's due to industry regulations, performance concerns, or integration with legacy on-prem applications. For these types of customers, Exadata Cloud@Customer should meet their requirements for strict data sovereignty and security by delivering high-performance Exadata Cloud Services capabilities in their data center behind their own firewall. 06:16 Nikita: What are the benefits of Autonomous Database on Exadata Cloud@Customer? How's it different? Maria: Autonomous Database on Exadata Cloud@Customer provides the same service as Autonomous Database Dedicated in the public cloud. So you get the same simplicity, agility, and performance, and elasticity that you get in the cloud. But it also provides a very fast and simple transition to an autonomous cloud because you can easily migrate on-prem databases to Exadata Cloud@Customer. Once the database is migrated, any existing applications can simply reconnect to that new database and run without any application changes being needed. And the data will leave your data center, so making it a very safe way to adopt a cloud model. 07:04 Lois: So, how do we manage communication to and from the public cloud? Maria: Each Cloud@Customer rack includes two local control plane servers to manage the communication to and from the public cloud. The local control plane acts on behalf of requests from the public cloud, keeping communications consolidated and secure. Platform control plane commands are sent to the Exadata Cloud@Customer system through a dedicated WebSocket secure tunnel.  Oracle Cloud operations staff use that same tunnel to monitor the autonomous database on Exadata Cloud@Customer both for maintenance and for troubleshooting. The two remote, control plane servers installed in the Exadata Cloud@Customer rack host that secure tunnel endpoint and act as a gateway for access to the infrastructure. They also host components that orchestrate the cloud automation, aggregates and routes telemetry messages from the Exadata Cloud@Customer platform to the Oracle Support Service infrastructure. And they also host images for server patching. 08:13 Maria: The Exadata Database Server is connected to the customer-managed switches via either 10 gigabit or 25 gigabit Ethernet. Customers have access to the customer Virtual Machine, or VM, via a pair of layer 2 network connections that are implemented as Virtual Network Interface Cards, or vNICs. They're also tagged VLAN. The physical network connections are implemented for high availability in an active standby configuration. Autonomous Database on Exadata Cloud@Customer provides the best of both worlds-- all of the automation including patching, backing up, scaling, and management of a database that you get with a cloud service, but without the data ever leaving the customer's data center. 09:01 Nikita: That's interesting. And, what happens if a dedicated database loses network connectivity to the OCI control plane? Maria: In the event an autonomous database on Exadata Cloud@Customer loses network connectivity to the OCI control plane, the Autonomous Database will actually continue to be available for your applications. And operations such as backups and autoscaling will not be impacted in that loss of network connectivity. However, the management and monitoring of the Autonomous Database via the OCI console and APIs as well as access by the Oracle Cloud operations team will not be available until that network is reconnected. 09:43 Maria: The capability suspended in the case of a lost network connection include, as I said, infrastructure management-- so that's the manual scaling of an Autonomous Database via the UI or our OCI CLI, or REST APIs, as well as Terraform scripts. They won't be available. Neither will the ability for Oracle Cloud ops to access and perform maintenance activities, such as patching. Nor will we be able to monitor the Oracle infrastructure during the time where the system is not connected. 10:20 Lois: That's good to know, Maria. What about data encryption and backup options? Maria: All Oracle Autonomous Databases encrypt data at REST. Data is automatically encrypted as it's written to the storage. But this encryption is transparent to authorized users and applications because the database automatically decrypts the data when it's being read from the storage. There are several options for backing up the Autonomous Database Cloud@Customer including using a Zero Data Loss Recovery Appliance, or ZDLRA. You can back it up to locally mounted NFS storage or back it up to the Oracle Public Cloud. 10:57 Nikita: I want to ask you about the typical workflow for Autonomous Database Dedicated infrastructure. What are the main steps here? Maria: In the typical workflow, the fleet administrator role performs the following steps. They provision the Exadata infrastructure by specifying its size, availability domain, and region within the Oracle Cloud. Once the hardware has been provisioned, the fleet administrator partitions the system by provisioning clusters and container databases. Then the developers, DBAs, or anyone who needs a database can provision databases within those container databases. Billing is based on the size of the Exadata infrastructure that's provisioned. So whether that's a quarter rack, half rack, or full rack. It also depends on the number of CPUs that are being consumed. Remember, it's also possible for customers to use their existing Oracle database licenses with this service to reduce the cost. 11:53 Lois: And what Exadata infrastructure models and shapes does Autonomous Database Dedicated support? Maria: That's the X7, X8, and X8M and you can get all of those in either a quarter, half, or full Exadata rack. Currently, you can create a maximum of 12 VM clusters on an Autonomous Database Dedicated infrastructure. We also advise that you limit the number of databases you provision to meet your preferred SLA. To meet the high availability SLA, we recommend a maximum of 100 databases. To meet the extreme availability SLA, we recommend a maximum of 25 databases. 12:35 Nikita: Ok, so now that I know all this, how do I actually get started with Autonomous Database on dedicated infrastructure? Maria: You need to increase your service limit to include that Exadata infrastructure and then you need to create the fleet and DBA service roles. You also need to create the necessary network model, VM clusters, and container databases for your organization. Finally, you need to provide access to the end users who want to create and use those Autonomous databases. Autonomous Database requires a subscription to that Exadata infrastructure for a minimum of 48 hours. But once subscribed, you can test out ideas and then terminate the subscription with no ongoing costs. While subscribed, you can control where you place the resources to perhaps manage latency sensitive applications. 13:29 Maria: You can also have control over patching schedules, software versions, so you can be sure that you're testing exactly what you need to. You can also migrate databases to the Autonomous Database via our export, import capabilities via the object store or through Data Pump or Golden Gate. As with any Autonomous Database, once it's provisioned, you've got full access to both autoscaling and all our cloning capabilities.  13:57 Lois: Maria, I've heard you talk about the importance of clean role separation in managing a private cloud. Can you elaborate on that, please? Maria: A successful private cloud is set up and managed using clean role separation between the fleet administration group and the developers, or DBA groups. The fleet administration group establishes the governance constraints, including things like budgeting, capacity compliance, and SLAs, according to the business structure. The physical resources are also logically grouped to align with this business structure, and then groups of users are given self-service access to the resources within these groups. So a good example of this would be that the developers and DBA groups use self-service database resources within these constraints. 14:46 Nikita: I see. So, what exactly does a fleet administrator do? Maria: Fleet administrators allocate budget by department and are responsible for the creation, monitoring, and management of the autonomous exadata infrastructure, the autonomous exadata VM clusters, and the autonomous container databases. To perform these duties, the fleet administrators must have an Oracle Cloud account or user, and that user must have permissions to manage these resources and be permitted to use network resources that need to be specified when you create these other resources. 15:24 Nikita: And what about database administrators? Maria: Database administrators create, monitor, and manage autonomous databases. They, too, need to have an Oracle Cloud account or be an Oracle Cloud user. Now, those accounts need to have the necessary permissions in order to create and access databases. They also need to be able to access autonomous backups and have permission to access the autonomous container databases, inside which these autonomous databases will be created, and have all of the necessary permissions to be able to create those databases, as I said. While creating autonomous databases, the database administrators will define and gain access to an admin user account inside the database. It's through this account that they will actually get the necessary permissions to be able to create and control database users.  16:24 Lois: How do developers fit into the picture? Maria: Database users and developers who write applications that will use or access an autonomous database don't actually need Oracle Cloud accounts. They'll actually be given the network connectivity and authorization information they need to access those databases by the database administrators. 16:45 Lois: Maria, you mentioned the various ways to manage the lifecycle of an autonomous dedicated service. Can you tell us more about that? Maria: You can manage the lifecycle of an autonomous dedicated service through the Cloud UI, Command Line Interface, through our REST APIs, or through one of the several language SDKs. The lifecycle operations that you can manage include capacity planning and setup, the provisioning and partitioning of exadata infrastructure, the provisioning and management of databases, the scaling of CPU storage and other resources, the scheduling of updates for the infrastructure, the VMs, and the database, as well as monitoring through event notifications.  17:30 Lois: And how do policies come into play? Maria: OCI allows fine-grained control over resources through the application of policies to groups. These policies are applicable to any member of the group. For Oracle Autonomous Database on dedicated infrastructure, the resources in question are autonomous exadata infrastructure, autonomous container databases, autonomous databases, and autonomous backups.  Lois: Thanks so much, Maria. That was great information. 18:05 The Oracle University Learning Community is a great place for you to collaborate and learn with experts, peers, and practitioners. Grow your skills, inspire innovation, and celebrate your successes. The more you participate, the more recognition you can earn. All of your activities, from liking a post to answering questions and sharing with others, will help you earn badges and ranks, and be recognized within the community. If you are already an Oracle MyLearn user, go to MyLearn to join the community. You will need to log in first. If you have not yet accessed Oracle MyLearn, visit mylearn.oracle.com and create an account to get started. 18:44 Nikita: Welcome back! Hi Kamryn, thanks for joining us on the podcast. So, in an Autonomous Database environment where most DBA tasks are automated, what exactly does an application DBA do? Kamryn: While Autonomous Database automates most of the repetitive tasks that DBAs perform, the application DBA will still want to monitor and diagnose databases for applications to maintain the highest performance and the greatest security possible. Tasks the application DBA performs includes operations on databases, cloning, movement, monitoring, and creating alerts. When required, the application DBA performs low-level diagnostics for application performance and looks for insights on performance and capacity trends.  19:36 Nikita: I see. And which tools do they use for these tasks? Kamryn: There are several tools at the application DBA's disposal, including Enterprise Manager, Performance Hub, and the OCI Console. For Autonomous Dedicated, all the database operations are exposed through the console UI and available through REST API calls, including provisioning, stop/start, lifecycle operations for dedicated database types, unscheduled on-demand backups and restores, CPU scaling and storage management, providing connectivity information, including wallets, scheduling updates. 20:17 Lois: So, Kamryn, what tools can DBAs use for deeper exploration? Kamryn: For deeper exploration of the databases themselves, Autonomous Database DBAs can use SQL Developer Web, Performance Hub, and Enterprise Manager. 20:31 Nikita: Let's bring Kay into the conversation. Hi Kay! With Autonomous Database Dedicated, I've heard that customers have more control over patching. Can you tell us a little more about that? Kay: With Autonomous Database Dedicated, customers get to determine the update or patching schedule if they wish. Oracle automatically manages all patching activity, but with the ADB-Dedicated service, customers have the option of customizing the patching schedule. You can specify which month in every quarter you want, which week in that month, which day in that month, and which patching window within that day. You can also dynamically change the scheduled patching date and time for a specific database if the originally scheduled time becomes inconvenient. 21:22 Lois: That's great! So, how often are updates published, and what options do customers have when it comes to applying these updates? Kay: Every quarter, updates are published to the console, and OCI notifications are sent out. ADB-Dedicated allows for greater control over updates by allowing you to choose to apply the current update or stay with the previous version and skip to the next release. And the latest update can be applied immediately. This provides fleet administrators with the option to maintain test and production systems at different patch levels. A fleet administrator or a database admin sets up the software version policy at the Autonomous Container Database level during provisioning, although the defaults can be modified at any time for an existing Autonomous Container Database. At the bottom of the Autonomous Exadata Infrastructure provisioning screen, you will see a Configure the Automatic Maintenance section, where you should click the Modify Schedule.  22:34 Nikita: What happens if a customer doesn't customize their patching schedule? Kay: If you do not customize a schedule, it behaves like Autonomous Serverless, and Oracle will set a schedule for you. ADB-Dedicated customers get to choose the patching schedule that fits their business.  22:52 Lois: Back to you, Kamryn, I know a bit about Transparent Data Encryption, but I'm curious to learn more. Can you tell me what it does and how it helps protect data? Kamryn: Transparent Data Encryption, TDE, enables you to encrypt sensitive data that you store in tables and tablespaces. After the data is encrypted, this data is transparently decrypted for authorized users or applications when they access this data. TDE helps protect data stored on media, also called data at rest. If the storage media or data file is stolen, Oracle database uses authentication, authorization, and auditing mechanisms to secure data in the database, but not in the operating system data files where data is stored. To protect these data files, Oracle database provides TDE.  23:45 Nikita: That sounds important for data security. So, how does TDE protect data files? Kamryn: TDE encrypts sensitive data stored in data files. To prevent unauthorized decryption, TDE stores the encryption keys in a security module external to the database called a keystore. You can configure Oracle Key Vault as part of the TDE implementation. This enables you to centrally manage TDE key stores, called TDE wallets, in Oracle Key Vault in your enterprise. For example, you can upload a software keystore to Oracle Key Vault and then make the contents of this keystore available to other TDE-enabled databases. 24:28 Lois: What about Oracle Autonomous Database? How does it handle encryption? Kamryn: Oracle Autonomous Database uses always-on encryption that protects data at rest and in transit. All data stored in Oracle Cloud and network communication with Oracle Cloud is encrypted by default. Encryption cannot be turned off. By default, Oracle Autonomous Database creates and manages all the master encryption keys used to protect your data, storing them in a secure PKCS 12 keystore on the same Exadata systems where the databases reside. If your company's security policies require, Oracle Autonomous Database can instead use keys you create and manage. Customers can control key generation and rotation of the keys. 25:19 Kamryn: The Autonomous databases you create automatically use customer-managed keys because the Autonomous container database in which they are created is configured to use customer-managed keys. Thus, those users who create and manage Autonomous databases do not have to worry about configuring their databases to use customer-managed keys. 25:41 Nikita: Thank you so much, Kamryn, Kay, and Maria for taking the time to give us your insights. To learn more about provisioning Autonomous Database Dedicated resources, head over to mylearn.oracle.com and search for the Oracle Autonomous Database Administration Workshop. Lois: In our next episode, we will discuss Autonomous Database tools. Until then, this is Lois Houston… Nikita: …and Nikita Abraham signing off. 26:07 That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

Oracle University Podcast
Autonomous Database on Serverless Infrastructure

Oracle University Podcast

Play Episode Listen Later Jan 2, 2024 17:41


Want to quickly provision your autonomous database? Then look no further than Oracle Autonomous Database Serverless, one of the two deployment choices offered by Oracle Autonomous Database.   Autonomous Database Serverless delegates all operational decisions to Oracle, providing you with a completely autonomous experience.   Join hosts Lois Houston and Nikita Abraham, along with Oracle Database experts, as they discuss how serverless infrastructure eliminates the need to configure any hardware or install any software because Autonomous Database handles provisioning the database, backing it up, patching and upgrading it, and growing or shrinking it for you.   Oracle Autonomous Database Episode: https://oracleuniversitypodcast.libsyn.com/oracle-autonomous-database Oracle MyLearn: https://mylearn.oracle.com/ Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X (formerly Twitter): https://twitter.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Rajeev Grover, and the OU Studio Team for helping us create this episode.   --------------------------------------------------------   Episode Transcript:   00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started. 00:26 Lois: Hello and welcome to the Oracle University Podcast. I'm Lois Houston, Director of Innovation Programs with Oracle University, and with me is Nikita Abraham, Principal Technical Editor. Nikita: Hi everyone! Welcome back to a new season of the Oracle University Podcast. This time, our focus is going to be on Oracle Autonomous Database. We've got a jam-packed season planned with some very special guests joining us. 00:52 Lois: If you're a regular listener of the podcast, you'll remember that we'd spoken a bit about Autonomous Database last year. That was a really good introductory episode so if you missed it, you might want to check it out.  Nikita: Yeah, we'll post a link to the episode in today's show notes so you can find it easily. 01:07 Lois: Right, Niki. So, for today's episode, we wanted to focus on Autonomous Database on Serverless Infrastructure and we reached out to three experts in the field: Hannah Nguyen,  Sean Stacey, and Kay Malcolm. Hannah is an Associate Cloud Engineer, Sean, a Director of Platform Technology Solutions, and Kay, who's been on the podcast before, is Senior Director of Database Product Management. For this episode, we'll be sharing portions of our conversations with them. So, let's get started. 01:38 Nikita: Hi Hannah! How does Oracle Cloud handle the process of provisioning an Autonomous Database?   Hannah: The Oracle Cloud automates the process of provisioning an Autonomous Database, and it automatically provisions for you a highly scalable, highly secure, and a highly available database very simply out of the box. 01:56 Lois: Hannah, what are the components and architecture involved when provisioning an Autonomous Database in Oracle Cloud? Hannah: Provisioning the database involves very few steps. But it's important to understand the components that are part of the provisioned environment. When provisioning a database, the number of CPUs in increments of 1 for serverless, storage in increments of 1 terabyte, and backup are automatically provisioned and enabled in the database. In the background, an Oracle 19c pluggable database is being added to the container database that manages all the user's Autonomous Databases. Because this Autonomous Database runs on Exadata systems, Real Application Clusters is also provisioned in the background to support the on-demand CPU scalability of the service. This is transparent to the user and administrator of the service. But be aware it is there. 02:49 Nikita: Ok…So, what sort of flexibility does the Autonomous Database provide when it comes to managing resource usage and costs, you know… especially in terms of starting, stopping, and scaling instances? Hannah: The Autonomous Database allows you to start your instance very rapidly on demand. It also allows you to stop your instance on demand as well to conserve resources and to pause billing. Do be aware that when you do pause billing, you will not be charged for any CPU cycles because your instance will be stopped. However, you'll still be incurring charges for your monthly billing for your storage. In addition to allowing you to start and stop your instance on demand, it's also possible to scale your database instance on demand as well. All of this can be done very easily using the Database Cloud Console. 03:36 Lois: What about scaling in the Autonomous Database? Hannah: So you can scale up your OCPUs without touching your storage and scale it back down, and you can do the same with your storage. In addition to that, you can also set up autoscaling. So the database, whenever it detects the need, will automatically scale up to three times the base level number of OCPUs that you have allocated or provisioned for the Autonomous Database. 04:00 Nikita: Is autoscaling available for all tiers?  Hannah: Autoscaling is not available for an always free database, but it is enabled by default for other tiered environments. Changing the setting does not require downtime. So this can also be set dynamically. One of the advantages of autoscaling is cost because you're billed based on the average number of OCPUs consumed during an hour. 04:23 Lois: Thanks, Hannah! Now, let's bring Sean into the conversation. Hey Sean, I want to talk about moving an autonomous database resource. When or why would I need to move an autonomous database resource from one compartment to another? Sean: There may be a business requirement where you need to move an autonomous database resource, serverless resource, from one compartment to another. Perhaps, there's a different subnet that you would like to move that autonomous database to, or perhaps there's some business applications that are within or accessible or available in that other compartment that you wish to move your autonomous database to take advantage of. 04:58 Nikita: And how simple is this process of moving an autonomous database from one compartment to another? What happens to the backups during this transition? Sean: The way you can do this is simply to take an autonomous database and move it from compartment A to compartment B. And when you do so, the backups, or the automatic backups that are associated with that autonomous database, will be moved with that autonomous database as well. 05:21 Lois: Is there anything that I need to keep in mind when I'm moving an autonomous database between compartments?  Sean: A couple of things to be aware of when doing this is, first of all, you must have the appropriate privileges in that compartment in order to move that autonomous database both from the source compartment to the target compartment. In addition to that, once the autonomous database is moved to this new compartment, any policies or anything that's defined in that compartment to govern the authorization and privileges of that said user in that compartment will be applied immediately to that new autonomous database that has been moved into that new compartment. 05:59 Nikita: Sean, I want to ask you about cloning in Autonomous Database. What are the different types of clones that can be created?  Sean: It's possible to create a new Autonomous Database as a clone of an existing Autonomous Database. This can be done as a full copy of that existing Autonomous Database, or it can be done as a metadata copy, where the objects and tables are cloned, but they are empty. So there's no rows in the tables. And this clone can be taken from a live running Autonomous Database or even from a backup. So you can take a backup and clone that to a completely new database. 06:35 Lois: But why would you clone in the first place? What are the benefits of this?  Sean: When cloning or when creating this clone, it can be created in a completely new compartment from where the source Autonomous Database was originally located. So it's a nice way of moving one database to another compartment to allow developers or another community of users to have access to that environment. 06:58 Nikita: I know that along with having a full clone, you can also have a refreshable clone. Can you tell us more about that? Who is responsible for this? Sean: It's possible to create a refreshable clone from an Autonomous Database. And this is one that would be synced with that source database up to so many days. The task of keeping that refreshable clone in sync with that source database rests upon the shoulders of the administrator. The administrator is the person who is responsible for performing that sync operation. Now, actually performing the operation is very simple, it's point and click. And it's an automated process from the database console. And also be aware that refreshable clones can trail the source database or source Autonomous Database up to seven days. After that period of time, the refreshable clone, if it has not been refreshed or kept in sync with that source database, it will become a standalone, read-only copy of that original source database. 08:00 Nikita: Ok Sean, so if you had to give us the key takeaways on cloning an Autonomous Database, what would they be?  Sean: It's very easy and a lot of flexibility when it comes to cloning an Autonomous Database. We have different models that you can take from a live running database instance with zero impact on your workload or from a backup. It can be a full copy, or it can be a metadata copy, as well as a refreshable, read-only clone of a source database. 08:33 Did you know that Oracle University offers free courses on Oracle Cloud Infrastructure? You'll find training on everything from cloud computing, database, and security to artificial intelligence and machine learning, all of which is available free to subscribers. So, get going! Pick a course of your choice, get certified, join the Oracle University Learning Community, and network with your peers. If you are already an Oracle MyLearn user, go to MyLearn to begin your journey. If you have not yet accessed Oracle MyLearn, visit mylearn.oracle.com and create an account to get started.  09:12 Nikita: Welcome back! Thank you, Sean, and hi Kay! I want to ask you about events and notifications in Autonomous Database. Where do they really come in handy?  Kay: Events can be used for a variety of notifications, including admin password expiration, ADB services going down, and wallet expiration warnings. There's this service, and it's called the notifications service. It's part of OCI. And this service provides you with the ability to broadcast messages to distributed components using a publish and subscribe model. These notifications can be used to notify you when event rules or alarms are triggered or simply to directly publish a message. In addition to this, there's also something that's called a topic. This is a communication channel for sending messages to subscribers in the topic. You can manage these topics and their subscriptions really easy. It's not hard to do at all. 10:14 Lois: Kay, I want to ask you about backing up Autonomous Databases. How does Autonomous Database handle backups? Kay: Autonomous Database automatically backs up your database for you. The retention period for backups is 60 days. You can restore and recover your database to any point in time during this retention period. You can initiate recovery for your Autonomous Database by using the cloud console or an API call. Autonomous Database automatically restores and recovers your database to the point in time that you specify. In addition to a point in time recovery, we can also perform a restore from a specific backup set.  10:59 Lois: Kay, you spoke about automatic backups, but what about manual backups?  Kay: You can do manual backups using the cloud console, for example, if you want to take a backup say before a major change to make restoring and recovery faster. These manual backups are put in your cloud object storage bucket. 11:20 Nikita: Are there any special instructions that we need to follow when configuring a manual backup? Kay: The manual backup configuration tasks are a one-time operation. Once this is configured, you can go ahead, trigger your manual backup any time you wish after that. When creating the object storage bucket for the manual backups, it is really important-- so I don't want you to forget-- that the name format for the bucket and the object storage follows this naming convention. It should be backup underscore database name. And it's not the display name here when I say database name. 12:00 Kay: In addition to that, the object name has to be all lowercase. So three rules. Backup underscore database name, and the specific database name is not the display name. It has to be in lowercase. Once you've created your object storage bucket to meet these rules, you then go ahead and set a database property. Default_backup_bucket. This points to the object storage URL and it's using the Swift protocol. Once you've got your object storage bucket mapped and you've created your mapping to the object storage location, you then need to go ahead and create a database credential inside your database. You may have already had this in place for other purposes, like maybe you were loading data, you were using Data Pump, et cetera. If you don't, you would need to create this specifically for your manual backups. Once you've done so, you can then go ahead and set your property to that default credential that you created. So once you follow these steps as I pointed out, you only have to do it one time. Once it's configured, you can go ahead and use it from now on for your manual backups. 13:21 Lois: Kay, the last topic I want to talk about before we let you go is Autonomous Data Guard. Can you tell us about it? Kay: Autonomous Data Guard monitors the primary database, in other words, the database that you're using right now.  Lois: So, if ADB goes down… Kay: Then the standby instance will automatically become the primary instance. There's no manual intervention required. So failover from the primary database to that standby database I mentioned, it's completely seamless and it doesn't require any additional wallets to be downloaded or any new URLs to access APEX or Oracle Machine Learning. Even Oracle REST Data Services. All the URLs and all the wallets, everything that you need to authenticate, to connect to your database, they all remain the same for you if you have to failover to your standby database. 14:19 Lois: And what happens after a failover occurs? Kay: After performing a failover, a new standby for your primary will automatically be provisioned. So in other words, in performing a failover your standby does become your new primary. Any new standby is made for that primary. I know, it's kind of interesting. So currently, the standby database is created in the same region as the primary database. For better resilience, if your database is provisioned, it would be available on AD1 or Availability Domain 1. My secondary, or my standby, would be provisioned on a different availability domain. 15:10 Nikita: But there's also the possibility of manual failover, right? What are the differences between automatic and manual failover scenarios? When would you recommend using each? Kay: So in the case of the automatic failover scenario following a disastrous situation, if the primary ADB becomes completely unavailable, the switchover button will turn to a failover button. Because remember, this is a disaster. Automatic failover is automatically triggered. There's no user action required. So if you're asleep and something happens, you're protected. There's no user action required, but automatic failover is allowed to succeed only when no data loss will occur.   15:57 Nikita: For manual failover scenarios in the rare case when an automatic failover is unsuccessful, the switchover button will become a failover button and the user can trigger a manual failover should they wish to do so. The system automatically recovers as much data as possible, minimizing any potential data loss. But you can see anywhere from a few seconds or minutes of data loss. Now, you should only perform a manual failover in a true disaster scenario, expecting the fact that a few minutes of potential data loss could occur, to ensure that your database is back online as soon as possible.  16:44 Lois: Thank you so much, Kay. This conversation has been so educational for us. And thank you once again to Hannah and Sean. To learn more about Autonomous Database, head over to mylearn.oracle.com and search for the Oracle Autonomous Database Administration Workshop. Nikita: Thanks for joining us today. In our next episode, we will discuss Autonomous Database on Dedicated Infrastructure. Until then, this is Nikita Abraham… Lois: …and Lois Houston signing off. 17:12 That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

Oracle University Podcast
Best of 2023: OCI Identity and Access Management

Oracle University Podcast

Play Episode Listen Later Dec 5, 2023 13:56


Data breaches occur more often than we'd like them to. As businesses embrace remote work practices, IT resources are more at risk than ever before. Oracle Identity and Access Management (IAM) is an essential tool for protecting enterprise resources against cybersecurity threats. Join Lois Houston and Nikita Abraham, along with Rohit Rahi, as they examine IAM and the key aspects of this service, and discuss how you can control who has access to your resources.   Oracle MyLearn: https://mylearn.oracle.com/ Oracle University Learning Community: https://education.oracle.com/ou-community X (formerly Twitter): https://twitter.com/Oracle_Edu LinkedIn: https://www.linkedin.com/showcase/oracle-university/   Special thanks to Arijit Ghosh, Kiran BR, Rashmi Panda, David Wright, the OU Podcast Team, and the OU Studio Team for helping us create this episode.   --------------------------------------------------------   Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started. 00:26 Nikita: Hello and welcome to the Oracle University Podcast. I'm Nikita Abraham, Principal Technical Editor with Oracle University, and with me is Lois Houston, Director of Innovation Programs. Lois: Hi everyone. Thanks for joining us for this Best of 2023 series, where we're playing you six of our most popular episodes of the year.   00:47 Nikita: Today's episode is #3 of 6 and is a throwback to a conversation with Rohit Rahi, Vice President of CSS OU Cloud Delivery, on Identity and Access Management, which is one of OCI's top security features. So, let's get straight into it. 01:03 Rohit: IAM stands for Identity and Access Management service. It's also sometimes referred to as fine-grained access control or role-based access control service.  There are two key aspects to this service. The first one is called authentication, or also referred to as AuthN. And the second aspect is referred to as authorization or also referred to as AuthZ. Authentication has to deal with identity or who someone is, while authorization has to deal with permission or what someone is allowed to do.  01:37 Rohit: So basically what the service ensures is making sure that a person is who they claim to be. And as far as authorization is concerned, what the service does is it allows a user to be assigned one or more pre-determined roles, and each roles comes with a set of permissions. Now, there are various concepts which are part of this service or various features which are part of this service, starting with identity domains, principles, groups, dynamic groups, compartments, et cetera. Now identity domains is basically a container for your users and groups. So think about this as a construct which represents a user population in OCI and the associated configurations and security settings.  02:30 Lois: So, how does this work in practice?  Rohit: Well, what we do first is we create an identity domain, and then we create users and groups within that identity domain. And then we write policies against those groups, and policies are scoped to a tenancy, an account, or a compartment. And of course, the resources are available within a compartment. And again, compartment is kind of a logical isolation for resources. So this is how the whole service works. 03:03 Rohit: And users and the groups, authentication is done by common mechanisms like username and password, and policies is basically where you provide this role-based access control. So you put these groups in one of the pre-determined roles, and then you assign some permissions against those roles. So this is how the service works in a nutshell.  Now anything you create in the cloud, all these objects, whether it's a block storage, it's a compute instance, it's a file storage, it's a database, these are all resources. And if these things are resources, there has to be a unique identifier for these resources, else how are you going to operate on these resources? So what OCI does is it provides its own assigned identifier, which is called Oracle Cloud ID, OCID. You don't have to provide this. We do this automatically for all the resources. 04:02 Nikita: Thanks for that rundown, Rohit. Another feature of OCI is compartments, right? Can you tell us a bit about compartments? Rohit: When you open an account in OCI, you get a tenancy. That's another fancy name for an account. And we also give you a Root Compartment. So think of Root Compartment as this logical construct where you can keep all of your cloud resources. And then what you could do is, you could create your own individual compartments. And the idea is, you create these for isolation and controlling access. And you could keep a collection of related resources in specific compartments. So the network resource has-- a network compartment has network resources, and storage compartment has storage resources.  04:46 Rohit: Now, keep in mind, Root Compartment, as I said earlier, can hold all of the cloud resources. So it can be sort of a kitchen sink. You could put everything in there. But the best practice is to create dedicated compartments to isolate resources. You will see why. Let me just explain. So first thing is, each resource you create belongs to a single compartment. So you create a virtual machine, for example. It goes to Compartment A. It cannot go to Compartment B again. You have to move it from Compartment A, or delete, and re-create in Compartment B. Keep in mind, each resource belongs to a single compartment.  05:21 Rohit: Why you use compartments in the first place is for controlling access and isolation. So the way you do that is, you have the resources, let's say in this case a block storage, kept in Compartment A. You don't want those to be used by everyone. You want those to be used only by the compute admins and storage admins.  So you create those admins as users and groups, write these policies, and they can access these resources in this compartment. So it's very important. Do not put all of your resources in the Root Compartment. Create resource-specific compartments, or whichever way you want to divide your tenancies, and put resources accordingly.  06:00 Lois: Now, how do resources interact if they are in different compartments? Do they all have to be in the same compartment?  Rohit: Absolutely not! Resources in one compartment can interact with the resource in another compartment. Here, the Virtual Cloud Network is-- the compute instance uses the Virtual Cloud Network, but these are in two different compartments. So this is absolutely supported. And it keeps your design much cleaner.  Keep in mind that resources can also be moved from one compartment to another. So in this example, Compartment A had a virtual machine. We can move that from Compartment A to Compartment B. Another concept, which is very important to grasp is the compartments are global constructs, like everything in identity. So resources from multiple regions can be in the same compartment. So when you go to Phoenix, you see this compartment existing. You go to Ashburn, you see the same compartment.  06:55 Rohit: Now, you can write policies to prevent users from accessing resources in a specific region. You could do that. But keep in mind, all the compartments you create are global, and they are available in every region you have access to. Compartments can also be nested. So you have up to six levels nesting provided by compartments. You would do this again because this can mimic your current design, whether it's your organizational design or whether it's your ID hierarchy. You could create nested compartments. It just helps keep your design cleaner.  07:32 Rohit: And then, finally, you could set quotas and budgets on compartments. So you could say that, my particular compartment, you cannot create a bare metal machine. Or you cannot create an Exadata resource. So you could control it like that. And then you could also create budgets on compartments. So you could say that, if the usage in a particular compartment goes beyond $1,000, you'd get flagged, and you get notified. So you could do that. So that's compartments for you. It's a very unique feature within OCI. We believe it helps keep your tenancies much better organized. And it really supports your current ID hierarchy and design.  08:12 Boosting your professional reputation is now easier than ever. Oracle University Learning Community is a collaborative, dynamic community that gives you the power to create your own personal brand. Achieve champion levels and acquire badges. Get inducted into the Hall of Fame. Become a thought leader. If you are already an Oracle MyLearn user, go to MyLearn to join the community. You will need to log in first. If you have not yet accessed Oracle MyLearn, visit mylearn.oracle.com and create an account to get started. 08:53 Nikita: Welcome back! So Rohit, can you tell us a little bit about principals? Rohit: A principal is an IAM entity that is allowed to interact with OCI resources. There are two kinds of principals primarily in OCI. One is your users. Think about people who are logging on to your console or using your CLI or SDKs, users… human beings actually using your cloud resources. And then the resources themselves can be principals. So a good example of a resource principal is an instance principal which is actually an instance which becomes a principal, which means that it can make API calls against other OCI services like storage.  09:34 Rohit: Also, when we talk about principles we have groups. And groups are basically collection of users who have the same type of access requirements to resources. So you can have a storage admin group where you could group all the human beings who are storage administrators and so on and so forth. So let's look at some of the details, starting with authentication. Authentication is sometimes also referred to as AuthN. Authentication is basically figuring out are you who you say you are. And the easiest way to understand this is all of us deal with this on everyday basis. When you go to our website and you provide your username and password to access some of the content, you are being authenticated.  10:15 Rohit: There are other ways to do authentication. The one common for cloud is API Signing Keys. So when you are making API calls, whether you're using the SDK or the CLI, you would use the API Signing Keys which use a public private key pair to sign these APIs calls and authenticate these APIs calls. It uses an RSA key pair, with both a public key and a private key. There is also a third way to do authentication, and that's based on authentication tokens. And these are Oracle-generated token strings. And the idea here is you can authenticate third-party APIs which don't support OCI authentication model.  10:56 Lois: So, then, what are authorizations?  Rohit: So authorization deals with permissions and figuring out what permissions do you have. In OCI, authorization is done through what we call as IAM policies. And policies, think about these as human readable statements to define granular permissions. Remember, policies can be attached to a compartment or they could be attached to a tenancy. If they're attached to a tenancy, it applies to everything within that tenancy. If it's applied to a compartment, it applies to only the resources within that compartment.  11:33 Rohit: The syntax is always you have to start with an allow. Everything is denied by default, so you don't really have to write a deny statement. So you say allow group_name. A group is basically a collection of users. So you cannot write a policy on individual users, you always operate at a group level. To do something, there's a verb. On some resources, there's a resource-type and there's a location.  Location can be a tenancy. Location can be a compartment. And you can make these policies really complex with adding conditions. So just to give you an idea of what the verbs might look like. There are four levels of verb. There is a manage, there's a use, there's a read, and there's a inspect. And as you go down, these become additive.  12:17 Rohit: So manage basically means you can manage your resources, use basically means you can read but you could not do things like update and delete and so on and so forth. And you can read more on the documentation. Resource type basically can be all resources, meaning everything in your account, or it could be compute resources, database resources, whatnot, all the resources you have.  Now, you could operate at a family level, which is meaning all the entities within that resource family, or you could even go very granular. So you could say that in compute, I just want somebody to operate on the instances, but not work on the instance images. So you could actually do that.  So this is how you would write a policy.  12:58 Nikita: For more on OCI, please visit mylearn.oracle.com, create a profile if you don't already have one, and get started on our free training on OCI Foundations. Taking this training will help you advance and future-proof your career and prepare you for our OCI Foundations Associate exam. Nikita: We hope you enjoyed that conversation. Join us next week for another throwback episode. Until then, this is Nikita Abraham... Lois: And Lois Houston, signing off! 13:27 That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

Oracle University Podcast
Best of 2023: Getting Started with Oracle Cloud Infrastructure

Oracle University Podcast

Play Episode Listen Later Nov 28, 2023 13:26


Oracle's next-gen cloud platform, Oracle Cloud Infrastructure, has been helping thousands of companies and millions of users run their entire application portfolio in the cloud. Today, the demand for OCI expertise is growing rapidly. Join Lois Houston and Nikita Abraham, along with Rohit Rahi, as they peel back the layers of OCI to discover why it is one of the world's fastest-growing cloud platforms.   Oracle MyLearn: https://mylearn.oracle.com/ Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ X (formerly Twitter): https://twitter.com/Oracle_Edu   Special thanks to Arijit Ghosh, Kiran BR, Rashmi Panda, David Wright, the OU Podcast Team, and the OU Studio Team for helping us create this episode.   ------------------------------------------------------   Episode Transcript: 00:00 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started. 00:26 Lois: Welcome to the Oracle University Podcast. I'm Lois Houston, Director of Innovation Programs with Oracle University, and with me today is Nikita Abraham, Principal Technical Editor. Nikita: Hi there! You're listening to our Best of 2023 series, where over the next few weeks, we'll be revisiting six of our most popular episodes of the year. 00:47 Lois: Today is episode 2 of 6, and we're throwing it back to our very first episode of the Oracle University Podcast. It was a conversation that Niki and I had with Rohit Rahi, Vice President, CSS OU Cloud Delivery. During this episode, we discussed Oracle Cloud Infrastructure's core coverage on different tiers. Nikita: But we began by asking Rohit to explain what OCI is and tell us about its key components. So, let's jump right in. 01:14 Rohit: Some of the world's largest enterprises are running their mission-critical workloads on Oracle's next generation cloud platform called Oracle Cloud Infrastructure. To keep things simple, let us break them down into seven major categories: Core Infrastructure, Database Services, Data and AI, Analytics, Governance and Administration, Developer Services, and Application Services.  But first, the foundation of any cloud platform is the global footprint of regions. We have many generally available regions in the world, along with multi-cloud support with Microsoft Azure and a differentiated hybrid offering called Dedicated Region Cloud@Customer.  01:57 Rohit: We have building blocks on top of this global footprint, the seven categories we just mentioned. At the very bottom, we have the core primitives: compute, storage, and networking. Compute services cover virtual machines, bare metal servers, containers, a managed Kubernetes service, and a managed VMWare service.  These services are primarily for performing calculations, executing logic, and running applications. Cloud storage includes disks attached to virtual machines, file storage, object storage, archive storage, and data migration services. 02:35 Lois: That's quite a wide range of storage services. So Rohit, we all know that networking plays an important role in connecting different services. These days, data is growing in size and complexity, and there is a huge demand for a scalable and secure approach to store data. In this context, can you tell us more about the services available in OCI that are related to networking, database, governance, and administration? 03:01 Rohit: Networking features let you set up software defined private networks in Oracle Cloud. OCI provides the broadest and deepest set of networking services with the highest reliability, most security features, and highest performance.  Then we have database services, we have multiple flavors of database services, both Oracle and open source. We are the only cloud that runs Autonomous Databases and multiple flavors of it, including OLTP, OLAP, and JSON.  And then you can run databases and virtual machines, bare metal servers, or even Exadata in the cloud. You can also run open source databases, such as MySQL and NoSQL in the Oracle Cloud Infrastructure.  03:45 Rohit: Data and AI Services, we have a managed Apache Spark service called Dataflow, a managed service for tracking data artifacts across OCI called Data Catalog, and a managed service for data ingestion and ETL called Data Integration.  We also have a managed data science platform for machine learning models and training. We also have a managed Apache Kafka service for event streaming use cases.  Then we have Governance and Administration services. These services include security, identity, and observability and management. We have unique features like compartments that make it operationally easier to manage large and complex environments. Security is integrated into every aspect of OCI, whether it's automatic detection or remediation, what we typically refer as Cloud Security Posture Management, robust network protection or encryption by default.  We have an integrated observability and management platform with features like logging, logging analytics, and Application Performance Management and much more.  04:55 Nikita: That's so fascinating, Rohit. And is there a service that OCI provides to ease the software development process? Rohit: We have a managed low code service called APEX, several other developer services, and a managed Terraform service called Resource Manager.  For analytics, we have a managed analytics service called Oracle Analytics Cloud that integrates with various third-party solutions.  Under Application services, we have a managed serverless offering, call functions, and API gateway and an Events Service to help you create microservices and event driven architectures.  05:35 Rohit: We have a comprehensive connected SaaS suite across your entire business, finance, human resources, supply chain, manufacturing, advertising, sales, customer service, and marketing all running on OCI.  That's a long list. And these seven categories and the services mentioned represent just a small fraction of more than 80 services currently available in OCI.  Fortunately, it is quick and easy to try out a new service using our industry-leading Free Tier account. We are the first cloud to offer a server for just a penny per core hour.  Whether you're starting with Oracle Cloud Infrastructure or migrating your entire data set into it, we can support you in your journey to the cloud.   06:28 Have an idea and want a platform to share your technical expertise? Head over to the new Oracle University Learning Community. Drive intellectual, free-flowing conversations with your peers. Listen to experts and learn new skills. If you are already an Oracle MyLearn user, go to MyLearn to join the Community. You will need to log in first. If you have not yet accessed Oracle MyLearn, visit mylearn.oracle.com and create an account to get started.  Join the conversation today! 07:04 Nikita: Welcome back! Now let's listen to Rohit explain the core constructs of OCI's physical architecture, starting with regions. Rohit: Region is a localized geographic area comprising of one or more availability domains.  Availability domains are one or more fault tolerant data centers located within a region, but connected to each other by a low latency, high bandwidth network. Fault domains is a grouping of hardware and infrastructure within an availability domain to provide anti-affinity. So think about these as logical data centers.  Today OCI has a massive geographic footprint around the world with multiple regions across the world. And we also have a multi-cloud partnership with Microsoft Azure. And we have a differentiated hybrid cloud offering called Dedicated Region Cloud@Customer.  08:02 Lois: But before we dive into the physical architecture, can you tell us…how does one actually choose a region?  Rohit: Choosing a region, you choose a region closest to your users for lowest latency and highest performance. So that's a key criteria. The second key criteria is data residency and compliance requirements. Many countries have strict data residency requirements, and you have to comply to them. And so you choose a region based on these compliance requirements.  08:31 Rohit: The third key criteria is service availability. New cloud services are made available based on regional demand at times, regulatory compliance reasons, and resource availability, and several other factors. Keep these three criteria in mind when choosing a region.  So let's look at each of these in a little bit more detail. Availability domain. Availability domains are isolated from each other, fault tolerant, and very unlikely to fail simultaneously. Because availability domains do not share physical infrastructure, such as power or cooling or the internal network, a failure that impacts one availability domain is unlikely to impact the availability of others.  A particular region has three availability domains. One availability domain has some kind of an outage, is not available. But the other two availability domains are still up and running.  09:26 Rohit: We talked about fault domains a little bit earlier. What are fault domains? Think about each availability domain has three fault domains. So think about fault domains as logical data centers within availability domain.  We have three availability domains, and each of them has three fault domains. So the idea is you put the resources in different fault domains, and they don't share a single point of hardware failure, like physical servers, physical rack, top of rack switches, a power distribution unit. You can get high availability by leveraging fault domains.  We also leverage fault domains for our own services. So in any region, resources in at most one fault domain are being actively changed at any point in time. This means that availability problems caused by change procedures are isolated at the fault domain level. And moreover, you can control the placement of your compute or database instances to fault domain at instance launch time. So you can specify which fault domain you want to use.  10:29 Nikita: So then, what's the general guidance for OCI users?  Rohit: The general guidance is we have these constructs, like fault domains and availability domains to help you avoid single points of failure. We do that on our own. So we make sure that the servers, the top of rack switch, all are redundant. So you don't have hardware failures or we try to minimize those hardware failures as much as possible. You need to do the same when you are designing your own architecture.  So let's look at an example. You have a region. You have an availability domain. And as we said, one AD has three fault domains, so you see those fault domains here.  11:08 Rohit: So first thing you do is when you create an application you create this software-defined virtual network. And then let's say it's a very simple application. You have an application tier. You have a database tier.  So first thing you could do is you could run multiple copies of your application. So you have an application tier which is replicated across fault domains. And then you have a database, which is also replicated across fault domains.  11:34 Lois: What's the benefit of this replication, Rohit?  Rohit: Well, it gives you that extra layer of redundancy. So something happens to a fault domain, your application is still up and running.  Now, to take it to the next step, you could replicate the same design in another availability domain. So you could have two copies of your application running. And you can have two copies of your database running.  11:57 Now, one thing which will come up is how do you make sure your data is synchronized between these copies? And so you could use various technologies like Oracle Data Guard to make sure that your primary and standby-- the data is kept in sync here. And so that-- you can design your application-- your architectures like these to avoid single points of failure. Even for regions where we have a single availability domain, you could still leverage fault domain construct to achieve high availability and avoid single points of failure.  12:31 Nikita: Thank you, Rohit, for taking us through OCI at a high level.  Lois: For a more detailed explanation of OCI, please visit mylearn.oracle.com, create a profile if you don't already have one, and get started on our free training on OCI Foundations.  Nikita: We hope you enjoyed that conversation. Join us next week for another throwback episode. Until then, this is Nikita Abraham... Lois: And Lois Houston, signing off! 12:57 That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

Cloud Wars Live with Bob Evans
Road to Oracle CloudWorld: AI Integration, Cloud Database Focus, and Exadata Enhancements

Cloud Wars Live with Bob Evans

Play Episode Listen Later Aug 11, 2023 14:31


Road to Oracle CloudWorld | Databases with Jenny Tsai-SmithThe Big ThemesInclusive to all customers: After last year, when some on-premises customers mistakenly thought that the conference might be geared exclusively to cloud customers, Oracle is using the name "Database World at CloudWorld" to make it clear that cloud, hybrid, and on-premises Oracle Database customers are all included.AI integrated into Oracle Database: Oracle has incorporated artificial intelligence (AI) and machine learning (ML) into its database offerings. The company already supports over 30 ML algorithms within the database, enabling data processing and training without moving data outside. Graph analytics and real-time graph views are among the upcoming AI-related features.Enhancements in Exadata and developer support: Oracle's Exadata product line, in collaboration with AMD, brings enhanced performance through increased memory, processing speed, and storage. The company is committed to supporting a variety of customer configurations, acknowledging the transition to the cloud while ensuring on-premises options. Developers are also a focus, with a developer-friendly release approach, tools to increase productivity, and discussions about features like Kubernetes support and data processing enhancements. The Big Quote: " . . we're going back to those some roots, if you will, and looking at what are things that developers really care about? Maybe they're even suffering through. And how can we solve those kinds of problems for them?"

Oracle University Podcast
Maximum Availability Architecture

Oracle University Podcast

Play Episode Listen Later Aug 8, 2023 14:44


Join Lois Houston and Nikita Abraham, along with Alex Bouchereau, as they talk about Oracle Maximum Availability Architecture, which provides architecture, configuration, and lifecycle best practices for Oracle Databases.   Oracle MyLearn: https://mylearn.oracle.com/ Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ Twitter: https://twitter.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Ranbir Singh, and the OU Studio Team for helping us create this episode.   --------------------------------------------------------   Episode Transcript:   00;00;00;00 - 00;00;39;11 Welcome to the Oracle University Podcast, the first stop on your cloud journey. During this series of informative podcasts, we'll bring you foundational training on the most popular Oracle technologies. Let's get started. Hello and welcome to the Oracle University Podcast. I'm Nikita Abraham, Principal Technical Editor with Oracle University, and I'm joined by Lois Houston, Director of Product Innovation and Go to Market Programs.   00;00;39;18 - 00;01;12;09 Hi, everyone. Last week, we discussed Oracle's Maximum Security Architecture, and today, we're moving on to Oracle Cloud Infrastructure's Maximum Availability Architecture. To take us through this, we're once again joined by Oracle Database Specialist Alex Bouchereau. Welcome, Alex. We're so happy you're becoming a regular on our podcast. So, to start, what is OCI Maximum Availability Architecture? Now, before we actually jump into the specifics, it's important to understand the problem we're trying to address.   00;01;12;11 - 00;01;38;01 And that is database downtime and data protection. We don't want any data loss and the impact of both of these types of occurrences can be significant. Now, $350K on average of costs of downtime per hour, 87 hours average amount of downtime per year is pretty significant. So, it's a very, very common occurrence. It's $10 million for a single outage, depending on how critical the application is.   00;01;38;03 - 00;02;02;28 And 91% of companies have experienced unplanned data center outages, which means this occurs fairly often. So, what can we do about this? How do we address the problem of data loss? It's important to understand a different terminology first. So, we'll start with high availability. High availability provides redundant components to go ahead and ensure your service is uninterrupted in case of a type of hardware failure.   00;02;03;01 - 00;02;24;24 So, if one server goes down, the other servers will be up. Ideally, you'll have a cluster to go ahead and provide that level of redundancy. And then we talk about scalability. Depending upon the workload, you want to ensure that you still have your performance. So, as your application becomes more popular and more end users go ahead and join it, the workload increases.   00;02;24;26 - 00;02;42;28 So, you want to ensure that the performance is not impacted at all. So, if we want to go ahead and minimize the time of our planned maintenance, which happens more often and a lot more often than unplanned outages, we need to do so in a rolling fashion. And that's where rolling upgrades, rolling patches, and all these types of features come into play.   00;02;42;29 - 00;03;10;20 Okay, so just to recap, the key terms you spoke about were high availability, which is if one server goes down, others will be up, scalability, which is even if the workload increases, performance isn't impacted, and rolling updates, which is managing planned updates seamlessly with no downtime. Great. What's next? Disaster recovery. So, we move from high availability to disaster recovery, protecting us from a complete site outage.   00;03;10;27 - 00;03;35;02 So, if the site goes down entirely, we want to have a redundant site to be able to failover to. That's where disaster recovery comes into play. And then how do we measure downtime and data loss? So, we do so with Recovery Point Objectives, or RPOs, measuring data loss and Recovery Time Objectives, or RTOs, measuring our downtime.   00;03;35;05 - 00;04;00;22 Alex, when you say measure downtime, how do we actually do that? Well, we use a technique called chaos engineering. Essentially, it's an art form at the end of the day because it's constantly evolving and changing over time. We're proactively breaking things in the system and we're testing how our failover, how our resiliency, and how our switchovers, and how everything goes ahead and works under the covers with all our different features.   00;04;00;23 - 00;04;21;28 A lot of components can suffer an outage, right? We have networks and servers, storage, and all these different components can fail. But also human error. Someone can delete a table. You could delete a bunch of rows. So, they can make a mistake on the system as well. That occurs very often. Data corruption and then, of course, power failures.   00;04;22;00 - 00;04;45;03 Godzilla could attack and take out the entire data center. Godzilla! Ha! And you want to be able to go ahead and have a disaster recovery in place. And then there's all kinds of maintenance activities that happen with application updates. You might want to reorganize the data without changing the application and the small, little optimizations. And these can all happen in isolation and or in combination with each other.   00;04;45;05 - 00;05;19;03 And so chaos engineers take all this into consideration and build out the use cases to go ahead and test the system. Do we have some best practices in place for this, then? Oracle Maximum Availability Architecture, MAA, is Oracle's best practice blueprint based on proven Oracle high availability technologies, end-to-end validation, expert recommendations, and customer experiences. The key goal of MAA is to achieve optimal high availability, data protection, and disaster recovery for Oracle customers at the lowest cost and complexity.   00;05;19;05 - 00;05;54;07 MAA consists of reference architectures for various buckets of HA service-level agreements, configuration practices, and HA lifecycle operational best practices, and are applicable for non-engineered systems, engineered systems, non-cloud, and cloud deployments. Availability of data and applications is an important element of every IT strategy. At Oracle, we've used our decades of enterprise experience to develop an all-encompassing framework that we can all call Oracle MAA, for Maximum Availability Architecture.   00;05;54;07 - 00;06;20;21 And how was Oracle's Maximum Availability Architecture developed? Oracle MAA starts with customer insights and expert recommendations. These have been collected from our huge pool of customers and community of database architects, software engineers, and database strategists. Over the years, this has helped the Oracle MAA development team gain a deep and complete understanding of various kinds of events that can affect availability.   00;06;20;24 - 00;06;48;11 Through this, they have developed an array of availability reference architectures. These reference architectures acknowledge not all data or applications require the same protection and that there are real tradeoffs in terms of cost and effort that should be considered. Whatever your availability goals may be for a database or related applications, Oracle has the product functionality and guidance to ensure you can make the right decision with full knowledge of the tradeoffs in terms of downtime, data loss, and costs.   00;06;48;11 - 00;07;04;01 These reference architectures use a wide array of our HA features, configurations, and operational practices.   00;07;04;03 - 00;07;29;04 Want to get the inside scoop on Oracle University? Head on over to the all-new Oracle University Learning Community. Attend exclusive events. Read up on the latest news. Get firsthand access to new products and stay up-to-date with upcoming certification opportunities. If you're already an Oracle MyLearn user, go to mylearn.oracle.com to join the community. You will need to log in first. If you've not yet accessed Oracle MyLearn, visit mylearn.oracle.com and create an account to get started.   00;07;29;04 - 00;07;57;19 Join the community today. Welcome back. Alex, you were telling us about how Oracle MAA or Maximum Availability Architecture has reference architectures that use a series of high availability features and configurations. But, how do these help our customers? They help our end customers achieve primarily four goals.   00;07;57;22 - 00;08;29;29 Number one, data protection, reducing data loss through flashback and absolute data protection through zero data loss recovery appliance. Number two, active replication, which allows customers to connect their applications to replicated sites in an active-active HA solution through Active Data Guard and GoldenGate. Number three, scale out, which allows customers the ability to scale compute nodes linearly through RAC, ASM, and Sharding.   00;08;30;01 - 00;08;58;19 Four, continuous availability. This allows transparent failovers of services across sites distributed locally or remote, through AC and GDS. These features and solutions allow customers to mitigate not only planned events, such as software upgrades, data schema changes, and patching, but also unplanned events, such as hardware failures and software crashes due to bugs. Finally, customers have various deployment choices on which we can deploy these HA solutions.   00;08;58;22 - 00;09;25;02 The insights, recommendations, reference architectures, features, configurations, best practices, and deployment choices combine to form a holistic blueprint, which allows customers to successfully achieve their high availability goals. What are the different technologies that come into play here? Well, we'll start with RAC. So, RAC is a clustering technology spread through different nodes across the different servers, so you don't have a single point of failure.   00;09;25;05 - 00;09;46;13 From a scalability standpoint and performance standpoint, you get a lot of benefit associated with that. You constantly add a new node whenever you want to without experiencing any downtime. So, you have that flexibility at this point. And if any type of outage occurs, all the committed transactions are going to be protected and we'll go ahead and we'll move that session over to a new service.   00;09;46;15 - 00;10;07;27 So, from that point, we want to go ahead and also protect our in-flight transactions. So, when it comes to in-flight transactions, how are we going to protect those in addition to the RAC nodes? Well, we can go ahead and do so with another piece of technology that's built into RAC, and that's the Transparent Application Continuity feature. So, this feature is going to expand the capabilities of RAC.   00;10;08;03 - 00;10;28;18 It's a feature of RAC to go ahead and protect our in-flight transactions so our application doesn't experience those transactions failing and coming back up to the layer, or even up to the end users. We want to capture those. We want to replay them. So that's what application continuity does. It allows us to go in and do that.   00;10;28;21 - 00;10;51;03 It supports a whole bunch of different technologies, from Java, .NET, PHP. You don't have to make any changes to the application. All you have to do is use the correct driver and have the connection string appropriately configured and everything else is happening in the database. What about for disaster recovery? Active Data Guard is the Oracle solution for disaster recovery.   00;10;51;05 - 00;11;29;08 It eliminates a single point of failure by providing one or more synchronized physical replicas of the production database. It uses Oracle Aware Replication to efficiently use network bandwidth and provide unique levels of data protection. It provides data availability with fast, manual, or automatic failover to standby should a primary fail and fast switch over to a standby for the purpose of minimizing planned downtime as well. An Active Data Guard standby is open, read only, while it is being synchronized, providing advanced features for data protection, availability, and protection offload.   00;11;29;08 - 00;11;50;23 We have different database services, right? We have our Oracle Database Cloud servers, we have Exadata Cloud servers, and we have Autonomous Database. Do they all have varying technologies built into them? All of them are Database Aware architecture at the end of the day. And the Oracle Database Cloud Service, you have the choice of single instance, or you can go ahead and choose between RAC as well.   00;11;50;25 - 00;12;23;25 You can use quick migration via Zero Downtime Migration, or ZDM for short. We have automated backups built in, and you can set up cross-regional or cross availability to do any DR with Active Data Guard through our control play. And we build on that with Exadata Cloud Service by going ahead and changing the foundation to Exadata, with all the rich benefits of performance, scalability, and optimizations for the Oracle Database, and all the different HA and DR technologies that run within it, to the cloud.   00;12;23;27 - 00;12;50;22 Very easy to go ahead and move from Exadata on-premise to Exadata Cloud Service. And you have choices. You can do the public cloud, or you can do Cloud@Customer or ExaCC, as we call it, to go ahead and run Exadata within your own data center--Exadata Cloud Service and your own data center. And building on top of that, we have Autonomous, which also builds on top of that Exadata infrastructure.   00;12;50;25 - 00;13;19;12 And we have two flavors of that. We have shared and we have dedicated, depending upon your requirements. And is all of this managed by Oracle? Now, at this point, everything's managed by Oracle and things like Data Guard can be configured. We call it Autonomous Data Guard in the Autonomous Database. With a simple two clicks, you can set up cross-regional or cross availability domain VR. And then everything is built, of course, from a high-available multitenant RAC infrastructure.   00;13;19;15 - 00;13;48;02 So, it's using all other technologies and optimizations that we've been talking about. Thanks, Alex, for listing out the different offerings we have. I think we can wind up for today. Any final thoughts? So high availability, disaster recovery, absolute requirements. Everybody should have it. Everybody should think of it ahead of time. We have different blueprints, different tiers of our MAA architecture that map different RTO and RPO requirements depending upon your needs.   00;13;48;04 - 00;14;12;01 And those may change over time. And finally, the business continuity we can provide with MAA is for both planned maintenance and unplanned outage events. So, it's for both. And that's a critical part to this as well. Thank you, Alex, for spending this time with us. That's it for this episode. Next week, we'll talk about managing Oracle Database with REST APIs, and ADB built-in tools.   00;14;12;04 - 00;16;57;28 Until then, this is Nikita Abraham and Lois Houston signing off. That's all for this episode of the Oracle University Podcast. If you enjoyed listening, please click Subscribe to get all the latest episodes. We'd also love it if you would take a moment to rate and review us on your podcast app. See you again on the next episode of the Oracle University Podcast.

Oracle University Podcast
Exadata Cloud Service

Oracle University Podcast

Play Episode Listen Later Jun 6, 2023 18:25


Hear Lois Houston and Nikita Abraham, along with Alex Bouchereau, talk about Exadata Cloud Service, and more specifically, Exadata Cloud Service X8M, which is the latest release of Oracle's premier Database Cloud Service.   They also take a look at how advanced cloud automation, dynamic resource scaling, and flexible subscription pricing enable customers to run database workloads faster and with lower costs.   Oracle MyLearn: https://mylearn.oracle.com/ Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ Twitter: https://twitter.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Ranbir Singh, and the OU Studio Team for helping us create this episode.

Oracle University Podcast
Oracle Autonomous Database

Oracle University Podcast

Play Episode Listen Later May 30, 2023 12:13


What if you could significantly reduce the amount of time spent managing your database while still being confident that it is secure?   Well, you can! With Oracle Autonomous Database (ADB), you can enjoy the highest levels of performance, scalability, and availability without the complexity of operating and securing your database.   In this episode, Lois Houston and Nikita Abraham speak to William Masdon about how you can use the features of ADB to securely integrate, store, and analyze your data.   Oracle MyLearn: https://mylearn.oracle.com/ Oracle University Learning Community: https://education.oracle.com/ou-community LinkedIn: https://www.linkedin.com/showcase/oracle-university/ Twitter: https://twitter.com/Oracle_Edu   Special thanks to Arijit Ghosh, David Wright, Ranbir Singh, and the OU Studio Team for helping us create this episode.

DBAOCM Podcast
EP511 - Qual o preço de um Exadata?

DBAOCM Podcast

Play Episode Listen Later Oct 10, 2022 12:10


EP511 - Qual o preço de um Exadata?   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

Cloud Wars Live with Bob Evans
Road to Oracle CloudWorld 2022 Databases Preview: Exadata, 23c Release, and More with EVP Juan Loaiza

Cloud Wars Live with Bob Evans

Play Episode Listen Later Oct 3, 2022 10:48


The Big Themes:Enhancing the developer experience: This is a major focus for Oracle, and many CloudWorld sessions will focus on work that's been done to help developers use Oracle DBs, including Autonomous Database.Oracle Exadata: Along with database and cloud, Juan says that Exadata is one of three core initiatives for the event.Oracle Database 23c release: CloudWorld will feature the very first preview and discussion of Oracle's next major DB release, called 23c.The Big Quote: "We're also talking about what we're doing for distributed apps, like micro-services and events. We're building a lot of this technology directly into the Oracle database to make it dramatically simpler. We'll be talking about things we're doing with documents and document databases—that's going to be a really big deal."Want to learn more about Oracle CloudWorld 2022?Explore content, speakers, and registration options. This episode is sponsored by Oracle.

Digital Impact Radio
S8 EP22 - Robert Greene discusses Autonomous & Exadata Cloud@Customer

Digital Impact Radio

Play Episode Listen Later Sep 26, 2022 19:37


Robert Greene, Oracle VP of Product Management talks about Exadata Cloud@Customer, being able to create both Autonomous and Exadata Database VM Clusters on Exadata Cloud@Customer platforms from X7 Gen 2 to the newest generation. This provides organizations a single platform for their Modernisation Approach whether it includes a traditional approach to managing data, or what is possible with the Autonomous Database Category.

DBAOCM Podcast
EP453 - 4 principais erros de quem quer aprender Exadata

DBAOCM Podcast

Play Episode Listen Later Jun 28, 2022 11:26


EP453 - 4 principais erros de quem quer aprender Exadata   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

DBAOCM Podcast
EP447 - O Plano para Certificação Exadata em 7 semanas

DBAOCM Podcast

Play Episode Listen Later Jun 15, 2022 11:48


EP447 - O Plano para Certificação Exadata em 7 semanas   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

DBAOCM Podcast
EP446 - O que cai na prova de Certificação Exadata X9M (1Z0-902) ?

DBAOCM Podcast

Play Episode Listen Later Jun 13, 2022 10:16


EP446 - O que cai na prova de Certificação Exadata X9M (1Z0-902) ?   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

DBAOCM Podcast
EP444 - Qual a diferença do Exadata Cloud @ Customer (ExaCC)?

DBAOCM Podcast

Play Episode Listen Later Jun 7, 2022 13:24


EP444 - Qual a diferença do Exadata Cloud @ Customer (ExaCC)?   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

DBAOCM Podcast
EP443 - Quanto custa a Certificação Exadata X9M?

DBAOCM Podcast

Play Episode Listen Later Jun 6, 2022 12:20


EP443 - Quanto custa a Certificação Exadata X9M?   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

DBAOCM Podcast
EP438 - Qual a vantagem do Exadata Virtualizado?

DBAOCM Podcast

Play Episode Listen Later Jun 1, 2022 14:56


EP438 - Qual a vantagem do Exadata Virtualizado?   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

Google Cloud Platform Podcast
AlloyDB with Sandy Ghai and Gurmeet "GG" Goindi

Google Cloud Platform Podcast

Play Episode Listen Later May 18, 2022 47:36


AlloyDB for PostgreSQL has launched and hosts Mark Mirchandani and Gabe Weiss are here this week to talk about it with guests Sandy Ghai and Gurmeet Goindi. This fully managed, Postgres compatible database for enterprise use combines the power of Google Cloud and the best features of Postgres for superior data management. AlloyDB began years ago as a solution to help manage huge data migrations to the cloud. Customers needed a way to take advantage of the benefits of cloud, modernizing their databases as they migrated in an easy, flexible, and scalable way. Databases had to maintain performance and availability while offering enterprise customers optimal security features and more. We learn why PostgreSQL is important in the equation and how AlloyDB is built with Google scaling abilities and ML while supporting open source compatibility. We talk about data analytics workloads and how AlloyDB handles in-the-moment analytics needs. Our guests describe and compare different database offerings at Google, emphasizing the solutions that set AlloyDB apart. We chat about the types of projects each database is best suited for and how AlloyDB fits into the Google database portfolio. We hear examples of customers moving to AlloyDB and how they're using this new service. Clients have been leveraging the embedded ML features for better data management. Sandy Ghai Sandy is a product manager on GCP Databases and has been working on the AlloyDB team since the beginning. Gurmeet “GG” Goindi GG is a product manager at Google, where he focuses on databases and attends meetings. Prior to joining Google, GG led product management for Exadata at Oracle, where he also worked on databases and attended meetings. GG has had various product management, management, and engineering roles for the last 20 years in Silicon Valley, but his favorite meetings have been at Google. He holds an MBA from the University of Chicago Booth School of Business. Cool things of the week Google I/O site Introducing “Visualizing Google Cloud: 101 Illustrated References for Cloud Engineers and Architects” blog Meet the people of Google Cloud: Priyanka Vergadia, bringing Google Cloud to life in illustrations blog Working with Remote Functions docs Interview AlloyDB for PostgreSQL site AlloyDB Documentation docs AlloyDB for PostgreSQL under the hood: Intelligent, database-aware storage blog PostgreSQL site Introducing AlloyDB for PostgreSQL video Introducing AlloyDB, a PostgreSQL-compatible cloud database service video BigQuery site Spanner site CloudSQL site What's something cool you're working on? Gabe is working on some exciting content to support landing the AlloyDB launch. Hosts Mark Mirchandani and Gabe Weiss

DBAOCM Podcast
EP378 - Como alterar IP no Exadata?

DBAOCM Podcast

Play Episode Listen Later Mar 4, 2022 28:32


EP378 - Como alterar IP no Exadata?   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

Datascape Podcast
Episode 56 - Oracle Database, Exadata And Cloud Update

Datascape Podcast

Play Episode Listen Later Feb 8, 2022 44:43


In this episode, Warner is joined by Oracle ACE Franky Faust to discuss the latest updates and current status of the Oracle database stack including Exadata and Oracle cloud offerings.

Digital Impact Radio
S8 EP01 - Martin Lambert talks Anytime Modernisation with Exadata

Digital Impact Radio

Play Episode Listen Later Jan 18, 2022 13:32


Martin Lambert introduces the term "Anytime Modernisation", helping organisations practice both Cloud Speed and Technical Debt. This enables customers to modernize operations and reduce costs, at speed.

Digital Impact Radio
S7 EP20 - Jason Reinhardt talks Data Management Choices

Digital Impact Radio

Play Episode Listen Later Dec 7, 2021 9:48


Jason Reinhardt, Oracle Product Manager talks about the Oracle Data Management Choices, from on-premise to the many cloud variants such as Autonomous Shared and Dedicated, Database Cloud Service, Exadata Cloud Service and Exadata Cloud at Customer.

DBAOCM Podcast
EP293 - Exadata X9M-2 100G Client Network

DBAOCM Podcast

Play Episode Listen Later Oct 30, 2021 10:30


EP293 - Exadata X9M-2 100G Client Network   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

DBAOCM Podcast
EP294 - Exadata X9M com menos PMEM?

DBAOCM Podcast

Play Episode Listen Later Oct 30, 2021 12:02


EP294 - Exadata X9M com menos PMEM?   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

DBAOCM Podcast
EP204 - Como planejar uma Expansão de Storage no Exadata

DBAOCM Podcast

Play Episode Listen Later Jun 20, 2021 16:18


EP204 - Como planejar uma Expansão de Storage no Exadata   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

DBAOCM Podcast
EP170 - Quais as principais features do Exadata?

DBAOCM Podcast

Play Episode Listen Later May 11, 2021 14:14


EP170 - Quais as principais features do Exadata?   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

Einsen & Nullen
Engineered Systems - Die datengetriebene Zukunft

Einsen & Nullen

Play Episode Listen Later May 11, 2021 21:29


Nachdem wir nun verstehen, was sich hinter dem Begriff „Engineered Systems“ verbirgt, klärt uns unser Gast Ralf Zenses von Oracle in der heutigen Episode darüber auf, wie diese funktionieren und welche Vorteile einen schon bei der Auslieferung erwarten. Ferner gibt er uns seine Einschätzung, wie sich der Beruf des Datenbankadministrators verändern wird, welche Prozesse in diesem Zusammenhang automatisiert werden – und zeigt zudem auf, wieso bei einem typischen Datenbankensystem immer das Netzwerk das Problem darstellt.

DBAOCM Podcast
EP165 - O que é o OEDA do Exadata?

DBAOCM Podcast

Play Episode Listen Later May 6, 2021 26:36


EP165 - O que é o OEDA do Exadata?   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

DBAOCM Podcast
EP164 - Network Requirements for Oracle Exadata

DBAOCM Podcast

Play Episode Listen Later May 5, 2021 32:10


EP164 - Network Requirements for Oracle Exadata   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

Einsen & Nullen
Engineered Systems - Keine schwarze Magie

Einsen & Nullen

Play Episode Listen Later May 4, 2021 19:06


Was hat die industrielle Vorfertigung mit der modernen IT gemeinsam? Wie wir heute erfahren, lautet die Antwort „Engineered Systems“. Unser Gast Ralf Zenses von Oracle erklärt uns, was sich hinter dem Begriff verbirgt und zeigt u. a. Vorteile in Bezug auf Aktualisierungen und das Einhalten der DSGVO auf. Ferner erklärt er, wieso ein Multi Purpose „Schweizer-Taschenmesser“ nicht immer die beste Lösung ist und wann man IT-Systeme am besten outsourcen sollte.

DBAOCM Podcast
EP162 - Oracle Exadata Database Machine Performance Features

DBAOCM Podcast

Play Episode Listen Later May 4, 2021 27:04


EP162 - Oracle Exadata Database Machine Performance Features   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

DBAOCM Podcast
EP154 - Autoglass melhora performance em 50% com Reorg do Exadata

DBAOCM Podcast

Play Episode Listen Later Apr 23, 2021 48:38


EP154 - Autoglass melhora performance em 50% com Reorg do Exadata   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

DBAOCM Podcast
EP153 - Posso atualizar a versão do AHF no Exadata ?

DBAOCM Podcast

Play Episode Listen Later Apr 21, 2021 11:42


EP153 - Posso atualizar a versão do AHF no Exadata ?   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

DBAOCM Podcast
EP127 - Como identificar melhorias em Exadata?

DBAOCM Podcast

Play Episode Listen Later Mar 17, 2021 9:20


EP127 - Como identificar melhorias em Exadata?   Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle:   https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

DBAOCM Podcast
Mentoria #38 | Você foi nos ajudar com a implementação do Exadata X8

DBAOCM Podcast

Play Episode Listen Later Feb 1, 2021 31:00


Mentoria #38 | Você foi nos ajudar com a implementação do Exadata X8 Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle: https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

Hashmap on Tap
#55 Challenges When Migrating from Oracle to Snowflake with Preetpal Singh

Hashmap on Tap

Play Episode Listen Later Jan 5, 2021 48:11


You asked, we answered. Hashmap hosts Kelly Kohlleffel and Preetpal Singh dive into how to migrate Oracle databases to Snowflake, whether it’s Exadata, RAC, or just good old Oracle RDBMS. Show Notes: Oracle Exadata to Snowflake with Hashmap Data Migrations with Hashmap Previous podcast episode on this topic: #46 Oracle Exadata to Snowflake Migrations On tap for today’s episode: Nespresso Vertuo and Italian Espresso Contact Us: https://www.hashmapinc.com/reach-out

GoldenTalks
GoldenTalks Live - Engineered Systems (ou Supercomputadores?)

GoldenTalks

Play Episode Listen Later Dec 16, 2020 162:37


Você sabe a diferença de um Engineered System e um Supercomputador? Nesta Live Técnica temos especialistas sobre Exadata, ODA e Zero Data Loss. Convidados: Franky Weber Fernando Simon Mario Barduchi Mediador: Gilson Martins #repliqueconhecimento #oracle #goldengatebr #goldengate #database #dba #bancodedados #ti #entrevista #carreira #sucesso #informatica #bigdata #oracleworld #oraclebrasil #estudesempre #tecnologia #goldentalks #live #exadata #oda #zdlra #engineeredsystems

Digital Impact Radio
Digital Impact Radio - Jason Reinhardt talks Exadata Platform Choices (Ser5/E22)

Digital Impact Radio

Play Episode Listen Later Dec 1, 2020 17:59


Jason Reinhardt, Database and Cloud Specialist talks about Oracle Exadata Platform as the best place to run Oracle Database, simplifying digital transformations, increasing database performance, and reducing costs. Jason explains how the Oracle Exadata Platform can be experienced via different deployment and consumption models, from On-Premise, At-Customer, Cloud Service to Autonomous.

DBAOCM Podcast
EP53 - Oracle X TSE: O atraso no resultado das eleições foi culpa do Exadata?

DBAOCM Podcast

Play Episode Listen Later Nov 24, 2020 39:48


EP53 - Oracle X TSE: O atraso no resultado das eleições foi culpa do Exadata? Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle: https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

DBAOCM Podcast
EP46 - Preciso fazer resize do /u01 no Exadata, e agora?

DBAOCM Podcast

Play Episode Listen Later Nov 7, 2020 14:52


EP46 - Preciso fazer resize do /u01 no Exadata, e agora? Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle: https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

DBAOCM Podcast
EP45 - Preciso fazer resize do /u01 no Exadata, e agora?

DBAOCM Podcast

Play Episode Listen Later Nov 4, 2020 14:52


EP45 - Preciso fazer resize do /u01 no Exadata, e agora? Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle: https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

Hashmap on Tap
#46 Oracle Exadata to Snowflake Migrations

Hashmap on Tap

Play Episode Listen Later Oct 20, 2020 46:58


Hashmap hosts, Kelly Kohlleffel, Randy Pitcher, and Preetpal Singh discuss the nuts and bolts of the wave of Oracle Exadata cloud migrations taking place today. They share different perspectives, approaches, and risks involved in an Exadata to Snowflake migration. Show Notes: Exadata to Snowflake Migration with Hashmap On tap for today’s episode: Pique Jasmine Green Tea with Savannah Bee Company Tupelo Honey, Sant Eustachio Nespresso, and Coconut Macaroon loose leaf tea. Contact Us: https://www.hashmapinc.com/reach-out

DBAOCM Podcast
EP29 - É possível utilizar um Storage Externo no Exadata?

DBAOCM Podcast

Play Episode Listen Later Sep 26, 2020 57:20


EP29 - É possível utilizar um Storage Externo no Exadata? Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle: https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

DBAOCM Podcast
Descubra as novidades do Exadata X8M

DBAOCM Podcast

Play Episode Listen Later Sep 19, 2020 3:32


Descubra as novidades do Exadata X8M Entre no nosso canal do Telegram para receber conteúdos Exclusivos sobre Banco de dados Oracle: https://t.me/joinchat/AAAAAEb7ufK-90djaVuR4Q

Google Cloud Platform Podcast
Bare Metal Solution with James Harding and Gurmeet Goindi

Google Cloud Platform Podcast

Play Episode Listen Later Aug 26, 2020 42:16


Mark and Brian Dorsey are together again this week as we learn all about Google’s Bare Metal Solution with our guests James Harding and Gurmeet “GG” Goindi. To start the show, GG introduces us to Bare Metal Solution, explaining that it allows client projects built on specialized, often outdated software to take advantage of the benefits of a cloud environment. Using Bare Metal Solution, clients can choose to migrate all or part of their projects for a fully customized experience. We learn how Bare Metal Solution is able to support a partial or full native solution for clients and go through the steps to getting a project from completely on-prem to the cloud where latency is decreased, security is increased, and other cloud benefits can be leveraged. GG gives examples of situations where Bare Metal is a great option for clients, for instance an established company with an early 90s database that recently branched out into apps built in cloud native software. James outlines the benefits of Bare Metal Solution over other options, including real world examples of industries that have been able to modernize their offerings and adapt with the Bare Metal. GG and James wrap up the show explaining why the open source aspect of Bare Metal is so important to the evolution and flexibility of the product, and we talk about the recent developments at Bare Metal. James Harding James Harding leads the Data Management Practice for North America, with responsibility for the go-to-market strategy for all products and services data mangement. He also oversees marketing campaigns and sales field enablement. Gurmeet “GG” Goindi Gurmeet Goindi (GG) is a product manager at Google, where he focuses on databases and attends meetings. Prior to joining Google, GG led product management for Exadata at Oracle, where he also worked on databases and attended meetings. GG has had various product management, management, and engineering roles for the last 20 years in Silicon Valley, but his favorite meetings have been at Google. He holds an MBA from the University of Chicago Booth School of Business. Cool things of the week Google Cloud Next Week 7: Application Modernization site Brian’s Cloud Next Presentation: Where Should I Run My Stuff? Choosing Compute Options site Mark’s Cloud Next Presentation: What’s New in Google Cloud Cost Management site Announcing the general availability of Google Cloud Game Servers blog Interview Bare Metal Solution site Bare Metal Solution Next Presentation site Bare Metal Solution on GitHub site Oracle site Oracle Rack Cabinets site Stack Chat Segment of the Week Max talks to Deloitte about how they built their system to help groups collect and respond to COVID-19 data on our Stack Chat Segment this week! What’s something cool you’re working on? NCAA bracket predictions on QwikLabs Here’s a hint for next week’s episode! GKE Turns 5: What’s New?

GoldenTalks
GoldenTalks - Episódio 17: LIVE com Fernando Simon

GoldenTalks

Play Episode Listen Later Aug 13, 2020 129:37


No GoldenTalks de hoje temos Fernando Simon. Ele é Oracle ACE, Co-Fundador do LuxOUG, Especialista em Exadata e ZDLRA. Se quiser pagar uma cerveja ou passar uma "buxa" para o Fernando Simon, pode entrar em contato através do: LinkedIn: https://www.linkedin.com/in/fernando-simon/ Twitter: https://twitter.com/FSimonDBA E-mail: fernando.simon.br@gmail.com Blog: http://www.fernandosimon.com/blog #oracle #goldengatebr #goldengate #database #dba #bancodedados #ti #talkshow #entrevista #carreira #sucesso #informatica #sucessoprofissional #bigdata #oracleworld #oraclebrasil #estudesempre #mundoti #bi #oracleemcasa #online #tecnologia #goldentalks #live #oracleace #oracleocm

DBA Genesis Audio Experience
Why CBO is better than RBO? | #dailyDBA 38

DBA Genesis Audio Experience

Play Episode Listen Later Mar 14, 2020 20:16


#dbaChallenge: In Oracle RAC, will one node query execution check execution plan with another node? - Questions Picked-up For This Episode: ============================ What recommendations do you have when it comes to indexes? How to write better queries so that if there is any change at application end, no changes must be done at db end? Why CBO is better than RBO? What is the impact of Exadata, Cloud and SAAS products on DBA jobs? How to define which column to create index on? Bonus Question: How to identify fake dba vs experienced dba? - #dailyDBA #dbaGenesis - Your comments encourage us to produce quality content, please take a second and say ‘Hi’ in the comments and let me and my team know what you thought of the video … p.s. It would mean the world to me if you hit the subscribe button ;) - Link to full course: https://dbagenesis.com/p/oracle-virtualbox-administration Link to all DBA courses: https://dbagenesis.com/courses Link to real-time projects: https://dbagenesis.com/p/projects Link to support articles: https://support.dbagenesis.com - DBA Genesis provides all you need to build and manage effective Oracle technology learning. We designed DBA Genesis as a simple to use yet powerful online Oracle learning system for students. Each of our courses is taught by an expert instructor, and every course is available with a challenging project to push you out of your comfort zone!! DBA Genesis is currently the fastest & the most engaging learning platforms for DBAs across the globe. Take your database administration skills to next level by enrolling into your first course. - Facebook: https://www.facebook.com/dbagenesis/ Instagram: https://www.instagram.com/dbagenesis/ Twitter: https://twitter.com/DbaGenesis Website: https://dbagenesis.com/ Contact us: support@dbagenesis.com - Start your DBA Journey Today !! Become an exclusive DBA Genesis member: https://www.youtube.com/channel/UCUHHRiLeH7sO46GJBRGweag/join

DBA Genesis Audio Experience
What is Oracle Exadata? | #dailyDBA 34

DBA Genesis Audio Experience

Play Episode Listen Later Mar 12, 2020 18:42


#dbaChallenge: Why Oracle RAC has only 3 SCAN listener? - Questions Picked-up For This Episode: ============================ I feel our production database is running slow, can I increase SGA by 5 GB? What is the difference between Oracle Database and Oracle Exadata? How to load data into another database from oracle database? How can I get username of the user that inserted a record in a table? Query running slow post upgrade to 18c? Bonus Question: What's your recommendation for setting up Oracle practice lab at home! - #dailyDBA #dbaGenesis - Your comments encourage us to produce quality content, please take a second and say ‘Hi’ in the comments and let me and my team know what you thought of the video … p.s. It would mean the world to me if you hit the subscribe button ;) - Link to full course: https://dbagenesis.com/p/oracle-virtualbox-administration Link to all DBA courses: https://dbagenesis.com/courses Link to real-time projects: https://dbagenesis.com/p/projects Link to support articles: https://support.dbagenesis.com - DBA Genesis provides all you need to build and manage effective Oracle technology learning. We designed DBA Genesis as a simple to use yet powerful online Oracle learning system for students. Each of our courses is taught by an expert instructor, and every course is available with a challenging project to push you out of your comfort zone!! DBA Genesis is currently the fastest & the most engaging learning platforms for DBAs across the globe. Take your database administration skills to next level by enrolling into your first course. - Facebook: https://www.facebook.com/dbagenesis/ Instagram: https://www.instagram.com/dbagenesis/ Twitter: https://twitter.com/DbaGenesis Website: https://dbagenesis.com/ Contact us: support@dbagenesis.com - Start your DBA Journey Today !! Become an exclusive DBA Genesis member: https://www.youtube.com/channel/UCUHHRiLeH7sO46GJBRGweag/join

The AI Eye: stock news & deal tracker
The #AI Eye: Equinix (NasdaqGS: EQIX) Leveraging Oracle (NYSE: ORCL) Exadata, Oracle Database to Power Data Center and Interconnection Platf

The AI Eye: stock news & deal tracker

Play Episode Listen Later Feb 11, 2020 5:53


The #AI Eye: Equinix (NasdaqGS: EQIX) Leveraging Oracle (NYSE: ORCL) Exadata, Oracle Database to Power Data Center and Interconnection Platform

Investorideas -Trading & News
The #AI Eye: Equinix (NasdaqGS: EQIX) Leveraging Oracle (NYSE: ORCL) Exadata, Oracle Database to Power Data Center and Interconnection Platf

Investorideas -Trading & News

Play Episode Listen Later Feb 11, 2020 5:53


The #AI Eye: Equinix (NasdaqGS: EQIX) Leveraging Oracle (NYSE: ORCL) Exadata, Oracle Database to Power Data Center and Interconnection Platform

DBAOCM Podcast
EP04 - Como funcionam as redes do Exadata

DBAOCM Podcast

Play Episode Listen Later Nov 26, 2019 33:48


Neste episódio eu explico sobre como funcionam as redes do Exadata, e o que você precisa saber para solicitar os IP's de uma nova implementação de um Exadata.

DBAOCM Podcast
EP03 - Como Consolidar seus Bancos de Dados no Exadata

DBAOCM Podcast

Play Episode Listen Later Nov 21, 2019 41:35


Neste episódio eu falo um pouco sobre as opções e formas de consolidação no Exadata, e até mesmo como evitar alguns problemas comuns.

Digital Impact Radio
Digital Impact Radio - Gavin Parish Talks Exadata X8M (Ser3/E6)

Digital Impact Radio

Play Episode Listen Later Nov 19, 2019 22:12


Gavin Parish talks about Oracle Exadata Database Machine X8M announced at Oracle OpenWorld by Larry Ellison. Building on the Exadata X8 state-of-the-art hardware and software, the Exadata X8M family adopts two new cutting-edge.

DBAOCM Podcast
EP01 - O que é o Oracle Exadata?

DBAOCM Podcast

Play Episode Listen Later Nov 18, 2019 34:16


Neste episódio eu explico o que é o Exadata e quais as vantagens de ter o seu banco de dados nesta Tecnologia.

Breaking Analysis with Dave Vellante
Spending Data Shows Cloud Disrupting the Analytic Database Market

Breaking Analysis with Dave Vellante

Play Episode Listen Later Oct 18, 2019 21:34


theCUBE hosts Dave Vellante (@dvellante) shares his analysis of the analytic database market. Spending intentions data from @ETRNews (Enterprise Technology Research) suggests that the cloud is disrupting the $20B enterprise data warehouse market. Specifically, Amazon (AWS), Microsoft, Google and upstart Snowflake are poised to continue to gain share. The large installed bases of Teradata, IBM and Oracle are under attack. The database leader Oracle is investing heavily in Exadata and cloud. IBM recently killed its Netezza brand must pivot while Teradata is re-trenching. The cloud continues to disrupt and the database market is red hot.Originally published on 9/6/2019

Digital Impact Radio
Digital Impact Radio - Martin Lambert talks Oracle Exadata Platform (Ser2/Ep8)

Digital Impact Radio

Play Episode Listen Later Apr 16, 2019 8:53


Martin Lambert, Engineered Systems Lead for Australia and New Zealand talks about the Oracle Exadata Platform, and the role it plays for Organisations, whether it be On-Premise in a Data Center of their Choice, or hosted/managed as a Cloud Service in the Oracle Public Cloud.

BIASed
How to Move to the Cloud and Save Money with Oracle Exadata

BIASed

Play Episode Listen Later Mar 21, 2018 28:49


Learn how to move to the cloud and save money with Oracle Exadata.

The PeopleSoft Administrator Podcast
#30 - Exadata w/ Karl Arao

The PeopleSoft Administrator Podcast

Play Episode Listen Later May 27, 2016 39:24


Karl Arao joins us this week to talk about Exadata. Karl gives a great introduction to Exadata and explains how it can benefit PeopleSoft applications. Show Notes Introduction to Exadata @ 2:00 Indexes @ 13:30 Migrating to an Exadata @15:30 Upgrading to 12c with Exadata @ 24:00 Value of Exadata @ 29:00 Exadata and PeopleSoft @ 34:00 Patching Exadata @ 37:00 Links Patching Exadata (888828.1) Follow Karl on Twitter Karl's Wiki and Blog

DatabaseCast
DatabaseCast 63: Engineered systems

DatabaseCast

Play Episode Listen Later Dec 17, 2015 75:14


Neste episódio do DatabaseCast Mauro Pichilian (@pichiliani), Wagner Crivelini (@wcrivelini) e o convidado Rodrigo Righetti fecham com a Oracle para discutir os Engineered Systems. Neste episódio, você vai aprender a diferença entre um banco de dados na nuvem, como on premisses ou appliance, como resolver o problema colocando mais hardware, se preparar para administrar o banco, o sistema operacional e o hardware e quebrar o cofrinho de moedas para comprar um Exadata. NOVIDADE: Acessem o canal do DatabaseCast no YouTube: https://www.youtube.com/channel/UC8EUZ3gYTxJi-gr4azFJGYA Confira o preço promocional da camiseta do DatabaseCast Fluxo Matrix com tecido especial (tipo tradicional econômico) http://www.zazzle.com.br/camiseta_fluxo_matrix_t_shirt-235338811658509024 Veja a caneca Datas SQL com a sintaxe para manipulação de datas no Oracle, SQL Server, Mysql e PostgreSQL. http://www.zazzle.com.br/caneca_datassql_branca_325ml-168900583784663517 Confirma o livro "Conversando sobre banco de dados" em: http://www.amazon.com.br/Conversando-sobre-Banco-Dados-publicados-ebook/dp/B00JV3B7VI/ e http://clubedeautores.com.br/book/126042--Conversando_sobre_banco_de_dados Confira as camisetas com estampas fractais do DatabaseCast: http://www.zazzle.com.br/databasecast Não deixe de nos incentivar digitando o seu comentário no final deste artigo, mandando e-mail para databasecast@gmail.com, seguindo o nosso twitter @databasecast, vendo informações de bastidores no nosso Tumblr e curtindo a nossa página no Facebook e no Google+.

DatabaseCast
DatabaseCast 63: Engineered systems

DatabaseCast

Play Episode Listen Later Dec 17, 2015 75:14


Neste episódio do DatabaseCast Mauro Pichilian (@pichiliani), Wagner Crivelini (@wcrivelini) e o convidado Rodrigo Righetti fecham com a Oracle para discutir os Engineered Systems. Neste episódio, você vai aprender a diferença entre um banco de dados na nuvem, como on premisses ou appliance, como resolver o problema colocando mais hardware, se preparar para administrar o banco, o sistema operacional e o hardware e quebrar o cofrinho de moedas para comprar um Exadata. NOVIDADE: Acessem o canal do DatabaseCast no YouTube: https://www.youtube.com/channel/UC8EUZ3gYTxJi-gr4azFJGYA Confira o preço promocional da camiseta do DatabaseCast Fluxo Matrix com tecido especial (tipo tradicional econômico) http://www.zazzle.com.br/camiseta_fluxo_matrix_t_shirt-235338811658509024 Veja a caneca Datas SQL com a sintaxe para manipulação de datas no Oracle, SQL Server, Mysql e PostgreSQL. http://www.zazzle.com.br/caneca_datassql_branca_325ml-168900583784663517 Confirma o livro "Conversando sobre banco de dados" em: http://www.amazon.com.br/Conversando-sobre-Banco-Dados-publicados-ebook/dp/B00JV3B7VI/ e http://clubedeautores.com.br/book/126042--Conversando_sobre_banco_de_dados Confira as camisetas com estampas fractais do DatabaseCast: http://www.zazzle.com.br/databasecast Não deixe de nos incentivar digitando o seu comentário no final deste artigo, mandando e-mail para databasecast@gmail.com, seguindo o nosso twitter @databasecast, vendo informações de bastidores no nosso Tumblr e curtindo a nossa página no Facebook e no Google+.

DatabaseCast
DatabaseCast 39: Passado, presente e futuro do Oracle

DatabaseCast

Play Episode Listen Later Nov 20, 2013 73:39


Neste episódio do DatabaseCast, o podcast brasileiro sobre banco de dados, Mauro Pichiliani (@pichiliani) e Wagner Crivelini (@wcrivelini) estudam o passado, presente e futuro do Oracle com o convidado Ricardo Portilho Proni (@rportilhoproni). Neste episódio você vai saber quem poderia vestir a armadura do Homem de Ferro, quais são as funcionalidades do Oracle 8i, 9i, 10g, 11g e 12c, como trabalhar de plantão no dia dos namorados, o que é que em na documentação do Oracle e descobrir como um manual (de papel!) pode fazer toda a diferença em uma consultoria.Não deixe de nos incentivar digitando o seu comentário no final deste artigo, mandando e-mail para databasecast@gmail.com, seguindo o nosso twitter @databasecast , vendo informações de bastidores no nosso Tumblr e curtindo a nossa página no Facebook. O DatabaseCast pode ser acompanhado no iMasters

DatabaseCast
DatabaseCast 39: Passado, presente e futuro do Oracle

DatabaseCast

Play Episode Listen Later Nov 20, 2013 73:39


Neste episódio do DatabaseCast, o podcast brasileiro sobre banco de dados, Mauro Pichiliani (@pichiliani) e Wagner Crivelini (@wcrivelini) estudam o passado, presente e futuro do Oracle com o convidado Ricardo Portilho Proni (@rportilhoproni). Neste episódio você vai saber quem poderia vestir a armadura do Homem de Ferro, quais são as funcionalidades do Oracle 8i, 9i, 10g, 11g e 12c, como trabalhar de plantão no dia dos namorados, o que é que em na documentação do Oracle e descobrir como um manual (de papel!) pode fazer toda a diferença em uma consultoria.Não deixe de nos incentivar digitando o seu comentário no final deste artigo, mandando e-mail para databasecast@gmail.com, seguindo o nosso twitter @databasecast , vendo informações de bastidores no nosso Tumblr e curtindo a nossa página no Facebook. O DatabaseCast pode ser acompanhado no iMasters

Intel Chip Chat
Data Warehousing and the Oracle Exadata System – Intel® Chip Chat episode 279

Intel Chip Chat

Play Episode Listen Later Oct 11, 2013 10:09


Wrapping up our Oracle OpenWorld podcasts, Dr. Marcus Praetzas, the director of the Financial Data Warehouse at Deutsche Bank AG, discusses the challenges of data warehousing including the volume of data, the frequency of queries and performance needs. He also touches on the role of cloud computing in future data warehousing.

IOUG Blog Central » Podcasts
IOUG Podcast 30-APR-2013 Database 12c Beta Revelations at Collaborate 13

IOUG Blog Central » Podcasts

Play Episode Listen Later Apr 30, 2013


For the week of April 30th, 2013: An Interview with Bobby Curtis and James Lui: Database 12c Revelations at Collaborate A Little Insight into Oracle’s Beta Program IOUG Podcast 30-APR-2013 Database 12c Beta Revelations at Collaborate 13 Subscribe to this … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 05-OCT-2012 12c DB Revealed / Dell Buys Quest / Generic Exadata Machines

IOUG Blog Central » Podcasts

Play Episode Listen Later Oct 5, 2012


For the week of October 5th, 2012: 12c Database – The Quick Tip Quest Becomes Dell Software Exadata Machines: The Executive Summary Version IOUG Podcast 05-OCT-2012 12c DB Revealed / Dell Buys Quest / Generic Exadata Machines Subscribe to this … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 15-SEP-2012 Itanium’s Back / Java 7 Cloudy Bugs / Fusion HCM Taleo Talent

IOUG Blog Central » Podcasts

Play Episode Listen Later Sep 16, 2012


For the week of September 15th, 2012: Oracle Support for HP Itanium Temporarily Extended More Critical Bugs Found in Java 7 Java EE 7 Not Going to Be Cloud-Ready for Near Future Taleo Brings Shiny Glam to Oracle Fusion HCM … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 31-AUG-2012 Express and 11GR1 Go Bye-Bye / New GoldenGate & PostgreSQL MMR

IOUG Blog Central » Podcasts

Play Episode Listen Later Sep 1, 2012


For the week of August 31st, 2012: Did You Miss Oracle Express’s Last Exit? Just In time for Oracle 11gR1’s Support Departure New Releases: GoldenGate 11gR2 & Postgres Plus xDB Replication Server with MMR IOUG Podcast 31-AUG-2012 Express and 11GR1 … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 24-AUG-2012 Rumors of MySQL’s Doom by Oracle / Design Piracy

IOUG Blog Central » Podcasts

Play Episode Listen Later Aug 26, 2012


For the week of August 24th, 2012: Everybody’s Preparing for OpenWorld Dispelling the Rumors of MySQL’s Impending Doom On Piracy of Design IOUG Podcast 24-AUG-2012 Rumors of MySQL’s Doom by Oracle / Design Piracy Subscribe to this Podcast (RSS) or … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 17-AUG-2012 DBAs and Developers as Leaders / OpenWorld Tech Sessions

IOUG Blog Central » Podcasts

Play Episode Listen Later Aug 18, 2012


For the week of August 17th, 2012: IOUG’s DBAs & Developers as Future Leaders of IT What’s IOUG Doing This Year at OpenWorld 2012 Coming Very Soon: The New IOUG.org IOUG Podcast 17-AUG-2012 DBAs and Developers as Leaders / OpenWorld … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 10-AUG-2012 The Big Data World of the Data Scientist & The DBA

IOUG Blog Central » Podcasts

Play Episode Listen Later Aug 9, 2012


For the week of August 10th, 2012: So, Just What is a Data Scientist Big Data at Work The DBA Evolution IOUG Podcast 10-AUG-2012 The Big Data World of the Data Scientist & The DBA Subscribe to this Podcast (RSS) … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 03-AUG-2012 OIM11gR2 / OBIEE 11.1.1.6.2 / NoSQL 2.0 Beta / OpenWorld Update

IOUG Blog Central » Podcasts

Play Episode Listen Later Aug 3, 2012


For the week of August 3rd, 2012: Oracle Identity Management 11g Release 2 Now Available Oracle Business Intelligence 11.1.1.6.2 New Features Including new BI Mobile iPad App Couchbase NoSQL Server 2.0 Beta Now with JSON Support Oracle Openworld IOUG Update … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 28-JUL-2012 London Olympics: Remembering the Birth of the Internet

IOUG Blog Central » Podcasts

Play Episode Listen Later Jul 28, 2012


For the week of July 28th, 2012: IOUG Recognizes the Work of Tim Berners-Lee WWW and the Internet All Started At This Place (in France) IOUG Podcast 28-JUL-2012 London Olympics: Remembering the Birth of the Internet Subscribe to this Podcast … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 21-JUL-2012 A Disasterous Day in the Life of a Cloud DBA / Cloud GPS

IOUG Blog Central » Podcasts

Play Episode Listen Later Jul 20, 2012


For the week of July 21st, 2012: How Many Ways Can a Cloud Die (besides evaporating)? Keep Yourself from Getting Lost in the Clouds IOUG Podcast 21-JUL-2012 A Disasterous Day in the Life of a Cloud DBA / Cloud GPS … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 14-JUL-2012 DBA Ch-ch-changes / You’ve Got Big Data!

IOUG Blog Central » Podcasts

Play Episode Listen Later Jul 14, 2012


For the week of July 14th, 2012: Changing Roles of the DBA Open Availability of Questionable Technical Content Big Data (even when you don’t think you have it) Overall Systems Security Evolving Your IOUG IOUG Podcast 14-JUL-2012 DBA Ch-ch-changes / … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 06-JUL-2012 OEM Ops Center 12cU1 / Google sends $4 million bill to Larry

IOUG Blog Central » Podcasts

Play Episode Listen Later Jul 7, 2012


For the week of July 6th, 2012: OEM Ops Center 12c Update 1 Now Available Google submits $4 million bill for costs in Oracle lawsuit IOUG Podcast 06-JUL-2012 OEM Ops Center 12cU1 / Google sends $4 million bill to Larry … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 29-JUN-2012 No Flashy Oracle Support / Protecting Big Data and Your IT Dept

IOUG Blog Central » Podcasts

Play Episode Listen Later Jun 28, 2012


For the week of June 29th, 2012: Away Goes Flash(y) My Oracle Support How Do You Avoid Losing Big Data and IT (the department)? IOUG Podcast 29-JUN-2012 No Flashy Oracle Support / Protecting Big Data and Your IT Dept Subscribe … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 22-JUN-2012 Oracle’s Dying Sun / NoSQL

IOUG Blog Central » Podcasts

Play Episode Listen Later Jun 23, 2012


For the week of June 22nd, 2012: Is Oracle’s Sun Dying? What is NoSQL? Subscribe to this Podcast (RSS) or iTunes IOUG Podcast 22-JUN-2012 Oracle’s Dying Sun / NoSQL According to a recent article published by The Register, a UK-based … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 16-JUN-2012 ExaPlatinum / TOAD / Oracle’s Birthday

IOUG Blog Central » Podcasts

Play Episode Listen Later Jun 16, 2012


For the week of June 16th, 2012: Exadata Customers Get Free Platinum Support from Oracle Quest TOAD New Release Happy Birthday, Oracle! IOUG Podcast 16-JUN-2012 ExaPlatinum / TOAD / Oracle’s Birthday Subscribe to this Podcast (RSS) or iTunes Oracle has … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 08-JUN-2012 Oracle Getting Social / Dr. DBAs

IOUG Blog Central » Podcasts

Play Episode Listen Later Jun 7, 2012


For the week of June 8th, 2012: Oracle Acquires Vitrue and Collective Intellect Tom Kyte on Dr. DBAs IOUG’s Free Online Technology Training IOUG Podcast 08-JUN-2012 Oracle Getting Social / Dr. DBAs   Subscribe to this Podcast (RSS) or iTunes … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 31-MAY-2012 Ellison Speaks

IOUG Blog Central » Podcasts

Play Episode Listen Later Jun 1, 2012


For the week of May 31st, 2012: Oracle versus IBM: The Showdown of Big Red and Blue SaaSy Cloud Applications: Oracle’s Henway 2nd Hand IOUG Podcast 31-MAY-2012 Ellison Speaks Subscribe to this Podcast (RSS) or iTunes Do you remember back … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 23-MAY-2012 Desktop VM / Real-Time Data / What is Big Data

IOUG Blog Central » Podcasts

Play Episode Listen Later May 24, 2012


For the week of May 23rd, 2012: Oracle Releases New Desktop Virtualization Portfolio Updates Real-Time Data Isn’t Reality (yet) What is Big Data? IOUG Podcast 23-MAY-2012 Subscribe to this Podcast (RSS) or in iTunes Oracle has introduced new desktop virtualization … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 18-MAY-2012 Facebook Technology and Oracle

IOUG Blog Central » Podcasts

Play Episode Listen Later May 19, 2012


For the week of May 18th, 2012: What’s the Technology Behind FaceBook IOUG Podcast 18-MAY-2012 Facebook Technology and Oracle Subscribe to this Podcast (RSS) or via iTunes With this week’s largest in history IPO leading the news. IOUG takes a … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 10-MAY-2012 Big Data Top Issues / Collaborate Honors IOUG Members

IOUG Blog Central » Podcasts

Play Episode Listen Later May 10, 2012


For the week of May 10th, 2012: New Research Examines the Top Issues in Managing Big and Unstructured Data IOUG Valued Members of Our Community Honored at Collaborate Updates from the IOUG Support Council Seminar Series: IOUG Security Survey: Enterprise … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 04-MAY-2012 Security via HCM / Assessing IT Applicants

IOUG Blog Central » Podcasts

Play Episode Listen Later May 4, 2012


For the week of May 4th, 2012: IT Security begins with Human Capital Management Learning to Assess the Latest Tech Candidate How Does Collaborate Help Secure your Organization IOUG Podcast 04-MAY-2012 Subscribe to this Podcast (RSS) or in iTunes In … Continue reading →

IOUG Blog Central » Podcasts
IOUG Extras Podcast 28-APR-2012 (Collaborate Foodie Edition)

IOUG Blog Central » Podcasts

Play Episode Listen Later Apr 28, 2012


For April 28th, 2012: IOUG Podcast Extras: Consuming Collaborate Conferences IOUG Extras Podcast 28-APR-2012 (Collaborate Foodie Edition) Subscribe to Podcast (RSS) or in iTunes So we’ve rounded the end of a week of technology education at Collaborate 12, and you’re … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 26-APR-2012 Toad 11.5 / EMC DDBoost

IOUG Blog Central » Podcasts

Play Episode Listen Later Apr 26, 2012


For the week of April 26th, 2012: Quest Software Releases Toad for Oracle 11.5 EMC Announces Integration of EMC DD Boost and Oracle RMAN How You Can Contribute to this IOUG Podcast IOUG Podcast 26-APR-2012 Subscribe to Podcasts (RSS) or … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 24-APR-2012 (Collaborate Edition) Oracle Social / WC Sites / EBS R12.2

IOUG Blog Central » Podcasts

Play Episode Listen Later Apr 24, 2012


On the floor at the Social Media meet at the IOUG Booth in the Collaborate 12 Exhibit Hall on Tuesday, April 24th, 2012: Looks like Oracle is Going Social Prepare for the WebCenter Sites Tidal Wave e-Business Release 12.2 Introduces … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 23-APR-2012 (Collaborate Edition) Capt. Mark Kelly – Keynote & Courage

IOUG Blog Central » Podcasts

Play Episode Listen Later Apr 23, 2012


For Collaborate 12 on Monday, April 23rd, 2012, Our keynote today was delivered by Captain Mark Kelly, Commander of the Space Shuttle Endeavour’s Final Mission and along with his wife Congresswoman Gabrielle Giffords, co-author of the new book, Gabby: A … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 18-APR-2012 MySQL Confs / MySQL 5.6 Released / Real World Perf Tour

IOUG Blog Central » Podcasts

Play Episode Listen Later Apr 18, 2012


For the week of April 18th, 2012: Oracle Announces It’s First Sponsored MySQL Conference A New Development Milestone Release of MySQL 5.6 is now available IOUG’s Real World Performance Tour Returns to California this Month IOUG’s Plug-In to Vegas! adds … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 11-APR-2012 Bus Analytics Roadmap / Exalytics / World Record Performance

IOUG Blog Central » Podcasts

Play Episode Listen Later Apr 11, 2012


For the week of April 11th, 2012: OpenWorld Tokyo 2012: Oracle Presents Business Analytics Strategy Oracle’s Business Analytics Portfolio is Enhanced with New Certified on Exalytics BI Applications New World Record x86 Performances for Oracle on Middleware and Transactional Database … Continue reading →

IOUG Blog Central » Podcasts
IOUG Podcast 04-APR-2012 Exalytics In-Memory Machine Launch / Engineered Systems / Big Data Trends

IOUG Blog Central » Podcasts

Play Episode Listen Later Apr 6, 2012


For April 4th, 2012 up this week: The Exalytics In-Memory Machine Launched by Oracle Oracle Says The Bookings Pipeline for Engineered Systems is Strong Trending Now: Big Data – A Challenge or Opportunity? And Upcoming Collaborate 2012 Activities IOUG Podcast … Continue reading →