At the end of the year 2020, Oracle released 21c on its Cloud infrastructure. The on-premises version will follow later this year. As with every new Oracle version, the Data Pump utility gets new features or enhancements for existing features.
This presentation gives an overview of the enhancements of Data Pump and Transportable Tablespaces. The following list is an excerpt of the points I will talk about
- Simultaneous use of EXCLUDE and INCLUDE
- Parallelized import of metadata during a TTS import operation
- Checksum support for dump files
- Direct access to Oracle Cloud Object Store for exports and imports
Tanel Poder - Performance stories from Exadata MigrationsTanel Poder
Tanel Poder has been involved in a number of Exadata migration projects since its introduction, mostly in the area of performance ensurance, troubleshooting and capacity planning.
These slides, originally presented at UKOUG in 2010, cover some of the most interesting challenges, surprises and lessons learnt from planning and executing large Oracle database migrations to Exadata v2 platform.
This material is not just repeating the marketing material or Oracle's official whitepapers.
Oracle Database 12c with RAC High Availability Best PracticesMarkus Michalewicz
This presentation walks through different levels of High Availability (HA) using a Bronze to Platinum range of levels. It explores and explains the different techniques that can be used on each level generically and based on Oracle technology, including features that can be used with Oracle Databases only.
This presentation was first published in July 2014 and newer versions as well as more features might have been made available for each product mentioned in the deck since then.
This document discusses Oracle database performance tuning. It covers identifying common Oracle performance issues such as CPU bottlenecks, memory issues, and inefficient SQL statements. It also outlines the Oracle performance tuning method and tools like the Automatic Database Diagnostic Monitor (ADDM) and performance page in Oracle Enterprise Manager. These tools help administrators monitor performance, identify bottlenecks, implement ADDM recommendations, and tune SQL statements reactively when issues arise.
Oracle RAC is an option to the Oracle Database Enterprise Edition. At least, this is what it is known for. This presentation shows the many ways in which the stack, which is known as Oracle RAC can be used in the most efficient way for various use cases.
Design and develop with performance in mind
Establish a tuning environment
Index wisely
Reduce parsing
Take advantage of Cost Based Optimizer
Avoid accidental table scans
Optimize necessary table scans
Optimize joins
Use array processing
Consider PL/SQL for “tricky” SQL
This document provides a summary of a presentation on Oracle Real Application Clusters (RAC) integration with Exadata, Oracle Data Guard, and In-Memory Database. It discusses how Oracle RAC performance has been optimized on Exadata platforms through features like fast node death detection, cache fusion optimizations, ASM optimizations, and integration with Exadata infrastructure. The presentation agenda indicates it will cover these RAC optimizations as well as integration with Oracle Data Guard and the In-Memory database option.
This document provides an overview and introduction to Oracle SQL basics. It covers topics such as installing Oracle software like the database, Java SDK, and SQL Developer tool. It then discusses database concepts like what a database and table are. It also covers database fundamentals including SQL queries, functions, joins, constraints, views and other database objects. The document provides examples and explanations of SQL statements and database components.
by Mahesh Pakal, AWS
PostgreSQL is a powerful, enterprise class open source object-relational database system with an emphasis on extensibility and standards-compliance. PostgreSQL boasts many sophisticated features and runs stored procedures in more than a dozen programming languages. We’ll explore the advantages and limitations of PostgreSQL, examples of where it is best suited for use, and examples of who is using PostgreSQL to power their applications.
Starting with 12c Release 1, Oracle introduced a completely new architecture concept for its database - the Container Database.
With this new architecture, new challenges came up but with the same breath a wide branch of new opportunities.
The presentation will address the capabilities to create fast and easy new (test) databases or clones for a running production database. Five different ways will be discussed.
- Using Local and Remote Cloning
- Using an Unplugged PDB (predefined master)
- Using Refreshable PDBs as a master for new (test) databases
- Snapshot Carousel
Another point of the agenda is the usage of the Snapshot features of ACFS and Direct NFS to speed up the creation process.
Understanding SQL Trace, TKPROF and Execution Plan for beginnersCarlos Sierra
The three fundamental steps of SQL Tuning are: 1) Diagnostics Collection; 2) Root Cause Analysis (RCA); and 3) Remediation. This introductory session on SQL Tuning is for novice DBAs and Developers that are required to investigate a piece of an application performing poorly.
On this session participants will learn about producing a SQL Trace then a summary TKPROF report. A sample TKPROF is navigated with the audience, where the trivial and the no so trivial is exposed and explain. Execution Plans are also navigated and explained, so participants can later untangle complex Execution Plans and start diagnosing SQL performing badly.
This session encourages participants to ask all kind of questions that could be potential road-blocks for deeper understanding of how to address a SQL performing poorly.
This presentation discusses the top 5 reasons as well as various technology updates to provide a reasonable answer to the rather common question: "Why should one use an Oracle Database?". This "2020 "C-Edition" was first presented during the IOUG / Quest Forum Digital Event: Database & tech Week in June 2020 and subsequently updated based on feedback received.
Understanding oracle rac internals part 1 - slidesMohamed Farouk
This document discusses Oracle RAC internals and architecture. It provides an overview of the Oracle RAC architecture including software deployment, processes, and resources. It also covers topics like VIPs, networks, listeners, and SCAN in Oracle RAC. Key aspects summarized include the typical Oracle RAC software stack, local and cluster resources, how VIPs and networks are configured, and the role and dependencies of listeners.
The document provides an overview of the Oracle database including its architecture, components, and features. It discusses Oracle's memory structure consisting of the shared pool, database buffer cache, and redo log buffer. It describes Oracle's process structure including background processes like DBWR, LGWR, PMON and SMON. It also covers Oracle's storage structure such as datafiles, redo logs, control files and the physical and logical storage architectures including tablespaces, segments, extents and blocks.
This document provides an overview of the Oracle database architecture. It describes the major components of Oracle's architecture, including the memory structures like the system global area and program global area, background processes, and the logical and physical storage structures. The key components are the database buffer cache, redo log buffer, shared pool, processes, tablespaces, data files, and redo log files.
This document discusses upgrading to Oracle Database 19c and migrating to Oracle Multitenant. It provides an overview of key features such as being able to have 3 user-created PDBs without a Multitenant license in 19c. It also demonstrates how to use AutoUpgrade to perform an upgrade and migration to Multitenant with a single command. The document highlights various Multitenant concepts such as resource sharing, connecting to containers, and cloning PDBs.
Part1 of SQL Tuning Workshop - Understanding the OptimizerMaria Colgan
Part 1 of a 5 part SQL Tuning workshop, This presentation covers the history of the Oracle Optimizer and explains the first thing the Optimizer does when it receives a SQL statements, which is to transform the SQL statement in order to open up additional access paths.
This document provides an overview of Oracle 12c Pluggable Databases (PDBs). Key points include:
- PDBs allow multiple databases to be consolidated within a single container database (CDB), providing benefits like faster provisioning and upgrades by doing them once per CDB.
- Each PDB acts as an independent database with its own data dictionary but shares resources like redo logs at the CDB level. PDBs can be unplugged from one CDB and plugged into another.
- Hands-on labs demonstrate how to create, open, clone, and migrate PDBs between CDBs. The document also compares characteristics of CDBs and PDBs and shows how a non-C
This document discusses execution plans in Oracle Database. It begins by explaining what an execution plan is and how it shows the steps needed to execute a SQL statement. It then covers how to generate an execution plan using EXPLAIN PLAN or querying V$SQL_PLAN. The document discusses what the optimizer considers a "good" plan in terms of cost and performance. It also explores key elements of an execution plan like cardinality, access paths, join methods, and join order.
New Generation Oracle RAC 19c focuses on diagnosing Oracle RAC performance issues. The document discusses tools used by Oracle's RAC performance engineering team to instrument and measure key code areas between releases. It also covers how Oracle RAC provides high availability and scalability for workloads like traditional apps, new apps, IoT workloads, and more. Diagnosing performance requires understanding factors like private network latency and configuration.
Presented at the Dallas Oracle Users Group
By Nabil Nawaz
sponsored by BIAS Corporation
Oracle DataPump is an excellent tool for cloning databases and schemas and it is widely used as a common toolset today among DBAs and Developers to transfer data and structure between databases. Please come and learn about new Data pump features for Oracle version 12.2. We will also be sharing a case study for a large multi-terabyte database for optimizing a data pump import process that originally ran for more than a day and then the process was tuned to run in just about 4-6 hours a nearly 90% performance enhancement. The tips that will be shared will be of great value and help to ensure you are able to have a well-tuned import process with DataPump.
Oracle Database 12c includes over 500 new features. Some key new features include:
- Oracle Database 12c Express (EM Express) which replaces Database Control and has less features than Database Control but does not require Java or an app server.
- New online capabilities like online DDL operations with no DDL locking, online move of partitions with no impact to queries, and online statistics gathering for bulk loads.
- Adaptive SQL Plan Management which allows the optimizer to select a more optimal plan at execution time based on current statistics.
- Multitenant architecture which allows consolidation of multiple databases into one container database with pluggable databases.
Oracle E-Business Suite R12.2.5 on Database 12c: Install, Patch and AdministerAndrejs Karpovs
- The document discusses installing, patching, and administering Oracle E-Business Suite R12.2.5. Key points include:
- R12.2.5 can be installed directly on RAC using the latest startCD, but additional configuration may be required.
- Patching is done using ADOP with both online ("hotpatch") and downtime modes. The full upgrade to R12.2.5 requires downtime.
- Administration involves tools like adstrtal.sh, WebLogic console, and integrating with components like OAM using scripts. Custom development requires tools like adsplice.
This document discusses the new Data Pump utilities in Oracle Database 10g for high-performance data movement. Data Pump allows loading and unloading of data and metadata in a server-based, parallel manner using direct path APIs. It provides automatic parallelism, checkpoint/restart capabilities, fine-grained object selection, monitoring, and improved performance over traditional Export/Import - achieving speeds up to 40x faster for data loading. The new expdp/impdp clients offer enhanced functionality while Data Pump serves as the foundation for other Oracle technologies requiring fast data movement. Customers have reported significant performance gains during beta testing of Data Pump.
DBA Commands and Concepts That Every Developer Should Know - Part 2Alex Zaballa
This document provides a summary of several database administration (DBA) commands and concepts relevant for developers. It discusses topics such as count(1) vs count(*), gathering system statistics, setting the DB_FILE_MULTIBLOCK_READ_COUNT parameter, analyzing tables, explaining plans, monitoring SQL performance, full table scans, pending statistics, restoring statistics history, parallel DML, Flashback Query, DBMS_APPLICATION_INFO, and privileges for reading tables. The document is intended to help developers better understand and work with database configurations and operations.
DBA Commands and Concepts That Every Developer Should Know - Part 2Alex Zaballa
This document provides a summary of several database administration (DBA) commands and concepts relevant for developers. It discusses topics such as count(1) vs count(*), gathering system statistics, setting the DB_FILE_MULTIBLOCK_READ_COUNT parameter, analyzing tables, explaining plans, monitoring SQL performance, full table scans, pending statistics, restoring statistics history, parallel DML, Flashback Query, DBMS_APPLICATION_INFO, schema management, adding columns with defaults, object and system privileges. The document is intended to help developers better understand and work with database concepts.
The document discusses transporting a single partition of a table between Oracle databases using Oracle Data Pump. It provides steps to create test tablespaces and a partitioned table with two partitions stored in different tablespaces. It then shows how to export the first partition to a dump file, drop related objects, and import the partition from the dump file to restore it. This demonstrates using Data Pump to transport a single table partition between databases for improved efficiency over full table transports.
Migrating from Oracle Enterprise Manager 10g to 12c Cloud ControlLeighton Nelson
This document outlines the steps to migrate from Oracle Enterprise Manager 10g to 12c Cloud Control. It discusses upgrading the Oracle Management Service (OMS) and repository database while keeping downtime minimal using a two-system upgrade approach. The process involves deploying 12c agents, backing up the 11g repository, upgrading the OMS to 12c on a new host, and migrating the agents to the new OMS with only brief downtime. Post-upgrade tasks include configuring a database link and stopping the old OMS.
The current trends to work in Agile and DevOps are challenging for database developers. Source control is a standard for non-database code but it’s a challenge for databases. This talk has an ambition to change that situation and help developers and DBA take over control of source code and data.
Reduce planned database down time with Oracle technologyKirill Loifman
How to design an Oracle database system to minimize planned interruptions? That depends on the requirements, goals, SLAs etc. The presentation will follow top-down approach. First we will describe major types of planned maintenance, prioritize those and then based on the system availability requirements find the best cost-effective technics to address those. A bit of planning, strategy and of course modern database and OS technics including latest Oracle 12c features.
Exam 1z0 062 Oracle Database 12c: Installation and AdministrationKylieJonathan
Training increase your chance of passing Oracle Database 12c: Installation and Administration exam. Examarea.com
provide you the updated study dumps for 1Z1-062 exam.
@ https://www.examarea.com/1Z1-062-exams.html
#IT #oracle #database #training #dumps #discount
This document provides instructions for configuring SafePeak Technologies' caching software. It outlines several key configuration steps:
1. Eliminate all "Cache=Disable" events by configuring dynamic objects and ignoring certain SQL commands that would otherwise disable caching.
2. Optimize inactive and non-cacheable SQL patterns by reviewing them and modifying their settings if possible to enable caching.
3. Use configuration tools like the SafePeak dashboard to review analytics on SQL traffic and caching performance, identify optimization opportunities, and configure caching rules for SQL patterns.
The overall process involves running application traffic through SafePeak, analyzing the results to identify patterns for optimization, configuring caching settings, and repeating in iterative cycles until desired performance
This document provides step-by-step instructions for upgrading an Oracle database from version 10.2.0.4 to 11.2.0.2. It involves running pre-upgrade checks, backing up the database, setting environment variables to point to the new Oracle home, running upgrade scripts to upgrade the database, and performing post-upgrade tasks like recompiling objects and checking for errors. The process ensures the integrity and consistency of the upgraded Oracle software.
Capacity planning is a difficult challenge faced by most companies. If you have too few machines, you will not have enough compute resources available to deal with heavy loads. On the other hand, if you have too many machines, you are wasting money. This is why companies have started investing in automatically scaling services and infrastructure to minimize the amount of wasted money and resources.
In this talk, Nathan will describe how Yelp is using PaaSTA, a PaaS built on top of open source tools including Docker, Mesos, Marathon, and Chronos, to automatically and gracefully scale services and the underlying cluster. He will go into detail about how this functionality was implemented and the design designs that were made while architecting the system. He will also provide a brief comparison of how this approach differs from existing solutions.
The document discusses Cummins' plan to migrate its E-Business Suite instances to a new Exadata platform and upgrade to release 12.2. Key steps included exporting the databases using Export/Import, upgrading customizations to Oracle standards, and upgrading applications to 12.2. Downtime was minimized by preparing databases in advance and combining steps where possible. Challenges arose from sharing hardware resources between instances for the first time.
(ATS6-PLAT07) Managing AEP in an enterprise environmentBIOVIA
Deployments can range from personal laptop usage to large enterprise environments. The installer allows both interactive and unattended installations. Key folders include Users for individual data, Jobs for temporary execution data, Shared Public for shared resources, and XMLDB for the database. Logs record job executions, authentication events, and errors. Tools like DbUtil allow backup/restore of data, pkgutil creates packages for application delivery, and regress enables test automation. Planning folder locations and maintenance is important for managing resources in an enterprise environment.
The document provides guidance on migrating to Oracle Autonomous Database Cloud using various methods. It discusses using DBMS_CLOUD to load data directly from object storage, using Oracle Data Pump to export and import data between databases, and using SQL*Loader to load data from flat files. It also covers using Oracle SQL Developer which provides wizards and tools to simplify the migration process. The document highlights important considerations for each method such as the source database version and file format and location. It provides examples for executing each migration method.
Extreme replication at IOUG Collaborate 15Bobby Curtis
This document summarizes a session on tuning Oracle GoldenGate performance between an Oracle source and target database. It discusses tools for monitoring GoldenGate performance such as lag reports, process statistics, and database views. It also provides a case study example configuration and recommendations for tuning integrated extract and replicat parameters such as parallelism settings.
Similar to Oracle 21c: New Features and Enhancements of Data Pump & TTS (20)
Demystifying Neural Networks And Building Cybersecurity ApplicationsPriyanka Aash
In today's rapidly evolving technological landscape, Artificial Neural Networks (ANNs) have emerged as a cornerstone of artificial intelligence, revolutionizing various fields including cybersecurity. Inspired by the intricacies of the human brain, ANNs have a rich history and a complex structure that enables them to learn and make decisions. This blog aims to unravel the mysteries of neural networks, explore their mathematical foundations, and demonstrate their practical applications, particularly in building robust malware detection systems using Convolutional Neural Networks (CNNs).
The History of Embeddings & Multimodal EmbeddingsZilliz
Frank Liu will walk through the history of embeddings and how we got to the cool embedding models used today. He'll end with a demo on how multimodal RAG is used.
The Challenge of Interpretability in Generative AI Models.pdfSara Kroft
Navigating the intricacies of generative AI models reveals a pressing challenge: interpretability. Our blog delves into the complexities of understanding how these advanced models make decisions, shedding light on the mechanisms behind their outputs. Explore the latest research, practical implications, and ethical considerations, as we unravel the opaque processes that drive generative AI. Join us in this insightful journey to demystify the black box of artificial intelligence.
Dive into the complexities of generative AI with our blog on interpretability. Find out why making AI models understandable is key to trust and ethical use and discover current efforts to tackle this big challenge.
The Zaitechno Handheld Raman Spectrometer is a powerful and portable tool for rapid, non-destructive chemical analysis. It utilizes Raman spectroscopy, a technique that analyzes the vibrational fingerprint of molecules to identify their chemical composition. This handheld instrument allows for on-site analysis of materials, making it ideal for a variety of applications, including:
Material identification: Identify unknown materials, minerals, and contaminants.
Quality control: Ensure the quality and consistency of raw materials and finished products.
Pharmaceutical analysis: Verify the identity and purity of pharmaceutical compounds.
Food safety testing: Detect contaminants and adulterants in food products.
Field analysis: Analyze materials in the field, such as during environmental monitoring or forensic investigations.
The Zaitechno Handheld Raman Spectrometer is easy to use and features a user-friendly interface. It is compact and lightweight, making it ideal for field applications. With its rapid analysis capabilities, the Zaitechno Handheld Raman Spectrometer can help you improve efficiency and productivity in your research or quality control workflows.
Top 12 AI Technology Trends For 2024.pdfMarrie Morris
Technology has become an irreplaceable component of our daily lives. The role of AI in technology revolutionizes our lives for the betterment of the future. In this article, we will learn about the top 12 AI technology trends for 2024.
UiPath Community Day Amsterdam: Code, Collaborate, ConnectUiPathCommunity
Welcome to our third live UiPath Community Day Amsterdam! Come join us for a half-day of networking and UiPath Platform deep-dives, for devs and non-devs alike, in the middle of summer ☀.
📕 Agenda:
12:30 Welcome Coffee/Light Lunch ☕
13:00 Event opening speech
Ebert Knol, Managing Partner, Tacstone Technology
Jonathan Smith, UiPath MVP, RPA Lead, Ciphix
Cristina Vidu, Senior Marketing Manager, UiPath Community EMEA
Dion Mes, Principal Sales Engineer, UiPath
13:15 ASML: RPA as Tactical Automation
Tactical robotic process automation for solving short-term challenges, while establishing standard and re-usable interfaces that fit IT's long-term goals and objectives.
Yannic Suurmeijer, System Architect, ASML
13:30 PostNL: an insight into RPA at PostNL
Showcasing the solutions our automations have provided, the challenges we’ve faced, and the best practices we’ve developed to support our logistics operations.
Leonard Renne, RPA Developer, PostNL
13:45 Break (30')
14:15 Breakout Sessions: Round 1
Modern Document Understanding in the cloud platform: AI-driven UiPath Document Understanding
Mike Bos, Senior Automation Developer, Tacstone Technology
Process Orchestration: scale up and have your Robots work in harmony
Jon Smith, UiPath MVP, RPA Lead, Ciphix
UiPath Integration Service: connect applications, leverage prebuilt connectors, and set up customer connectors
Johans Brink, CTO, MvR digital workforce
15:00 Breakout Sessions: Round 2
Automation, and GenAI: practical use cases for value generation
Thomas Janssen, UiPath MVP, Senior Automation Developer, Automation Heroes
Human in the Loop/Action Center
Dion Mes, Principal Sales Engineer @UiPath
Improving development with coded workflows
Idris Janszen, Technical Consultant, Ilionx
15:45 End remarks
16:00 Community fun games, sharing knowledge, drinks, and bites 🍻
Cracking AI Black Box - Strategies for Customer-centric Enterprise ExcellenceQuentin Reul
The democratization of Generative AI is ushering in a new era of innovation for enterprises. Discover how you can harness this powerful technology to deliver unparalleled customer value and securing a formidable competitive advantage in today's competitive market. In this session, you will learn how to:
- Identify high-impact customer needs with precision
- Harness the power of large language models to address specific customer needs effectively
- Implement AI responsibly to build trust and foster strong customer relationships
Whether you're at the early stages of your AI journey or looking to optimize existing initiatives, this session will provide you with actionable insights and strategies needed to leverage AI as a powerful catalyst for customer-driven enterprise success.
Choosing the Best Outlook OST to PST Converter: Key Features and Considerationswebbyacad software
When looking for a good software utility to convert Outlook OST files to PST format, it is important to find one that is easy to use and has useful features. WebbyAcad OST to PST Converter Tool is a great choice because it is simple to use for anyone, whether you are tech-savvy or not. It can smoothly change your files to PST while keeping all your data safe and secure. Plus, it can handle large amounts of data and convert multiple files at once, which can save you a lot of time. It even comes with 24*7 technical support assistance and a free trial, so you can try it out before making a decision. Whether you need to recover, move, or back up your data, Webbyacad OST to PST Converter is a reliable option that gives you all the support you need to manage your Outlook data effectively.
Self-Healing Test Automation Framework - HealeniumKnoldus Inc.
Revolutionize your test automation with Healenium's self-healing framework. Automate test maintenance, reduce flakes, and increase efficiency. Learn how to build a robust test automation foundation. Discover the power of self-healing tests. Transform your testing experience.
Keynote : Presentation on SASE TechnologyPriyanka Aash
Secure Access Service Edge (SASE) solutions are revolutionizing enterprise networks by integrating SD-WAN with comprehensive security services. Traditionally, enterprises managed multiple point solutions for network and security needs, leading to complexity and resource-intensive operations. SASE, as defined by Gartner, consolidates these functions into a unified cloud-based service, offering SD-WAN capabilities alongside advanced security features like secure web gateways, CASB, and remote browser isolation. This convergence not only simplifies management but also enhances security posture and application performance across global networks and cloud environments. Discover how adopting SASE can streamline operations and fortify your enterprise's digital transformation strategy.
TrustArc Webinar - Innovating with TRUSTe Responsible AI CertificationTrustArc
In a landmark year marked by significant AI advancements, it’s vital to prioritize transparency, accountability, and respect for privacy rights with your AI innovation.
Learn how to navigate the shifting AI landscape with our innovative solution TRUSTe Responsible AI Certification, the first AI certification designed for data protection and privacy. Crafted by a team with 10,000+ privacy certifications issued, this framework integrated industry standards and laws for responsible AI governance.
This webinar will review:
- How compliance can play a role in the development and deployment of AI systems
- How to model trust and transparency across products and services
- How to save time and work smarter in understanding regulatory obligations, including AI
- How to operationalize and deploy AI governance best practices in your organization
Garbage In, Garbage Out: Why poor data curation is killing your AI models (an...Zilliz
Enterprises have traditionally prioritized data quantity, assuming more is better for AI performance. However, a new reality is setting in: high-quality data, not just volume, is the key. This shift exposes a critical gap – many organizations struggle to understand their existing data and lack effective curation strategies and tools. This talk dives into these data challenges and explores the methods of automating data curation.
Discovery Series - Zero to Hero - Task Mining Session 1DianaGray10
This session is focused on providing you with an introduction to task mining. We will go over different types of task mining and provide you with a real-world demo on each type of task mining in detail.
6. Oracle 21c
• Available since 8th of December 2020 as Cloud-first release
• Autonomous Database Service (including Always Free Tier (not in all regions))
• Database Service
• Virtual Machine (RAC, Single Instance)
• Bare Metal (Single Instance)
• Innovation release with Premier Support until 30. June 2023 (no Extended Support)
• See MOS note Release Schedule of Current Database Releases (Doc ID 742060.1)
• Oracle 21c Live Labs Workshops
• Link: https://apexapps.oracle.com/pls/apex/f?p=133:100:109140367598541::::SEARCH:21c
• No fix release date for the on-premises version
• Linux and Windows releases are planned for the 1st half-year 2021
• Other platforms will be released later
Use your 30-days Oracle Cloud free trial to get hands on experiences with Oracle 21c (RAC).
7. Always-free Autonomous Database 1/2
• Oracle 21c is available in the regions Ashburn (IAD), Phoenix (PHX), Frankfurt (FRA) and London (LHR)
• Can only be created in the home region of the user
• Navigate to Oracle Database > Autonomous Databases
• Direct link: https://cloud.oracle.com/db/adb
• Current workload types supported for always-free
10. Simultanenous use of EXCLUDE and INCLUDE
• Before 21c, the parameters EXCLUDE and INCLUDE were mutually exclusive
• This restriction is lifted with 21c
• When both parameters are used, the INCLUDE parameter is evaluated first
UDE-00011: parameter include is incompatible with parameter exclude
$> vi expdp_tables.par
...
SCHEMAS=HR,SH
INCLUDE=TABLE
EXCLUDE=TABLE:"IN ('COSTS', 'SALES')"
$> vi impdp_tables.par
...
SCHEMAS=HR
INCLUDE=TABLE
EXCLUDE=TABLE:"IN ('JOB_HISTORY')"
EXCLUDE=INDEX,STATISTICS
11. Checksum Support 1/3
• Oracle 21c supports the generation of checksums for the dump files using CHECKSUM parameter
• Can be used to confirm validity after transfer (to/from object storage, on-premises copy)
• Data Pump writes control information into the header block of each dump file
• Extended by checksums for the remaining blocks
• Use CHECKSUM_ALGORITHM to choose checksum algorithm
• CRC32
• SHA256 (default)
• SHA384
• SHA512
Generating checksums for dump file set
No new item codes in DBMS_DATAPUMP.GET_DUMPFILE_INFO for the checksum support.
12. Checksum Support 2/3
• Verify a dump file using impdp and VERIFY_ONLY parameter
• ORA-39411 is raised when an invalid checksum is detected
$> impdp ... VERIFY_ONLY=YES
…
Verifying dump file checksums
Master table "SYSTEM"."SYS_IMPORT_SCHEMA_01" successfully loaded/unloaded
dump file set is complete
verified checksum for dump file "/backup/dumps/expdp_hr_sh.dmp"
dump file set is consistent
Job "SYSTEM"."SYS_IMPORT_SCHEMA_01" successfully completed at Sun Apr 18
16:20:58 2021 elapsed 0 00:00:03
ORA-39001: invalid argument value
ORA-39000: bad dump file specification
ORA-39411: header checksum error in dump file "/backup/dumps/expdp_hr_sh.dmp"
13. Checksum Support 3/3
• Use parameter VERIFY_CHECKSUM during import to validate the dump files as first step
• If turned off, a warning is written to the screen/log
$> impdp ... VERIFY_CHECKSUM=YES
…
Verifying dump file checksums
Master table "SYSTEM"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_FULL_01": system/********@C21SFE1PDB1
parfile=impdp_21c_demo_checksum.par
Processing object type TABLE_EXPORT/TABLE/TABLE
Warning: dump file checksum verification is disabled
No warning/error is raised, when the dump file does not include checksums.
14. Object Storage Integration 1/4
• Data Pump can access the Object Store during export and import operations
• Parameter DUMPFILE now supports a Uniform Resource Identifier (URI)
• Credential for the target bucket is provided by the parameter CREDENTIAL
• Use the PL/SQL package DBMS_CLOUD to create the required credential
• MOS note How To Setup And Use DBMS_CLOUD Package (Doc ID 2748362.1)
• Currently only working in Autonomous Database (even with 19c)
Only the CDB architecture is supported by DBMS_CLOUD.
DBMS_CLOUD is pre-installed, configured and maintained in Oracle Autonomous Database
Manual installation steps
The following steps are required to manually install and enable DBMS_CLOUD for Oracle Database 19c beginning with 19.9 and higher.
Currently, Oracle Database 21c is not supported.
DUMPFILE=https://objectstorage.<Region>.oraclecloud.com/n/<Namespace>/b/<Bucket>/o/mydump_%u.dmp
15. Object Storage Integration 2/4
• Create credential and test access to the target bucket
SQL> BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => 'CLOUD_ACCESS',
username => 'christian.gohmann@trivadis.com',
password => '[Au_L8F0DRb_}lXmcM-G'
);
END;
/
SQL> SELECT COUNT(*) AS "OBJECT_COUNT"
FROM DBMS_CLOUD.LIST_OBJECTS('CLOUD_ACCESS','https://objectstorage.eu-
frankfurt-1.oraclecloud.com/n/kwq12gukapmy/b/datapump-dumps/o/');
OBJECT_COUNT
------------
1
Error ORA-20404 is raised, when the URI is invalid or the user is not authorized to access the
bucket.
Auth token
16. Object Storage Integration 3/4
• Perform export/import using command-line tools
$> vi expdp.par
USERID=system/manager
DIRECTORY=DATA_PUMP_DIR
DUMPFILE=https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/cre01gulanmy/b/datapump-
dumps/o/test_%U.dmp
CREDENTIAL=CLOUD_ACCESS
SCHEMAS=SCOTT
$> expdp parfile=expdp.par
…
Master table "ADMIN"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for ADMIN.SYS_EXPORT_SCHEMA_01 is:
https://swiftobjectstorage.eu-frankfurt-1.oraclecloud.com/v1/cre01gulanmy/datapump-
dumps/test_01.dmp
Job "ADMIN"."SYS_EXPORT_SCHEMA_01" successfully completed at Wed Apr 7 19:14:01 2021
elapsed 0 00:02:10
Parameter DATA_PUMP_DIR is still required for the log file.
17. Object Storage Integration 4/4
• Data Pump splits the dump file(s) into 10 MiB chunks to increase the upload speed
• Use a Swift compatible tool like curl to download the dump file
• Parameter FILESIZE is supported, but each of the dump files is still split into chunks of 10 MiB
$> curl -O -X GET -u 'christian.gohmann@trivadis.com:[Au_L8F0DRb_}lXmcM-G'
https://swiftobjectstorage.eu-frankfurt-1.oraclecloud.com/v1/cre01gulanmy/datapump-
dumps/test_01.dmp
Auth token
18. TRANSFORM Parameter Enhancements
• New transformation INDEX_COMPRESSION_CLAUSE for Data Pump Import
• Allows the control of adding, changing or removing index key compression
• All indexes of the Data Pump are affected
• Supported values:
• NONE (Tablespace default will be used)
• COMPRESS n (Prefix Compression)
• COMPRESS ADVANCED LOW (Advanced Index Compression, ACO required)
• COMPRESS ADVANCED HIGH (Advanced Index Compression, ACO required)
$> impdp ... TRANSFORM = INDEX_COMPRESSION_CLAUSE:"<Clause>"
The transformation TABLE_COMPRESSION_CLAUSE exists since Oracle 12c Release 1 to do the
same on table level.
19. Miscellaneous
• Support for the native JSON data type
• Tables with JSON data types are automatically excluded when VERSION is set to <= 19
• No warning or error is recorded in the log file
• Query to check if tables with JSON data type columns exist
SQL*Loader was also enhanced in 21c to support the native JSON data type for conventional
and direct path loads.
SQL> CREATE TABLE json_data (
ID NUMBER,
DATA JSON
);
SQL> SELECT DISTINCT owner, table_name FROM dba_tab_cols WHERE data_type = 'JSON';
OWNER TABLE_NAME
---------- --------------------
HR JSON_DATA
21. Parallelize Metadata Operations 1/2
• Before 21c, only one Data Pump worker was supported
• In 21c, all defined Data Pump workers (PARALLEL parameter) are exporting/importing metadata
• Each worker processes one type of metadata at the same time
ORA-39002: invalid operation
ORA-39047: Jobs of type TRANSPORTABLE cannot use multiple execution streams.
20-APR-21 08:28:27.212: W-1 Startup on instance 1 took 1 seconds
20-APR-21 08:28:29.079: W-2 Startup on instance 1 took 1 seconds
...
20-APR-21 08:29:02.686: W-1 Processing object type TRANSPORTABLE_EXPORT/TABLE
20-APR-21 08:29:04.601: W-2 Processing object type TRANSPORTABLE_EXPORT/CONSTRAINT/CONSTRAINT
20-APR-21 08:29:04.671: W-2 Completed 50 CONSTRAINT objects in 0 seconds
20-APR-21 08:29:11.621: W-2 Processing object type TRANSPORTABLE_EXPORT/POST_INSTANCE/PROCACT_INSTANCE
20-APR-21 08:29:11.650: W-2 Completed 15 PROCACT_INSTANCE objects in 0 seconds
20-APR-21 08:29:12.304: W-2 Processing object type TRANSPORTABLE_EXPORT/POST_INSTANCE/PROCDEPOBJ
20-APR-21 08:29:12.334: W-2 Completed 10 PROCDEPOBJ objects in 0 seconds
20-APR-21 08:29:47.203: W-1 Completed 57 TABLE objects in 0 seconds
22. Parallelize Metadata Operations 2/2
• Before 21c, only one Data Pump worker was supported
• In 21c, all defined Data Pump workers (PARALLEL parameter) are exporting/importing metadata
• Each worker processes one type of metadata at the same time
ORA-39002: invalid operation
ORA-39047: Jobs of type TRANSPORTABLE cannot use multiple execution streams.
20-APR-21 08:28:27.212: W-1 Startup on instance 1 took 1 seconds
20-APR-21 08:28:29.079: W-2 Startup on instance 1 took 1 seconds
...
20-APR-21 08:29:02.686: W-1 Processing object type TRANSPORTABLE_EXPORT/TABLE
20-APR-21 08:29:04.601: W-2 Processing object type TRANSPORTABLE_EXPORT/CONSTRAINT/CONSTRAINT
20-APR-21 08:29:04.671: W-2 Completed 50 CONSTRAINT objects in 0 seconds
20-APR-21 08:29:11.621: W-2 Processing object type TRANSPORTABLE_EXPORT/POST_INSTANCE/PROCACT_INSTANCE
20-APR-21 08:29:11.650: W-2 Completed 15 PROCACT_INSTANCE objects in 0 seconds
20-APR-21 08:29:12.304: W-2 Processing object type TRANSPORTABLE_EXPORT/POST_INSTANCE/PROCDEPOBJ
20-APR-21 08:29:12.334: W-2 Completed 10 PROCDEPOBJ objects in 0 seconds
20-APR-21 08:29:47.203: W-1 Completed 57 TABLE objects in 0 seconds
23. Resume Jobs
• Starting with 21c, a failed Transportable Tablespace job can be restarted near the point of the
failure
24. Further Information
Oracle Database 21c – What’s new
https://docs.oracle.com/en/database/oracle/oracle-database/21/whats-new.html
Oracle Database 21c – Utilities
https://docs.oracle.com/en/database/oracle/oracle-database/21/sutil/index.html
My Oracle Support
https://support.oracle.com