SlideShare a Scribd company logo
1 of 98
Download to read offline
What’s new in Autonomous Database in 2023
Sandesh Rao
VP AIOps , Autonomous Database
@sandeshr
https://www.linkedin.com/in/raosandesh/
https://www.slideshare.net/SandeshRao4
July 2023
How to get started with the Autonomous
Database Free Tier
Always Free services enable developers and students to learn, build and get hands-on
experience with Oracle Cloud for unlimited time
Anyone can try for an unlimited time the full functionality of:
• Oracle Autonomous Database
• Oracle Cloud Infrastructure including:
• Compute VMs
• Block and Object Storage
• Load Balancer
Free tier
Free tier – Tech spec
2 Autonomous Databases (Autonomous Data Warehouse or Autonomous Transaction
Processing), each with 1 OCPU and 20 GB storage
2 Compute VMs, each with 1/8 OCPU and 1 GB memory
2 Block Volumes, 100 GB total, with up to 5 free backups
10 GB Object Storage, 10 GB Archive Storage, and 50,000/month API requests
1 Load Balancer, 10 Mbps bandwidth
10 TB/month Outbound Data Transfer
500 million ingestion Datapoints and 1 billion Datapoints for Monitoring Service
1 million Notification delivery options per month and 1000 emails per month
Getting Started workshops for Autonomous Database
https://bit.ly/get-started-with-adb
How to Get Started with Machine Learning
Database Developer to Data Scientist Journey
Oracle Machine Learning Notebooks
Collaborative UI
• Based on Apache Zeppelin
• Supports data scientists, data analysts,
application developers, and DBAs with
SQL and Python
• Easy notebook sharing
• Scheduling, versioning, access control
Included with Autonomous Database
• Automatically provisioned and managed
• In-database algorithms and
analytics functions
• Explore and prepare, build and evaluate
models, score data, deploy solutions
Autonomous Database as a Data Science Platform
Copyright © 2021 Oracle and/or its affiliates.
Automate production and deployment of ML models
• Enhance Data Scientist productivity
and user-experience
• Enable non-expert users to leverage ML
• Unify model deployment and monitoring
• Support model management
Features
• Minimal user input: data, target
• Model leaderboard
• Model deployment via REST
• Model monitoring
• Cognitive features for image and text
AutoML “Code-free” user interface supporting automated end-to-end
machine learning
Copyright © 2020 Oracle and/or its affiliates.
Export Data as JSON to Object Storage
ADB now has a procedure to export a query as
JSON directly to Object Storage bucket.
The query can be an advanced query - includes
joins or subqueries.
Specify format parameter
with compression option to compress the output
files.
Use DBMS_CLOUD.DELETE_OBJECT to delete
the files
BEGIN
DBMS_CLOUD.EXPORT_DATA(
credential_name => 'DEF_CRED_NAME',
file_uri_list =>
‘bucketname/filename’,
query => 'SELECT * FROM DEPT’,
format => JSON_OBJECT('type' value
'json'));
END; /
Export Data As JSON To Object Storage
OVERVIEW HOW IT WORKS
Partitions with external tables in Cloud
• ADB provides transparent access
over data in Object Stores
• Easily join across data sets in the
“data lake” and in-database data
sets
• Leverage in-file metadata with Avro,
ORC and Parquet to simplify
creating tables
Autonomous Database – Accessing Data In Object Stores
Object Stores
Oracle
Object Store
Amazon
S3
Azure
Blob Store
Autonomous Database
Google
Cloud Store
Wasabi
Cloud Store
37
Copyright © 2021, Oracle and/or its affiliates
HOW IT WORKS
Note only use of DBMS_CLOUD syntax is supported
Hybrid Partitioned Tables
BEGIN DBMS_CLOUD.CREATE_HYBRID_PART_TABLE(
table_name =>'HPT1’,
credential_name =>'OBJ_STORE_CRED’,
format => json_object('delimiter' value ',', ‘
recorddelimiter' value 'newline', ‘
characterset' value 'us7ascii’),
column_list => 'col1 number, col2 number, col3 number’
partitioning_clause => 'partition by range (col1)
(partition p1 values less than (1000) external location (
'https://swiftobjectstorage.us-ashburn-1 .../file_01.txt') ,
partition p2 values less than (2000) external location (
‘https://swiftobjectstorage.us-ashburn-1 .../file_02.txt') ,
partition p3 values less than (3000) ) )
END;
External tables with partitioning specified in source files
Partitioning is a well-established technique to improve the performance and manageability of database
systems by dividing large objects into smaller partitions; any large data warehouse takes advantage of it
BEGIN
DBMS_CLOUD.CREATE_EXTERNAL_PART_TABLE(
TABLE_NAME => 'sales_new_api',
CREDENTIAL_NAME => 'CRED_OCI',
FILE_URI_LIST => 'https://objectstorage.us-ashburn-
1.oraclecloud.com/n/my_namespace/b/moviestream_landing/o/sales_sample/*.parquet',
FORMAT => '{"type":"parquet", "schema":
"first","partition_columns":[{"name":"month","type":"varchar2(100)"}]}'
);
END;
/
External tables with partitioning specified in source files
We now derive the column
structure for self-describing
table formats with partitioned
external tables, just like with
nonpartitioned external tables
External tables with partitioning specified in source files
If new files are added or removed in the underlying Object Store, you just run the new sync procedure
like this:
BEGIN
DBMS_CLOUD.SYNC_EXTERNAL_PART_TABLE (table_name => 'sales_new_api');
END;
/
Automatic Partitioning
Automatic partitioning in ADB analyzes the application
workload
Automatically applies partitioning to tables and their indexes to
improve performance or to allow better management of large
tables
Automatic partitioning chooses from the following partition
methods:
• INTERVAL AUTOMATIC: best suited for ranges of partition key values
• LIST AUTOMATIC: applies to distinct partition key values
• HASH: partitioning on the partition key's hash values
OVERVIEW
Automatic partitioning performs the following operations:
• Identify candidate tables for automatic partitioning by
analyzing the workload for selected candidate tables.
• By default, automatic partitioning uses the workload
information collected in an Autonomous Database for analysis
• Evaluate partition schemes based on workload analysis and
quantification and verification of the performance benefits:
1. Candidate empty partition schemes with synthesized
statistics are created internally and analyzed for
performance.
2. Candidate scheme with highest estimated IO
reduction is chosen as optimal partitioning strategy -
internally implemented to test and verify
performance
3. If candidate partition scheme does not improve
performance automatic partitioning is not
implemented
Implement optimal partitioning strategy, if configured to do so,
for the tables analyzed by the automatic partitioning procedures.
HOW IT WORKS
Set Patch Level When Creating A Clone and
retrieve Patch Details
Set Patch Level When Creating A Clone
When you provision or clone an Autonomous
Database instance you can select a patch level to apply
upcoming patches.
There are two patch level options: Regular and Early.
The Early patch level allows testing upcoming patches
one week before they are applied as part of the regular
patching program
The console shows the patch level setting with the
section headed Maintenance.
OVERVIEW HOW IT WORKS
View Autonomous Database maintenance event history to see details about past maintenance events
(requires ADMIN user)
View Patch Details
OVERVIEW
SELECT * FROM DBA_CLOUD_PATCH_INFO;
SELECT * FROM DBA_CLOUD_PATCH_INFO WHERE PATCH_VERSION = 'ADBS-21.7.1.2';
View Patch Details HOW IT WORKS
OCI Identity and Access Management (IAM)
Authentication
Integration With OCI Identity and Access Management (IAM)
Authentication
OCI Identity and Access Management users can now
authenticate and authorize to ADB-Serverless.
Better security since user access to databases is
managed centrally instead of locally in every database
Reduces zombie database user accounts
User management moves DBA tasks to the IAM
administrator
SQL*Plus users can sign into Autonomous Database
using their IAM username and IAM database password
Users can also use IAM SSO tokens with the latest JDBC-
thin and OCI-C database clients to connect with ADB-
Shared
Identity and Access Management (IAM) authentication - additional
features
Can now leverage a single
identifier and password to access
all your databases in OCI
OCI application integration with
Autonomous Databases is
enhanced to support application
identities, database links, and
proxy authentication to simplify
application maintenance
Improves overall security through
accountability since the IAM user
information can be collected as
part of an audit record
Load Data Using DBMS_CLOUD
Load data using DBMS_CLOUD
• For data loading from files in the Cloud
• Store your object storage credentials
• Use the procedure DBMS_CLOUD.COPY_DATA to load
data
• The source file in this example is channels.txt
File-02 in
Object Store
Bucket
File-03 in
Object Store
Bucket
File-01 in
Object Store
Bucket
SET DEFINE OFF BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => 'DEF_CRED_NAME',
username => 'adwc_user@example.com',
password => 'password' ); END; /
Load data using DBMS_CLOUD
CREATE TABLE CHANNELS (channel_id CHAR(1), channel_desc VARCHAR2(20),
channel_class VARCHAR2(20) );
BEGIN DBMS_CLOUD.COPY_DATA( table_name =>'CHANNELS', credential_name
=>'DEF_CRED_NAME’,
file_uri_list =>
'https://objectstorage.us-phoenix-1.oraclecloud.com/n/namespace-string/b/
bucketname/o/channels.txt', format => json_object('delimiter' value ',') );
END;
BEGIN DBMS_CLOUD.COPY_DATA( table_name =>'CHANNELS', credential_name
=>'DEF_CRED_NAME’,
file_uri_list =>'https://objectstorage.us-phoenix-1.oraclecloud.com/n/namespace-
string/b/
bucketname/o/exp01.dmp,
https://objectstorage.us-phoenix-1.oraclecloud.com/n/namespace-string/b/
bucketname/o/exp02.dmp', format => json_object('type' value 'datapump') );
END;
Load data using DBMS_CLOUD
BEGIN DBMS_CLOUD.COPY_COLLECTION( collection_name => 'fruit', credential_name =>
'DEF_CRED_NAME', file_uri_list => 'https://objectstorage.us-ashburn-1.oraclecloud.com/n/
namespace-string/b/fruit_bucket/o/myCollection.json’,
format => JSON_OBJECT('recorddelimiter' value '''n''') );
END;
BEGIN DBMS_CLOUD.COPY_COLLECTION( collection_name => 'fruit2', credential_name =>
'DEF_CRED_NAME', file_uri_list => 'https://objectstorage.us-ashburn-1.oraclecloud.com/n/
namespace-string/b/json/o/fruit_array.json’,
format => '{"recorddelimiter" : "0x''01''", "unpackarrays" : TRUE}' );
END;
Load data using DBMS_CLOUD
SELECT table_name, owner_name, type, status, start_time, update_time,
logfile_table, badfile_table FROM user_load_operations WHERE
type = 'COPY’;
TABLE_NAME OWNER_NAME TYPE STATUS START_TIME UPDATE_TIME LOGFILE_TABLE BADFILE_TABLE
------------------------------------------------------------------------------------
FRUIT ADMIN COPY COMPLETED 2020-04-23 22:27:37 2020-04-23 22:27:38 "" ""
FRUIT ADMIN COPY FAILED 2020-04-23 22:28:36 2020-04-23 22:28:37 COPY$2_LOG COPY$2_BAD
SELECT credential_name, username, comments FROM all_credentials;
CREDENTIAL_NAME USERNAME COMMENTS
---------------------------–----------------------------- --------------------
ADB_TOKEN user_name@example.com {"comments":"Created via
DBMS_CLOUD.create_credential"}
DEF_CRED_NAME user_name@example.com {"comments":"Created via
DBMS_CLOUD.create_credential"}
ADB now supports SQL access to tenancy
details
When you file a service request for Autonomous Database, you need to provide the tenancy details for
your instance. Tenancy details for the instance are available on the Oracle Cloud Infrastructure console.
However, if you are connected to the database, you can now obtain these details by querying
the CLOUD_IDENTITY column of the V$PDBS view. For example:
...will generate something similar to the following:
How to get the tenancy details for your instance
SELECT cloud_identity FROM v$pdbs;
{"DATABASE_NAME" : "DBxxxxxxxxxxxx",
"REGION" : "us-phoenix-1",
"TENANT_OCID" :
"OCID1.TENANCY.REGION1..ID1",
"DATABASE_OCID" :
"OCID1.AUTONOMOUSDATABASE.OC1.SEA.ID2
",
"COMPARTMENT_OCID" :
"ocid1.tenancy.region1..ID3"}
IP Address ACL’s
UI Button To Quickly Add Your IP Address To ACLs
Makes it easier to setup ACLs where you need to add the
IP address of the client to your Network Access Control
List (ACL)
The button "Add My IP Address" adds your current IP
address to the ACL entry
Removes the need to manually look for your client IP via
a 3rd party website or app anymore (e.g. Google,
whatsmyip.com etc)
Available in both the Create Autonomous Database flow
for new ADB-S instances to be provisioned and the
Update Network Access flow for existing ADB-S
instances
Multicloud with OCI and Azure
Low-latency connectivity between Microsoft Azure and OCI
Deploys Oracle Database on OCI, and provides metrics on Azure
Combine the full Azure catalog of AI and application services with OCI’s most powerful
database services
No charges for Oracle Interconnect for Microsoft Azure ports or data ingress/egress over
the Interconnect
Billing normally for consumption of Oracle Database services, such as Autonomous
Database
Multicloud with OCI and Azure
Oracle Database Service for Microsoft Azure (ODSA)
Automatically configures
everything required to link the
two cloud environments
Federates Azure active
directory identities
Azure like UI & API
experience for provisioning
and managing Oracle
database services on OCI
Sends metrics, logs, and
events for the OCI databases
to Azure tooling for unified
telemetry and monitoring
Collaborative support model
Direct connection between
cloud vendors
<2ms latency for traffic between
OCI and Microsoft Azure
Pricing is based solely on port
capacities for OCI FastConnect
and Azure ExpressRoute Local
Circuit
No charges for inbound or
outbound bandwidth consumed
Architecture
For each database product, ODSA supports the common administration and application access
capabilities:
• Create, read, update, delete, list (CRUDL)
• Clone database
• Database backup (automatic and manual)
• Database restore (restore to existing database for now)
• Generate Azure connection string
• Display database metrics
Oracle Cloud Infrastructure Integration
Azure tools integration
Delivers OCI database metrics,
events, and logs to tools such as
Azure Application Insights,
Azure Event Grid, and Azure Log
Analytics
Enables Azure users to view OCI
databases alongside the rest of
your Azure data, for unified
telemetry and monitoring
Also creates a custom
dashboard that provides Azure
developers with Oracle
database resource details, and
connection strings for their
applications
Custom dashboard
Displays graphs for each of
the standard Oracle
database metrics for the
resource
Give developers and
administrators a quick view
of all metrics in one place
Real Application Testing - Database Replay
Using Oracle Real Application Testing
Capture a Workload on an Autonomous Database Instance
Testing the effects of changes on existing workloads using simulated
data sets often do not accurately represent production workloads
BEGIN
DBMS_CLOUD_ADMIN.START_WORKLOAD_CAPTURE(
capture_name => ‘CAP_TEST1',
duration => 60);
END;
/
Replay a workload
Replay a workload on a refreshable clone
BEGIN
DBMS_CLOUD_ADMIN.REPLAY_WORKLOAD(
capture_name => 'CAP_TEST1');
END;
/
Log in as the ADMIN user or have the EXECUTE privilege on DBMS_CLOUD_ADMIN
Replay a workload on a full clone
BEGIN
DBMS_CLOUD_ADMIN.REPLAY_WORKLOAD(
capture_name => 'CAP_TEST1’,
capture_source_tenancy_ocid => 'OCID1.TENANCY.REGION1..ID1’,
capture_source_db_name => 'ADWFINANCE');
END;
/
Oracle Machine Learning for R on
Autonomous Database
Sample of common enterprise machine learning pain points
Copyright © 2023 Oracle and/or its affiliates.
119
“It takes too long to get my data or to get the ‘right’ data”
“I can’t analyze or mine all of my data – it has to be sampled”
“Putting open source models and results into production takes too long
and is ad hoc and complex”
“Our company is concerned about data security, backup and recovery”
“We need to build and score with 100s or 1000s of models fast
to meet business objectives”
Copyright © 2023 Oracle and/or its affiliates.
Access latency
Paradigm shift: R à Data Access Language à R
Memory limitation – data size, in-memory processing
Single threaded
Issues for backup, recovery, security
Ad hoc production deployment
Traditional analytics and data source interaction
Data Source
Flat Files extract / export
read
export load
Data source connectivity packages
Read/Write files using built-in tool capabilities
Oracle Machine Learning provides tools for data science project success
Copyright © 2023, Oracle and/or its affiliates . All rights reserved.
121
OML Component Oracle Autonomous
Database (19c, 21c)
Oracle Database
(19c, 21c)
Oracle
DBCS
Oracle Exadata
CS/CI/C@C
OML4SQL API
Build ML models and score data with
no data movement
ü
ADB-S, ADB-D, ADB C@C
ü ü ü
OML4Py API
Leverage the database as a high-performance
compute engine from Python with in-database ML
ü
ADB-S
ü ü ü
OML4R API
Leverage the database as a high-performance
compute engine from R with in-database ML
ü
ADB-S
ü ü ü
OML Notebooks
SQL, PL/SQL, Python, R, and markdown interpreters
ü
ADB-S
OML AutoML UI
No-code automated modeling interface
ü
ADB-S
OML Services
RESTful model management and deployment
ü
ADB-S
Oracle Data Miner
SQL Developer extension with a drag-n-drop interface
for creating ML methodologies
ü
ADB-S, ADB-D, ADB C@C
ü ü ü
Oracle Machine Learning for R
Leverage the database as HPC environment
Use in-database parallelized and distributed
ML algorithms from a native R interface
Store and manage user-defined functions and R
objects in the database
Integrate results into applications
and dashboards via SQL or REST
Eliminate the need to explicitly provision R engines
for solution deployment
Empower data scientists with R
Autonomous
Database
Oracle Database
(on premises, DBCS, ExaCC/CS)
OML4R
Standalone client
R
SQL
OML Notebooks
Copyright © 2023, Oracle and/or its affiliates. All rights reserved.
12
4
R
Python
SQL
Pl/SQL
markdown
REST
roadmap
Oracle Machine Learning for R
Transparency layer
• Leverage proxy objects so data remains in database
• Overload native functions translating functionality to SQL
• Use familiar R syntax on database data
Parallel, distributed in-database algorithms
• Scalability and performance
• Exposes in-database algorithms available from OML4SQL
Embedded execution
• Manage and invoke user-defined R functions
• Data-parallel, task-parallel, and non-parallel execution
• Use open source packages to augment functionality
Empower data scientists with R
Copyright © 2023, Oracle and/or its affiliates. All rights reserved.
12
5
Autonomous
Database
Oracle Database
(on premises, DBCS, ExaCC/CS)
OML4R
Standalone client
R
SQL
OML Notebooks
R
Python
SQL
Pl/SQL
markdown
REST
roadmap
OML4R 2.0 Algorithms on ADB
Supports automatic data preparation, partitioned model ensembles, integrated text mining
• Decision Tree
• Logistic Regression
• Naïve Bayes
• Neural Network
• Support Vector Machine
• Random Forest
• XGBoost (21c)
Regression
• Generalized Linear Model
• Neural Network
• Support Vector Machine
• XGBoost (21c)
Classification
Attribute Importance
• Minimum Description Length
• Random Forest
Clustering
• Hierarchical k-Means
• Orthogonal Partitioning
• Expectation Maximization
• Gaussian Mixture Models via EM
Feature Extraction
• Non-negative Matrix Factorization
• Principal Component Analysis
• Singular Value Decomposition
• Explicit Semantic Analysis
Market Basket Analysis
• Apriori – Association Rules
Anomaly Detection
• 1 Class Support Vector Machine
Time Series
• Single Exponential Smoothing
• Double Exponential Smoothing
• Triple Exponential Smoothing
Copyright © 2023, Oracle and/or its affiliates. All rights reserved.
13
1
Machine Learning in-database algorithms
Conda environment creation via OML Notebooks on ADB
Copyright © 2023, Oracle and/or its affiliates
134
Conda environment usage via OML Notebooks on ADB
Copyright © 2023, Oracle and/or its affiliates
135
Example using Support Vector Machine for anomaly detection
Scalable in-database algorithms
# obtain proxy object
ore.sync(table="ONTIME_S")
# build anomaly detection model
SVM.MOD = ore.odmSVM(~., ONTIME_S,
"anomaly.detection",
outlier.rate = 0.1)
# view model object
SVM.MOD
Copyright © 2023 Oracle and/or its affiliates. All rights reserved.
OML4R
Autonomous Database Plugin for
Microsoft Excel
Using the Excel Add-in to Query Autonomous Database
Excel Add-in allows you query
data in the Autonomous
Database directly from Excel
Run native SQL queries and use
a wizard to query Analytic Views
created by the Data Analysis tool
Download the Add-in
1. Login to the web UI for the DB Actions page.
2. On the right-side navigation menu, click on
the link to “Download Add-in for Excel.”
Installation on macOS
3. Unzip the downloaded zip file
4. Open a terminal window and navigate to the unzipped folder
5. Ensure that Excel is not running.
6. The install.sh file does not have to execute permissions
chmod 764 install.sh
./install.sh
Installation on macOS
7. Launch Excel
8. On the Insert tab on the ribbon, click the down arrow on the Add-ins / My Add-ins
option:
9. Under Developer Add-ins, you will see the Oracle Autonomous Database Add-in.
10. Click to select this Add-in
Installation on macOS
At the bottom, you see a notification about the Add-in being loaded:
After the Add-in is loaded successfully, you will see the following message:
Also, a new ribbon item, “Autonomous Database.”
Installation on macOS
11. Close and Quit Excel
12. Launch Excel and insert the Add-in again. (You must perform these steps 6 thru 9
mentioned above every time you launch Excel.)
You are now ready to connect to the Autonomous Database, run native SQL, and use the
Analytic View Query wizard.
Connecting to Autonomous Database
You are now connected to the Autonomous Database and ready to run native SQL and use
the Analytic View Query wizard.
On the Autonomous Database ribbon tab, click the About button.
This provides information about the Add-in and Autonomous Database version, which is
helpful while working with support on any problems you face with the Add-in.
Run Native SQL for analysis using Excel Pivot tables
Launch the Native Sql panel by clicking the button in the Autonomous Database ribbon.
Example SQL query
Add the above query in the text box under the Write a query label on the right-side panel.
Check the Pivot table checkbox.
Under Select worksheet, click the “+” icon and provide MovieSales as the name.
Click the check button then Execute
select a.continent, a.country, b.form_factor, b.device, c.month, d.day,
e.genre, e.customer_segment, e.sales, e.purchases
from countries a, devices b, months c, days d, movie_sales_2020 e
where e.country = a.country and
e.day = d.day and
e.month = c.month and
e.device = b.device and
e.country = a.country
order by c.month_num;
Run Native SQL for analysis using Excel Pivot tables
Two new tabs are created, viz.
MovieSales and Sheet2 (Sheet
number might vary in your case)
On Sheet2, an Excel pivot table
is created with the data fetched
from the Autonomous Database.
Run Native SQL for analysis using Excel Pivot tables
Setup the Pivot table options as shown on the screen below:
The data for this pivot table is
fetched from the MovieSales
worksheet.
Now you can use the native
Excel capabilities to analyze
data.
Access Amazon Redshift, Snowflake and
Other Non-Oracle Databases from Your
Autonomous Database
Under “Actions” menu of your Redshift cluster on AWS console, we need to select “Modify publicly accessible
setting” to make sure our Redshift cluster is publicly accessible:
Step-1: Make Sure Redshift is Configured to Allow Public Access
Navigate to the VPC security group that is assigned to our Redshift cluster and create an inbound rule to port 5439
from the source IP or CIDR range of our choice:
Step-1: Make Sure Redshift is Configured to Allow Public Access
Create a credential object with the credentials (username and password) of the target database:
Step-2: Create a Database Link to your Redshift Instance
BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => 'REDSHIFT_CRED',
username => 'awsadmin',
password => '************');
END;
/
PL/SQL procedure successfully completed.
Create a database link to our Redshift instance.
Nearly identical to any other database link creation except for the gateway_params parameter for which we are
passing 'awsredshift' as our database type:
Step-2: Create a Database Link to your Redshift Instance
BEGIN
DBMS_CLOUD_ADMIN.CREATE_DATABASE_LINK(
db_link_name => 'REDSHIFT_LINK',
hostname => 'redshift-cluster-1.******.us-west-
1.redshift.amazonaws.com',
port => '5439',
service_name => 'dev',
credential_name => 'REDSHIFT_CRED',
gateway_params => JSON_OBJECT('db_type' value 'awsredshift'),
ssl_server_cert_dn => NULL);
END;
/
PL/SQL procedure successfully completed.
Step-3: Run a Query Over the Database Link
SELECT COUNT(*) FROM SALES@REDSHIFT_LINK;
COUNT(*)
-----------
172456
Building Lakehouse on ADB
DATA LAKE
The evolution of data management and analytics
Lakehouse
Data
warehouse
Data Lake
Data
warehouse
Data Lakehouse on OCI
Data sources
Data Lakehouses on OCI
Open & flexible: analyze any database, any application, from anywhere
Managed Open Source Data Warehouse
Data Movement
Data Definition
& Discovery
Any Database
Any Events/Sensors
Data Stores
Data target
Machine Learning
& Data Science
Any BI Tool
Any Application
Any Application
Any Cloud
Object Storage
Relational Data
AI services for automation,
prep, and prediction
Big Data Service
Data Flow
Autonomous
Database
Data Integration
GoldenGate
Data Catalog
Harvest and curate
Evolve object storage files into meaningful business entities
Copyright © 2021, Oracle and/or its affiliates
176
Typical data lake organization
Create External table for external data
Understand data in your Lake
Copyright © 2021, Oracle and/or its affiliates
177
BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => 'DEF_CRED_NAME',
username => 'adb_user@example.com',
password => 'password' );
END;
/
BEGIN
DBMS_CLOUD.CREATE_EXTERNAL_TABLE(
table_name =>'sales_extended_ext',
credential_name =>'DEF_CRED_NAME',
file_uri_list =>'https://objectstorage.us-phoenix-
1.oraclecloud.com/n/namespace-string/b/bucketname/o/sales_extended.parquet',
format => '{"type":"parquet", "schema": "first"}');
END;
/
Evolve object storage files into meaningful business entities
OCI Data Catalog
Copyright © 2021, Oracle and/or its affiliates
178
Derive technical metadata automatically Easily apply business context
Discover and understand entities
Copyright © 2021, Oracle and/or its affiliates
179
Search for data sets using a range of criteria
1
Understand its meaning and context
2
Autonomous Database both a Data Catalog source and consumer
Data Catalog integrates with Autonomous Database
Copyright © 2021, Oracle and/or its affiliates
180
Data Catalog is the source of truth for
Object Store metadata
• Harvest object storage to derive schemas
• Manage business glossary, terms and tags
• Discover data using powerful search
Use Autonomous Database to discover
and analyze data sets
• Managed schemas and tables defined
automatically
• No management required
• Use Oracle SQL to query both metadata
and object storage data
Data Catalog
Object
Storage
Autonomous
Database(s)
Harvest
1
Sync
2
Query 3
ADB DCAT integration Live Lab
Check it out:
Database actions
Want to load/link data with UI?
Copyright © 2021, Oracle and/or its affiliates
182
Gain insights from the lakehouse
Copyright © 2021, Oracle and/or its affiliates
185
One query spans storage types
• Correlate information from data lake
and data warehouse
• Access from any SQL tool or application
• Preserve your investment in tools and
skill sets
• Safeguard sensitive data using Oracle
Database advanced security policies
Integration with Slack
Send messages to a Slack channel:
BEGIN
DBMS_CLOUD_NOTIFICATION.SEND_MESSAGE(
provider => 'slack',
credential_name => 'SLACK_CRED',
message => 'Alert from Autonomous Database...',
params => json_object('channel' value 'C0....08'));
END;
/
Send notifications or query results to a Slack channel from ADB
Send output from a query to a Slack channel:
BEGIN
DBMS_CLOUD_NOTIFICATION.SEND_DATA(
provider => 'slack',
credential_name => 'SLACK_CRED',
query => 'SELECT username,account_status, expiry_date
FROM account_users
WHERE rownum < 5’, params =>
json_object('channel' value 'C0....08’,
'type' value 'csv'));
END;
/
Send notifications or query results to a Slack channel from ADB
Clone your Autonomous Database across
regions from a backup
Clone your Autonomous Database across regions from a backup
More Actions > create clone
Clone your Autonomous Database across regions from a backup
Using Private Endpoints for Autonomous
Database with Shared Exadata Infrastructure
Creating an Autonomous Database with a Private Endpoint
Prerequisites:
A virtual cloud network (VCN) in the region
where you want to create the Autonomous
Database. In the VCN subnet, select Default
DHCP Options (this choice sets up an
internet resolver and a VCN resolver)
At least one subnet in the VCN
At least one network security group (NSG) in
the VCN
Creating an Autonomous Database with a Private Endpoint
After the database is provisioned, you can see the networking details on the Autonomous Database Details page
Scenario 1: Connecting from Your VCN
Connecting to an Autonomous Database with a Private Endpoint
Useful if you have an application that is running inside
Oracle Cloud Infrastructure, either on a virtual machine
(VM) in the same VCN that is configured with your
database or on a VM in a different VCN
The following network diagram shows an application
running in the same VCN as the database. The
Autonomous Data Warehouse (ADW) instance has a
private endpoint in VCN A and subnet A (CIDR
10.0.2.0/24).
The NSG associated with the Autonomous Data
Warehouse instance is NSG 1. The application that
connects to the Autonomous Data Warehouse instance is
running on a VM that is in subnet B (CIDR 10.0.1.0/24).
Scenario 1: Connecting from Your VCN
Connecting to an Autonomous Database with a Private Endpoint
Define security rules in NSG 1 to control ingress and egress traffic
Allow ingress traffic from the source 10.0.1.0/24 (the CIDR for subnet B, where the application runs) on the
destination port range 1522, and egress traffic from the ADW instance to the destination 10.0.1.0/24
Scenario 1: Connecting from Your VCN
Connecting to an Autonomous Database with a Private Endpoint
Also create a security rule to allow traffic to and from the VM
You can use a stateful security rule for the VM, so defining a rule for egress to the destination subnet
(10.0.2.0/24)
After you configure the security rules, your application can connect to the Autonomous Data Warehouse
database by using the database wallet, just as you would usually connect
Scenario 2: Connecting from Your Data Center
Connecting to an Autonomous Database with a Private Endpoint
Connect the on-premises network to the VCN
with FastConnect and then set up a dynamic
routing gateway (DRG)
Add an entry in your on-premises
host’s /etc/hosts file with the database’s private
IP address and FQDN
You can find the private IP address on the
database details page and the FQDN
in tnsnames.ora inside your wallet.
Alternatively, you can set up hybrid DNS in Oracle
Cloud Infrastructure for DNS name resolution
Scenario 2: Connecting from Your Data Center
Connecting to an Autonomous Database with a Private Endpoint
Traffic is also allowed to and from the database by means of two stateless security rules for the data center
CIDR range (172.16.0.0/16).
Thank you
Any Questions?
Sandesh Rao
VP AIOps Autonomous Database
@sandeshr
https://www.linkedin.com/in/raosandesh/
https://www.slideshare.net/SandeshRao4
What's new in the world of the Autonomous Database in 2023

More Related Content

Similar to What's new in the world of the Autonomous Database in 2023

Data exposure in Azure - production use-case
Data exposure in Azure - production use-caseData exposure in Azure - production use-case
Data exposure in Azure - production use-caseAlexander Laysha
 
Estimating the Total Costs of Your Cloud Analytics Platform
Estimating the Total Costs of Your Cloud Analytics PlatformEstimating the Total Costs of Your Cloud Analytics Platform
Estimating the Total Costs of Your Cloud Analytics PlatformDATAVERSITY
 
Introduction to Azure Cloud Storage
Introduction to Azure Cloud StorageIntroduction to Azure Cloud Storage
Introduction to Azure Cloud StorageGanga R Jaiswal
 
Azure - Data Platform
Azure - Data PlatformAzure - Data Platform
Azure - Data Platformgiventocode
 
Data warehousing features in oracle
Data warehousing features in oracleData warehousing features in oracle
Data warehousing features in oracleJinal Shah
 
Whats new in Enterprise 5.0 Product Suite
Whats new in Enterprise 5.0 Product SuiteWhats new in Enterprise 5.0 Product Suite
Whats new in Enterprise 5.0 Product SuiteMicro Focus
 
Data Handning with Sqlite for Android
Data Handning with Sqlite for AndroidData Handning with Sqlite for Android
Data Handning with Sqlite for AndroidJakir Hossain
 
Novidades do SQL Server 2016
Novidades do SQL Server 2016Novidades do SQL Server 2016
Novidades do SQL Server 2016Marcos Freccia
 
Microsoft Azure Big Data Analytics
Microsoft Azure Big Data AnalyticsMicrosoft Azure Big Data Analytics
Microsoft Azure Big Data AnalyticsMark Kromer
 
World2016_T5_S5_SQLServerFunctionalOverview
World2016_T5_S5_SQLServerFunctionalOverviewWorld2016_T5_S5_SQLServerFunctionalOverview
World2016_T5_S5_SQLServerFunctionalOverviewFarah Omer
 
11g architecture
11g architecture11g architecture
11g architectureManohar Jha
 
Geek Sync | Deployment and Management of Complex Azure Environments
Geek Sync | Deployment and Management of Complex Azure EnvironmentsGeek Sync | Deployment and Management of Complex Azure Environments
Geek Sync | Deployment and Management of Complex Azure EnvironmentsIDERA Software
 
session and cookies.ppt
session and cookies.pptsession and cookies.ppt
session and cookies.pptJayaprasanna4
 
Blue Green Sitecore Deployments on Azure
Blue Green Sitecore Deployments on AzureBlue Green Sitecore Deployments on Azure
Blue Green Sitecore Deployments on AzureRob Habraken
 
오토스케일링 제대로 활용하기 (김일호) - AWS 웨비나 시리즈 2015
오토스케일링 제대로 활용하기 (김일호) - AWS 웨비나 시리즈 2015오토스케일링 제대로 활용하기 (김일호) - AWS 웨비나 시리즈 2015
오토스케일링 제대로 활용하기 (김일호) - AWS 웨비나 시리즈 2015Amazon Web Services Korea
 
Second phase report on "ANALYZING THE EFFECTIVENESS OF THE ADVANCED ENCRYPTIO...
Second phase report on "ANALYZING THE EFFECTIVENESS OF THE ADVANCED ENCRYPTIO...Second phase report on "ANALYZING THE EFFECTIVENESS OF THE ADVANCED ENCRYPTIO...
Second phase report on "ANALYZING THE EFFECTIVENESS OF THE ADVANCED ENCRYPTIO...Nikhil Jain
 

Similar to What's new in the world of the Autonomous Database in 2023 (20)

Informatica training
Informatica trainingInformatica training
Informatica training
 
04 Azure IAAS 101
04 Azure IAAS 10104 Azure IAAS 101
04 Azure IAAS 101
 
ora_sothea
ora_sotheaora_sothea
ora_sothea
 
Data exposure in Azure - production use-case
Data exposure in Azure - production use-caseData exposure in Azure - production use-case
Data exposure in Azure - production use-case
 
Microsoft cloud 101
Microsoft cloud 101Microsoft cloud 101
Microsoft cloud 101
 
Estimating the Total Costs of Your Cloud Analytics Platform
Estimating the Total Costs of Your Cloud Analytics PlatformEstimating the Total Costs of Your Cloud Analytics Platform
Estimating the Total Costs of Your Cloud Analytics Platform
 
Introduction to Azure Cloud Storage
Introduction to Azure Cloud StorageIntroduction to Azure Cloud Storage
Introduction to Azure Cloud Storage
 
Azure - Data Platform
Azure - Data PlatformAzure - Data Platform
Azure - Data Platform
 
Data warehousing features in oracle
Data warehousing features in oracleData warehousing features in oracle
Data warehousing features in oracle
 
Whats new in Enterprise 5.0 Product Suite
Whats new in Enterprise 5.0 Product SuiteWhats new in Enterprise 5.0 Product Suite
Whats new in Enterprise 5.0 Product Suite
 
Data Handning with Sqlite for Android
Data Handning with Sqlite for AndroidData Handning with Sqlite for Android
Data Handning with Sqlite for Android
 
Novidades do SQL Server 2016
Novidades do SQL Server 2016Novidades do SQL Server 2016
Novidades do SQL Server 2016
 
Microsoft Azure Big Data Analytics
Microsoft Azure Big Data AnalyticsMicrosoft Azure Big Data Analytics
Microsoft Azure Big Data Analytics
 
World2016_T5_S5_SQLServerFunctionalOverview
World2016_T5_S5_SQLServerFunctionalOverviewWorld2016_T5_S5_SQLServerFunctionalOverview
World2016_T5_S5_SQLServerFunctionalOverview
 
11g architecture
11g architecture11g architecture
11g architecture
 
Geek Sync | Deployment and Management of Complex Azure Environments
Geek Sync | Deployment and Management of Complex Azure EnvironmentsGeek Sync | Deployment and Management of Complex Azure Environments
Geek Sync | Deployment and Management of Complex Azure Environments
 
session and cookies.ppt
session and cookies.pptsession and cookies.ppt
session and cookies.ppt
 
Blue Green Sitecore Deployments on Azure
Blue Green Sitecore Deployments on AzureBlue Green Sitecore Deployments on Azure
Blue Green Sitecore Deployments on Azure
 
오토스케일링 제대로 활용하기 (김일호) - AWS 웨비나 시리즈 2015
오토스케일링 제대로 활용하기 (김일호) - AWS 웨비나 시리즈 2015오토스케일링 제대로 활용하기 (김일호) - AWS 웨비나 시리즈 2015
오토스케일링 제대로 활용하기 (김일호) - AWS 웨비나 시리즈 2015
 
Second phase report on "ANALYZING THE EFFECTIVENESS OF THE ADVANCED ENCRYPTIO...
Second phase report on "ANALYZING THE EFFECTIVENESS OF THE ADVANCED ENCRYPTIO...Second phase report on "ANALYZING THE EFFECTIVENESS OF THE ADVANCED ENCRYPTIO...
Second phase report on "ANALYZING THE EFFECTIVENESS OF THE ADVANCED ENCRYPTIO...
 

More from Sandesh Rao

Performance Tuning Using oratop
Performance Tuning Using oratop Performance Tuning Using oratop
Performance Tuning Using oratop Sandesh Rao
 
Oracle AHF Insights 23c: Deeper Diagnostic Insights for your Oracle Database ...
Oracle AHF Insights 23c: Deeper Diagnostic Insights for your Oracle Database ...Oracle AHF Insights 23c: Deeper Diagnostic Insights for your Oracle Database ...
Oracle AHF Insights 23c: Deeper Diagnostic Insights for your Oracle Database ...Sandesh Rao
 
Oracle AHF Insights 23c
Oracle AHF Insights 23cOracle AHF Insights 23c
Oracle AHF Insights 23cSandesh Rao
 
Oracle AHF Insights 23c - Deeper Diagnostic Insights for your Oracle Database...
Oracle AHF Insights 23c - Deeper Diagnostic Insights for your Oracle Database...Oracle AHF Insights 23c - Deeper Diagnostic Insights for your Oracle Database...
Oracle AHF Insights 23c - Deeper Diagnostic Insights for your Oracle Database...Sandesh Rao
 
Performance Tuning Using oratop
Performance Tuning Using oratop Performance Tuning Using oratop
Performance Tuning Using oratop Sandesh Rao
 
How to use 23c AHF AIOPS to protect Oracle Databases 23c
How to use 23c AHF AIOPS to protect Oracle Databases 23c How to use 23c AHF AIOPS to protect Oracle Databases 23c
How to use 23c AHF AIOPS to protect Oracle Databases 23c Sandesh Rao
 
What's new in Autonomous Database - OCYatra2023 - Sandesh Rao.pdf
What's new in Autonomous Database - OCYatra2023 - Sandesh Rao.pdfWhat's new in Autonomous Database - OCYatra2023 - Sandesh Rao.pdf
What's new in Autonomous Database - OCYatra2023 - Sandesh Rao.pdfSandesh Rao
 
Oracle AHF Insights 23c: Deeper Diagnostic Insights for your Oracle Database
Oracle AHF Insights 23c: Deeper Diagnostic Insights for your Oracle DatabaseOracle AHF Insights 23c: Deeper Diagnostic Insights for your Oracle Database
Oracle AHF Insights 23c: Deeper Diagnostic Insights for your Oracle DatabaseSandesh Rao
 

More from Sandesh Rao (8)

Performance Tuning Using oratop
Performance Tuning Using oratop Performance Tuning Using oratop
Performance Tuning Using oratop
 
Oracle AHF Insights 23c: Deeper Diagnostic Insights for your Oracle Database ...
Oracle AHF Insights 23c: Deeper Diagnostic Insights for your Oracle Database ...Oracle AHF Insights 23c: Deeper Diagnostic Insights for your Oracle Database ...
Oracle AHF Insights 23c: Deeper Diagnostic Insights for your Oracle Database ...
 
Oracle AHF Insights 23c
Oracle AHF Insights 23cOracle AHF Insights 23c
Oracle AHF Insights 23c
 
Oracle AHF Insights 23c - Deeper Diagnostic Insights for your Oracle Database...
Oracle AHF Insights 23c - Deeper Diagnostic Insights for your Oracle Database...Oracle AHF Insights 23c - Deeper Diagnostic Insights for your Oracle Database...
Oracle AHF Insights 23c - Deeper Diagnostic Insights for your Oracle Database...
 
Performance Tuning Using oratop
Performance Tuning Using oratop Performance Tuning Using oratop
Performance Tuning Using oratop
 
How to use 23c AHF AIOPS to protect Oracle Databases 23c
How to use 23c AHF AIOPS to protect Oracle Databases 23c How to use 23c AHF AIOPS to protect Oracle Databases 23c
How to use 23c AHF AIOPS to protect Oracle Databases 23c
 
What's new in Autonomous Database - OCYatra2023 - Sandesh Rao.pdf
What's new in Autonomous Database - OCYatra2023 - Sandesh Rao.pdfWhat's new in Autonomous Database - OCYatra2023 - Sandesh Rao.pdf
What's new in Autonomous Database - OCYatra2023 - Sandesh Rao.pdf
 
Oracle AHF Insights 23c: Deeper Diagnostic Insights for your Oracle Database
Oracle AHF Insights 23c: Deeper Diagnostic Insights for your Oracle DatabaseOracle AHF Insights 23c: Deeper Diagnostic Insights for your Oracle Database
Oracle AHF Insights 23c: Deeper Diagnostic Insights for your Oracle Database
 

Recently uploaded

Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsAndrey Dotsenko
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
Bluetooth Controlled Car with Arduino.pdf
Bluetooth Controlled Car with Arduino.pdfBluetooth Controlled Car with Arduino.pdf
Bluetooth Controlled Car with Arduino.pdfngoud9212
 
costume and set research powerpoint presentation
costume and set research powerpoint presentationcostume and set research powerpoint presentation
costume and set research powerpoint presentationphoebematthew05
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 
My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024The Digital Insurer
 
APIForce Zurich 5 April Automation LPDG
APIForce Zurich 5 April  Automation LPDGAPIForce Zurich 5 April  Automation LPDG
APIForce Zurich 5 April Automation LPDGMarianaLemus7
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
 
Unlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsUnlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsPrecisely
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
Artificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraArtificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraDeakin University
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptxLBM Solutions
 

Recently uploaded (20)

Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
Bluetooth Controlled Car with Arduino.pdf
Bluetooth Controlled Car with Arduino.pdfBluetooth Controlled Car with Arduino.pdf
Bluetooth Controlled Car with Arduino.pdf
 
costume and set research powerpoint presentation
costume and set research powerpoint presentationcostume and set research powerpoint presentation
costume and set research powerpoint presentation
 
The transition to renewables in India.pdf
The transition to renewables in India.pdfThe transition to renewables in India.pdf
The transition to renewables in India.pdf
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 
My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024
 
APIForce Zurich 5 April Automation LPDG
APIForce Zurich 5 April  Automation LPDGAPIForce Zurich 5 April  Automation LPDG
APIForce Zurich 5 April Automation LPDG
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
 
Unlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsUnlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power Systems
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
Artificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraArtificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning era
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptx
 

What's new in the world of the Autonomous Database in 2023

  • 1. What’s new in Autonomous Database in 2023 Sandesh Rao VP AIOps , Autonomous Database @sandeshr https://www.linkedin.com/in/raosandesh/ https://www.slideshare.net/SandeshRao4 July 2023
  • 2. How to get started with the Autonomous Database Free Tier
  • 3. Always Free services enable developers and students to learn, build and get hands-on experience with Oracle Cloud for unlimited time Anyone can try for an unlimited time the full functionality of: • Oracle Autonomous Database • Oracle Cloud Infrastructure including: • Compute VMs • Block and Object Storage • Load Balancer Free tier
  • 4. Free tier – Tech spec 2 Autonomous Databases (Autonomous Data Warehouse or Autonomous Transaction Processing), each with 1 OCPU and 20 GB storage 2 Compute VMs, each with 1/8 OCPU and 1 GB memory 2 Block Volumes, 100 GB total, with up to 5 free backups 10 GB Object Storage, 10 GB Archive Storage, and 50,000/month API requests 1 Load Balancer, 10 Mbps bandwidth 10 TB/month Outbound Data Transfer 500 million ingestion Datapoints and 1 billion Datapoints for Monitoring Service 1 million Notification delivery options per month and 1000 emails per month
  • 5. Getting Started workshops for Autonomous Database https://bit.ly/get-started-with-adb
  • 6. How to Get Started with Machine Learning
  • 7. Database Developer to Data Scientist Journey
  • 8. Oracle Machine Learning Notebooks Collaborative UI • Based on Apache Zeppelin • Supports data scientists, data analysts, application developers, and DBAs with SQL and Python • Easy notebook sharing • Scheduling, versioning, access control Included with Autonomous Database • Automatically provisioned and managed • In-database algorithms and analytics functions • Explore and prepare, build and evaluate models, score data, deploy solutions Autonomous Database as a Data Science Platform Copyright © 2021 Oracle and/or its affiliates.
  • 9. Automate production and deployment of ML models • Enhance Data Scientist productivity and user-experience • Enable non-expert users to leverage ML • Unify model deployment and monitoring • Support model management Features • Minimal user input: data, target • Model leaderboard • Model deployment via REST • Model monitoring • Cognitive features for image and text AutoML “Code-free” user interface supporting automated end-to-end machine learning Copyright © 2020 Oracle and/or its affiliates.
  • 10. Export Data as JSON to Object Storage
  • 11. ADB now has a procedure to export a query as JSON directly to Object Storage bucket. The query can be an advanced query - includes joins or subqueries. Specify format parameter with compression option to compress the output files. Use DBMS_CLOUD.DELETE_OBJECT to delete the files BEGIN DBMS_CLOUD.EXPORT_DATA( credential_name => 'DEF_CRED_NAME', file_uri_list => ‘bucketname/filename’, query => 'SELECT * FROM DEPT’, format => JSON_OBJECT('type' value 'json')); END; / Export Data As JSON To Object Storage OVERVIEW HOW IT WORKS
  • 12. Partitions with external tables in Cloud
  • 13. • ADB provides transparent access over data in Object Stores • Easily join across data sets in the “data lake” and in-database data sets • Leverage in-file metadata with Avro, ORC and Parquet to simplify creating tables Autonomous Database – Accessing Data In Object Stores Object Stores Oracle Object Store Amazon S3 Azure Blob Store Autonomous Database Google Cloud Store Wasabi Cloud Store 37 Copyright © 2021, Oracle and/or its affiliates HOW IT WORKS
  • 14. Note only use of DBMS_CLOUD syntax is supported Hybrid Partitioned Tables BEGIN DBMS_CLOUD.CREATE_HYBRID_PART_TABLE( table_name =>'HPT1’, credential_name =>'OBJ_STORE_CRED’, format => json_object('delimiter' value ',', ‘ recorddelimiter' value 'newline', ‘ characterset' value 'us7ascii’), column_list => 'col1 number, col2 number, col3 number’ partitioning_clause => 'partition by range (col1) (partition p1 values less than (1000) external location ( 'https://swiftobjectstorage.us-ashburn-1 .../file_01.txt') , partition p2 values less than (2000) external location ( ‘https://swiftobjectstorage.us-ashburn-1 .../file_02.txt') , partition p3 values less than (3000) ) ) END;
  • 15. External tables with partitioning specified in source files Partitioning is a well-established technique to improve the performance and manageability of database systems by dividing large objects into smaller partitions; any large data warehouse takes advantage of it BEGIN DBMS_CLOUD.CREATE_EXTERNAL_PART_TABLE( TABLE_NAME => 'sales_new_api', CREDENTIAL_NAME => 'CRED_OCI', FILE_URI_LIST => 'https://objectstorage.us-ashburn- 1.oraclecloud.com/n/my_namespace/b/moviestream_landing/o/sales_sample/*.parquet', FORMAT => '{"type":"parquet", "schema": "first","partition_columns":[{"name":"month","type":"varchar2(100)"}]}' ); END; /
  • 16. External tables with partitioning specified in source files We now derive the column structure for self-describing table formats with partitioned external tables, just like with nonpartitioned external tables
  • 17. External tables with partitioning specified in source files If new files are added or removed in the underlying Object Store, you just run the new sync procedure like this: BEGIN DBMS_CLOUD.SYNC_EXTERNAL_PART_TABLE (table_name => 'sales_new_api'); END; /
  • 18. Automatic Partitioning Automatic partitioning in ADB analyzes the application workload Automatically applies partitioning to tables and their indexes to improve performance or to allow better management of large tables Automatic partitioning chooses from the following partition methods: • INTERVAL AUTOMATIC: best suited for ranges of partition key values • LIST AUTOMATIC: applies to distinct partition key values • HASH: partitioning on the partition key's hash values OVERVIEW Automatic partitioning performs the following operations: • Identify candidate tables for automatic partitioning by analyzing the workload for selected candidate tables. • By default, automatic partitioning uses the workload information collected in an Autonomous Database for analysis • Evaluate partition schemes based on workload analysis and quantification and verification of the performance benefits: 1. Candidate empty partition schemes with synthesized statistics are created internally and analyzed for performance. 2. Candidate scheme with highest estimated IO reduction is chosen as optimal partitioning strategy - internally implemented to test and verify performance 3. If candidate partition scheme does not improve performance automatic partitioning is not implemented Implement optimal partitioning strategy, if configured to do so, for the tables analyzed by the automatic partitioning procedures. HOW IT WORKS
  • 19. Set Patch Level When Creating A Clone and retrieve Patch Details
  • 20. Set Patch Level When Creating A Clone When you provision or clone an Autonomous Database instance you can select a patch level to apply upcoming patches. There are two patch level options: Regular and Early. The Early patch level allows testing upcoming patches one week before they are applied as part of the regular patching program The console shows the patch level setting with the section headed Maintenance. OVERVIEW HOW IT WORKS
  • 21. View Autonomous Database maintenance event history to see details about past maintenance events (requires ADMIN user) View Patch Details OVERVIEW
  • 22. SELECT * FROM DBA_CLOUD_PATCH_INFO; SELECT * FROM DBA_CLOUD_PATCH_INFO WHERE PATCH_VERSION = 'ADBS-21.7.1.2'; View Patch Details HOW IT WORKS
  • 23. OCI Identity and Access Management (IAM) Authentication
  • 24. Integration With OCI Identity and Access Management (IAM) Authentication OCI Identity and Access Management users can now authenticate and authorize to ADB-Serverless. Better security since user access to databases is managed centrally instead of locally in every database Reduces zombie database user accounts User management moves DBA tasks to the IAM administrator SQL*Plus users can sign into Autonomous Database using their IAM username and IAM database password Users can also use IAM SSO tokens with the latest JDBC- thin and OCI-C database clients to connect with ADB- Shared
  • 25. Identity and Access Management (IAM) authentication - additional features Can now leverage a single identifier and password to access all your databases in OCI OCI application integration with Autonomous Databases is enhanced to support application identities, database links, and proxy authentication to simplify application maintenance Improves overall security through accountability since the IAM user information can be collected as part of an audit record
  • 26. Load Data Using DBMS_CLOUD
  • 27. Load data using DBMS_CLOUD • For data loading from files in the Cloud • Store your object storage credentials • Use the procedure DBMS_CLOUD.COPY_DATA to load data • The source file in this example is channels.txt File-02 in Object Store Bucket File-03 in Object Store Bucket File-01 in Object Store Bucket SET DEFINE OFF BEGIN DBMS_CLOUD.CREATE_CREDENTIAL( credential_name => 'DEF_CRED_NAME', username => 'adwc_user@example.com', password => 'password' ); END; /
  • 28. Load data using DBMS_CLOUD CREATE TABLE CHANNELS (channel_id CHAR(1), channel_desc VARCHAR2(20), channel_class VARCHAR2(20) ); BEGIN DBMS_CLOUD.COPY_DATA( table_name =>'CHANNELS', credential_name =>'DEF_CRED_NAME’, file_uri_list => 'https://objectstorage.us-phoenix-1.oraclecloud.com/n/namespace-string/b/ bucketname/o/channels.txt', format => json_object('delimiter' value ',') ); END; BEGIN DBMS_CLOUD.COPY_DATA( table_name =>'CHANNELS', credential_name =>'DEF_CRED_NAME’, file_uri_list =>'https://objectstorage.us-phoenix-1.oraclecloud.com/n/namespace- string/b/ bucketname/o/exp01.dmp, https://objectstorage.us-phoenix-1.oraclecloud.com/n/namespace-string/b/ bucketname/o/exp02.dmp', format => json_object('type' value 'datapump') ); END;
  • 29. Load data using DBMS_CLOUD BEGIN DBMS_CLOUD.COPY_COLLECTION( collection_name => 'fruit', credential_name => 'DEF_CRED_NAME', file_uri_list => 'https://objectstorage.us-ashburn-1.oraclecloud.com/n/ namespace-string/b/fruit_bucket/o/myCollection.json’, format => JSON_OBJECT('recorddelimiter' value '''n''') ); END; BEGIN DBMS_CLOUD.COPY_COLLECTION( collection_name => 'fruit2', credential_name => 'DEF_CRED_NAME', file_uri_list => 'https://objectstorage.us-ashburn-1.oraclecloud.com/n/ namespace-string/b/json/o/fruit_array.json’, format => '{"recorddelimiter" : "0x''01''", "unpackarrays" : TRUE}' ); END;
  • 30. Load data using DBMS_CLOUD SELECT table_name, owner_name, type, status, start_time, update_time, logfile_table, badfile_table FROM user_load_operations WHERE type = 'COPY’; TABLE_NAME OWNER_NAME TYPE STATUS START_TIME UPDATE_TIME LOGFILE_TABLE BADFILE_TABLE ------------------------------------------------------------------------------------ FRUIT ADMIN COPY COMPLETED 2020-04-23 22:27:37 2020-04-23 22:27:38 "" "" FRUIT ADMIN COPY FAILED 2020-04-23 22:28:36 2020-04-23 22:28:37 COPY$2_LOG COPY$2_BAD SELECT credential_name, username, comments FROM all_credentials; CREDENTIAL_NAME USERNAME COMMENTS ---------------------------–----------------------------- -------------------- ADB_TOKEN user_name@example.com {"comments":"Created via DBMS_CLOUD.create_credential"} DEF_CRED_NAME user_name@example.com {"comments":"Created via DBMS_CLOUD.create_credential"}
  • 31. ADB now supports SQL access to tenancy details
  • 32. When you file a service request for Autonomous Database, you need to provide the tenancy details for your instance. Tenancy details for the instance are available on the Oracle Cloud Infrastructure console. However, if you are connected to the database, you can now obtain these details by querying the CLOUD_IDENTITY column of the V$PDBS view. For example: ...will generate something similar to the following: How to get the tenancy details for your instance SELECT cloud_identity FROM v$pdbs; {"DATABASE_NAME" : "DBxxxxxxxxxxxx", "REGION" : "us-phoenix-1", "TENANT_OCID" : "OCID1.TENANCY.REGION1..ID1", "DATABASE_OCID" : "OCID1.AUTONOMOUSDATABASE.OC1.SEA.ID2 ", "COMPARTMENT_OCID" : "ocid1.tenancy.region1..ID3"}
  • 34. UI Button To Quickly Add Your IP Address To ACLs Makes it easier to setup ACLs where you need to add the IP address of the client to your Network Access Control List (ACL) The button "Add My IP Address" adds your current IP address to the ACL entry Removes the need to manually look for your client IP via a 3rd party website or app anymore (e.g. Google, whatsmyip.com etc) Available in both the Create Autonomous Database flow for new ADB-S instances to be provisioned and the Update Network Access flow for existing ADB-S instances
  • 35. Multicloud with OCI and Azure
  • 36. Low-latency connectivity between Microsoft Azure and OCI Deploys Oracle Database on OCI, and provides metrics on Azure Combine the full Azure catalog of AI and application services with OCI’s most powerful database services No charges for Oracle Interconnect for Microsoft Azure ports or data ingress/egress over the Interconnect Billing normally for consumption of Oracle Database services, such as Autonomous Database Multicloud with OCI and Azure
  • 37. Oracle Database Service for Microsoft Azure (ODSA) Automatically configures everything required to link the two cloud environments Federates Azure active directory identities Azure like UI & API experience for provisioning and managing Oracle database services on OCI Sends metrics, logs, and events for the OCI databases to Azure tooling for unified telemetry and monitoring
  • 38. Collaborative support model Direct connection between cloud vendors <2ms latency for traffic between OCI and Microsoft Azure Pricing is based solely on port capacities for OCI FastConnect and Azure ExpressRoute Local Circuit No charges for inbound or outbound bandwidth consumed
  • 40. For each database product, ODSA supports the common administration and application access capabilities: • Create, read, update, delete, list (CRUDL) • Clone database • Database backup (automatic and manual) • Database restore (restore to existing database for now) • Generate Azure connection string • Display database metrics Oracle Cloud Infrastructure Integration
  • 41. Azure tools integration Delivers OCI database metrics, events, and logs to tools such as Azure Application Insights, Azure Event Grid, and Azure Log Analytics Enables Azure users to view OCI databases alongside the rest of your Azure data, for unified telemetry and monitoring Also creates a custom dashboard that provides Azure developers with Oracle database resource details, and connection strings for their applications
  • 42. Custom dashboard Displays graphs for each of the standard Oracle database metrics for the resource Give developers and administrators a quick view of all metrics in one place
  • 43. Real Application Testing - Database Replay
  • 44. Using Oracle Real Application Testing Capture a Workload on an Autonomous Database Instance Testing the effects of changes on existing workloads using simulated data sets often do not accurately represent production workloads BEGIN DBMS_CLOUD_ADMIN.START_WORKLOAD_CAPTURE( capture_name => ‘CAP_TEST1', duration => 60); END; /
  • 45. Replay a workload Replay a workload on a refreshable clone BEGIN DBMS_CLOUD_ADMIN.REPLAY_WORKLOAD( capture_name => 'CAP_TEST1'); END; / Log in as the ADMIN user or have the EXECUTE privilege on DBMS_CLOUD_ADMIN Replay a workload on a full clone BEGIN DBMS_CLOUD_ADMIN.REPLAY_WORKLOAD( capture_name => 'CAP_TEST1’, capture_source_tenancy_ocid => 'OCID1.TENANCY.REGION1..ID1’, capture_source_db_name => 'ADWFINANCE'); END; /
  • 46. Oracle Machine Learning for R on Autonomous Database
  • 47. Sample of common enterprise machine learning pain points Copyright © 2023 Oracle and/or its affiliates. 119 “It takes too long to get my data or to get the ‘right’ data” “I can’t analyze or mine all of my data – it has to be sampled” “Putting open source models and results into production takes too long and is ad hoc and complex” “Our company is concerned about data security, backup and recovery” “We need to build and score with 100s or 1000s of models fast to meet business objectives”
  • 48. Copyright © 2023 Oracle and/or its affiliates. Access latency Paradigm shift: R à Data Access Language à R Memory limitation – data size, in-memory processing Single threaded Issues for backup, recovery, security Ad hoc production deployment Traditional analytics and data source interaction Data Source Flat Files extract / export read export load Data source connectivity packages Read/Write files using built-in tool capabilities
  • 49. Oracle Machine Learning provides tools for data science project success Copyright © 2023, Oracle and/or its affiliates . All rights reserved. 121 OML Component Oracle Autonomous Database (19c, 21c) Oracle Database (19c, 21c) Oracle DBCS Oracle Exadata CS/CI/C@C OML4SQL API Build ML models and score data with no data movement ü ADB-S, ADB-D, ADB C@C ü ü ü OML4Py API Leverage the database as a high-performance compute engine from Python with in-database ML ü ADB-S ü ü ü OML4R API Leverage the database as a high-performance compute engine from R with in-database ML ü ADB-S ü ü ü OML Notebooks SQL, PL/SQL, Python, R, and markdown interpreters ü ADB-S OML AutoML UI No-code automated modeling interface ü ADB-S OML Services RESTful model management and deployment ü ADB-S Oracle Data Miner SQL Developer extension with a drag-n-drop interface for creating ML methodologies ü ADB-S, ADB-D, ADB C@C ü ü ü
  • 50. Oracle Machine Learning for R Leverage the database as HPC environment Use in-database parallelized and distributed ML algorithms from a native R interface Store and manage user-defined functions and R objects in the database Integrate results into applications and dashboards via SQL or REST Eliminate the need to explicitly provision R engines for solution deployment Empower data scientists with R Autonomous Database Oracle Database (on premises, DBCS, ExaCC/CS) OML4R Standalone client R SQL OML Notebooks Copyright © 2023, Oracle and/or its affiliates. All rights reserved. 12 4 R Python SQL Pl/SQL markdown REST roadmap
  • 51. Oracle Machine Learning for R Transparency layer • Leverage proxy objects so data remains in database • Overload native functions translating functionality to SQL • Use familiar R syntax on database data Parallel, distributed in-database algorithms • Scalability and performance • Exposes in-database algorithms available from OML4SQL Embedded execution • Manage and invoke user-defined R functions • Data-parallel, task-parallel, and non-parallel execution • Use open source packages to augment functionality Empower data scientists with R Copyright © 2023, Oracle and/or its affiliates. All rights reserved. 12 5 Autonomous Database Oracle Database (on premises, DBCS, ExaCC/CS) OML4R Standalone client R SQL OML Notebooks R Python SQL Pl/SQL markdown REST roadmap
  • 52. OML4R 2.0 Algorithms on ADB Supports automatic data preparation, partitioned model ensembles, integrated text mining • Decision Tree • Logistic Regression • Naïve Bayes • Neural Network • Support Vector Machine • Random Forest • XGBoost (21c) Regression • Generalized Linear Model • Neural Network • Support Vector Machine • XGBoost (21c) Classification Attribute Importance • Minimum Description Length • Random Forest Clustering • Hierarchical k-Means • Orthogonal Partitioning • Expectation Maximization • Gaussian Mixture Models via EM Feature Extraction • Non-negative Matrix Factorization • Principal Component Analysis • Singular Value Decomposition • Explicit Semantic Analysis Market Basket Analysis • Apriori – Association Rules Anomaly Detection • 1 Class Support Vector Machine Time Series • Single Exponential Smoothing • Double Exponential Smoothing • Triple Exponential Smoothing Copyright © 2023, Oracle and/or its affiliates. All rights reserved. 13 1 Machine Learning in-database algorithms
  • 53. Conda environment creation via OML Notebooks on ADB Copyright © 2023, Oracle and/or its affiliates 134
  • 54. Conda environment usage via OML Notebooks on ADB Copyright © 2023, Oracle and/or its affiliates 135
  • 55. Example using Support Vector Machine for anomaly detection Scalable in-database algorithms # obtain proxy object ore.sync(table="ONTIME_S") # build anomaly detection model SVM.MOD = ore.odmSVM(~., ONTIME_S, "anomaly.detection", outlier.rate = 0.1) # view model object SVM.MOD Copyright © 2023 Oracle and/or its affiliates. All rights reserved. OML4R
  • 56. Autonomous Database Plugin for Microsoft Excel
  • 57. Using the Excel Add-in to Query Autonomous Database Excel Add-in allows you query data in the Autonomous Database directly from Excel Run native SQL queries and use a wizard to query Analytic Views created by the Data Analysis tool
  • 58. Download the Add-in 1. Login to the web UI for the DB Actions page. 2. On the right-side navigation menu, click on the link to “Download Add-in for Excel.”
  • 59. Installation on macOS 3. Unzip the downloaded zip file 4. Open a terminal window and navigate to the unzipped folder 5. Ensure that Excel is not running. 6. The install.sh file does not have to execute permissions chmod 764 install.sh ./install.sh
  • 60. Installation on macOS 7. Launch Excel 8. On the Insert tab on the ribbon, click the down arrow on the Add-ins / My Add-ins option: 9. Under Developer Add-ins, you will see the Oracle Autonomous Database Add-in. 10. Click to select this Add-in
  • 61. Installation on macOS At the bottom, you see a notification about the Add-in being loaded: After the Add-in is loaded successfully, you will see the following message: Also, a new ribbon item, “Autonomous Database.”
  • 62. Installation on macOS 11. Close and Quit Excel 12. Launch Excel and insert the Add-in again. (You must perform these steps 6 thru 9 mentioned above every time you launch Excel.) You are now ready to connect to the Autonomous Database, run native SQL, and use the Analytic View Query wizard.
  • 63. Connecting to Autonomous Database You are now connected to the Autonomous Database and ready to run native SQL and use the Analytic View Query wizard. On the Autonomous Database ribbon tab, click the About button. This provides information about the Add-in and Autonomous Database version, which is helpful while working with support on any problems you face with the Add-in.
  • 64. Run Native SQL for analysis using Excel Pivot tables Launch the Native Sql panel by clicking the button in the Autonomous Database ribbon. Example SQL query Add the above query in the text box under the Write a query label on the right-side panel. Check the Pivot table checkbox. Under Select worksheet, click the “+” icon and provide MovieSales as the name. Click the check button then Execute select a.continent, a.country, b.form_factor, b.device, c.month, d.day, e.genre, e.customer_segment, e.sales, e.purchases from countries a, devices b, months c, days d, movie_sales_2020 e where e.country = a.country and e.day = d.day and e.month = c.month and e.device = b.device and e.country = a.country order by c.month_num;
  • 65. Run Native SQL for analysis using Excel Pivot tables Two new tabs are created, viz. MovieSales and Sheet2 (Sheet number might vary in your case) On Sheet2, an Excel pivot table is created with the data fetched from the Autonomous Database.
  • 66. Run Native SQL for analysis using Excel Pivot tables Setup the Pivot table options as shown on the screen below: The data for this pivot table is fetched from the MovieSales worksheet. Now you can use the native Excel capabilities to analyze data.
  • 67. Access Amazon Redshift, Snowflake and Other Non-Oracle Databases from Your Autonomous Database
  • 68. Under “Actions” menu of your Redshift cluster on AWS console, we need to select “Modify publicly accessible setting” to make sure our Redshift cluster is publicly accessible: Step-1: Make Sure Redshift is Configured to Allow Public Access
  • 69. Navigate to the VPC security group that is assigned to our Redshift cluster and create an inbound rule to port 5439 from the source IP or CIDR range of our choice: Step-1: Make Sure Redshift is Configured to Allow Public Access
  • 70. Create a credential object with the credentials (username and password) of the target database: Step-2: Create a Database Link to your Redshift Instance BEGIN DBMS_CLOUD.CREATE_CREDENTIAL( credential_name => 'REDSHIFT_CRED', username => 'awsadmin', password => '************'); END; / PL/SQL procedure successfully completed.
  • 71. Create a database link to our Redshift instance. Nearly identical to any other database link creation except for the gateway_params parameter for which we are passing 'awsredshift' as our database type: Step-2: Create a Database Link to your Redshift Instance BEGIN DBMS_CLOUD_ADMIN.CREATE_DATABASE_LINK( db_link_name => 'REDSHIFT_LINK', hostname => 'redshift-cluster-1.******.us-west- 1.redshift.amazonaws.com', port => '5439', service_name => 'dev', credential_name => 'REDSHIFT_CRED', gateway_params => JSON_OBJECT('db_type' value 'awsredshift'), ssl_server_cert_dn => NULL); END; / PL/SQL procedure successfully completed.
  • 72. Step-3: Run a Query Over the Database Link SELECT COUNT(*) FROM SALES@REDSHIFT_LINK; COUNT(*) ----------- 172456
  • 73. Building Lakehouse on ADB DATA LAKE
  • 74. The evolution of data management and analytics Lakehouse Data warehouse Data Lake Data warehouse
  • 75. Data Lakehouse on OCI Data sources Data Lakehouses on OCI Open & flexible: analyze any database, any application, from anywhere Managed Open Source Data Warehouse Data Movement Data Definition & Discovery Any Database Any Events/Sensors Data Stores Data target Machine Learning & Data Science Any BI Tool Any Application Any Application Any Cloud Object Storage Relational Data AI services for automation, prep, and prediction Big Data Service Data Flow Autonomous Database Data Integration GoldenGate Data Catalog
  • 76. Harvest and curate Evolve object storage files into meaningful business entities Copyright © 2021, Oracle and/or its affiliates 176 Typical data lake organization
  • 77. Create External table for external data Understand data in your Lake Copyright © 2021, Oracle and/or its affiliates 177 BEGIN DBMS_CLOUD.CREATE_CREDENTIAL( credential_name => 'DEF_CRED_NAME', username => 'adb_user@example.com', password => 'password' ); END; / BEGIN DBMS_CLOUD.CREATE_EXTERNAL_TABLE( table_name =>'sales_extended_ext', credential_name =>'DEF_CRED_NAME', file_uri_list =>'https://objectstorage.us-phoenix- 1.oraclecloud.com/n/namespace-string/b/bucketname/o/sales_extended.parquet', format => '{"type":"parquet", "schema": "first"}'); END; /
  • 78. Evolve object storage files into meaningful business entities OCI Data Catalog Copyright © 2021, Oracle and/or its affiliates 178 Derive technical metadata automatically Easily apply business context
  • 79. Discover and understand entities Copyright © 2021, Oracle and/or its affiliates 179 Search for data sets using a range of criteria 1 Understand its meaning and context 2
  • 80. Autonomous Database both a Data Catalog source and consumer Data Catalog integrates with Autonomous Database Copyright © 2021, Oracle and/or its affiliates 180 Data Catalog is the source of truth for Object Store metadata • Harvest object storage to derive schemas • Manage business glossary, terms and tags • Discover data using powerful search Use Autonomous Database to discover and analyze data sets • Managed schemas and tables defined automatically • No management required • Use Oracle SQL to query both metadata and object storage data Data Catalog Object Storage Autonomous Database(s) Harvest 1 Sync 2 Query 3 ADB DCAT integration Live Lab Check it out:
  • 81. Database actions Want to load/link data with UI? Copyright © 2021, Oracle and/or its affiliates 182
  • 82. Gain insights from the lakehouse Copyright © 2021, Oracle and/or its affiliates 185 One query spans storage types • Correlate information from data lake and data warehouse • Access from any SQL tool or application • Preserve your investment in tools and skill sets • Safeguard sensitive data using Oracle Database advanced security policies
  • 84. Send messages to a Slack channel: BEGIN DBMS_CLOUD_NOTIFICATION.SEND_MESSAGE( provider => 'slack', credential_name => 'SLACK_CRED', message => 'Alert from Autonomous Database...', params => json_object('channel' value 'C0....08')); END; / Send notifications or query results to a Slack channel from ADB
  • 85. Send output from a query to a Slack channel: BEGIN DBMS_CLOUD_NOTIFICATION.SEND_DATA( provider => 'slack', credential_name => 'SLACK_CRED', query => 'SELECT username,account_status, expiry_date FROM account_users WHERE rownum < 5’, params => json_object('channel' value 'C0....08’, 'type' value 'csv')); END; / Send notifications or query results to a Slack channel from ADB
  • 86. Clone your Autonomous Database across regions from a backup
  • 87. Clone your Autonomous Database across regions from a backup More Actions > create clone
  • 88. Clone your Autonomous Database across regions from a backup
  • 89. Using Private Endpoints for Autonomous Database with Shared Exadata Infrastructure
  • 90. Creating an Autonomous Database with a Private Endpoint Prerequisites: A virtual cloud network (VCN) in the region where you want to create the Autonomous Database. In the VCN subnet, select Default DHCP Options (this choice sets up an internet resolver and a VCN resolver) At least one subnet in the VCN At least one network security group (NSG) in the VCN
  • 91. Creating an Autonomous Database with a Private Endpoint After the database is provisioned, you can see the networking details on the Autonomous Database Details page
  • 92. Scenario 1: Connecting from Your VCN Connecting to an Autonomous Database with a Private Endpoint Useful if you have an application that is running inside Oracle Cloud Infrastructure, either on a virtual machine (VM) in the same VCN that is configured with your database or on a VM in a different VCN The following network diagram shows an application running in the same VCN as the database. The Autonomous Data Warehouse (ADW) instance has a private endpoint in VCN A and subnet A (CIDR 10.0.2.0/24). The NSG associated with the Autonomous Data Warehouse instance is NSG 1. The application that connects to the Autonomous Data Warehouse instance is running on a VM that is in subnet B (CIDR 10.0.1.0/24).
  • 93. Scenario 1: Connecting from Your VCN Connecting to an Autonomous Database with a Private Endpoint Define security rules in NSG 1 to control ingress and egress traffic Allow ingress traffic from the source 10.0.1.0/24 (the CIDR for subnet B, where the application runs) on the destination port range 1522, and egress traffic from the ADW instance to the destination 10.0.1.0/24
  • 94. Scenario 1: Connecting from Your VCN Connecting to an Autonomous Database with a Private Endpoint Also create a security rule to allow traffic to and from the VM You can use a stateful security rule for the VM, so defining a rule for egress to the destination subnet (10.0.2.0/24) After you configure the security rules, your application can connect to the Autonomous Data Warehouse database by using the database wallet, just as you would usually connect
  • 95. Scenario 2: Connecting from Your Data Center Connecting to an Autonomous Database with a Private Endpoint Connect the on-premises network to the VCN with FastConnect and then set up a dynamic routing gateway (DRG) Add an entry in your on-premises host’s /etc/hosts file with the database’s private IP address and FQDN You can find the private IP address on the database details page and the FQDN in tnsnames.ora inside your wallet. Alternatively, you can set up hybrid DNS in Oracle Cloud Infrastructure for DNS name resolution
  • 96. Scenario 2: Connecting from Your Data Center Connecting to an Autonomous Database with a Private Endpoint Traffic is also allowed to and from the database by means of two stateless security rules for the data center CIDR range (172.16.0.0/16).
  • 97. Thank you Any Questions? Sandesh Rao VP AIOps Autonomous Database @sandeshr https://www.linkedin.com/in/raosandesh/ https://www.slideshare.net/SandeshRao4