Update Oracle RAC 19c with the Out Of Place Method

The following steps can be used to install the latest Database and Grid Infrastructure Release Update (RU) patches to a running RAC Cluster. We will use the Out of Place method which means we create a new Oracle Home, apply the patches to it and switch to this new Oracle Home.

Currently AutoUpgrade does not support Out of Place patching of the Grid Infrastructure home so we will do it manually and update the database home with AutoUpgrade.

Current setup and required patches

The current configuration of my test system is as follows. The steps to setup this system are described in this post (Install Oracle RAC 19c on Linux):

  • 2 Oracle VirtualBox VMs with Oracle Linux 9.7
  • Oracle Grid Infrastructure 19c (Base Release)
  • Oracle Database 19c (Base Release)
  • 1 RAC Container Database (orcl) and a Pluggable Database (pdb1)

The goal is to install the Grid Infrastructure and Database Release Update 19.30 (RU January 2026). We will need the following files to perform the update:

Patch DescriptionFile Name
Grid Infrastructure 19c (Base Release)LINUX.X64_193000_grid_home.zip
Grid Infrastructure Release Update 19.30p38658588_190000_Linux-x86-64.zip
Database 19c (Base Release)LINUX.X64_193000_db_home.zip
Database Release Update 19.30p38632161_190000_Linux-x86-64.zip
Grid Infrastructure 19.30 (includes OCW)p38629535_190000_Linux-x86-64.zip
OJVM 19.30p38523609_190000_Linux-x86-64.zip
OPatchp6880880_190000_Linux-x86-64.zip

The files will be copied to a location accessible from the VMs.

Manual Out Of Place Patching of the Grid Infrastructure Home

First we will create the new Grid Home (run as root on both cluster nodes).

mkdir -p /u01/app/19.30.0/grid
chown -R grid:oinstall /u01/app/19.30.0/grid
chmod -R 775 /u01/app/19.30.0/grid

Next we extract the base release of the Grid Infrastructure software in the new directory and patch it (run as the grid owner on the first cluster node):

# copy the base release
unzip -q /sw/oracle/gi/19c_linux/LINUX.X64_193000_grid_home.zip -d /u01/app/19.30.0/grid
# update OPatch
mv /u01/app/19.30.0/grid/OPatch /u01/app/19.30.0/grid/OPatch.old
unzip -q /sw/oracle/db/oracle_patches/opatch/p6880880_190000_Linux-x86-64.zip -d /u01/app/19.30.0/grid
# run the prerequisite checks
export ORACLE_HOME=/u01/app/19.30.0/grid
export PATH=$ORACLE_HOME/bin:$ORACLE_HOME/jdk/bin:$ORACLE_HOME/OPatch:$PATH
export CV_ASSUME_DISTID=9.7
$ORACLE_HOME/gridSetup.sh -executePrereqs -silent
# the swap size error can be ignored
# extract and apply the patches
unzip -q /sw/oracle/db/oracle_patches/ora19c_lin_x64/ora_gi_19RUs/19.30/p38658588_190000_Linux-x86-64.zip -d ~
export ORACLE_BASE=/u01/app/grid
export CLUSTER_NAME=$(olsnodes -c)
export CLUSTER_NODES=$(olsnodes | tr '\n' ','| sed 's/,\s*$//')
cd $ORACLE_HOME
./gridSetup.sh -ignorePrereq -waitforcompletion -silent \
   -applyRU /home/grid/38658588/38523609 \
   -applyOneOffs /home/grid/38658588/38629535/36758186,/home/grid/38658588/38629535/38632161,/home/grid/38658588/38629535/38653268,/home/grid/38658588/38629535/38661284,/home/grid/38658588/38629535/38729293 \
   -responseFile $ORACLE_HOME/install/response/gridsetup.rsp \
   INVENTORY_LOCATION=/u01/app/oraInventory \
   ORACLE_BASE=$ORACLE_BASE \
   SELECTED_LANGUAGES=en \
   oracle.install.option=CRS_SWONLY \
   oracle.install.asm.OSDBA=asmdba \
   oracle.install.asm.OSASM=asmadmin \
   oracle.install.crs.config.ClusterConfiguration=STANDALONE \
   oracle.install.crs.config.configureAsExtendedCluster=false \
   oracle.install.crs.config.clusterName=$CLUSTER_NAME \
   oracle.install.crs.config.gpnp.configureGNS=false \
   oracle.install.crs.config.autoConfigureClusterNodeVIP=false \
   oracle.install.crs.config.clusterNodes=$CLUSTER_NODES
Sample Output (click to expand):
[grid@lin3 ~]$ # copy the base release
[grid@lin3 ~]$ unzip -q /sw/oracle/gi/19c_linux/LINUX.X64_193000_grid_home.zip -d /u01/app/19.30.0/grid
[grid@lin3 ~]$ # update OPatch
[grid@lin3 ~]$ mv /u01/app/19.30.0/grid/OPatch /u01/app/19.30.0/grid/OPatch.old
[grid@lin3 ~]$ unzip -q /sw/oracle/db/oracle_patches/opatch/p6880880_190000_Linux-x86-64.zip -d /u01/app/19.30.0/grid
[grid@lin3 ~]$ # run the prerequisite checks
[grid@lin3 ~]$ export ORACLE_HOME=/u01/app/19.30.0/grid
[grid@lin3 ~]$ export PATH=$ORACLE_HOME/bin:$ORACLE_HOME/jdk/bin:$ORACLE_HOME/OPatch:$PATH
[grid@lin3 ~]$ export CV_ASSUME_DISTID=9.7
[grid@lin3 ~]$ $ORACLE_HOME/gridSetup.sh -executePrereqs -silent
Launching Oracle Grid Infrastructure Setup Wizard...

[WARNING] [INS-13014] Target environment does not meet some optional requirements.
   CAUSE: Some of the optional prerequisites are not met. See logs for details. /u01/app/oraInventory/logs/GridSetupActions2026-03-26_06-58-07AM/gridSetupActions2026-03-26_06-58-07AM.log
   ACTION: Identify the list of failed prerequisite checks from the log: /u01/app/oraInventory/logs/GridSetupActions2026-03-26_06-58-07AM/gridSetupActions2026-03-26_06-58-07AM.log. Then either from the log file or from installation manual find the appropriate configuration to meet the prerequisites and fix it manually.
[grid@lin3 ~]$ # the swap size error can be ignored
[grid@lin3 ~]$ # extract and apply the patches
[grid@lin3 ~]$ unzip -q /sw/oracle/db/oracle_patches/ora19c_lin_x64/ora_gi_19RUs/19.30/p38658588_190000_Linux-x86-64.zip -d ~
[grid@lin3 ~]$ export ORACLE_BASE=/u01/app/grid
[grid@lin3 ~]$ export CLUSTER_NAME=$(olsnodes -c)
[grid@lin3 ~]$ export CLUSTER_NODES=$(olsnodes | tr '\n' ','| sed 's/,\s*$//')
[grid@lin3 ~]$ cd $ORACLE_HOME
[grid@lin3 grid]$ ./gridSetup.sh -ignorePrereq -waitforcompletion -silent \
>    -applyRU /home/grid/38658588/38523609 \
>    -applyOneOffs /home/grid/38658588/38629535/36758186,/home/grid/38658588/38629535/38632161,/home/grid/38658588/38629535/38653268,/home/grid/38658588/38629535/38661284,/home/grid/38658588/38629535/38729293 \
>    -responseFile $ORACLE_HOME/install/response/gridsetup.rsp \
>    INVENTORY_LOCATION=/u01/app/oraInventory \
>    ORACLE_BASE=$ORACLE_BASE \
>    SELECTED_LANGUAGES=en \
>    oracle.install.option=CRS_SWONLY \
>    oracle.install.asm.OSDBA=asmdba \
>    oracle.install.asm.OSASM=asmadmin \
>    oracle.install.crs.config.ClusterConfiguration=STANDALONE \
>    oracle.install.crs.config.configureAsExtendedCluster=false \
>    oracle.install.crs.config.clusterName=$CLUSTER_NAME \
>    oracle.install.crs.config.gpnp.configureGNS=false \
>    oracle.install.crs.config.autoConfigureClusterNodeVIP=false \
>    oracle.install.crs.config.clusterNodes=$CLUSTER_NODES
Preparing the home to patch...
Applying the patch /home/grid/38658588/38523609...
Successfully applied the patch.
Applying the patch /home/grid/38658588/38629535/36758186...
Successfully applied the patch.
Applying the patch /home/grid/38658588/38629535/38632161...
Successfully applied the patch.
Applying the patch /home/grid/38658588/38629535/38653268...
Successfully applied the patch.
Applying the patch /home/grid/38658588/38629535/38661284...
Successfully applied the patch.
Applying the patch /home/grid/38658588/38629535/38729293...
Successfully applied the patch.
The log can be found at: /u01/app/oraInventory/logs/GridSetupActions2026-03-26_07-01-08AM/installerPatchActions_2026-03-26_07-01-08AM.log
Launching Oracle Grid Infrastructure Setup Wizard...

[WARNING] [INS-13014] Target environment does not meet some optional requirements.
   CAUSE: Some of the optional prerequisites are not met. See logs for details. /u01/app/oraInventory/logs/GridSetupActions2026-03-26_07-01-08AM/gridSetupActions2026-03-26_07-01-08AM.log
   ACTION: Identify the list of failed prerequisite checks from the log: /u01/app/oraInventory/logs/GridSetupActions2026-03-26_07-01-08AM/gridSetupActions2026-03-26_07-01-08AM.log. Then either from the log file or from installation manual find the appropriate configuration to meet the prerequisites and fix it manually.
The response file for this session can be found at:
 /u01/app/19.30.0/grid/install/response/grid_2026-03-26_07-01-08AM.rsp

You can find the log of this install session at:
 /u01/app/oraInventory/logs/GridSetupActions2026-03-26_07-01-08AM/gridSetupActions2026-03-26_07-01-08AM.log

As a root user, execute the following script(s):
        1. /u01/app/19.30.0/grid/root.sh

Execute /u01/app/19.30.0/grid/root.sh on the following nodes:
[lin3, lin4]


Successfully Setup Software with warning(s).
[grid@lin3 grid]$

We are not running root.sh (yet) as instructed by the output. Now we switch node one to the new grid home (this will restart the database instance on node 1). Run as root on node 1:

date
# update the env file
sed -i s/19.0.0/19.30.0/g /home/grid/grid.env
# switch node one to the new grid home
su - grid -c "/u01/app/19.30.0/grid/gridSetup.sh \
   -silent -switchGridHome \
   oracle.install.option=CRS_SWONLY \
   ORACLE_HOME=/u01/app/19.30.0/grid \
   oracle.install.crs.config.clusterNodes=`hostname|awk -F. {'print $1'}` \
   oracle.install.crs.rootconfig.executeRootScript=false"
# run root.sh as root on the first node
/u01/app/19.30.0/grid/root.sh
date
Sample Output (click to expand):
[root@lin3 ~]# date
Thu Mar 26 07:40:25 AM CET 2026
[root@lin3 ~]# # update the env file
[root@lin3 ~]# sed -i s/19.0.0/19.30.0/g /home/grid/grid.env
[root@lin3 ~]# # switch node one to the new grid home
[root@lin3 ~]# su - grid -c "/u01/app/19.30.0/grid/gridSetup.sh \
>    -silent -switchGridHome \
>    oracle.install.option=CRS_SWONLY \
>    ORACLE_HOME=/u01/app/19.30.0/grid \
>    oracle.install.crs.config.clusterNodes=`hostname|awk -F. {'print $1'}` \
>    oracle.install.crs.rootconfig.executeRootScript=false"
Launching Oracle Grid Infrastructure Setup Wizard...


As a root user, execute the following script(s):
        1. /u01/app/19.30.0/grid/root.sh

Execute /u01/app/19.30.0/grid/root.sh on the following nodes:
[lin3]



You can find the log of this install session at:
 /u01/app/oraInventory/logs/UpdateNodeList2026-03-26_07-40-25AM.log
You can find the log of this install session at:
 /u01/app/oraInventory/logs/UpdateNodeList2026-03-26_07-40-25AM.log
Successfully Setup Software.
[root@lin3 ~]# # run root.sh as root on the first node
[root@lin3 ~]# /u01/app/19.30.0/grid/root.sh
Check /u01/app/19.30.0/grid/install/root_lin3.fritz.box_2026-03-26_07-40-43-041089929.log for the output of root script
[root@lin3 ~]# date
Thu Mar 26 07:46:21 AM CET 2026
[root@lin3 ~]#

We now run the same script as root on the second cluster node (this will restart the database instance on node 2).

Sample Output (click to expand):
[root@lin4 ~]# date
Thu Mar 26 07:50:12 AM CET 2026
[root@lin4 ~]# # update the env file
[root@lin4 ~]# sed -i s/19.0.0/19.30.0/g /home/grid/grid.env
[root@lin4 ~]# # switch node one to the new grid home
[root@lin4 ~]# su - grid -c "/u01/app/19.30.0/grid/gridSetup.sh \
>    -silent -switchGridHome \
>    oracle.install.option=CRS_SWONLY \
>    ORACLE_HOME=/u01/app/19.30.0/grid \
>    oracle.install.crs.config.clusterNodes=`hostname|awk -F. {'print $1'}` \
>    oracle.install.crs.rootconfig.executeRootScript=false"
Launching Oracle Grid Infrastructure Setup Wizard...


As a root user, execute the following script(s):
        1. /u01/app/19.30.0/grid/root.sh

Execute /u01/app/19.30.0/grid/root.sh on the following nodes:
[lin4]



You can find the log of this install session at:
 /u01/app/oraInventory/logs/UpdateNodeList2026-03-26_07-50-12AM.log
You can find the log of this install session at:
 /u01/app/oraInventory/logs/UpdateNodeList2026-03-26_07-50-12AM.log
Successfully Setup Software.
[root@lin4 ~]# # run root.sh as root on the first node
[root@lin4 ~]# /u01/app/19.30.0/grid/root.sh
Check /u01/app/19.30.0/grid/install/root_lin4.fritz.box_2026-03-26_07-50-30-899553907.log for the output of root script
[root@lin4 ~]# date
Thu Mar 26 08:11:29 AM CET 2026
[root@lin4 ~]#

The Grid infrastructure software is now updated from 19.27 to 19.30. We will delete the old software now since this is a test system. Run these commands as grid and the command from the script output as root on both nodes:

# get current release of the grid software
crsctl query has releasepatch
# remove old grid home software
export ORACLE_HOME=/u01/app/19.0.0/grid
$ORACLE_HOME/deinstall/deinstall -local

Out Of Place Patching of the Database Home with AutoUpgrade

To update the Database from 19.27 to 19.30 we use AutoUpgrade. We copy the patches and create the AutoUpgrade config file as oracle on the first node:

mkdir -p /home/oracle/autoupgrade/patches
# download and copy patches
cp /sw/oracle/db/oracle_patches/opatch/p6880880_190000_Linux-x86-64.zip /home/oracle/autoupgrade/patches  # OPatch
cp /sw/oracle/db/oracle_patches/ora19c_lin_x64/ora_db_19RUs/19.30/p38632161_190000_Linux-x86-64.zip /home/oracle/autoupgrade/patches  # DB 19.30 RU
cp /sw/oracle/db/oracle_patches/ora19c_lin_x64/ora_gi_19RUs/19.30/p38629535_190000_Linux-x86-64.zip /home/oracle/autoupgrade/patches  # GI Patch (includes OCW)
cp /sw/oracle/db/oracle_patches/ora19c_lin_x64/ora_db_19RUs/19.30/p38523609_190000_Linux-x86-64.zip /home/oracle/autoupgrade/patches  # OJVM 19.30
# download and copy the base image
cp /sw/oracle/db/19c_linux/LINUX.X64_193000_db_home.zip /home/oracle/autoupgrade/patches

cat > ~/au.cfg << EOF
global.global_log_dir=/home/oracle/autoupgrade/log
global.keystore=/home/oracle/autoupgrade/keystore
patch1.sid=orcl1
patch1.restoration=NO
patch1.source_home=/u01/app/oracle/product/19.0.0/dbhome_1
patch1.target_home=/u01/app/oracle/product/%RELEASE%.%UPDATE%.0/dbhome_1
patch1.folder=/home/oracle/autoupgrade/patches
patch1.patch=RU,OJVM,OPATCH,OCW
patch1.download=NO
patch1.rac_rolling=REQUIRED
EOF

Now we download the latest release of AutoUpgrade and start it in analyze mode and check the output:

wget https://download.oracle.com/otn-pub/otn_software/autoupgrade.jar
$ORACLE_HOME/jdk/bin/java -jar ~/autoupgrade.jar -patch -config ~/au.cfg -mode analyze

Sample Output (click to expand):
[oracle@lin3 ~]$ wget https://download.oracle.com/otn-pub/otn_software/autoupgrade.jar
--2026-03-26 09:01:07--  https://download.oracle.com/otn-pub/otn_software/autoupgrade.jar
Resolving download.oracle.com (download.oracle.com)... 23.58.108.117
Connecting to download.oracle.com (download.oracle.com)|23.58.108.117|:443... connected.
HTTP request sent, awaiting response... 302 Moved Temporarily
Location: https://edelivery.oracle.com/otn-pub/otn_software/autoupgrade.jar [following]
--2026-03-26 09:01:07--  https://edelivery.oracle.com/otn-pub/otn_software/autoupgrade.jar
Resolving edelivery.oracle.com (edelivery.oracle.com)... 2.17.191.76, 2a02:26f0:ab00:694::366, 2a02:26f0:ab00:68d::366
Connecting to edelivery.oracle.com (edelivery.oracle.com)|2.17.191.76|:443... connected.
HTTP request sent, awaiting response... 302 Moved Temporarily
Location: https://download.oracle.com/otn-pub/otn_software/autoupgrade.jar?AuthParam=1774512189_f7fff09153798526ab94ecd24fc02682 [following]
--2026-03-26 09:01:09--  https://download.oracle.com/otn-pub/otn_software/autoupgrade.jar?AuthParam=1774512189_f7fff09153798526ab94ecd24fc02682
Connecting to download.oracle.com (download.oracle.com)|23.58.108.117|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 6960211 (6.6M) [application/x-jar]
Saving to: ‘autoupgrade.jar’

autoupgrade.jar                                  100%[==========================================================================================================>]   6.64M  5.51MB/s    in 1.2s

2026-03-26 09:01:11 (5.51 MB/s) - ‘autoupgrade.jar’ saved [6960211/6960211]

[oracle@lin3 ~]$ $ORACLE_HOME/jdk/bin/java -jar ~/autoupgrade.jar -patch -config ~/au.cfg -mode analyze
AutoUpgrade Patching 26.2.260205 launched with default internal options
Processing config file ...
+-----------------------------------------+
| Starting AutoUpgrade Patching execution |
+-----------------------------------------+
1 CDB(s) plus 2 PDB(s) will be analyzed
Type 'help' to list console commands
patch> status

Config

        User configuration file    [/home/oracle/au.cfg]
        General logs location      [/home/oracle/autoupgrade/log/cfgtoollogs/patch/auto]
        Mode                       [ANALYZE]
Jobs Summary

        Total databases in configuration file [3]
        Total Non-CDB being processed         [0]
        Total Containers being processed      [3]

        Jobs finished successfully            [0]
        Jobs finished/stopped                 [0]
        Jobs in progress                      [1]

Progress
        +---+---------------------------------------------------------+
        |Job|                                                 Progress|
        +---+---------------------------------------------------------+
        |100|[||||||||||||||||||||||||||                        ] 50 %|
        +---+---------------------------------------------------------+

patch> Job 100 completed
------------------- Final Summary --------------------
Number of databases            [ 1 ]

Jobs finished                  [1]
Jobs failed                    [0]

Please check the summary report at:
/home/oracle/autoupgrade/log/cfgtoollogs/patch/auto/status/status.html
/home/oracle/autoupgrade/log/cfgtoollogs/patch/auto/status/status.log
[oracle@lin3 ~]$ more /home/oracle/autoupgrade/log/cfgtoollogs/patch/auto/status/status.log
==========================================
   AutoUpgrade Patching Summary Report
==========================================
[Date]           Thu Mar 26 09:08:11 CET 2026
[Number of Jobs] 1
==========================================
[Job ID] 100
==========================================
[DB Name]                orcl
[Version Before AutoUpgrade Patching] 19.27.0.0.0
[Version After AutoUpgrade Patching]  19.30.0.0.260120
------------------------------------------
[Stage Name]    PENDING
[Status]        SUCCESS
[Start Time]    2026-03-26 09:04:11
[Duration]      0:00:00
[Log Directory] /home/oracle/autoupgrade/log/orcl1/100/pending
------------------------------------------
[Stage Name]    PRECHECKS
[Status]        SUCCESS
[Start Time]    2026-03-26 09:04:11
[Duration]      0:04:00
[Log Directory] /home/oracle/autoupgrade/log/orcl1/100/prechecks
[Detail]        /home/oracle/autoupgrade/log/orcl1/100/prechecks/orcl_preupgrade.log
                Check passed and no manual intervention needed
------------------------------------------
[oracle@lin3 ~]$

If all checks are successful we can start the update as oracle on the first cluster node:

$ORACLE_HOME/jdk/bin/java -jar ~/autoupgrade.jar -patch -config ~/au.cfg -mode deploy
Sample Output (click to expand):
[oracle@lin3 ~]$ $ORACLE_HOME/jdk/bin/java -jar ~/autoupgrade.jar -patch -config ~/au.cfg -mode deploy
AutoUpgrade Patching 26.2.260205 launched with default internal options
Processing config file ...
+-----------------------------------------+
| Starting AutoUpgrade Patching execution |
+-----------------------------------------+
1 CDB(s) plus 2 PDB(s) will be processed
Type 'help' to list console commands
patch> status

Config

        User configuration file    [/home/oracle/au.cfg]
        General logs location      [/home/oracle/autoupgrade/log/cfgtoollogs/patch/auto]
        Mode                       [DEPLOY]
Jobs Summary

        Total databases in configuration file [3]
        Total Non-CDB being processed         [0]
        Total Containers being processed      [3]

        Jobs finished successfully            [0]
        Jobs finished/stopped                 [0]
        Jobs in progress                      [1]

Progress
        +---+---------------------------------------------------------+
        |Job|                                                 Progress|
        +---+---------------------------------------------------------+
        |101|[||||||||                                          ] 15 %|
        +---+---------------------------------------------------------+

patch> Job 101 completed
------------------- Final Summary --------------------
Number of databases            [ 1 ]

Jobs finished                  [1]
Jobs failed                    [0]
Jobs restored                  [0]
Jobs pending                   [0]

# Run the root.sh script as root for the following jobs:
For orcl1 in lin3 -> /u01/app/oracle/product/19.30.0/dbhome_1/root.sh
For orcl1 in lin4 -> /u01/app/oracle/product/19.30.0/dbhome_1/root.sh

Please check the summary report at:
/home/oracle/autoupgrade/log/cfgtoollogs/patch/auto/status/status.html
/home/oracle/autoupgrade/log/cfgtoollogs/patch/auto/status/status.log
[oracle@lin3 ~]$ more /home/oracle/autoupgrade/log/cfgtoollogs/patch/auto/status/status.log
==========================================
   AutoUpgrade Patching Summary Report
==========================================
[Date]           Thu Mar 26 10:35:51 CET 2026
[Number of Jobs] 1
==========================================
[Job ID] 101
==========================================
[DB Name]                orcl
[Version Before AutoUpgrade Patching] 19.27.0.0.0
[Version After AutoUpgrade Patching]  19.30.0.0.260120
------------------------------------------
[Stage Name]    PENDING
[Status]        SUCCESS
[Start Time]    2026-03-26 09:14:06
[Duration]      0:00:00
[Log Directory] /home/oracle/autoupgrade/log/orcl1/101/pending
------------------------------------------
[Stage Name]    PREACTIONS
[Status]        SUCCESS
[Start Time]    2026-03-26 09:14:07
[Duration]      0:00:00
[Log Directory] /home/oracle/autoupgrade/log/orcl1/101/preaction
------------------------------------------
[Stage Name]    PRECHECKS
[Status]        SUCCESS
[Start Time]    2026-03-26 09:14:07
[Duration]      0:03:42
[Log Directory] /home/oracle/autoupgrade/log/orcl1/101/prechecks
[Detail]        /home/oracle/autoupgrade/log/orcl1/101/prechecks/orcl_preupgrade.log
                Check passed and no manual intervention needed
------------------------------------------
[Stage Name]    PREFIXUPS
[Status]        SUCCESS
[Start Time]    2026-03-26 09:17:50
[Duration]      0:09:42
[Log Directory] /home/oracle/autoupgrade/log/orcl1/101/prefixups
[Detail]        /home/oracle/autoupgrade/log/orcl1/101/prefixups/prefixups.html
------------------------------------------
[Stage Name]    EXTRACT
[Status]        SUCCESS
[Start Time]    2026-03-26 09:27:33
[Duration]      0:02:38
[Log Directory] /home/oracle/autoupgrade/log/orcl1/101/extract
------------------------------------------
[Stage Name]    DBTOOLS
[Status]        SUCCESS
[Start Time]    2026-03-26 09:30:12
[Duration]      0:00:02
[Log Directory] /home/oracle/autoupgrade/log/orcl1/101/dbtools
------------------------------------------
[Stage Name]    INSTALL
[Status]        SUCCESS
[Start Time]    2026-03-26 09:30:14
[Duration]      0:30:21
[Log Directory] /home/oracle/autoupgrade/log/orcl1/101/install
------------------------------------------
[Stage Name]    OPTIONS
[Status]        SUCCESS
[Start Time]    2026-03-26 10:00:36
[Duration]      0:00:00
[Log Directory] /home/oracle/autoupgrade/log/orcl1/101/options
------------------------------------------
[Stage Name]    ROOTSH
[Status]        SUCCESS
[Start Time]    2026-03-26 10:00:36
[Duration]      0:00:01
[Log Directory] /home/oracle/autoupgrade/log/orcl1/101/rootsh
------------------------------------------
[Stage Name]    DB_PATCHING
[Status]        SUCCESS
[Start Time]    2026-03-26 10:00:38
[Duration]      0:34:25
[Log Directory] /home/oracle/autoupgrade/log/orcl1/101/db_patching
------------------------------------------
[Stage Name]    POSTCHECKS
[Status]        SUCCESS
[Start Time]    2026-03-26 10:35:03
[Duration]      0:00:05
[Log Directory] /home/oracle/autoupgrade/log/orcl1/101/postchecks
[Detail]        /home/oracle/autoupgrade/log/orcl1/101/postchecks/orcl_postupgrade.log
                Check passed and no manual intervention needed
------------------------------------------
[Stage Name]    POSTFIXUPS
[Status]        SUCCESS
[Start Time]    2026-03-26 10:35:09
[Duration]      0:00:41
[Log Directory] /home/oracle/autoupgrade/log/orcl1/101/postfixups
[Detail]        /home/oracle/autoupgrade/log/orcl1/101/postfixups/postfixups.html
------------------------------------------
[Stage Name]    POSTACTIONS
[Status]        SUCCESS
[Start Time]    2026-03-26 10:35:50
[Duration]      0:00:00
[Log Directory] /home/oracle/autoupgrade/log/orcl1/101/postaction
------------------------------------------
[oracle@lin3 ~]$

This process takes about 1 hour 20 minutes and restarts the databases. Now we also run root.sh as root on both nodes as requested by the script. After it finished we perform some additional tasks and the update is finished (run as root on the first node):

# update env files and enable Enterprise Manager login
newsid=orcl
pnode=`su - grid -c 'olsnodes|head -1'`
snode=`su - grid -c 'olsnodes|tail -1'`
sed -i s/19.0.0/19.30.0/g /home/oracle/ora19.env
ssh -n $snode "sed -i s/19.0.0/19.30.0/g /home/oracle/ora19.env"
chmod 640 /u01/app/oracle/product/19.30.0/dbhome_1/admin/$newsid/xdb_wallet/cwallet.sso
chmod 640 /u01/app/oracle/product/19.30.0/dbhome_1/admin/$newsid/xdb_wallet/ewallet.p12
ssh -n $snode 'chmod 640 /u01/app/oracle/product/19.30.0/dbhome_1/admin/'$newsid'/xdb_wallet/cwallet.sso'
ssh -n $snode 'chmod 640 /u01/app/oracle/product/19.30.0/dbhome_1/admin/'$newsid'/xdb_wallet/ewallet.p12'
# check versions of the grid software and database software
su - grid -c "crsctl query has releasepatch"
su - oracle -c ". ora19.env; echo 'select version_full from product_component_version;'|sqlplus -S / as sysdba"
Sample Output (click to expand):
[root@lin3 ~]# # update env files and enable Enterprise Manager login
[root@lin3 ~]# newsid=orcl
[root@lin3 ~]# pnode=`su - grid -c 'olsnodes|head -1'`
snode=`su - grid -c 'olsnodes|tail -1'`
sed -i s/19.0.0/19.30.0/g /home/oracle/ora19.env
ssh -n $snode "sed -i s/19.0.0/19.30.0/g /home/oracle/ora19.env"
chmod 640 /u01/app/oracle/product/19.30.0/dbhome_1/admin/$newsid/xdb_wallet/cwallet.sso
chmod 640 /u01/app/oracle/product/19.30.0/dbhome_1/admin/$newsid/xdb_wallet/ewallet.p12
ssh -n $snode 'chmod 640 /u01/app/oracle/product/19.30.0/dbhome_1/admin/'$newsid'/xdb_wallet/cwallet.sso'
ssh -n $snode 'chmod 640 /u01/app/oracle/product/19.30.0/dbhome_1/admin/'$newsid'/xdb_wallet/ewallet.p12'
# check versions of the grid software and database software
su - grid -c "crsctl query has releasepatch"
su - oracle -c ". ora19.env; echo 'select version_full from product_component_version;'|sqlplus -S / as sysdba"
[root@lin3 ~]# snode=`su - grid -c 'olsnodes|tail -1'`
[root@lin3 ~]# sed -i s/19.0.0/19.30.0/g /home/oracle/ora19.env
[root@lin3 ~]# ssh -n $snode "sed -i s/19.0.0/19.30.0/g /home/oracle/ora19.env"
[root@lin3 ~]# chmod 640 /u01/app/oracle/product/19.30.0/dbhome_1/admin/$newsid/xdb_wallet/cwallet.sso
[root@lin3 ~]# chmod 640 /u01/app/oracle/product/19.30.0/dbhome_1/admin/$newsid/xdb_wallet/ewallet.p12
[root@lin3 ~]# ssh -n $snode 'chmod 640 /u01/app/oracle/product/19.30.0/dbhome_1/admin/'$newsid'/xdb_wallet/cwallet.sso'
[root@lin3 ~]# ssh -n $snode 'chmod 640 /u01/app/oracle/product/19.30.0/dbhome_1/admin/'$newsid'/xdb_wallet/ewallet.p12'
[root@lin3 ~]# # check versions of the grid software and database software
[root@lin3 ~]# su - grid -c "crsctl query has releasepatch"
Oracle Clusterware release patch level is [933857815] and the complete list of patches [36758186 38523609 38632161 38653268 38661284 38729293 ] have been applied on the local node. The release patch string is [19.30.0.0.0].
[root@lin3 ~]# su - oracle -c ". ora19.env; echo 'select version_full from product_component_version;'|sqlplus -S / as sysdba"

VERSION_FULL
--------------------------------------------------------------------------------
19.30.0.0.0

[root@lin3 ~]#

This concludes the update of Oracle RAC 19c from 19.27 to 19.30. If you have any questions feel free to contact us.

Further Information

0