Skip to main content

Moving or cloning oracle home and database from one windows server to another windows server

Here we will see how to clone / move oracle 10G R2 (10.2.0.3.0) home and database from one windows server to another windows server.

First i'll explain you the situation where the above scenario arose in our environment.

Due to some reasons, every VM server in our company is moved to new storage from old storage. While moving one of the server which contains oracle 10g r2 (10.2.0.3.0) database, we got some error. Hence We are not able to complete the task for only that server.

So our network team installed a new machine with the same windows operating systems in the new storage.

Now our job is to reconfigure same oracle 10g (10.2.0.3.0) database for our local development team in the new server. But we don't have any Oracle 10G R2 software with us. Hence we had raised a SR with Oracle support.

But Oracle is not providing Oracle 10G R2 software anymore and it came up with the following reasons.

"Hello Ramesh, 

Please note that, from 1st of march 2014 all the Database 10gr2 (10.2.0.1, 10.2.0.2, 10.2.0.3, 10.2.0.4 )and 11gr1(11.1.0.6) are deemed to be restricted. Hence we won’t be able to provide you same. In order to receive the media pack, you need to have active CSI for the Database License prior to 1st march 2014".

Then we decided to move entire Oracle home and databases from old windows 2008 (32 bit) server to new windows 2008 (32 bit) server. I searched steps in Google for proceeding. But i found only the oracle doc which confused me a little bit before i read thoroughly.

Cloning an Oracle Database 10.2 Oracle Home:

http://docs.oracle.com/cd/B19306_01/em.102/b16227/oui7_cloning.htm#BCGEEFIF

Source environment details:
IP address                               :  192.168.3.100
ORACLE_HOME                  :  E:\oracle\product\10.2.0\db_1
Datafile location                     :  E:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\

Target environment details:
IP address                               : 10.168.3.15
Oracle_home and datafiles should reside in D drive as per our server configuration.
Here i am using my sid as orcl.

Step 1:
Brought down the database and listener in the source (i.e.old) server

Step 2:
Copy the entire oracle home folder and database related folders to new server using the method
"\\ip address\d$" in the run from old server.

Step 3:
In the target (new) server check whether perl softweare is installed or not. If not installed, download the perl software based on your architecture from the following link and install it in the new windows server.

Perl download

http://www.activestate.com/activeperl/downloads

Step 4:
After installing the perl, open the command prompt and then go to bin folder location of perl software installed in the command prompt.

Step 5:
We must run clone script which is already available in Oracle home copied to new server (D:\oracle\product\10.2.0\db_1\clone\bin\clone.pl)

Before running the below cmd, you must be in the bin folder in the installed Perl directory in the windows command prompt

perl D:\oracle\product\10.2.0\db_1\clone\bin\clone.pl ORACLE_HOME=<Path to the Oracle_Home being_cloned> ORACLE_HOME_NAME=<Oracle_Home_Name for the Oracle_Home being cloned>

The above command shows the syntax for the clone.pl script for Windows-based systems:

Note:
ORACLE_HOME is a location where you have copied oracle home folder to.
ORACLE_HOME_NAME is your choice.

I have given the below command in my environment.

perl D:\oracle\product\10.2.0\db_1\clone\bin\clone.pl ORACLE_HOME=D:\oracle\product\10.2.0\db_1 ORACLE_HOME_NAME=oracle_10g

While running the above command, you'll get the pop-up cmd which will show you the details of the process & then finally completed successfully message. Now your oracle home is registered in the new server.

You can see complete details in the alert log written in ORA INVENTORY location.

Step 6:
In the target server, create the same sid like old server manually using oradim utility.

oradim -NEW -SID orcl -STARTMODE auto

Step 7:
in the target server create the password file with orapwd utility

orapwd file=D:\oracle\product\10.2.0\db_1\database\pwdorcl.ora password=sys entries=3 force=y


Step 8:
Change the locations mentioned in pfile according to your environment and startup monut with pfile and create spfile.

After mounting the database, rename your datafiles,temp files and redo members according to files located in the new server.

For example:

Datafiles renaming:

ALTER DATABASE
    RENAME FILE 'E:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\SYSTEM01.DBF'                              TO 'D:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\SYSTEM01.DBF';

Temp file renaming:

ALTER DATABASE
    RENAME FILE 'E:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\TEMP01.DBF',
'E:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\247CER_TEMP.DBF'
TO 'D:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\TEMP01.DBF',
'D:\ORACLE\PRODUCT\10.2.0\ORADATA\ORCL\247CER_TEMP.DBF';

Redo log file renaming:

ALTER DATABASE
    RENAME FILE 'E:\ORACLE\PRODUCT\10.2.0\ORADATA\ZAIREDB\REDO03.LOG',
'E:\ORACLE\PRODUCT\10.2.0\ORADATA\ZAIREDB\REDO02.LOG',
'E:\ORACLE\PRODUCT\10.2.0\ORADATA\ZAIREDB\REDO01.LOG'
      TO 'D:\ORACLE\PRODUCT\10.2.0\ORADATA\ZAIREDB\REDO03.LOG',
'D:\ORACLE\PRODUCT\10.2.0\ORADATA\ZAIREDB\REDO02.LOG',
'D:\ORACLE\PRODUCT\10.2.0\ORADATA\ZAIREDB\REDO01.LOG';

Finally open the database.

alter database open;

(If datafiles,control files,temp files and redo log files path are same on both source and target servers, we can directly open the db without renaming files.

Comments

  1. I am surprised why other specialized don’t perceive your site I’m greatly cheerful I discovered this.http://www.removallondon.biz

    ReplyDelete

Post a Comment

Popular posts from this blog

Export and import multiple schema using expdp/impdp (Data Pump utility)

Use the below sql query to export and import multiple schema: expdp schemas=schema1,schema2,schema3 directory=DATA_PUMP_DIR dumpfile=schemas120514bkp.dmp exclude=statistics logfile=expdpschemas120514.log impdp schemas=schema1,schema2,schema3 directory=DATA_PUMP_DIR dumpfile=schemas120514bkp.dmp logfile=impdpschemas120514.log sql query to export and import a schema: expdp schemas=schema directory=DATA_PUMP_DIR dumpfile=schema120514bkp.dmp exclude=statistics logfile=expdpschema120514.log impdp schemas=schema directory=DATA_PUMP_DIR dumpfile=schema120514bkp.dmp logfile=expdpschema120514.log Parameter STATISTICS=NONE can either be used in export or import. No need to use the parameter in both. To export meta data only to get ddl of the schemas: expdp schemas=schema1,schema2,schema3 directory=TEST_DIR dumpfile=content.dat content=METADATA_ONLY exclude=statistics To get the DDL in a text file: impdp directory=TEST_DIR sqlfile=sql.dat logfile=sql.log dumpfile=content12051

Pinning execution plan for a sql_id using 'SQL Profile' to consistently reproduce a good plan

Deal all, Below post content is not my own. It is taken from MOSC document but the example shown below is taken from our environment. I got to know the below topic when we had experienced the bad SQL query performance in one of our production environment during daily batch running time. Hence we had implemented the same mentioned in the title "Pinning execution plan for a sql_id using 'SQL Profile' to consistently reproduce a good plan". Introduction The "cost-based optimizer (CBO or just optimizer) is the part of database software that determines the most efficient means of accessing the specified data (or at least what it "calculates" as the most efficient). The optimizer determines which execution plan is most efficient by considering several sources of information, including query conditions, available access paths, statistics gathered for the system, and hints. Some changes in those conditions may affect the calculations of the plan

SQL query to find the cause or reason for more archive log generation

Finding reason or cause for heavy or more archive log generation in a particular time period As i said in the previous post we are going to see SQL queries to find the cause or reason for more archive log generation in a problematic window... Sometimes customer would ask the sql query which generated more archive logs sometimes before or days before (not for current archive logs generation which is described in the previous post). In such scenarios, follow the steps below. Step 1: ====== First you must know the timing when more number of archive logs generated in the databases. for that you can use the below query. Below sql query gives how many number of archive logs generated for each and every hour... col day for a12 set lines 1000 set pages 999 col "00" for a3 col "01" for a3 col "02" for a3 col "03" for a3 col "04" for a3 col "05" for a3 col "06" for a3 col "07" for