Copy and Delete Dumpfiles from ASM to Local Filesystem Automatically
Hello Everyone,
So someone ask you to create a script to take a logical backup of your database and store it in ASM.
You decided to create an Export DataPump script that will:
1. Take a full export of the database to +FRA
2. Take a full metadata export of the database and store it in +FRA
3. Copy the dumpfiles in +FRA to your local disk (in case the ASM instace fails)
4. Delete old dumpfiles in +FRA and in the local disk if they are past the retention period.
The script provided below will:
1. Perform export datapump and store the dumpfiles in ASM
2. The dumpfiles stored in ASM will be copied to the local disk. The script will read the logfile generated by the export job. If you will take a look at the end of the export logfile, you will see there the list of dumpfiles generated, we just need to extract these string and process it using Unix scripts. Then we can form an ASMCMD command from here. When running ASMCMD command, just make sure you switch to the GRID environment.
3. After generating the ASMCMD copy (asmcmd cp)command, the dumpfiles in ASM should be stored in the local disk.
4. For the old dumpfiles in ASM, we just need to read the export logfile generated three days ago to generate the script. It's the same concept as presented in number 2.
5. Cleanup dumpfiles stored in disk is easy, its just a simple find command.
6. Housekeeping is just cleaning up logfiles generated by this job.
NOTES: Make sure to pre-create the directories and set the correct environmane (DB / GRID)
If you have questions, clarifications or additional information, please feel free to comment.
Let's share and make others love Oracle, too.
Here's a simple script for that. Feel free to edit and isolate the scripts that you need:
Script Name: EXPORTJOB01
# SET THE ORACLE ENVIRONMENT AND THE VARIABLE HERE:
BASE=$(basename $0 '.sh')
DATE="`date +%Y%m%d`"
DATESTAMP="`date +%Y%m%d-%H%M`"
SCRIPTDIR=/home/oracle/scripts
LOGBASE=/home/oracle/logs
LOGDIR=$LOGBASE/datapump
CLEANUPLOGS=$LOGDIR/deleted-expdplogs-${ORACLE_SID}-${DATESTAMP}.log
EXPDPDIR=+FRA/TESTDB/EXPORT
PARMFILEF=$SCRIPTDIR/EXPDP-FULL-TESTDB.par
PARMFILEM=$SCRIPTDIR/EXPDP-META-TESTDB.par
ExecutionLog=$LOGBASE/`basename ${0}`.${DATESTAMP}
exec > $ExecutionLog 2>&1
echo Script $0
echo ==== EXPDP Job initiated at: `date` ====
echo ""
. /home/oracle/oraenv
# Perform DataPump Export Job
echo "
DIRECTORY=EXPDP_TESTDB
DUMPFILE=FULL_${ORACLE_SID}_${DATESTAMP}_%U
FULL=Y
FILESIZE=10G
PARALLEL=4
COMPRESSION=ALL
LOGFILE=EXPDP_TESTDB_LOG:full_expdp-$ORACLE_SID-$DATE.log
" > $PARMFILEF
expdp "'/ as sysdba'" PARFILE=$PARMFILEF
STATUS=$?
if [[ $STATUS -ne 0 && $STATUS -ne 2 ]]; then
echo "\nERROR: $STATUS. While exporting Full Data via DataPump EXPDP \n"
exit
fi
echo "
DIRECTORY=EXPDP_TESTDB
DUMPFILE=META_${ORACLE_SID}_${DATESTAMP}_%U
FULL=Y
CONTENT=METADATA_ONLY
PARALLEL=4
LOGFILE=EXPDP_TESTDB_LOG:meta_expdp-$ORACLE_SID-$DATE.log
" > $PARMFILEM
expdp "'/ as sysdba'" PARFILE=$PARMFILEM
STATUS=$?
if [[ $STATUS -ne 0 && $STATUS -ne 2 ]]; then
echo "\nERROR: $STATUS. While exporting MetaData via DataPump EXPDP \n"
exit
fi
echo "Export...DONE"
# Perform Dumpfile Copy from ASM to Local Disk
# Extract the generated dumpfiles from the export logfile
grep "+FRA/testdb/export/full*" $LOGDIR/full_expdp-$ORACLE_SID-$DATE.log > $LOGDIR/list_dumpfiles_$ORACLE_SID.log
grep "+FRA/testdb/export/meta*" $LOGDIR/meta_expdp-$ORACLE_SID-$DATE.log >> $LOGDIR/list_dumpfiles_$ORACLE_SID.log
awk '{print "asmcmd cp"$0}' $LOGDIR/list_dumpfiles_$ORACLE_SID.log > $LOGDIR/out.1 && awk '{print $0" /data/adc_db00/dbbackup/export/testdb;"}' $LOGDIR/out.1 > $LOGDIR/copydump.sh
# Make generated script executable
chmod 755 $LOGDIR/copydump.sh
# Change to GRID environment and run the script
. /home/oracle/gridenv
sh $LOGDIR/copydump.sh > $LOGDIR/copy_dumpfiles_todisk-$DATESTAMP.log
echo "Dumpfile Copy...DONE"
# Switch back to Oracle DB environment
. /home/oracle/oraenv
# Delete ASM dumpfiles older than 3 days
find $LOGDIR -name "full_expdp-testdb*.log" -mtime +3 -exec ls {} \; > $LOGDIR/clean.out1 && tail -1 $LOGDIR/clean.out1 > $LOGDIR/clean.out2
export iFILE=`cat $LOGDIR/clean.out2`
if [ -z "$iFILE" ]; then
echo "No dumpfiles older than 3 days were found! "
else grep "+FRA/testdb/export/full*" $iFILE > $LOGDIR/clean.list1;
fi
awk '{print "asmcmd rm -f "$0}' $LOGDIR/clean.list1 > $LOGDIR/list.2 && awk '{print $0";"}' $LOGDIR/list.2 > $LOGDIR/clean.sh
# Make generated script executable
chmod 755 $LOGDIR/clean.sh
# Change to GRID environment and run the script
. /home/oracle/gridenv
sh $LOGDIR/clean.sh > $LOGDIR/purge_old_dumpfiles-$DATESTAMP.log
# Delete Disk dumpfiles older than 3 days
echo "List of cleaned-up old dumpfile files" > $CLEANUPLOGS
find $DBBACKUP/export/testdb/ -name "*testdb*.dmp" -mtime +3 -exec ls -l {} \; >> $CLEANUPLOGS
find $DBBACKUP/export/testdb/ -name "*testdb*.dmp" -mtime +3 -exec rm -f {} \;
echo "Purge Old Dumpfiles...DONE"
# Delete export-related logfiles older than 3 days (Just Housekeeping)
echo "List of cleaned-up log files" >> $CLEANUPLOGS
find $LOGBASE/ -name "EXPORTJOB01.*" -mtime +3 -exec ls -l {} \; >> $CLEANUPLOGS
find $LOGBASE/ -name "EXPORTJOB01.*" -mtime +3 -exec rm -f {} \;
find $LOGDIR/ -name "*.log" -mtime +3 -exec ls -l {} \; >> $CLEANUPLOGS
find $LOGDIR/ -name "*.log" -mtime +3 -exec rm -f {} \;
find $LOGDIR/ -name "*out.*" -mtime +3 -exec ls -l {} \; >> $CLEANUPLOGS
find $LOGDIR/ -name "*out.*" -mtime +3 -exec rm -f {} \;
# Check for export errors.
if [ `/bin/egrep "error" ${ExecutionLog} | wc -l` -gt 0 ]
then
RC=1
JobStatusMsg="export database failed"
elif
[ `/bin/egrep "ORA-" ${ExecutionLog} | wc -l` -gt 0 ]
then
RC=1
JobStatusMsg="export database failed"
elif
[ `/bin/egrep "ASMCMD-8002" ${ExecutionLog} | wc -l` -gt 0 ]
then
RC=1
JobStatusMsg="export database failed"
else
RC=0
JobStatusMsg="export database successful"
fi
echo ==== ${JobStatusMsg} on `date` ====
echo ""
exit $RC
=================================================================
The script can also be downloaded via this link: DataPump Script
Automating the process to copy and delete dumpfiles from ASM to the local filesystem can save significant time. Using tools like Web Space Kit makes managing such tasks more efficient and hassle-free.
ReplyDelete