Sunday, December 30, 2018

DBA Bundle v4.8 | December 2018 Release

DBA Bundle V4.8 download:
https://www.dropbox.com/s/xn0tf2pfeq04koi/DBA_BUNDLE4.tar?dl=0


What is new in this version:

oracle_cleanup.sh script: - Verify the log/trace files location validity before the deletion of log files.

dbalarm.sh script: - Check the existence of Restore Points when FRA hits the threshold.
                       - Avoid running the script if there is another session already running it.
                       - Fix a bug in the Monitoring Services section.

configuration_baseline.sh script: - Included the Routing table in the OS configuration baseline report.

Fixed minor bugs in dbdailychk.sh & asmdisks.sh scripts.

In case you are new to the DBA Bundle, please visit this article which explains the tool in details:
http://dba-tips.blogspot.com/2014/02/oracle-database-administration-scripts.html

Tuesday, October 23, 2018

DBA BUNDLE 4.7 released

New features of DBA Bundle 4.7

active_sessions.sh script:
- Show current running RMAN Backups if exist.
- Fixed divide by zero bug when TOTALWORK=0 in v$session_longops

dbalarm.sh script:
- Enclosing OS Network Failure Statistics (Dropped Packets) whenever TIMEOUT errors appear in the DB or Listener ALERTLOG.
- Added RMAN Backup Failure new Check.
- Added the Checking of Running RMAN Backup when CPU hit the threshold

configuration_baseline.sh script:
- Enabled the reporting of NOLOGGING objects in the report.

To download DBA Bundle V4.7:
https://www.dropbox.com/s/xn0tf2pfeq04koi/DBA_BUNDLE4.tar?dl=0

This article explains the major features of the DBA BUNDLE:
http://dba-tips.blogspot.com/2014/02/oracle-database-administration-scripts.html

Tuesday, October 16, 2018

Monitoring Failed RMAN Backup Jobs in dbalarm Shell Script

dbalarm script is now capable of monitoring RMAN backup failures. Just make sure to adjust the interval of the following parameter (in minutes) to the same execution interval of dbalarm.sh in your crontab.

e.g. If you are scheduling dbalarm.sh script to run every 5 minutes in the crontab, the following parameter should be set to 5
LAST_MIN_BKP_CHK=5      # REPORT RMAN Backup FAILURE in the last N MINUTES. Should be similar to the crontab execution interval of dbalarm script. [DB]

By default, dbalarm.sh script will monitor RMAN backup failure, in case you want to change this behavior and stop this feature for any reason, just set the following parameter to N
CHKRMANBKP=N           # Enable/Disable Checking of RMAN Backup FAILURE.      [DB]

To download dbalarm script:

To read more about dbalarm script:

Thanks to Rahul Malode who recommended adding this feature.

Wednesday, September 19, 2018

Bundle V4.6 Is Released

The new features of Bundle V4.6 are:

db_locks.sh script: Replaced the column that checks the wait in seconds from SECONDS_IN_WAIT to use WAIT_TIME_MICRO (which is more accurate) for 11gr2 compatibility.

dbalarm.sh script:
- The BLOCKING SESSION time threshold has been changed so the script will monitor only the blocking sessions that are blocked for more than 1 min.
- For 11g version onwards, The BLOCKING SESSION time threshold will be compared against WAIT_TIME_MICRO column which is more accurate.

sql_id_details.sh script: If the statement was already been vetted by SQL Advisor, the script will automatically show the tuning recommendations.

To download DBA Bundle V4.6:
https://www.dropbox.com/s/xn0tf2pfeq04koi/DBA_BUNDLE4.tar?dl=0

To visit the main article that explains all the bundle's features:
http://dba-tips.blogspot.com/2014/02/oracle-database-administration-scripts.html

Monday, August 27, 2018

DBA Bundle Version 4.5 Is Now Available

DBA Bundle version 4.5 is now available for download.

Enhancements:
- Enhanced the display of the Execution Plan when running sql_id_details.sh using sqlid alias.
- Added REPORT UNRECOVERABLE DATABASE FILES that don't have a valid backup to the Daily Health Check Report [dbdailychk.sh script]

To download DBA Bundle version 4.5:
https://www.dropbox.com/s/xn0tf2pfeq04koi/DBA_BUNDLE4.tar?dl=0

To read the full article:
http://dba-tips.blogspot.com/2014/02/oracle-database-administration-scripts.html

Tuesday, July 24, 2018

DBA Bundle 4.4

DBA Bundle version 4.4 is released. It comes with the following:

Bug fixes:
- Fixed small bugs with the following scripts:
configuration_baseline.sh
schedule_rman_image_copy_bkp.sh
schedule_rman_full_bkp.sh
dbalarm.sh

Enhancements:
- Added new parameter to dbdailychk_html.sh & dbdailychk.sh to control the display of LAST MODIFIED OBJECTS in the Health Check report.

To download DBA Bundle v4.4:
https://www.dropbox.com/s/xn0tf2pfeq04koi/DBA_BUNDLE4.tar?dl=0

To read the full article about the bundle features and how to use it:
http://dba-tips.blogspot.com/2014/02/oracle-database-administration-scripts.html

Saturday, July 7, 2018

Auditing Using LogMiner

Scenario: Let's suppose you want to investigate a recent case on a database whereas auditing is NOT enabled, or it's enabled but it didn't cover the activities you are looking for, in this case, you still have the ability to dig and find those activities using LogMiner feature.

LogMiner is a free feature (No license required) available in both Standard and Enterprise Editions, that can analyze the contents of archive log files and display (SYS, non-SYS, COMMITTED, NON-COMMITTED) transactions that happened during a specific period of time. It can also be used to recover a logical corruption of one or multiple transactions as it also contains the reverse (UNDO) statement of all transactions (similar to flashback transaction query).

I'll start with building the test scenario, a new user will be created and by that user, I'll execute some DDLs and DMLs along with wrong statements to figure out what are the statements that LogMiner can fetch for us.

-- Login as SYSDBA and Create a new user "AA" for testing: <let's see if this CREATE USER statement by SYSDBA will get captured>
06-Jul-2018 15:16:15 SYS@orcl> create user AA identified by "aa#12345";
06-Jul-2018 15:16:30 SYS@orcl> grant connect,resource to aa;

-- Connect to the new user:
06-Jul-2018 15:16:37 SYS@orcl> conn aa/aa#12345
Connected.

06-JUL-18 AA@orcl> create table a (a number);
Table created.

-- execute a wrong insert statement: <let's see if wrong statements are getting captured>
06-JUL-18 AA@orcl> insert into a('1');
insert into a('1')
              *
ERROR at line 1:
ORA-00928: missing SELECT keyword

-- execute a committed insert transaction:
06-JUL-18 AA@orcl> insert into a values (2);
1 row created.

06-JUL-18 AA@orcl> commit;
Commit complete.

-- execute a commited delete transaction:
06-JUL-18 AA@orcl> delete from a;
1 row deleted.

06-JUL-18 AA@orcl> commit;
Commit complete.

-- Execute a commited update tansaction that update nothing: 
06-JUL-18 AA@orcl> update a set a=5 where a=1;
0 rows updated.

-- Execute a truncate table statement:
06-JUL-18 AA@orcl> truncate table a;
Table truncated.

-- Create a procedure with compilation errors:
06-JUL-18 AA@orcl> create or replace procedure p as select * from a;
  2  /
Warning: Procedure created with compilation errors.

-- Execute a non-committed insert statement:
06-JUL-18 AA@orcl> insert into a values (3);
1 row created.

06-JUL-18 AA@orcl> rollback;
Rollback complete.

-- Now I'm done let's check the time at the end of this demo:
06-JUL-18 AA@orcl> !date
Fri Jul  6 15:34:51 GST 2018

And now let's use the LogMiner to reveal the transaction details that happened during that demo by user AA.

Step 1: Turn on the database level Supplemental Logging:

-- First check if the Supplemental Logging is already enabled, if supplemental_log_data_min shows "YES" this means it's ENABLED, then better do NOT run the last step of deactivating it:

SQL> SELECT supplemental_log_data_min FROM v$database;
SUPPLEME
--------
NO

-- supplemental_log_data_min shows "NO", so we have to enable the database level Supplemental Logging:
SQL> ALTER DATABASE ADD SUPPLEMENTAL LOG DATA;

Step 2: Provide the date & time range of the transactions you want to view, try to make the range as short as possible:

SQL> begin
dbms_logmnr.start_logmnr(
starttime => '06-Jul-2018 15:16:00',
endtime => '06-Jul-2018 15:36:00',
options => dbms_logmnr.dict_from_online_catalog + dbms_logmnr.continuous_mine + dbms_logmnr.print_pretty_sql + dbms_logmnr.committed_data_only
);
end;
/

Note: Specifying "dbms_logmnr.committed_data_onlyoption means to look for committed transactions only, if you really interested in fetching rolled back transaction, just remove "dbms_logmnr.committed_data_only" from the list of options.


Step 3: Query V$LOGMNR_CONTENTS to get the information you need:

SQL>    set linesize 170 pages 100
col TIMESTAMP for a20
col sql_redo for a40
col sql_undo for a40
col username for a20
col os_username for a20
col MACHINE_NAME for a20
select xid,to_char(timestamp,'DD-MON-YY HH24:MI:SS') TIMESTAMP,
username,OS_USERNAME,MACHINE_NAME,sql_redo,sql_undo,ROLLBACK from v$logmnr_contents
where SEG_OWNER='AA';

=> sql_redo: the statement it self.
=> sql_undo: the reverse statement that can be used to rollback the original statement.
-- You can also search by Segment Name "SEG_NAME='A'" or by operation "operation in ('INSERT','UPDATE','DELETE')"

XID TIMESTAMP       USERNAME  OS_USERNAME  MACHINE_NAME SQL_REDO
---------------- -------------------- --------- ------------ ------------ ----------------------------------------
SQL_UNDO
----------------------------------------
09001900E39D0000 06-JUL-18 15:16:30   UNKNOWN  UNKNOWN UNKNOWN create user aa identified by  VALUES 'S:
D4D17D1076EC7C6F5B1D68FBCB244E8789691A14
6F5209DF2A51AC579007;E3EE1448DE26764F' ;


08001E002E830000 06-JUL-18 15:16:57   UNKNOWN  UNKNOWN UNKNOWN create table a (a number);


0A002100A8510600 06-JUL-18 15:17:34   UNKNOWN  UNKNOWN UNKNOWN insert into "AA"."A"
  values
"A" = 2;
delete from "AA"."A"
 where
    "A" = 2 and
    ROWID = 'AAAjrvAAEAAAACvAAA';

09000600E09D0000 06-JUL-18 15:17:43   UNKNOWN  UNKNOWN UNKNOWN delete from "AA"."A"
  where
"A" = 2 and
ROWID = 'AAAjrvAAEAAAACvAAA';
insert into "AA"."A"
 values
    "A" = 2;

05000D00AC800000 06-JUL-18 15:18:21   UNKNOWN  UNKNOWN UNKNOWN truncate table a;


0A0007004D510600 06-JUL-18 15:18:45   UNKNOWN  UNKNOWN UNKNOWN create or replace procedure p as select
* from a;;
6 rows selected.

As you can see, LogMiner fetches all kind of statements that made changes to the database including DML & DDLs along with the statements executed with SYSDBA privilege.
Any other statements that don't make any change to the database like SELECT and failed SQL statements along with the DML statements that didn't make any actual change to the data will not be shown by the LogMiner.

Once you are done:

- End the Log Miner Session and flush the contents of v$logmnr_contents:
SQL> execute dbms_logmnr.end_logmnr();
SQL> select * from v$logmnr_contents;
No Rows should be returned.

-Deactivate the database level Supplemental Logging:
In case the SUPPLEMENTAL LOG DATA was already enabled in step 1, then don't execute this step, replication software like Oracle Streams & golden gate depends on the SUPPLEMENTAL LOG DATA in order to perform the replication.

SQL> ALTER DATABASE drop SUPPLEMENTAL LOG DATA;

SQL> SELECT supplemental_log_data_min,supplemental_log_data_pk,supplemental_log_data_ui,supplemental_log_data_fk,supplemental_log_data_all FROM v$database;

LogMiner cannot work as an alternative to Auditing, LogMiner can only fetch the changes that happened to the database, unlike Auditing that can capture details like SELECT statements and login details, such information is not available in the Archivelogs to be mined.
Moreover, LogMiner can analyze only the available archive logs, it cannot go far in the past where the archive logs are not available, unless you restore the archivelogs from a backup media and feed them into the LogMiner.

To sum up. LogMiner is mainly used by the DBAs to recover logical data corruption, but we can use it as a forensic tool.

References:
https://oracle-base.com/articles/8i/logminer
https://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:433619656232

Thursday, May 17, 2018

DBA Bundle v4.3 Is Now Released

Version 4.3 is now available. It's packed up with the following features:

New Scripts:
goldengate_lag_mon.sh For monitoring goldengate LAG, more details: http://dba-tips.blogspot.ae/2018/05/linux-shell-script-to-monitor-and.html

New Enhancements:
- gather_stats.sh script for gathering statistics can now run in the background (nohup mode).
- Paranoid mode feature has been added to dbalarm.sh script for reporting EXPORT/IMPORT, ALTER SYSTEM, ALTER DATABASE, instance STARTUP/SHUTDOWN and other DB Major activities, for more details on this feature: http://dba-tips.blogspot.ae/2018/05/dbalarm-script-for-monitoring-oracle-db.html
- oracle_cleanup.sh script for cleaning up database & listener logs is now giving the option to the user to archive Audit Log files as well.
- Added more parameters to dbdailychk.shdbdailychk_html.sh (HTML version) to let the user have more control over the report contents (e.g. including/excluding SQL Tuning Advisor, Memory Advisor, Segment Advisor, Job details and credentials changes results in/from the health check report).

To download DBA Bundle v4.3:
https://www.dropbox.com/s/xn0tf2pfeq04koi/DBA_BUNDLE4.tar?dl=0

To read the full article about the bundle features and how to use it:
http://dba-tips.blogspot.com/2014/02/oracle-database-administration-scripts.html

Monday, May 14, 2018

Linux Shell Script To Monitor and Report GoldenGate Lag

photographer's Twenty20 Philipbrunner
This script can monitor GoldenGate lag whenever it happens based on the per-defined LAG threshold inside the script.
It's highly recommended to deploy this script on all (source & destination) replication servers in order to detect the lag on all processes (Extract, Pump, and Replicate).

This script is not designed to monitor the replicated data inside the tables it totally relies on the native GoldenGate GGSCI console.

This script should be executed/scheduled by the GoldenGate installation owner OS user.                                               

How it works:

First, Download the script:
https://www.dropbox.com/s/l4dqzicviuaawt6/goldengate_lag_mon.sh?dl=0

Second, Adjust the following parameters:

MAIL_LIST="youremail@yourcompany.com"
Replace "youremail@yourcompany.com" pattern with your e-mail.


# ###########################################
# Mandatory Parameters To Be Set By The User:
# ###########################################
ORACLE_HOME= # ORACLE_HOME path of the database where GoldenGate is running against.
GG_HOME=           # GoldenGate Installation Home path. e.g. GG_HOME=/goldengate/gghome

Please note that ORACLE_HOME & GG_HOME are mandatory to be adjusted by YOU, in case you missed setting them up, the script will automatically try to guess the right values, but this will not be accurate most of the times.


# ################
# Script Settings:
# ################
# LAG THRESHOLD in minutes: [If reached an e-mail alert will be sent. Default 10 minutes]
LAG_IN_MINUTES=10

Here you define the LAG threshold in minutes (it's 10 minutes by default). Whereas if the lag reached 10 minutes it will send you an email.


# Excluded Specific PROCESSES NAME:
# e.g. If you want to exclude two replicate processes with names REP_11 and REP_12 from being reported then add them to below parameter as shown:
# EXL_PROC_NAME="DONOTREMOVE|REP_11|REP_12"
EXL_PROC_NAME="DONOTREMOVE"

In case you want to exclude specific (Extract, Pump, or Replicat) processes, let's say you want to exclude a process you use it for testing the replication, you can add it to the above parameter as shown in the blue color example.

DISCLAIMER: THIS SCRIPT IS DISTRIBUTED IN THE HOPE THAT IT WILL BE USEFUL BUT WITHOUT ANY WARRANTY. IT IS PROVIDED "AS IS".

GitHub version:

Monday, May 7, 2018

dbalarm Script for Monitoring Oracle DB Server is Now Packed With New Features

I've introduced a new mode called "Paranoid" mode to the dbalarm script (which is used to monitor the Oracle database and listener logs along with monitoring locks, Space, and other events). When the paranoid  mode gets enabled, the script will monitor the following additional activities/events on the Oracle database:

- EXPORT/IMPORT activities whenever they start.
- ALTER SYSTEM commands.
- ALTER DATABASE commands.
- Instance STARTUP/SHUTDOWN events.
- Other Major DB activities.

This mode is disabled by default to let the notifications focused on reporting errors. In order to enable this mode, you have to set PARANOIDMODE parameter from N to Y:

# #########################
# THRESHOLDS:
# #########################
# Modify the THRESHOLDS to the value you prefer:
...

PARANOIDMODE=Y    # Paranoid mode will report more events like export/import, instance shutdown/startup. [Default Disabled]

To download and read the full list of features of dbalarm script, please visit:
http://dba-tips.blogspot.ae/2014/02/database-monitoring-script-for-ora-and.html

Thursday, May 3, 2018

Gather Statistics Shell Script Can Now Run In nohup Mode | Oracle On Linux

I've re-coded the gather statistics shell script for Oracle to run in nohup mode by default, thereby allowing the DBA to close the session without interrupting the gather statistics operation.

Script download link:
https://www.dropbox.com/s/fku4hf082xah1bm/gather_stats.sh?dl=0

For more details on how to use the script, please visit the original post:
http://dba-tips.blogspot.com/2014/09/script-to-ease-gathering-statistics-on.html

And here is the GitHub version:

Saturday, April 21, 2018

Script To Automatic Kill of Long Running Sessions Connecting From Specific Program | Oracle On Linux

In the previous post, I've shared a script reports long running sessions/queries. In this post I'll explain another simple script that kills long running sessions connecting from specific programs e.g. SQL Developer

The script can be tuned to target any specific programs, but I'm mainly defaulting the script to target SQL Developer, as it's widely used by the developers, moreover, it doesn't maintain disconnecting the running session properly when the user close the session's window, keeping the "supposed to be closed" session query running in the database!

Download link:
https://www.dropbox.com/s/ke01v10bu0bat84/kill_long_running_queries.sh?dl=0

How to use:

The script is designed to be scheduled in the crontab to run every let's say 5 minutes to kill specific sessions based on the killing criteria which you can identify by editing the following parameters:

export ORACLE_SID=orcl
You have to provide the name of the instance you want to run the script against e.g. orcl

MAIL_LIST="youremail@yourcompany.com"
Provide your email to get a detailed report after each successful session kill.

export MODULE_NAME="'SQL Developer'"
You have to provide the module name that appears in v$session which you are targeting its sessions, in order to provide more than one module you have to enclose each module name between single quote putting comma "," in between.
e.g. To include SQL Developer and Toad sessions in the killing criteria you have to provide the values to MODULE_NAME parameter like the following:
export MODULE_NAME="'SQL Developer','Toad'"

To get an idea of module names that are connecting to your database run this statement:
select distinct(module_name) from gv$active_session_history;

export DURATION="2.5"
Duration parameter is the session's elapsed active time in hours which if reached the session will be a candidate for kill. It's set to 2 hours and half as a default value. of course you have to change it to whatever fits your requirements.

export REPORT_ONLY="N"
This parameter if set to "N" to enable the real kill of the candidate session, if it's set to "Y" it will not do the actual kill but will ONLY report the candidate session to your email. This parameter is helpful during testing phase, you've to test this script first by setting this parameter to "Y", it will keep reporting to you the candidate sessions (to be killed) without killing them, once you're confident with the script accuracy, set it to"N" to start killing the candidate sessions.

Ultimately, although I'm not keen to share such scripts that may cause problems if they are not accurately configured , I'm sharing this script in the hope that you will use it wisely, and in order to do so, you have to test and test and test such scripts before you deploy them on production.
By your use of this script you acknowledge that you use it at your own risk.

Github version:



Shell Script To Monitor Long Running Queries In Oracle

Monitoring long-running queries is part of the early warning mechanism for the performance
issues on the database before it gets exaggerated


If you're running Oracle on Linux, then this script is for you.
Download link:
https://www.dropbox.com/s/yh6k3r7q3p876h0/report_long_runing_queries.sh?dl=0

The script is easy to use, it will run against all instances on the server reporting any active sessions/query for more than 60 minutes (the default), you can adjust this duration by changing the following parameter under THRESHOLDS section to whatever number of minutes you want:

EXEC_TIME_IN_MINUTES=60        # Report Sessions running longer than N minutes (default is 60 min).

Also, you can control the report generation if the number of long active sessions have reached a specific count. By adjusting the following parameter:

LONG_RUN_SESS_COUNT=0        # The count of long-running sessions, report will not appear unless reached (0 report all long running sessions).

Lastly, you must provide your email address in order to receive a detailed Email report of long-running queries/active sessions, by editing below template in navy color:

MAIL_LIST="youremail@yourcompany.com"

You can schedule the script in the crontab to run every 5/10 minutes.

In the upcoming post, I'll share another script to automatically kill long active sessions connected by specific programs like SQL Developer or Toad. Let's make the life of our beloved developers easy as they always make our life easy :-)

Monitor Long Running Queries | Long Active Sessions Script (GitHub version):

Thursday, April 5, 2018

DBA Bundle V4.2 Is Now Available

Fixed reported bugs inside (dbdailychk.sh, dbalarm.sh, export_data.sh, report_long_runing_queries.sh) scripts.

New Features:
Added Goldengate log monitoring feature to dbalarm.sh script [Disabled by default].
Added new aliases:
 - raclog to open the alert logfile for Clusterware/Oracle Restart.
 - asmlert to open the alert logfile for ASM instance.

To download DBA Bundle V4.2:
https://www.dropbox.com/s/xn0tf2pfeq04koi/DBA_BUNDLE4.tar?dl=0

To visit the main article that explains in details all bundle features:
http://dba-tips.blogspot.com/2014/02/oracle-database-administration-scripts.html

Wednesday, February 28, 2018

DBA Bundle V4.1 Released

DBA Bundle V4.1 is now released with the following new scripts:

backup_ctrl_spf_AWR.sh Backup Controlfile as (Trace/RMAN BKP), Backup SPFILE and Generate AWR for full day This script can be scheduled in the crontab to run once a day. Script options/variables must be modified to match your environment. New Starting from V4.1
kill_long_running_queries.sh Kill all queries running longer than 2.5 Hours(can be customized inside the script) by specific modules (to be specified inside the script)
This script can be scheduled in the crontab. Script options/variables MUST be modified to get the killing criteria match your requirements. New Starting from V4.1
check_standby_lag.sh If you have a standby DB then you can use this script on Primary DB site to report to you any LAG happens between the primary and the standby DB.
The variables section at the top of the script must be populated by you to match your environment or the script will not be able to run.
This script can be scheduled in the crontab to run every 5 minutes. New Starting from V4.1
This link will show you how to use this script:
http://dba-tips.blogspot.ae/2017/11/shell-script-to-check-lag-sync-status.html

Many scripts are packed with new features like:

RMAN_full.sh
dbalarm.sh
dbdailychk.sh
schedule_rman_full_bkp.sh
active_sessions.sh
sql_id_details.sh



SQLPLUS is now more friendly for DBAs when invoked using "sql" alias:
It will display the following info when login:
 - Fundamental info about the DB you've connected to.
 - Your current session details (OSPID, SID, SERIAL#).
 - The prompt will show the time, instance_name you're connecting to and the connected user.
This will help you track your session log accurately plus minimize human mistakes.
 - The format of common columns in DBA_* & V$ views will be well fitted in your display.

To download DBA Bundle V4.1: [Feb 2018]
https://www.dropbox.com/s/k96rl0f4g39ukih/DBA_BUNDLE5.tar?dl=0

To visit the main article that explains all bundle features in details, and download the latest updated DBA Bundle version:
http://dba-tips.blogspot.com/2014/02/oracle-database-administration-scripts.html