We received a mail from customer stating that - Heavy business activity today after 10AM and especially between 16:00 to 17:00 hrs where we see 47 archive log switch and 100GB archive generated today in today after 10AM.
Using below sql query you can check in which interval more archive log generated in the database.
col day for a12
set lines 1000
set pages 999
col "00" for a3
col "01" for a3
col "02" for a3
col "03" for a3
col "04" for a3
col "05" for a3
col "06" for a3
col "07" for a3
col "08" for a3
col "09" for a3
col "10" for a3
col "11" for a3
col "12" for a3
col "13" for a3
col "14" for a3
col "15" for a3
col "16" for a3
col "17" for a3
col "18" for a3
col "19" for a3
col "20" for a3
col "21" for a3
col "22" for a3
col "23" for a3
SELECT
to_char(first_time,'DD-MON-YYYY') day,
to_char(sum(decode(to_char(first_time,'HH24'),'00',1,0)),'99') "00",
to_char(sum(decode(to_char(first_time,'HH24'),'01',1,0)),'99') "01",
to_char(sum(decode(to_char(first_time,'HH24'),'02',1,0)),'99') "02",
to_char(sum(decode(to_char(first_time,'HH24'),'03',1,0)),'99') "03",
to_char(sum(decode(to_char(first_time,'HH24'),'04',1,0)),'99') "04",
to_char(sum(decode(to_char(first_time,'HH24'),'05',1,0)),'99') "05",
to_char(sum(decode(to_char(first_time,'HH24'),'06',1,0)),'99') "06",
to_char(sum(decode(to_char(first_time,'HH24'),'07',1,0)),'99') "07",
to_char(sum(decode(to_char(first_time,'HH24'),'08',1,0)),'99') "08",
to_char(sum(decode(to_char(first_time,'HH24'),'09',1,0)),'99') "09",
to_char(sum(decode(to_char(first_time,'HH24'),'10',1,0)),'99') "10",
to_char(sum(decode(to_char(first_time,'HH24'),'11',1,0)),'99') "11",
to_char(sum(decode(to_char(first_time,'HH24'),'12',1,0)),'99') "12",
to_char(sum(decode(to_char(first_time,'HH24'),'13',1,0)),'99') "13",
to_char(sum(decode(to_char(first_time,'HH24'),'14',1,0)),'99') "14",
to_char(sum(decode(to_char(first_time,'HH24'),'15',1,0)),'99') "15",
to_char(sum(decode(to_char(first_time,'HH24'),'16',1,0)),'99') "16",
to_char(sum(decode(to_char(first_time,'HH24'),'17',1,0)),'99') "17",
to_char(sum(decode(to_char(first_time,'HH24'),'18',1,0)),'99') "18",
to_char(sum(decode(to_char(first_time,'HH24'),'19',1,0)),'99') "19",
to_char(sum(decode(to_char(first_time,'HH24'),'20',1,0)),'99') "20",
to_char(sum(decode(to_char(first_time,'HH24'),'21',1,0)),'99') "21",
to_char(sum(decode(to_char(first_time,'HH24'),'22',1,0)),'99') "22",
to_char(sum(decode(to_char(first_time,'HH24'),'23',1,0)),'99') "23",
count(*) Tot
from
v$log_history
WHERE first_time like '%17-NOV-14%'
GROUP by
to_char(first_time,'DD-MON-YYYY'),trunc(first_time) order by trunc(first_time);
As customer said, At 4 PM there 47 archive logs generated which means (between the time duration 16 & 17)
So here Our aim is to get all sql queries (update,insert & delete) involved in heavy redo generation between 16:00 and 17:00 hrs (1 hour) (in a particular time interval).
One of our dba used the steps described in the post
http://oradba11g.blogspot.in/2014/05/sometimes-customer-would-ask-sql-query.html
So he identified the two update sql statements which made maximum number of block changes in the db at that time & sent to customer.
Then we got another mail from customer stating that
set lines 1000
Output for DELETE:
Using below sql query you can check in which interval more archive log generated in the database.
col day for a12
set lines 1000
set pages 999
col "00" for a3
col "01" for a3
col "02" for a3
col "03" for a3
col "04" for a3
col "05" for a3
col "06" for a3
col "07" for a3
col "08" for a3
col "09" for a3
col "10" for a3
col "11" for a3
col "12" for a3
col "13" for a3
col "14" for a3
col "15" for a3
col "16" for a3
col "17" for a3
col "18" for a3
col "19" for a3
col "20" for a3
col "21" for a3
col "22" for a3
col "23" for a3
SELECT
to_char(first_time,'DD-MON-YYYY') day,
to_char(sum(decode(to_char(first_time,'HH24'),'00',1,0)),'99') "00",
to_char(sum(decode(to_char(first_time,'HH24'),'01',1,0)),'99') "01",
to_char(sum(decode(to_char(first_time,'HH24'),'02',1,0)),'99') "02",
to_char(sum(decode(to_char(first_time,'HH24'),'03',1,0)),'99') "03",
to_char(sum(decode(to_char(first_time,'HH24'),'04',1,0)),'99') "04",
to_char(sum(decode(to_char(first_time,'HH24'),'05',1,0)),'99') "05",
to_char(sum(decode(to_char(first_time,'HH24'),'06',1,0)),'99') "06",
to_char(sum(decode(to_char(first_time,'HH24'),'07',1,0)),'99') "07",
to_char(sum(decode(to_char(first_time,'HH24'),'08',1,0)),'99') "08",
to_char(sum(decode(to_char(first_time,'HH24'),'09',1,0)),'99') "09",
to_char(sum(decode(to_char(first_time,'HH24'),'10',1,0)),'99') "10",
to_char(sum(decode(to_char(first_time,'HH24'),'11',1,0)),'99') "11",
to_char(sum(decode(to_char(first_time,'HH24'),'12',1,0)),'99') "12",
to_char(sum(decode(to_char(first_time,'HH24'),'13',1,0)),'99') "13",
to_char(sum(decode(to_char(first_time,'HH24'),'14',1,0)),'99') "14",
to_char(sum(decode(to_char(first_time,'HH24'),'15',1,0)),'99') "15",
to_char(sum(decode(to_char(first_time,'HH24'),'16',1,0)),'99') "16",
to_char(sum(decode(to_char(first_time,'HH24'),'17',1,0)),'99') "17",
to_char(sum(decode(to_char(first_time,'HH24'),'18',1,0)),'99') "18",
to_char(sum(decode(to_char(first_time,'HH24'),'19',1,0)),'99') "19",
to_char(sum(decode(to_char(first_time,'HH24'),'20',1,0)),'99') "20",
to_char(sum(decode(to_char(first_time,'HH24'),'21',1,0)),'99') "21",
to_char(sum(decode(to_char(first_time,'HH24'),'22',1,0)),'99') "22",
to_char(sum(decode(to_char(first_time,'HH24'),'23',1,0)),'99') "23",
count(*) Tot
from
v$log_history
WHERE first_time like '%17-NOV-14%'
GROUP by
to_char(first_time,'DD-MON-YYYY'),trunc(first_time) order by trunc(first_time);
Output:
As customer said, At 4 PM there 47 archive logs generated which means (between the time duration 16 & 17)
Note:
first_time like '%17-NOV-14%' (date format in my db is 17-NOV-14)
So here Our aim is to get all sql queries (update,insert & delete) involved in heavy redo generation between 16:00 and 17:00 hrs (1 hour) (in a particular time interval).
One of our dba used the steps described in the post
http://oradba11g.blogspot.in/2014/05/sometimes-customer-would-ask-sql-query.html
So he identified the two update sql statements which made maximum number of block changes in the db at that time & sent to customer.
Then we got another mail from customer stating that
- Are we sure that these are only two major contributor? Can you please check because I spotted few other heavy programs which were running but does not run this sql.
- Can you please tell what was main reason for archive generation between 4 to 5pm today. We had 47 log switch in this hour.
set lines 1000
set pages 50
set long 50000
set long 50000
col c1 heading Begin_Interval_time format a25
col c2 heading Exec format 9999999
col SQL_TEXT for a100
col SQL_TEXT for a100
break on c1
select s.begin_interval_time c1,sql.sql_id,
sql.PLAN_HASH_VALUE HASH_VALUE,
sql.executions_delta c2
from
dba_hist_sqlstat sql,
dba_hist_snapshot s,
v$sqltext st
where
s.snap_id=sql.snap_id and
sql.sql_id=st.sql_id and st.SQL_TEXT like 'DELETE%'
and
s.begin_interval_time like '%17-NOV-14 04.00.04.731 PM%' order by c2 desc;
Note:
- You can edit the columns & begin_interval_time (i.e.snap id generated time) that you required in the above query as per your choice.
- You can change the st.SQL_TEXT like 'UPDATE% for spotting update query
- You can change the st.SQL_TEXT like 'INSERT% for spotting insert query
- User executed sql statements will be in capital letters & db generated sql statements will be in small letters. Hence i used UPDATE, DELETE & INSERT words in capital letter.
Here my guess is that query executed more number of times (i.e. Number of executions) is the biggest contributor for heavy archive log generation.
Output for UPDATE:
Output for DELETE:
I got the sql statements using sql_id in the above output from v$sqltext view (Query is given below)
set pages 50000
col SQL_TEXT for a75
select sql_text from v$sqltext where sql_id ='&sql_id' order by piece;
Please also refer the related posts written by me:
Thank you for this interesting post.
ReplyDelete