Oracle Scratchpad

November 29, 2006

Analysing Statspack (1)

Filed under: Performance,Statspack,Troubleshooting — Jonathan Lewis @ 10:54 pm GMT Nov 29,2006

[Further Reading on Statspack]

“How do you interpret statspack data ?”

If you go for a generic answer, it’s likely to be a long answer – but the first step is always the classic  “Tom Kyte turnaround”  … Why do you want to interpret Statspack data ?

As “a starter for ten” [Ed: a reference that may be familiar only to UK viewers of the quiz show Univeristy Challenge], I’m just going to float one thought: there are three reasons for looking at Statspack data and the reason dictates the approach, and the significance you attach to the data.

Reason 1 is simply to observe patterns and trends over time. Statspack allows you to collect data at regular intervals and, with a little careful progamming, you can run trend lines through the data to answer questions like “Has the random disk read time been getting worse over the last six months”, “Has the number of hourly executions of query X been increasing over the last 8 weeks“.

Reason 2 is to answer simple questions like: “why was the system surprisingly slow at 10:00 am this morning”. The strategy is simple – dump the Statspack report ($ORACLE_HOME/rdbms/admin/spreport.sql) or AWR report ($ORACLE_HOME/rdbms/admin/swrrpt.sql) around the problem time this morning, and do the same for the same time-period yesterday and the previous week.  Simply eye-balling the reports looking for the biggest changes is often enough to identify the critical issue.

Reason 3 is to address the standard complaint: “the system is slow”.  If that’s really the case – no one task is a terrible issue but response times always seems to be unreasonably slow – then a quick analysis of a Statspack report may give you a clue about how to use your time most cost-effectively to make a difference. The trick, of course, is to identify and cross-reference the information that highlights the two or three critical failings that are worth addressing.  I’ll be discussing this, and the difference between ‘analysing the data’ and ‘picking random numbers’, in a future article.

[Further Reading on Statspack]

Footnote: Any advice about reading Statspack reports is almost always relevant when reading AWR reports (and vice versa).

22 Comments »

  1. I find Statspack useful for reasons 1 and 2 regularly and I think it’s unfairly criticised at times. The instance-wide scope does make it difficult to prove where the problem lies and how much performance improvement can be expected when it’s eliminated. However, I think it’s a practical tool for answering ‘why the hell is this suddenly going slow’ questions which crop up all the time in production support. Comparing two reports can narrow the possibilities very quickly. It’s particularly useful for the ‘why the hell did this suddenly go slow yesterday, but it’s alright now’ questions. I don’t see how tracing could ever help you with that unless you have it switched on all the time – or you’re using ASH/AWR.

    For the relatively low resource cost I think it’s an indispensable tool to be used in conjunction with other tools.

    I’m going to post the next few good examples I come across. In fact I think I might still have one tucked away in my email at work, but they come along every couple of weeks anyway.

    Comment by Doug Burns — November 29, 2006 @ 11:20 pm GMT Nov 29,2006 | Reply

  2. Just an observation…
    Reason 1 – user doesn’t need a lot of Oracle knowledge.
    Statspack usage for reasons 2 and 3 and possibly other uses usually do require (be it good or bad) certain level of Oracle skills.

    Comment by Alex Gorbachev — November 30, 2006 @ 12:03 am GMT Nov 30,2006 | Reply

  3. Alex makes a good observation. Statspack enables users of relatively little knowledge to at least collect some information, although it may be very general. Can that user then pick out the relevant information? – that’s one of the flaws of statspack in my opinion, wayyyy too much chaff.

    It can (often? – sometimes?) demonstrate where the issue is, although in my experience rarely why it’s occurring, at least not without more indepth analysis.

    I treat it as just another tool, it has it’s uses but it’s not going to solve everything.

    Comment by Graham Oakes — November 30, 2006 @ 11:24 am GMT Nov 30,2006 | Reply

  4. I guess, this is why oracle improves more about performance tuning in 10g, by collecting information in time model, metrics, ash and awr.

    Jaffar

    Comment by Syed Jaffar Hussain — November 30, 2006 @ 2:54 pm GMT Nov 30,2006 | Reply

  5. Graham, Jaffar: Spot on, although “chaff” is a bit unfair. Most of the data will be useful to someone at some time. However, most of the time one individual only needs a little bit of the data – and the difficulty is identifying which little bit.

    This is why ADDM came along – it takes the huge wealth of detail from the AWR report and picks out the very few critical points that are most relevant at that moment. Any attempt to create a Statspack Analyzer has to do the same – which is probably why the (freeware) world hasn’t really moved on since Anjo Kolk created the Oraperf site a few years ago.

    At some time in the future, I will be following up on all three reasons for using statspack including how, despite the general correctness of Alex’ comment, the more you know about Oracle the more you can get out of option 1.

    Comment by Jonathan Lewis — November 30, 2006 @ 8:57 pm GMT Nov 30,2006 | Reply

  6. Jonathan,

    I quick question. Oracle 10g comes with very rich functionality called ADDM, then, why STATSPACK still there and why does Oracle improves little bit in statspack of Oracle 10g?

    It would be good if you can also talk/discuss about the changes made in Oralce10g statspack.

    Jaffar

    Comment by Syed Jaffar Hussain — December 1, 2006 @ 10:39 am GMT Dec 1,2006 | Reply

  7. Jaffar, one thing to spot right away from 10g Statspack – it has waits histogram. Can be really useful sometimes to get closer to the bottom of a problem; indirectly, though.

    Jonathan, indeed, you are right on reason 1. I should have been more precise that trending is no brainer but analysis of results does require Oracle knowledge and the more you have the better. There might be zillion of reasons why a certain metric (or a set) behaves in a particular way and that’s where knowledge and experience kick in. Anyway, this is already more in the area of 2nd and 3rd applications.

    Comment by Alex Gorbachev — December 2, 2006 @ 4:38 pm GMT Dec 2,2006 | Reply

  8. Alex,

    The other and valid reason could be that statspack is a free utility unlike ADDM, where you pay extra licensing amount.

    You are right, I have seen people, they simply collect statspack, they don’t where to start and how to start. I guess, this could be due to that fact that they dont know what they want to achieve.

    I have come across of few crazy things in one of the bussiest database at my previous organization where we couldn’t find any clue.

    Jaffar

    Comment by Jaffar — December 3, 2006 @ 5:59 am GMT Dec 3,2006 | Reply

  9. Jaffar,
    Statspack has less value in complex environments due to its “aggregational” nature. The more you aggregate the less you see.
    If companies don’t have experience Oracle experts they might still want to run Statspack so that they have at least something to offer when they got in trouble and call for external help. ;-)
    On the other hand, they would probably want to call that same help-line to set up Statspack collection properly, at least.

    Comment by Alex Gorbachev — December 3, 2006 @ 11:37 pm GMT Dec 3,2006 | Reply

  10. Jaffar, I think the 10g enhancements to statspack are probably due to the non-stop, parallel, development that goes on at Oracle. The most recent changes to statspack may have happened before AWR went live, notwithstanding the fact that AWR could be expected to make statspack redundant.
    I suspect that there will be no further enhancement to statspack – although there may be a fix for its failure to collect the filter_predicates and access_predicates from v$sql_plan.

    Comment by Jonathan Lewis — December 4, 2006 @ 5:51 pm GMT Dec 4,2006 | Reply

  11. Can you help me, please, for writing a sql script in order to generate daily statspack rapport?
    (from 0h00 to 23h45)
    Thank you in advance

    tutu

    Comment by tutu — March 10, 2009 @ 2:52 pm GMT Mar 10,2009 | Reply

  12. Mr. Lewis,

    This is a serious comment/question to which I very much hope you will reply with a serious answer.

    I have been collecting statspack stats hourly at level 7 for about a year. I would like to keep about 3 years of these stats for analyzing trends, comparing historical data, etc. currently I have about 2 years of statspack data (the earlier data was level 5 and possibly not hourly).

    Do you think this is excessive and/or a bad idea and/or a waste of time?

    I would very much appreciate your comments on saving extensive statspack data for a given database.

    Thank you,

    Robert Wood.

    Comment by Robert Wood — March 14, 2009 @ 1:40 am GMT Mar 14,2009 | Reply

    • Robert,

      I can’t imagine why you would think I would assume your question wasn’t serious, or why I wouldn’t try to supply a serious answer.

      My answer, though, won’t sound entirely serious because it is the answer made famous by Douglas Adams: 42.
      (See http://jonathanlewis.wordpress.com/2008/02/28/42/ for the serious justification).

      Nigel has made some relevant comments, though, which I agree with. There is a point at which the data becomes so old that there is no point in holding it for comparison purposes. You might want to keep 13 months, for example, so that you could compare this Christmas with last Christmas – but you also have to worry about how you might handle creating reports across instance restarts.

      In principle, so long as you remember to include the full, indexed, keys of the tables in your queries, you probably can’t do much harm (in terms of performance) when querying the tables – but I suppose you might still decide to limit the damage by reducing history to one or two snapshots per day and then compacting the tables at regular intervals.

      Another alternative is to generate and archive one report per day once the data is over (say) 42 days and then delete the data.

      Comment by Jonathan Lewis — March 22, 2009 @ 12:57 am GMT Mar 22,2009 | Reply

  13. Hi Robert – it used to be the case that DBAs were supposed to ignore statspack reports that spanned instance bounces. But, if your system is stable, no patches have been applied, and other such stuff, then I suppose that there may be some high-level stats you could make use of for pie chart sort of “quck glance” reports. The detail, though, is bound to become less and less relevant to earlier snapshots: data has been added/removed; fragmentation may have taken place; parameters changed, etc.

    Comment by Nigel — March 17, 2009 @ 4:47 pm GMT Mar 17,2009 | Reply

  14. Hi Nigel,

    Yes. That is the sort of thing I was thinking of.
    It is taking up a lot of space, but I am hesitant to get rid of it just because of the potential for analyzing and/or reporting on the high level stuff which could be used to show some sorts of totals or trends.

    Thanks,

    Robert.

    Comment by Robert — March 17, 2009 @ 7:29 pm GMT Mar 17,2009 | Reply

  15. @Robert:
    You can simply export the PERFSTAT schema then clean-up statspack snapshots like this:

    SQL> connect perfstat/perfstat
    SQL> @$ORACLE_HOME/rdbms/admin/sppurge <– Purges a limited range of Snapshot IDs for a given database instance.

    Comment by Nigel — March 17, 2009 @ 7:40 pm GMT Mar 17,2009 | Reply

  16. […] 原文:http://jonathanlewis.wordpress.com/2006/11/29/analysing-statspack-pt1/ […]

    Pingback by 分析 Statspack 1 | xpchild — June 21, 2011 @ 5:15 am GMT Jun 21,2011 | Reply

  17. […] Analysing Statspack (1) […]

    Pingback by Viewing Figures « Oracle Scratchpad — February 3, 2012 @ 9:32 am GMT Feb 3,2012 | Reply

  18. […] Web tools to analyze Reportshttp://www.oraperf.comhttp://www.txmemsys.com/statspack-reg.htmhttp://www.spviewer.com/index.htmlspReporter 1.0.30http://www.softpedia.com/get/Internet/Servers/Database-Utils/spReporter.shtmlhttp://www.wise-oracle.comwww.statspackanalyzer.com Good Links with more Informationhttp://jonathanlewis.wordpress.com/2006/11/29/analysing-statspack-pt1/ […]

    Pingback by Analyzing Statspack Report ← oraxprt — August 21, 2013 @ 9:31 pm GMT Aug 21,2013 | Reply

  19. […] Analysing Statspack (1) […]

    Pingback by Viewing Figures | Oracle Scratchpad — October 8, 2013 @ 5:33 pm GMT Oct 8,2013 | Reply


RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Theme: Rubric. Get a free blog at WordPress.com

Follow

Get every new post delivered to your Inbox.

Join 4,429 other followers