Here’s a question that came up on the OTN database forum recently:
I want to reorganize some tables to release some space, how can I find the tables that need to be reorganized. I found something that whenever I reorganize the tables it won’t release space, instead the size of the table increases. Can we find the tables which will gain space by reorganization ?
I’m not going to address the ‘how do we find the tables’ question here – but I thought I’d just describe how difficult the problem of re-organising tables can be by quoting something I wrote in Practical Oracle 8i nearly eight years ago.
Imagine you have some sort of order processing system where each order goes through three main stages – ‘new’, ‘in progress’, ‘completed’. A ‘new’ row requires 50 bytes of storage, an ‘in progress’ row takes up 100 bytes, and a ‘completed’ row reaches it full maturity at 150 bytes. The canny DBA quickly works out that PCTFREE should be set to 65% to allow new rows to grow to full size without migrating (which also means setting PCTUSED to 35% or less as the total of PCTUSED and PCTFREE has to be no more than 100).
Every month, most orders that were completed at least 12 months earlier are deleted from the system, but some are retained for various reasons. Over a period of time the table becomes a little ragged because of this incomplete clear down, so the DBA decides the time has come to rebuild the table and claw back some space and efficiency.
Ask yourselves this question: what should the value of PCTFREE be (at least temporarily) as the table is being rebuilt?
Assume 95% of the rows in the table are complete, 4% are in progress, and 1% are new. For the benefit of the completed rows, the best value for PCTFREE would be zero since the bulk of the data will never be updated again, but this would be disastrous for the more recent 5% of the data which needs plenty of room to grow. Conversely if you set PCTFREE to 65% to cater for the new data, or even 35% to compromise between the new data and the ‘in progress’ data, you waste a huge amount of space in the blocks which contain only completed data.
Of course you could get your spreadsheet going and decide on 2% as a reasonable weighted average – (95% * 0 + 4% * 35 + 1% * 65)/100, and hope for the best. You could start playing complicated games with partial unloads and reloads, changing PCTFREE as you go. There are several (tedious) variations of making the best of an awkward situation.
The book then goes on to describe how you might mess about with the minimize records_per_block option of the alter table command to prepare an empty table which would only allow you to insert the “correct” number of rows per block.
In passing, one of the answers to the original question pointed to an item on OraFaq that suggested that you identify tables where rebuilding might be of some benefit by looking at dba_tables, multiplying avg_row_len by num_rows, dividing by the block size then comparing with blocks. If blocks is much larger – say 50% – than the space needed for the data then you may have a table worth thinking about.
Although it made no allowances for row and block overheads as a pointer to trouble-spots it’s probably quite reasonable in most cases – provided your statistics are reasonably accurate, and you don’t have long, LOB, varray, nested table or UDT (user-defined type) columns – all of which may mess up the arithmetic a bit.