1

I have a data cleanup-er procedure which deletes the same data from the card rows of two tables. Both of these update statement use the same subQuery for detecting which rows should be updated.

 UPDATE table_1 SET card = NULL WHERE id in
               (select id from sub_table WHERE /* complex clause here */);

 UPDATE table_2 SET card = NULL WHERE id in
               (select id from sub_table WHERE /* complex clause here */);

Is using Oracle Temporary table good solution for optimizing my code?

CREATE TEMPORARY TABLE tmp_sub_table AS
select id from sub_table WHERE /* complex clause here */;

UPDATE table_1 SET card = NULL WHERE id in (select * from tmp_sub_table);    
UPDATE table_2 SET card = NULL WHERE id in (select * from tmp_sub_table);

Should I use Local temporary table or Global Temporary table?

2
  • 1
    If the data is Huge in sub_table, I would suggest Global Temporary Table. Commented Jul 25, 2017 at 8:17
  • 1
    I second going for a global temporary table (GTT), as per your second example (although I'd change select * from tmp_sub_table to select id from tmp_sub_table to avoid issues if someone adds columns to your GTT in the future). The good thing about the GTT is that you only create it once, so you don't have the overheads of creating and deleting it in your code. Commented Jul 25, 2017 at 9:15

1 Answer 1

2

Global Temporary Tables are persistent data structures. When we INSERT the data is written to disk, when we SELECT the data is read from disk. So that's quite a lot of Disk I/O: the cost saving from running the same query twice must be greater than the cost of all those writes and reads.

One thing to watch out for is that GTTs are built on a Temporary Tablespace, so you might get contention with other long running processes which are doing sorts, etc. It's a good idea to have a separate Temporary Tablespace, just for GTTs but not many DBAs do this.

An alternative solution would be to use a collection to store subsets of the records in memory and use bulk processing.

 declare
     l_ids sys.ocinumberlist;
     cursor l_cur is
         select id from sub_table WHERE /* complex clause here */
         order by id
         ;
begin
    open lcur;
    loop
        fetch lcur bulk collect into l_ids limit 5000;
        exit when l_ids.count() = 0;

        update table1 
        set card=null
        where id member of l_ids;

        update table2 
        set card=null
        where id member of l_ids;

    end loop;
end;

"updating many rows with one update statement ... works much faster than updating separately using Looping over cursor"

That is the normal advice. But this is a bulk operation: it is updating five thousand rows at a time, so it's faster than row-by-row. The size of the batch is governed by the BULK COLLECT ... LIMIT clause: you don't want to make the value too high because the collection is in session memory but as you're only select one column - and a number - maybe you can make it higher.

As always tuning is a matter of benchmarking. Have you established that running this sub-query twice is a high-cost operation?

select id from sub_table WHERE /* complex clause here */

If it seems too slow you need to test other approaches and see whether they're faster. Maybe a Global Temporary Table is faster than a bulk operation. Generally memory access is faster than disk access, but you need to see which works best for you.

Sign up to request clarification or add additional context in comments.

3 Comments

In this case I have to update all my data in separate statements which will worsen the performance.
updating many rows with one update statement: "UPDATE table_1 SET card = NULL WHERE id in (select id from sub_table WHERE);" works much faster than updating separately using Looping over cursor.
@mariami APC's solution is fetching 5000 rows at a time and then doing the update to both tables based on the ids of those 5000 rows. It may or may not be as performant as the GTT solution - you should test both methods to see which works best for you.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.