Oracle Database - How can we load a large number of rows into an indexed existing table ?

1 - About

If you have to perform Data Processing - (Batch|Bulk) Processing data load into a Data Warehouse or data mart, you must skip the index to minimize the generation of redo log, such as :

  1. set the indexes to the Oracle Database - Indexes state. They are not dropped but just setting as unusable
  2. set the session to skip UNUSABLE indexes (Oracle Database - skip_unusable_indexes parameter) and do the bulk load (Oracle Database - Direct (path insert|load) (/*+ APPEND */))
  3. re-enable the indexes
Why not just drop the index ?
Because the command CREATE INDEX can failed and if no one notice it, the performance goes down.
If the command to re-enable the index fails, when the users run queries that need that index, they will get an error message.

3 - Example

This example is the end of this article : Oracle Database - Why you have still a lot of redo and archive ? The Index side effect.


[email protected]>alter index big_table_idx unusable;

Index altered.

[email protected]>alter session set skip_unusable_indexes=true;

Session altered.

[email protected]>insert /*+ APPEND */ into big_table select * from all_objects;

66652 rows created.


Statistics
----------------------------------------------------------
       6308  recursive calls
       1545  db block gets
     187973  consistent gets
          0  physical reads
    7606756  redo size
        893  bytes sent via SQL*Net to client
        966  bytes received via SQL*Net from client
          4  SQL*Net roundtrips to/from client
       1550  sorts (memory)
          0  sorts (disk)
      66652  rows processed

[email protected]>alter index big_table_idx rebuild nologging;

Index altered.


Data Science
Data Analysis
Statistics
Data Science
Linear Algebra Mathematics
Trigonometry

Powered by ComboStrap