Oracle Database - How can we load a large number of rows into an indexed existing table ?


If you have to perform Data Processing - (Batch|Bulk) Processing data load into a Data Warehouse or data mart, you must skip the index to minimize the generation of redo log, such as :

  1. set the indexes to the Oracle Database - Index (indices) state. They are not dropped but just setting as unusable
  2. re-enable the indexes

Why not just drop the index ?
Because the command CREATE INDEX can failed and if no one notice it, the performance goes down.
If the command to re-enable the index fails, when the users run queries that need that index, they will get an error message.


This example is the end of this article : Oracle Database - Why you have still a lot of redo and archive ? The Index side effect.

[email protected]>alter index big_table_idx unusable;

Index altered.

[email protected]>alter session set skip_unusable_indexes=true;

Session altered.

[email protected]>insert /*+ APPEND */ into big_table select * from all_objects;

66652 rows created.

       6308  recursive calls
       1545  db block gets
     187973  consistent gets
          0  physical reads
    7606756  redo size
        893  bytes sent via SQL*Net to client
        966  bytes received via SQL*Net from client
          4  SQL*Net roundtrips to/from client
       1550  sorts (memory)
          0  sorts (disk)
      66652  rows processed

[email protected]>alter index big_table_idx rebuild nologging;

Index altered.

Powered by ComboStrap