In a datawarehouse environment (184.108.40.206 on RedHat) with compressed tablespaces, tried to drop some columns in the following way:
alter table DWH.TEST_TABLE set unused column COLUMN_SIX;
alter table DWH.TEST_TABLE drop unused columns;
This took a very long time, let it run for 24 hours (!), noticed that the undo-tablespace (80GB…) completely was filled up. Then I killed the command. No penalty, everything was back to normal.
So my intention was to use the addition ‘checkpoint 100000’ , and came across a very old but good explanation at Tom’s site here.
The next sentence I read too (should have scared the hell out of me..):
if this statement is interrupted after a checkpoint has been applied, the table remains in an unusable state. While the table is unusable, the only operations allowed on it are DROP TABLE, TRUNCATE TABLE, and ALTER TABLE DROP COLUMNS CONTINUE. Submitting DROP COLUMNS CONTINUE this statement while the table is in a valid state results in an Error.
So I typed in the command…:
alter table DWH.TEST_TABLE drop unused columns checkpoint 100000;
And after a while – several hours I believe:
ERROR at line 1:
ORA-00600: internal error code, arguments: ,
‘Alter table drop columns continue’ gave the same result. Completely stuck. A select of the table : noway.
No happy customer by the way
Gave this problem to Oracle with the following testcase (don’t try the following at home in your production environment on 220.127.116.11, or 11.2 I understood):
CREATE TABLESPACE TS_TEST DATAFILE ‘<directory>/ts_test_data01.dbf’ SIZE 1000M AUTOEXTEND ON NEXT 500M MAXSIZE 64000M
DEFAULT COMPRESSION FOR ALL OPERATIONS
EXTENT MANAGEMENT LOCAL UNIFORM SIZE 50M
SEGMENT SPACE MANAGEMENT AUTO;
create user dwh identified by incident
default tablespace ts_test
temporary tablespace temp
quota unlimited on ts_test
grant connect to dwh;
grant resource to dwh;
grant unlimited tablespace to […]