I have a table in SQL Server 2008 R2 with close to a billion rows. I want to change the datatype of two columns from int to bigint. Two times ALTER TABLE zzz ALTER COLUMN yyy
works, but it's very slow. How can I speed the process up? I was thinking to copy the data to another table, drop, create, copy back and switching to simple recovery mode or somehow doing it with a cursor a 1000 rows a time but I'm not sure if those will actually lead to any improvement.
Depending on what change you are making, sometimes it can be easier to take a maintenance window. During that window (where nobody should be able to change the data in the table) you can:
- drop any indexes/constraints pointing to the old column, and disable triggers
- add a new nullable column with the new data type (even if it is meant to be NOT NULL)
- update the new column setting it equal to the old column's value (and you can do this in chunks of individual transactions (say, affecting 10000 rows at a time using
UPDATE TOP (10000) ... SET newcol = oldcol WHERE newcol IS NULL
) and with CHECKPOINT to avoid overrunning your log) - once the updates are all done, drop the old column
- rename the new column (and add a NOT NULL constraint if appropriate)
- rebuild indexes and update statistics
The key here is that it allows you to perform the update incrementally in step 3, which you can't do in a single ALTER TABLE command.
This assumes the column is not playing a major role in data integrity - if it is involved in a bunch of foreign key relationships, there are more steps.
EDIT
Also, and just wondering out loud, I haven't done any testing for this (but adding it to the list). I wonder if page + row compression would help here? If you change an INT to a BIGINT, with compression in place SQL Server should still treat all values as if they still fit in an INT. Again, I haven't tested if this would make an alter faster or slower, or how much longer it would take to add compression in the first place. Just throwing it out there.
WHERE newcol IS NULL
won't work. –
Shutz WHERE newcol IS NULL AND oldcol IS NOT NULL
... –
Ermin WHERE newcol1 IS NULL AND newcol2 IS NULL AND (oldcol1 IS NOT NULL OR oldcol2 IS NOT NULL)
. Will this have a big impact on speed? –
Shutz Here is how I changed column datatype (int to decimal) on a huge table:
-- 1) add the new column with the new data type, allow null
ALTER TABLE dbo.[Table] ADD [Column2] DECIMAL(18,2) NULL;
-- 2) copy data from the old to the new column (with or without TOP)
UPDATE TOP(10000) t SET t.[Column2] = t.[Column1]
FROM dbo.[Table] t WHERE t.[Column2] IS NULL;
-- 3) drop constraints (if any) on old column, then drop old column
ALTER TABLE dbo.[Table] DROP CONSTRAINT DF_Table_Column1;
ALTER TABLE dbo.[Table] DROP COLUMN [Column1];
-- 4) rename the new column to the old column's name
EXEC sys.sp_rename
@objname = N'dbo.[Table].Column2',
@newname = 'Column1',
@objtype = 'COLUMN';
-- 5) add NOT NULL constraint on the new column
ALTER TABLE dbo.[Table] ALTER COLUMN Column1 DECIMAL(18,2) NOT NULL;
ALTER TABLE dbo.[Table] ADD CONSTRAINT DF_Table_Column1 DEFAULT (1) FOR [Column1];
If you are using something like SQL Server Management Studio you just go to the table in the database, right-click, select 'Design' and then choose the column you want to edit : set it to bigint and hit save. Changes the whole column, but previous values will remain as they are. This is good for allowing a table to 'grow out of' int into bigint, but won't change the existing data as far as I know.
© 2022 - 2024 — McMap. All rights reserved.
Alter
statement you supplied is your best bet here. – Clarissa