I have a large number of fields that are currently NTEXT.
In SQL 2005, we have run some performance tests to convert them into NVHAR (Max).
If you read this article:
This explains that a simple ALTER COLUMN does not rearrange the data in rows.
I feel it with my data we actually perform very poorly in some areas, if we only run alternate columns, however, if I set the update table for all these fields column = If we run the column then we increase a very large efficiency.
The problem I have is that hundreds of these columns in the database have millions of records. A simple test (on low performance virtual machines) had a table with a single NTN column with 7 million records, which took 5 hours to update.
Can anyone suggest how I can update data is a more efficient way to reduce downtime and lock?
EDIT: My backup solution is to update the data in the block over time, however, with our data, it reaches poor performance until all records are updated and this time less time It's better, so I'm still looking for a quick way to update.
If you can not take a downtime ...
Two new Create a column: nvarchar (max) processedflag INT DEFAULT 0
Create a non-clustered index on the processed flag
You have the UPDATE TOP available for you (you can set the order Want to update)
Just set processedflag for 1 during the update so that the next update will be updated only where the process is still valid 0
after updating you using @@ rowcount Can you see if you can get out of any loop?
I suggest that using WAITFOR for a few seconds after each update query gives other questions the chance to get the lock on the table and not to use the overload disk.
Comments
Post a Comment