Sql updating large number of rows

First lets create one table and insert some sample data.-------------------------------------------------------------------------------------------------------- This is not the most efficient way to generate some random data using a recursive CTE but I like it so I’m using it.How I would break it up though would be a CTE while adding row_number() over partition by sow_number or something similar, then a loop with update top amount being programmable. brentozar.com/pastetheplan What wait types accrue during this query?Can you break out the set portions in a previous select query so it doesn't have to compute that during the update?This technique is useful because it avoids the concurrency hits that large updates incur; the smaller the x (the number of rows in the updates), the less likely that the update task will prevent other users from accessing the data.Combined with transaction-log backups, this method can also keep your transaction-log size to a minimum.

This kind of problem is very common, and it’s the main reason that deduping software exists.

This is the simplest method to run a query in a small batches.

I frequently use this method in development when I want to update some records quickly without thinking much. An integer after GO will execute the preceding query specified number of times.

Often in my job I have to create a procedure inside SQL Server that will process millions of data rows, save them into a temp table (staging table) and finally save them into a table (s) in a database(s).

I am not looking into alternative solutions, like SSIS.


Leave a Reply

  1. Sexy aunty chat site 07-Sep-2017 20:47

    A pro lockout has reunited old teammates and brought a crew of new players to the bench; notably missing from the line-up, however, is everyone's favourite enforcer and heart of the team, Doug "The Thug" Glatt.