Professional Web Applications Themes

Updating millions of records - Microsoft SQL / MS SQL Server

We are currently using SQL server for some data ysis, and have tables containing > 100 million records. We need to run some update statements that alter the majority of these records, but the performance is being hit by the creation of vast transaction logs. Is there anyway of turning off logging for this kind of operation. I realise this kind of goes against the philosophy of have transactional database to maintain integrity. If there is any failure in the updates, we don't need to rollback. Thanks Miles...

  1. #1

    Default Updating millions of records

    We are currently using SQL server for some data ysis, and have tables
    containing > 100 million records.

    We need to run some update statements that alter the majority of these
    records, but the performance is being hit by the creation of vast
    transaction logs. Is there anyway of turning off logging for this kind of
    operation. I realise this kind of goes against the philosophy of have
    transactional database to maintain integrity.

    If there is any failure in the updates, we don't need to rollback.

    Thanks

    Miles


    Miles Guest

  2. #2

    Default Re: Updating millions of records

    Hi Miles,

    You can't turn of logging, but you can prevent the log file from growing out
    of control in these kind of situations with the following procedure.

    -- Kick all other users out to prevent them from performing operations
    -- that can not be recovered to a certain point in time
    ALTER DATABASE <mydb> SET SINGLE_USER WITH ROLLBACK IMMEDIATE
    -- ## Backup transaction log
    -- Change recovery model
    ALTER DATABASE <mydb> SET RECOVERY SIMPLE
    -- ## Do the actual work
    -- Because the recovery model is now simple the transaction log will be
    cleaned on a regular basis, and can be kept to a reasonable size if you
    execute the update statement in batches of a smallish size, say 10000 rows.
    SET ROWCOUNT 10000
    -- For example
    UPDATE large_table
    SET this_column = 10
    WHERE that_column = 1
    AND this_column <> 10
    -- Change recovery back
    ALTER DATABASE <mydb> SET RECOVERY FULL
    -- ## Backup the transaction log
    -- ## Backup the database
    -- Let other users back in
    ALTER DATABASE <mydb> SET MUTLI_USER WITH ROLLBACK IMMEDIATE


    --
    Jacco Schalkwijk MCDBA, MCSD, MCSE
    Database Administrator
    Eurostop Ltd.


    "Miles Dennis" <co.uk> wrote in message
    news:bhvnq5$eek$1$demon.co.uk... 


    Jacco Guest

  3. #3

    Default Updating millions of records

    There is no way not to log the update. But there is some
    workaround to avoid transaction log grow exponentially.
    read: http://www.experts-
    exchange.com/Databases/Microsoft_SQL_Server/Q_20467425.html 
    and have tables 
    majority of these 
    of vast 
    for this kind of 
    philosophy of have 
    rollback. 
    Riccardo Guest

  4. #4

    Default Re: Updating millions of records

    To add to the other responses, you might consider using SELECT INTO to
    emulate the update and renaming the new table to the original name
    afterward. This will be minimally logged in the SIMPLE or BULK_LOGGED
    recovery model.

    --
    Hope this helps.

    Dan Guzman
    SQL Server MVP

    -----------------------
    SQL FAQ links (courtesy Neil Pike):

    http://www.ntfaq.com/Articles/Index.cfm?DepartmentID=800
    http://www.sqlserverfaq.com
    http://www.mssqlserver.com/faq
    -----------------------

    "Miles Dennis" <co.uk> wrote in message
    news:bhvnq5$eek$1$demon.co.uk... 
    tables 
    kind of 


    Dan Guest

  5. #5

    Default Re: Updating millions of records

    Only thing you have to be careful with with regards to large files is that
    the transaction log backup (not the transaction log itself) in BULK_LOGGED
    mode will include all the pages of the new table, and of course the data
    file will also grow by the size of the table. It will be less than having a
    transaction log with 100 million fully logged updates, but it won't be
    neglectible.


    "Dan Guzman" <net> wrote in message
    news:phx.gbl... 
    > tables 
    > kind of 
    >
    >[/ref]


    Jacco Guest

Similar Threads

  1. updating records
    By Auddog in forum MySQL
    Replies: 2
    Last Post: May 14th, 01:12 PM
  2. Replies: 1
    Last Post: October 9th, 02:17 AM
  3. Updating Multi records
    By rddave51 in forum Dreamweaver AppDev
    Replies: 1
    Last Post: April 10th, 04:37 PM
  4. Delete Millions of Duplicate Records
    By Jacco in forum Microsoft SQL / MS SQL Server
    Replies: 3
    Last Post: August 14th, 04:35 PM

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139