Professional Web Applications Themes

Deleting duplicate records - FileMaker

Jon, You didnt supply the DDL, so I can only point to existing practices to remove duplicate records.Here they are: INF: How to Remove Duplicate Rows From a Table [url]http://support.microsoft.com/default.aspx?scid=kb;EN-US;q139444[/url] INF: Removing Duplicate Rows from an Existing Table in SQL [url]http://support.microsoft.com/default.aspx?scid=kb;EN-US;q70956[/url] Removing Duplicate Records [url]http://www.databasejournal.com/features/mssql/article.php/1438651[/url] >>>> Should I use primary keys to prevent this? Yes or a UNIQUE constraint depending upon the business logic. >>>Would the job fail or just not load the dups? May be you can load it first to a temp table and then use the logic mentioned in the above articles before inserting to the main ...

  1. #1

    Default Re: Deleting Duplicate records

    Jon,

    You didnt supply the DDL, so I can only point to existing practices to
    remove duplicate records.Here they are:

    INF: How to Remove Duplicate Rows From a Table
    [url]http://support.microsoft.com/default.aspx?scid=kb;EN-US;q139444[/url]

    INF: Removing Duplicate Rows from an Existing Table in SQL
    [url]http://support.microsoft.com/default.aspx?scid=kb;EN-US;q70956[/url]

    Removing Duplicate Records
    [url]http://www.databasejournal.com/features/mssql/article.php/1438651[/url]

    >>>> Should I use primary keys to prevent this?
    Yes or a UNIQUE constraint depending upon the business logic.

    >>>Would the job fail or just not load the dups?
    May be you can load it first to a temp table and then use the logic
    mentioned in the above articles before inserting to the main table.


    --
    Dinesh.
    SQL Server FAQ at
    [url]http://www.tkdinesh.com[/url]

    "Jon" <jonscottsmartneighborood.net> wrote in message
    news:055701c34660$d76633f0$a301280aphx.gbl...
    >
    > Hi,
    >
    > I have duplicate records in a table that need to be
    > removed. Any coding help to remove these records would be
    > appreciated. Also ideas on preventing would be helpful.
    > Should I use primary keys to prevent this? Sorry new to
    > SQL coding. Also if setting keys would prevent this what
    > would happen when I BCP data into the table and the file
    > contained dups? Would the job fail or just not load the dups?
    >
    > Thanks,
    >
    > Jon

    Dinesh.T.K Guest

  2. #2

    Default Deleting duplicate records

    The help file is a bit thin in its description of how to locate and delete
    "excess" duplicate records.

    Locating duplicate records is straight forward (!), but how does one go
    about finding duplicate records in excess of the first instance of a record?
    Once found, I assume a simple delete all records is appropriate, assuming
    that only duplicate records are displayed.

    Any help or pointers would be appreciated.

    Thanks - Art

    Art Russell Guest

  3. #3

    Default Re: Deleting duplicate records

    Actually, the online help is very helpful in this respect. Read the article
    "Finding duplicate values using a self-join relationship" (a search on
    "duplicate records" will find it). It will explain precisely what you need
    to do to achieve your goal.

    Bridget Eley

    in article BB5EF188.52A21%artrussellmindspring.com, Art Russell at
    [email]artrussellmindspring.com[/email] wrote on 13/8/03 9:33 AM:
    > The help file is a bit thin in its description of how to locate and delete
    > "excess" duplicate records.
    >
    > Locating duplicate records is straight forward (!), but how does one go
    > about finding duplicate records in excess of the first instance of a record?
    > Once found, I assume a simple delete all records is appropriate, assuming
    > that only duplicate records are displayed.
    >
    > Any help or pointers would be appreciated.
    >
    > Thanks - Art
    >
    Bridget Eley Guest

  4. #4

    Default Re: Deleting duplicate records

    Art Russell wrote:
    > The help file is a bit thin in its description of how to locate and delete
    > "excess" duplicate records.
    >
    > Locating duplicate records is straight forward (!), but how does one go
    > about finding duplicate records in excess of the first instance of a record?
    > Once found, I assume a simple delete all records is appropriate, assuming
    > that only duplicate records are displayed.
    >
    > Any help or pointers would be appreciated.
    >
    > Thanks - Art
    >
    In FileMaker ver 5 and above from the Users Guide, chapter 10, section
    13, titled Example of a Scriptmaker Script. In ver 4 it is chapter 5,
    section 26. This walks you through setting up and scripting a method of
    finding all duplicates. Also if you search for "finding duplicates" in
    the knowledgebase at filemaker.com you will find two more techniques.
    This should help. Best of luck.

    Michael Myett

    Michael Myett Guest

Similar Threads

  1. Replies: 1
    Last Post: July 10th, 11:08 AM
  2. Duplicate records
    By DuLaus in forum Coldfusion - Advanced Techniques
    Replies: 7
    Last Post: October 25th, 03:20 PM
  3. Return Records and not duplicate them :: Again
    By The Ox in forum Dreamweaver AppDev
    Replies: 1
    Last Post: March 16th, 05:57 PM
  4. Duplicate records?
    By Bridget Eley in forum FileMaker
    Replies: 2
    Last Post: July 28th, 12:06 AM
  5. SQL Query for deleting duplicate records
    By jijo in forum Microsoft SQL / MS SQL Server
    Replies: 1
    Last Post: July 8th, 01:23 PM

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139