DTS Issue - the number of failing rows exceeds the maximum specified

  • Hi all,

    I'm attempting to run a DTS task which will copy data from one server to another.  It's a straight like for like copy - the row names are the same on the destination server.  This DTS task is part of a two stage process.  There is another task that extracts data from our main database and sends it to a staging server (which has external access) and this server then sends that extract to the remote server (this is the stage that's failing). 

    The error I'm getting pops up after about 4000 rows have been transformed or copied.  It is "The number of failing rows exceeds the maximum specified.  [DBNETLIB][ConnectionWrite (send()).]  General network error.

    Nice and vauge

    Has anyone experienced anything similar to this, and if so do you have any advice?  Both servers are SQL 2000 sp3a.

    Cheers,

     

    Iain

  • "General network error"

    This says it all. Basically there's been some sort of communication hiccup that's terminated the connection.

     

    --------------------
    Colt 45 - the original point and click interface

  • Phil is right. This is what you get when SQL Server has problems communicating. Hiccups in comm that a file transfer or other type of RPC use tend to blow by it, but SQL Server has little tolerance for this stuff.

  • This post was very helpful to me for confirming that we actually had a networking issue and not a DTS development issue.  However, our network engineers have, so far, been unable to determine the cause of the connection hiccups that are preventing my DTS package from completing its tasks.

    In the meantime, if you know, would increasing the max error count property (located under the Option tab of each of the task properties and currently set to the default value of 0) be an effective workaround until our connection problem is solved?

    If so, is there any downside or things to watch out for when temporarily increasing the max error count until the connection issue can be resolved? 

    Also, if I do increase the max error count property, if the import is underway when there is a connection hiccup during a particular task, would increasing this max error count property cause the task to start over from the beginning, or would the task pick up where it left off when the connection was temporarily broken?

    Thanks in advance for any help!

  • Probably should have started a new thread.

    Increasing the max error count may help you if the Network hiccup is resolved before the data transfer.

    Downsides ... the transfer does not start over and any rows that are in error are dropped.

    So maybe its not such an effective workaround.

    You don't mention where you're pulling data from. If its an ODBC datasource you can turn on ODBC logging and trace the ODBC calls that go back and forth. Another option would be to do a network trace and track what the packets are doing.

     

    --------------------
    Colt 45 - the original point and click interface

  • Thanks for the reply.  I suspected as much (concerning the max error count).

    Our ISP did run a network trace and monitered the network performance as I ran the DTS package.  According to them, all is well (they saw no breakage in the connection despite my "General Network Error" message while they were monitoring). 

    I'm stumped.  My DTS package imports data from one SQL Server db to another SQL Server db.  We've been running this package successfuly for a couple of years now.  Possibly, as the data grows, we've maxed out the data we can transport within our bandwith capacity???

    Applying this theory, I'm reducing the size of the data (i.e. number of records) being imported in any one package.  Or -- as Steve mentioned in his editorial today, I might have to resort to FedEx.

    Thanks again!

     

     

  • One other thing you can check/change. Is the server using TCP or Named Pipes? If its using Named Pipes, change it to TCP.

     

    --------------------
    Colt 45 - the original point and click interface

  • The server is using TCP. 

    It did turn out that the size of our import finally outgrew the capacity of the line.  I rewrote the package so that significantly less data is imported at any one time, and as a result, we're now able to get the data to the server.  It's a workaround, but it gets us there.

    Thanks again for your time, Phil.

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply