This worked well, until I added a multicast and an OLE Command component to the data flow. What I wanted was to check the values passed to the stored procedure and add the existing row to history if there was a change. The OLE Command used the same connection manager as the main data insert, since the history and the current tables were in the same database.
When running it, I would get to about 100,000 rows, and then SSIS would begin to slow down and finally it would lose network connectivity to the SQL Server. (We have a separate SSIS server) This is the error that was written to the logs:
SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred.
Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0"
Hresult: 0x80004005 Description: "A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online.".
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0"
Hresult: 0x80004005 Description: "Communication link failure".
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0"
Hresult: 0x80004005 Description: "Named Pipes Provider: No process is on the other end of the pipe."
I began by troubleshooting the network, looking for errors on the switch and both servers. This yielded nothing, there were no errors logged. So I turned to SSIS, figuring that it somehow was the culprit since everything worked until I added the multicast.
I removed the multicast and the OLE Command components and tested again. The package processed a million rows in less than 2 minutes.
I added the multicast back in, without the OLE Command and tested. Again it processed a million rows in less than 2 minutes.
I added the OLE Command component back in and tested. It failed at 49,000 records.
I added another connection manager, set the same properties and then pointed the OLE Command to the new manager. When I tested, I was surprised to see it complete successfully!
It seems that the one connection manager was not able to handle the multiple destinations simultaneously. I have no way to confirm this at this time, but since I have multiple connection managers for the same database, I have not experienced this issue at all.
1 comment:
This idea worked like a charm for me. I'm loading data without staging the raw data first. On the way to my target table, I want to check for dirty incoming data and save those raw records in an error table. I ended up with 1 OLE destination for my target table, and a few destinations for the error table, all using the same connection. With a smaller number of records, I had no errors, but with larger numbers, I'd error out usually after a few million rows.
I redesigned this so that I have two connections pointing to the same database. For the errors, I gathered them all up in a union before sending them to the error table. This way I have only 2 destinations, each with its own connection. All of my problems went away. Thanks for the tip!
Post a Comment