January 12, 2021 at 4:42 pm
Hi,
Each morning, we run several data driven subscriptions which generate about 10,000 reports and email them as excel attachments. Recently we have resolved some issues we had with the subscriptions (such as bad email addresses) that were causing subscription errors. Most days, we run through these subscriptions error-free. Some days, we get a few errors due to a connection pool timeout. Basically, the report server cannot reach its own database, and claims the connection pool is full. Well... it's not full. I ran some perfom stats and found that we're maxing out at well under 200 simultaneous connections. Also, we have named pipes enabled...
When the error occurs, SSRS doesn't retry (super frustrating! why not retry?!), and it doesn't log anything in the ExecutionLog, so we can't tell which rows of the data driven subscription failed in order to retry manually. I tried increasing the MaxActiveReqForOneUser setting, and that seemed to help for a while but the problem has resurfaced.
The SSRS service (SQL Server 2016, SP2, recent CU) and database are running on a VM with 8 vCPUs and 20 GB of RAM, with SSD storage. The SQL server is set to allow unlimited concurrent connections. As stated, both TCP/IP and Named Pipes are enabled. The Remote Query Timeout is set to 6000 (100 minutes).
I'm really at a loss as far as how to figure out why this is failing and how to prevent it from occurring...
The error looks like this:
dbpolling!WindowsService_0!730!01/12/2021-08:11:32:: e ERROR: PollingFunction: Microsoft.ReportingServices.Library.ReportServerDatabaseUnavailableException: The report server cannot open a connection to the report server database. A connection to the database is required for all requests and processing. ---> System.Data.SqlClient.SqlException: Connection Timeout Expired. The timeout period elapsed during the post-login phase. The connection could have timed out while waiting for server to complete the login process and respond; Or it could have timed out while attempting to create multiple active connections. The duration spent while attempting to connect to this server was - [Pre-Login] initialization=1; handshake=0; [Login] initialization=0; authentication=1; [Post-Login] complete=15993; ---> System.ComponentModel.Win32Exception: The wait operation timed out.
notification!WindowsService_0!730!01/12/2021-08:11:33:: e ERROR: Error occurred processing subscription 7ab8c91d-3972-45f6-a663-30d97f790adf: The report server cannot open a connection to the report server database. A connection to the database is required for all requests and processing.
library!WindowsService_0!180!01/12/2021-08:11:33:: e ERROR: Throwing Microsoft.ReportingServices.Library.ReportServerDatabaseUnavailableException: , Microsoft.ReportingServices.Library.ReportServerDatabaseUnavailableException: The report server cannot open a connection to the report server database. A connection to the database is required for all requests and processing. ---> System.Data.SqlClient.SqlException: Connection Timeout Expired. The timeout period elapsed during the post-login phase. The connection could have timed out while waiting for server to complete the login process and respond; Or it could have timed out while attempting to create multiple active connections. The duration spent while attempting to connect to this server was - [Pre-Login] initialization=1; handshake=0; [Login] initialization=0; authentication=1; [Post-Login] complete=15993; ---> System.ComponentModel.Win32Exception: The wait operation timed out
--- End of inner exception stack trace ---
at System.Data.SqlClient.SqlInternalConnectionTds..ctor(DbConnectionPoolIdentity identity, SqlConnectionString connectionOptions, SqlCredential credential, Object providerInfo, String newPassword, SecureString newSecurePassword, Boolean redirectedUserInstance, SqlConnectionString userConnectionOptions, SessionData reconnectSessionData, DbConnectionPool pool, String accessToken, Boolean applyTransientFaultHandling, SqlAuthenticationProviderManager sqlAuthProviderManager)
at System.Data.SqlClient.SqlConnectionFactory.CreateConnection(DbConnectionOptions options, DbConnectionPoolKey poolKey, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningConnection, DbConnectionOptions userOptions)
at System.Data.ProviderBase.DbConnectionFactory.CreatePooledConnection(DbConnectionPool pool, DbConnection owningObject, DbConnectionOptions options, DbConnectionPoolKey poolKey, DbConnectionOptions userOptions)
at System.Data.ProviderBase.DbConnectionPool.CreateObject(DbConnection owningObject, DbConnectionOptions userOptions, DbConnectionInternal oldConnection)
at System.Data.ProviderBase.DbConnectionPool.UserCreateRequest(DbConnection owningObject, DbConnectionOptions userOptions, DbConnectionInternal oldConnection)
at System.Data.ProviderBase.DbConnectionPool.TryGetConnection(DbConnection owningObject, UInt32 waitForMultipleObjectsTimeout, Boolean allowCreate, Boolean onlyOneCheckConnection, DbConnectionOptions userOptions, DbConnectionInternal& connection)
at System.Data.ProviderBase.DbConnectionPool.TryGetConnection(DbConnection owningObject, TaskCompletionSource`1 retry, DbConnectionOptions userOptions, DbConnectionInternal& connection)
at System.Data.ProviderBase.DbConnectionFactory.TryGetConnection(DbConnection owningConnection, TaskCompletionSource`1 retry, DbConnectionOptions userOptions, DbConnectionInternal oldConnection, DbConnectionInternal& connection)
at System.Data.ProviderBase.DbConnectionInternal.TryOpenConnectionInternal(DbConnection outerConnection, DbConnectionFactory conne
January 13, 2021 at 5:10 pm
Thanks for posting your issue and hopefully someone will answer soon.
This is an automated bump to increase visibility of your question.
January 15, 2021 at 4:03 pm
Can you split the workload into smaller pieces, and/or distribute the subscriptions over a longer period?
I know that's kind of a bandaid-type fix, but it may tell you if the problem is time or workload related.
You can also look for whether anything else is connecting to the database - there could be some maintenance process that is interfering, such as the one that cleans up execution log entries.
January 15, 2021 at 5:31 pm
Thanks for your reply Parker! I appreciate it.
The business goal is to send these reports out ASAP. If we were to break the data-driven subscriptions into smaller pieces, we'd be adding technical debt / complexity in managing who-gets-which-copy-of-which-report--and we'd still have the same number of reports that all need to go out ASAP. The way I understand it, if you have 1 subscription with 100 rows, or 10 subscriptions with 10 rows, each one gets an entry in the Notifications queue table. Without altering the built in SSRS stored procs which processes the Notifications queue table, I don't think we have any way to throttle the processing of subscriptions.
As for a maintenance process or some other thing connecting while we're processing subscriptions--I don't see any sql agent jobs that do anything like that. Is it perhaps a trigger or some such? I can run a trace to find out if this is occurring, but if it is, I'm not sure how I would prevent it without modifying the built-in SSRS code.
--Andy
January 15, 2021 at 6:46 pm
I mentioned the execution log maintenance process because I have read a reference to that - someone reported that it was putting long lasting locks on the log table which interfered with their operations. Like you, they did not see a job to go with it, and attributed it to inaccessible SSRS code.
You might try running earlier/later if you can change the entire schedule - possibly that would get you away from any interference (if that is really what this is).
As an aside, I've learned to run nothing of my own at midnight in a shared environment - there's always someone (sometimes several) who thinks that is the perfect time to run a process that will make the lights dim in a three-state area... If you have good control over processing and scheduling this should not be happening to you, but otherwise is worth thinking about.
January 15, 2021 at 7:59 pm
in our situation, we wait for external data to be refreshed, we start our ETL and dataset processing as soon as it's done, and then trigger the schedules that call the subscriptions once data processing is done. the only variable is that the external data is sometimes ready slightly earlier or later in the morning, but we have no ability to change the overall timing. this isn't a shared environment, it's private and there is nothing else going on other than the processing of these subscriptions.
if the error message is to be trusted, the issue isn't a locked table, it's that the SSRS web service cannot establish a connection to the SSRS database. the evidence i've gathered doesn't support the error message's assertion that the connection pool is full.
i suppose the next step will be to leave a trace running, wait for the error to recur, and then do a deep dive into what else is using the SQL server at that time...
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply