March 11, 2008 at 8:16 am
I've setup the sample reports that are available from microsoft to provide stats on what reports have been run etc from reporting services.
The link that provided this is http://technet.microsoft.com/en-us/library/ms161561.aspx
It queries the report$server database. Very useful this. Or was! It's now broken. I've found out why it's breaking mainly through digging around in the script task where the package crashes and also through surfing the net and finding this
https://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=967081&SiteID=1
thread. I've tried their fix but it still crashes unfortunately.
It seems like the issue is populating an array indexes in the script component in the update parameters data flow task. I think this was written to deal with minimal parameter requests, however I've recently written a multi-valued parameter report and it's made the string containing the parameter details has now hit the maximum length for the string. Changing the WSTR 4000 to a different size doesn't seem to have an effect, and the forum entry above doesn't hint at what the problem is apart from the script not being able to deal with the length.
The error I now get is as follows:
at MyComponent1.ScriptMain.Input0_ProcessInputRow(Input0Buffer Row) in dts://Scripts/MyComponent1/ScriptMain:line 35
at MyComponent1.UserComponent.Input0_ProcessInput(Input0Buffer Buffer) in dts://Scripts/MyComponent1/ComponentWrapper:line 68
at MyComponent1.UserComponent.ProcessInput(Int32 InputID, PipelineBuffer Buffer) in dts://Scripts/MyComponent1/ComponentWrapper:line 25
at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.ProcessInput(Int32 inputID, PipelineBuffer buffer)
Anyone have any ideas on how to fix this?
March 13, 2008 at 9:04 am
Sob...16 views but no reply. Has no-one else encountered this?
March 18, 2008 at 4:22 am
Is it worth me reporting this to Microsoft? If so, how do I do this please as I've never contacted them in this way before?
March 31, 2008 at 11:36 am
I'm working on this exact issue right now. The issue occurs when your parameter values are too long (i.e. > ~32767 chars) and are being truncated somewhere in the conversion from BLOB data to Byte Array to String. I do not believe that using the GetBlobData function will work, so I'm currently working on a workaround. I'll try to post my findings here when/if I get this thing to work.
March 31, 2008 at 2:33 pm
Ok, here's the basis of my workaround.
I created a script task to replace the Update Parameters data flow task. I also wrote a custom SQL function that parses the parameter values for me, although with a script task, this part could have also been done in vb.net.
In the script, fill a datatable (you'll need to add System.Xml as a reference) with the results of the original select statement from dbo.ExecutionLogs. Then loop through the rows of the datatable and pass the parameters column to a parse function, which should return the Name and Value column. Once you have the logID, name, and value, send those to another function which will write them to the ExecutionParameters database.
The concept is pretty typical of ADO.NET, it's just a shame that I had to resort to this. But even with my enormous volume of parameters, the script only took about 13 minutes to run, and it ran very smoothly.
One note, I'm sure there are better ways to do it, but I ended up putting the connection string for my SQLConnection in a variable, because I couldn't figure out how to get the conn. string from a connection manager. (OLE DB connection uses the "Provider" syntax which throws an error from ADO, and the ADO connection needs Windows Authentication, which I can't provide b/c I'm running from scheduled batch)
Hope this helps...
Here's the code for the script, and below that is the code for the SQL function. I make no warranties for this code, but it works for me.
Imports System
Imports System.Data
Imports System.Data.SqlClient
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
'--Notes--
' - Must add System.XML as a reference!
' - There are various ways to get connections/connection strings. I set this script
' to use a variable called "connString" that contains the connection string and
' is made available to this script in the task properties (ReadOnlyVariables).
' - This script takes the place of the "Update Parameters" data flow task. It does the
' exact same thing, only better!
' - There should now be no limit to the length of parameters in a report. Of course, the datatype
' of the column that is being written to could be a limitation.
'--End Notes--
Public Sub Main()
Dim cnn As New SqlConnection(Dts.Variables.Item("connString").Value.ToString())
Dim cmd As New SqlCommand
cmd.Connection = cnn
'Same query as data flow task. Get the raw parameter data from dbo.ExecutionLogs
cmd.CommandText = "Select ExecutionLogID, Parameters from dbo.ExecutionLogs where Parameters IS NOT NULL and DATALENGTH(Parameters) > 0"
cmd.CommandType = CommandType.Text
Dim da As New SqlDataAdapter
Dim dt As New DataTable
Try
cnn.Open()
da.SelectCommand = cmd
'Fill a datatable with the selected data so we can parse it with a datareader.
da.Fill(dt)
cnn.Close()
Catch
Throw
Finally
If Not cnn.State = ConnectionState.Closed Then
cnn.Close()
End If
End Try
Dim cmd2 As New SqlCommand
cmd2.Connection = cnn
cmd2.CommandType = CommandType.Text
'Call custom function that will return two columns of the parsed values.
cmd2.CommandText = "Select [Name], [Value] from fnParse_Parameters(@str)"
Try
cnn.Open()
Dim ExecutionLogID As Guid
Dim name As String
Dim value As String
Dim dr As DataRow
'Loop through the rows and parse get the name and values
If dt.Rows.Count > 0 Then
For Each dr In dt.Rows
cmd2.Parameters.Clear()
cmd2.Parameters.AddWithValue("@str", Trim(dr.Item("Parameters").ToString()))
ExecutionLogID = CType(dr.Item("ExecutionLogID"), Guid)
Dim dr2 As SqlDataReader = cmd2.ExecuteReader()
While dr2.Read()
If dr2.HasRows = True Then
name = Trim(dr2.Item("Name").ToString())
value = Trim(dr2.Item("Value").ToString())
'Write the values to dbo.ExecutionParameters
WriteValues(ExecutionLogID, name, value)
End If
End While
dr2.Close()
'Added dr2.Close() because I kept getting an error that it needed to be closed.
'I guess that the 'End While' doesn't "close" the datareader, although I never
'seem to need to do this in a normal VB app.
Next
End If
cnn.Close()
Catch
Throw
Finally
If Not cnn.State = ConnectionState.Closed Then
cnn.Close()
End If
End Try
'Return Success. Don't handle any failures because you want any exception to bubble up.
Dts.TaskResult = Dts.Results.Success
End Sub
Private Sub WriteValues(ByVal ExecutionLogID As Guid, ByVal Name As String, ByVal Value As String)
Dim cnn As New SqlConnection(Dts.Variables.Item("connString").Value.ToString())
Dim cmd As New SqlCommand
cmd.Connection = cnn
cmd.CommandType = CommandType.Text
cmd.CommandText = "INSERT INTO [dbo].[ExecutionParameters] ([Name],[Value],[ExecutionLogID]) VALUES " & _
"(@Name, @Value, @ExecutionLogID)"
cmd.Parameters.AddWithValue("@Name", Name)
cmd.Parameters.AddWithValue("@Value", Value)
cmd.Parameters.AddWithValue("@ExecutionLogID", ExecutionLogID)
Try
cnn.Open()
cmd.ExecuteNonQuery()
cnn.Close()
Catch
Throw
'MsgBox("Error: " & ex.Message)
Finally
If Not cnn.State = ConnectionState.Closed Then
cnn.Close()
End If
End Try
End Sub
End Class
Here's the SQL Function:
GO
/****** Object: UserDefinedFunction [dbo].[fnParse_Parameters] Script Date: 03/31/2008 16:25:30 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE FUNCTION [dbo].[fnParse_Parameters]
(@str nvarchar(MAX))
RETURNS @Values TABLE ([Name] nvarchar(MAX), [Value] nvarchar(MAX))AS
BEGIN
DECLARE @Delim char(1)
SET @Delim = '&'
DECLARE @Delim2 char(1)
SET @Delim2 = '='
DECLARE @chrind INT
DECLARE @chrind2 INT
DECLARE @Piece nvarchar(MAX)
DECLARE @Name nvarchar(MAX)
DECLARE @Value nvarchar(MAX)
SET @chrind = 1
WHILE @chrind > 0
BEGIN
SELECT @chrind = CHARINDEX(@Delim,@str)
IF @chrind > 0
Begin
SET @Piece = LEFT(@str,@chrind - 1)
End
ELSE
Begin
SET @Piece = @STR
End
SET @chrind2 = CHARINDEX(@Delim2,@Piece)
IF @chrind2 > 0
Begin
SET @Name = LEFT(@Piece,@chrind2 - 1)
SET @Value = RIGHT(@Piece, LEN(@Piece) - @chrind2)
End
ELSE
Begin
SET @Name = @Piece
SET @Value = @Piece
End
INSERT @Values([Name],[Value]) VALUES(@Name, @Value)
SET @STR = RIGHT(@str,LEN(@str) - @chrind)
IF LEN(@str) = 0 BREAK
END
RETURN
END
April 7, 2008 at 3:25 am
Thanks for this. I've assumed you pre-populated a user variable called connString here with the connection string, but I'm not sure with what here. I've tried but with no success. If this is what you meant, what did you populate the connstring variable with please?
April 7, 2008 at 7:11 am
Yes, I created a variable called connString, and it contains the connection string. Be sure the connection string doesn't contain the "Provider" keyword, as the OLE DB connection will not take that. Take a look at http://www.connectionstrings.com for help with the connection strings.
Also, be sure that you expose the variable to the script task by adding "connString" to the ReadOnlyVariables section of the script editor. Here's how:
- Double click the script task
- Click "Script" from the menu on the left
- Find the property "ReadOnlyVariables" and type in "connString"
There are other ways to get that connection string, including hardcoding it in the script. I'm more of a .NET developer than an SSIS guy, so this way made the most sense to me, and was honestly the only way I could get it to work myself.
Hope that helps, but if it doesn't, what error message(s) are you getting?
April 11, 2008 at 4:21 am
Wahey!!! That's worked thanks. I rather stupidly was connecting to the report$server database rather than the RSExecutionLog database! Many thanks for this code. Much obliged.
April 11, 2008 at 6:08 am
Glad that helped.
I think that's a major limitation within SSIS. Row by row transformations with large amounts of data are not uncommon.
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy