December 30, 2008 at 3:07 am
Yelena Varshal (12/29/2008)
I mostly use CSV / BCP or MS ACCESS for that.
One key point about ths approach is that it's "quick and dirty" - if all you want to do is copy a few records from one database to another, and particularly if you have query windows open to both already, it's considerably quicker, easier, and I would guess in most cases less error prone to do it this way than messing about exporting to CSV, re-importing etc. The cut and paste of the script serves as a sanity check - a preview of what it's going to do before you actually do it, which I think can be a bit more transparent than kicking a SSIS, DTS or BCP job off.
December 30, 2008 at 4:18 am
Although I agree that there is merit in this technique (as I've used it myself on previous projects). I would say this "quick and dirty" attempt is like re-inventing the wheel and making it oval! It's been done before and been done better as a quick search on the web will show. Personally I used the usp_generate_inserts procedure mentioned above with a few personal enhancements.
In terms of when this technique is useful, sripting an entire database to text files (schema and data) means you can:
- place the scripts in source control
- perform simple data comparisson
- ease deployment (you'd be surprised how many DBA's refuse to or are not competent to do anything other than run SQL scripts)
I tend to script out the static metadata e.g. product_type and include the insert script with the create table script.
There are many enhancements that could be made e.g.
- Using NVARCHAR(MAX) so as not to exceed the 4000/8000 character limit
- handling binary data correctly
- why not generate a single insert instead of 1 per row e.g. INSERT ... SELECT ... UNION ALL SELECT ...
- why not make the insert re-runnable e.g. using a WHERE NOT EXISTS clause to only insert data that's not already there
- even better with SQL 2008 you could generate a MERGE query to INSERT, UPDATE or DELETE as appropriate
December 30, 2008 at 5:08 am
Adam Tappis (12/30/2008)
Although I agree that there is merit in this technique (as I've used it myself on previous projects). I would say this "quick and dirty" attempt is like re-inventing the wheel and making it oval!
I prefer the comparison that I show you a hammer, and you say "but it's not a screwdriver!". I continue to find it useful from time to time, and will continue to use it - if you don't - well, don't use it!
This is nothing more than something I knocked up in an idle few minutes that I thought was worth sharing. I did google for something similar first, but didn't hit on the right combination of search terms. One thing I was trying to achieve was to keep things as simple as possible while getting to the intended result, so that the procedure itself would perhaps spark some ideas in people about what could be done in terms of using SQL to write SQL.
- place the scripts in source control
- perform simple data comparisson
- ease deployment (you'd be surprised how many DBA's refuse to or are not competent to do anything other than run SQL scripts)
This really goes way beyond the intention here. I guess you could use it to script some key coding table data, and source control that, but really that's not what it is for (personally, I would tend to use something like Red Gate SQL Data Compare for that). I use this for much more "ad hoc" tasks - when setting up a SQL Compare project or a SSIS/DTS job would be overkill.
- Using NVARCHAR(MAX) so as not to exceed the 4000/8000 character limit
I still have some SQL 2000 databases it needs to work on. No reason not to change it for SQL 2005 and above ...
- handling binary data correctly
Personally, just didn't feel the need - if it gets much beyond what can be verified visually, either in number of records, number of fields or the data types it needs to operate on, using any of the several other available solutions might be more appropriate. I guess it would be easy enough to do.
- why not generate a single insert instead of 1 per row e.g. INSERT ... SELECT ... UNION ALL SELECT ...
Certainly could do - but why? This is intended for a small number of records, so performance is not a particular issue. Mind you, t would mitigate considerably the string length limitations for tables with a lot of columns I guess. I tend not to use this where there are a lot of columns though - I would prefer using something like SQL Data Compare that gives me a more digestible preview of what is going to happen in those cases.
- why not make the insert re-runnable e.g. using a WHERE NOT EXISTS clause to only insert data that's not already there
Presumably based on ascertaining the key fields? Again, I was staying away from the additional complexity because I didn't particularly feel the need, and also with the limitations on string length brought about by the general approach, this could itself cause some problems.
- even better with SQL 2008 you could generate a MERGE query to INSERT, UPDATE or DELETE as appropriate
See SQL 2000 comment earlier.
January 1, 2009 at 4:34 pm
Hi all,
this is a really good idea.
I struggled sometimes about the need to get a few thousend rows from one database into a local test database, which could be done vrey easy by this code, Thanks for this.
But it would be fine, if such a functionality could be located in a "tools" database, so I improved the code a bit.
Now it is possible to point to the the target database with a additional input parameter of the procedure. Hope, that is an improvement someone needs.
Joe
Here is the code:
--======================================================================
-- Function to format the output value for inclusion in the VALUES list
--======================================================================
CREATE FUNCTION spVal(@str nvarchar(4000)) RETURNS nvarchar(4000) AS
BEGIN
DECLARE @res nvarchar(4000)
IF @STR IS NULL
SET @res = 'NULL'
ELSE
SET @res = 'N''' + REPLACE(@str, '''', '''''') + ''''
RETURN(@res)
END
GO
--======================================================================
-- Function for the special case of formatting the output value for
-- inclusion in the VALUES list
--======================================================================
CREATE FUNCTION spMVal(@val money) RETURNS nvarchar(4000) AS
BEGIN
DECLARE @res nvarchar(4000)
IF @val IS NULL
SET @res = 'NULL'
ELSE
SET @res = CONVERT(varchar(20), @val, 2)
RETURN(@res)
END
GO
--======================================================================
-- Create a script for inserting data into the specified table, based
-- on the optional condition
--
-- i.e
-- EXEC[dbo].[spScriptInsert]
--@database = N'Test',
--@table = N'Pricing'
--======================================================================
CREATE PROC spScriptInsert(@database sysname,
@table varchar(80),
@condition varchar(80) = '1=1')
AS
DECLARE @fields nvarchar(4000)
, @values nvarchar(4000)
, @sql nvarchar(4000)
, @fieldname nvarchar(128)
, @colorder int
, @type varchar(40)
if object_id('tempdb..##fields') is not null
DROP table ##fields
SET @sql ='SELECT top 100 colorder, syscolumns.name, systypes.name as type
INTO ##fields
FROM ' + @database + '.dbo.syscolumns
JOIN ' + @database + '.dbo.sysobjects ON sysobjects.id = syscolumns.id
JOIN ' + @database + '.dbo.systypes ON systypes.xusertype = syscolumns.xusertype
WHERE sysobjects.name = ''' + @table + ''' and systypes.name <> ''text''
ORDER BY colorder'
EXEC sp_executeSQL @sql
DECLARE fieldscursor CURSOR FOR
SELECT [colorder], [name], [type]
FROM ##fields
ORDER BY colorder
OPEN fieldscursor
FETCH NEXT FROM fieldscursor INTO @colorder, @fieldname, @type
SET @fields = ''
SET @values = ''
WHILE @@FETCH_STATUS = 0
BEGIN
SET @fields = @fields + ',[' + @fieldname + ']'
IF @type = 'money'
-- Special case for "money" type
SET @values = @values + '+'',''+dbo.spMVal([' + @fieldname + '])'
ELSE
SET @values = @values + '+'',''+dbo.spVal([' + @fieldname + '])'
FETCH NEXT FROM fieldscursor INTO @colorder, @fieldname, @type
END
DEALLOCATE fieldscursor
SET @sql = 'SELECT ''INSERT INTO ' + @table + '(' +
SUBSTRING(@fields, 2, 2000) +
') VALUES (''+ ' +
SUBSTRING(@values, 6, 1000) + '+'')'' FROM ' +
@database + '.dbo.' + @table + ' WHERE ' + @condition
EXEC sp_executeSQL @sql
GO
January 1, 2009 at 8:05 pm
Handy bit of code.
I personally use SSMS Tools Pack's "Generate Insert Statements" function (http://www.ssmstoolspack.com/) directly within SSMS which also works with SQL 2000 however your version caters for the need when the generation must be scripted.
January 2, 2009 at 6:55 am
Mmm, cursor don't loops over rows in the source table, but fields in the source table...
Greetings
David Rodríguez
January 2, 2009 at 7:57 am
It shouldn't need to cursor through the fields either.
Below is a version I'd used in the past. I've just updated it for SQL 2005 and 2008. It's not got lot's of bells and whistles, but supports multiple tables/schemas (using wildcard in table and schema name) and as far my testing goes, supports all the new data types (some additional enhancement may be required to handle sql_variant and UDT's).
It also uses a cursor, but only to iterate over all the tables that match the parameter definitions.
Oh, and it supports scripting data from views.
CREATE PROCEDURE [dbo].[usp_generate_inserts]
/***
$One-Liner
Generates an insert script for the specified object(s)
$Detail
This stored procedure will generate an INSERTS script for the data contained in
a table or view. It supports wildcards for both table anme and schema parameters.
$Example
script data from [dbo].[test]
EXECUTE usp_generate_inserts 'test'
script data from all tables that begin with dim in the warehouse schema
EXECUTE usp_generate_inserts 'dim%', 'warehouse'
$Support
SQL Server 2005 and 2008
$Created By Date
=====================================================
Adam Tappis 31 Dec 2008
***/
(
@i_table_name NVARCHAR(255) = NULL
, @i_schema_name NVARCHAR(255) = 'dbo'
)
AS
DECLARE
@vsSQL NVARCHAR(MAX)
, @vsCols NVARCHAR(MAX)
, @vsTableName SYSNAME
, @maxID INT
CREATE TABLE #tmp
( id INT IDENTITY
, [--sqltext] NVARCHAR(MAX))
CREATE TABLE #tmp2
( id INT IDENTITY
, [--sqltext] NVARCHAR(MAX))
SET NOCOUNT ON
DECLARE csrTables CURSOR FOR
SELECT QUOTENAME(SCHEMA_NAME(schema_id)) + '.' + QUOTENAME([name])
FROM sys.objects
WHERE type IN ('U', 'V')
AND [name] LIKE ISNULL(@i_table_name, [name])
AND SCHEMA_NAME(schema_id) LIKE ISNULL(@i_schema_name, SCHEMA_NAME(schema_id))
ORDER BY [name]
OPEN csrTables
FETCH NEXT
FROM csrTables
INTO @vsTableName
WHILE @@fetch_status = 0
BEGIN
SELECT @vsSQL = ''
, @vsCols = ''
SELECT @vsSQL = @vsSQL
+ CASE
-- ignore timestamp columns
WHEN st.name IN ('timestamp') THEN ''
-- handle binary types
WHEN st.name IN ('image','binary','varbinary','geography','geometry') THEN
'ISNULL(sys.fn_varbintohexstr(CAST(' + sc.name + ' AS VARBINARY(MAX))),
''NULL'')+'',''+'
-- handle GUID columns
WHEN st.name in ('uniqueidentifier') THEN
'ISNULL('''''''' + CAST(' + sc.name + ' AS CHAR(36)) +
'''''''',''NULL'')+'',''+'
-- handle date and time types
WHEN st.name LIKE '%date%' OR st.name LIKE '%time%' THEN
'ISNULL('''''''' + CONVERT(VARCHAR(50),' + sc.name + ',113) +
'''''''',''NULL'')+'',''+'
-- handle string types
WHEN st.name in ('sql_variant','varchar','char','nvarchar','nchar','sysname','xml',
'text', 'ntext','hierarchyid') THEN
'ISNULL(''' + CASE LEFT(st.name,1) WHEN 'n' THEN 'N' ELSE '' END + '''''''+'
+ 'REPLACE(CAST(' + sc.name + ' AS
NVARCHAR(MAX)),'''''''','''''''''''')+'''''''',''NULL'')+'',''+'
-- numeric types
ELSE 'ISNULL(CAST(' + sc.name + ' AS VARCHAR(MAX)),''NULL'')+'',''+'
END
FROM sys.columns sc
JOIN sys.types st
ON sc.user_type_id = st.user_type_id
WHERE sc.object_id = OBJECT_ID(@vsTableName)
ORDER BY column_id
SELECT @vsCols = @vsCols + sc.name + ','
FROM sys.columns sc
JOIN sys.types st
ON sc.user_type_id = st.user_type_id
WHERE sc.object_id = OBJECT_ID(@vsTableName)
AND st.name <> 'timestamp'
ORDER BY column_id
SELECT @vsSQL = STUFF(@vsSQL,LEN(@vsSQL) - 2, 3, '''')
, @vsCols = STUFF(@vsCols,LEN(@vsCols), 1, '')
INSERT #tmp
EXEC ('SELECT ' + @vsSQL + ' FROM ' + @vsTableName)
-- Check if any rows were present
IF (SELECT COUNT(*) FROM #tmp) > 0
BEGIN
SELECT @maxID = MAX(id)
FROM #tmp
UPDATE #tmp
SET [--sqltext] = 'SELECT ' + SUBSTRING([--sqltext],1,DATALENGTH([--sqltext])-1)
+ CASE id WHEN @maxID THEN '' ELSE CHAR(13) + CHAR(10) + ' UNION ALL' END
INSERT #tmp2
SELECT CHAR(13) + CHAR(10) + 'DELETE FROM ' + @vsTableName + CHAR(13) + CHAR(10) + 'GO' + CHAR(13) + CHAR(10)
IF (SELECT COUNT(*) FROM sys.columns WHERE object_id = OBJECT_ID(@vsTableName) AND is_identity = 1) > 0
BEGIN
INSERT #tmp2
SELECT CHAR(13) + CHAR(10) + 'SET IDENTITY_INSERT ' + @vsTableName + ' ON'
END
INSERT #tmp2
SELECT CHAR(13) + CHAR(10) + 'INSERT ' + @vsTableName + '(' + @vsCols + ')'
INSERT #tmp2
SELECT [--sqltext]
FROM #tmp
IF (SELECT COUNT(*) FROM sys.columns WHERE object_id = OBJECT_ID(@vsTableName) AND is_identity = 1) > 0
BEGIN
INSERT #tmp2
SELECT CHAR(13) + CHAR(10) + 'SET IDENTITY_INSERT ' + @vsTableName + ' OFF'
END
INSERT #tmp2
SELECT 'GO' + CHAR(13) + CHAR(10) + CHAR(13) + CHAR(10) + 'UPDATE STATISTICS ' + @vsTableName + CHAR(13) +
CHAR(10) + 'GO'
DELETE #tmp
END
ELSE
BEGIN
INSERT #tmp2
SELECT '-- Now rows in table ' + @vsTableName
END
FETCH NEXT
FROM csrTables
INTO @vsTableName
END
CLOSE csrTables
DEALLOCATE csrTables
UPDATE #tmp2
SET [--sqltext] = SUBSTRING([--sqltext], 1, CHARINDEX(',)', [--sqltext]) - 1)
+ ',NULL)'
WHERE CHARINDEX(',)', [--sqltext]) <> 0
UPDATE #tmp2
SET [--sqltext] = REPLACE([--sqltext], ',''''', ',NULL')
WHERE CHARINDEX(',''''', [--sqltext]) <> 0
UPDATE #tmp2
SET [--sqltext] = REPLACE([--sqltext], '(''''', ',NULL')
WHERE CHARINDEX('(''''', [--sqltext]) <> 0
SELECT [--sqltext]
FROM #tmp2
ORDER BY id
SET NOCOUNT OFF
GO
January 2, 2009 at 9:57 am
This seems to be addressing a different issue to the one I was trying to solve. First of all, it's pretty tied to SQL server 2005 and above - e.g. use of "sys.objects" rather than "sysobjects" - I wanted to retain the backwards compatibility, at least for the time being!
Adam Tappis (1/2/2009)
It's not got lot's of bells and whistles, but supports multiple tables/schemas (using wildcard in table and schema name)
I think this, together with the omission of a condition parameter, is the biggest indication that it's trying to do something different to what I set out to do. The article starts:
I found myself using DTS fairly frequently to transfer trivial amounts of data from one database to another - frequently fairly small volumes, such as new additions to coding tables, and restoring that one record or two that someone had accidentally deleted.
So I'm not really setting out to transfer large quantities of data from several tables - in fact, I would tend not to use this if the output were more than I could comfortably scroll through and do a quick sanity check. I've got SQL Data Compare - I'm not seeking to reinvent that particular wheel.
It shouldn't need to cursor through the fields either.
It certainly doesn't need to use a cursor to iterate through the fields, but I don't really feel any reason not to - I think it makes what it is doing reasonably clear, and performance in this context isn't really an issue, since it is only runs occasionally on quite a small data set (i.e. the set of columns for the single table specified - it wasn't the original intention to iterate across several, in particular since in many cases a condition would be present, and the same condition might well not make sense on several different tables).
Viewing 8 posts - 16 through 22 (of 22 total)
You must be logged in to reply to this topic. Login to reply