June 19, 2008 at 4:08 pm
Hi,
I want to crossjoin a table with values 0 to 7 . I could do this by creating a table with these values, is there any way that I do this without creating a table using a select statement, such as:
select * from table1 cross join (select '0','1','2','3','4','5','6','7') table2
this dont work as this creates those values in a row, I want them in a column, so that evry column in table 1 wil join with 0 to7.
Thanks in advance ,
JRJ
June 19, 2008 at 4:12 pm
Try this:
select *
from table1
cross join (select '0' as N
Union All Select '1' as N
Union All Select '2' as N
Union All Select '3' as N
Union All Select '4' as N
Union All Select '5' as N
Union All Select '6' as N
Union All Select '7' as N) as table2
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
June 20, 2008 at 1:09 am
Thanks very much, that works.
JRJ
June 20, 2008 at 6:21 am
Glad I could help.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
June 20, 2008 at 7:40 am
You can do that... but a lot of such things will come up in the future. I recommend that you create a Tally table and learn how to use it... here's a couple of articles to get you started... if you actually read them and do some of the examples, you will quickly discover other uses for the Tally table...
http://www.sqlservercentral.com/articles/TSQL/62867/
http://www.sqlservercentral.com/articles/T-SQL/63003/
--Jeff Moden
Change is inevitable... Change for the better is not.
June 20, 2008 at 8:25 am
I just want to register my agreement with Jeff. If this is truly a one-time thing and really only 7 items, then sure, go with this. But if you're going to be doing this again, or you're going to be using multiple or varying counts, then definitely, switch to a Numbers or "Tally" table now. It's easy to setup and much easier to use that this trick. Much more flexible too.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
June 20, 2008 at 8:38 am
rbarryyoung (6/20/2008)
I just want to register my agreement with Jeff.
Heh... ok, ok... I promise to never draw smiley faces on your helmet ever again 😛
--Jeff Moden
Change is inevitable... Change for the better is not.
June 20, 2008 at 8:39 am
Jeff Moden (6/20/2008)
rbarryyoung (6/20/2008)
I just want to register my agreement with Jeff.Heh... ok, ok... I promise to never draw smiley faces on your helmet ever again 😛
I wouldn't make that promise....:hehe:
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
June 20, 2008 at 9:02 am
Matt Miller (6/20/2008)
Jeff Moden (6/20/2008)
rbarryyoung (6/20/2008)
I just want to register my agreement with Jeff.Heh... ok, ok... I promise to never draw smiley faces on your helmet ever again 😛
I wouldn't make that promise....:hehe:
Nor would I. Uh,... wait, that's not right...
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
June 20, 2008 at 4:39 pm
Thanks guys for your response, it's really useful. I am just wondering which one will be faster when I deel with some 50,000 records: the one with temp table as in the query above or a permenant table?
Please could you help me with the follwing query:
I have a table with data imported every hour from another DB using DTS copy column task (maximum around 50,000 records); I delete all the data in my table just before I import the data each time. IF I create indexes in this table, will it be an overhead for my DB as I am deleting my table and rebuilding every hour (I suppose if I have 3 indexes in my table, the DB have to build 150000 indexes every hour or so).
Thanks in advance
June 20, 2008 at 5:11 pm
JRJ (6/20/2008)
Thanks guys for your response, it's really useful. I am just wondering which one will be faster when I deel with some 50,000 records: the one with temp table as in the query above or a permenant table?
That's not a temp table, it is a "derived table", and the permanent table is at least as fast, and probably much faster.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
June 20, 2008 at 5:20 pm
JRJ (6/20/2008)
Thanks guys for your response, it's really useful. I am just wondering which one will be faster when I deel with some 50,000 records: the one with temp table as in the query above or a permenant table?Please could you help me with the follwing query:
I have a table with data imported every hour from another DB using DTS copy column task (maximum around 50,000 records); I delete all the data in my table just before I import the data each time. IF I create indexes in this table, will it be an overhead for my DB as I am deleting my table and rebuilding every hour (I suppose if I have 3 indexes in my table, the DB have to build 150000 indexes every hour or so).
Thanks in advance
I'm not trying to be flippant here... trying to give some good advice based on experience...
If you were truly concerned about performance, you wouldn't be using DTS to import your data and you wouldn't be using DELETE to empty the table. Instead, you would be using BULK INSERT, possibly with a BCP Format file, and you would either be using TRUNCATE to empty your staging table or you'd be building a new one in TempDB for every import. If it's taking you longer than 5 seconds to import only 50,000 rows, you are simply doing it the wrong way and may need to make a change.
--Jeff Moden
Change is inevitable... Change for the better is not.
June 20, 2008 at 6:01 pm
Thanks for your reply, sorry about the wrong naming.
Regarding my second query, I am deeling with constantly changing existing data as well as new data; if I don't delete, then I have to update those existing records and insert the new ones, which is not possible with DTS copy column task. Is it possible with BCP?
Can you illustrate further, how can I use the tempDB for this? These data need to be accessed by users through a web page and the users have permissions for this DB. If I need to use the TempDB, then I need to assign permissions for the tempDB as well. Is that ok security wise?
June 20, 2008 at 6:22 pm
JRJ (6/20/2008)
Thanks for your reply, sorry about the wrong naming.Regarding my second query, I am deeling with constantly changing existing data as well as new data; if I don't delete, then I have to update those existing records and insert the new ones, which is not possible with DTS copy column task. Is it possible with BCP?
Can you illustrate further, how can I use the tempDB for this? These data need to be accessed by users through a web page and the users have permissions for this DB. If I need to use the TempDB, then I need to assign permissions for the tempDB as well. Is that ok security wise?
When you say "constantly changing existing data as well as new data", are you saying that the hourly file contains a complete replacement for all data whether it's existing data or new data?
--Jeff Moden
Change is inevitable... Change for the better is not.
June 21, 2008 at 8:06 am
Sorrry Jeff, I went to bed after my last reply.....
Yes every import is a complete replacement of old and new data.
Viewing 15 posts - 1 through 15 (of 28 total)
You must be logged in to reply to this topic. Login to reply