February 26, 2015 at 3:30 pm
I am needing to combine the Notes field where Number and date are the same...For example
for Number 0000000003 I need notes to Read ('CHK # 2452 FOR $122.49 REJECTED AS NSF ON 2/25/15') the note counter is different for each row, and is combination of special char, 0-Z and looks like the (!) depicts the start of a new Number.
CREATE TABLE [dbo].[MyTable](
[NUMBER] [varchar](10) NULL,
[HD_DATE_TIMEX] [datetime] NULL,
[TRANS_NO] [varchar](2) NULL,
[MESSAGE_COUNTER] [varchar](1) NULL,
[MESSAGE_2] [varchar](40) NULL,
[SQL_LAST_UPDATE] [datetime] NULL
) ON [PRIMARY]
INSERT INTO MyTable (Number, tDate,tNum,Note_Counter, Note, SQL_Last_Update)
VALUES ('0000000003','2015-02-25 13:18:02.000','00','!','CHK # 2452 FOR $122.49 REJECTED AS NSF ','2015-02-25 13:18:03.000');
INSERT INTO MyTable (Number, tDate,tNum,Note_Counter, Note, SQL_Last_Update)
VALUES ('0000000003','2015-02-25 13:18:02.000','00','"',ON 2/25/15','2015-02-25 13:18:03.000');
INSERT INTO MyTable (Number, tDate,tNum,Note_Counter, Note, SQL_Last_Update)
VALUES ('0000000009','2015-02-18 11:03:43.000','00','!','Debbie CALLED TO GO OVER THE ACCT. SHE ','2015-02-25 13:18:03.000');
INSERT INTO MyTable (Number, tDate,tNum,Note_Counter, Note, SQL_Last_Update)
VALUES ('0000000009','2015-02-18 11:03:43.000','00','"','WILL BE MAILING PAYMENT THIS','2015-02-25 13:18:03.000');
INSERT INTO MyTable (Number, tDate,tNum,Note_Counter, Note, SQL_Last_Update)
VALUES ('0000000009','2015-02-18 11:03:43.000','#','00','WEEK.','2015-02-25 13:18:03.000')
February 26, 2015 at 3:57 pm
One of these should do it:
http://www.sqlusa.com/bestpractices/training/scripts/commadelimitedlist/
February 27, 2015 at 7:36 am
I've tried using this and didn't really do the job. Could you provide an example with the data I provided.
February 27, 2015 at 7:49 am
Not enough information. Show us what you tried first.
February 27, 2015 at 7:58 am
;WITH ARNOTES
as (
select * from [myTable0914]
UNION ALL
select * from [myTable1014]
UNION ALL
select * from [myTable1114]
UNION ALL
select * from [myTable1214])
Select ARNOTES.AR_NUMBER,HD_DATE_TIMEX,ARNOTES.TRANS_NO,ARNOTES.MESSAGE_COUNTER,
rtrim(ltrim(ARNOTES.Message_2)) as Notes,
DATEDIFF(day, HD_DATE_TIMEX, GETDATE()) as daysoldIND
,( SELECT rtrim(ltrim(ARNOTES.Message_2)) + ' ' as [text()]
FROM ARNOTES
FOR XML PATH ('')) as Notes,
ARNOTES.SQL_LAST_UPDATE
from ARNOTES
GROUP By ARNOTES.AR_NUMBER,HD_DATE_TIMEX,ARNOTES.TRANS_NO,ARNOTES.MESSAGE_COUNTER,ARNOTES.Message_2,ARNOTES.SQL_LAST_UPDATE
February 27, 2015 at 9:12 am
you can use STUFF and FOR XML PATH to do it. There are articles on here that show how -- I'm sure because I've read them. Just don't remember it all...
Here's another post that shows one:
http://www.sqlservercentral.com/Forums/Topic1434207-391-1.aspx
February 27, 2015 at 9:23 am
Yes it does stuff it in there, but It creates one long string, As I stated in my original post, need to concat where number is the same and date where counter begins with ! and ends before the next ! ( that is one comment for that Number.
February 27, 2015 at 9:49 am
Here is the article about this topic. http://www.sqlservercentral.com/articles/comma+separated+list/71700/[/url]
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
February 27, 2015 at 9:52 am
Here is the challenge. You posted sample ddl and sample data but they don't match up. The columns in the inserts don't exist in the table. There numerous broken string values in the data.
And really?? varchar(1)???
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
February 27, 2015 at 10:32 am
And Really Yeah, varchar(1) I didn't create the table I am forced to use it.
February 27, 2015 at 11:55 am
Well if you want some help we need ddl and sample data that we can use.
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
February 27, 2015 at 12:04 pm
The sample data does match, the fields I've indicated to insert values. The Mytable1114, MyTable1214 etc.. are tables that I pull the data for 30,60,90,120 from the date ran. I am not worried about those, Once one table is figured out the others won't be a problem, it's the first one I am having problems with. The data does coincide with the fields it is being inserted in, maybe comma are forgotten, but the data is correct.
February 27, 2015 at 12:20 pm
He means that you gave us an example we can't run
Your create table creates one set of column names, and your insert references different ones.
Then you are also missing a quote, and in 1 row transposed the order of the values.
I THINK what you want for sample data is this:
INSERT INTO MyTable (Number, [HD_DATE_TIMEX],[TRANS_NO],[MESSAGE_COUNTER],[MESSAGE_2], [SQL_LAST_UPDATE])
VALUES ('0000000003','2015-02-25 13:18:02.000','00','!','CHK # 2452 FOR $122.49 REJECTED AS NSF ','2015-02-25 13:18:03.000')
INSERT INTO MyTable (Number, [HD_DATE_TIMEX],[TRANS_NO],[MESSAGE_COUNTER],[MESSAGE_2], [SQL_LAST_UPDATE])
VALUES ('0000000003','2015-02-25 13:18:02.000','00','"','ON 2/25/15','2015-02-25 13:18:03.000');
INSERT INTO MyTable (Number, [HD_DATE_TIMEX],[TRANS_NO],[MESSAGE_COUNTER],[MESSAGE_2], [SQL_LAST_UPDATE])
VALUES ('0000000009','2015-02-18 11:03:43.000','00','!','Debbie CALLED TO GO OVER THE ACCT. SHE ','2015-02-25 13:18:03.000');
INSERT INTO MyTable (Number, [HD_DATE_TIMEX],[TRANS_NO],[MESSAGE_COUNTER],[MESSAGE_2], [SQL_LAST_UPDATE])
VALUES ('0000000009','2015-02-18 11:03:43.000','00','"','WILL BE MAILING PAYMENT THIS','2015-02-25 13:18:03.000');
INSERT INTO MyTable (Number, [HD_DATE_TIMEX],[TRANS_NO],[MESSAGE_COUNTER],[MESSAGE_2], [SQL_LAST_UPDATE])
VALUES ('0000000009','2015-02-18 11:03:43.000','00','#','WEEK.','2015-02-25 13:18:03.000')
But please confirm. And its good practice to test scripts you are going to post to make sure they are runnable.
I think the query you want is this:
SELECT m.Number,m.HD_DATE_TIMEX, STUFF((SELECT ',' + m2.Message_2
FROM MyTable m2
WHERE m2.Number = m.number
AND m2.HD_DATE_TIMEX = m.HD_DATE_TIMEX
ORDER BY m2.MESSAGE_COUNTER
FOR XML PATH ('')),1,1,'') MessageText
FROM MyTable m
GROUP BY Number,HD_DATE_TIMEX
Viewing 13 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply