Viewing 9 posts - 1 through 9 (of 9 total)
well your suggestions are great, i had the recursive function calls as my first solution during the research but i used this solution from Hunterwood
;with HierarchyTree ([HierarchyKey], [ParentKey], [Level], nodePath)
as
(
...
March 16, 2010 at 6:29 pm
good solution, but assuming the Depth/Level is infinite, creating an uncertain..
[HierarchyKey] as [Hierarchy!2!HierarchyKey],
ParentKey as [Hierarchy!2!ParentKey],
[Level] as [Hierarchy!2!Level],...
March 10, 2010 at 4:57 pm
GSquared (9/4/2008)
September 4, 2008 at 10:33 am
GSquared (9/4/2008)
Getting all the duplicates is easy. Getting the duplicates since the last duplicate...
September 4, 2008 at 9:16 am
hmm... interesting, how about having a counter that resets to 1 when it encounters a duplicate... i'm not sure if this is applicable/achievable using partitioning/ranking,
my thought would be
RowIDRowKeyCounter
151
251
342
433
524
611
711
Then selecting all...
September 4, 2008 at 7:51 am
hmm i have not actually dug deeper with bcp, can we monitor the records being inserted into the table? like error in each fields in a row, fk_constraints, pk_constraints, Null...
August 26, 2008 at 8:36 pm
yes, but in case it has multiple result, it will yield to the first non null column, so even col1 could be updated from row 1 in results matching to...
August 13, 2008 at 5:04 am
never mind found the answer,
SQL behaviour would be...
First Non Null Field for Each group of Record would be updated to the source...
so with the tables above this would mean, only
Table...
August 11, 2008 at 8:12 pm
Viewing 9 posts - 1 through 9 (of 9 total)