September 25, 2019 at 7:42 am
Hi
I am trying to load the data from the on-premise sql server to Sql Server at VM. I need to do it every day. For the same, I have created a trigger. Trigger is inserting the data properly. But now, I need to insert triggerID in the destination columns for every run in a column.
I was trying to insert the value of the same like this but it's giving error.
"Activity Copy Data1 failed: Please choose only one of the three property "name", "path" and "ordinal" to reference columns for "source" and "sink" under "mappings" property. "
pipeline details. Please suggest
{
"name": "pipeline1",
"properties": {
"activities": [
{
"name": "Copy Data1",
"type": "Copy",
"dependsOn": [],
"policy": {
"timeout": "7.00:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "AzureSqlSource"
},
"sink": {
"type": "SqlServerSink"
},
"enableStaging": false,
"translator": {
"type": "TabularTranslator",
"mappings": [
{
"source": {
"name": "Name",
"type": "String"
},
"sink": {
"name": "Name",
"type": "String"
}
},
{
"source": {
"type": "String",
"name": "@pipeline().parameters.triggerIDVal"
},
"sink": {
"name": "TriggerID",
"type": "String"
}
}
]
}
},
"inputs": [
{
"referenceName": "AzureSqlTable1",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "SqlServerSQLDEV02",
"type": "DatasetReference"
}
]
}
],
"parameters": {
"triggerIDVal": {
"type": "string"
}
},
"annotations": []
}
}
September 26, 2019 at 8:10 am
Thanks for posting your issue and hopefully someone will answer soon.
This is an automated bump to increase visibility of your question.
December 18, 2019 at 3:48 pm
Hi
If I have correctly understood what you are trying to do, you want to pass a pipeline parameter as part of the data copy from source to destination, inserting the pipeline parameter as a new column (or even replacing an existing value) into your destination table
If that is correct, the way to do this is as follows:
(i defined my table with the same columns as my destination table - but essentially you need this to be the data the source table columns)
create proc MyInsertProc @MyDefinedTableTypeParameter [dbo].[MyDefinedTableType] READONLY, @TriggerID int
@MyDefinedTableTypeParameter - this is a variable for your table type
[dbo].[MyDefinedTableType] - this will be your defined table type object reference that you defined in step 1. In addition to this of course you will to define what your procedure does.... i.e. insert the record into a destination table!
3.1 Specify the table type as [dbo].[MyDefinedTableType] (or as you have named it)
3.2 specify the table type parameter as MyDefinedTableTypeParameter (or again as per your named defined in step 2.
3.3 Map your pipeline parameters to your proc parameters as required.
I hope that helps you out - i can see the post is from a couple of months ago, but I've only recently been through the same process.
Let me know if its useful 🙂
_____________________________________________________________________________MCITP: Business Intelligence Developer (2005)
July 5, 2020 at 12:51 pm
Hello,
This article on logging ADF pipeline activity covers multiple methods of capturing pipeline activity details including triggerid.
https://www.mssqltips.com/sqlservertip/6320/logging-azure-data-factory-pipeline-audit-data/
Hope this helps.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply