Hi Annu, this video was excellent! I was wondering, for the case of explicit mapping, when copying from hierarchical source to hierarchical sink, is it possible to add a column in the sink which is not present in the source? So for example, in your video, if you want to add a hardcoded column in your output JSON named ''status" and then fill it with value "active". Is it possible? And another questtion, can you build more complex sink structures based on the input data? For example, let's imagine that you need to have the 'fisrt_name' and 'last_name' from the source to be included in the output JSON within a JSON object named 'person_name', like: 'person_name' : {'firstname' : 'xxx', 'last_name': 'yyy'}, could you do that with the mapping? Or would we need a dataflow for more complex activities?
i have upwards of 500 columns in source xml files, which im trying to copy. But its giving me character exceeds limit of 8512 error. Any workaround in this case
my use case is if corrupt input data occurs then I want to log that data in sql table without using upsert, fault tolerance, enable logging or data flow , is there another way eg if string data comes instead of int then i want to skip that and record in table and dont interrupt rest run
I am trying to copy data from one sql table to another. but when i try to do import schema, i am not getting option to change the data type. pls guide me to undertstand the reason behind this
Really good stuff. Thanks for sharing!!
Simple and clear Explantion of schema mapping !!
Thank you !!
thanks, this was very helpful!
Excellent lecture Annu
Plz make more videos🦋🦋🦋
Thankyou so much
Good one ✌️
Thankyou so much 🙂
Hi Annu, this video was excellent!
I was wondering, for the case of explicit mapping, when copying from hierarchical source to hierarchical sink, is it possible to add a column in the sink which is not present in the source? So for example, in your video, if you want to add a hardcoded column in your output JSON named ''status" and then fill it with value "active". Is it possible?
And another questtion, can you build more complex sink structures based on the input data? For example, let's imagine that you need to have the 'fisrt_name' and 'last_name' from the source to be included in the output JSON within a JSON object named 'person_name', like: 'person_name' : {'firstname' : 'xxx', 'last_name': 'yyy'}, could you do that with the mapping? Or would we need a dataflow for more complex activities?
Thank you so much😊
Thankyou for watching
Hi ma’am
Can we do this copy activity for delta table as sink.
Hi Mam,
How to join a column from another table in ADF.
24:30 What if we have multiple arrays in the source?
So who do you copy multiple csv file without headers and map them to existing tables in Azure without the pop_0 error?
i have upwards of 500 columns in source xml files, which im trying to copy. But its giving me character exceeds limit of 8512 error. Any workaround in this case
my use case is if corrupt input data occurs then I want to log that data in sql table without using upsert, fault tolerance, enable logging or data flow , is there another way eg if string data comes instead of int then i want to skip that and record in table and dont interrupt rest run
Why not fault tolerance? It's the best and easiest way for this scenario
I am trying to copy data from one sql table to another. but when i try to do import schema, i am not getting option to change the data type. pls guide me to undertstand the reason behind this
at 8:47 how table name is created we have used auto create but not given any table name