Это видео недоступно.
Сожалеем об этом.

1. Handle Error Rows in Data Factory Mapping Data Flows

Поделиться
HTML-код
  • Опубликовано: 11 май 2020
  • In this video, I discussed about Handling Error Rows in data in Data factory mapping data flows.
    Link for Azure Functions Play list:
    • 1. Introduction to Azu...
    Link for Azure Basics Play list:
    • 1. What is Azure and C...
    Link for Azure Data factory Play list:
    • 1. Introduction to Azu...
    #Azure #ADF #AzureDataFactory

Комментарии • 121

  • @namangupta7111
    @namangupta7111 4 года назад +3

    Thanks Wafastudies , You are making ADF easy for us by sharing videos

  • @mohitupadhayay1439
    @mohitupadhayay1439 Год назад +6

    Hi Wafa,
    I have some Interview Questions I was asked. Please make some videos on the same as a request.
    1. How to create a GENERIC pipeline that can be reused again?
    2. Write a Single line query to delete data from 10 tables.
    3. What is Encoding in COPY activity?
    4. What is the limitation of LOOKUP activity?
    5. How do you validate your SINK data in ADF whether it is right or wrong?

  • @vinod02791
    @vinod02791 3 года назад +10

    Hi, These Real Time scenarios helped me a lot and added advantage to my experience in Azure and helped to get a job along with other technologies. Thank you for your videos!! :) :)

  • @mohammedshoaib1769
    @mohammedshoaib1769 3 года назад +1

    This is just awesome explanation. Covering real time scenarios perfectly giving a good insight about ADF. Thanks you guys keep up the good work. Also expecting more about Databricks.

  • @akkuhome9760
    @akkuhome9760 4 года назад

    Keep up the good work mate.., good cover of topics with hands on experience.

  • @jaymakam9673
    @jaymakam9673 Год назад +1

    This is what exactly is needed to understand ADF better. Thank you so much for the hardwork _/\_..

  • @prekshajain5174
    @prekshajain5174 4 года назад +4

    Amazing explanation wafastudies cleared so many things, great work.

  • @AmitSharma-mv5xe
    @AmitSharma-mv5xe Год назад

    Really appreciate your contribution brother

  • @vivek05117gece
    @vivek05117gece Год назад

    Hi Wafa, very well explained video. I have watched multiple videos on ADF, but after watching your videos I feel confident for the interview.

  • @kumarpolisetty3048
    @kumarpolisetty3048 4 года назад +5

    Thanks wafa studies. Please start tutorial on Azure Databricks. Ur vedios are really helpful.

  • @sriharig9096
    @sriharig9096 Год назад

    Really helpful...thank you for your efforts

  • @adityachary2478
    @adityachary2478 3 года назад +2

    Fantastic bro ur explanation is awesome I learned lot from ur channel

  • @ArunKumar-kb7fr
    @ArunKumar-kb7fr 4 года назад +1

    Thanks for uploading...explanation is very good..

  • @sraoarjun
    @sraoarjun 9 месяцев назад

    Excellent video !!

  • @arijitmitra8585
    @arijitmitra8585 2 года назад

    Great content. Kudos to you

  • @dvsubbaraonerella1449
    @dvsubbaraonerella1449 2 года назад +1

    Awesome. Thanks alot for all the efforts and your time.

  • @meelookaru2888
    @meelookaru2888 3 года назад +1

    Thank you so much for your time. its really helpful.
    could you please add some more real time scenarios on ADF.

  • @shubhammanjalkar8591
    @shubhammanjalkar8591 7 месяцев назад

    Amazing. Thanks you so much

  • @pavankumarvarmadendukuri4665
    @pavankumarvarmadendukuri4665 Год назад

    Great work brother... keep it up...

  • @soumyaprasanna6606
    @soumyaprasanna6606 2 года назад +8

    Thanks for all the efforts that you have put to create the ADF videos. Can you please share the sample CSV files that you have used to explain here?

  • @ootec1
    @ootec1 3 года назад +2

    Really I loved it. Thanks for the video buddy.

  • @NehaJain-ry9sr
    @NehaJain-ry9sr 3 года назад +1

    Awesome video... many lessons in one video , including common errors... Thanks Wafa.. keep it up.. great work

  • @akashputti
    @akashputti 3 года назад +2

    thanks, its definitely going to be helpful. better than udemy and courera courses

  • @saivaraprasadarao6990
    @saivaraprasadarao6990 2 года назад +1

    Excellent explanation
    Thank you so much 😊

  • @vaibhavkumar38
    @vaibhavkumar38 3 года назад +1

    Tip: In error table have same column names as normal table but the data type is kept as nvarchar- because only that datatype can store any data inside it

  • @thiagocustodio8177
    @thiagocustodio8177 4 года назад +1

    how to validate if sales item is not null and not empty string? I could not find a way to write an if statement. Adding one condition to each validation will make the flow pretty complex when working with large datasets. Also, is it possible to redirect to the same sink?

  • @Anonymous-cj4gy
    @Anonymous-cj4gy 3 года назад +5

    Hi,
    If you want to check if there is an error in another column as well then use the below condition :
    ErrorRows = isNull(toDate(salesDate, 'dd-MMM-yyyy')) || lesser(toInteger(quantity), 0) || isNull(salesItem) || isNull(country)
    Here if any one of the conditions is true then that row will go to the bad table.
    Also, remember if you use dd-MMM-YYYY instead of dd-MMM-yyyy it will give you the wrong output.

  • @anusha8129
    @anusha8129 Месяц назад

    Thankyou sir...

  • @bhaskarnukala902
    @bhaskarnukala902 2 года назад +1

    I got one question . What if the bad row "01-------2020" doesn't show up in the data preview. It might be a decimal . Then how come 'varchar'datatype will be useful on the DB where we store bad data?

  • @ranjansrivastava9256
    @ranjansrivastava9256 6 месяцев назад

    Appreciated !!! If you share this concept using copy activity it will be good.

  • @arun06530
    @arun06530 3 года назад +1

    very nice & informative Video.

  • @annekrishnavinod5482
    @annekrishnavinod5482 3 года назад +1

    can we use fault tolerance option for skip the bad rows?

  • @vijaybodkhe8379
    @vijaybodkhe8379 Год назад

    Thanks for Sharing

  • @rahulkr5694
    @rahulkr5694 3 года назад +1

    Nice explanation.
    I have one question, I have one csv file in that i have one column with 50 length but some how some records came in CSV file with more than 50 length. How to reject these records from ADF.

  • @pawanreddie2162
    @pawanreddie2162 3 года назад +1

    If we dont know which column has error rows then how to give split condition

  • @karthikram1625
    @karthikram1625 4 года назад

    can you explain , how did you connect those sql quiries to linked service?

  • @ramum4684
    @ramum4684 Год назад

    If in multiple col in same row have error then how to built expression in the visual expression builder page?

  • @venkatpr9692
    @venkatpr9692 Год назад

    Tq soooo much sir

  • @Adub333
    @Adub333 3 года назад

    Hello...i am running into an issue where my blob storage input is returning null values where i have not transformed that particular column...Any recommendations?

  • @shalakapowar0707
    @shalakapowar0707 2 года назад

    Hi, Thanks a lot for great video, but I have a question, how are you able to access this db from SQL server management as dataflow will not allow this

  • @karthikeyana2120
    @karthikeyana2120 3 года назад +1

    hi any idea, how to take latest file from the blob storage

  • @jesseantony1223
    @jesseantony1223 6 месяцев назад

    Hello, Thank you for the videos. Is there any way to get the csv files mensioned in the video so that we can practice it?

  • @tandaibhanukiran4828
    @tandaibhanukiran4828 2 года назад +1

    Thankyou Brother. LotsOfLove

  • @SantoshSingh-ki8bx
    @SantoshSingh-ki8bx 4 года назад +2

    Thanks Wafastudies. I have just one request, If you can share data used in tutorial. It will very helpful to follow on video.

    • @ashishsinha5306
      @ashishsinha5306 3 года назад

      Wafastudies could you provide us with the files it would be really helpful. Great videos btw.

  • @UmerPKgrw
    @UmerPKgrw 3 года назад

    i have simple dataflow copy data from on prem Sql server to blob storage and it gives the following error:
    Linked service with Self-hosted Integration runtime is not supported in data flow. Please utilize the Azure IR with managed vnet using this tutorial

  • @gokulajith762
    @gokulajith762 Год назад

    Can we use the same method if we are using a self hosted integration runtime

  • @jayalakshmia6647
    @jayalakshmia6647 Год назад

    Is it possible to run multiple pipelines in parallel? if yes which activity please tell me

  • @vishnum4892
    @vishnum4892 2 года назад

    i got error saying linked service with self-hosted ir isno tsupported in dataflows.
    by using self-host ir its not possible to use data debugging.is there any other way.

  • @ssbeats677
    @ssbeats677 2 года назад

    Hi, can you please make a video the wat are the errors we can face and types of errors and how to resolve them

  • @prasadrasal5001
    @prasadrasal5001 3 года назад +1

    thank you so much

  • @user-xn5gn8nx4u
    @user-xn5gn8nx4u Год назад +1

    Hi Maheer,
    I am new learner to ADF, while doing this scenario, i came to know that self hosted IR cannot be used in Data flows, can you clarify the same. i am using sql server as an on prem application
    Connection failed
    Linked service with Self-hosted Integration runtime is not supported in data flow.
    Also can you clarify that do we need to configure Azure DB in ADF as well? So as to get DB tables details

  • @sravankumar1767
    @sravankumar1767 3 года назад +1

    nice explanation bro..

  • @sivareddy157
    @sivareddy157 3 года назад

    What about to check isnull function on multiple columns? Bro

  • @shubhamsonune1490
    @shubhamsonune1490 3 года назад +1

    Very helpfull

  • @paramkusamsaikiran1018
    @paramkusamsaikiran1018 2 года назад

    Maheer, If possible can you provide us the files you use in your lecture. It will be helpfull for us for practicing.

  • @ShaziyaKhan-xx4uj
    @ShaziyaKhan-xx4uj Год назад

    is real time playlist for testing.

  • @sonamkori8169
    @sonamkori8169 2 года назад +1

    Thank You Sir

  • @dhanyamenon1485
    @dhanyamenon1485 4 года назад +1

    Thanks for creating videos on real-time scenarios. It's really helpful.
    I have a question here - how to validate files in Azure Storage container based on validation schema for that data load already in SQL database used for the ETL validation scenario by using Copydata activity and capture the error rows and send an email notification to the user with the error rows data. I want to use Copy data activity here. have some limitations to use Mapping activity, so not able to use that. Please let me know. Thanks in advance.

    • @gokulajith762
      @gokulajith762 Год назад

      Hello, do you know how to carry out the same functionalities using copydata activity

  • @ambatikarthik6822
    @ambatikarthik6822 2 года назад

    Hi maheer, can you help me out how can we connect SQL Server to ADF while we connecting we facing error

  • @srinubathina7191
    @srinubathina7191 Год назад +2

    Thank You

  • @balakrishnaganasala5811
    @balakrishnaganasala5811 Год назад

    Thank you Wafa for the excellent videos', could you lease share the scripts you used to create the DB tables and files you have used over here.

  • @hariomrajpoot966
    @hariomrajpoot966 3 года назад

    What if the number of columns number are different

  • @vidyatechtalks9175
    @vidyatechtalks9175 2 года назад

    where did you created the tables??

  • @indhumathid1095
    @indhumathid1095 2 года назад

    Really Thank You so much for your videos. I am trying to fetch only required records from excel file and load it into Azure SQL Database using ADF can you give me any idea?

    • @rajashekerreddydommata537
      @rajashekerreddydommata537 2 года назад

      We can add an expression to the variable and use that lists only those which we want, orelse list out all the required records in an array and using for each activity you can iterate and fetch only the required from the excel sheet. Hope this will work.

  • @anujgupta-lc1md
    @anujgupta-lc1md 4 года назад +1

    Make More and More

  • @sharadgawade9408
    @sharadgawade9408 4 года назад +1

    Thanks for video on real time scenario. If have multiple files in my source location and one of them contain bad record so how to identify the file which holding bad records? Here you are passing file name as hardcoded.

    • @WafaStudies
      @WafaStudies  4 года назад +3

      Great question. Lets do one thing. I will create second video on this scenario. How to get filenames from folder and loop through each file. Pls stay connected

    • @sharadgawade9408
      @sharadgawade9408 4 года назад

      @@WafaStudies thank you so much..

  • @subbaraochereddy7089
    @subbaraochereddy7089 3 года назад

    this is work for only one column , how u handling truncation errors, Special characters files issue , pls upload the file

  • @parthasaradireddy6793
    @parthasaradireddy6793 4 месяца назад

    This is useful for etl tester or not

  • @diwakarnrp2092
    @diwakarnrp2092 Год назад

    how to download the files for practice purpose

  • @UmerPKgrw
    @UmerPKgrw 3 года назад

    How much it will cost to run an Azure data factory

  • @harishpentela1234
    @harishpentela1234 3 года назад

    Hi Wafa ,
    I have a doubt about loading tables from staging to Data warehouse by using dataflow ,if we have 25 dim tables Do I need to create 25 workflows ?Or is there any Dynamic way to do that
    Thanks in Advance

    • @DataWithNagar
      @DataWithNagar Год назад

      Create a single workflow template that can be parameterized. Parameters can include source table names, destination table names, transformation logic, etc.

  • @amanahmed6057
    @amanahmed6057 Год назад

    could you upload the files which you are using in every video or make a google drive

  • @annekrishnavinod5482
    @annekrishnavinod5482 3 года назад

    One Question @wafastudies , here your showing one column. for example, i don't know which column is data not properly coming or null value. how to do?

    • @mahihi251
      @mahihi251 3 года назад

      add one expression to each column to
      validate

  • @vigneshmurali1994
    @vigneshmurali1994 4 года назад

    Pls do on spark

  • @sudheerkumar7100
    @sudheerkumar7100 Месяц назад

    sir pl tel about BANKNIFTY

  • @dasarianandchandra6890
    @dasarianandchandra6890 3 года назад +1

    bro ur cool

  • @Anas_ILoveMyIndia
    @Anas_ILoveMyIndia 4 года назад +1

    Hi Maheer, We have a scenario like to fetch the zip file from FTP server and which is password protected. How to do the extraction and get the data into Blob storage or into SQL DB? Pls make a video. Thanks a lot !

  • @nallavellivenkatesh9479
    @nallavellivenkatesh9479 2 года назад

    Tq

  • @pranaypandu9268
    @pranaypandu9268 3 года назад +1

    Luv u bro

  • @aravindreddyramidi5543
    @aravindreddyramidi5543 Год назад

    Please provide csv file. It would be helpful.

  • @valijoneshniyazov
    @valijoneshniyazov 10 месяцев назад

    I wish csv files attached here.

  • @manasmohanty5754
    @manasmohanty5754 2 года назад +1

    Kindly provide some other real time scenarios..

    • @WafaStudies
      @WafaStudies  2 года назад +2

      Kindly check my channel Azure data factory real time scenarios playlist. There are 35+ real time scenarios covered

    • @mrrathore55
      @mrrathore55 2 года назад

      @@WafaStudies Many thanks :)

  • @raficompitative4077
    @raficompitative4077 Год назад

    Pdf material pls

  • @ArrowAAA
    @ArrowAAA 9 месяцев назад

    pls share the source files

  • @hariomrajpoot966
    @hariomrajpoot966 3 года назад

    This. Is very limited scenario

  • @gopalreddy9897
    @gopalreddy9897 3 года назад +2

    Please stop saying ok