Azure Data Factory Mapping Data Flows Tutorial | Build ETL visual way!

Поделиться
HTML-код
  • Опубликовано: 28 май 2024
  • With Azure Data Factory Mapping Data Flow, you can create fast and scalable on-demand transformations by using visual user interface. In just minutes you can leverage power of Spark with not a single line of code written.
    In this episode I give you introduction to what Mapping Data Flow for Data Factory is and how can it solve your day to day ETL challenges. In a short demo I will consume data from blob storage, transform movie data, aggregate it and save multiple outputs back to blob storage.
    Sample code and data: github.com/MarczakIO/azure4ev...
    Next steps for you after watching the video
    1. Azure Data Factory introduction video
    - • Azure Data Factory Tut...
    2. Check mapping data flow documentation
    - docs.microsoft.com/en-us/azur...
    3. Helpful tips and samples
    - github.com/kromerm/adfdataflo...
    Want to connect?
    - Blog marczak.io/
    - Twitter / marczakio
    - Facebook / marczakio
    - LinkedIn / adam-marczak
    - Site azure4everyone.com
  • НаукаНаука

Комментарии • 304

  • @bifurcate1788
    @bifurcate1788 4 года назад +6

    Adam, I have been watching many of your videos. As someone new to Azure, i find your videos immensely valuable. Keep up your great work, really appreciate!

  • @Haribabu-zj4hd
    @Haribabu-zj4hd 2 года назад

    So nice of your talent explaining the data flow in simple way. Thank you so much Mr.Adam.

  • @waklop4384
    @waklop4384 4 года назад +10

    Just discovered the channel. Your material is hight quality. It's excellent work. I will go watch more. Thank you Adam !

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Thank you. This means much :)

    • @pradeeps3671
      @pradeeps3671 3 года назад

      Hello Adam, pls let me know how to connect to dynamic crm .. Pls send detail to pradysg@gmail.com

  • @subhodipsaha7608
    @subhodipsaha7608 3 года назад +1

    I just find your videos while searching for ADF tutorials in youtube. The materials are fantastic and really helping me to learn. Thank you so much!!

  • @joshuaodeyemi3098
    @joshuaodeyemi3098 Год назад

    I love you, Adam!
    I have been struggling with using expression builder in Data Flow. I can't seem to figure out how to write the code. This video just made it look less complex. I'll be devoting more time to it.

  • @gunturchilli767
    @gunturchilli767 3 года назад

    Thanks you so much Adam. I was able to crack an interview with the help of your videos. I prepared notes according to your explanation & 3 hrs before the interview i watched your videos again it helped me alot

  • @hovardlee
    @hovardlee 2 года назад +1

    -1979 and ,12
    This is why complex logic is needed. Nice tutorial :)

  • @dimitarkrastev6085
    @dimitarkrastev6085 Год назад

    Great video! Most videos seem to focus mostly on the evertisement material straight from Azure. At best they show you the very dumb step of copying data from a file to DB.
    This is the first video I saw where you actually show how you can do something useful with the data and close to real life scenario.
    Thank you.

  • @hassy9118
    @hassy9118 3 года назад

    Excellent presentation of ADF Data Mapping...we love to see more Data Factory ETL videos. Thank you Adam!

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад

      Thanks, feel free to check my ADF playlist.

  • @yashmeenkhanam3451
    @yashmeenkhanam3451 4 года назад +5

    Outstanding !You just made Azure easy to learn. Thank you.

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Awesome, thank you!

    • @johnfromireland7551
      @johnfromireland7551 2 года назад

      ADF is but just one part of about 100 significant tools and actions in Azure. :-(

    • @CallousCoder
      @CallousCoder 2 года назад

      Hi Adam, is it possible to create these pipelines as code as well? Or somehow create them from my actual Azure pipeline? It would be sheerly insane (but it is a Microsoft product) to require and maintain two pipeline one that’s yiur Azure pipeline for CI and CD and one for the ADF. I really would want the Azure pipeline to be able to fill/create the ADF pipeline. But I haven’t found anything yet.

  • @abhijitk7363
    @abhijitk7363 4 года назад

    Adam, Thanks for this excellent video. You explained almost every feature available there in data flows. Looking forward a video on Azure SQL DWH. I know it will be great to learn about it from you.

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад +1

      Glad it was helpful! I'm just waiting for new UI to come to public preview then the video will be done :)

  • @sameerdongare1113
    @sameerdongare1113 4 года назад

    Another awesome video. The best part of Mapping Data Flow was the Optimization...where we could do Partitioning.

  • @hrefname
    @hrefname 3 года назад

    Thank you so much for this. I subscribed immediately, very informative and straightforward azure info. Will definitely recommend your channel. Keep up the great work!

  • @omarsantamaria6871
    @omarsantamaria6871 4 года назад

    Awesome video. I've seen a lot of site & videos and they are so complicated, but all yours are very crystal and anyone can understand.

  • @Lego-tech
    @Lego-tech 4 года назад

    Very crisp and clear information, I watched many videos but Adam's contents are awesome!! Thanks dear!! All the best for future good work!!

  • @alanzhao8074
    @alanzhao8074 3 года назад

    The best video about Azure Data Flows I can find. Thank you Adam!

  • @icici321
    @icici321 4 года назад

    I am new to data and ETL stuff but your video's are too good. Excellent examples and very clear explanation so anyone can understand. Thanks very much.

  • @subhraz
    @subhraz 2 года назад

    Very well explained and demonstrated. Really helpful to get started with Data flows.

  • @eramitgoswami
    @eramitgoswami 3 года назад +1

    Your way of explaining is outstanding, after watching it feel like Azure is very easy to learn. kindly keep sharing good videos Thank You..

  • @BijouBakson
    @BijouBakson 2 года назад +2

    It must be very challenging to do all this thing in English for you I imagine, Adam! Congratulations for pushing through despite the difficulty. 🙂

  • @arulmouzhiezhilarasan8518
    @arulmouzhiezhilarasan8518 3 года назад

    Impeccable to know reg Mapping Data Flow, Thanks Adam!

  • @susanmyers1
    @susanmyers1 3 года назад

    @ Work I'm having to build out a Data Mart with no training on my own. You are literally saving my hide with your videos. THANK YOU!

  • @Raguna
    @Raguna Год назад

    Very good explaining the Data Flow. Thanks Mr.Adam.

  • @sarahaamir7457
    @sarahaamir7457 3 года назад +1

    Thank you so much Adam! this was very clear and great video and a big help for my interview and knowledge.

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад +1

      Very welcome! Thanks for stopping by :)

  • @fadiabusafat5162
    @fadiabusafat5162 2 года назад

    Nice one Adam. Cool one. Keep doing fabulous videos always fella.
    Many THanks.

  • @bharatruparel9424
    @bharatruparel9424 4 года назад

    Hello Adam, I just finished this video. Very well done indeed. Thanks and regards. Bharat

  • @avinashbasetty
    @avinashbasetty 4 года назад

    Features are very interesting. want to try with the different partitioning techniques. Thank you for sharing such amazing stuff

  • @SIDDHANTSINGHBCE
    @SIDDHANTSINGHBCE 3 года назад +1

    These videos are great. Helping me so much! Thanks Adam

  • @soumikdas7709
    @soumikdas7709 3 года назад +1

    Your videos are very informative and practical oriented. Keep doing .

  • @NehaJain-ry9sr
    @NehaJain-ry9sr 4 года назад

    Awesome videos Adam, your videos are great help to learn Azure. Keep it up :)

  • @balanm8570
    @balanm8570 4 года назад +1

    As useful another Awesome video Adam !!!. Excellent. It was to the POINT !!!. Keep up the good work which you have been doing for plenty of users like me. Eagerly waiting for more similar videos like this from you !!!.
    Can you please have some videos for Azure Search ...

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Thank you so very much :) Azure Search is on the list but there is so many news coming from Ignite that I might need to change the order. Let's see all the news :).

  • @aarontian5979
    @aarontian5979 2 года назад

    Your channel is totally underrated, man

  • @eddyjawed
    @eddyjawed 3 месяца назад

    Thank you Adam Dzienkuje, this is a great tutorial.

  • @Cool2kid
    @Cool2kid 3 года назад +1

    Your video content is awesome!!! Your video is very useful to understand Azure concept specially for me who just started Azure journey.
    I would like to have one video where we can see how to deploy code from Dev to QA to Prod. How to handle connection string, parameter etc during deployment.
    thanks again for wonderful video content.

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад

      ADF CI/CD is definitely on the list. It's a bit complex topic to get it right so it might take time to prepare proper content around this. Thanks for watching and suggesting ;)

  • @rqn2274
    @rqn2274 3 года назад

    Nice video Adam. Professional as always

  • @wojciechjaniszewski9086
    @wojciechjaniszewski9086 4 года назад

    very well done on explaining principles of mapping data flows!!!

  • @rajanarora6655
    @rajanarora6655 2 года назад

    Your videos are really great and helped me understand lot of concepts of Azure. Can you please make one using SSIS package and show how to use that within Azure Data Factory

  • @lowroar5127
    @lowroar5127 2 года назад

    So helpful! Thank you very much Adam!

  • @KarthikeshwarSathya
    @KarthikeshwarSathya 3 года назад +1

    This was explained very well. thank you.

  • @chandrasekharnallam2578
    @chandrasekharnallam2578 3 года назад +1

    excellent explanation with simple scenario. Thank you.

  • @ngophuthanh
    @ngophuthanh 2 года назад

    Thank you, Adam. As always, you rock.

  • @549srikanth
    @549srikanth 3 года назад +1

    I would say this is the best content I've seen so far!! Thank you so much for making it Adam!
    Just wondering, is there a Crtl+Z or Crtl+Y command in case we did some changes in the dataflow and restore it to previous version?

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад +1

      Awesome, thanks! Unfortunately not, but you can use versioning in the data factory which will allow you to revert to previous version in case you broke something. Highly recommended. Unfortunately not reverts for specific actions.

    • @549srikanth
      @549srikanth 3 года назад

      @@AdamMarczakYT Excellent!! Thank you so much for your reply!

    • @johnfromireland7551
      @johnfromireland7551 2 года назад

      @@549srikanth I publish each time I create a significant new step in the pipeline and I use data preview before moving on to the next step. Also, you can , I think, export the code version of the entire pipeline. Presumably you can, then, paste that into a new Pipeline to resurrect your previous version.

  • @generaltalksoflife
    @generaltalksoflife 3 года назад +1

    Hi Adam, Thank for helping us in learning new technologies. You are awesome 👌🏻👌🏻👌🏻👏👏.

  • @GiovanniOrlandoi7
    @GiovanniOrlandoi7 2 года назад +1

    Great video! Thanks Adam!

  • @javm7378
    @javm7378 2 года назад +1

    I really like your tutorials. I have been looking for a "table partition switching" tutorial but haven't found any good ones. May be you could do one for us? I am sure it'll be very popular as there aren't any good ones out there and it is an important topic in certifications :-)

  • @shantanudeshmukh4390
    @shantanudeshmukh4390 3 года назад +1

    Wow ! Fantastic explanation.

  • @valentinnica6034
    @valentinnica6034 4 года назад +1

    that was actually not so hard. thanks man, you're awesome.

  • @skillT01
    @skillT01 Год назад

    👍 Its amazing , Practical implementation of Data Flow.

  • @sapecyrille5487
    @sapecyrille5487 Год назад

    Great! You are the best Adam.

  • @grzegorzz4025
    @grzegorzz4025 3 года назад +1

    Adam, great tutorial! Kudos!

  • @biswajitsarkar5538
    @biswajitsarkar5538 4 года назад

    Thanks Adam !! very informative video.Liked it a lot..

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Thanks and you are most welcome! Glad you hear it.

  • @JoeandAlex
    @JoeandAlex 3 года назад +1

    Brilliant way of explanation

  • @achraferraji3403
    @achraferraji3403 2 года назад

    Amazing Video, we want other parts !

  • @niteeshmittal
    @niteeshmittal 4 года назад

    Adam, Your content is always easy to grab, excellent work mate. Could you please explain how to create a pipeline which has a copy activity followed by a mapping data flow activity.

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Thanks, just drag and drop copy activity and data flow blocks on the pipeline and drag a line from copy to data flow activity.

  • @anubhav2020
    @anubhav2020 2 года назад +2

    Hello Adam, thanks a bunch for this excellent video. The tutorial was very thorough and anyone new can easily follow. I do have a question though. I am trying to replicate an SQL query into the Data Flow, however, I have had no luck so far.
    The query is as follows:
    Select ZipCode, State
    From table
    Where State in ('AZ', 'AL', 'AK', 'AR', 'CO', 'CA', 'CT'...... LIST OF 50 STATES);
    I tried using Filter, Conditional Split and Exists transforms, but could not achieve the desired result. Being new to the Cloud Platform, I am having a bit of trouble.
    Might I request you please cover topics like Data Subsetting/Filtering (WHERE and IN Clauses etc.) in your tutorials.
    Appreciate your time and help in putting together these practical implementations.

  • @atiry839
    @atiry839 3 года назад +1

    Wow,I like your video, I did it today. and I had good result. thanks for your good explanation.

  • @MrVivekc
    @MrVivekc 2 года назад +1

    very good explanation Adam. keep it up.

    • @AdamMarczakYT
      @AdamMarczakYT  2 года назад

      Thanks, will do!

    • @MrVivekc
      @MrVivekc 2 года назад

      @@AdamMarczakYT Adam do we have trail version of Azure for Learning purpose?

  • @mohmmedshahrukh8450
    @mohmmedshahrukh8450 Год назад

    best video on azure I have ever seen❤❤

  • @RahulRajput_018
    @RahulRajput_018 3 года назад +1

    Thanks buddy ...Great work

  • @rahulkota9793
    @rahulkota9793 3 года назад +1

    Very useful. Thank you so much.

  • @dataintelligence6262
    @dataintelligence6262 4 года назад

    Great job! Thanks for all

  • @nick6s
    @nick6s 2 года назад

    Excellent tutorials

  • @jayong2370
    @jayong2370 Год назад

    Thank you Adam.

  • @yashgemini4024
    @yashgemini4024 3 года назад +1

    Appreciate you content. Thanks.

  • @abu1479
    @abu1479 3 года назад

    Adam, excellent presentation of ADF concept. I find all your videos really helpful in understanding the ADF concept. One question in regards to the sink dataset in dataflow, how can I create dynamic folder in my blob storage based on the year, month and day when this dataflow was triggered?

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад

      Depends on what do you want to achieve. You can either set partitioning by date column which will split the data by date. Or if you want to put entire dataset in one folder using date then use formatDateTime expression like formatDateTime(utcNOw(), "yyyy/MM/dd") as path.

  • @horatiohe
    @horatiohe 3 года назад +1

    thanks for the great content!! you are the man :)

  • @desparadoking8209
    @desparadoking8209 3 года назад

    Thanks for the informative and detailed video adam, 😊👌. Your content is practical. Can you make a video on how load the data from Oracle table to azure data factory? It would be helpful for audiences.

  • @isurueranga9704
    @isurueranga9704 2 года назад

    best tutorial ever... 💪🏻💪🏻💪🏻

  • @DrDATA-ep6mg
    @DrDATA-ep6mg 3 года назад +1

    Very nice tutorial 👍

  • @chsrkn
    @chsrkn 4 года назад

    Features are very interesting and thanks for your clear explanation , could you explain bit more reusable data flows , I mean to use same data flow for multiple tables/files like reusable pipelines.

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Thanks for suggestion, I will try to make video on data flow parametrization in the future. Thanks for watching :)

  • @RC-nn1ld
    @RC-nn1ld 3 года назад

    Love these videos so easy to understand, do you have a video on new XML connector

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад +1

      Great, thanks! Not yet, maybe in near future :)

  • @skybluelearner4198
    @skybluelearner4198 Год назад

    Good explanation there.

  • @dintelu
    @dintelu 3 года назад +1

    Wow..lucid explanation..

  • @sudarshanbhattacharjee4411
    @sudarshanbhattacharjee4411 Год назад

    Thanks for such good video

  • @arun06530
    @arun06530 3 года назад +1

    nice & detailed video.

  • @harshapatankar484
    @harshapatankar484 3 года назад +1

    Amazing videos.

  • @adiky
    @adiky 3 года назад +1

    Hi Adam, that's a great tutorial, many thanks for it. I have a question that can we write the transformation functions in different language like Python or R instead of Scala? If yes can you please share some details on it?

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад +2

      Unfortunately not right now :( If you need those then use Azure Databricks instead.

    • @rubendarksun6691
      @rubendarksun6691 2 года назад

      Or a python/R script in a batch process, right? Databricks would be better option of you need spark, since its also more expensive than batch

  • @nidhisharma-rb7nx
    @nidhisharma-rb7nx 2 года назад

    Adam, great video.I m new to Data Flow and I have one doubt, I want to implement File level checks in Data Flow but not able to do it. All tasks are performing data level checks like exist or conditional split. Is it possible to implement File level check like whether file exist or not in storage account?

  • @paulnelson1623
    @paulnelson1623 3 года назад +1

    For anyone wondering how to make the year check (or any check) in the second step more robust, you can exchange the following expressions using the 'case' expression as used below which says, if this expression evaluates as true, do this, else do something else.
    Worth nothing here that in the first expression, there is only a true expression provided while the second expression has both true and false directives. As per the documentation on the 'case' expression: "If the number of inputs are even, the other is defaulted to NULL for last condition."
    /* Year column expression */
    /* If the title contains a year, extract the year, else set to Null */
    case(regexMatch(title, '([0-9]{4})'),toInteger(trim(right(title, 6), '()')))
    /* title column expression*/
    /* If the title contains a year, strip the year from the title, else leave the title alone */
    case(regexMatch(title, '([0-9]{4})'),toString(left(title, length(title)-7)), title)

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад

      Thanks Paul :) I used as simple example as possible for people who aren't fluent in scala but of course you always need to cover all possible scenarios. Sometimes I like to fail the transformation rather than continue with fallback logic as I expect some values to be present.

    • @paulnelson1623
      @paulnelson1623 3 года назад

      @@AdamMarczakYT Of course, I just wanted to see if I could take it a step further to align more closely with what would be needed in a production data engineering scenario and thought others may have the same idea. Thanks for the content! :)

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад

      Thanks, I bet people will appreciate this :)

  • @erick1992santos
    @erick1992santos 4 года назад

    Excellent content!! I just have one doubt, where I can find documentation about the functions of scala? I do not know anything about it. Just have subscribed your channel! Thanks a lot!!

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Tank you Erick! I think the list of available functions are standard for Spark. I never checked if they all match but you can find them here spark.apache.org/docs/2.4.4/api/sql/

  • @gursikh133
    @gursikh133 4 года назад +1

    Adam, FOr using transformation do I need to learn scala. Or just refer the documentation you specified for scala functions and write the transformation?

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Documentation should be enough. MDF is targeting simple transformations so in most cases documentation alone will suffice.

  • @samb9403
    @samb9403 2 года назад

    Great video.
    Question: Under "New Datasets", is there a capability to drop data into Snowflake? I see S3, Redshift, etc.
    I appreciate the video and feedback!

  • @Rafian1924
    @Rafian1924 3 года назад +1

    Lovely bro!!

  • @jg5399
    @jg5399 4 года назад

    Thanks for the nice video. Do you know if there is anyway to connect to Dynamics F&O or D635/CE yet to use Data Flow as my source?

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Hey, thanks for watching! Not that I'm aware of, I would assume this would be the case for Logic App to fetch data to Blob Storage and then trigger ADF processing.
      docs.microsoft.com/en-us/azure/connectors/connectors-create-api-crmonline

  • @jayakrishna9153
    @jayakrishna9153 4 года назад

    very good explanation..keep doig

  • @yashnegi9473
    @yashnegi9473 2 года назад

    Video is excellent. I want to know the problem statement which Data flow is solving?

  • @khushbookumari-pc2xw
    @khushbookumari-pc2xw 4 года назад

    Please also explain how to use data analytics in pipeline flow.
    This is best content . Thank u so much

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Thanks you. As to your question can you elaborate on data analytics part? What exactly would you like to see.

    • @khushbookumari-pc2xw
      @khushbookumari-pc2xw 4 года назад +1

      Anything like how to make function or procedure and how to use this in pipeline to execute .basic flow of pipeline by using analytics.

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Hey, I do plan to have implementation videos like this in future. Although pipeline of videos is long so I can promise anything right now. I added this to the list of potential topics :) thanks!

    • @khushbookumari-pc2xw
      @khushbookumari-pc2xw 4 года назад

      okay, Thanks

  • @joeyt.1504
    @joeyt.1504 3 года назад

    Great video's, thank you! Just a question; what is the difference between a storage account and a data lake? Costs and type of saved data?

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад +1

      Hi, thanks! You might be interested in checking out my video on Data lake ruclips.net/video/2uSkjBEwwq0/видео.html it goes into great detail on all the differences.

  • @JoshuaDHarvey
    @JoshuaDHarvey 3 года назад

    Great video thank you

  • @TheLastBeat2
    @TheLastBeat2 3 года назад +1

    Hi Adam, so glad I found your channel. Your videos were a big help for achieving the AZ900 certificate. Now I am studying a lot to uplift my knowledge and get the Azure data engineer certificate. However, I have an important question! Data flows are expensive, sometimes clients don’t want to use this, are there alternatives to achieve the same result in azure data factory? Thank you very much!

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад

      Well you can't have the cookie and eat the cookie :) In my opinion it's not that expensive compared to other available tools.

    • @TheLastBeat2
      @TheLastBeat2 3 года назад

      @@AdamMarczakYT True! I am currently struggling with csv files that sometimes have extra spaces after the words in the header, this then gives error when doing a copy activity to Azure SQL Database. Do you have any idea to make my flow a bit more flexible so that it can deal with this? It needs some trimming in the header

    • @TheLastBeat2
      @TheLastBeat2 3 года назад

      I thought of doing a SELECT in a dataflow to then change to the correct header titles, but for this I need to know where the spaces will be in the future. So also not flexible.

  • @tenghover
    @tenghover 2 года назад

    Would you plan to make video for introduction of each transforamtion components? Thanks

  • @mustafakamal5945
    @mustafakamal5945 2 года назад +1

    Hi Adam, Thanks for making this videos, very clear and concise. I have a question (sorry not related to this video) regarding Conditional split - Can the output stream activities, run in parallel ?

    • @AdamMarczakYT
      @AdamMarczakYT  2 года назад +1

      They typically run in parallel as it's Apache Spark behind the scenes.

    • @mustafakamal5945
      @mustafakamal5945 2 года назад

      @@AdamMarczakYT Thank you !

  • @thisiszico2006
    @thisiszico2006 4 года назад

    Awesome again.

  • @mikem8915
    @mikem8915 4 года назад

    Outstanding.

  • @omarsantamaria6871
    @omarsantamaria6871 4 года назад

    Hello Adam. Your video is impressive, as always, but I'm concerned about the source dataset. Question: Does the DataFlow activity only work if the datsource are connected to Azure SQL?
    I tried using a previous dataset connected to the local server, but this dataset does not appear on the
    Source settings / Source options / Source dataset in
    DataFlow activity option. I tried with New option and it is only enabled to select the AZURE dataset. All options in the database are disabled. So I couldn't create a data set for SQL Server neither.

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Hey mapping data flows currently support 6 data services for both source and sink.
      docs.microsoft.com/en-us/azure/data-factory/data-flow-source#supported-source-connectors-in-mapping-data-flow
      I'd check if you can trick data flows by using Azure SQL connector to connect to on premise SQL server, but I never personally tried.

  • @MrSARAZZ
    @MrSARAZZ 4 года назад

    Hi Adam, just watched two of your videos on Azure Data Factory, nice work. Any chance you can do one on ADF using REST API as a data source with a JSON output, then store in a SQL Server sink?

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад +1

      Great suggestion! I'll add it to the list of potential topics. :) thanks for watching ;)

    • @gift2299
      @gift2299 4 года назад

      @@AdamMarczakYT Please, a tutorial on this would be amazing!

  • @PicaPauDiablo1
    @PicaPauDiablo1 3 года назад +1

    Adam, is there a way to preserve the filename and just have it change the extension? For instance, I'm adding a column with datetime, but at the end I would like it to have the same file name, just parquet. Is there a way to do that?

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад

      Use expressions :) That's what they are for.

    • @PicaPauDiablo1
      @PicaPauDiablo1 3 года назад

      @@AdamMarczakYT Sorry if it was a dumb question, I'm still new to ADF. Ignore if it's too inane but is fileanem in the @pipeline parameter? I found one online but couldn't get it to parse.

  • @gbw5679
    @gbw5679 4 года назад

    Thanks Adam!

  • @01sanjaysinha
    @01sanjaysinha Год назад

    Thanks!

  • @JohnJohnson-bs4cw
    @JohnJohnson-bs4cw 3 года назад +1

    Great Video. Can you use data from a REST Api as a source for a Mapping Data Flow or does the source have to be a dataset on Azure?

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад

      Here is the list of supported data sources for MDF docs.microsoft.com/en-us/azure/data-factory/data-flow-source?WT.mc_id=AZ-MVP-5003556 . Just copy data from REST API to Blob and then start MDF pipeline using that blob path as a parameter.