Azure Data Factory Mapping Data Flows Tutorial | Build ETL visual way!

Поделиться
HTML-код
  • Опубликовано: 29 янв 2025

Комментарии • 305

  • @atlanticoceanvoyagebird2630
    @atlanticoceanvoyagebird2630 3 месяца назад +3

    Hi Adam: I am 76 years old and also very new in this technology of ETL which I got now clear. Keep on developing this kind of professional projects to be used in the resume preparations.

  • @bifurcate-ai
    @bifurcate-ai 4 года назад +6

    Adam, I have been watching many of your videos. As someone new to Azure, i find your videos immensely valuable. Keep up your great work, really appreciate!

  • @waklop4384
    @waklop4384 5 лет назад +10

    Just discovered the channel. Your material is hight quality. It's excellent work. I will go watch more. Thank you Adam !

    • @AdamMarczakYT
      @AdamMarczakYT  5 лет назад

      Thank you. This means much :)

    • @pradeeps3671
      @pradeeps3671 3 года назад

      Hello Adam, pls let me know how to connect to dynamic crm .. Pls send detail to pradysg@gmail.com

  • @eramitgoswami
    @eramitgoswami 4 года назад +2

    Your way of explaining is outstanding, after watching it feel like Azure is very easy to learn. kindly keep sharing good videos Thank You..

  • @hovardlee
    @hovardlee 3 года назад +1

    -1979 and ,12
    This is why complex logic is needed. Nice tutorial :)

  • @joshuaodeyemi3098
    @joshuaodeyemi3098 Год назад

    I love you, Adam!
    I have been struggling with using expression builder in Data Flow. I can't seem to figure out how to write the code. This video just made it look less complex. I'll be devoting more time to it.

  • @notonprem
    @notonprem 5 месяцев назад

    This is quality stuff. Good for a quick upskill especially when prepping for an interview.

  • @subhodipsaha7608
    @subhodipsaha7608 4 года назад +1

    I just find your videos while searching for ADF tutorials in youtube. The materials are fantastic and really helping me to learn. Thank you so much!!

  • @dimitarkrastev6085
    @dimitarkrastev6085 2 года назад

    Great video! Most videos seem to focus mostly on the evertisement material straight from Azure. At best they show you the very dumb step of copying data from a file to DB.
    This is the first video I saw where you actually show how you can do something useful with the data and close to real life scenario.
    Thank you.

  • @susanmyers1
    @susanmyers1 4 года назад

    @ Work I'm having to build out a Data Mart with no training on my own. You are literally saving my hide with your videos. THANK YOU!

  • @aarontian5979
    @aarontian5979 3 года назад

    Your channel is totally underrated, man

  • @Lego-tech
    @Lego-tech 4 года назад

    Very crisp and clear information, I watched many videos but Adam's contents are awesome!! Thanks dear!! All the best for future good work!!

  • @lavanyay2767
    @lavanyay2767 7 месяцев назад

    very very detailed work flow , i tried this and able to understand Data flow process so easily . Thank you for the wonderful session.

  • @fadiabusafat5162
    @fadiabusafat5162 3 года назад

    Nice one Adam. Cool one. Keep doing fabulous videos always fella.
    Many THanks.

  • @gunturchilli767
    @gunturchilli767 4 года назад

    Thanks you so much Adam. I was able to crack an interview with the help of your videos. I prepared notes according to your explanation & 3 hrs before the interview i watched your videos again it helped me alot

  • @omarsantamaria6871
    @omarsantamaria6871 5 лет назад

    Awesome video. I've seen a lot of site & videos and they are so complicated, but all yours are very crystal and anyone can understand.

  • @alanzhao8074
    @alanzhao8074 4 года назад

    The best video about Azure Data Flows I can find. Thank you Adam!

  • @icici321
    @icici321 5 лет назад

    I am new to data and ETL stuff but your video's are too good. Excellent examples and very clear explanation so anyone can understand. Thanks very much.

  • @yashmeenkhanam3451
    @yashmeenkhanam3451 4 года назад +5

    Outstanding !You just made Azure easy to learn. Thank you.

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Awesome, thank you!

    • @johnfromireland7551
      @johnfromireland7551 3 года назад

      ADF is but just one part of about 100 significant tools and actions in Azure. :-(

    • @CallousCoder
      @CallousCoder 3 года назад

      Hi Adam, is it possible to create these pipelines as code as well? Or somehow create them from my actual Azure pipeline? It would be sheerly insane (but it is a Microsoft product) to require and maintain two pipeline one that’s yiur Azure pipeline for CI and CD and one for the ADF. I really would want the Azure pipeline to be able to fill/create the ADF pipeline. But I haven’t found anything yet.

  • @Cool2kid
    @Cool2kid 4 года назад +1

    Your video content is awesome!!! Your video is very useful to understand Azure concept specially for me who just started Azure journey.
    I would like to have one video where we can see how to deploy code from Dev to QA to Prod. How to handle connection string, parameter etc during deployment.
    thanks again for wonderful video content.

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      ADF CI/CD is definitely on the list. It's a bit complex topic to get it right so it might take time to prepare proper content around this. Thanks for watching and suggesting ;)

  • @soumikdas7709
    @soumikdas7709 3 года назад +1

    Your videos are very informative and practical oriented. Keep doing .

  • @SIDDHANTSINGHBCE
    @SIDDHANTSINGHBCE 3 года назад +1

    These videos are great. Helping me so much! Thanks Adam

  • @01sanjaysinha
    @01sanjaysinha 2 года назад

    Thanks!

  • @balanm8570
    @balanm8570 5 лет назад +1

    As useful another Awesome video Adam !!!. Excellent. It was to the POINT !!!. Keep up the good work which you have been doing for plenty of users like me. Eagerly waiting for more similar videos like this from you !!!.
    Can you please have some videos for Azure Search ...

    • @AdamMarczakYT
      @AdamMarczakYT  5 лет назад

      Thank you so very much :) Azure Search is on the list but there is so many news coming from Ignite that I might need to change the order. Let's see all the news :).

  • @generaltalksoflife
    @generaltalksoflife 4 года назад +1

    Hi Adam, Thank for helping us in learning new technologies. You are awesome 👌🏻👌🏻👌🏻👏👏.

  • @shantanudeshmukh4390
    @shantanudeshmukh4390 4 года назад +1

    Wow ! Fantastic explanation.

  • @mohmmedshahrukh8450
    @mohmmedshahrukh8450 2 года назад

    best video on azure I have ever seen❤❤

  • @grzegorzz4025
    @grzegorzz4025 3 года назад +1

    Adam, great tutorial! Kudos!

  • @MrVivekc
    @MrVivekc 3 года назад +1

    very good explanation Adam. keep it up.

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад

      Thanks, will do!

    • @MrVivekc
      @MrVivekc 3 года назад

      @@AdamMarczakYT Adam do we have trail version of Azure for Learning purpose?

  • @Haribabu-zj4hd
    @Haribabu-zj4hd 3 года назад

    So nice of your talent explaining the data flow in simple way. Thank you so much Mr.Adam.

  • @techBird-b2m
    @techBird-b2m 2 года назад

    👍 Its amazing , Practical implementation of Data Flow.

  • @BijouBakson
    @BijouBakson 2 года назад +2

    It must be very challenging to do all this thing in English for you I imagine, Adam! Congratulations for pushing through despite the difficulty. 🙂

  • @chandrasekharnallam2578
    @chandrasekharnallam2578 3 года назад +1

    excellent explanation with simple scenario. Thank you.

  • @sameerdongare1113
    @sameerdongare1113 5 лет назад

    Another awesome video. The best part of Mapping Data Flow was the Optimization...where we could do Partitioning.

  • @Raguna
    @Raguna 2 года назад

    Very good explaining the Data Flow. Thanks Mr.Adam.

  • @rqn2274
    @rqn2274 4 года назад

    Nice video Adam. Professional as always

  • @GiovanniOrlandoi7
    @GiovanniOrlandoi7 3 года назад +1

    Great video! Thanks Adam!

  • @hrefname
    @hrefname 4 года назад

    Thank you so much for this. I subscribed immediately, very informative and straightforward azure info. Will definitely recommend your channel. Keep up the great work!

  • @ngophuthanh
    @ngophuthanh 2 года назад

    Thank you, Adam. As always, you rock.

  • @achraferraji3403
    @achraferraji3403 2 года назад

    Amazing Video, we want other parts !

  • @abhijitk7363
    @abhijitk7363 4 года назад

    Adam, Thanks for this excellent video. You explained almost every feature available there in data flows. Looking forward a video on Azure SQL DWH. I know it will be great to learn about it from you.

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад +1

      Glad it was helpful! I'm just waiting for new UI to come to public preview then the video will be done :)

  • @subhraz
    @subhraz 2 года назад

    Very well explained and demonstrated. Really helpful to get started with Data flows.

  • @sapecyrille5487
    @sapecyrille5487 Год назад

    Great! You are the best Adam.

  • @KarthikeshwarSathya
    @KarthikeshwarSathya 3 года назад +1

    This was explained very well. thank you.

  • @avinashbasetty
    @avinashbasetty 4 года назад

    Features are very interesting. want to try with the different partitioning techniques. Thank you for sharing such amazing stuff

  • @bharatruparel9424
    @bharatruparel9424 5 лет назад

    Hello Adam, I just finished this video. Very well done indeed. Thanks and regards. Bharat

  • @Montreal_powerbi_connect
    @Montreal_powerbi_connect 3 года назад +1

    Wow,I like your video, I did it today. and I had good result. thanks for your good explanation.

  • @JoeandAlex
    @JoeandAlex 3 года назад +1

    Brilliant way of explanation

  • @arulmouzhiezhilarasan8518
    @arulmouzhiezhilarasan8518 4 года назад

    Impeccable to know reg Mapping Data Flow, Thanks Adam!

  • @isurueranga9704
    @isurueranga9704 3 года назад

    best tutorial ever... 💪🏻💪🏻💪🏻

  • @sarahaamir7457
    @sarahaamir7457 4 года назад +1

    Thank you so much Adam! this was very clear and great video and a big help for my interview and knowledge.

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад +1

      Very welcome! Thanks for stopping by :)

  • @big-bang-movies
    @big-bang-movies 3 года назад

    Hi Adam, few doubts. Please help me understand.
    1. 10:04, After running the dataflow 1st time, there are 9125 rows got populated. Well, there is no output sink or output dataset associated with it dataflow yet, then where exactly those ingested rows are getting saved/populated?
    2.15:04, after re-calculating "title" (by removing the year part), how come the previous original column (title) got disappeared? The modified title column should appear in addition to the previous original column (title) right?

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад

      hey 1. it's amount of rows loaded. 2. if you create new column with the same name it will replace old one. In this case we replaced title column.

  • @rosszhu1660
    @rosszhu1660 4 года назад +2

    A quick question, Azure dataset seems only support already structured data, like CSV or JSON, what if my datasource is an unstructured text file that must be transformed into csv before being used? Is there a way to do this transformation (possibly python code) in data factory?

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Hey, you can call azure databricks which can transform any file using Python/Scala/R etc. But data factory itself can't do it.

    • @rosszhu1660
      @rosszhu1660 4 года назад

      @@AdamMarczakYT Got it. Thanks a lot! It looks like I have to learn Spark :-)

  • @wojciechjaniszewski9086
    @wojciechjaniszewski9086 4 года назад

    very well done on explaining principles of mapping data flows!!!

  • @valentinnica6034
    @valentinnica6034 5 лет назад +1

    that was actually not so hard. thanks man, you're awesome.

  • @549srikanth
    @549srikanth 4 года назад +1

    I would say this is the best content I've seen so far!! Thank you so much for making it Adam!
    Just wondering, is there a Crtl+Z or Crtl+Y command in case we did some changes in the dataflow and restore it to previous version?

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад +1

      Awesome, thanks! Unfortunately not, but you can use versioning in the data factory which will allow you to revert to previous version in case you broke something. Highly recommended. Unfortunately not reverts for specific actions.

    • @549srikanth
      @549srikanth 4 года назад

      @@AdamMarczakYT Excellent!! Thank you so much for your reply!

    • @johnfromireland7551
      @johnfromireland7551 3 года назад

      @@549srikanth I publish each time I create a significant new step in the pipeline and I use data preview before moving on to the next step. Also, you can , I think, export the code version of the entire pipeline. Presumably you can, then, paste that into a new Pipeline to resurrect your previous version.

  • @lowroar5127
    @lowroar5127 2 года назад

    So helpful! Thank you very much Adam!

  • @rajanarora6655
    @rajanarora6655 3 года назад

    Your videos are really great and helped me understand lot of concepts of Azure. Can you please make one using SSIS package and show how to use that within Azure Data Factory

  • @NehaJain-ry9sr
    @NehaJain-ry9sr 4 года назад

    Awesome videos Adam, your videos are great help to learn Azure. Keep it up :)

  • @DrDATA-ep6mg
    @DrDATA-ep6mg 4 года назад +1

    Very nice tutorial 👍

  • @eddyjawed
    @eddyjawed 11 месяцев назад

    Thank you Adam Dzienkuje, this is a great tutorial.

  • @tenghover
    @tenghover 3 года назад

    Would you plan to make video for introduction of each transforamtion components? Thanks

  • @dwainDigital
    @dwainDigital 3 года назад

    How do you delete from target based on data from the Source? I'm really struggling to understand if i have a column with a value that I want to delete in the target table. Everything seems to be geared up to altering source data coming in

  • @mangeshxjoshi
    @mangeshxjoshi 5 лет назад +1

    hi , does Azure Data factory can be used to Replace IBM DataStage Mappings transformation. as ibm datastage is a etl tool and azure data factory is a managed data integration service on cloud. does azure data factory supports only blob storarage , azure cosmos db (sql api) , azure data lake storage , azure sql data warehouse azure sql database only ? Apart from these , does Azure Data factory connects to SAP HANA , SAP bw , oracle . are there any connectors being used to pull data from other sources like SAP hana ,oracle etc

    • @AdamMarczakYT
      @AdamMarczakYT  5 лет назад

      Hey, in general ADF has 80+ connectors, including SAP and Oracle. You use those to copy data from those sources to blob storage and then trigger mapping data flow pipeline to get data from blob storage (or data lake), transform it and output it back to blob (or one of supported output systems). Where ADF copies it to designated place.

  • @dintelu
    @dintelu 4 года назад +1

    Wow..lucid explanation..

  • @niteeshmittal
    @niteeshmittal 4 года назад

    Adam, Your content is always easy to grab, excellent work mate. Could you please explain how to create a pipeline which has a copy activity followed by a mapping data flow activity.

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Thanks, just drag and drop copy activity and data flow blocks on the pipeline and drag a line from copy to data flow activity.

  • @carlossalinas8497
    @carlossalinas8497 4 года назад

    Adam you have an ability at explaining complex things, this tutorial made my day, thanks

  • @seb6302
    @seb6302 4 года назад +1

    I have an issue with the column 'title' not being found in the derived column despite being able to see all the column in the source beforehand.. Very confused!

    • @seb6302
      @seb6302 4 года назад

      When attempting to aggregate - no columns are found. Again despite seeing them in the source.

    • @seb6302
      @seb6302 4 года назад

      I've rebuilt the whole thing and still face the same issue. Google yields no results either.. Does anyone know what i'm doing wrong?

    • @seb6302
      @seb6302 4 года назад

      Just tried again and it works! The only difference this time round was that I didn't enable data flow debug. No idea why it worked this time.

    • @seb6302
      @seb6302 4 года назад

      Also 'Actions' no longer exists under pipeline - Is there a new way to view the details pane? I can't seem to find one.

    • @seb6302
      @seb6302 4 года назад

      These actions can be now found if you hover over 'Name'!

  • @jagadeeshpinninti3456
    @jagadeeshpinninti3456 4 года назад +1

    can you please explain who to connect source dataset from azure data lake storage gen 2 tables in data flows of Azure data factory?

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      It's the same as blob storage, just create linked service and select Azure Table Storage and create dataset for it. Not that this is not supported for Mapping Data Flows.

  • @mohitjoshi1361
    @mohitjoshi1361 3 года назад

    Does any of these option changed now? Because I am not able to see any data debug option to be enabled, and directly preview data in dataset itself.

  • @PicaPauDiablo1
    @PicaPauDiablo1 4 года назад +1

    Adam, is there a way to preserve the filename and just have it change the extension? For instance, I'm adding a column with datetime, but at the end I would like it to have the same file name, just parquet. Is there a way to do that?

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Use expressions :) That's what they are for.

    • @PicaPauDiablo1
      @PicaPauDiablo1 4 года назад

      @@AdamMarczakYT Sorry if it was a dumb question, I'm still new to ADF. Ignore if it's too inane but is fileanem in the @pipeline parameter? I found one online but couldn't get it to parse.

  • @horatiohe
    @horatiohe 4 года назад +1

    thanks for the great content!! you are the man :)

  • @RahulRajput_018
    @RahulRajput_018 4 года назад +1

    Thanks buddy ...Great work

  • @javm7378
    @javm7378 2 года назад +1

    I really like your tutorials. I have been looking for a "table partition switching" tutorial but haven't found any good ones. May be you could do one for us? I am sure it'll be very popular as there aren't any good ones out there and it is an important topic in certifications :-)

  • @RC-nn1ld
    @RC-nn1ld 4 года назад

    Love these videos so easy to understand, do you have a video on new XML connector

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад +1

      Great, thanks! Not yet, maybe in near future :)

  • @eshaandevgan312
    @eshaandevgan312 4 года назад +1

    I have a question, please help. I am not able to understand why DataFlows need to have their own data sets. Why not use the pipeline datasets. This will help me a lot. Thanks in advance.

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад +1

      It can use pipeline datasets, but not all types/source systems are supported.

    • @eshaandevgan312
      @eshaandevgan312 4 года назад +1

      @@AdamMarczakYT Thanks Adam, and your videos are very nice. Keep it up.

  • @Rafian1924
    @Rafian1924 4 года назад +1

    Lovely bro!!

  • @rahulkota9793
    @rahulkota9793 4 года назад +1

    Very useful. Thank you so much.

  • @oathkeepersapphirelands
    @oathkeepersapphirelands 4 года назад +1

    How do you solve the parallel execution of your pipeline when triggered by events to avoid duplicates?

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад +1

      You need to do this as part of your flow design. Unfortunately some things can't be solved by tools. Thanks for watching! :)

    • @oathkeepersapphirelands
      @oathkeepersapphirelands 4 года назад

      @@AdamMarczakYT Ok maybe handle it by Run ID I guess :)

  • @kirankumarreddykkr9606
    @kirankumarreddykkr9606 Год назад

    can you pyspark or sql in Expression functions ?
    are only scale

  • @Lakshmi-y4x
    @Lakshmi-y4x 6 месяцев назад

    Thank you, very helpful tutorials

  • @yashnegi9473
    @yashnegi9473 2 года назад

    Video is excellent. I want to know the problem statement which Data flow is solving?

  • @JohnJohnson-bs4cw
    @JohnJohnson-bs4cw 3 года назад +1

    Great Video. Can you use data from a REST Api as a source for a Mapping Data Flow or does the source have to be a dataset on Azure?

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад

      Here is the list of supported data sources for MDF docs.microsoft.com/en-us/azure/data-factory/data-flow-source?WT.mc_id=AZ-MVP-5003556 . Just copy data from REST API to Blob and then start MDF pipeline using that blob path as a parameter.

  • @samb9403
    @samb9403 2 года назад

    Great video.
    Question: Under "New Datasets", is there a capability to drop data into Snowflake? I see S3, Redshift, etc.
    I appreciate the video and feedback!

  • @mrjedrek1112
    @mrjedrek1112 4 года назад

    Hi, I have an issue and I am wandering if you could help me. I have created similar data flow. When I run a pipeline with this data flow inside I can see that new file was created in my Data lake. Unfortunately, this file is always empty, but when I click preview data within data flow (in a sink tool) I can see data. Empty means it has column names, but it doesn't have any data. This file is CSV

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Which file are you checking? Mapping Data Flow creates many files in the output to follow partitioned model which is HDFS compatible. Typically there is an empty file and a folder which inside contains partitioned data.

  • @nick6s
    @nick6s 3 года назад

    Excellent tutorials

  • @davidakoko3308
    @davidakoko3308 4 года назад +1

    Hi Mr adam how are you? been trying to use the add function to add two columns of numeric value but the result is wrong
    E.G ADD(COLUMN_A, COLUMN_B) RESULT =COLUMN_AB instead of adding the values. lets say column_a have value of 334 and coumn_b have value of 4 result is giving 3344 instead of 338. please can you help. Nice video BTW. thanks

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Check out concat function docs.microsoft.com/en-us/azure/data-factory/data-flow-expression-functions#concat

  • @gursikh133
    @gursikh133 5 лет назад +1

    Adam, FOr using transformation do I need to learn scala. Or just refer the documentation you specified for scala functions and write the transformation?

    • @AdamMarczakYT
      @AdamMarczakYT  5 лет назад

      Documentation should be enough. MDF is targeting simple transformations so in most cases documentation alone will suffice.

  • @mohmedaminpatel4427
    @mohmedaminpatel4427 3 года назад

    Around 20:20, We can see there is just one partition, does Azure automatically decide the number of partitions it needs to divide the dataset into ? Also is it done at some cost i.e. more partitions cost more or is it complementary ?
    Thank you for all the tutorials, I am binge watching them since 3 days now and thoroughly enjoying them ! Would love to see some tutorials for Synapse as well :) !

    • @mohmedaminpatel4427
      @mohmedaminpatel4427 3 года назад

      Wow should have waited before making the comment as you have explained it later in the video itself. Thank you Adam !

    • @AdamMarczakYT
      @AdamMarczakYT  3 года назад

      Glad it helped, thanks! :)

  • @biswajitsarkar5538
    @biswajitsarkar5538 5 лет назад

    Thanks Adam !! very informative video.Liked it a lot..

    • @AdamMarczakYT
      @AdamMarczakYT  5 лет назад

      Thanks and you are most welcome! Glad you hear it.

  • @ahmedmj8729
    @ahmedmj8729 2 года назад

    Hello Adam , i follow these steps but i have a problem : i didn't find the source columns when i go to derived column component to write expression based on existing column. in your video , total columns in source component show = 3 , for me =0 ? i changed the source from csv to sql table and i didn't found the solution.

  • @skybluelearner4198
    @skybluelearner4198 2 года назад

    Good explanation there.

  • @TheLastBeat2
    @TheLastBeat2 4 года назад +1

    Hi Adam, so glad I found your channel. Your videos were a big help for achieving the AZ900 certificate. Now I am studying a lot to uplift my knowledge and get the Azure data engineer certificate. However, I have an important question! Data flows are expensive, sometimes clients don’t want to use this, are there alternatives to achieve the same result in azure data factory? Thank you very much!

    • @AdamMarczakYT
      @AdamMarczakYT  4 года назад

      Well you can't have the cookie and eat the cookie :) In my opinion it's not that expensive compared to other available tools.

    • @TheLastBeat2
      @TheLastBeat2 4 года назад

      @@AdamMarczakYT True! I am currently struggling with csv files that sometimes have extra spaces after the words in the header, this then gives error when doing a copy activity to Azure SQL Database. Do you have any idea to make my flow a bit more flexible so that it can deal with this? It needs some trimming in the header

    • @TheLastBeat2
      @TheLastBeat2 4 года назад

      I thought of doing a SELECT in a dataflow to then change to the correct header titles, but for this I need to know where the spaces will be in the future. So also not flexible.

  • @abhim4nyu
    @abhim4nyu 2 года назад

    Will it work with pipe (“|”) separated value file instead of csv?

  • @yashgemini4024
    @yashgemini4024 4 года назад +1

    Appreciate you content. Thanks.

  • @SairamPoluru
    @SairamPoluru 5 лет назад +1

    I got stuck at Derived Column, since not able to get columns from source (Movies input)

    • @AdamMarczakYT
      @AdamMarczakYT  5 лет назад

      Check if you selected 'first row as headers' checkbox in your dataset. Also there is a button preview data there which you can use to check if you properly set up your dataset. If all will fail, I often advise my colleagues to just delete everything and start over, while sounds a bit weird it really sometimes is faster than finding the issue and provides nice learning curve, Good luck!

    • @seb6302
      @seb6302 4 года назад

      I have the same issue - did you resolve it?

  • @kevinabraham92
    @kevinabraham92 2 года назад

    Nice video.
    Just curious. Can you explain toInteger(trim(right(title,6),'()')) in detail please. Like how this command executes?

  • @khushbookumari-pc2xw
    @khushbookumari-pc2xw 5 лет назад

    Please also explain how to use data analytics in pipeline flow.
    This is best content . Thank u so much

    • @AdamMarczakYT
      @AdamMarczakYT  5 лет назад

      Thanks you. As to your question can you elaborate on data analytics part? What exactly would you like to see.

    • @khushbookumari-pc2xw
      @khushbookumari-pc2xw 5 лет назад +1

      Anything like how to make function or procedure and how to use this in pipeline to execute .basic flow of pipeline by using analytics.

    • @AdamMarczakYT
      @AdamMarczakYT  5 лет назад

      Hey, I do plan to have implementation videos like this in future. Although pipeline of videos is long so I can promise anything right now. I added this to the list of potential topics :) thanks!

    • @khushbookumari-pc2xw
      @khushbookumari-pc2xw 5 лет назад

      okay, Thanks

  • @anubhav2020
    @anubhav2020 3 года назад +2

    Hello Adam, thanks a bunch for this excellent video. The tutorial was very thorough and anyone new can easily follow. I do have a question though. I am trying to replicate an SQL query into the Data Flow, however, I have had no luck so far.
    The query is as follows:
    Select ZipCode, State
    From table
    Where State in ('AZ', 'AL', 'AK', 'AR', 'CO', 'CA', 'CT'...... LIST OF 50 STATES);
    I tried using Filter, Conditional Split and Exists transforms, but could not achieve the desired result. Being new to the Cloud Platform, I am having a bit of trouble.
    Might I request you please cover topics like Data Subsetting/Filtering (WHERE and IN Clauses etc.) in your tutorials.
    Appreciate your time and help in putting together these practical implementations.

  • @arun06530
    @arun06530 4 года назад +1

    nice & detailed video.