How to Create Databricks Workflows (new features explained)

Поделиться
HTML-код
  • Опубликовано: 7 фев 2025

Комментарии • 38

  • @youfran
    @youfran Год назад +5

    I wish they would add the possibility of adding workflow dependencies to other workflows. As a data engineer, you need this 100% of the time.

    • @BryanCafferky
      @BryanCafferky  Год назад +1

      Not sure what you mean. Could you elaborate?

    • @youfran
      @youfran Год назад +8

      @@BryanCafferky I meant would be immensely helpful if Databricks workflows offered the feature to set a trigger mode based on the completion or state of other workflows, given we have the limit of 100 tasks per workflow.

  • @vyacheslavs5642
    @vyacheslavs5642 Год назад +3

    You can use Terraform to provision your Workflows, Tasks, Clusters, Notebooks, etc. programmatically. Then Terraform scripts (*.tf, *.hcl) can be uploaded to Git and used in CI/CD as well.

    • @BryanCafferky
      @BryanCafferky  Год назад +3

      Thanks for your comment. Terraform is not open source anymore which causes me to pause on its future. OpenTofu is the new open source Terraform. You can also use Python with the Databricks Python SDK, or just Python with the Databricks REST API or the new Databricks Asset Bundles.

    • @palanikumar4150
      @palanikumar4150 5 месяцев назад

      Can you please send any reference for creating Workflows. tasks and run notebooks using terraform?. So far I'm using terraform only to create databricks workspace and cluster.

  • @manasr3969
    @manasr3969 9 месяцев назад

    love this video. The dashboard refresh is supercool

  • @afonso0078
    @afonso0078 Год назад +1

    Thank you for sharing your knowledge! One question: is there a way to create this workflow using some type of ci/cd? for example, creating a development branch and pull request to merge in a master branch?
    The main idea is to create the workflow into a development environment and send it to the production environment.

    • @BryanCafferky
      @BryanCafferky  Год назад +1

      Yes. There are several ways. I am using the Databricks Python SDK from an Azure DevOps pipeline to do this. However, workflows are not stored in the repos so you'll need to use the UI, get the JSON and paste it into a file in your repo. learn.microsoft.com/en-us/azure/databricks/dev-tools/sdk-python You can also use the new Databricks Asset Bundles learn.microsoft.com/en-us/azure/databricks/dev-tools/bundles/

  • @Databricks
    @Databricks Год назад +1

    Great summary!!

  • @zoji9566
    @zoji9566 11 месяцев назад

    Invaluable. Thank you 🙏

  • @michasikorski6671
    @michasikorski6671 Год назад

    I have workflow with task A and task B, and 10 mores. I would like to have widgets or parameters like A : True, B : False... and it would decide if task should be skipped or now. Is it possible? How?

  • @philipqualtier5102
    @philipqualtier5102 3 месяца назад

    This is an excellent video. Do you have any videos where you can pass a parameter into a SQL statement instead of python code.

    • @BryanCafferky
      @BryanCafferky  3 месяца назад

      I don't think I did a video about that yet. Good thought. Thanks.

  • @RajeshPhanindra
    @RajeshPhanindra Год назад +1

    When creating a workflow, does it allow you to drag and drop tasks?

    • @BryanCafferky
      @BryanCafferky  Год назад

      No. The UI is more select and set the properties. The UI will update to the properties like dependencies.

  • @8aravindk
    @8aravindk Год назад +1

    Hi @bryan, Why are these videos still not in the playlist on your website, it's been 2 weeks since you posted them here. I'm looking under the DataBricks Section and can't find them. I think your website should be first class citizen for locating your videos as well. Cheers and thanks for the helpful videos.

    • @BryanCafferky
      @BryanCafferky  Год назад

      Hi @Baravindk, They are in the YT playlist and the GitBook points you to the playlist rather than listing all the videos therein. To make new videos more easily found, I added a new videos menu to the GitBook and added these. These videos are in the RUclips Master Data Lakehouse playlist. Thanks

  • @elprofesornet8897
    @elprofesornet8897 6 месяцев назад

    Hi Bryan, excellent material :) One question, would you still stick to workflows if you have to upload/download files using SFTP or import data consuming REST APIs?

    • @BryanCafferky
      @BryanCafferky  6 месяцев назад +1

      Thanks. It depends on how the SFTP was being transferred. If some other external process dropped files to ADLS via SFTP, that seems to be fine. a Workflow could pick the file(s) up. If it needed to execute the SFTP to get the data within the workflow, This blog suggests using ADF to get the data to ADLS and then the workflow could do the rest. Notebooks can call REST APIs so it is possible to get data using REST APIs but whether it works in a given scenario would need to be evaluation by the data engineer.

  • @SaiKumar-ub6jo
    @SaiKumar-ub6jo 7 месяцев назад +1

    Can you help how we can create the drop down for task parameters in worflow

    • @BryanCafferky
      @BryanCafferky  7 месяцев назад

      You use widgets. Doc here learn.microsoft.com/en-us/azure/databricks/notebooks/widgets

  • @datoalavista581
    @datoalavista581 11 месяцев назад

    Brilliant !! Thank you so much

  • @conconmc
    @conconmc Год назад +1

    Hi Bryan, wondering if you could a video of databricks and DBT? Would be interested in your thoughts :)

    • @BryanCafferky
      @BryanCafferky  Год назад

      I have not used dbt but from what I have seen it is very powerful. Thanks

  • @ZhangYiman-p8d
    @ZhangYiman-p8d 2 месяца назад

    How to pass the variables from one task to the following task?

    • @BryanCafferky
      @BryanCafferky  2 месяца назад

      See this blog docs.databricks.com/en/jobs/parameters.html

  • @SujeetKumarSinghlive
    @SujeetKumarSinghlive Год назад

    It helps lot , Thanks!

  • @shankhadeepghosal731
    @shankhadeepghosal731 8 месяцев назад

    how to use if else branch logic ?

  • @Noobsmove
    @Noobsmove Год назад +1

    Agree on the limitations. For some reason a Databricks Workflow cannot contain more than 100 steps.
    Luckily there is now a new feature where a workow can contain a new kind of step which triggers another job.
    So now you can atleast subdivide you job into multiple smaller ones and then have a mster job that triggesr all the sub-jobs.
    But still, it would be way easier to just not have that limitation. It feels kinda artificial :/

  • @MariusS-h2p
    @MariusS-h2p 10 месяцев назад

    19:25 That's not a future option, that's just the category?!

  • @joshuatrampier4355
    @joshuatrampier4355 Год назад

    How do you delete a task from a workflow?

    • @BryanCafferky
      @BryanCafferky  Год назад

      click on the task in WF editor and click on the trash can.

  • @JMo268
    @JMo268 Год назад

    Could you dedicate a video to Unity Catalog?