Code along - build an ELT Pipeline in 1 Hour (dbt, Snowflake, Airflow)

Поделиться
HTML-код
  • Опубликовано: 30 сен 2024

Комментарии • 162

  • @twickAttack
    @twickAttack 7 месяцев назад +31

    Hey Jay, thank you for the video. I'd be happy to see you doing more ELT pipelines and focus on your thought's process ( I can watch longer format 1-2 hours) - why you do things in that way, why is it important and whatnot; and you can throw some explainers to anything else you do and the reason behind it. I think senior DE and others with experience do things bit automatically and it takes time for the newbies to pick up on those skills. So, your thought process for doing things instead of just doing the things is priceless for anyone watching, including me. Appreciate your video, dude :)

    • @jayzern
      @jayzern  7 месяцев назад +9

      Thank you! Will try to create more useful content

    • @miguelhermar
      @miguelhermar 6 месяцев назад

      Completely agree 😊

  • @AakashKumarDhal
    @AakashKumarDhal 2 месяца назад +3

    Error solved!!!!
    for anyone facing this error:
    Runtime Error
    Database error while listing schemas in database "dbt_db"
    Database Error
    250001: Could not connect to Snowflake backend after 2 attempt(s).Aborting
    Try the second method to update account name for your project inside profile.yml file.
    account_locator-account_name

  • @mohitupadhayay1439
    @mohitupadhayay1439 Месяц назад +7

    AN ABSOLUTE GOLDMINE OF AN INFORMATION WHICH NOT AY UDEMY OR RUclips TUTOR HAS PROVIDED YET!

  • @kmonish9119
    @kmonish9119 4 месяца назад +5

    I have been struggling with dbt and airflow for a long time. For some reason I could not connect the dots. Having some mixture of knowledge - I landed on this tutorial and it just glued all my scattered dots well. Thanks Jayzern!!! Really appreciate the efforts :)

  • @aliceschirina8191
    @aliceschirina8191 10 часов назад

    Hello, thanks for this tutorial. At the very beginning, when trying to run the "dbt deps" command I'm getting this error : "Encountered an error loading local configuration: dbt_cloud.yml credentials file for dbt Cloud not found. Download your credentials file from dbt Cloud to `C:\Users\a.schirina\.dbt`". I'm using dbt command locally and my profiles.yml in the .dbt folder is
    data_pipeline:
    target: dev
    outputs:
    dev:
    type: snowflake
    account: jpb45436
    # User/password auth
    user: alices
    password: mypassword
    role: dbt_role
    database: dbt_db
    warehouse: dbt_wh
    schema: dbt_schema
    threads: 4
    client_session_keep_alive: False
    Does anyone know the problem?

  • @Dev_Zyoom
    @Dev_Zyoom 7 месяцев назад +9

    honestly never knew about dbt and glad to learn it here thank you

  • @GeorgeNyamao
    @GeorgeNyamao Месяц назад +2

    Thanks @jayzern. This tutorial is awesome. I will be recommending it to folks who struggle with connecting dbt with any database engine.

  •  13 часов назад

    Thank you for the video jayzern. When I push code into Git, should I push code of dbt only, or I need to push all code of dbt-dag ?

  • @truongnguyen813
    @truongnguyen813 4 месяца назад +1

    I'm struggling within the step to load dbt data_pipeline, it did not show in the airflow dag. How could I be wrong, can you support?

  • @southafricangamer7174
    @southafricangamer7174 3 дня назад

    So to my understanding, the singular tests really mean to check if nothing is the result of the query been tested.
    If the test is true, then nothing equates to the query been tested - Great your data is fine.
    If false, you should run that query to see what exactly are those rows.
    Confusing at first but makes sense now.

  • @rakshavishwakarma1811
    @rakshavishwakarma1811 5 месяцев назад +2

    thank you so much it is 100% worth and useful... expect some more videos in detail.... like prod deployment through git and git interation with airflow

  • @StanleySI
    @StanleySI 4 месяца назад +1

    Just wonder in the real world scenario, where are all raw data stored? In AWS s3?

  • @MalvinSiew
    @MalvinSiew 5 месяцев назад +2

    Hi Jay, thanks for the video. I'm having an issue connecting to Snowflake backend at the stage you first perform 'dbt run' @ 14:50 .
    This is the error I get:
    15:17:54 Encountered an error:
    Runtime Error
    Database error while listing schemas in database "dbt_db"
    Database Error
    250001: Could not connect to Snowflake backend after 2 attempt(s).Aborting
    I've checked the profiles.yml file and all details are correct. Please help!

    • @parag2039
      @parag2039 5 месяцев назад +1

      facing the same issue!!!!!! can anyone please help I've restarted and tried everything possible to figure out but failed

    • @parag2039
      @parag2039 5 месяцев назад

      @MalvinSiew
      I solved one of the two errors I was facing. I did not have Git installed in my system. You can simply ask AI for prompts to guide you through the installation process.

    • @vitorcavalcante8495
      @vitorcavalcante8495 4 месяца назад

      Had the same problem, when passing the account_value with 'dbt init' I wasn't able to connecto using the ccount url value, only with the second option which was the - value

    • @oreschz
      @oreschz 2 месяца назад

      did u solve it? I have the same problem. what is the solution?

    • @AakashKumarDhal
      @AakashKumarDhal 2 месяца назад

      @@oreschz could you solve it?

  • @anikethdeshpande8336
    @anikethdeshpande8336 4 месяца назад +3

    i'm new to snowflake, dbt and airflow,
    this is awesome tutorial, got to learn a lot
    thank you jayzern

  • @abdullahsiddique7787
    @abdullahsiddique7787 13 дней назад

    Is data engineering dead with advent of AI ? What is the future of data engineering careers in your opinion ?

  • @diaconescutiberiu7535
    @diaconescutiberiu7535 6 месяцев назад +2

    Awesome video! I already recommended this to my entire team. Please make more like this, they are extremely helpful.
    Idea for next video: dbt for Snowflake (again) but with Data Vault 2.0 modeling. I would love to see the logic behind creating dim and fact tables, how you define the stg files for creating the hubs/satellites/links.

    • @jayzern
      @jayzern  6 месяцев назад

      Oof yea I did consider doing a Data Vault model where we showcase how hubs, satellites and links work but didn't think ppl would be interested. Thanks for raising 👍

  • @MarionWilliams-m7x
    @MarionWilliams-m7x 27 дней назад

    Brown Gary Martinez Dorothy Garcia James

  • @corbanb
    @corbanb Месяц назад

    Jay! Thanks for the video and content very cool to see. Curious why Airflow over something like FiveTran besides the ability to self host? Any gotchas?

    • @jayzern
      @jayzern  Месяц назад

      FiveTran is not really an orchestration tool - it's really meant for the "Extract Load" part only. It's great because of Unix philosophy, i.e. "do one thing, do one thing well only", whereas Airflow is more of a generalist, task-based orchestrator. Another thing is FiveTran is super expensive, unless you're working on something enterprise-y

  • @ahmednasr3811
    @ahmednasr3811 4 месяца назад +2

    Thanks bro for your efforts ❤

  • @tianhockwoo3025
    @tianhockwoo3025 Месяц назад

    Hello did anyone else face this error at Airflow after @32:50
    Broken DAG: [/usr/local/airflow/dags/dbt-dag.py]
    Traceback (most recent call last):
    File "/usr/local/lib/python3.12/site-packages/cosmos/operators/base.py", line 361, in __init__
    self.full_refresh = full_refresh
    ^^^^^^^^^^^^^^^^^
    File "/usr/local/lib/python3.12/site-packages/airflow/models/baseoperator.py", line 1198, in __setattr__
    if key in self.__init_kwargs:
    ^^^^^^^^^^^^^^^^^^
    AttributeError: 'DbtRunLocalOperator' object has no attribute '_BaseOperator__init_kwargs'. Did you mean: '_BaseOperator__instantiated'?
    please send help

    • @CosmicNomad
      @CosmicNomad Месяц назад +1

      I am facing the exact same error. Please post a reply, if you were able to figure out the fix. I'll do the same if I find a solution.

    • @CosmicNomad
      @CosmicNomad Месяц назад

      Ok, so I think I was able to find the thread related to this issue.. Its still open as of 8/18/2024 11pm PT..
      github.com/astronomer/astronomer-cosmos/issues/1161

  • @ThiagoSilva-vh9fy
    @ThiagoSilva-vh9fy Месяц назад

    couldn't run int_order_items.sql because it ruturns a strange error: it says: "The selection criterion 'int_order_items.sql' does not match any enabled nodes". And if aI run "dbt run" it says: " unexpected '.' in line 1" at 20:22

  • @christophercampo9099
    @christophercampo9099 16 дней назад

    Thank you, thank you THANK YOU! This was so helpful, easy to follow and made perfect sense.

  • @saikoundinya9913
    @saikoundinya9913 3 месяца назад

    Dude, where did you even mention about dbt_project.yml file, in part 2 of the video, you directly jump to vscode
    what are the details ??

  • @miguelhermar
    @miguelhermar 7 месяцев назад +1

    Thanks Jay! Could you also upload into the Notion document the code for the dbt_dag.py file for the Airflow deployment? That's still missing 🙏🏻

    • @jayzern
      @jayzern  7 месяцев назад +1

      Totally forgot about that, thanks for the reminder!

    • @miguelhermar
      @miguelhermar 7 месяцев назад

      No worries, I realized you used it from the Cosmos github repo so I managed to find it there and finally was able to wire up everyhing and deploy it. 🤓 Thanks Jay. It's a super helpful tutorial. @@jayzern

  • @uppinder
    @uppinder Месяц назад

    26:00 item_discount_amount is supposed to be negative because the macro defined it as such. I also checked the data on snowflake and they're all negative amounts. Did I miss something?

  • @anduamlaktadesse9284
    @anduamlaktadesse9284 5 месяцев назад +2

    so supportive and completing the project .

  • @KieuOanhNguyen-ve6yk
    @KieuOanhNguyen-ve6yk 3 месяца назад +2

    Thank you for your video. I'm stuck at the last step when connecting to Airflow though I followed every step,.it shows this error:
    Broken DAG: [/usr/local/airflow/dags/dbt_dag.py]
    Traceback (most recent call last):
    File "/usr/local/lib/python3.11/site-packages/cosmos/converter.py", line 211, in __init__
    project_config.validate_project()
    File "/usr/local/lib/python3.11/site-packages/cosmos/config.py", line 206, in validate_project
    raise CosmosValueError(f"Could not find {name} at {path}")
    cosmos.exceptions.CosmosValueError: Could not find dbt_project.yml at /usr/local/airflow/dags/dbt/data_pipelin/dbt_project.yml
    Do you have any idea why I get this error? Thank you in advance!

    • @George-s9s1u
      @George-s9s1u Месяц назад

      same error here.... no solution yet.

    • @CosmicNomad
      @CosmicNomad Месяц назад

      In your case I think you have a typo missing 'e' in /usr/local/airflow/dags/dbt/data_pipelin

    • @oyindamolavictor
      @oyindamolavictor Месяц назад

      please how did you fix this issue?

    • @aarthithinakaran6655
      @aarthithinakaran6655 28 дней назад

      I got the same error too. Please let me know how you fixed it.

    • @George-s9s1u
      @George-s9s1u 27 дней назад

      @@aarthithinakaran6655 yes I fixed it everything works

  • @montoyescoful
    @montoyescoful 3 месяца назад

    Hi Jay. Question: Once you have created the Fact table, how does this process work if I run it again? Is it going to append new records and update the existing ones? Or is it going to drop and create the Fact table over again?

  • @dominicaleung7329
    @dominicaleung7329 12 дней назад

    Thank you very much. This is very nice and concise tutorial, exactly what I need.

  • @RockefellerBurgess
    @RockefellerBurgess 13 дней назад

    19395 Bryana Station

  • @ozland7172
    @ozland7172 4 месяца назад

    Hello.. thanks for the tutorial.
    I know airflow runs the tasks/dags however I cannot follow one thing; how do we determine the order of the action items at 35:36 within dbt (I believe it is determined on dbt side) since we have only one dag running on this example? I appreciate if anyone replies.

  • @CosmicNomad
    @CosmicNomad Месяц назад

    This is such an amazing video @jayzern! The project taken was not overly complex but also not barebones and covered a lot of important stuff! Thanks for being thoughtful and including the code along link (else some of formatting issues would have bugged many newbies)!
    I think you should keep creating more videos as you are a good teacher. Only suggestion I have is may be include a bit more explanation, which will help beginners even more! Kudos!

  • @stephenarnold2343
    @stephenarnold2343 3 месяца назад

    I materialized marts as tables but int_order_items, int_order_items_summary and fct_orders are created as views instead of tables. How do I convert these views to tables?

  • @abderrahmanehamim6692
    @abderrahmanehamim6692 17 дней назад

    Thank you very much

  • @saurabhpandey1882
    @saurabhpandey1882 3 месяца назад

    Thanks Jayzern,! if I can be of some help for your next video let me know!

  • @PhanNguyen_010
    @PhanNguyen_010 6 месяцев назад +2

    thank you so much for this tutorial. hope you have more videos in the future

    • @jayzern
      @jayzern  6 месяцев назад

      Thanks man!

  • @PhiNguyen-iz9go
    @PhiNguyen-iz9go 6 месяцев назад +2

    This code along session (starting from scratch with environment setup, codebase structure ...) is soooooooo helpful. Hope to see more examples like this. Keep up the work my man

    • @PhiNguyen-iz9go
      @PhiNguyen-iz9go 6 месяцев назад

      I watched the video "How you start learning Data Engineering..." and wondering that can you do a live coding that step through all those aspects (from SQL, command lines... to Kafka...) in 1 project? I think it would help a lot...

    • @jayzern
      @jayzern  6 месяцев назад +1

      Glad to hear it's helpful! 👍
      It's great to hear feedback on what type of live coding videos you find insightful. Will keep note on Kafka and Command lines

  • @SteynGun-n2u
    @SteynGun-n2u Месяц назад

    hi guys kindly help me out, does only snowflakes and dbt is enought are i have to learn hadoop, spark etc i am working as data analyst for last 1 year and planning to switch to de

  • @projectveritas93
    @projectveritas93 7 месяцев назад +2

    Great video and explanation. we need more videos from you.

  • @DivineSam-w6m
    @DivineSam-w6m 2 месяца назад +1

    This video is like a gold mine for building a portfolio especially for someone starting out as a Data Engineer like me!... Manny Thanks and Kudos to you!.. Love from India

    • @adityakulkarni3798
      @adityakulkarni3798 13 дней назад

      Hey how did you use snowflake? Did you buy it because it shows me that it is a paid software

  • @AlbertLorraine-t1y
    @AlbertLorraine-t1y 10 дней назад

    Ziemann Bypass

  • @melvin9993
    @melvin9993 3 месяца назад +1

    Dude this is so good :)

  • @anggipermanaharianja6122
    @anggipermanaharianja6122 День назад

    nice

  • @thainguyenbalamquang386
    @thainguyenbalamquang386 3 месяца назад +1

    Thank you, love your work

  • @GarrettSchwarzenbach-u9v
    @GarrettSchwarzenbach-u9v 21 день назад

    Roberts Isle

  • @nothan_nah
    @nothan_nah 7 месяцев назад +2

    Thanks for sharing this dbt tutorial! It’s definitely super hot rn and useful to learn. 🎉

  • @DagStylez
    @DagStylez 3 месяца назад +1

    Excellent tutorial!!!

  • @peekknuf
    @peekknuf 7 месяцев назад +1

    Extremely useful content, i especially liked live googling and debugging parts

    • @jayzern
      @jayzern  7 месяцев назад +1

      Thank you for the support! Hope other people find it useful too.

  • @AkashKandarkar
    @AkashKandarkar 3 месяца назад +1

    Amazingly explained 👌

  • @srikantaghosh2386
    @srikantaghosh2386 2 месяца назад

    At 32:21, how did you copy the dbt folders to airflow project?

  • @dataengineermatheusbudin7011
    @dataengineermatheusbudin7011 2 месяца назад

    Hey, thanks for the project tutorial. i was wondering if there is the best way to deploy airflow on a cloud enviroment... I see a lot of Ec2 or EKS (kubernetes). But maybe i could work on ECS + Fargate? Which deploy method would you please recomend regarding a production scenario? (like beyond studies, thinking about a daily job task). Thank you mate

    • @jayzern
      @jayzern  2 месяца назад

      Airflow + EKS is probably the most common in the industry because of cost reasons and vertical scaling. You could use ECS + Fargate too, but fargate is really expensive!
      I don't have any recs atm, but will try to create more examples on production DAGs next time. Check out ruclips.net/video/Xe8wYYC2gWQ/видео.html in the meantime!

  • @JohnS-er7jh
    @JohnS-er7jh 3 месяца назад

    Thanks very much for posting this! Definately earned another subscriber/viewer

  • @AaronAsherRandall
    @AaronAsherRandall 2 месяца назад

    This is great! At what point would you need to dockerize the files though? Sorry, new to data engineering. Thank you!

    • @jayzern
      @jayzern  2 месяца назад

      You can Dockerize it at the beginning, or once you have a baseline model working. I've seen cases where Data engineers start with Docker, or Dockerize it halfway! I personally prefer the latter

  • @AwakenByMe
    @AwakenByMe 2 месяца назад

    WOW!! ,Thank you so much for this wonderful video, Please keep making dbt + airflow videos,
    I have one doubt, I can see that one task in airflow which is stg_tpch_orders have run + test in your dag, But it is not showing up in mine,
    Have you added any tests on stg_tpch_orders ? but maybe missed to show it into the video ?

    • @jayzern
      @jayzern  2 месяца назад

      Hmm it's hard to tell without looking at ur code, but there is a generic test for stg_tpch_orders that looks at the relationship between fct_orders and stg_tpch_orders. Check your generic_tests.yml file to confirm
      Thanks for the support man!

  • @pythonmathui3057
    @pythonmathui3057 2 месяца назад

    I'm struggling with airflow connection to snowflake, can you make another video to elaborate it more?

    • @jayzern
      @jayzern  2 месяца назад

      For sure, I didn't explain the airflow integration with snowflake as much as I wanted to

  • @maryam4071
    @maryam4071 5 месяцев назад

    hi, I would like to know about singular test, we want to check negative value in test, why we use the condition as positive?

  • @sanaomar2182
    @sanaomar2182 4 месяца назад

    how do you know your username>? jayzer? like I went back to my profile but it did not work. where can I find the name of my user?

    • @quintonflorence6492
      @quintonflorence6492 4 месяца назад

      In Snowflake you should see your user name in the bottom left corner. It'll be the top bolded value

  • @mohammedvahid5099
    @mohammedvahid5099 6 месяцев назад

    Pleas make complete videos on DBT WITH snowflake migration project with real time scenario videos bro thnk u❤ nice explaind

    • @jayzern
      @jayzern  6 месяцев назад +1

      Thank you man! Will take that into consideration

  • @CybersecYT
    @CybersecYT 5 месяцев назад

    How could i get the project folder structure?

  • @abdelaliamghar5123
    @abdelaliamghar5123 2 месяца назад

    90' it's a long Time

  • @CSK-Studios
    @CSK-Studios 7 месяцев назад

    Hi Jay, good one..am trying same way but getting below error " 1 of 1 ERROR creating view model dbt_schema.stg_tpch_line_items................. [ERROR in 0.04s]
    06:17:33
    06:17:33 Finished running 1 view model in 2.02s.
    06:17:33
    06:17:33 Completed with 1 error and 0 warnings:
    06:17:33
    06:17:33 Compilation Error in model stg_tpch_line_items (models\staging\stg_tpch_line_items.sql)
    06:17:33 'dict object' has no attribute 'type_string'
    06:17:33
    06:17:33 > in macro generate_surrogate_key (macros\sql\generate_surrogate_key.sql)
    06:17:33 > called by macro default__generate_surrogate_key (macros\sql\generate_surrogate_key.sql)
    06:17:33 > called by model stg_tpch_line_items (models\staging\stg_tpch_line_items.sql)"

    • @jayzern
      @jayzern  7 месяцев назад

      Try checking if your dbt_utils version is correct. There seems to be a compile time error with calling generate surrogate key. The code is available in notion page.

    • @szhao2864
      @szhao2864 7 месяцев назад

      I got the same error. How did you solve it?

  • @rvmnet2112
    @rvmnet2112 5 месяцев назад

    One question here, As we have dbt jobs feature available in dbt cloud and it is very easy to create job here then why it is need to use airflow?

    • @jayzern
      @jayzern  5 месяцев назад

      Yea that's great question! In theory dbt cloud can trigger jobs too, but in practice you'd want to decouple your orchestration tool away from your transformation tool for a myriad of reasons: ability to orchestrate other tools together with dbt, avoid vendor lock from dbt, many companies are comfortable with Airflow etc. It really depends on your tech stack

  • @kenneth1691
    @kenneth1691 6 месяцев назад

    Thank you so much for this, I've been trying to learn how to do this and you helped me solve this
    Do you have trainings!!

    • @jayzern
      @jayzern  6 месяцев назад

      Thanks man! Yea I'm working on live trainings too so stay tuned 🙌

  • @hiteshmohite7677
    @hiteshmohite7677 7 месяцев назад

    hey, I have a small request
    can you please make a video on how to make use of pyspark efficiently in low spec system with huge amount of data

    • @jayzern
      @jayzern  6 месяцев назад

      Low compute Spark + high volumes of data is challenging but will take note. Thx for the suggestion

  • @梁喬萍
    @梁喬萍 3 месяца назад

    love this! thanks for sharing this tutorial, very useful

  • @prabhatgupta6415
    @prabhatgupta6415 2 месяца назад

    can u tell why have we used airflow since dbt cloud has feature to schedule the jobs?

    • @jayzern
      @jayzern  2 месяца назад

      If your company only uses dbt and no other tooling, dbt cloud works too
      However in the real world, it's hard to control your CRON schedule when you have many tools in your stack. Orchestrators job is to focus on scheduling. Linux philosophy of do one thing, do one thing well TLDR

  • @giovannimaia9652
    @giovannimaia9652 3 месяца назад

    Please post more videos, your videos are awesome and very instructive

  • @aiviet5497
    @aiviet5497 2 месяца назад

    I need a longer video. Please give me.

  • @albertcampillo
    @albertcampillo Месяц назад

    Hi @jayzern, thanks a lot for your video, really valuable content!

  • @popalex
    @popalex 6 месяцев назад

    Great video.
    I would love to see a complex ETL pipelines.

  • @sanaomar2182
    @sanaomar2182 4 месяца назад

    How did he stat? did he create a wroksheet? I tried it but it di not work, the very first steps ?? what arethey?

    • @Rajdeep6452
      @Rajdeep6452 4 месяца назад

      yes you need to write the queries in a worksheet

  • @oludelehalleluyah6723
    @oludelehalleluyah6723 4 месяца назад

    I haven't a lot from this tutorial...
    Thank you

  • @maikerodrigo4249
    @maikerodrigo4249 2 месяца назад

    Great tutorial, i've learning a lot thanks!

  • @reneporto-ai
    @reneporto-ai 4 месяца назад

    WOW! That's is amazing tutorial, thanks a lot.

  • @OmerNadler
    @OmerNadler 6 месяцев назад

    is do i need to pay on astro ? if i want to use this for prod env

  • @neosmith009
    @neosmith009 6 месяцев назад

    Overall great, the airflow orchestration felt a bit clunky especially given that the source code had to be kept in the same directory.

    • @jayzern
      @jayzern  6 месяцев назад

      Thx for the feedback 👍 ideally should wrap this in a container image, but for simplicity decided to keep it as code

    • @neosmith009
      @neosmith009 5 месяцев назад

      @@jayzern Makes sense, any good resources on self hosting dbt core?

  • @i-see-right-through-you
    @i-see-right-through-you 2 месяца назад

    well done! great tutorial!

  • @nadhasthirundhitan
    @nadhasthirundhitan 4 месяца назад

    excellent video, thank you

  • @StephenRayner
    @StephenRayner 2 месяца назад

    You should check out Meltano

    • @jayzern
      @jayzern  2 месяца назад

      I've heard great things about Meltano!

  • @JoseR-ui9vn
    @JoseR-ui9vn 3 месяца назад

    Thanks Jayzern

  • @pavankumard5276
    @pavankumard5276 6 месяцев назад

    Need more content like this!!! Really amazing video. Just one suggestion I would like to make before diving into the coding part it would be better if you could provide a real world scenario and reference that while writing you code. Thanks

    • @jayzern
      @jayzern  6 месяцев назад +1

      Appreciate the feedback man 🙏 will try to incorporate more real-world context before and during the live coding part, that's a great idea

    • @pavankumard5276
      @pavankumard5276 6 месяцев назад

      @@jayzern thanks a lot, waiting for some more tutorials😃

  • @jeevankumarkondasingu34
    @jeevankumarkondasingu34 4 месяца назад

    Nice Explaination

  • @BishalKarki-pe8hs
    @BishalKarki-pe8hs 4 месяца назад

    100% worth it

  • @okkwok1753
    @okkwok1753 6 месяцев назад

    I am not sure why I cannot open the notes, can anyone help?

    • @jayzern
      @jayzern  6 месяцев назад +2

      I double checked the link and it's working, try this
      bittersweet-mall-f00.notion.site/Code-along-build-an-ELT-Pipeline-in-1-Hour-dbt-Snowflake-Airflow-cffab118a21b40b8acd3d595a4db7c15?pvs=74
      Let me know what error you see

  • @thetrangia1091
    @thetrangia1091 7 месяцев назад

    thank you very much

  • @kirankumar1290
    @kirankumar1290 5 месяцев назад

    Good

  • @prasadatluri
    @prasadatluri 7 месяцев назад

    Great video Jay

    • @RohithPatelKanchukatla
      @RohithPatelKanchukatla 6 месяцев назад

      Hii mr.prasad garu are you data engineer too?

    • @prasadatluri
      @prasadatluri 6 месяцев назад +1

      @@RohithPatelKanchukatla Hi there. I am a Data Scientist

  • @MiguelMadrizGr8
    @MiguelMadrizGr8 5 месяцев назад +14

    Are we in 2024? it all looks a lot like old century ... coding is necessary but come on, this is another story

    • @WheresTheLambSAAAAAAAUCE
      @WheresTheLambSAAAAAAAUCE 2 месяца назад +1

      What do you mean? There’s very little actual coding here to be honest. Most of it is trying to get the different services and database talking to each other and exchanging information. And then automating it. There’s a lot of moving parts.

    • @George-s9s1u
      @George-s9s1u Месяц назад

      what do you mean?

    • @emilst7413
      @emilst7413 25 дней назад

      wdym

    • @maddriven07
      @maddriven07 24 дня назад

      Yeah its alot by just building tables and populating it

  • @fun2badult
    @fun2badult 6 месяцев назад

    Can you please post more videos like this? Really appreciate it. Helps me understand the Dbt/Snowflake/Airflow a lot

    • @jayzern
      @jayzern  6 месяцев назад

      Yes sir am working on future videos right now!

  • @fun2badult
    @fun2badult 6 месяцев назад

    Doing this for the second time and for some reason dbt is only creating views and not tables FML

    • @Rajdeep6452
      @Rajdeep6452 4 месяца назад

      same. Its not creating tables. Just views.

    • @duvanzapata6761
      @duvanzapata6761 4 месяца назад

      @@Rajdeep6452 Hi I fix it just buy running again dbt run and also checking first that the dbt_project.yml say table on the marts

    • @Bolda92
      @Bolda92 2 месяца назад

      @@duvanzapata6761 Verified all of that and reran dbt run and still I have only views. any ideas?

    • @oyindamolavictor
      @oyindamolavictor Месяц назад

      I had the same error and I realized the issue came from my dbt_project.yml, I had a typo instead of data_marts I wrote data_mart. The name must be the same with the data_marts folder you created in folder structure