If you're running into an error with "exit code 1" @1:28:55, you need to update the Dockerfile @1:08:08. Goto the Dockerfile and "image: postgres:9.2" for both "source_postgres" and "destination_postgres" I think this was a good idea but many details went off. I wish he'll be more specific about the version of the software he's using next time.
Thanks, I wasted many hours with that error. I would appreciate if you could expand on the reasons why this error happens. Again, thanks for your contribution.
bro, the execution of my code is still running, (after I edited the part of the code you mentioned), but I believe in you! Thank you so much wallah you are a savior! 🔥🔥
Please bring a bigger course , which covers all aspects from basics , like from scratch , MySQL , python/Java/Scala , Hadoop Spark , pyspark , something that covers a data engineering with one cloud
Amazing! Looking forward to it. Also, it would be perfect to have a more comprehensive version of this course as well. Covering all batch and stream processing tools and methods.
At 1:53:56 you may face an error due to the fact that the dbt service is launched before the completion of the elt_script service. To solve the issue, you have to add condition: service_completed_successfully under the depends_on clause of the dbt service to be sure that it will always be launched after the completion of the elt_script service.
@@wusswuzz5818 did you end up figure out any errors? dbt creates the view for me for the films, film_actors, and actors tables but not for the film_ratings table because the relation "public.films" does not exist
Hey , currently there is no set path to becoming a data engineer. So please create a proper certification with a clear roadmap of foundations and most used cloud tech in data engineering so that we can get some structure going for those interested in this career.
Good job, you are creating visibility for airbyte in a great way, by providing an evolutionary view of the stack that gets one to eventually need it. Hope they continue to support you making content using this approach.
Thanks for the course .. to all those using VM's make sure files are located on the VM .. tried running docker excercise with docker on VM and files on host (Windows) wasted a lot of time resloving errors finally moved all the files to the Ubuntu VM where things ran smoothly ...
I just started and love the style. You teach fluent and set focus on the important take aways. I saw so much bullshit that I really expected that I need to watch you 10 minutes installing docker and already started skipping but you didn't show that part which is nice. Makes perfect sense. Someone not being able to RTFM and install docker on his own shouldn't focus on DE at this point anyway imho.
@@JREQuickPods he codes and games on his twitch channel and he has a RUclips(you can search his name) where he shares his experiences. twitch.tv/justinbchau. I watched him make some of this course a few months back on twitch.
Please make this a series! (Edit) Suggestion 1: I would make is to include an overview before each section of 1) the overarching pipeline 2) where in the pipeline we are for a given section. In other words, having a map or diagram of what is happening would help with conceptual understanding. Suggestion 2: Explain the code line-by-line conceptually. Time writing the code on the screen could be cut and replaced just by explanation. This would save time.
If you get the "pg_dump: error: aborting because of server version mismatch ". I found that running "apt-get update && apt-get install -y postgresql-client" is actually installing version 15, while "postgres:latest: pulls version 16. To fix this this, specify your image to be "postgres:15" for both the source_postgres and destination_postgres in the docker-compose.yaml. I also changed my RUN in command in Dockerfile to be "apt-get update && apt-get install -y postgresql-client-15" to be explicit.
This course is not for beginners, he is moving forward like a hell without telling how to install, he is moving fast forward and video title is for beginners
An advice. Doesn't make sense to read and write the code from your top monitor. Save your and the watchers' time and just copy-paste what's there and explain it line by line.
Thank you! I ran into all the missing configuration/typos Justin had. But for this one, his github yaml file didn't have the line, so it was difficult to find the root cause. thank you for sharing.
Airbyte section is complicated and badly explained. It looks for me like some part is missing in the recording. E.g. how airbyte was started? After this video I don't see significant advantage using Airbyte over elt_script from the video example.
The setup for these services seems like A LOT of work and because of that things can go wrong in production, not to mention handover for something like this...what alternatives are there for this type of ELT pipeline?
Errors: Database Error in model actors (models/example/actors.sql) relation "public.actors" does not exist relation "public.actors" does not exist ===> Solution: In 'docker-compose.yaml', add the below code to dbt: depends_on: elt_script: condition: service_completed_successfully
i did this and the dbt runs for the films, film_actors, and actors tables but when trying to create the view for the film_ratings table it still says that relation "public.films" does not exist. I have been frustratingly stuck on this for days, I even tried rebuilding the project. Not sure what else to do, any thoughts on what I can do next?
Came across another issue while running code for datapipeline gave me version error between dump and postgresql I changeed the version from latest to 15.5 to match dump version... hope this helps in case anyone face issues on this module
In the "Building a Data pipeline from Scratch" section, when I am running the containers by docker compose up, I get the following error: elt-elt_script-1 | pg_dump: error: aborting because of server version mismatch elt-elt_script-1 | pg_dump: detail: server version: 16.1 (Debian 16.1-1.pgdg120+1); pg_dump version: 15.5 (Debian 15.5-0+deb12u1) I have installed the latest version of postgres on my machine i.e. 16.1. I have also removed the images and volumes and rerun the docker compose up command, still I get the above error. Can someone please help? Thanks!
I ran into this same problem. I fixed it by changing the docker-compose.yaml. Instead of image: postgres:latest under the source_postgres and destination_postgres, I wrote image: postgres:15 in both of these places. This way when the code is run, it installed the same version of PostgreSQL in both the source and destination.
Okay, major question. As you are adding in new technologies like Airflow in the docker file, where is your console log or terminal to let you know if there’s any syntax errors etc.? For example, with React, Django, Flutter, I always have the app running on local host and ALWAYS have that window open to see if there are any errors in the error log as I am updating files. How do you do that with this workflow to make sure you’re not making mistakes while you’re writing code?
I love it. But lets say i need to make a email classification system that distributes these emails after there classification and send them to there classified department server. Would you use a message queue like celery or use a broker like kafka or rabit mq?
It was actually he forgot the macro keyword when he was defining the macro in ratings_macro.sql file. {% macro generate_ratings() %} CASE WHEN user_rating >= 4.5 THEN 'Excellent' WHEN user_rating >= 4.0 THEN 'Good' WHEN user_rating >= 3.0 THEN 'Average' ELSE 'Poor' END as rating_category {% endmacro %}
I ran into the below error trying to run my container, can you please advice me on how to resolve it? All my files are in one directory. 2024-05-03 15:34:42 Node.js v18.20.2 2024-05-03 15:34:45 node:internal/modules/cjs/loader:1143 2024-05-03 15:34:45 throw err; 2024-05-03 15:34:45 ^ 2024-05-03 15:34:45 2024-05-03 15:34:45 Error: Cannot find module '/app/src/index.js' 2024-05-03 15:34:45 at Module._resolveFilename (node:internal/modules/cjs/loader:1140:15) 2024-05-03 15:34:45 at Module._load (node:internal/modules/cjs/loader:981:27) 2024-05-03 15:34:45 at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:128:12) 2024-05-03 15:34:45 at node:internal/main/run_main_module:28:49 { 2024-05-03 15:34:45 code: 'MODULE_NOT_FOUND', 2024-05-03 15:34:45 requireStack: [] 2024-05-03 15:34:45 } 2024-05-03 15:34:45 2024-05-03 15:34:45 Node.js v18.20.2
have a issue with postgres version mismatch. I change the version of postgres server to suit with it. services: source_postgres: image: postgres:15.5 same for destination_postgres. Hope this help
Why is dbt using the port 5434 to communicate with the destination database? Both containers are running in the same docker network so why do we need to use the exposed port number?
It does not need that port that port is exposed so that if you the user wants to take a peak at the database you can run a db manager against that port, otherwise to peek at the database you would have to exec -it into the db container to look at the tables.
i get a runtime error thrown by dbt-1: Could not find profile named "custom_postgres" ... I've already checked my dbt_project.yml file and my profiles file in the .dbt directory and everything check outs. I am not sure what else to do
Not even on the Gartner Quadrant. And Airbyte is a freemium, only 14 days of free use. And as you might have guessed, it's greed-priced, errr usage priced. the lowest use (10gb and only 4 rows to replicate per month) is $160.00. That is huge expensive. I think Im going to skip this advertisement.
Hi Justin, I have problem loading data to Postgres from the SQL scripts. The problem I encounter is "unf_inv_name expected afer this token". May I get some advices pls?
LOL. Chau just skipped debugging it 🤣 He actually just forgot the macro keyword. It should be {% macro generate_ratings() %} in the rating_macro.sql file
I am getting an issue with docker compose up 'dbt-1 | relation "public.actors" does not exist' seems like dbt runs first and then the elt_script, even though I have the same docker compose file as the video shows.
@@josesalazar2384 I was having the same issue but with public.films does not exist. I went in the docker file and added the condition. But that did not solve the issue. i looked around and turns out in sources.yml i had the table source for films to be "films.sql". we dont need to add a .sql. Leaving this here in case someone has the same issue as me
We need a 60 hrs course for DataEngineer.
U make one
Check out the data engineering zoomcamp.
😂@@dijik123
@@StarLord-571
@@StarLord-571 it's not with the quantity but with the quality
We need full data engineering course py + sql + big data hadoop + apache spark + apache airflow + apache kafka + aws + project
If you're running into an error with "exit code 1" @1:28:55, you need to update the Dockerfile @1:08:08. Goto the Dockerfile and "image: postgres:9.2" for both "source_postgres" and "destination_postgres"
I think this was a good idea but many details went off. I wish he'll be more specific about the version of the software he's using next time.
Thanks, I wasted many hours with that error. I would appreciate if you could expand on the reasons why this error happens. Again, thanks for your contribution.
God bless you mate !
Thanks a lot for the point. I spent a bit of time trying to solve the issue.
bro, the execution of my code is still running, (after I edited the part of the code you mentioned), but I believe in you! Thank you so much wallah you are a savior! 🔥🔥
thanks mate
Please bring a bigger course , which covers all aspects from basics , like from scratch , MySQL , python/Java/Scala , Hadoop Spark , pyspark , something that covers a data engineering with one cloud
yes!!
Yes, please. This would be so helpful but thanks for this resource, a great start.
YES Please
Yes
Yes please
Amazing!
Looking forward to it.
Also, it would be perfect to have a more comprehensive version of this course as well. Covering all batch and stream processing tools and methods.
the best DE course that i ever seen. the most courses only stick to the theory and never show the practical part.
At 1:53:56 you may face an error due to the fact that the dbt service is launched before the completion of the elt_script service.
To solve the issue, you have to add condition: service_completed_successfully under the depends_on clause of the dbt service to be sure that it will always be launched after the completion of the elt_script service.
Thank you so much.
This did not work for me, and I can't progress any further. Frustrating.
Thanks man, I was struggling with this for a while
depends_on:
elt_script:
condition : service_completed_successfully
@@wusswuzz5818 did you end up figure out any errors? dbt creates the view for me for the films, film_actors, and actors tables but not for the film_ratings table because the relation "public.films" does not exist
in the docker-compose.yaml file, for anyone wondering (like me 1 min ago)
Hey , currently there is no set path to becoming a data engineer. So please create a proper certification with a clear roadmap of foundations and most used cloud tech in data engineering so that we can get some structure going for those interested in this career.
Good job, you are creating visibility for airbyte in a great way, by providing an evolutionary view of the stack that gets one to eventually need it. Hope they continue to support you making content using this approach.
Thanks for the course .. to all those using VM's make sure files are located on the VM .. tried running docker excercise with docker on VM and files on host (Windows) wasted a lot of time resloving errors finally moved all the files to the Ubuntu VM where things ran smoothly ...
More bi and DE courses like this plz
Finally a data engineering course!
I just started and love the style. You teach fluent and set focus on the important take aways. I saw so much bullshit that I really expected that I need to watch you 10 minutes installing docker and already started skipping but you didn't show that part which is nice. Makes perfect sense. Someone not being able to RTFM and install docker on his own shouldn't focus on DE at this point anyway imho.
Since when Justin is a data engineer? Well I guess the constant learning is real.
I’ve been following him since I started coding(almost 4 years now) he’s always learning and growing.
@@briabytes which channel ?
@@briabytes Please share his YT channel.
@@JREQuickPods he codes and games on his twitch channel and he has a RUclips(you can search his name) where he shares his experiences. twitch.tv/justinbchau. I watched him make some of this course a few months back on twitch.
his twitch is in the other comment
YES THANK YOU SO MUCH keep expanding this!!!!
Please add more data engineering course like this. I really love it.
1:59:58
the reason why he encountered the error is that {% generate_ratings() %}.
To avoid the error, you should put {% macro generate_ratings() %}
Please make this a series!
(Edit)
Suggestion 1: I would make is to include an overview before each section of 1) the overarching pipeline 2) where in the pipeline we are for a given section. In other words, having a map or diagram of what is happening would help with conceptual understanding.
Suggestion 2: Explain the code line-by-line conceptually. Time writing the code on the screen could be cut and replaced just by explanation. This would save time.
We would like to see more content for Data Engineering, possibly a full course.
Yes! I’ve been waiting for this.
If you get the "pg_dump: error: aborting because of server version mismatch ".
I found that running "apt-get update && apt-get install -y postgresql-client" is actually installing version 15, while "postgres:latest: pulls version 16.
To fix this this, specify your image to be "postgres:15" for both the source_postgres and destination_postgres in the docker-compose.yaml.
I also changed my RUN in command in Dockerfile to be "apt-get update && apt-get install -y postgresql-client-15" to be explicit.
Thank you. I have 30 mininutes for this error.
love you mate.
This course is not for beginners, he is moving forward like a hell
without telling how to install, he is moving fast forward and video title is for beginners
An advice. Doesn't make sense to read and write the code from your top monitor. Save your and the watchers' time and just copy-paste what's there and explain it line by line.
Well you should read the description
Thank you for this! Learned a lot today. 👍🏾
Long waiting for this course
Here I'm I thinking about data engineering, then Boom! RUclips shows me a data Engineering course 😅
You probably googled
Right off the bat, would just like to make the comment of, please nix the background music or make it even quieter... Distracting.
Please make a detailed tutorial on data engineer, about 20-30 hours full end to end course please it's a request 🙌
another one had issues with host.docker.internal hence added extra_hosts:
- "host.docker.internal:host-gateway" to docker compose for dbt
Thank you! I ran into all the missing configuration/typos Justin had. But for this one, his github yaml file didn't have the line, so it was difficult to find the root cause. thank you for sharing.
@@paolaprieto8111 heya could you share the exact code you wrote, I can't seem to get it working
Thank you for this. I like the way you teach it step-by-step and I always got lost with JOINS. Haha anyways thanks for this! Keep it up
My man...Justin!
You're a top crossfitter and I miss working out under your guidance
Airbyte section is complicated and badly explained.
It looks for me like some part is missing in the recording. E.g. how airbyte was started?
After this video I don't see significant advantage using Airbyte over elt_script from the video example.
Yes! Thanks!
Finalllyyyy an data project
I like to see more of your content, good explanation skills
Thank you for amazing video. Could you please a second video including Data Engineering Projects
best video on the internet
The setup for these services seems like A LOT of work and because of that things can go wrong in production, not to mention handover for something like this...what alternatives are there for this type of ELT pipeline?
Finalllyyyy an data project. What are the learning pre-requisites for this course?.
I really need a longer video.
Errors:
Database Error in model actors (models/example/actors.sql)
relation "public.actors" does not exist
relation "public.actors" does not exist
===> Solution:
In 'docker-compose.yaml', add the below code to dbt:
depends_on:
elt_script:
condition: service_completed_successfully
A hero, but why does it work for most people out of the box?
i did this and the dbt runs for the films, film_actors, and actors tables but when trying to create the view for the film_ratings table it still says that relation "public.films" does not exist. I have been frustratingly stuck on this for days, I even tried rebuilding the project. Not sure what else to do, any thoughts on what I can do next?
Can you pls do a complete end to end video on AWS or GCP data engineering data ingestion, etl, analytics using pyspark
Came across another issue while running code for datapipeline gave me version error between dump and postgresql I changeed the version from latest to 15.5 to match dump version... hope this helps in case anyone face issues on this module
Hero
Please make more video about Data Engineering
In the "Building a Data pipeline from Scratch" section, when I am running the containers by docker compose up, I get the following error:
elt-elt_script-1 | pg_dump: error: aborting because of server version mismatch
elt-elt_script-1 | pg_dump: detail: server version: 16.1 (Debian 16.1-1.pgdg120+1); pg_dump version: 15.5 (Debian 15.5-0+deb12u1)
I have installed the latest version of postgres on my machine i.e. 16.1. I have also removed the images and volumes and rerun the docker compose up command, still I get the above error. Can someone please help? Thanks!
I ran into this same problem. I fixed it by changing the docker-compose.yaml. Instead of image: postgres:latest under the source_postgres and destination_postgres, I wrote image: postgres:15 in both of these places. This way when the code is run, it installed the same version of PostgreSQL in both the source and destination.
@@oddlang687 it' s work , thanks you very much, i take two day to find the way to show that problem.
Thanks man!
@@oddlang687 Appreciate this comment, fixed for me. Thank you!
please we need courses about big data , hadoop and sparks with practical projects
Do you need to know maths for data engineering if so what types of
ok, this is super helpful.
Thank's for the video
Please make video on performance testing using jmeter
Okay, major question. As you are adding in new technologies like Airflow in the docker file, where is your console log or terminal to let you know if there’s any syntax errors etc.? For example, with React, Django, Flutter, I always have the app running on local host and ALWAYS have that window open to see if there are any errors in the error log as I am updating files.
How do you do that with this workflow to make sure you’re not making mistakes while you’re writing code?
please come up with big data course
@freeCodeCamp please put more content on data engineering ;)
it was alright, I think adding a bit more discipline into the course would be an amazing upgrade
Thank you
thank you very much
Love it!
Thanks!
This is excellent content
Keep up the good work!
Quallity content as always
Real GEM
This channel is gold 🪙❤
I love it. But lets say i need to make a email classification system that distributes these emails after there classification and send them to there classified department server.
Would you use a message queue like celery or use a broker like kafka or rabit mq?
Am I stubit or the first part about Docker was a bit incomprehensible?
Do full MySQL course
How do i learn all of these? What course should i take? Please help
Can you make data engineering boot camp.
Awesome!
I need a comprehensive video. Please !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!. I'm looking for that.
My boy, Chau!
Nice one
The intro and description don't match the video - there is no Spark or Kafka anywhere in the video.
You didn't solve the column level macro bug @ 2:00:03. That's not the attitude mate.
It was actually he forgot the macro keyword when he was defining the macro in ratings_macro.sql file.
{% macro generate_ratings() %}
CASE
WHEN user_rating >= 4.5 THEN 'Excellent'
WHEN user_rating >= 4.0 THEN 'Good'
WHEN user_rating >= 3.0 THEN 'Average'
ELSE 'Poor'
END as rating_category
{% endmacro %}
I ran into the below error trying to run my container, can you please advice me on how to resolve it? All my files are in one directory. 2024-05-03 15:34:42 Node.js v18.20.2
2024-05-03 15:34:45 node:internal/modules/cjs/loader:1143
2024-05-03 15:34:45 throw err;
2024-05-03 15:34:45 ^
2024-05-03 15:34:45
2024-05-03 15:34:45 Error: Cannot find module '/app/src/index.js'
2024-05-03 15:34:45 at Module._resolveFilename (node:internal/modules/cjs/loader:1140:15)
2024-05-03 15:34:45 at Module._load (node:internal/modules/cjs/loader:981:27)
2024-05-03 15:34:45 at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:128:12)
2024-05-03 15:34:45 at node:internal/main/run_main_module:28:49 {
2024-05-03 15:34:45 code: 'MODULE_NOT_FOUND',
2024-05-03 15:34:45 requireStack: []
2024-05-03 15:34:45 }
2024-05-03 15:34:45
2024-05-03 15:34:45 Node.js v18.20.2
Excellent .
Finally!
have a issue with postgres version mismatch. I change the version of postgres server to suit with it.
services:
source_postgres:
image: postgres:15.5
same for destination_postgres.
Hope this help
thank you man! Great help
I am stuck at installing docker, 'cant run on my pc' windows 10
At 2:52:34 how was airbyte started ??
try looking into course resources > airbyte
Why is dbt using the port 5434 to communicate with the destination database? Both containers are running in the same docker network so why do we need to use the exposed port number?
It does not need that port that port is exposed so that if you the user wants to take a peak at the database you can run a db manager against that port, otherwise to peek at the database you would have to exec -it into the db container to look at the tables.
Might be great, but source code is not working at all (starts with main branch, and keeps going).
Is spark and kafka actually covered here as mentioned in the description?
nope
Here's your data model. People/Places/Pennies/Product involved in a Process.
The semi-transparent terminal is a very bad idea for a video. It unnecessarily makes it harder to see the text.
i get a runtime error thrown by dbt-1:
Could not find profile named "custom_postgres"
...
I've already checked my dbt_project.yml file and my profiles file in the .dbt directory and everything check outs. I am not sure what else to do
What are the learning pre-requisites for this course?
brain
this guy could barely code in JavaScript like a year ago so not too much i guess.
@@AndrewHuange thanx for answering my query👍
Can anyone suggest some youtube channel which provide all the topics like SQL python spark pyspark azure ???
Im curious what tools, besides, Airbyte can be used? (the presenters employer - FYI, this is a marketing channel for Airbyte).
Not even on the Gartner Quadrant. And Airbyte is a freemium, only 14 days of free use. And as you might have guessed, it's greed-priced, errr usage priced. the lowest use (10gb and only 4 rows to replicate per month) is $160.00. That is huge expensive. I think Im going to skip this advertisement.
@@markk364 you can run airbyte for free using the OSS version.
1:40:58 can't find the schema.yml file in the course github. Had to copy everything from the video :/
ran into the same issue, its in a different branch
Hi Justin, I have problem loading data to Postgres from the SQL scripts. The problem I encounter is "unf_inv_name expected afer this token". May I get some advices pls?
1:14:49
what happened with the macro on 1:59:56? you got an error and then just jumped into another topic :D
LOL. Chau just skipped debugging it 🤣
He actually just forgot the macro keyword. It should be {% macro generate_ratings() %} in the rating_macro.sql file
Hi! Thanks for the tutorial, where can I find those yml file ? in the github repo there are not those files
I mean the part of dbt
Any Pre-requisites required ?
definitely not for beginners but good .
Sir please make the audio track available
I am getting an issue with docker compose up
'dbt-1 | relation "public.actors" does not exist'
seems like dbt runs first and then the elt_script, even though I have the same docker compose file as the video shows.
fixed it by doing this on my dbt service
depends_on:
elt_script:
condition: service_completed_successfully
@@josesalazar2384 thank you for sharing this! I ran into the same problem and was looking forever for a solution. This worked great for me
thanks a lot
@@josesalazar2384 I was having the same issue but with public.films does not exist. I went in the docker file and added the condition. But that did not solve the issue. i looked around and turns out in sources.yml i had the table source for films to be "films.sql".
we dont need to add a .sql.
Leaving this here in case someone has the same issue as me
I want to be Data Engineering Assistant 😀
does the alpine container run the same as the ubuntu container i presume it does but want to know if it does change the scope of the project