I've made it to this third video of the series and quite enjoying it. It's the perfect balance between a quick start and just enough to get you going. Thanks for creating this. You got a new sub.
Dear Friend - Thank you so much for your kind words! To stay updated with our future content, I kindly request you to consider subscribing and liking our videos. If you could share our video/channel with your friends, we would be so grateful! Once again thank you for your encouragement.
@@SleekData ihaving issue help me out. you didnt show how and where did you write clean data and init python file. Broken DAG: [/opt/airflow/dags/exchange_rate_pipeline.py] Traceback (most recent call last): File "", line 488, in _call_with_frames_removed File "/opt/airflow/dags/exchange_rate_pipeline.py", line 7, in from clean_data import clean_data ModuleNotFoundError: No module named 'clean_data'
Dear Friend - Thank you for your kind words! We're delighted to hear that you're enjoying our content. To stay updated with our future videos, consider subscribing and sharing with your friends. Your support means a lot to us. Thanks again for your encouragement!
Every time I restart the Airflow container and access localhost:8080, I receive the error: localhost didn’t send any data. This issue is only resolved when I delete all previously created data and build a new container. Is there a way I can retain this data every time I restart the container?"
Hi, Same error im having module not found. Can you please explain the error. I have also tried import sys sys.path.append('/path/to/your/module') from clean_data import clean_data but it is giving me the same error.
If someone like me stuck up with tasks of exchange rate being "succeed" but nothing actually happened: remove "end_date" param in dag=DAG, because it's just preventing dag from running at all. dag = DAG( 'exchange_rate_etl', start_date=datetime(2023, 10, 1), # end_date=datetime(2023, 12, 31), schedule_interval='0 22 * * *', default_args={"retries": 2, "retry_delay": timedelta(minutes=5)}, catchup=False ) I also have a problem with sending an email, but that's not super important part of the video, so that's fine
Dear Friend - Thank you so much for your feedback! To stay updated with our future content, I kindly request you to consider subscribing and liking our videos. If you could share our video/channel with your friends, we would be so grateful! Once again thank you for your encouragement.
hello im stuck here: Broken DAG: [/opt/airflow/dags/exchange_rate_pipeline.py] Traceback (most recent call last): File "", line 488, in _call_with_frames_removed File "/opt/airflow/dags/exchange_rate_pipeline.py", line 7, in from clean_data import clean_data ModuleNotFoundError: No module named 'clean_data
I am getting an error at the docker site for i am told that the clean_data module in the exchange_rate_pipeline is not defined. Even after creating the clean_data file in the plugin folder
Dear Friend - Thank you your kind feedback! To stay updated with our future content, I kindly request you to consider subscribing and liking our videos. Your support is valuable to us. If you ever need technical support, please feel free to leave a comment on any of our videos, and we'll be sure to respond promptly or create a video addressing the issue.
I have an issue, when the exchange_rate_pipeline included as a 2nd DAG after refresh the airflow , it's shown in the UI but when I triggered the DAG it's not starting the process, and when I check the first DAG "welcome_dag" it's also not start the process. What am I missing? Btw I only change the airflow.cfg as you mention that "smtp_host = sleek-smtp" and "smtp_starttls = False"
Dear Friend - Thank you so much for your kind words! To stay updated with our future content, I kindly request you to consider subscribing and liking our videos. If you could share our video/channel with your friends, we would be so grateful! Once again thank you for your encouragement.
I notice that there are wave lines under "from airflow.utils.dates import days_ago" and other lines with "from airflow....import..", how to remove such warnings? Thanks
Dude - pip install apache-airflow still not working then pls close and reopen. Good luck, To stay updated with our future content, I kindly request you to consider subscribing and liking our videos. If you could share our video/channel with your friends, we would be so grateful!
Dear Friend - pls refer below line in docker compose, it just mounts entire airflow folder in your local machine as /opt/airflow in airflow docker instance. ./airflow:/opt/airflow with this you can manually create plugin folder in your local airflow folder, then place whatever py files needed (Don't worry about _*_ file created automatically) and restart docker, all should work as expected. Good luck, if helpful please don't forget to like and subscribe.
while clicking on local host link 8080:8080 i am encountering the error that This page isn’t working localhost didn’t send any data. ERR_EMPTY_RESPONSE.... Please help!!
Dear Friend Please double-check the Docker Compose file. You should expose port 8080 from a container to the host. You can refer to the previous video in the playlist. Try all possibilities: 127.0.0.1:8080 127.0.0.1:8080 127.0.0.1:8080 localhost:8080 localhost:8080 Also, please open the command prompt and type ipconfig and hit enter. You will get your IPv4 Address. For example, if it is 192.168.56.1, then please try all possibilities with that IP address: 192.168.56.1:8080 192.168.56.1:8080 192.168.56.1:8080 Good luck, to stay updated with our future content, I kindly request you to consider subscribing and liking our videos.
I am not fully understand your tutorial because of your code document (clean_data def). Also, when implement git_repo_dag.py, a dummy error come to me "airflow.exceptions.AirflowException: GitHub operator error: The conn_id `github_default` isn't defined". Maybe, a short video is good, but not enough knowledge and explaination will let it be bad
Dear friend - Thanks for the feedback. I will add clean_data code to the code link shortly. And for your another issue, pls try creating a default git connection- pla refer below video to create and understand connections: ruclips.net/video/-fjAchpM4mc/видео.html
Dear Friend Please double-check the Docker Compose file. You should expose port 8080 from a container to the host. You can refer to the previous video in the playlist. Try all possibilities: 127.0.0.1:8080 127.0.0.1:8080 127.0.0.1:8080 localhost:8080 localhost:8080 Also, please open the command prompt and type ipconfig and hit enter. You will get your IPv4 Address. For example, if it is 192.168.56.1, then please try all possibilities with that IP address: 192.168.56.1:8080 192.168.56.1:8080 192.168.56.1:8080 Good Luck, please like and subscribe.
When I run the dockerfile and docker-compse.yml, I have this error: => ERROR [3/4] COPY requirements.txt /tmp/requirements.txt dockerfile:12 -------------------- 10 | 11 | # Install provider packages from requirements.txt 12 | >>> COPY requirements.txt /tmp/requirements.txt 13 | RUN pip install -r /tmp/requirements.txt -------------------- ERROR: failed to solve: failed to compute cache key: failed to calculate checksum of ref 9rhhdgnjglwa0lgbm0w5dn2kk::ls3a3ce00oaooi9d7m6269o9t: "/requirements.txt": not found 0.0s How can I fix this issue? thank you!
someone help me i cant run the exchange rate pipeline: Broken DAG: [/opt/airflow/dags/exchange_rate_pipeline.py] Traceback (most recent call last): File "", line 488, in _call_with_frames_removed File "/opt/airflow/dags/exchange_rate_pipeline.py", line 7, in from clean_data import clean_data ModuleNotFoundError: No module named 'clean_data
Dude - The error indicates that the clean_data module is not found. This can happen if the module is not in the Python path or if there is a typo in the import statement. Here are steps to troubleshoot and resolve the issue. Check Module Location. Verify Import Statement from subdirectory.clean_data import clean_data or please try: import sys sys.path.append('/path/to/your/module') from clean_data import clean_data
I've made it to this third video of the series and quite enjoying it. It's the perfect balance between a quick start and just enough to get you going. Thanks for creating this. You got a new sub.
Dear Friend - Thank you so much for your kind words!
To stay updated with our future content, I kindly request you to consider subscribing and liking our videos.
If you could share our video/channel with your friends, we would be so grateful!
Once again thank you for your encouragement.
@@SleekData ihaving issue help me out. you didnt show how and where did you write clean data and init python file.
Broken DAG: [/opt/airflow/dags/exchange_rate_pipeline.py]
Traceback (most recent call last):
File "", line 488, in _call_with_frames_removed
File "/opt/airflow/dags/exchange_rate_pipeline.py", line 7, in
from clean_data import clean_data
ModuleNotFoundError: No module named 'clean_data'
The best Airflow course
Dear Friend -
Thank you for your kind words! We're delighted to hear that you're enjoying our content. To stay updated with our future videos, consider subscribing and sharing with your friends. Your support means a lot to us. Thanks again for your encouragement!
Every time I restart the Airflow container and access localhost:8080, I receive the error: localhost didn’t send any data. This issue is only resolved when I delete all previously created data and build a new container. Is there a way I can retain this data every time I restart the container?"
Hi, Same error im having module not found.
Can you please explain the error. I have also tried
import sys
sys.path.append('/path/to/your/module')
from clean_data import clean_data
but it is giving me the same error.
Is it possible to create a custom airflow provider? Great video btw!🎉
If someone like me stuck up with tasks of exchange rate being "succeed" but nothing actually happened: remove "end_date" param in dag=DAG, because it's just preventing dag from running at all.
dag = DAG(
'exchange_rate_etl',
start_date=datetime(2023, 10, 1),
# end_date=datetime(2023, 12, 31),
schedule_interval='0 22 * * *',
default_args={"retries": 2, "retry_delay": timedelta(minutes=5)},
catchup=False
)
I also have a problem with sending an email, but that's not super important part of the video, so that's fine
Dear Friend - Thank you so much for your feedback!
To stay updated with our future content, I kindly request you to consider subscribing and liking our videos.
If you could share our video/channel with your friends, we would be so grateful!
Once again thank you for your encouragement.
Thanks for helping out mate
hello im stuck here:
Broken DAG: [/opt/airflow/dags/exchange_rate_pipeline.py]
Traceback (most recent call last):
File "", line 488, in _call_with_frames_removed
File "/opt/airflow/dags/exchange_rate_pipeline.py", line 7, in
from clean_data import clean_data
ModuleNotFoundError: No module named 'clean_data
@@saifilicious1749 hey you can just add "sys.path.insert(0, os.path.join(os.path.dirname(__file__), 'plugins'))" before import clean_data
I am getting an error at the docker site for i am told that the clean_data module in the exchange_rate_pipeline is not defined. Even after creating the clean_data file in the plugin folder
i have solved it. i placed the clean_data file in the same as the exchange_rate_pipeline. Thanks
Dear Friend - Thank you your kind feedback! To stay updated with our future content, I kindly request you to consider subscribing and liking our videos. Your support is valuable to us. If you ever need technical support, please feel free to leave a comment on any of our videos, and we'll be sure to respond promptly or create a video addressing the issue.
I have an issue, when the exchange_rate_pipeline included as a 2nd DAG after refresh the airflow , it's shown in the UI but when I triggered the DAG it's not starting the process, and when I check the first DAG "welcome_dag" it's also not start the process. What am I missing?
Btw I only change the airflow.cfg as you mention that "smtp_host = sleek-smtp" and "smtp_starttls = False"
Dude - pls try the video side by side by opening on another device may be your mobile...good luck.
Dude, it's probably because Dag's end_date parameter is smaller than today.
Update it to at least today or more than today.
Is there a way to install packages in runtime?
Or adding them in requirements and re-building the image each time is required?
Bro - You can install in docker terminal (may be you need to switch to root) but that will go away once you restart docker.
Very helpfull videos! Thank you!!
Dear Friend - Thank you so much for your kind words!
To stay updated with our future content, I kindly request you to consider subscribing and liking our videos.
If you could share our video/channel with your friends, we would be so grateful!
Once again thank you for your encouragement.
I notice that there are wave lines under "from airflow.utils.dates import days_ago" and other lines with "from airflow....import..", how to remove such warnings? Thanks
Dude - pip install apache-airflow
still not working then pls close and reopen.
Good luck, To stay updated with our future content, I kindly request you to consider subscribing and liking our videos.
If you could share our video/channel with your friends, we would be so grateful!
how did you create the plugin folder? i don't have it by default
Dear Friend - pls refer below line in docker compose, it just mounts entire airflow folder in your local machine as /opt/airflow in airflow docker instance.
./airflow:/opt/airflow
with this you can manually create plugin folder in your local airflow folder, then place whatever py files needed (Don't worry about _*_ file created automatically) and restart docker, all should work as expected.
Good luck, if helpful please don't forget to like and subscribe.
I have the same issue, can u please help me
while clicking on local host link 8080:8080 i am encountering the error that This page isn’t working localhost didn’t send any data.
ERR_EMPTY_RESPONSE.... Please help!!
Dear Friend
Please double-check the Docker Compose file. You should expose port 8080 from a container to the host. You can refer to the previous video in the playlist.
Try all possibilities:
127.0.0.1:8080
127.0.0.1:8080
127.0.0.1:8080
localhost:8080
localhost:8080
Also, please open the command prompt and type ipconfig and hit enter. You will get your IPv4 Address. For example, if it is 192.168.56.1, then please try all possibilities with that IP address:
192.168.56.1:8080
192.168.56.1:8080
192.168.56.1:8080
Good luck, to stay updated with our future content, I kindly request you to consider subscribing and liking our videos.
on the git_repo_dag the list_repo_tags graph is failed give some tips to resolve it
Dude - pls let me know the error message as a separate comment (i don't get notification for responses on existing comment)
I am not fully understand your tutorial because of your code document (clean_data def). Also, when implement git_repo_dag.py, a dummy error come to me "airflow.exceptions.AirflowException: GitHub operator error: The conn_id `github_default` isn't defined". Maybe, a short video is good, but not enough knowledge and explaination will let it be bad
Dear friend - Thanks for the feedback.
I will add clean_data code to the code link shortly.
And for your another issue, pls try creating a default git connection- pla refer below video to create and understand connections:
ruclips.net/video/-fjAchpM4mc/видео.html
@@SleekData Thank for your response! Hope you will give us more awesome videos
I'm also facing the same issue, how to fix this one?
Hi Team. Your full code link is forwarding me to the same youtube video link every time.
Dude - you have to scroll down, good luck.
sleek-data.blogspot.com/2023/10/airflow-dags-operators-tasks-providers.html
Local host error
Dear Friend
Please double-check the Docker Compose file. You should expose port 8080 from a container to the host. You can refer to the previous video in the playlist.
Try all possibilities:
127.0.0.1:8080
127.0.0.1:8080
127.0.0.1:8080
localhost:8080
localhost:8080
Also, please open the command prompt and type ipconfig and hit enter. You will get your IPv4 Address. For example, if it is 192.168.56.1, then please try all possibilities with that IP address:
192.168.56.1:8080
192.168.56.1:8080
192.168.56.1:8080
Good Luck, please like and subscribe.
When I run the dockerfile and docker-compse.yml, I have this error: => ERROR [3/4] COPY requirements.txt /tmp/requirements.txt
dockerfile:12
--------------------
10 |
11 | # Install provider packages from requirements.txt
12 | >>> COPY requirements.txt /tmp/requirements.txt
13 | RUN pip install -r /tmp/requirements.txt
--------------------
ERROR: failed to solve: failed to compute cache key: failed to calculate checksum of ref 9rhhdgnjglwa0lgbm0w5dn2kk::ls3a3ce00oaooi9d7m6269o9t: "/requirements.txt": not found 0.0s
How can I fix this issue? thank you!
someone help me i cant run the exchange rate pipeline:
Broken DAG: [/opt/airflow/dags/exchange_rate_pipeline.py]
Traceback (most recent call last):
File "", line 488, in _call_with_frames_removed
File "/opt/airflow/dags/exchange_rate_pipeline.py", line 7, in
from clean_data import clean_data
ModuleNotFoundError: No module named 'clean_data
hello, did you find a solution?
Dude - The error indicates that the clean_data module is not found. This can happen if the module is not in the Python path or if there is a typo in the import statement. Here are steps to troubleshoot and resolve the issue.
Check Module Location.
Verify Import Statement
from subdirectory.clean_data import clean_data
or please try:
import sys
sys.path.append('/path/to/your/module')
from clean_data import clean_data