Thank you so much 😀 if you would like to go more deep in snowflake.. you can also watch my udemy contents.. My current 3 courses are available in discounted price www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=NEW-YEAR-2024 www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=NEW-YEAR-2024 www.udemy.com/course/snowflake-dynamic-table-masterclass-e2e-data-pipeline/?couponCode=B1E84B2CB4AA82CB95E3
Most welcome.. and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge.. These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
sir, I have tried to build the project along side you. Everything went fine until I am setting primary and foreign keys in snowflake. Please update in medium blog. Ty.
Awesome!! Can we get the excate file you used for this project. I just tried to get the file in the respective git but i got only 2023 world cup json files! please guide me!
hi wiil u do the video on generative AI and LLMS with snowflake detailed manner or suggest any resources for that , it will be more helpful for lot of members
Hi , At 41 minute how do u get result attribute in case statement --> when info:outcome.result = 'tie' then 'Tie' because we have -> "outcome": { "winner": "Sri Lanka", "by": { "wickets": 8 } we have only 2 attributes 'winner' and 'by' in outcome object
if you try yourself, you would understand how it works.. but your question is not super clear to me... all code is available in my medium page.. so you can download and try it out..
Couldn't find the first 6 json files to be loaded within landing schema Can anyone please help me with the files so i can proceed further with the projy
Glad you enjoyed it and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge.. These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
To interact with snowflake, you can either do it using standard ANSI SQL or you can also write program using python API or Java API or Scala API. ---- I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge.. These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
Thanks a lot and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge.. These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
You are so welcome! Please try to access following link if want to get access to content. ✏ medium.com/@data-engineering-simplified/8f8e4f0fd1d0 alternatively, you can join my Facebook group and 200 ODI data set is already published there. facebook.com/groups/627874916138090/?mibextid=c7yyfP and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge.. These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
Unless, you share more detail, it is hard to extend help.. You can also watch the complete JSON playlist and try to see if that can help. ruclips.net/p/PLba2xJ7yxHB6ybgtaIsTKmmF2Nl2wAe2S
Hi Thanks for the detail explanation, I have a query After loading of data from stage to raw table don't we need to move or clean the files from stage location? Copy command only stores 64 days of metadata then after 64 days if old files are still there it will be reprocessed again? Can we think of a mechanism to archive the processed files from stage location?
I have never tried this 64 days concept and there is a parameter to control it. But thanks for the note, I will try it to check if 64 old data is re-loaded or not.
You are so welcome! Please try to access following link if want to get access to sql scripts ✏ medium.com/@data-engineering-simplified/8f8e4f0fd1d0 alternatively, you can join my Facebook group and 200 ODI data set is already published there. facebook.com/groups/627874916138090/?mibextid=c7yyfP and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge.. These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
Please try to access following link ✏ medium.com/@data-engineering-simplified/8f8e4f0fd1d0 alternatively, you can join my facebook group and 200 ODI data set is already published there. facebook.com/groups/627874916138090/?mibextid=c7yyfP and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge.. These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command... 1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50 2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
The amount of effort and dedication that you have put in this video is tremendous. Great work!!
Thank you so much 😀
if you would like to go more deep in snowflake.. you can also watch my udemy contents..
My current 3 courses are available in discounted price
www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=NEW-YEAR-2024
www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=NEW-YEAR-2024
www.udemy.com/course/snowflake-dynamic-table-masterclass-e2e-data-pipeline/?couponCode=B1E84B2CB4AA82CB95E3
This is incredible and passionate work Sir
God bless you abundantly
Most awaited video..Thank you so much Sir !!
Most welcome..
and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge..
These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50
2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
Amazing stuff!!! May I know what JSON visualizer you're using in this video, please?
The resources that you have used are free or it required subscription to use them??
sir, I have tried to build the project along side you. Everything went fine until I am setting primary and foreign keys in snowflake. Please update in medium blog. Ty.
Let me check
Awesome!! Can we get the excate file you used for this project. I just tried to get the file in the respective git but i got only 2023 world cup json files! please guide me!
hello @TechLearner-r4z were you able to get the 8 json files that ere loaded ?
Hey, thanks for this nice hands-on project, just wanted to ask which tool do you use to visualize the json ?
drop me a note to my instagram
sir please give data file also so that we can also make the project according with you
Hi I didn't the get all the file from the gitlab or from medium source link. Could you help me
What tool are you using to visualize the data as nodes?
can we use this project in our resume ?
FROM WHERE CAN I GET THE DATA THAT YOU USED IN CRICKET DATA, PLS PROVIDE THE DATA ATLEAST NO SIR
Why did you need to connect to Dbeaver to see the PK/FK relationship? Could we not do that in Snowflake?
hi wiil u do the video on generative AI and LLMS with snowflake detailed manner or suggest any resources for that , it will be more helpful for lot of members
This video helps lot but i don't see any 8 json files in gitlab pls can u provide...
Hi ,
At 41 minute how do u get result attribute in case statement -->
when info:outcome.result = 'tie' then 'Tie'
because we have ->
"outcome": {
"winner": "Sri Lanka",
"by": {
"wickets": 8
}
we have only 2 attributes 'winner' and 'by' in outcome object
if you try yourself, you would understand how it works.. but your question is not super clear to me... all code is available in my medium page.. so you can download and try it out..
Couldn't find the first 6 json files to be loaded within landing schema
Can anyone please help me with the files so i can proceed further with the projy
Great video
Glad you enjoyed it
and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge..
These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50
2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
What is the tool used at 1:12:24?
found it DBeaver
good.
Do you provide any snowflake online trainings?
Planning to do it soon, but for now.. No.... pls join my facebook group so if I do it.. you will come to know about it.
Do we have to code in snowflake
To interact with snowflake, you can either do it using standard ANSI SQL or you can also write program using python API or Java API or Scala API.
----
I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge..
These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50
2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
more projects like this please using more features.
Thanks for your note...
I assume, you have already seen the other end2end project using snowpark
ruclips.net/video/1jC98XQwBZw/видео.html
this is super helpful
Excellent 👌👌
Thanks a lot
and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge..
These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50
2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
Data set we do we get?
check video description
Thank you so much for the video
You are so welcome!
Please try to access following link if want to get access to content.
✏ medium.com/@data-engineering-simplified/8f8e4f0fd1d0
alternatively, you can join my Facebook group and 200 ODI data set is already published there.
facebook.com/groups/627874916138090/?mibextid=c7yyfP
and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge..
These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50
2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
I am stuck with multilevel and different attribute json to snowflake conversion using snowpark
Unless, you share more detail, it is hard to extend help..
You can also watch the complete JSON playlist and try to see if that can help.
ruclips.net/p/PLba2xJ7yxHB6ybgtaIsTKmmF2Nl2wAe2S
Hi
Thanks for the detail explanation, I have a query
After loading of data from stage to raw table don't we need to move or clean the files from stage location? Copy command only stores 64 days of metadata then after 64 days if old files are still there it will be reprocessed again?
Can we think of a mechanism to archive the processed files from stage location?
I have never tried this 64 days concept and there is a parameter to control it. But thanks for the note, I will try it to check if 64 old data is re-loaded or not.
Thanks for the video
You are so welcome!
Please try to access following link if want to get access to sql scripts
✏ medium.com/@data-engineering-simplified/8f8e4f0fd1d0
alternatively, you can join my Facebook group and 200 ODI data set is already published there.
facebook.com/groups/627874916138090/?mibextid=c7yyfP
and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge..
These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50
2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
where is json files/
your video won't be complete unless you provide data files for us to use and build project with you.
can we get sample json files please
Please try to access following link
✏ medium.com/@data-engineering-simplified/8f8e4f0fd1d0
alternatively, you can join my facebook group and 200 ODI data set is already published there.
facebook.com/groups/627874916138090/?mibextid=c7yyfP
and yes, I know many of us are not fully aware of snowpark Python API, if you want to manage snowflake more programatically.. you can watch my paid contents (data + code available) .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge..
These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=DIWALI50
2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=DIPAWALI35
@@DataEngineering Thank you so much😊
What name for visual graph program? ruclips.net/video/qDmqE89DSQQ/видео.html
Hello sir, could you share your LinkedIn id
for any queries.. you can reach out to me via insta or from my facebook page.