What a explanation sir we are able to understanding and remembering each and every point very easily and we can never forget after listening to your session's
Hello Sir, Thank you very much for the great content...You are an exceptional teacher. Could you please make a video on Snowpipe Optimization? Please it's a request🙏
Hi, great content! Appreciate your efforts towards putting these explanations. I have a doubt: What if a file is loaded in S3 with the same name & with the same set of data but with some changes in the values for some records. How to deal with this scenario, is there a thing where we can upsert data instead of inserting it every time in Snowpipe?
If you want to load the same file again, you can use FORCE='TRUE' option in your copy command. watch Copy Options video for more details. And we dont do upserts while loading data into stage tables, we have to do those transformations while loading data into main target/integration/base tables.
Hi sir, can you suggest how can we create multiple files from file based on one look up table also files should be in same format and sane delimiter. File size is 60mb
Amazon SNS (Simple Notification Service) and Amazon SQS (Simple Queue Service) are both messaging services offered by AWS. The main difference between the two is their foundation
Mostly while loading the data into target tables, we use merge queries by using distinct on source table, this will make sure there will be no duplicates..
Hi, Thanks for the detailed explaination. However ,I have a question that if i have a file thats already processed and again i got the file where data is different but filename remains same.. How to encounter such issue? Also keeping duplicates in mind
If you want to reprocess the file with same name use FORCE=TRUE option in Copy command. if you are concerned about duplicates with FORCE option then instead of FORCE option clear the file you have processed from copy history and reprocess that file. But in general in real time the stage tables are truncate and load mostly and from stage to target we use merge queries, so duplicates issue won't come.
My full course is available in Udemy. My Snowflake Udemy Course: www.udemy.com/course/snowflake-complete-course-for-clearing-interviews/?couponCode=C2271E74086DBBD928D9 Use the coupon below to get it for 499 rupees. Coupon: C2271E74086DBBD928D9
Hi Sir. My Snowflake Account is Hosted on Azure.I am trying to load the data from S3 Bucket to Snowflake using Snowpipe. But I am getting error like "090040 (XX000): Pipe Notifications bind failure "Cross cloud integration is not supported for pipe creation in AZURE using a stage in AWS." .Is there any way to solve this?
Hi Sir, i am getting error in this step List @stage files "Failure using stage area. Cause: [Access Denied (Status Code: 403; Error Code: AccessDenied)"
I have given that info in next video 'Troubleshooting Snowpipe'. // How to pause a pipe ALTER PIPE mydb.pipes.employee_pipe SET PIPE_EXECUTION_PAUSED = true;
Retrieve the most recent notifications that were created in the past hour and sent using the integration named MY_INTEGRATION. select * from table(information_schema.notification_history( START_TIME=>dateadd('hour',-1,current_timestamp()), END_TIME=>current_timestamp(), RESULT_LIMIT=>100, INTEGRATION_NAME=>'MY_INTEGRATION'));
Hi Thanks for you video. a single pipe can load data into single table only right? CREATE OR REPLACE pipe mydb.my_schema.pipe_customers AUTO_INGEST = TRUE AS COPY INTO mydb.my_schema.customers FROM @mydb.my_schema.my_ext_stage/pipe/ file_format = MY_CSV_FORMAT_WITH_OUT_HEADER; can I use same pipe to load data into employees table? if yes, what is the sql query? Thanks
Finally got great snowflake content RUclips channel ❤
You are great Sir. This is really really helpful. After going through your videos I don't find any need to buy 20-30k course. Thank you 😊
What a explanation sir we are able to understanding and remembering each and every point very easily and we can never forget after listening to your session's
Thank you
bahut bhadia bhai !! Cheers
It help me lot to clear the concept !!
greate efforts sir , these are the best videos of snowfalke with practical i have seen on youtube ,keep doing it sir 😇
Excellent Session
Wonderful course coverage.👍💐
Thank you for sharing the info.
Hello Sir,
Thank you very much for the great content...You are an exceptional teacher.
Could you please make a video on Snowpipe Optimization?
Please it's a request🙏
is the target table always truncate and load? where can we set/see the target table properties like Insert/update?
based on your requirement and design approach you have to decide it is truncate and load or incremental, insert/update are not properties of a table
Hi, If we want to load from multiple files to multiples tables from same folders, we need to create multiple SQS ques (one SQS queue for on pipe)
Not required, use same pipe and same stage with different file pattern.
Excellent bro❤
It was very clear.. Thank you so much..
Is there a similar video of SnowPipe where you have used Azure instead of AWS?
I haven't prepared that brother
Hi, great content! Appreciate your efforts towards putting these explanations.
I have a doubt: What if a file is loaded in S3 with the same name & with the same set of data but with some changes in the values for some records. How to deal with this scenario, is there a thing where we can upsert data instead of inserting it every time in Snowpipe?
If you want to load the same file again, you can use FORCE='TRUE' option in your copy command. watch Copy Options video for more details. And we dont do upserts while loading data into stage tables, we have to do those transformations while loading data into main target/integration/base tables.
Hi sir, can you suggest how can we create multiple files from file based on one look up table also files should be in same format and sane delimiter. File size is 60mb
What lookup table? where do you want to create these files? no clarity on requirement..
Thanks for this valid information
are snowflake stage by default are truncate and load
nope
You are doing a great job.Could you pleae do video on realtime issues .
Will do once all the concepts completed
Thank you
If we take your course in udemy,did you expalined One any Project
no
is there difference between SNS and SQS, somewhere you mentioned SNS but in demo you used SQS
Amazon SNS (Simple Notification Service) and Amazon SQS (Simple Queue Service) are both messaging services offered by AWS. The main difference between the two is their foundation
How do you control duplicate datasets being loaded into Target tables?
Mostly while loading the data into target tables, we use merge queries by using distinct on source table, this will make sure there will be no duplicates..
Hi, Thanks for the detailed explaination. However ,I have a question that if i have a file thats already processed and again i got the file where data is different but filename remains same.. How to encounter such issue? Also keeping duplicates in mind
If you want to reprocess the file with same name use FORCE=TRUE option in Copy command. if you are concerned about duplicates with FORCE option then instead of FORCE option clear the file you have processed from copy history and reprocess that file. But in general in real time the stage tables are truncate and load mostly and from stage to target we use merge queries, so duplicates issue won't come.
Awesome.
Thanks sir 🎉
when i am trying to create snowpipe it giving error as object doesnot exist or operations cannot be performed
That means, Snowflake-AWS integration didn't happen correctly, pls watch videos and retry
thank you bro.i did successfully
If any error in files loading data into tables using snowpipe while move to production how can we Stop snowpipe
Watch the other video Troubleshooting Snowpipe, I have explained there
Ok thank you
Sir I have data in SAS. I want to continuously ingest data from SAS to snowflake. Does snow pipe supports it?
No, Snowpipe supports ingestion only from 3 clouds(Azure, AWS and GCP)
sir, how can we know the name of the snowpipe while loading the data.
But why do you need that?
Hello sir..Do you provide full course on snowflake?
My full course is available in Udemy.
My Snowflake Udemy Course:
www.udemy.com/course/snowflake-complete-course-for-clearing-interviews/?couponCode=C2271E74086DBBD928D9
Use the coupon below to get it for 499 rupees.
Coupon: C2271E74086DBBD928D9
I m getting error while using the list stage as SQL execution error aws_assume role how to rectify it
with one line of error, how can i help you, reach me on jana.snowflake2@gmail.com
Hi Sir. My Snowflake Account is Hosted on Azure.I am trying to load the data from S3 Bucket to Snowflake using Snowpipe. But I am getting error like "090040 (XX000): Pipe Notifications bind failure "Cross cloud integration is not supported for pipe creation in AZURE using a stage in AWS." .Is there any way to solve this?
Hi Sir, i am getting error in this step List @stage files "Failure using stage area. Cause: [Access Denied (Status Code: 403; Error Code: AccessDenied)"
That means your snowflake was not properly integrated with your external storage
now the issue is resolved. but the data is not loaded into the table. Pipe is in running status.
working fine. Thanks for your video. very helpful!!
can u explain how to create event notification in azure
I have already posted Azure integration, pls check my playlist
can we load multiple table by using single pipe ?
yes we can, by using single pipe we can load different files to different tables, but only from the external stage specified in the pipe definition
How much salary will get in Accenture for snowflake data warehouse application developer for 4yrs professional exp nd 2 yrs of snowflake exp??
10 lpa
Is snowflake support constraints and make video on this brother
Snowflake doesn't support constraints as of now except Not Null constraints
how to stop the pipe?
I have given that info in next video 'Troubleshooting Snowpipe'.
// How to pause a pipe
ALTER PIPE mydb.pipes.employee_pipe SET PIPE_EXECUTION_PAUSED = true;
Very Clear Explanation sir..In realtime suppose if we have 100 different files do we need to create 100 pipes?
Yes
Thanks for the Reply sir... It really helpful.
@@mrjana520 Is there any other way around?
Nice explanation but where we can see the notification?
Retrieve the most recent notifications that were created in the past 24 hours.
select * from table(information_schema.notification_history())
Retrieve the most recent notifications that were created in the past hour and sent using the integration named MY_INTEGRATION.
select * from table(information_schema.notification_history(
START_TIME=>dateadd('hour',-1,current_timestamp()),
END_TIME=>current_timestamp(),
RESULT_LIMIT=>100,
INTEGRATION_NAME=>'MY_INTEGRATION'));
Error assuming :-Please verify the rile and external I'd are configured correctly in your aws policy
What it means rectification for please share one tip or link(you tube )
community.snowflake.com/s/question/0D50Z00009UruoRSAR/troubleshooting-sql-execution-error-error-assuming-awsrole-please-verify-the-role-and-externalid-are-configured-correctly-in-your-aws-policy
I believe need not to schedule the snowpipes.
true
@@mrjana520 when will go for snowpipe and without snowpipe is there any specific scenario when will we go for snowpipe?
Brother, that is the first point I have explained in this video, please ask doubts after watching and practicing
Hi Thanks for you video. a single pipe can load data into single table only right?
CREATE OR REPLACE pipe mydb.my_schema.pipe_customers
AUTO_INGEST = TRUE
AS
COPY INTO mydb.my_schema.customers
FROM @mydb.my_schema.my_ext_stage/pipe/
file_format = MY_CSV_FORMAT_WITH_OUT_HEADER;
can I use same pipe to load data into employees table? if yes, what is the sql query?
Thanks
One pipe can load data to one table only.