Snowflake - SnowPipe - Working Session

Поделиться
HTML-код
  • Опубликовано: 23 дек 2024

Комментарии • 83

  • @prasadmuthyala9625
    @prasadmuthyala9625 Месяц назад +1

    Finally got great snowflake content RUclips channel ❤

  • @factocratics
    @factocratics 2 года назад +10

    You are great Sir. This is really really helpful. After going through your videos I don't find any need to buy 20-30k course. Thank you 😊

  • @dabbuguntaprasad3710
    @dabbuguntaprasad3710 Год назад +1

    What a explanation sir we are able to understanding and remembering each and every point very easily and we can never forget after listening to your session's

  • @iAmDecemberBorn
    @iAmDecemberBorn 5 месяцев назад +1

    bahut bhadia bhai !! Cheers
    It help me lot to clear the concept !!

  • @trendyshorts910
    @trendyshorts910 7 месяцев назад +1

    greate efforts sir , these are the best videos of snowfalke with practical i have seen on youtube ,keep doing it sir 😇

  • @SivaPanga-e1m
    @SivaPanga-e1m Год назад +1

    Excellent Session

  • @venkataramanamurthypasumar4542
    @venkataramanamurthypasumar4542 Год назад +2

    Wonderful course coverage.👍💐

  • @sureshrecinp
    @sureshrecinp Год назад +1

    Thank you for sharing the info.

  • @BabaPatil-k7c
    @BabaPatil-k7c 6 месяцев назад +1

    Hello Sir,
    Thank you very much for the great content...You are an exceptional teacher.
    Could you please make a video on Snowpipe Optimization?
    Please it's a request🙏

  • @MrVenkatesh69
    @MrVenkatesh69 5 месяцев назад

    is the target table always truncate and load? where can we set/see the target table properties like Insert/update?

    • @mrjana520
      @mrjana520  5 месяцев назад

      based on your requirement and design approach you have to decide it is truncate and load or incremental, insert/update are not properties of a table

  • @nbaireddy
    @nbaireddy Год назад

    Hi, If we want to load from multiple files to multiples tables from same folders, we need to create multiple SQS ques (one SQS queue for on pipe)

    • @mrjana520
      @mrjana520  Год назад

      Not required, use same pipe and same stage with different file pattern.

  • @mohammedvahid5099
    @mohammedvahid5099 6 месяцев назад +1

    Excellent bro❤

  • @NirmalaDevi-vo7he
    @NirmalaDevi-vo7he Год назад +1

    It was very clear.. Thank you so much..

  • @sakshamsomani
    @sakshamsomani 8 месяцев назад

    Is there a similar video of SnowPipe where you have used Azure instead of AWS?

    • @mrjana520
      @mrjana520  8 месяцев назад

      I haven't prepared that brother

  • @aayushkumar963
    @aayushkumar963 Год назад +1

    Hi, great content! Appreciate your efforts towards putting these explanations.
    I have a doubt: What if a file is loaded in S3 with the same name & with the same set of data but with some changes in the values for some records. How to deal with this scenario, is there a thing where we can upsert data instead of inserting it every time in Snowpipe?

    • @mrjana520
      @mrjana520  Год назад +2

      If you want to load the same file again, you can use FORCE='TRUE' option in your copy command. watch Copy Options video for more details. And we dont do upserts while loading data into stage tables, we have to do those transformations while loading data into main target/integration/base tables.

  • @nagaa225
    @nagaa225 5 месяцев назад

    Hi sir, can you suggest how can we create multiple files from file based on one look up table also files should be in same format and sane delimiter. File size is 60mb

    • @mrjana520
      @mrjana520  5 месяцев назад

      What lookup table? where do you want to create these files? no clarity on requirement..

  • @tinkubharath69
    @tinkubharath69 Год назад +1

    Thanks for this valid information

  • @m04d10y1996
    @m04d10y1996 8 месяцев назад

    are snowflake stage by default are truncate and load

  • @annasaga
    @annasaga 2 года назад +1

    You are doing a great job.Could you pleae do video on realtime issues .

    • @mrjana520
      @mrjana520  2 года назад

      Will do once all the concepts completed

    • @annasaga
      @annasaga 2 года назад

      Thank you

  • @divityvali8454
    @divityvali8454 6 месяцев назад

    If we take your course in udemy,did you expalined One any Project

  • @m04d10y1996
    @m04d10y1996 8 месяцев назад

    is there difference between SNS and SQS, somewhere you mentioned SNS but in demo you used SQS

    • @mrjana520
      @mrjana520  8 месяцев назад

      Amazon SNS (Simple Notification Service) and Amazon SQS (Simple Queue Service) are both messaging services offered by AWS. The main difference between the two is their foundation

  • @venkataramanamurthypasumar4542

    How do you control duplicate datasets being loaded into Target tables?

    • @mrjana520
      @mrjana520  Год назад

      Mostly while loading the data into target tables, we use merge queries by using distinct on source table, this will make sure there will be no duplicates..

  • @deepbhattacharyya8475
    @deepbhattacharyya8475 Год назад

    Hi, Thanks for the detailed explaination. However ,I have a question that if i have a file thats already processed and again i got the file where data is different but filename remains same.. How to encounter such issue? Also keeping duplicates in mind

    • @mrjana520
      @mrjana520  Год назад

      If you want to reprocess the file with same name use FORCE=TRUE option in Copy command. if you are concerned about duplicates with FORCE option then instead of FORCE option clear the file you have processed from copy history and reprocess that file. But in general in real time the stage tables are truncate and load mostly and from stage to target we use merge queries, so duplicates issue won't come.

  • @mohammadhaque1873
    @mohammadhaque1873 9 месяцев назад +1

    Awesome.

  • @RajeshS-ee5by
    @RajeshS-ee5by 3 месяца назад

    Thanks sir 🎉

  • @SaiKumarGaddam-g3x
    @SaiKumarGaddam-g3x 6 месяцев назад

    when i am trying to create snowpipe it giving error as object doesnot exist or operations cannot be performed

    • @mrjana520
      @mrjana520  6 месяцев назад

      That means, Snowflake-AWS integration didn't happen correctly, pls watch videos and retry

  • @parthibans7596
    @parthibans7596 Год назад +1

    thank you bro.i did successfully

  • @virupakshi9421
    @virupakshi9421 Год назад

    If any error in files loading data into tables using snowpipe while move to production how can we Stop snowpipe

    • @mrjana520
      @mrjana520  Год назад +1

      Watch the other video Troubleshooting Snowpipe, I have explained there

    • @virupakshi9421
      @virupakshi9421 Год назад

      Ok thank you

  • @GAGANKUBSAD-c9n
    @GAGANKUBSAD-c9n Год назад

    Sir I have data in SAS. I want to continuously ingest data from SAS to snowflake. Does snow pipe supports it?

    • @mrjana520
      @mrjana520  Год назад

      No, Snowpipe supports ingestion only from 3 clouds(Azure, AWS and GCP)

  • @deepthikotipalli
    @deepthikotipalli Год назад

    sir, how can we know the name of the snowpipe while loading the data.

    • @mrjana520
      @mrjana520  Год назад

      But why do you need that?

  • @PAWANYADAV-dq4zg
    @PAWANYADAV-dq4zg Год назад

    Hello sir..Do you provide full course on snowflake?

    • @mrjana520
      @mrjana520  Год назад

      My full course is available in Udemy.
      My Snowflake Udemy Course:
      www.udemy.com/course/snowflake-complete-course-for-clearing-interviews/?couponCode=C2271E74086DBBD928D9
      Use the coupon below to get it for 499 rupees.
      Coupon: C2271E74086DBBD928D9

  • @sasikanth513
    @sasikanth513 Год назад

    I m getting error while using the list stage as SQL execution error aws_assume role how to rectify it

    • @mrjana520
      @mrjana520  Год назад

      with one line of error, how can i help you, reach me on jana.snowflake2@gmail.com

  • @kanthkk1365
    @kanthkk1365 Год назад

    Hi Sir. My Snowflake Account is Hosted on Azure.I am trying to load the data from S3 Bucket to Snowflake using Snowpipe. But I am getting error like "090040 (XX000): Pipe Notifications bind failure "Cross cloud integration is not supported for pipe creation in AZURE using a stage in AWS." .Is there any way to solve this?

  • @jansivarun
    @jansivarun 6 месяцев назад

    Hi Sir, i am getting error in this step List @stage files "Failure using stage area. Cause: [Access Denied (Status Code: 403; Error Code: AccessDenied)"

    • @mrjana520
      @mrjana520  6 месяцев назад

      That means your snowflake was not properly integrated with your external storage

    • @jansivarun
      @jansivarun 6 месяцев назад

      now the issue is resolved. but the data is not loaded into the table. Pipe is in running status.

    • @jansivarun
      @jansivarun 6 месяцев назад +1

      working fine. Thanks for your video. very helpful!!

  • @SandeepKhandale-rq6le
    @SandeepKhandale-rq6le 7 месяцев назад

    can u explain how to create event notification in azure

    • @mrjana520
      @mrjana520  7 месяцев назад

      I have already posted Azure integration, pls check my playlist

  • @ParYucreatives
    @ParYucreatives Год назад

    can we load multiple table by using single pipe ?

    • @mrjana520
      @mrjana520  Год назад

      yes we can, by using single pipe we can load different files to different tables, but only from the external stage specified in the pipe definition

  • @pib4579
    @pib4579 Год назад

    How much salary will get in Accenture for snowflake data warehouse application developer for 4yrs professional exp nd 2 yrs of snowflake exp??

  • @manjutharak
    @manjutharak Год назад

    Is snowflake support constraints and make video on this brother

    • @mrjana520
      @mrjana520  Год назад +1

      Snowflake doesn't support constraints as of now except Not Null constraints

  • @vinodbarma2858
    @vinodbarma2858 Год назад

    how to stop the pipe?

    • @mrjana520
      @mrjana520  Год назад +1

      I have given that info in next video 'Troubleshooting Snowpipe'.
      // How to pause a pipe
      ALTER PIPE mydb.pipes.employee_pipe SET PIPE_EXECUTION_PAUSED = true;

  • @kanthreddy7396
    @kanthreddy7396 Год назад

    Very Clear Explanation sir..In realtime suppose if we have 100 different files do we need to create 100 pipes?

  • @badamsunilkumar
    @badamsunilkumar 11 месяцев назад

    Nice explanation but where we can see the notification?

    • @mrjana520
      @mrjana520  11 месяцев назад

      Retrieve the most recent notifications that were created in the past 24 hours.
      select * from table(information_schema.notification_history())

    • @mrjana520
      @mrjana520  11 месяцев назад

      Retrieve the most recent notifications that were created in the past hour and sent using the integration named MY_INTEGRATION.
      select * from table(information_schema.notification_history(
      START_TIME=>dateadd('hour',-1,current_timestamp()),
      END_TIME=>current_timestamp(),
      RESULT_LIMIT=>100,
      INTEGRATION_NAME=>'MY_INTEGRATION'));

  • @prakashgrandi5508
    @prakashgrandi5508 Год назад

    Error assuming :-Please verify the rile and external I'd are configured correctly in your aws policy

    • @prakashgrandi5508
      @prakashgrandi5508 Год назад

      What it means rectification for please share one tip or link(you tube )

    • @mrjana520
      @mrjana520  Год назад

      community.snowflake.com/s/question/0D50Z00009UruoRSAR/troubleshooting-sql-execution-error-error-assuming-awsrole-please-verify-the-role-and-externalid-are-configured-correctly-in-your-aws-policy

  • @CR7LOVER576
    @CR7LOVER576 Год назад

    I believe need not to schedule the snowpipes.

    • @mrjana520
      @mrjana520  Год назад +1

      true

    • @CR7LOVER576
      @CR7LOVER576 Год назад

      @@mrjana520 when will go for snowpipe and without snowpipe is there any specific scenario when will we go for snowpipe?

    • @mrjana520
      @mrjana520  Год назад

      Brother, that is the first point I have explained in this video, please ask doubts after watching and practicing

  • @mechanicalbrand444
    @mechanicalbrand444 3 месяца назад

    Hi Thanks for you video. a single pipe can load data into single table only right?
    CREATE OR REPLACE pipe mydb.my_schema.pipe_customers
    AUTO_INGEST = TRUE
    AS
    COPY INTO mydb.my_schema.customers
    FROM @mydb.my_schema.my_ext_stage/pipe/
    file_format = MY_CSV_FORMAT_WITH_OUT_HEADER;
    can I use same pipe to load data into employees table? if yes, what is the sql query?
    Thanks

    • @mrjana520
      @mrjana520  3 месяца назад

      One pipe can load data to one table only.