Managed and External Delta Tables in Databricks| DataFrame and Spark SQL way

Поделиться
HTML-код
  • Опубликовано: 20 янв 2025

Комментарии • 10

  • @anagaraj4706
    @anagaraj4706 2 месяца назад +2

    audio-video is not aligned for this video. please check

  • @rabink.5115
    @rabink.5115 4 месяца назад +3

    At 5:01, you mentioned metadata for external table stored in the storage path mentioned in the syntax. I had a doubt for it. Only, the data got stored in the path mentioned, the metadata is always stored inside the databricks irrespective of table type. Correct me, if I understand wrong.

    • @santoshatyam1409
      @santoshatyam1409 4 месяца назад

      I think you are correct

    • @vishoonaik
      @vishoonaik 3 месяца назад

      Yes, Data is stored in external location. That's what if you drop the table, data still exists in external location.

  • @SunilKumar-hq6bt
    @SunilKumar-hq6bt 2 месяца назад +1

    The data for the external tables are stored in cloud and metadata is stored with in databricks.

  • @suvakantasahoo7546
    @suvakantasahoo7546 Год назад +5

    audio with video is mismatching

  • @JD-xd3xp
    @JD-xd3xp Год назад +4

    audio-video is not aligned

  • @snagendra5415
    @snagendra5415 Год назад

    Bro, If I want to delete external table completely, then how to do it? Please tell me

    • @IrfanMohammed-n6r
      @IrfanMohammed-n6r 5 месяцев назад

      When it comes to external tables, you can only delete their schema or table in database. To delete the underlying data, you need to either delete the entire folder which was using that location for external table or delete files inside that folder.

  • @surajwagh1990
    @surajwagh1990 10 месяцев назад

    Could you please create video on workflow job