106.Databricks|Pyspark|Automation|Real Time Project:DataType Issue When Writing to Azure Synapse/SQL

Поделиться
HTML-код
  • Опубликовано: 2 ноя 2024

Комментарии • 15

  • @abhishekrajoriya2507
    @abhishekrajoriya2507 Год назад +1

    Thanks a lot sir for sharing such real-time automation scenarios, this will help us to make more dynamic solutions

  • @ramswaroop1520
    @ramswaroop1520 Месяц назад

    in one of the interview they asked, "What is biggest/wellknown drawback in azure synapse analytics ", can you please clarify.

  • @chappasiva
    @chappasiva 2 месяца назад

    Hi Raja,
    Nice explanation.
    where can we get the note books. please provide me those.

  • @VenkatGolivi
    @VenkatGolivi 10 месяцев назад +1

    Excellent you deserve for more subscriptions

  • @Ramakrishna410
    @Ramakrishna410 Год назад

    Hi Nice explanation. My requirement is source is CSV, target is delta table.. so how to fetch column name and datatype for delta table..

  • @sravankumar1767
    @sravankumar1767 8 месяцев назад

    Same logic how can we apply Mongo db and cosmos db, can you please explain. Basically these databases support different datatype right ? Can you please make one video on this one 🙏

  • @rajunaik8803
    @rajunaik8803 Год назад +1

    Hi Raja, I guess we can also do type casting according to target system before appending! Do you see any challenges in type casting?

    • @rajasdataengineering7585
      @rajasdataengineering7585  Год назад +1

      Hi Raju, data type is different between azure DW and spark so created this automation solution

  • @naren06938
    @naren06938 8 часов назад

    I hope it is Database, not warehouse....

  • @sravankumar1767
    @sravankumar1767 8 месяцев назад

    I have one doubt how can we select query for azure Mongo db

  • @suvratrai8873
    @suvratrai8873 Год назад +1

    what if there are more columns in dataframe, compared to table?

    • @rajasdataengineering7585
      @rajasdataengineering7585  Год назад

      We need to apply workaround of dropping unwanted columns from the dataframe

    • @suvratrai8873
      @suvratrai8873 Год назад

      Then maybe we can add that condition too right, just to cover all possible scenarios?