Azure Content : Annu
Azure Content : Annu
  • Видео 177
  • Просмотров 441 068
5. Narrow VS Wide Transformation in RDD | Pyspark | Apache Spark
This video will explain different types of transformation in Apache Spark. There are two types of Transformation - Narrow and Wide
#azure #dataenginnerjobs #dataengineers #azuredataengineering #pyspark #programming #coding #cloudcomputing #azureservices
Просмотров: 295

Видео

4. RDD operations | Transformations and actions | Pyspark
Просмотров 239Месяц назад
In this video, we learnt about RDD operations - RDD transformations and RDD actions in Pyspark #azure #dataenginnerjobs #dataengineers #azuredataengineering #pyspark #programming #coding #cloudcomputing #azureservices
3. RDD partitioning | Repartition() vs Coalesce
Просмотров 4632 месяца назад
In this video we learnt the difference between Repartition() and Coalesce()
3. Use private endpoints to connect securely to Azure SQL Server
Просмотров 8653 месяца назад
In this video, we learnt how to use private endpoints to connect securely to Azure SQL Server
2. Connect to SQL Server through SQL Server Authentication
Просмотров 4963 месяца назад
In this video we learnt how to Connect to SQL Server through SQL Server Authentication #azure #azuredataengineer #azuresql #sqlserver #azureactivedirectory #azureentraid #microsoft #azuredataengineering
1. Create Azure SQL database with Microsoft Entra ID authentication
Просмотров 1,5 тыс.3 месяца назад
In this video we learnt how to Create Azure SQL database with Microsoft Entra ID authentication #azure #azuredataengineer #azuresql #sqlserver #azureactivedirectory #azureentraid #microsoft #azuredataengineering
38. How to find the position of nth occurrence of a character in a string using mapping dataflow
Просмотров 2113 месяца назад
In this video , how to find the position of nth occurrence of a character in a string using mapping dataflow #azure #azuredataengineer #azuredataengineering #cloud
34. How to Format datetime to a 12-hours clock and 24-hours clock
Просмотров 6914 месяца назад
In this video, we learnt how to Format datetime to a 12-hours clock / 24-hours clock
8. Found more columns than expected | Common Errors in ADF #adf
Просмотров 6144 месяца назад
In this video we learnt how to mitigate "Error found when processing 'Csv/Tsv Format Text’ .Found more columns than expected" While using copy activity in Azure Data Factory to load data from CSV to SQL Table, the value would be considered as if they belong to 2 different columns if they have dataset delimiter within the value. We need to consider using correct Quote character in dataset config...
2. Read Multiple JSONs from ADLS and load into SQL using Logic apps
Просмотров 3474 месяца назад
In this video we are Loading array of multiple Jsons from ADLS to multiple rows in SQL Using Azure Logic Apps #logicapps #automation #integration #azureintegration #azure #azurelogicapps #logicapp #cloud #cloudcomputing
1. Read JSON data from ADLS and insert into SQL table using Logic apps
Просмотров 1,1 тыс.4 месяца назад
In this video we learnt how to read JSON data from ADLS and load it into SQL table using Logic apps Parse JSON official documentation: learn.microsoft.com/en-us/azure/logic-apps/logic-apps-perform-data-operations?tabs=consumption#parse-json-action #logicapps #automation #integration #azureintegration #azure #azurelogicapps #logicapp #cloud #cloudcomputing
2. Resilient Distributed Dataset (RDD) in Pyspark
Просмотров 9245 месяцев назад
In this video, we learnt about RDD in pyspark #pyspark #azuredataengineer #dataengineer #dataengineering #azuredataengineering
1. Introduction to Pyspark
Просмотров 9935 месяцев назад
In this video, we learnt about Pyspark.
37. How to concatenate data of multiple rows after grouping using mapping dataflow #adf #dataflow
Просмотров 4317 месяцев назад
In this video, we learnt How to concatenate data of multiple rows after grouping using mapping dataflow #adf #dataflow
33. How to Break Out of ForEach Loop on Activity Failure #adf
Просмотров 2,3 тыс.7 месяцев назад
33. How to Break Out of ForEach Loop on Activity Failure #adf
7. Self-Referencing Variable is not allowed #adf #datafactory
Просмотров 5107 месяцев назад
7. Self-Referencing Variable is not allowed #adf #datafactory
6. For Each activity do not run in parallel #adf #datafactory
Просмотров 1,7 тыс.7 месяцев назад
6. For Each activity do not run in parallel #adf #datafactory
How to pause all the dedicated SQL pools in subscription after a period of inactivity
Просмотров 6367 месяцев назад
How to pause all the dedicated SQL pools in subscription after a period of inactivity
5. Execute Pipeline activity fails to pass a parameter of array type to child pipeline
Просмотров 1,3 тыс.9 месяцев назад
5. Execute Pipeline activity fails to pass a parameter of array type to child pipeline
4. The specified SQL Query is not valid. The query doesn't return any data.
Просмотров 5849 месяцев назад
4. The specified SQL Query is not valid. The query doesn't return any data.
3. The expression cannot be evaluated because property doesn't exist.
Просмотров 1,5 тыс.9 месяцев назад
3. The expression cannot be evaluated because property doesn't exist.
2. Bad Request error in ADF | Message Null | Common errors in ADF
Просмотров 1,4 тыс.9 месяцев назад
2. Bad Request error in ADF | Message Null | Common errors in ADF
1. Failed to Delete Linked Service | Common Errors in ADF
Просмотров 1,3 тыс.9 месяцев назад
1. Failed to Delete Linked Service | Common Errors in ADF
26. Monitor Copy activity in ADF pipeline #adf
Просмотров 1,1 тыс.9 месяцев назад
26. Monitor Copy activity in ADF pipeline #adf
25. Auto Create table option in Copy activity for SQL as sink in Azure data factory
Просмотров 3,7 тыс.10 месяцев назад
25. Auto Create table option in Copy activity for SQL as sink in Azure data factory
32. How to Use the outh2.0 token to access the secured resource using azure data factory
Просмотров 1,6 тыс.Год назад
32. How to Use the outh2.0 token to access the secured resource using azure data factory
31. How to Query pipeline runs in ADF based on input filter conditions
Просмотров 2,8 тыс.Год назад
31. How to Query pipeline runs in ADF based on input filter conditions
30. Execute ADF pipeline via rest API from other ADF
Просмотров 2,3 тыс.Год назад
30. Execute ADF pipeline via rest API from other ADF
29. Execute ADF pipeline via REST API call
Просмотров 6 тыс.Год назад
29. Execute ADF pipeline via REST API call
Secure File Transfer Protocol | Copy data from on Prem to Azure storage account
Просмотров 2,1 тыс.Год назад
Secure File Transfer Protocol | Copy data from on Prem to Azure storage account

Комментарии

  • @BikashKumarPradhan-k7c
    @BikashKumarPradhan-k7c День назад

    Can we not use wildcards to handle this scenario as you have explained earlier¿

  • @AdenilsonFTJunior
    @AdenilsonFTJunior 2 дня назад

    Sensational video! One question, why in ForEach1 did you have to create a dataset that points to the parameterised files in the dataset and not use the result of getmedata1, which already returns a list with each of the names and dates?

    • @azurecontentannu6399
      @azurecontentannu6399 2 дня назад

      @@AdenilsonFTJunior The one outside Foreach is pointing to folder level.. So we are getting the file names within the folder using child items and the one inside Foreach is parameterized to process those files one by one through iteration and get the last modified date of the files

    • @AdenilsonFTJunior
      @AdenilsonFTJunior 2 дня назад

      @@azurecontentannu6399 thanks!

  • @napoleanbonaparte9225
    @napoleanbonaparte9225 3 дня назад

    Thank u, very much , what if it is case of folder of big data

  • @madhanrommala2883
    @madhanrommala2883 5 дней назад

    Great explanation! Thanks for this content 😀

  • @rajashekar4171
    @rajashekar4171 12 дней назад

    Hi Annu, Could you please make a video explaining the Degree of Copy Parallelism in Copy Activity? It would be really helpful! Thanks!

  • @atlanticoceanvoyagebird2630
    @atlanticoceanvoyagebird2630 13 дней назад

    Beautiful and simple way of systematical teaching to the students. You are great from your heart explaining everything clearly without hiding . Your teaching is greatly appreciated by Muihammad Khan from North America.

    • @azurecontentannu6399
      @azurecontentannu6399 12 дней назад

      @@atlanticoceanvoyagebird2630 Thankyou so much for your kind words

  • @rajashekar4171
    @rajashekar4171 13 дней назад

    Wow, your content and explanations are absolutely fantastic! Really appreciate the clarity and detail you put into every video. Keep up the amazing work! 🙌 Thanks

  • @atlanticoceanvoyagebird2630
    @atlanticoceanvoyagebird2630 13 дней назад

    I never understood this confused parameterized value passing from other instructors except your teaching methods that remained clear explanations. I am learning a lot of things from you which I could not enjoy from other sources. Muhammad Khan from North America.

  • @atlanticoceanvoyagebird2630
    @atlanticoceanvoyagebird2630 13 дней назад

    Your English pronunciations are very clear not other instructors who speaks and can understand 5% audience not like you where audience can understand 100%. Keep on this great pronunciations.

  • @meghachouksey3248
    @meghachouksey3248 15 дней назад

    If we want to skip files which contain char like fact, aggregated inside the file name like data567- fact.csv and data13144-aggregated-1456.csv from copying how we can do this???

  • @rajendrayegireddi3429
    @rajendrayegireddi3429 17 дней назад

    Great video, mam I have a doubt that after using a filter activity why don't we take a copy activity directly to load dat into destination. Please clear my doubt

  • @rajendrayegireddi3429
    @rajendrayegireddi3429 18 дней назад

    Good explanation annu. I have a small question. why don't we lookup activity instead of filter activity to filter last updated file?

  • @rohigt5745
    @rohigt5745 19 дней назад

    Awesome and inspiring, I like your analytical approach and good communication skills.

  • @rohigt5745
    @rohigt5745 19 дней назад

    Excellent analytical approach in the explanation👍

  • @CodeVeda
    @CodeVeda 20 дней назад

    I watched a few of your videos on ADF, and honestly, they are much better than the paid ones on Udemy😀

  • @PurviMakanee
    @PurviMakanee 23 дня назад

    Filename and filepath both paramere pointing to file name only.....I guess that is a bug in ADF. Microsoft needs to improve it.

  • @aperxmim
    @aperxmim 26 дней назад

    Great explanation of the tools and features before going into the demonstration examples. a good sneak preview.

  • @Uda_dunga
    @Uda_dunga 27 дней назад

    End me kya kiya😢

  • @SAIKUMAR03
    @SAIKUMAR03 28 дней назад

    256?

  • @SAIKUMAR03
    @SAIKUMAR03 28 дней назад

    4-256

  • @SAIKUMAR03
    @SAIKUMAR03 28 дней назад

    excel was also not supported as sink. Iceberg is only supported as sink but not source.

  • @aswinvenkatesh9840
    @aswinvenkatesh9840 Месяц назад

    =

  • @aswinvenkatesh9840
    @aswinvenkatesh9840 Месяц назад

    =

  • @aswinvenkatesh9840
    @aswinvenkatesh9840 Месяц назад

    =

  • @aswinvenkatesh9840
    @aswinvenkatesh9840 Месяц назад

    =

  • @virbhadramule6088
    @virbhadramule6088 Месяц назад

    nice thanks for made this content

  • @rajveerdhumal3143
    @rajveerdhumal3143 Месяц назад

    Please raise audio quality and avoid disturbance, you have long journey here, thanks a lot

  • @rajveerdhumal3143
    @rajveerdhumal3143 Месяц назад

    Thanks for this content really Helpful

  • @madhua8892
    @madhua8892 Месяц назад

    Good explanation and easy to understand. Soon i will follow your azure content. Hope you will clear all doubts there.

  • @aswinvenkatesh9840
    @aswinvenkatesh9840 Месяц назад

    =

  • @aswinvenkatesh9840
    @aswinvenkatesh9840 Месяц назад

    =

  • @HETALVYAS-j2c
    @HETALVYAS-j2c Месяц назад

    can you tech me to :-Use Azure Data Factory to retrieve data from the Azure Log Analytics API. Use the query language used by Log Analytics (KQL).The sink will be SQL DB. Use conditions like error/Fail .

  • @shivakumarganji3914
    @shivakumarganji3914 Месяц назад

    12 hours

  • @aswinvenkatesh9840
    @aswinvenkatesh9840 Месяц назад

    =

  • @mohitupadhayay1439
    @mohitupadhayay1439 Месяц назад

    Hi Annu Please create an end to end video for Teradata migration, Netezza Migration, SAP migration. That would outstand your channel.

    • @azurecontentannu6399
      @azurecontentannu6399 Месяц назад

      @@mohitupadhayay1439 Reproducing scenario like these are difficult due to unavailable resources for these external source

  • @u2521
    @u2521 Месяц назад

    I want to use active directory managed identity in mssql server as an authentication option in lastet 20v of msssql how to do that? I don't have any azure resource but all my users are from azure ad

  • @natarajbeelagi569
    @natarajbeelagi569 Месяц назад

    Can we use sql query with like operator in lookup activity? select folder names like '____-__-__'

    • @azurecontentannu6399
      @azurecontentannu6399 Месяц назад

      @@natarajbeelagi569 yes ofcourse.. If ur dataset is SQL dataset. U can write any SQL code directly in lookup activity

  • @GaneshDasari-s6l
    @GaneshDasari-s6l Месяц назад

    Thanks for the video. I have scenario like how to copy SharePoint files to Azure storage using ADF parametrize country wise.

  • @mohitupadhayay1439
    @mohitupadhayay1439 Месяц назад

    Informative Annu. Please try making End to End project videos as well. This series together with the Project video will help a DE understand the scale and requirement of how to handle things on big scale data.

  • @rajashekerreddydommata537
    @rajashekerreddydommata537 Месяц назад

    Instead we can also use rank function inside window transformation and filter based on N value being provided.

  • @anandnandan
    @anandnandan Месяц назад

    When there is no file at the source and if the merge file option is selected then it is generating a 2 B file with no data, any idea why it is happening?

    • @azurecontentannu6399
      @azurecontentannu6399 Месяц назад

      @@anandnandanwhen there is No file at the source, why would you do merge

    • @anandnandan
      @anandnandan Месяц назад

      @@azurecontentannu6399 My pipeline schedule to run twice a day, and sometimes the source team doesn't send all the files. When this happens, the copy activity creates an empty file at the sink for missing source files. If I remove the merge option, the file isn't created at all. Is this a glitch or expected behavior for the copy activity? I know I can add pre-checks, but I'm just curious to understand this behavior better.

    • @azurecontentannu6399
      @azurecontentannu6399 Месяц назад

      @@anandnandan I suspect it's an expected behavior. Will let u know if I get any source of confirmation on this behavior.. However, as u already thought abt pre check, you can add custom storage Event trigger instead of schedule trigger for ur pipeline as pre check

  • @sbalaji92
    @sbalaji92 Месяц назад

    @azurecontentannu6399 How do you provide values to the parameters you have created when you have storage based trigger configured for the pipeline?

  • @lakkshhmitechchannel
    @lakkshhmitechchannel Месяц назад

    Hi Sis, what is the interview process for Microsoft and how many total rounds of interviews they will conduct. Please suggest some tips

  • @gotimukulshriya1864
    @gotimukulshriya1864 Месяц назад

    Hi, I also have a scenario where I have few values of type decimal(38,18) on the source which I am copying into adls as a parquet file. In this case the column is converted to string. While copying this data to dsqlpool I am facing an issue saying “cannot load string to decimal(38,18). Do you have a work around for this?

  • @gotimukulshriya1864
    @gotimukulshriya1864 Месяц назад

    Hi, I have a scenario where a single table has multiple unique keys or composite keys. How do we match all the keys with the target table using key Column?

    • @azurecontentannu6399
      @azurecontentannu6399 Месяц назад

      @@gotimukulshriya1864 yes store the keys in the form of col1, col2, col3 in a SQL table and dynamically point to this col having the key columns list and use createarray function @createarray(keycol)

  • @davidmcintyre6862
    @davidmcintyre6862 Месяц назад

    Hi Annu: Just discovered your channel and just wanted to say you're a brilliant presenter - your content is really helpful and clear! Thanks very much - keep up the excellent work.

  • @shelleycurrie764
    @shelleycurrie764 Месяц назад

    really helpful recordings, thanks

  • @megharaina6108
    @megharaina6108 Месяц назад

    can u add ACls to this as well

  • @bhavindedhia3976
    @bhavindedhia3976 Месяц назад

    How to add source and target counts as well?

  • @truthUntold99
    @truthUntold99 Месяц назад

    Thankyou so much ma'am for the video. It was very helpful. But can you please make a video or guide me on how to use a stored procedure in the pipeline to first check if the table exists and then truncate it then load it.. or if the table does not exist then create the table and load it.