codeLive: Processing Large Volumes of Salesforce Data with MuleSoft

Поделиться
HTML-код
  • Опубликовано: 2 дек 2024

Комментарии • 6

  • @AshalathaVemula
    @AshalathaVemula 9 месяцев назад +1

    Is that possible to read a .txt file from sftp and batch update on salesforce account object. Can you please provide the session for this aswell.

  • @nupurkumari8977
    @nupurkumari8977 7 месяцев назад

    I think there is a correction needed over here for Batch Job . According to my knowledge , if we are having 500000 records with block size 10 , then number of records in 1 block will be 10 record . So by dividing we get total number of blocks will be 50000 , each block containing 10 records . Block size in batch job defines number of records in 1 block , it doesn't define total number of blocks . Similarly it will be applied for other use cases .

  • @lakshmithokata3390
    @lakshmithokata3390 2 месяца назад

    Does Bulk Api V2 works with json/xml/ Java payload?

  • @pulavarthivasavi8811
    @pulavarthivasavi8811 8 месяцев назад

    hello sir can you give me brief explanation with oauth username password authentication

  • @soumyaranjansahoo-h5q
    @soumyaranjansahoo-h5q 10 месяцев назад

    i wants to fetch 50000000 records from salesforce to mulesoft then how can i fetch it