Can you please also elaborate on scaling this solution when working with Partitioned tables. can we refresh a certain partition of a table using this method?
Planning to shift from import mode to direct query - Aws rds postgresql data , to avoid refreshing datasets and to offer portal to customers with live data . Do you see any disadvantages /hiccups here ?
Check the limitations of Direct Query and also the performance of your data model. It should be optimised otherwise you’ll encounter a lot of problems.
Yes, it’s possible and this is what is a even driven data architecture. As soon as data load completes, this would run automatically. Also, this can be parameterize.
This is very helpful thankyou. Our Datasets are fed by Dataflows. If we use the same principle but using the Refresh Dataflow API, how can we then have the Refresh Dataset activity triggered only once the DF refresh is complete?
Thank you for your question! I'm glad you found the video helpful. To trigger a dataset refresh only after the Dataflow refresh is complete, follow these steps: Trigger Dataflow Refresh: Use the Refresh Dataflow API in an Azure Data Factory (ADF) Web activity: POST api.powerbi.com/v1.0/myorg/groups/{groupId}/dataflows/{dataflowId}/refreshes 2. Monitor Refresh Status: Poll the Dataflow refresh status using another Web activity: GET api.powerbi.com/v1.0/myorg/groups/{groupId}/dataflows/{dataflowId}/refreshes 3. Conditional Execution: Use the status check output to trigger the dataset refresh once the Dataflow is complete. Trigger Dataset Refresh: Use a Web activity to refresh the dataset: POST api.powerbi.com/v1.0/myorg/groups/{groupId}/datasets/{datasetId}/refreshes you can use Dataflow as an activity in your ADF pipeline.
@@BIConsultingPro Wow!! Thank you so much for such a quick and detailed response. I really appreciate your help! Lets hope the execs appreciate getting their reports earlier in the morning too! :D
Thank you so much for this, this helped me to trigger the automatic refresh of powerBi dataset after my pipeline run, now I want to send automated email if the powerBi dataset has been successfully refreshed or failed, how do we do that?
Thank you for the quick response:) However, Will my logic App be able to read the success status of powerBI dataset refresh?? Because when added ‘powerBI dataset refresh’ activity at the end of my pipeline it called for dataset refresh and next we don't know if the refresh has been successfully or failed right?
@@vidyakanmus9318 I'll suggest you to try it and once you try and get failed. Please do some research. If you don't get answer, then write us at connect@biconsultingpro.com with your findings, and we will help you.
Hi I am getting the error {"error":{"code":"ItemNotFound","message": "Dataset is not found! please verify datasetId is correct and user have sufficient permissions."}} ,is i am missing something ?
Yes, it is possible to refresh data from Azure Data Factory (ADF) to Power BI, even if they are in different Azure tenants. Azure Data Factory supports cross-tenant data movement and data integration scenarios.
Also instead of allowing SP to use Power BI APIs uneer Developer settings.. can we enable allow SP to use read-only admin APIs under Admin API settings??
Thank you for this video. Helped in a critical time.
Glad it was helpful!
Can you please also elaborate on scaling this solution when working with Partitioned tables. can we refresh a certain partition of a table using this method?
Very Informative..Nice
thanks a lot, ı followed steps and it worked
For authorization, don't we require bearer token? It is not used here in Web activity
Planning to shift from import mode to direct query - Aws rds postgresql data , to avoid refreshing datasets and to offer portal to customers with live data . Do you see any disadvantages /hiccups here ?
Check the limitations of Direct Query and also the performance of your data model. It should be optimised otherwise you’ll encounter a lot of problems.
Hi, where is the Service Principal video? i did not find it.
Here you go: What is Azure Service Principal? Why do we need it and how to create it? | Azure
ruclips.net/video/PkLrKDW9gY8/видео.html
Hi, very good video.
Is it possible to add an XMLA script with parameter to process old partitions in a fact table?
is that possible to run the pipeline dynamically or schedule refresh against the manual run
Yes, it’s possible and this is what is a even driven data architecture. As soon as data load completes, this would run automatically.
Also, this can be parameterize.
This is very helpful thankyou.
Our Datasets are fed by Dataflows. If we use the same principle but using the Refresh Dataflow API, how can we then have the Refresh Dataset activity triggered only once the DF refresh is complete?
Yes, you can use dataflows in a ADF pipelines
Thank you for your question! I'm glad you found the video helpful.
To trigger a dataset refresh only after the Dataflow refresh is complete, follow these steps:
Trigger Dataflow Refresh: Use the Refresh Dataflow API in an Azure Data Factory (ADF) Web activity:
POST api.powerbi.com/v1.0/myorg/groups/{groupId}/dataflows/{dataflowId}/refreshes
2. Monitor Refresh Status: Poll the Dataflow refresh status using another Web activity:
GET api.powerbi.com/v1.0/myorg/groups/{groupId}/dataflows/{dataflowId}/refreshes
3. Conditional Execution: Use the status check output to trigger the dataset refresh once the Dataflow is complete.
Trigger Dataset Refresh: Use a Web activity to refresh the dataset:
POST api.powerbi.com/v1.0/myorg/groups/{groupId}/datasets/{datasetId}/refreshes
you can use Dataflow as an activity in your ADF pipeline.
@@BIConsultingPro
Wow!! Thank you so much for such a quick and detailed response. I really appreciate your help!
Lets hope the execs appreciate getting their reports earlier in the morning too! :D
Thank you so much for this, this helped me to trigger the automatic refresh of powerBi dataset after my pipeline run, now I want to send automated email if the powerBi dataset has been successfully refreshed or failed, how do we do that?
Congratulations!! You can use Azure Logic App and can call a Logic App in your Pipeline.
Thank you for the quick response:) However, Will my logic App be able to read the success status of powerBI dataset refresh??
Because when added ‘powerBI dataset refresh’ activity at the end of my pipeline it called for dataset refresh and next we don't know if the refresh has been successfully or failed right?
@@vidyakanmus9318 I'll suggest you to try it and once you try and get failed. Please do some research. If you don't get answer, then write us at connect@biconsultingpro.com with your findings, and we will help you.
Sure, thank you !! Appreciate your help :)
But how to schedule this refresh automatically once the dataload is completed. Can you please show us with a video
Add trigger
Hi I am getting the error
{"error":{"code":"ItemNotFound","message":
"Dataset is not found!
please verify datasetId is correct and user have sufficient permissions."}} ,is i am missing something ?
I get the same error, have you had any luck in fixing this?
how to connect to PBI report and copy the data from PBI report from one of the visual to the flat files
If you just need to download data from any visuals, simply click on three dots and click export data.
Hi Our Azure services and power bi are in different tenants. Is it possible to refresh from Adf to power bi , if they are in different tenants
Yes, it is possible to refresh data from Azure Data Factory (ADF) to Power BI, even if they are in different Azure tenants. Azure Data Factory supports cross-tenant data movement and data integration scenarios.
Can we use the same process with service principle??
Yes, absolutely
Also instead of allowing SP to use Power BI APIs uneer Developer settings.. can we enable allow SP to use read-only admin APIs under Admin API settings??
Also if my workspace isn’t V2.. won’t it work?? If not this way, is there any other way to refresh such workspace from adf??
This works on datasets but fails on dataflows.. not sure why..