this is great information. I used to get panic attacks when I hear the word XMLA endpoint, but now I feel very confident about it. Thanks a ton for simple explanation
I’ve used XMLA endpoint to migrate an on-prem SSAS model to Power BI Premium by deploying the existing Visual Studio solution to the “new server” (the Power BI workspace). In this project we also process partition on the PBI dataset with SSIS through XMLA endpoint. This is a game changer!
Very interesting; I tried to do it and, apparently, the connection was successful; the problem seems to be that I need to purchase the "premium capacity" which seems not to be included in the trial.
@@karloisaac2597 Hi Karlo, so tell me if I understood it right: you are trying to connect to an XMLA Endpoint via SISS with a trial of Power BI Premium per User but it doesn’t work even if the connection is valid?
@@marisolmarchegiani6235 Thanks for your answer. What I'm trying to do is to deploy an Analysis Services Tabular Project (developed in Visual Studio) into the XMLA Endpoint. I get to the point where the program seems to find the workspace in Power BI but I get an error message. I suspect that the reason is that I cannot set write support in the workspace.
Also something amazing is that you can perform calculations in a Pbix and connect a dataflow to it (analysis services, endpoint) and pasted Mdax code. This allows you to create a "view" with dax calculations in a Pbix and import an aggregated view into a "main" model.
XMLA endpoint is a game changer. I am using it along with Tabular Object Model (TOM) and C#. It is pretty exciting to see how much control we get, whether it is to process only a dimension (On-demand refresh of a dimension) or to keep a model from eviction :).
You have a Fabric tenant named Tenant1 that contains a workspace named WS1. WS1 uses a capacity named C1 and contains a dataset named DS1. You need to ensure read-write access to DS1 is available by using XMLA endpoint. What should be modified first? A. the DS1 settings B. the WS1 settings C. the C1 settings D. the Tenant1 settings
Worth mentioning that you can connect from Excel and read dataset tables into Excel tables (use power query option or AS (import)). There are Excel size limits to consider but for some use cases this is a really cool feature, when you don't want a pivot table, but just a standard Excel table, and who doesn't love Excel right? ;-)
Hi Adam, thanks for sharing. I am using XMLA endpoint to deploy changes in the model but still taking a lot of time to refresh the data. The whole point of using XMLA endpoint is to avoid refreshing large datasets after small changes, is that right? Do you happen to know what might have caused the problem I am experiencing here?
Thanks for sharing: my question is: if I change XMLA Endpoint in the premium capacity setting from read only to read & write. will I still need to do those changes in the tenant setting?
Guys, I really appreciate your content, you are so cool! As far as I understand, if we do 'write' operation via XMLA, we lose the ability to download the PBIX file, just like with an incremental refresh. Maybe you know when Microsoft will make this available or is any workaround here? Otherwise, it's a little uncomfortable.
@@tmjones212 What I mean is if you're using the XMLA endpoint to publish models, you develop them in 3rd party tools like Tabular Editor. So you won't be looking to download them to .pbix files after publishing for editing.
That’s correct! With SSIS you just need to create a solution that connects to the XMLA endpoint e launches the JSON to refresh data (entire dataset, single table or partition) and then create a job on a SQL Server the schedules the SSIS project. I am pretty sure you can do this also with Azure Data Factory even though I haven’t tried it myself.
This is exactly the issue I've been wrestling with for three days now! Thank you for asking the question. Also, please let me know if you have success with this method/process!
Hello Thank you a lot for your video, but I have a question can I work with the dataset using Analysis Services tools(Visual studio) after deploying the model to a Premium workspace.
we know that premium per user stops users with only a pro licence from viewing anything created under PPU - eg, you cannot publish a premium dataset/dataflow content into a normal workspace and have pro users view the content. but can you do the opposite? so have your primary dataset in a normal workspace published by a pro user, then your premium user connects directly to the pro dataset, then publishes it into their premium workspace, so you effectively have an XMLA endpoint on your pro dataset without affecting the ability for your pro users to view the report?
turns out i found the answer to my question: The following datasets aren't accessible by the XMLA endpoint. These datasets won't appear under the workspace in SSMS or in other tools: Datasets based on a live connection to an Azure Analysis Services or SQL Server Analysis Services model. Datasets based on a live connection to a Power BI dataset in another workspace. To learn more, see Intro to datasets across workspaces. Datasets with Push data by using the REST API. Excel workbook datasets.
It's pretty frustrating XMLA endpoints aren't available on Pro. I can't justify 5k USD per month to replace the one Azure Analysis Services db I have, but would love to. Especially as Power BI has in built connectivity to delta lake.
Is it possible to see the direct query being generated to the data source using sql server profiler I can see the Dax query being generated though through the sql profiler when connected to the xmla end point
Not very sure if this is helpful but you can schedule the stored procedure execution using SQL job and then set the schedule refresh for the power bi dataset after sometime.
Short answer, yes. Two ways to do it, create an agent job that refresh the dataset and in your SP just call it using sp_start_job. Or you declare TMSL and run it against linked server.
Hey guys. I’ve setup an xlma endpoint to SSAS using msmdpump.dll and IIS. Then I use a power bi gateway . This all works really well but I find permissions quite limited. Any pointers on how to setup security so that access to the IIS site is restricted so the xlma endpoint can only be accessed by specific users or credentials?
this is great information. I used to get panic attacks when I hear the word XMLA endpoint, but now I feel very confident about it. Thanks a ton for simple explanation
I’ve used XMLA endpoint to migrate an on-prem SSAS model to Power BI Premium by deploying the existing Visual Studio solution to the “new server” (the Power BI workspace). In this project we also process partition on the PBI dataset with SSIS through XMLA endpoint. This is a game changer!
Very interesting; I tried to do it and, apparently, the connection was successful; the problem seems to be that I need to purchase the "premium capacity" which seems not to be included in the trial.
@@karloisaac2597 Hi Karlo, so tell me if I understood it right: you are trying to connect to an XMLA Endpoint via SISS with a trial of Power BI Premium per User but it doesn’t work even if the connection is valid?
@@marisolmarchegiani6235 Thanks for your answer. What I'm trying to do is to deploy an Analysis Services Tabular Project (developed in Visual Studio) into the XMLA Endpoint. I get to the point where the program seems to find the workspace in Power BI but I get an error message. I suspect that the reason is that I cannot set write support in the workspace.
I'll be honest, you just taught me something I didn't know about power bi! Great video!!
Also something amazing is that you can perform calculations in a Pbix and connect a dataflow to it (analysis services, endpoint) and pasted Mdax code. This allows you to create a "view" with dax calculations in a Pbix and import an aggregated view into a "main" model.
XMLA endpoint is a game changer. I am using it along with Tabular Object Model (TOM) and C#. It is pretty exciting to see how much control we get, whether it is to process only a dimension (On-demand refresh of a dimension) or to keep a model from eviction :).
Love the Rocinante shirt! Glad to know you're an expanse geek as well! Great video, really informative!
You have a Fabric tenant named Tenant1 that contains a workspace named WS1. WS1 uses a capacity named C1 and contains a dataset named DS1.
You need to ensure read-write access to DS1 is available by using XMLA endpoint.
What should be modified first?
A. the DS1 settings
B. the WS1 settings
C. the C1 settings
D. the Tenant1 settings
Amazing! I had no idea about this capability, I just tested it. Thank you!
Thanks, Adam. Great video all around.
Worth mentioning that you can connect from Excel and read dataset tables into Excel tables (use power query option or AS (import)).
There are Excel size limits to consider but for some use cases this is a really cool feature, when you don't want a pivot table, but just a standard Excel table, and who doesn't love Excel right? ;-)
haha! LOVE it! I worked so hard on that endpoint joke. :-)
Hi Adam, thanks for sharing. I am using XMLA endpoint to deploy changes in the model but still taking a lot of time to refresh the data. The whole point of using XMLA endpoint is to avoid refreshing large datasets after small changes, is that right? Do you happen to know what might have caused the problem I am experiencing here?
Thanks for sharing: my question is: if I change XMLA Endpoint in the premium capacity setting from read only to read & write. will I still need to do those changes in the tenant setting?
Can we use xml endpoint in Azure Datafactory to get data and insert into data lake or database
I like the ALM toolkit! when we aare several developers on the same model.
Guys, I really appreciate your content, you are so cool!
As far as I understand, if we do 'write' operation via XMLA, we lose the ability to download the PBIX file, just like with an incremental refresh. Maybe you know when Microsoft will make this available or is any workaround here? Otherwise, it's a little uncomfortable.
When you're using the XMLA endpoint PBIX files are exactly what you're trying to avoid...
@@joshash5944 can you explain what you mean?
@@tmjones212 What I mean is if you're using the XMLA endpoint to publish models, you develop them in 3rd party tools like Tabular Editor. So you won't be looking to download them to .pbix files after publishing for editing.
@@joshash5944, why should we choose Tabular Editor over Power BI Desktop to develop the model?
Love the T-Shirt
Hope and wish can have a video to teach how automate the refresh thru the XMLA endpoint!
If you are familiar with SSIS and SQL Server Agent jobs that's one way to automate it.
That’s correct! With SSIS you just need to create a solution that connects to the XMLA endpoint e launches the JSON to refresh data (entire dataset, single table or partition) and then create a job on a SQL Server the schedules the SSIS project. I am pretty sure you can do this also with Azure Data Factory even though I haven’t tried it myself.
@@bladerunnerisback Thanks for the direction, I will look into that.
@@marisolmarchegiani6235 Nice feedback, I will try to do that!
This is exactly the issue I've been wrestling with for three days now! Thank you for asking the question.
Also, please let me know if you have success with this method/process!
Haa excellent video.. please could elaborate more about the XMLA enp point issues..
Hello Thank you a lot for your video, but I have a question can I work with the dataset using Analysis Services tools(Visual studio) after deploying the model to a Premium workspace.
Q: Would it be possible to copy a dataset Into another database and automate this copy? Thank you.
we know that premium per user stops users with only a pro licence from viewing anything created under PPU - eg, you cannot publish a premium dataset/dataflow content into a normal workspace and have pro users view the content. but can you do the opposite? so have your primary dataset in a normal workspace published by a pro user, then your premium user connects directly to the pro dataset, then publishes it into their premium workspace, so you effectively have an XMLA endpoint on your pro dataset without affecting the ability for your pro users to view the report?
turns out i found the answer to my question:
The following datasets aren't accessible by the XMLA endpoint. These datasets won't appear under the workspace in SSMS or in other tools:
Datasets based on a live connection to an Azure Analysis Services or SQL Server Analysis Services model.
Datasets based on a live connection to a Power BI dataset in another workspace. To learn more, see Intro to datasets across workspaces.
Datasets with Push data by using the REST API.
Excel workbook datasets.
It's pretty frustrating XMLA endpoints aren't available on Pro. I can't justify 5k USD per month to replace the one Azure Analysis Services db I have, but would love to. Especially as Power BI has in built connectivity to delta lake.
Does a Fabric Warehouse Dataset also have an XMLA endpoint?
Can we connect to xmla endpoint without admin access but we have premium licence
If did a change using XLMA endpoint, you'll not able to download the dataset, is that right, any thoughts?
Love the shirt!
Can you connect to a Power BI dataset /load a Power BI dataset into R using this
Hi Patrick/Adam, Can you tell me how do we sum up time to show it on a cluster chart, for example how do I sum up (1:20:00 + 2:50:00 = 04:10:00)
would appreciate a reply here: ExecuteXMLA failed with Result. why am I getting this error? Please help! Nothing will open
9:30 AM CDT, not CET (that’s what the video text said)
Hi Adam, is there any way to automate this entire process?
Is it possible to see the direct query being generated to the data source using sql server profiler
I can see the Dax query being generated though through the sql profiler when connected to the xmla end point
Can we set timeouts across all the client tools when connecting to xmla end point? Rather setting connection timeouts in each client tools..
I want the ability to refresh my Power BI dataset as a last step of my SQL Server Stored Procedure. Can that be accomplished with xmla endpoints?
Not very sure if this is helpful but you can schedule the stored procedure execution using SQL job and then set the schedule refresh for the power bi dataset after sometime.
Short answer, yes. Two ways to do it, create an agent job that refresh the dataset and in your SP just call it using sp_start_job. Or you declare TMSL and run it against linked server.
Maybe a dumb qeustions, does this mean i can refresh every minute?
Nice - he is an Elite Dangerous player ;-)
Can I add a map from QGIS to power bi if it can? How?
Wow this would be great
So why is this better than using ssis to refresh cubes? I must be missing something 🤔
I also don't see how this is different than just connecting to the SSAS model itself
Awesome!!!
Dyson Sphere program and Elite FTW (well not that much after odyssey)
Where is XMLA endpoint?
I'll give you one better. What is XMLA endpoint?
I'll give you one better! Why is XMLA endpoint?
First
Hey guys. I’ve setup an xlma endpoint to SSAS using msmdpump.dll and IIS. Then I use a power bi gateway . This all works really well but I find permissions quite limited. Any pointers on how to setup security so that access to the IIS site is restricted so the xlma endpoint can only be accessed by specific users or credentials?