Very good video! I think I was the one that asked the question in the live stream. The only limitation or pain I've found so far is that the dataflows can't handle null values in the data or they crash. I've only tested with blob storage so far and not event hub. Hopefully this is fixed in a later version.
Hi Patrick, great stuff! I was able to follow along in a PPU workspace, but I am left with a problem: Power BI will not let me change it to DirectQuery...I have connected the hot path, but DQ is greyed out. MS docs refer that the enhanced compute engine needs to set to on for the dataflow, but that option in the service is also greyed out (just as for you in the video...). Any hints on what I am doing wrong?
Hey Patrick, I have a power bi report which contains 1 calculated column with a complex dax query. Is there any way to read power bi report (not manually) and fetch that dax query and translate into sql query. Then store that query into a table. Source of pbi report is sql server.
Patrick- would this be a replacement for a Stream Analytics job? At least for the real time streaming output. With only 7 day retention for cold data, you would still need to use a SA job to drop historical data into a data lake or db, right? Great video!
Very good video Patrick, as always. I'm just wondering why the maximum period of 7 days... is there a way to keep the data flowing indefinitely? It would be very interesting in a stock market scenario, for example
Thanks for taking the hit for us. I can't image the number of days it took to figure all this out. In the end, WAY TOO MUCH SETUP!!!! If a person has time to do all this, they can just go out and get the temps manually.
Very good video! I think I was the one that asked the question in the live stream. The only limitation or pain I've found so far is that the dataflows can't handle null values in the data or they crash. I've only tested with blob storage so far and not event hub. Hopefully this is fixed in a later version.
Any can update this video with an alternative/Azure Stream Analytics (given the retirement of Streaming Datasets)?
Great tutorial. The whole upstream of event hub/stream is new to me so this is great insight to understand the entire pipeline.
Mind blowing lol
Awesome stuff! Thanks, Patrick!
This is really great! Many use cases pop up already. Not really the standard PowerBI stuff though.
Can you post the PS script? Thanks!
This is excellent. Can you show us more on using event hub and show us the powershell file?
Thanks it worked, just what i needed perfect timming
The Art of the Possible ... Great Video Patrick
Hi Patrick,
great stuff! I was able to follow along in a PPU workspace, but I am left with a problem: Power BI will not let me change it to DirectQuery...I have connected the hot path, but DQ is greyed out. MS docs refer that the enhanced compute engine needs to set to on for the dataflow, but that option in the service is also greyed out (just as for you in the video...).
Any hints on what I am doing wrong?
Awesome video Patrick, thanks a lot! It will be very nice to be able to use Power BI for IoT use cases instead of having to rely on IoT Central
Hey Patrick,
I have a power bi report which contains 1 calculated column with a complex dax query.
Is there any way to read power bi report (not manually) and fetch that dax query and translate into sql query.
Then store that query into a table.
Source of pbi report is sql server.
Hi, This Option is available but creating New is not available now. Any Alternatives ?
Hey Patrick which books are behind you?
Patrick- would this be a replacement for a Stream Analytics job? At least for the real time streaming output. With only 7 day retention for cold data, you would still need to use a SA job to drop historical data into a data lake or db, right? Great video!
could you share the powershell script
Nice shoutout to Hope! :)
Very good video Patrick, as always. I'm just wondering why the maximum period of 7 days... is there a way to keep the data flowing indefinitely? It would be very interesting in a stock market scenario, for example
Use Premium Capacity which is 100TB per workspace
Thanks for taking the hit for us. I can't image the number of days it took to figure all this out. In the end, WAY TOO MUCH SETUP!!!! If a person has time to do all this, they can just go out and get the temps manually.
😊 New Year 👦 in a 🎁 ! 👈
Nice! Goodbye Splunk dashboards!
eo.ea.eh.ek
oi.am.oh.ak
ho.ua.no.ea