Adam, thank you so much for all your content. This is a fabulous e2e on IoT hubs and streaming. By far the best I have found; explain things incredibly well. MS should be recruiting you to do all their public training content!
I am preparing for an exam and got sick with MS learning materials this is much better from their sources, I hope they will structure all tutorial like this for easy understanding
Really you are amazing, ADAM. Your videos are mind-blowing, very easy to understand. Even the tough topic looks so easy after I watch your videos as your explanation and presentation are very clear.
Hello Adam, your way of teaching is very good and your presentation enhances it more. Whenever I need to study something related to azure, first I check video from your channel. You are doing a great work. Thanks
this is just amazing walk-through. like all your content deeply explained with practical tips. learn a lot from this one - and was pleasure to watch. thanks for sharing your knowledge
Thanks in a million Adam! Very well explained. This is the best thing since slice bread + Power BI at a click of a button. This is the nth time that I am watching this again. Great content. Awesome. I couldn't find this explanation--simply put anywhere else. “Great teachers are hard to find”. Grade: A++ 💥
Great video. Wish it covered needed accounts and roles to add inputs/outputs, in case the implementer is not an admin or owner of the input subscription
Great end2end scenario, especially your IoT simulator! :D However, I'm suprised that You didn't show other windowing functions like session, sliding and hopping.
Thanks. It's always hard to choose which are the TOP features and which aren't. I mentioned window functions in additional features. But I wanted to show live dashboarding more.
Hi Adam, Can we use multiple device for the same IoT Hub. I was facing issue while using multiple devices in the same IoT hub. It is picking only from one device and ignoring other.
@@AdamMarczakYT The blob storage output asks for necessary field minimum rows,hours,minutes unlike shown in your video. What values should I put there?
Hi there, I usually just test on some 'dev' instance, but I haven't found good method/tool yet myself for that as I'm not working much with streaming data. Thanks for stopping by! :)
This is very informative. I am using your videos for me to get familiar with different tools in Azure. Just a quick question - in your first example (Blob as an output). Will the data in the blob double because you're selecting everything from iotinput and inserting in bloboutput without any conditions?
Hey Raphael, thanks! I'm not an expert on SAP ECC but I understood that that it supports AQMP so potentially you can stream your data to Event Hubs. :)
Hi Adam! Great video, as always:) I have one question regarding adding new inputs and outputs. When you stop the streaming job to add new input and output you loose data, right? If so, one should take time to design the solution so that you do not loose data. Additionally, there is no point in creating every output and input to safeguard against loss of data because of additional cost. I was just thinking out loud here :P
Stream analytics has option to continue where it left of from the event hub as long as the event hub still has those messages (configured via retention period). :) For the loss of data event hubs has capture feature which by default dumps all the stream data into blob for long term backup/retention/analytics.
Hey Adam. Nice video. Could you please let me how I can process i.e transform, aggregate huge source data and load to synapse db? The transformation is really complex as there are multiple joins and transformation. The output of one table will be used as input to other till all intermediate data is joined and loaded to final reporting aggregated summary table.
Nice video, but I have some questions 1. When we hit the refresh button the data are not actually refreshed and only the first 50 messages are shown everytime. 2. How about adding a second input (I have set two RaspBerry emulators that sending messages) and a second output also. The query does not work. Here is mine: SELECT * INTO powerBi12 FROM raspinput WHERE deviceId='Raspberry Pi Web Client' SELECT * INTO powerBi1 FROM raspinput WHERE deviceId='Raspberry Pi 2' When I hit "Test query" only the first SELECT statement works. "rapsinput" in the above query is an IoTHub with two devices. Thanks in advance
If the input is a json message how will that be parsed/flattened by the stream analytics when writing on to blob to use it on Power BI service ? in this scenario can the stream analytics supports writing some java code to flattened the json message ?
I have been watching a lot of your tutorials. You do a great job with your presentations. I have been looking for a specific playlist supporting DP-203 exam. I have not found one that I like yet. Are you planning on presenting one (I hope)?
Awesome video! But I have one doubt regarding this, what if someone wants to process stream data from various different IoT sensors at the same time (simultaneously), then can different Output Folders (with each IoT sensor's stream data in Excel files) be created for each IoT sensor's stream data? Could you help me with this, thank you so much!
Well that's already a very specific scenario, as a data engineer you need to be able to put all the pieces together. To work with excel files and streams Azure Databricks might be a good choice.
Thanks a lot for this video. In query "select *... " used in video itself Is it fetching all data every time that we already have in our blob? Or it fetches only the latest data?
Hi Adam your tutorial is absolutely amazing its help me to finish my Hons Project, but I am really struggling with something not relating to this but its within the data so I have integrated my data from pybytes but the thing is all my sensor reading comes as signals and I want to sort this sensor reading as I have three different reading and won't allow me to show them organised so I am struggling on that bit is there way to sort this signals coming from pybytes to the reading names I have three reading Temp and Acce and Humidity
Dear Adam, how I can ingest data in Iot Hubs or Event Hubs with API Key? I study for CoinMarketCap data but I could not transfer the data to Event Hubs.
Spinning up a "Stream Analytics job" per query does not seem feasible. I wish there was a product where I can have several standing queries that can be maintained via SDK. Putting multiple queries in a single query window is hard to manage.
I agree, although it is how it is. Maybe try different streaming technology that is more flexible and matches your needs. For example maybe use Azure Databricks?
Hi, Thanks. This is an amazing video for a beginner like us. One query: What needs to change in the python program if we will use a real sensor such as ADXL345 with raspberry pi instead of using simulator? Kindly revert soon or let me know if anyone can answer.
Thanks. If you use your own device then check out Microsoft docs for IoT hub or Event Hub and write code for that device to send telemetry data. Here is an example in C docs.microsoft.com/en-us/azure/iot-hub/quickstart-send-telemetry-c?WT.mc_id=AZ-MVP-5003556 but you need to check tutorials for your own device and supported languages.
This is so that stream analytics know when to stop writing to file X and create file X+1 and start writing there. So you won't end up with petabyte sized files. You should put values as you like depending on the future needs.
PBI Streaming Datasets have row limits but also only store certain amount of data, feel free to check this article to understand the limitations docs.microsoft.com/en-us/power-bi/connect-data/service-real-time-streaming?WT.mc_id=AZ-MVP-5003556#streaming-dataset-matrix in case of this demo 200,000 rows are stored docs.microsoft.com/en-us/power-bi/connect-data/service-real-time-streaming?WT.mc_id=AZ-MVP-5003556#using-azure-stream-analytics-to-push-data
Again nice productive vdo ...!! how to make or can we make azure blob storage or azure SQL server as " Input " in AZure stream analytics . If you have made vdo on both senario then please share link or could you please make vdo on that too. Cheers
Not sure I understand the question. Just go to input and add blob input. There are many technologies which output stream data to blob. Unfortunately I don't have video on that.
@@AdamMarczakYT Your vdo are good but I am not able to make output connection.... means everything is running fine but can't see output in blob....blob is empty...😟
Hi Adam,I was following your video and after 10.02 minutes in your video. I got error while creating azure streaming job, unable to create inputs. Gateway did not receive a response from Microsoft.streamanalytics within the specified time period. Please advice
hey nice video, i have a problem though. Im using 7 Power Bi outputs and the query and everything seems to work also the connection tests are all succesfull. Although whatever i do im not seeing the datasets in the power bi do you know what could be wrong? im sending the data from a raspberry Pi once and then sending it with the query to the outputs. Should i be streaming data constantly from the Pi to be able to see it in the Power Bi ? Or what else could be the problem? Thank you in advance
Hard to say, in general when I make mistake I try to start over as finding that little error usually takes longer than setting everything up from the scratch. Remember that you will only get data for the time your job was running, so you need to start the job and then send events, or select earlier date of processing. Remember also to make sure you have your own consumer group set up so there won't be any race conditions. Thanks for watching!
These videos are even better than the live demos in my Azure Cert courses by Microsoft approved trainers 😁
Wow, thanks! Appreciated :)
That's a very crisp & clear video for Streaming Analytics in Azure.
Fantastic, thanks!
Adam, thank you so much for all your content. This is a fabulous e2e on IoT hubs and streaming. By far the best I have found; explain things incredibly well. MS should be recruiting you to do all their public training content!
Wow, thanks! Glad you liked it.
Great video to show us this demo with multiple Azure service to do the IOT stream analytics! Thank you!
I am preparing for an exam and got sick with MS learning materials this is much better from their sources, I hope they will structure all tutorial like this for easy understanding
Thank you :)
Really you are amazing, ADAM. Your videos are mind-blowing, very easy to understand. Even the tough topic looks so easy after I watch your videos as your explanation and presentation are very clear.
I appreciate that!
You are a saviour, i have my final round of interview and given similar task. Couldnt thank you enough for posting this! Thanks mate.
Hello Adam, your way of teaching is very good and your presentation enhances it more. Whenever I need to study something related to azure, first I check video from your channel. You are doing a great work. Thanks
Very well demonstrated and step by step process. Keep up the good work 👍
Thank you! Cheers!
Kudos to ur knowledge n presentation and making it a available for free cost🙏
Thanks a lot 😊
yes this was an amazing , what we understand in 2-3 hrs , we can easily pickup in 30 mins , nice demonstration.
Super helpful video. Excellent job Adam. Please do more advanced video on this subject please
Regret not having used these materials for passing az-900. Thanks a lot for such wonderful content!
Excellent ! Thanks for helping in understanding the topics better and in a simple way.
Great to hear!
outstanding demo, Adam. Thanks for your easy teaching techniques to motivate towards technology learning.
My pleasure!
Wow u just explaned me what was trying to undestand about 4 months!! Tks
As always, very good presentation and explanation. Thank you Adam!
Glad you liked it! Thanks as always! :)
Thanks for the great video Adam
I always refer your channel in my free time to explore the new things.
Awesome, thank you!
this is just amazing walk-through.
like all your content deeply explained with practical tips.
learn a lot from this one - and was pleasure to watch.
thanks for sharing your knowledge
Too Good - Thank You So Much For Posting these Awesome Videos Guru Ji - Respect from India.
Thanks Adam this video is very helpful, easier than the original one from MS. Thanks a lot
Glad it was helpful Gustavo!
Thanks in a million Adam! Very well explained. This is the best thing since slice bread + Power BI at a click of a button. This is the nth time that I am watching this again. Great content. Awesome. I couldn't find this explanation--simply put anywhere else. “Great teachers are hard to find”. Grade: A++ 💥
Loved it, straight to the point. Learnt from this.
Hey Adam, your videos are really awesome and informative. Thank you for creating such a resourceful videos.
My pleasure!
Great explanation and clear content. Coming from a Microsoft guy too ❤️
Much appreciated! Welcome aboard! :)
Thank you for creating such awesome and informative videos. I really grateful for those!
Great video. Wish it covered needed accounts and roles to add inputs/outputs, in case the implementer is not an admin or owner of the input subscription
This was very well explained , Thank you Adam .
Glad you enjoyed it!
Fantastic demo and explanations Adam! Kudos!
As always glad you enjoy it :)
Thanks for putting amazing content, its so easy to learn from your video..
man you make big smile on my face, Thanks
This is brilliant video for beginners.
Great to hear, thanks!
You always give us a good overview for the proposal of videos. Congratulations :)
Thank you very much!
To be honest, I can understand this easily more than Microsoft doc
Awesome, thanks! :D
Best azure videos ever, and they are free ¡! Thank you a lot my friend
Glad you like them!
Excellent video. Thank you, Adam.
My pleasure!
Very good presentation on E2E scenarios. Thanks.
Thanks! :)
Good explanation given Adam, it helped me while I was looking for streaming analytics videos !! Thanks for making
My pleasure!
Very instructive video! Thank you!
This is simply great way of presenting. Thank you!
Glad you like it, thanks for stopping by :)
A very very nice tutorial. Thanks!
Thank you Adam. Again great tutorial :)
Really well explained! I'm sure it took you a lot of time to do this video and I Thank you very much for it! It will help me out with my Master Thesis
Awesome! Thanks :)
Awesome! Explanations are clear and complete, thanks
Glad it helped!
excellent video friend, thanks for sharing your knowledge, greetings
Thank you! Cheers!
Great end2end scenario, especially your IoT simulator! :D
However, I'm suprised that You didn't show other windowing functions like session, sliding and hopping.
Thanks. It's always hard to choose which are the TOP features and which aren't. I mentioned window functions in additional features. But I wanted to show live dashboarding more.
Nice Presentation, Thank you!!
Nice explanation. Ik hope it is still working the same now.
This was a GREAT demo!
My pleasure!
Congratulations! Great job!
Thank you so much 😀
Great video Adam! 😍 Do you have a similar video but incorporating Python? Thanks
Awesome Adam,stay blessed
Fantastic video Adam. many thanks for that.
Glad you enjoyed it
Hi Adam, Can we use multiple device for the same IoT Hub. I was facing issue while using multiple devices in the same IoT hub. It is picking only from one device and ignoring other.
Of'course you can. It wouldn't be an IoT 'hub' if you would be able connect multiple devices. You might have made mistake somewhere else.
Thanks a lot, very nice presentation
This is amazing, Adam :)
Thank you! :)
Thanks for this but it would be great if you can tell us how to deal with multiple sensors
Well, multiple sensors just send their unique deviceid with the data. So just write appropriate SQL queries as per your business logic.
Just great tutorial. Brilliant work!
Thank you! Cheers!
Very helpful. Best video I found. Thankyou🙂
You're welcome!
@@AdamMarczakYT The blob storage output asks for necessary field minimum rows,hours,minutes unlike shown in your video. What values should I put there?
Great content Adam.
Very good explanation Adam, Thank a lot .. How Testing can be done On azure Live analytics streaming data?
Hi there, I usually just test on some 'dev' instance, but I haven't found good method/tool yet myself for that as I'm not working much with streaming data. Thanks for stopping by! :)
This is very informative. I am using your videos for me to get familiar with different tools in Azure. Just a quick question - in your first example (Blob as an output). Will the data in the blob double because you're selecting everything from iotinput and inserting in bloboutput without any conditions?
Correct. With no aggregations/filters/selects you just make a dump of your stream data on the blob. Thanks for watching :)
Excellent Video again! One Question can I connect Event Hub on SAP ECC? something like we connect on Data Factory by Sap Table?
Hey Raphael, thanks! I'm not an expert on SAP ECC but I understood that that it supports AQMP so potentially you can stream your data to Event Hubs. :)
Dude. You are fantastic
this is amazing. thank you.
Hi Adam! Great video, as always:) I have one question regarding adding new inputs and outputs. When you stop the streaming job to add new input and output you loose data, right? If so, one should take time to design the solution so that you do not loose data. Additionally, there is no point in creating every output and input to safeguard against loss of data because of additional cost. I was just thinking out loud here :P
Stream analytics has option to continue where it left of from the event hub as long as the event hub still has those messages (configured via retention period). :) For the loss of data event hubs has capture feature which by default dumps all the stream data into blob for long term backup/retention/analytics.
Great demo!
Loved it!! Thanks a Lot 😊
Simply superb you rock!!
Thanks mate! :)
Hey Adam. Nice video. Could you please let me how I can process i.e transform, aggregate huge source data and load to synapse db? The transformation is really complex as there are multiple joins and transformation. The output of one table will be used as input to other till all intermediate data is joined and loaded to final reporting aggregated summary table.
Hi Adam, I know that EventHub has data retention period. It is possible to fetch data from a time window using stream analytics queries?
Regards
Very useful . Excellent job
Thank you! Cheers!
Precise and excellent...
Thanks!
Nice Video Adam. Good to learn azure stream analytics. Is there any video for synapse analytics as well?
Thanks, not yet, but I have it in plans!
@@AdamMarczakYT Great. I am waiting for one !
Nice video, but I have some questions
1. When we hit the refresh button the data are not actually refreshed and only the first 50 messages are shown everytime.
2. How about adding a second input (I have set two RaspBerry emulators that sending messages) and a second output also. The query does not work. Here is mine:
SELECT
*
INTO
powerBi12
FROM
raspinput
WHERE
deviceId='Raspberry Pi Web Client'
SELECT
*
INTO
powerBi1
FROM
raspinput
WHERE
deviceId='Raspberry Pi 2'
When I hit "Test query" only the first SELECT statement works. "rapsinput" in the above query is an IoTHub with two devices.
Thanks in advance
Excellent video well articulated
Thank you kindly!
Hello Adam, did you create two outputs with the same dataset and different table names? Did it work?
Not sure I understand but single source can output to multiple outputs as shown in the video.
If the input is a json message how will that be parsed/flattened by the stream analytics when writing on to blob to use it on Power BI service ? in this scenario can the stream analytics supports writing some java code to flattened the json message ?
I have been watching a lot of your tutorials. You do a great job with your presentations. I have been looking for a specific playlist supporting DP-203 exam. I have not found one that I like yet. Are you planning on presenting one (I hope)?
Awesome video! But I have one doubt regarding this, what if someone wants to process stream data from various different IoT sensors at the same time (simultaneously), then can different Output Folders (with each IoT sensor's stream data in Excel files) be created for each IoT sensor's stream data? Could you help me with this, thank you so much!
Well that's already a very specific scenario, as a data engineer you need to be able to put all the pieces together. To work with excel files and streams Azure Databricks might be a good choice.
Well depicted end-2-end journey! Btw your video editing is too good, what tool do you use?
Thanks! I use visio for diagram and camtasia for recoding :)
Thanks a lot for this video.
In query "select *... " used in video itself
Is it fetching all data every time that we already have in our blob? Or it fetches only the latest data?
It fetches what's currently available in the stream. Not blob.
@@AdamMarczakYT I know this, my question was like will it fetch all data every time? Is it possible to fetch incremental data only?
You are just amazing :)
Awesome video, cheers
Hi Adam your tutorial is absolutely amazing its help me to finish my Hons Project, but I am really struggling with something not relating to this but its within the data so I have integrated my data from pybytes but the thing is all my sensor reading comes as signals and I want to sort this sensor reading as I have three different reading and won't allow me to show them organised so I am struggling on that bit is there way to sort this signals coming from pybytes to the reading names I have three reading Temp and Acce and Humidity
Dear Adam, how I can ingest data in Iot Hubs or Event Hubs with API Key? I study for CoinMarketCap data but I could not transfer the data to Event Hubs.
Spinning up a "Stream Analytics job" per query does not seem feasible. I wish there was a product where I can have several standing queries that can be maintained via SDK. Putting multiple queries in a single query window is hard to manage.
I agree, although it is how it is.
Maybe try different streaming technology that is more flexible and matches your needs. For example maybe use Azure Databricks?
@@AdamMarczakYT I will take a look at Databricks. I wonder if Azure Synapse Analytics is an option if the storage is Cosmos DB.
at 6:12 you say replace the string, yes but replace by what ?
Love You brother , what a demo
Cheers!
At 4:03 in 2nd tumbling window output values should be 1 right?
Can anyone please explain me why we are getting 2 there.
Amazing content. Thanks Adam
Appreciate it!
Hi, Thanks. This is an amazing video for a beginner like us. One query: What needs to change in the python program if we will use a real sensor such as ADXL345 with raspberry pi instead of using simulator? Kindly revert soon or let me know if anyone can answer.
Thanks. If you use your own device then check out Microsoft docs for IoT hub or Event Hub and write code for that device to send telemetry data. Here is an example in C docs.microsoft.com/en-us/azure/iot-hub/quickstart-send-telemetry-c?WT.mc_id=AZ-MVP-5003556 but you need to check tutorials for your own device and supported languages.
Nice
Thanks
Brilliant! Subbed!
The blob storage output asks for necessary field of minimum rows,hours & minutes unlike shown in your video. What values should I put there?
This is so that stream analytics know when to stop writing to file X and create file X+1 and start writing there. So you won't end up with petabyte sized files. You should put values as you like depending on the future needs.
Got it.Thankyou @@AdamMarczakYT
Any video on azure anomaly detector for live streaming data on azure ?
Not yet :(
wont your power bi data set explode in size after a while?
PBI Streaming Datasets have row limits but also only store certain amount of data, feel free to check this article to understand the limitations docs.microsoft.com/en-us/power-bi/connect-data/service-real-time-streaming?WT.mc_id=AZ-MVP-5003556#streaming-dataset-matrix in case of this demo 200,000 rows are stored docs.microsoft.com/en-us/power-bi/connect-data/service-real-time-streaming?WT.mc_id=AZ-MVP-5003556#using-azure-stream-analytics-to-push-data
Again nice productive vdo ...!!
how to make or can we make azure blob storage or azure SQL server as " Input " in AZure stream analytics .
If you have made vdo on both senario then please share link or could you please make vdo on that too.
Cheers
Not sure I understand the question. Just go to input and add blob input. There are many technologies which output stream data to blob. Unfortunately I don't have video on that.
@@AdamMarczakYT Your vdo are good but I am not able to make output connection.... means everything is running fine but can't see output in blob....blob is empty...😟
Hi Adam,I was following your video and after 10.02 minutes in your video. I got error while creating azure streaming job, unable to create inputs. Gateway did not receive a response from Microsoft.streamanalytics within the specified time period. Please advice
Unfortunately can't help out with that :( maybe try starting over the quick demo. Maybe you missed a step? maybe it was a temporary issue?
hey nice video, i have a problem though. Im using 7 Power Bi outputs and the query and everything seems to work also the connection tests are all succesfull. Although whatever i do im not seeing the datasets in the power bi do you know what could be wrong? im sending the data from a raspberry Pi once and then sending it with the query to the outputs. Should i be streaming data constantly from the Pi to be able to see it in the Power Bi ? Or what else could be the problem? Thank you in advance
Hard to say, in general when I make mistake I try to start over as finding that little error usually takes longer than setting everything up from the scratch. Remember that you will only get data for the time your job was running, so you need to start the job and then send events, or select earlier date of processing. Remember also to make sure you have your own consumer group set up so there won't be any race conditions. Thanks for watching!