Hi Steve, is there any way to allow the chart to show historical data after refreshing the website? Because upon refreshing the website, the data on the chart reset as well. Thank you
Yes but it is not trivial. You would need to store the data in a data repository like S3 or DynamoDB. The easiest way is to use DynamoDB as an IoT data repository created in a separate Rule from IoT Core. Then you can use Lambda to fetch the data from Dynamo via a user SQL statement from a GET request to the Lambda function URL from your web code. If you wanted to pass query parameters from you web code for specific search criteria you would also need API Gateway to interface your Lambda function.
Is there a way to send json data packets with specific names and under specific folders? For example, if I have 3 unique folders for 3 different users. I want to send user1's data to their folder, user2's to their, etc. Also I want to send two different types of data. For example one coming from temp sensor, the other coming from soil moisture sensor. Is there a way to put and read each data packet with a name+timestamp? User1: TempSensor readings --> s3Bucket/user1folder/tempsensorReading+timestamp Soil Moisture --> s3Bucket/user1folder/SoilSensorReading+timestamp User2: TempSensor readings --> s3Bucket/user2folder/tempsensorReading+timestamp Soil Moisture --> s3Bucket/user2folder/SoilSensorReading+timestamp Etc
You can send IoT Data to AWS IoT Core with different topic identifiers or have the user identify themselves in the JSON IoT Payload as a key value pair. To distribute those payloads you would need to check that the topic or payload match the key-value pai either the Rules query statement or in a Lambda function 'rule action' linked to IoT Core. From there the data can easily be dispatched to a folder or any other service you like.
hey, steve, finding this kind of video for a long time, first of all, thank you for uploading such a video, is it possible to save the same kind of data in pure JSON format? , I tried dynamodb but the format I got was nested JSON. Can you please suggest to me any good AWS data storage service which stores data in a pure JSON format?
Hi, DDB is never going to return pure JSON without ETL. This isn't a huge deal, just use a Lambda function to convert any service output you have into JSON. A more costly and complicated approach is to use a AWS GLUE catalog for schema discovery and write a script to translate the data back to S3. What is your intention for your design?
The traditional way is to write server side code like hosting an MQTT broker and database application in a Linux instance in AWS EC2. However, I show you how to do it using AWS Serverless services with WebSockets later in this lecture series. Propagation varies depending on your distance from AWS region datacenter but I am getting 4/5 of a second latency from the west coast to US-East-1.
Nice video what was that anti vscode about
Hi Steve, is there any way to allow the chart to show historical data after refreshing the website? Because upon refreshing the website, the data on the chart reset as well. Thank you
Yes but it is not trivial. You would need to store the data in a data repository like S3 or DynamoDB. The easiest way is to use DynamoDB as an IoT data repository created in a separate Rule from IoT Core. Then you can use Lambda to fetch the data from Dynamo via a user SQL statement from a GET request to the Lambda function URL from your web code. If you wanted to pass query parameters from you web code for specific search criteria you would also need API Gateway to interface your Lambda function.
Is there a way to send json data packets with specific names and under specific folders? For example, if I have 3 unique folders for 3 different users. I want to send user1's data to their folder, user2's to their, etc.
Also I want to send two different types of data. For example one coming from temp sensor, the other coming from soil moisture sensor. Is there a way to put and read each data packet with a name+timestamp?
User1:
TempSensor readings --> s3Bucket/user1folder/tempsensorReading+timestamp
Soil Moisture --> s3Bucket/user1folder/SoilSensorReading+timestamp
User2:
TempSensor readings --> s3Bucket/user2folder/tempsensorReading+timestamp
Soil Moisture --> s3Bucket/user2folder/SoilSensorReading+timestamp
Etc
You can send IoT Data to AWS IoT Core with different topic identifiers or have the user identify themselves in the JSON IoT Payload as a key value pair. To distribute those payloads you would need to check that the topic or payload match the key-value pai either the Rules query statement or in a Lambda function 'rule action' linked to IoT Core. From there the data can easily be dispatched to a folder or any other service you like.
hey, steve, finding this kind of video for a long time, first of all, thank you for uploading such a video, is it possible to save the same kind of data in pure JSON format? , I tried dynamodb but the format I got was nested JSON. Can you please suggest to me any good AWS data storage service which stores data in a pure JSON format?
Hi, DDB is never going to return pure JSON without ETL. This isn't a huge deal, just use a Lambda function to convert any service output you have into JSON. A more costly and complicated approach is to use a AWS GLUE catalog for schema discovery and write a script to translate the data back to S3. What is your intention for your design?
Interesting
Hi Steve ,
how can we send data from sensors in real time to cloud using these AWS services
The traditional way is to write server side code like hosting an MQTT broker and database application in a Linux instance in AWS EC2. However, I show you how to do it using AWS Serverless services with WebSockets later in this lecture series. Propagation varies depending on your distance from AWS region datacenter but I am getting 4/5 of a second latency from the west coast to US-East-1.