Can you make a video on Credit card fraud detection and entire transformation from capturing streaming data , Historic data to uploading in s3 and for prediction insagemaker
One thing that will improve content is explaining maybe each concept as if hovering over it in vs code // where it explains things ie: say stringing then give a small summary for it because everyone won’t know these things
This is a great tutorial, previously, I just have basic idea about AWS stuffs, now with this tutorial, I learned the Kinesis data streaming part very easily. which can help me with my upcoming Interviews. Thanks a lot :) Felix.
Im glad that i found you and your videos really help me and many like me. So my request would be AwS Pyspark with Big Data pipelines and Aws Dynamo DB. Please consider this and thanks in advance. Much appreciation!
Hi Felix, does in real time system, is it common to have get data first in to s3 and then produce to kinesis data stream using lambda ? Can you please make a video on different type of producers and which one to choose ? I want to know how to decide the producer if we are designing real time system say with latency of milliseconds and also for near real time like having latency of say in minutes to hour. Thanks
This is great! Thanks so much, Felix! I was looking for a simple tutorial that scratch the basics of all these fragments in AWS, and also in Javascript - so this was spot on. I'm building a service that posts messages to a websocket handler (Lambda function) and I want to save that information coming from the websocket (basically health data from an iot-device). So from there, AWS Kinesis seems like a good idea to collect all incoming data from these devices, and then consume the data into a ie. DynamoDB. BTW: Are you open for freelance work with small AWS-based projects?
I am great fans of your aws videos.Kindly help me to understand how one lambda is calling another lambda and also updating 2 diffrent databases..Like product is purchased by customer and also updated in product inventory. In node js please😊
u have to like all my videos before i answer this question......just joking lol there are 3 ways u can enable one lambda to call another lambda: 1. use a "middle man" like kinesis or sqs. lambda #1 construct a data object send it to kinesis/sqs and then it triggers lambda #2. lambda #2 then extracts the data object and do whatever u want with it 2. attach an alb (application load balancer) to lambda #2 so it acts as an API and then lambda #1 calls the alb to invoke lambda #2 3. lambda #1 calling lambda #2 directly using the invoke method. here is a reference to that (www.sqlshack.com/calling-an-aws-lambda-function-from-another-lambda-function) i personally like the first approach the most but all 3 would work!! for ur second question, u can just perform 2 dynamo requests (one per each table). just change the dynamoTableName in the request params (e.g., github.com/felixyu9/serverless-api-tutorial-code/blob/main/index.js#L80-L83)
@@FelixYu thanks lot...I have successfully implemented ..Actually I was trying to implement ticket reservation system..While booking the ticket it will also minus the seat from the main table also keep booked seat no in users table... Thanks again..I am just having 1 month of exp in AWS.
yes u certainly could..for example, in consumer 1 lambda, u can read in the data as json/strings/numbers and then u make changes to it to make it a different (e.g., numeric calculation, string concatenation, etc.) and then u can save this new data to dynamodb
Hi Felix , It is a nice and clear video. One question, may I know if there can be applied on other types of sources data such as pdf, jpg? Can I add a aws textract in between to read the text from document or image rather than just text file ?
Hi Felix what if I have s3 in one aws and kinesis on another . And if I added cross account policy on lambda as well. What should I do in lambda to send the data to kinesis on another aws account.
if u have the producer lambda in account A and then wanna write data to a kinesis stream in account B, u will needa configure 2 roles. 1. in account B, create a role that has write access to the kinesis stream (e.g., putRecord, PutRecordBatch, etc.). let's call it kinesis-role-in-account-B. 2. in account A, create a role (let's call it lambda-role-in-account-A) and use this role to assume the kinesis-role-in-account-B that was created in step #1. attached this role to the lambda and it will be able to send data to kinesis in account B!!
sorry i dont think i keep the code after i uploaded the video :(( try to pause the video and copy the code that way. let me know if u run into any problems!!
Felix loving your channel, great content! Can we do more with Lambda functions? They seem to be an important link to many other services in AWS. For these 2 consumers maybe send an email with 1 and save to dynamo with the other?
tyty 😄 and yessir, we can certainly do more with lambda functions!! lambda has gained a lot of popularities in recent years becuz it is light weight, easy to set up and great for horizontal scaling. u just reminded me that i do already have videos for send emails in lambda and saving data to dynamo database: send emails in lambda: ruclips.net/video/mL-4PeuAuWc/видео.html save data to database: ruclips.net/video/Ut5CkSz6NR0/видео.html
yes, just change txt to json in the s3 trigger suffix 5:18 of the video and u can upload json files as the data source (e.g., test_file.json) (or just leave that field blank so it will take all file types). let me know if that works
@@kavyap3184 gotcha....lets try a few things here. 1. when u have the txt suffix set up and upload a txt file (the exact setup i have in the video), does everything work fine? 2. after u change txt to json and upload a json file, does it trigger the producer lambda? if not, it might be a permission issue. 3. if it triggers the producer lambda, lets console.log out dataString and see how it looks like (9:29 of the video)
@@FelixYu Thanks for the suggestion. It is just that my file is too big to process. Also do we see the records in the AWS data stream ?. Because specific data stream monitor shows no data even though i see the data in console.
Yea if ur file is big, u needa increase the producer lambda timeout and memory....and I don’t think u can look at the data from aws console. Under monitoring u can see metrics like incoming data in bytes, get record counts, etc
@@subhamchakraborty6822 i guess a better question is - are u able to connect aws with ur drill rig system cuz idk how u have ur drill rig system set up and where u store the data
@@subhamchakraborty6822 the easiest way i can think of is that..after matlab saves teh data to ur machine, u can write a script/cron job/program to upload the data to aws s3 and have that trigger a lambda function to send the data to kinesis
two things to check here: 1. make sure the cloudwatch console u are viewing in the same region as the lambda function 2. make sure the iam role the lambda is using has the CloudWatchLogsFullAccess policy attached to it
Video on send emails in lambda: ruclips.net/video/mL-4PeuAuWc/видео.html
Video on save data to a database: ruclips.net/video/Ut5CkSz6NR0/видео.html
Can you make a video on Credit card fraud detection and entire transformation from capturing streaming data , Historic data to uploading in s3 and for prediction insagemaker
I learned a lot of real-time use cases of AWS services from your channel. Thanks! for your efforts and please keep doing Felix.
Glad that u found my tutorials helpful!!
One thing that will improve content is explaining maybe each concept as if hovering over it in vs code // where it explains things ie: say stringing then give a small summary for it because everyone won’t know these things
Glad I found you. I've been looking for some walk-thru projects. Keep em coming.
Code for producer :
const AWS=require('aws-sdk')
AWS.config.update({
region:'us-east-1'
})
const s3=new AWS.S3()
const kinesis= new AWS.Kinesis()
exports.handler = async (event) => {
console.log(JSON.stringify(event));
const bucketName = event.Records[0].s3.bucket.name;
const keyName = event.Records[0].s3.object.key;
const params = {
Bucket: bucketName,
Key: keyName
}
await s3.getObject(params).promise().then(async (data) => {
const dataString = data.Body.toString();
const payload = {
data: dataString
}
await sendToKinesis(payload, keyName);
}, error => {
console.error(error);
})
};
async function sendToKinesis(payload, partitionKey) {
const params = {
Data: JSON.stringify(payload),
PartitionKey: partitionKey,
StreamName: 'mydatastream'
}
await kinesis.putRecord(params).promise().then(response => {
console.log(response);
}, error => {
console.error(error)
})
}
Code for Consumer:
exports.handler = async (event) => {
console.log(JSON.stringify(event));
for (const record of event.Records) {
const data = JSON.parse(Buffer.from(record.kinesis.data, 'base64'));
//send emails clients, publish the data social media
console.log('cosumer #2', data);
}
};
perfect men :)
This is a great tutorial, previously, I just have basic idea about AWS stuffs, now with this tutorial, I learned the Kinesis data streaming part very easily. which can help me with my upcoming Interviews. Thanks a lot :) Felix.
You are welcome!! Best of luck with ur upcoming interview :)
thank you for writing the code step by step and explaining it, its really helpful for beginners :)
Glad that u found it helpful :)
Hi Felix Yu, You are videos are clean, precise, and very informative. Can you make a video on Kafka?
Great stuff and awesome explanation. Throughly enjoyed.
Glad that it helped 👍
Thank you Felix !
Im glad that i found you and your videos really help me and many like me. So my request would be AwS Pyspark with Big Data pipelines and Aws Dynamo DB. Please consider this and thanks in advance. Much appreciation!
Can we directly send the data from lambda functions to consumers? If yes, tell me the cons
You saved my Day! Thanks a lot Felix.
np..Glad that it helped :)
Thanks a lot for another great tutorial!!
Glad you liked it!
Hi Felix, does in real time system, is it common to have get data first in to s3 and then produce to kinesis data stream using lambda ? Can you please make a video on different type of producers and which one to choose ? I want to know how to decide the producer if we are designing real time system say with latency of milliseconds and also for near real time like having latency of say in minutes to hour. Thanks
This is great! Thanks so much, Felix! I was looking for a simple tutorial that scratch the basics of all these fragments in AWS, and also in Javascript - so this was spot on.
I'm building a service that posts messages to a websocket handler (Lambda function) and I want to save that information coming from the websocket (basically health data from an iot-device). So from there, AWS Kinesis seems like a good idea to collect all incoming data from these devices, and then consume the data into a ie. DynamoDB.
BTW: Are you open for freelance work with small AWS-based projects?
Thanks man!! I appreciate the offer but I don’t think I have time for freelance work atm!!
I am great fans of your aws videos.Kindly help me to understand how one lambda is calling another lambda and also updating 2 diffrent databases..Like product is purchased by customer and also updated in product inventory. In node js please😊
u have to like all my videos before i answer this question......just joking lol
there are 3 ways u can enable one lambda to call another lambda:
1. use a "middle man" like kinesis or sqs. lambda #1 construct a data object send it to kinesis/sqs and then it triggers lambda #2. lambda #2 then extracts the data object and do whatever u want with it
2. attach an alb (application load balancer) to lambda #2 so it acts as an API and then lambda #1 calls the alb to invoke lambda #2
3. lambda #1 calling lambda #2 directly using the invoke method. here is a reference to that (www.sqlshack.com/calling-an-aws-lambda-function-from-another-lambda-function)
i personally like the first approach the most but all 3 would work!!
for ur second question, u can just perform 2 dynamo requests (one per each table). just change the dynamoTableName in the request params (e.g., github.com/felixyu9/serverless-api-tutorial-code/blob/main/index.js#L80-L83)
@@FelixYu thanks lot...I have successfully implemented ..Actually I was trying to implement ticket reservation system..While booking the ticket it will also minus the seat from the main table also keep booked seat no in users table...
Thanks again..I am just having 1 month of exp in AWS.
Nicee..glad that u figured it out and good luck with the aws exploration 👍
Can you share the difference between using apache kafka and kinesis?
💯
Nice video, thanks for the detailed explanation
Glad that it’s helpful!!
very clear explanation...please try to do lambda functions on python too
Thank you!!
Thanks for your content. It is really helpful.
Nice content Felix : )
love to see more of these projects.
it helps learn a lot : )
Thank you thank you :)
good quality video, keep up with the good work!
Thank you!! :)
great demo!!
Glad that u found it helpful 👍
Thank you for this video. Very helpful 👍
Glad that it’s helpful!!
Is there a NodeJS function that connects to a remote JMS ConnectionFactory endpoint for use inside the vpc?
Hi Felix, I need producer for CLICKSTREAM then what is the changes we need in the lambda code? Please correct me If I am wrong.
awesome sir
Could you also use step functions and put it all in one?
Good work
Good stuff Felix. Keep going.
Thank you thank you :)
thank you
Glad that u found it helpful
Hi Felix, I have question that can we modify the data received from S3 bucket or API gateway, then save it to DynamoDB by using Lambda Function
yes u certainly could..for example, in consumer 1 lambda, u can read in the data as json/strings/numbers and then u make changes to it to make it a different (e.g., numeric calculation, string concatenation, etc.) and then u can save this new data to dynamodb
Hi Felix , It is a nice and clear video. One question, may I know if there can be applied on other types of sources data such as pdf, jpg? Can I add a aws textract in between to read the text from document or image rather than just text file ?
yes..in 5:17 of the video, if you leave the Suffix field blank, u can upload any file types and it will trigger the producer lambda
@@FelixYu Yes, it will trigger the producer lambda. Could you show something like using textract and convert other formats of source data into csv ?
Can you do one video on Athena,Glue,and EMR
thanks!
glad that u found it helpful 👍
thanks
👍
I am getting errors like "Cannot read property '0' of undefined", and "event. Records is not iterable", can you rectify the error
where does cloud front comes into picture?
Can you please make a video with Python instead of Node JS. Though the content was quite explanatory. Thanks Felix for awesome content
I can write the code in python and then push it to GitHub when I get a chance. All other setups will be the same!!
My use case is Amazon connect data -> kinesis data stream -> delivery stream lambda -> se can you help me with lambda heree
Hi Felix what if I have s3 in one aws and kinesis on another . And if I added cross account policy on lambda as well. What should I do in lambda to send the data to kinesis on another aws account.
if u have the producer lambda in account A and then wanna write data to a kinesis stream in account B, u will needa configure 2 roles.
1. in account B, create a role that has write access to the kinesis stream (e.g., putRecord, PutRecordBatch, etc.). let's call it kinesis-role-in-account-B.
2. in account A, create a role (let's call it lambda-role-in-account-A) and use this role to assume the kinesis-role-in-account-B that was created in step #1. attached this role to the lambda and it will be able to send data to kinesis in account B!!
this is same as s3 streaming files right
I am getting error "stream/kinesis-stream because no identity-based policy allows the kinesis:PutRecord action" while porsting data to kinesis stream
Make sure u have the kinesis policy for ur iam role (1:55 of the video)
nice, hi sir, can these lambda functions usefull for Video,audio files also?
i think so. u just needa find a lib that handles video files and add the logic in the lambda code
Hi Felix, love your video, if it is possible to be Python code in lambda? Thanks
Yea u can write the lambda in python as well
Please advise where can we find the source code for this example
sorry i dont think i keep the code after i uploaded the video :(( try to pause the video and copy the code that way. let me know if u run into any problems!!
Felix loving your channel, great content! Can we do more with Lambda functions? They seem to be an important link to many other services in AWS. For these 2 consumers maybe send an email with 1 and save to dynamo with the other?
tyty 😄 and yessir, we can certainly do more with lambda functions!! lambda has gained a lot of popularities in recent years becuz it is light weight, easy to set up and great for horizontal scaling.
u just reminded me that i do already have videos for send emails in lambda and saving data to dynamo database:
send emails in lambda: ruclips.net/video/mL-4PeuAuWc/видео.html
save data to database: ruclips.net/video/Ut5CkSz6NR0/видео.html
Hello Felix, Thanks for the content. Is it possible to have the Json file (Source with JSON format) template from s3 to Kinesis?.
yes, just change txt to json in the s3 trigger suffix 5:18 of the video and u can upload json files as the data source (e.g., test_file.json) (or just leave that field blank so it will take all file types). let me know if that works
@@FelixYu Hi Felix, Thanks for the quick response. I have tried with changing the suffix of the file in S3. But this does not work either .
@@kavyap3184 gotcha....lets try a few things here.
1. when u have the txt suffix set up and upload a txt file (the exact setup i have in the video), does everything work fine?
2. after u change txt to json and upload a json file, does it trigger the producer lambda? if not, it might be a permission issue.
3. if it triggers the producer lambda, lets console.log out dataString and see how it looks like (9:29 of the video)
@@FelixYu Thanks for the suggestion. It is just that my file is too big to process. Also do we see the records in the AWS data stream ?. Because specific data stream monitor shows no data even though i see the data in console.
Yea if ur file is big, u needa increase the producer lambda timeout and memory....and I don’t think u can look at the data from aws console. Under monitoring u can see metrics like incoming data in bytes, get record counts, etc
I want to stream but to a event bus. I couldn't find a way yet...
Hi, is it possible to generate a real-time drill rig data stream by AWS Kinesis?
Yes, u would have to build a data producer that process the drill rig data and then sends it to kinesis thou
@@FelixYu Thanks! Is it possible to build a data producer via AWS?
@@subhamchakraborty6822 i guess a better question is - are u able to connect aws with ur drill rig system cuz idk how u have ur drill rig system set up and where u store the data
@@FelixYu I am planning to build the drill rig via MATLAB Simulink. But I don't know it's possible to connect the system with AWS or not.
@@subhamchakraborty6822 the easiest way i can think of is that..after matlab saves teh data to ur machine, u can write a script/cron job/program to upload the data to aws s3 and have that trigger a lambda function to send the data to kinesis
Can i get the same code in python pls?
I am getting an error like cannot read property of 0 of undefined
its prob caused by a typo....make sure it is event.Records[0]
Hi need the producer code can you pls help
Link for the code ?
Video was very good to understand, would you mention your mail id I had some queries related to iot service
Please upload the code in Python here Sir...
Oh nvm lol
there's error that says "The specific log group: /aws/lambda/consumer1 does not exist in this account or region."
got The specific log group: /aws/lambda/Producer does not exist in this account or region.
errors. can't see the triggered logs
two things to check here:
1. make sure the cloudwatch console u are viewing in the same region as the lambda function
2. make sure the iam role the lambda is using has the CloudWatchLogsFullAccess policy attached to it
Consumer1 - send to elastic search