Thank you for watching my videos. When the file size is too big to complete in 15 minutes, its recommended to use Step Function in that case. And it would be a extended customisation to this solution.
Thank you for watching my videos. Unfortunately I have not yet created a videos on this. But again thank you sharing this ask, I work on it and share it soon. Stay tuned, keep learning.
Hi, if we want to pass any number of tables in csv file format and we don't want to create the insert query, then is it possible to import data ? do you have any idea or video on that?
now i am getting [ERROR] OperationalError: could not connect to server: Connection timed out. on connection = psycopg2.connect(dbname = dbname, line. i have allowed all traffic with public access to RDS. but its stuck after print("donwloaded successfully....").
Thank you for watching my videos. Please check following thing from you side. 1. You have created lamba with 3.7 python run time 2. You have following env variables. dbname = os.getenv('dbname') host = os.getenv('host') user = os.getenv('user') password = os.getenv('password') tablename = os.getenv('tablename') 3. Lambda layer is created a attached to Lambda
I am getting the following error "[ERROR] Runtime.ImportModuleError: Unable to import module 'lambda_function': No module named 'psycopg2'. I think because of lambda layer. please let me know how to resolve this error?
Thank you for watching my videos. As mentioned in video could please check other video ruclips.net/video/JyQ9EFFR3n8/видео.html at time 17:40 minutes where I have explained how to set up this Lambda layer.
If there are bulk data in the S3 file which may take more that 15 minutes to process. How Lambda will take care of a bulk data file?
Thank you for watching my videos.
When the file size is too big to complete in 15 minutes, its recommended to use Step Function in that case. And it would be a extended customisation to this solution.
do you have any video on real-time streaming data ingestion (S3→Kafka → Glue Streaming → JDBC sink) and “Aurora + ElastiCache”?
Thank you for watching my videos.
Unfortunately I have not yet created a videos on this.
But again thank you sharing this ask, I work on it and share it soon.
Stay tuned, keep learning.
Great content!
Thank you for watching my videos.
And thank you very much for encouraging words.
Hi, if we want to pass any number of tables in csv file format and we don't want to create the insert query, then is it possible to import data ? do you have any idea or video on that?
Thank you for watching my videos.
Did you check my video similar to your requirement ruclips.net/video/03isHyT8kNQ/видео.html
now i am getting [ERROR] OperationalError: could not connect to server: Connection timed out. on connection = psycopg2.connect(dbname = dbname, line. i have allowed all traffic with public access to RDS. but its stuck after print("donwloaded successfully....").
Thank you for watching my videos.
Please check following thing from you side.
1. You have created lamba with 3.7 python run time
2. You have following env variables.
dbname = os.getenv('dbname') host = os.getenv('host') user = os.getenv('user') password = os.getenv('password') tablename = os.getenv('tablename')
3. Lambda layer is created a attached to Lambda
I am getting the following error "[ERROR] Runtime.ImportModuleError: Unable to import module 'lambda_function': No module named 'psycopg2'. I think because of lambda layer. please let me know how to resolve this error?
Thank you for watching my videos.
As mentioned in video could please check other video ruclips.net/video/JyQ9EFFR3n8/видео.html
at time 17:40 minutes where I have explained how to set up this Lambda layer.
@@cloudquicklabs Thanks. That works
You are welcome.