![Cloud Quick Labs](/img/default-banner.jpg)
- Видео 283
- Просмотров 1 246 690
Cloud Quick Labs
Индия
Добавлен 9 дек 2011
Enabling people to learn Cloud technologies (AWS/Azure/GCP) quickly by show casing real time cloud scenarios as Labs.
I believe in learning cloud skills by experimenting it. I want my subscribers and viewers to learn cloud skill quickly..!!
Hence my efforts are dedicated for wider Cloud community.
I am AWS community builder and AI/ ML technology enthusiast. I create and upload at least one video every fortnight.
Please do subscribe and activate bell icon for your notifications.
Lets learn together and grow together..!
I believe in learning cloud skills by experimenting it. I want my subscribers and viewers to learn cloud skill quickly..!!
Hence my efforts are dedicated for wider Cloud community.
I am AWS community builder and AI/ ML technology enthusiast. I create and upload at least one video every fortnight.
Please do subscribe and activate bell icon for your notifications.
Lets learn together and grow together..!
Azure VM Restore Using Backup With Azure Recovery Service Vault Using PowerShell Commands
===================================================================
1. SUBSCRIBE FOR MORE LEARNING :
ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ
===================================================================
2. CLOUD QUICK LABS - CHANNEL MEMBERSHIP FOR MORE BENEFITS :
ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ/join
===================================================================
3. BUY ME A COFFEE AS A TOKEN OF APPRECIATION :
www.buymeacoffee.com/cloudquicklabs
===================================================================
In this detailed tutorial, discover how to restore your Azure Virtual Machines (VMs) using backups stored in Azure Recovery Services Vault, all through PowerShel...
1. SUBSCRIBE FOR MORE LEARNING :
ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ
===================================================================
2. CLOUD QUICK LABS - CHANNEL MEMBERSHIP FOR MORE BENEFITS :
ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ/join
===================================================================
3. BUY ME A COFFEE AS A TOKEN OF APPRECIATION :
www.buymeacoffee.com/cloudquicklabs
===================================================================
In this detailed tutorial, discover how to restore your Azure Virtual Machines (VMs) using backups stored in Azure Recovery Services Vault, all through PowerShel...
Просмотров: 164
Видео
Azure VM Backup With Azure Recovery Services Vault Using PowerShell Commands
Просмотров 7614 часов назад
1. SUBSCRIBE FOR MORE LEARNING : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ 2. CLOUD QUICK LABS - CHANNEL MEMBERSHIP FOR MORE BENEFITS : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ/join 3. BUY ME A COFFEE AS A TOKEN OF APPRECIATION : www.buymeacoffee.com/cloudquicklabs In this comprehensive tutorial, learn how to effectively back up your Azure Virtual Machines (VMs) using Azure Recovery Serv...
AWS Resource Tagging Automation | AWS Resource Explorer | Lambda | AWS Resource Groups Tagging API
Просмотров 36514 дней назад
1. SUBSCRIBE FOR MORE LEARNING : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ 2. CLOUD QUICK LABS - CHANNEL MEMBERSHIP FOR MORE BENEFITS : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ/join 3. BUY ME A COFFEE AS A TOKEN OF APPRECIATION : www.buymeacoffee.com/cloudquicklabs Introduction Purpose of Tagging: The video begins by explaining the significance of tagging in AWS environments. Tags are cr...
Kubernetes | GitOps | AWS EKS | GitOps using Flux CD on AWS EKS | GitOps On Kubernetes Using Flux CD
Просмотров 30721 день назад
1. SUBSCRIBE FOR MORE LEARNING : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ 2. CLOUD QUICK LABS - CHANNEL MEMBERSHIP FOR MORE BENEFITS : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ/join 3. BUY ME A COFFEE AS A TOKEN OF APPRECIATION : www.buymeacoffee.com/cloudquicklabs Introduction to GitOps and Flux CD: Explains the concept of GitOps, which uses Git repositories as the source of truth for a...
AWS CDK in Python | How To Use AWS CDK in Python to Provision AWS Cloud Infrastructure Resource
Просмотров 23128 дней назад
1. SUBSCRIBE FOR MORE LEARNING : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ 2. CLOUD QUICK LABS - CHANNEL MEMBERSHIP FOR MORE BENEFITS : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ/join 3. BUY ME A COFFEE AS A TOKEN OF APPRECIATION : www.buymeacoffee.com/cloudquicklabs Here's a detailed description of a RUclips video titled "AWS CDK in Python | How To Use AWS CDK in Python to Provision AWS C...
AWS EKS Cluster Upgrade From v1.28 to v1.29 Using Terraform Without Application Downtime |Kubernetes
Просмотров 66728 дней назад
1. SUBSCRIBE FOR MORE LEARNING : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ 2. CLOUD QUICK LABS - CHANNEL MEMBERSHIP FOR MORE BENEFITS : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ/join 3. BUY ME A COFFEE AS A TOKEN OF APPRECIATION : www.buymeacoffee.com/cloudquicklabs The RUclips video titled "AWS EKS Cluster Upgrade From V1.28 to V1.29 Using Terraform Without Application Downtime | Kuberne...
How to Send Text SMS Notifications with Amazon SNS Using Golang | Send SMS With AWS SNS Using Go
Просмотров 185Месяц назад
1. SUBSCRIBE FOR MORE LEARNING : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ 2. CLOUD QUICK LABS - CHANNEL MEMBERSHIP FOR MORE BENEFITS : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ/join 3. BUY ME A COFFEE AS A TOKEN OF APPRECIATION : www.buymeacoffee.com/cloudquicklabs Title: Amazon SNS Text Messaging Using Golang Code Description: In this tutorial, we dive into how to integrate Amazon SNS (...
ETL | AWS Glue | AWS S3 | Transformations | AWS Glue ETL Data Pipeline With Advanced Transformations
Просмотров 704Месяц назад
1. SUBSCRIBE FOR MORE LEARNING : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ 2. CLOUD QUICK LABS - CHANNEL MEMBERSHIP FOR MORE BENEFITS : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ/join 3. BUY ME A COFFEE AS A TOKEN OF APPRECIATION : www.buymeacoffee.com/cloudquicklabs Title: AWS Glue ETL Data Pipeline With Advanced Transformations Introduction Opening: The video starts with an introduction ...
ETL | AWS Glue | Working with Apache Spark Using 3rd Party Library and AWS Data Catalog | PySpark
Просмотров 363Месяц назад
1. SUBSCRIBE FOR MORE LEARNING : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ 2. CLOUD QUICK LABS - CHANNEL MEMBERSHIP FOR MORE BENEFITS : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ/join 3. BUY ME A COFFEE AS A TOKEN OF APPRECIATION : www.buymeacoffee.com/cloudquicklabs Welcome to our comprehensive tutorial on using AWS Glue with Apache Spark for ETL (Extract, Transform, Load) processes! In t...
ETL | AWS Glue | Spark DataFrame | Working with PySpark DataFrame in | AWS Glue Notebook Job
Просмотров 790Месяц назад
1. SUBSCRIBE FOR MORE LEARNING : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ 2. CLOUD QUICK LABS - CHANNEL MEMBERSHIP FOR MORE BENEFITS : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ/join 3. BUY ME A COFFEE AS A TOKEN OF APPRECIATION : www.buymeacoffee.com/cloudquicklabs The video titled "Working with PySpark DataFrame in | AWS Glue Notebook Job" provides a comprehensive guide on loading Jupyt...
Kubernetes Persistent Volumes on AWS EKS with Karpenter | Persistent Volume Claim | K8S PV | PVC
Просмотров 3312 месяца назад
1. SUBSCRIBE FOR MORE LEARNING : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ 2. CLOUD QUICK LABS - CHANNEL MEMBERSHIP FOR MORE BENEFITS : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ/join 3. BUY ME A COFFEE AS A TOKEN OF APPRECIATION : www.buymeacoffee.com/cloudquicklabs The RUclips video titled "Kubernetes Persistent Volumes on AWS EKS with Karpenter | Persistent Volume Claim | K8S PV | PVC" ...
Kubernetes taints and tolerations | AWS EKS | Karpenter | Advanced Pod Scheduling in Kubernetes
Просмотров 2462 месяца назад
1. SUBSCRIBE FOR MORE LEARNING : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ 2. CLOUD QUICK LABS - CHANNEL MEMBERSHIP FOR MORE BENEFITS : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ/join 3. BUY ME A COFFEE AS A TOKEN OF APPRECIATION : www.buymeacoffee.com/cloudquicklabs The RUclips video titled "Kubernetes Taints and Tolerations | AWS EKS | Karpenter | Advanced Pod Scheduling in Kubernetes" p...
Kubernetes Pod Disruption Budget with Karpenter on AWS EKS | A Deep Dive into K8S PDB
Просмотров 4022 месяца назад
1. SUBSCRIBE FOR MORE LEARNING : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ 2. CLOUD QUICK LABS - CHANNEL MEMBERSHIP FOR MORE BENEFITS : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ/join 3. BUY ME A COFFEE AS A TOKEN OF APPRECIATION : www.buymeacoffee.com/cloudquicklabs In this comprehensive tutorial, dive into the intricate world of Kubernetes Pod Disruption Budgets (PDB) alongside the power...
Node affinity in Kubernetes | Kubernetes Node Affinity for Pod Scheduling With Karpenter On AWS EKS
Просмотров 2822 месяца назад
1. SUBSCRIBE FOR MORE LEARNING : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ 2. CLOUD QUICK LABS - CHANNEL MEMBERSHIP FOR MORE BENEFITS : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ/join 3. BUY ME A COFFEE AS A TOKEN OF APPRECIATION : www.buymeacoffee.com/cloudquicklabs In this comprehensive tutorial, we delve into the intricate world of Kubernetes node affinity, exploring how to leverage thi...
ETL PySpark Job | AWS Glue Spark ETL Job | Extract Transform Load from Amazon S3 to S3 Bucket
Просмотров 4912 месяца назад
1. SUBSCRIBE FOR MORE LEARNING : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ 2. CLOUD QUICK LABS - CHANNEL MEMBERSHIP FOR MORE BENEFITS : ruclips.net/channel/UCv9MUffHWyo2GgLIDLVu0KQ/join 3. BUY ME A COFFEE AS A TOKEN OF APPRECIATION : www.buymeacoffee.com/cloudquicklabs Introduction to ETL and PySpark: The video may begin with an introduction to the concepts of ETL and PySpark. ETL is a proce...
PySpark AWS Glue ETL Job to Transform and Load data from Amazon S3 Bucket to DynamoDB | Spark ETL
Просмотров 9973 месяца назад
PySpark AWS Glue ETL Job to Transform and Load data from Amazon S3 Bucket to DynamoDB | Spark ETL
AWS EKS Cluster Autoscaling Using Karpenter | Kubernetes Cluster Autoscaling Using Karpenter | SRE
Просмотров 1,5 тыс.3 месяца назад
AWS EKS Cluster Autoscaling Using Karpenter | Kubernetes Cluster Autoscaling Using Karpenter | SRE
AWS Glue Spark ETL Job to Load Data from Amazon S3 to AWS Glue Data Catalog | PySpark ETL
Просмотров 1,2 тыс.3 месяца назад
AWS Glue Spark ETL Job to Load Data from Amazon S3 to AWS Glue Data Catalog | PySpark ETL
Complete Guide: Hosting HashiCorp Vault on AWS EKS for Kubernetes Secret Management
Просмотров 8173 месяца назад
Complete Guide: Hosting HashiCorp Vault on AWS EKS for Kubernetes Secret Management
Istio Service Mesh on AWS EKS | Step by Step Guide to install Istio Service Mesh on Kubernetes
Просмотров 3,1 тыс.3 месяца назад
Istio Service Mesh on AWS EKS | Step by Step Guide to install Istio Service Mesh on Kubernetes
Kubernetes Service Mesh On AWS EKS Using AWS App Mesh | Step by Step Guide to Setup App Mesh On EKS
Просмотров 9503 месяца назад
Kubernetes Service Mesh On AWS EKS Using AWS App Mesh | Step by Step Guide to Setup App Mesh On EKS
Kubernetes Container App on AWS ECS Fargate | DocumentDB | ElastiCache | Terraform | GitHub Actions
Просмотров 5504 месяца назад
Kubernetes Container App on AWS ECS Fargate | DocumentDB | ElastiCache | Terraform | GitHub Actions
Monitoring Azure Kubernetes Service with Datadog | SLO, SLI, SRE Explained | AKS Monitoring
Просмотров 7454 месяца назад
Monitoring Azure Kubernetes Service with Datadog | SLO, SLI, SRE Explained | AKS Monitoring
Azure Kubernetes Services Insights | AKS Container Insights Using Azure Log Analytics Workspace
Просмотров 6664 месяца назад
Azure Kubernetes Services Insights | AKS Container Insights Using Azure Log Analytics Workspace
AKS Monitoring with Azure Managed Prometheus & Grafana | Azure Monitor Workspace Integration | K8S
Просмотров 2,3 тыс.4 месяца назад
AKS Monitoring with Azure Managed Prometheus & Grafana | Azure Monitor Workspace Integration | K8S
How to deploy ELK Stack on AWS EKS Using Terraform and GitHub Actions | ELK Stack on Kubernetes
Просмотров 2,1 тыс.4 месяца назад
How to deploy ELK Stack on AWS EKS Using Terraform and GitHub Actions | ELK Stack on Kubernetes
Amazon Athena Federated query to execute SQL queries across AWS S3 and RDS PostgreSQL data sources
Просмотров 1,3 тыс.4 месяца назад
Amazon Athena Federated query to execute SQL queries across AWS S3 and RDS PostgreSQL data sources
Amazon Athena to Query AWS DynamoDB Tables | Run SQL Query on NoSQL Amazon DynamoDB | ETL
Просмотров 9705 месяцев назад
Amazon Athena to Query AWS DynamoDB Tables | Run SQL Query on NoSQL Amazon DynamoDB | ETL
Linux Commands Masterclass Tutorials: Unleash the Power of the Command Line!
Просмотров 3155 месяцев назад
Linux Commands Masterclass Tutorials: Unleash the Power of the Command Line!
Docker Compose to define & run multi-container Docker applications | Python Flask | Redis #docker
Просмотров 6315 месяцев назад
Docker Compose to define & run multi-container Docker applications | Python Flask | Redis #docker
Nice simple demo
Thank you for watching my videos. Glad that it helped you.
Thank you for detail explanation. Very useful and solved my problem also
Thank you for watching my videos. Glad that it helped you.
Hi! How I can build & deploy docker image to a ACR and Web App on every PR to main/master branch? It should be the same but updating github actions workflow to run on PR?
Unable to fetch cronjobs logs using fluent bit
Thank you for watching my videos. Check out the path where this log is stored and then that should be configured log scrapers.
Good to see how you debugged the issue on spot. Loved it.
Thank you for wat have my videos. Glad that it helped you.
Great Explanation, Thank you dear
Thank you for watching my videos. Glad that it helped you.
@@cloudquicklabs These commands are same on linux systems.? . Before I have done with with old version on linux.
hi brother im able to collect data one by one through table but when im trying to establish connection through crowler its says unable to connect or establish connection then its unable to connect is that possible to add all tables at a time
Thank you for watching my videos. There could be multiple reasons like below. 1. Check if vpc endpoints for rds 2. Check if inbound security group has required ports enabled here. 3. Check if credentials are correctly provided.
yes I am facing same issue as all pointed here. role i created is not showing in grant access filter !
This seems to be serious issue may be good raise GCP support case here
informative thanks
Thank you for watching my videos. Glad that it helped you.
Hello! Thank you for this video. i was able to do everyhting you did in the video but when i go to index management i dont see any indices... any advise?
i noticed that for your service resource you dont have spec section, and the svc wasnt created.
Thank you for watching my videos. Looks like data ingestion has not started into Elasticsearch. Did you use the command file share videos description.
Well done, This a good details, I mixed the explaination towards the client to service part. This is a nice job well done
Thank you for watching my videos. Glad that it helped you.
Getting { "message": "Forbidden" } How can we fix it?
Able to upload text file via lambda but getting error with postman :( Please suggest
Thank you for watching my videos. Check if your API url is correct and API keys are correct.
Invoking via postman is about selecting right options. docs.aws.amazon.com/apigateway/latest/developerguide/call-api-with-api-gateway-lambda-authorization.html
@@cloudquicklabs Thanks! But i have checked everything multiple time from your video still facing the issue though I can upload the file using Lambda itself using Test option for Lambda. Any suggestion?
Do you provide any mentorship,or job assistant course ??
Thank you for watching my videos. Currently I am not doing this.
what if i send the file through multipart/form-data how to configure api gateway
Thank you for watching my videos here. Here you are binary encoding image file and passing that value to API via parameters. We need to make sure that multipart/form-data is intact.
The reason it got appended into the target table is because, the "Matching Keys" involves all of the column. Had it been just the "industry_name_anzsic" in matching keys. It would have updated it. Actually, I think you assumed that the just the leftmost column is the Matching key which happens most of the time as left is usually the primary key column and we do merges and joins on it. Hence, This was a honest mistake happened due to old habits. Old habits die hard.
Thank you for watching my videos. It's built on capability for Glue that I have used. But I am happy to explore more about it.
Will this be the continuous log forward to CW log group or any 5min or 10min duration it will send ?
Yes.. it will send logs continuously.
man!! you did not show what the parquet files content looks like ..ah!!
Thank you for watching my videos. Apologies here. It was just for your reference in the video for parquet file mention.
what if i want to use form data to pass the image
Thank you for watching my videos. You can still do that via an interface (app interface) and convert the image back to binary and call the api here.
@@cloudquicklabs thing is that i am doing some image processing and when i send image through binary its taking extremely long time. but when i do with multipart/form-data which usual way to send files its taking nominal time. also with python conversion from multipart to normal data is giving errors i tried that too. whats your suggestion on this.
aws glue software any tail version?
Thank you for watching my videos. Glue is PaaS there is versions here but feature options can be explored.
few more examples create it and place the videos. super
Indeed , I shall create more in this context of Data engineering.
have you got a video for incremental loads?
Thank you for watching my videos. Please check ruclips.net/video/RGSKeK9xow0/видео.html
In production how to grant access for external service. Is it through IAM roles or API gateways?
You can agrant access to external service via multiple options. 1. IAM users. 2. Federated Access (which supports) 3. API gateway as application for tasks
Hello the data is loaded from s3 to redshift is zig zag manner, we have the data from source like 1,2 3,4 order , but the target is 1,4,11 and so on , how to get serial data to the redshift
Thank you for watching my videos. It should not be the case here , I believe only columns should disordered while row data should be intact. Please watch this video again.
The content covered is good. but this frequent "you know" is kind of irritating. if possible please avoid using " you know" again and again.
Thank you for watching my videos. Indeed I try my best make it clear content ahead.
We are trying to integrate Sonar Cloud to our self hosted private Gitlab instance. in this context i would like to know whether its possible to do it and if its possible would like to know the steps to do so .
Thank you for watching my videos. It should be possible until your private network does not block Sonar Clou endpoints.
How to get classes ?
Thank you for watching my videos. I don't take classes but help through my videos , let me know if you have any topic to cover in videos.
Thankyou @cloud quick labs. Really helpful tutorial.👍
Thank you for watching my videos. Glad that it helped you.
If i am trying to access object directly using cloudfront url in browser i am able to do so but when i am trying to consume image/video on my website using img or video tag... I am getting cors error saying no access-controll-allow-origin lresent
Thank you for watching my videos. The issue you're encountering could be due to permissions or the way the AWS S3 bucket policy is configured. When accessing an S3 object directly via its URL. Did you check this.
25:12 Could you pls specify more on the Remove duplicate option in the query (to prevent data-redundancy if we run the pipeline twice ) ?
Thank you for watching my videos. I am a new video on incremental data load here. Please wait for the same.
This video is really amazing and next level.
Thank you for watching my videos. Glad that it helped you.
Great explanation! Thanks.
Thank you for watching my videos. Glad that it helped you.
I'm having this error: py4j.protocol.Py4JJavaError: An error occurred while calling o191.getCatalogSource. : com.amazonaws.services.glue.util.NonFatalException: Formats not supported for SparkSQL data sources. Got json at.... Does someone may know what could be happening.
Hi, @cloudquicklabs What was the use of second crawler. Is it only run once to get schema of redshift in to temp database and later it won't be running any time?
Thank you for watching my videos. It's to load the data from s3 to aws glue data catalog and then to Amazon Redshift services. Did you watch the second version of this video ruclips.net/video/RGSKeK9xow0/видео.htmlsi=FB_1BXVQp-SnfUtq
Good explanation !
Thank you for watching my videos. Glad that it helped you.
great video, thanks, I'd like to use the data on s3 via athena, do you have a solution for it?
Thank you for watching my videos. Please find the link ruclips.net/video/5G0i5uQVGIw/видео.html I have one more video on this but using csv file.
that's great explained
Thank you for watching my videos. Glad that it helped you.
❤@@cloudquicklabs
Very Good session, i have an automation SSM document which needs to call in my terraform code, would you help me with the procedure or documents.
Thank you for watching my videos. You can use terraform null_resource something like below and call AWS CLI to invoke the AWS SSM document resource "null_resource" "example" { provisioner "local-exec" { command = "echo This command will execute whenever the configuration changes" } }
Hi Thank you for the informational videos, would you clarify this doubt I have, here gluecrawler created and ran only once I believe instead of creating glue crawler can it be possible to migrate data directly from s3source crawler to redshift table . My intention of asking this query is since we are running gluecrawler only once or only at a time when we would like to see records in redshift table since the functionally of accessing records in redshift is possible by quering in the query editor
Thank you for watching my videos. We migrating data from source s3 to Amazon redshift which is destination here. Crawlers can be scheduled or invoked on demand.
thank you ok haha
Thank you for watching my videos.
ok? yah?😁
Thank you for watching my videos.
Wonderful scanerios. Instead of just giving brief . The way you are explaining different scenarios is amazing
Thank you for watching my videos. Glad that it helped you.
Thankyou . Need more such videos
Thank you for watching my videos. Glad that it helped you.
@@cloudquicklabs I have doubt on how it provisions the new VM. Is it based on provisioner file configurations where we give instance types. Or just dyanamically any instance type and instance size
Hi Sir, Thanks for making the things easier. I have a doubt from where you got that result.json and Helppowershell file before updating the cluster? Thanks!
Thank you for watching my videos. Please check the video at 5:23 minute you see that I am executing a powershell script which gives the output and stored in results.json.
Hi sir what is the difference between EMR Cluster vs EKS Cluster?
Thank you for watching my videos. EMR : is Elastic Map Reduce which is managed big data processing and analytics service. EKS: is Elastic Kubernetes Service which is basically managed Kubernetes (container orchestration) Service. Both are different.
@@cloudquicklabs Okay, hum yahan kah sakte hain ki EMR jo hai, wo Spark aur PySpark ke jobs run karta hai aur EKS jo hai, wo Python ke jobs run karta hai.
I want copy from dynamodb in one table data to paste it in couchbase in one collection.
Thank you for watching my videos. We have solutions for it I shall creat a videos on this soon.
thanks bro you save my time
Thank you for watching my videos. Glad that it helped you
Excellent sir thank u so much
Thank you for watching my videos. Glad that it helped you.
Bro do video on how does interrupt queue working with example.
Thank you for watching my videos. Indeed I shall create video on this scenario as well soon. It's quite interesting scenario.
Thank you for this richfull video , i want to ask you what type of ec2 instance you have used ? i have a similar project and i need to identify the cost metric and hardware requirements, so what you suggest for similar project that host a simple web application using kubernetes cluster and prometheus monitoring?
Thank you for watching my videos. Cost of EKS cluster depends on multiple choices 1. Size of Node 2.Disk usage 3.Networking Resources. etc. Did you AWS calculator for this. And also explore my RUclips channel to find correct videos for your use case (as there are many here)
I need to send different messages to n different users, Can we implement API Gateway triggers lambda and the lambda triggers SNS or SQS, to which we send a custom mobile number and a custom message. Please let us know if we can implement this. Thanks in advance.
Let me know when you find the answer to this.
Thank you for watching my videos. Indeed you can do it and it's workable solution. I would create new video on this topic which mimic your scenario, keep watching my channel.
I would sharing a video on this requirements soon.
@@cloudquicklabs thank you so much. This means a lot🙏🏽