Why I often use AWS lambda and serverless architecture

Поделиться
HTML-код
  • Опубликовано: 3 дек 2024

Комментарии • 64

  • @vytasgavelis
    @vytasgavelis 7 месяцев назад +3

    Would be keen to see a complete production setup that you are running.
    There are so many ways to deploy lambdas: SST, cdk by it's own, terraform etc... All of these have different drawbacks especially when it comes to local development environment experience but it is hard to find resources on the internet what real companies are doing

  • @baracka448
    @baracka448 Год назад +6

    8:30 Lambda as Docker images supports a max 10GB, so that would help you with the 256MB limit.

  • @TwenTV
    @TwenTV Год назад +4

    You can also get around the 250MB constraint by wrapping your lambda in a docker image and building your lambda from ECR instead. It will require a bit more on the development side with versioning but allows up to 10GB in dependencies

    • @WebDevCody
      @WebDevCody  Год назад

      Any clue if that slows down cold starts

    • @constponf7403
      @constponf7403 Год назад

      @@WebDevCody Not too much, since AWS provides specific docker base images. I am currently deploying every lambda as a docker image, because it simplifies the deployment process.

    • @uome2k7
      @uome2k7 Год назад +1

      why would you want to do that? Lambas should be short running (there's a time limit on how long it can run for) and stateless. Anything that needs more than 250MB throws up a lot of red flags because it sounds like it would be doing way too much. Wrapping them in a docker image means you have to add the docker registry lookup, download that image and all those dependencies and then start up that docker image, even if its docker running inside docker.
      Being able to run a lamba inside docker for local testing makes sense, but I think keeping it within the base restrictions should be the goal.
      Remember you are paying for compute cycles and memory space on AWS so you want to be as small and as fast as you can get.

    • @TwenTV
      @TwenTV Год назад

      @@uome2k7 Its 250MB for dependencies. There are many cases where you have a quick and useful lambda, but library requirements just surpass the 250MB allowed in layers :)

    • @WebDevCody
      @WebDevCody  Год назад +4

      @@uome2k7 there are many use cases, for example I need a lambda to generate a pdf for my users. In order to generate a pdf, you need a bunch of binaries which use a LOT of space. Generating the pdf takes maybe 1 second, but installing a chromium binary can take like 80mb by itself, not to mention your all your required node_modules. I'd also say using docker as a way to standardize your lambdas is actually very useful. I've wasted hours debugging issues that work locally but fail on lambda because of incorrect setup of the binary paths, etc.

  • @driden1987
    @driden1987 Год назад +3

    I've been using Serverless Stack (SST) lately, it's pretty awesome

    • @male3399
      @male3399 9 месяцев назад

      Do you use lambda with SST?

    • @driden1987
      @driden1987 9 месяцев назад

      @@male3399 Yes

  • @INKILU
    @INKILU Год назад +2

    Enjoying the aws vids 👍🏼

  • @w9914420
    @w9914420 Год назад +4

    Hi Cody, many thanks for your insights into lambda, would be cool to see an example of how you would use puppeteer to generate pdf's from html (with css styling)😊

  • @B1TCH35K1LL3R
    @B1TCH35K1LL3R Год назад +5

    serverless works fine when there are budget limitations, but another good approach for this might be to have your express/nest API web server as a container and deploy it using a management service such as ECS or even better, Kubernetes. However as I mentioned that won't apply to every project (specially if there are budget restrictions)

  • @SeibertSwirl
    @SeibertSwirl Год назад +1

    Good job babe!!!! I’m finally first again! But most of all I’m so proud of you and all this work you’ve done ❤

  • @habong17359
    @habong17359 Год назад +2

    if you get a chacne, can you walkthrough your ci/cd for lamda functions? great video btw!

  • @rustystrings0908
    @rustystrings0908 Год назад +2

    Can you show your set up on how you deploy this stuff to Lambda specifically? Do you use AWS SDK or are you writing shell scripts to do it?

    • @WebDevCody
      @WebDevCody  Год назад +3

      Usually I’d use the serverless framework. I can try making a video on it if I get time

    • @chris94kennedy
      @chris94kennedy Год назад

      @@WebDevCody I'd also like to see that Cody. Thanks as always

  • @xorlop
    @xorlop Год назад +1

    I am very interested to hear your thoughts on CF Workers. You can even create them programmatically on an enterprise plan. CF Workers also have environments so you can do a test deploy. There is also built-in monitoring now, too. I have some basic experience with Lambda, but not much. The CF Edge also doesn't have cold starts, I think (I could be very wrong about this!). Idk about limits... but I do know there are ways to change and alter them to be unbound I think.
    I only use it for work, so idk about pricing.

    • @WebDevCody
      @WebDevCody  Год назад +1

      I've never worked with CF workers, so if I do maybe work with them I'll make a video

  • @thebowshock7729
    @thebowshock7729 3 месяца назад

    Regarding your PDF generation example, it depends on how you look at it. You could try to force the standard javascript way of doing it, which you described, and not have it be optimal, or you can ditch Pupeteer and harness the power of Lambda and write the function in a language and runtime that are best suited for native PDF generation. That's another great benefit of lambdas- they are naturally microservices and each can have its own runtime and language

  • @corygrewohl8180
    @corygrewohl8180 Год назад +1

    so one thing I've wondered for a while, is does lambda/serverless architecture replace a normal backend built in Express, Django, etc.? I'm beginning to work on a web portal for an app startup idea, and I'm generally just confused out what the best way to build up a back end is. in the past I've used lambda for a lot, but I'm wondering if building an Express API is better, or if generally companies use both. I guess i'm just confused on where each comes into play. thanks for the help!

    • @WebDevCody
      @WebDevCody  Год назад

      I typically have a single express app that I wrap and get deployed to a single lambda function. There is a library called aws-serverless-express you can use to wrap your express app and deploy to a lambda, super easy to use. You can also use existing tools such as the serverless framework to host an express app directly to lambda. Honestly, deploying a django app to heroku or some other container service host works perfectly fine, I’m just a big fan of only pay for what you use. Some people deploy a separate lambda for each endpoint, but that causes deployments to take forever

    • @corygrewohl8180
      @corygrewohl8180 Год назад

      @@WebDevCody oh that's actually really cool you can wrap it and then reap the benefits of lambda still. i guess im just wondering what the benefit of deploying it to a server is then.

  • @s1dev
    @s1dev 29 дней назад

    does go (as a language) perform better in lambda functions?

  • @johndebord7802
    @johndebord7802 Год назад +1

    Is it best practice to just have one lambda function for your application? Or one lambda function per DynamoDB? Or multiple lambda functions for multiple REST-like requests? Kind of confused about this

    • @WebDevCody
      @WebDevCody  Год назад +1

      I typically do one lambda for my entire api, but you can deploy each lambda separate but it’ll take a while to deploy

    • @johndebord7802
      @johndebord7802 Год назад

      @@WebDevCody I suppose having one lambda would be best because it would minimize the possibility of cold-starts as well. But I'm almost sure that cold-starts will be a relic-of-the-past someday

    • @WebDevCody
      @WebDevCody  Год назад +1

      @@johndebord7802 yeah much higher chance stuff will be warm, but it does mean every endpoint will need to be configured the exact same. So if one endpoint requires more memory, you need to increase it for all endpoints. If one endpoint needs a higher timeout, you need to increase for all. It’s just trade offs

  • @Pyrospower
    @Pyrospower Год назад

    thanks for the interesting video!

  • @Xmasparol
    @Xmasparol Год назад +1

    Somehow lambda containerization is good I get pip issues in python psycopg2 and layers doesn't work I did google and chatgpt didn't help and got to the point to call AWS support

  • @dandogamer
    @dandogamer Год назад +4

    I've worked with serverless and AWS for 3 years and never had any fun developing on it. Debugging was always painful, documentation is poor and you end up having to buy into a ton of other services just to have a functioning yet complicated system. Not to mention the developer experience sucks, issues always cropping up on AWS but not local or you accidentally forget a permission and have to wait 10+ mins each time for the build to upload.
    If you're a small company who needs to move fast I would stop and consider whether it's worth slowing down your team for the sake of penny pinching.
    On the other hand if you are a larger team I would probably adopt a platform engineering approach to help your developers so they dont have to worry about all the intricacies

    • @WebDevCody
      @WebDevCody  Год назад

      I agree with everything you said 😂 aws can turn into a convoluted nightmare

  • @jatinhemnani1029
    @jatinhemnani1029 Год назад

    we can get an API url without API Gateway right directly using the function url

  • @digitnomad
    @digitnomad Год назад

    Hi Cody, when 1 AppSync do all the job, why busy with Gateway+Lambda?

  • @roach_iam
    @roach_iam Год назад

    Curious, what did you guys decide to do about the pdf issue?

    • @WebDevCody
      @WebDevCody  Год назад

      We managed to get puppeteer working on lambdas by making sure the deployed .zip had as little as possible in it. If we his limits again I think we’ll need to move to running docker containers on lambda

  • @cringelord511
    @cringelord511 Год назад

    are they similar to azure functions?

  •  Год назад

    What about when you have your lambda backend connected to a database and there is a spike in the number of users (number of concurrent lambda executers at the same time) but your db has a maximum number of connections? How do you handle that?

    • @uome2k7
      @uome2k7 Год назад

      you have to scale everything based on expected demand, ideally with some extra margin, but none of these should have unlimited growth allowances because nobody has unlimited pockets.

    • @WebDevCody
      @WebDevCody  Год назад

      You’d either need to scale or configure your db to accept more concurrent connections, or you need to wrap your database behind a connection pool service gateway which will limit how many connections can be made, and your lambdas will invoke that via rest. Lambdas do stay warm so you can potentially keep a database connection open between requests as long as you put the connection outside of the scope of the handler into a global code level. But like joe mentioned, there is no silver bullet, you need to re analyze your system when you hit certain levels of scale. If you choose to just use planetscale from the start, you’ll get that scaling in the future more than likely for a price tag

    • @yuhanna_kapali
      @yuhanna_kapali Год назад

      using rds proxies connect to the lamda will help on number of connection

    • @moodyhamoudi
      @moodyhamoudi Год назад

      http or websocket based connection models support serverless by design. There's no reason to try to finagle a stateful connection model to work with a stateless compute platform, you'd only be bandaiding an gunshot wound and you will hit a wall if you intend on going past ~100 concurrent users. Firestore is probably the most well established solution in the category but there are a few reasonable providers for both SQL and noSQL like AWS Aurora, Planetscale (not actually stateless but can handle a ton of connections), DynamoDB , MongoDB Atlas (only if you're using the new Data API) to name a few.
      Sorry for venting; this exact issue nearly ended me.

  • @Harish-rz4gv
    @Harish-rz4gv Год назад

    When do u start project series??

  • @andriisukhariev
    @andriisukhariev Год назад

    Cool thanks

  • @michaelscofield2469
    @michaelscofield2469 Год назад

    please make project series

  • @saman6199
    @saman6199 Год назад

    Hey Cody, would it be possible to have a very small express app and run it on lambda to show us how we could configuring it. It would be appreciate it

  • @Grahamaan27
    @Grahamaan27 11 месяцев назад

    Ok but the competition to serverless is not EC2, but ECS or containers. Why compare against VMs???

  • @devippo
    @devippo Год назад

    I think there are many devs who love maintaining and fiddling with AWS.
    To be honest I just want the the app to work for the client ASAP.

  • @illiakhomenko6405
    @illiakhomenko6405 Год назад

    Ligma-lambda is now forever in my mind😂😂😂😂

  • @rohangodha6725
    @rohangodha6725 Год назад +1

    leaked env vars unlucky 💀

  • @freshhorizonswithjakub
    @freshhorizonswithjakub Год назад +1

    roll it to 69 right? I see what you did there.

    • @WebDevCody
      @WebDevCody  Год назад

      By accident, but it works out

  • @Chris-se3nc
    @Chris-se3nc Год назад +2

    What a sticky mess and terrible local dev experience. I’ll stick to kubernetes. Multi cloud out of the box.

  • @Grahamaan27
    @Grahamaan27 11 месяцев назад

    Zero benefits I heard are not native with using ECS on AWS. Logging, auto scaling, metrics are all supported out of the box. Lambda has so much overhead and duplicated effort , if you like being inefficient but stupidly simple I see the appeal. But if youre running a production enterprise environment, I can't imagine you wouldnt want to save money and execution time by implementing autoscaled container services.