Deploying Machine Learning Models with mlflow and Amazon SageMaker

Поделиться
HTML-код
  • Опубликовано: 16 ноя 2024

Комментарии • 50

  • @mohajeramir
    @mohajeramir 11 месяцев назад

    This man is awesome. Is it possible to enable autoscaling from the mlflow command when creating the endpoint ?

    • @juliensimonfr
      @juliensimonfr  9 месяцев назад

      Lol. According to mlflow.org/docs/latest/python_api/mlflow.sagemaker.html, no autoscaling, but you could enable it later using the SageMaker SDK by updating the endpoint configuration. Check the SageMaker docs for details.

  • @1nspa
    @1nspa 3 года назад +1

    Hi thanks for this great video. I successfully ran the model locally and it worked but when trying to deplay it to sagemaker, I have this error "botocore.exceptions.NoCredentialsError: Unable to locate credentials
    ". Please can you help me??

    • @1nspa
      @1nspa 3 года назад +2

      Nevermind I was able to resolve the issue. If you are a beginner like me and went through a similar problem, here is how I solved it:
      1. Downloading the aws command line in my PC
      2. Setting up aws configure (see this link for details docs.aws.amazon.com/cli/latest/userguide/cli-configure-quickstart.html)
      3. Lastly was to authenticate docker with ECR (see docs.aws.amazon.com/AmazonECR/latest/userguide/registry_auth.html)
      4. Then deploying the model as described in the video above

  • @nielshoogeveen3767
    @nielshoogeveen3767 3 года назад

    I am getting the following error when building and push the image:
    File "/Users/niels/Desktop/Projects/MLflow_optuna/venv/lib/python3.9/site-packages/botocore/utils.py", line 1058, in validate_region_name
    raise InvalidRegionError(region_name=region_name)
    botocore.exceptions.InvalidRegionError: Provided region_name '\' doesn't match a supported format.
    What did I do wrong?

  • @laguitte
    @laguitte 4 года назад

    This shows really well how to BYO model to Sagemaker.
    On the other hand, I'm trying to figure out the best way to get a pre-baked model out of Sagemaker (e.g. Obj2Vec or BlazingText) and log it in MLflow or deploy it locally with MLflow (that could be helpful to deploy in a kubernetes cluster for example).
    I'm a bit confused between:
    option 1 = using the Sagemaker python SDK in local mode OR
    option 2 = taking the model artifacts from S3 somehow and packaging the model myself OR
    option 3 = setting up a Sagemaker endpoint to evaluate the model and then tear it down.
    I right now use option 3 but I have the feeling it's not the best way to do this. Do you see what I mean and how one could get that similar local experience for let's say a Obj2Vec model ?

    • @juliensimonfr
      @juliensimonfr  4 года назад +1

      Hi Louis, models trained with built-in algos are MXNet models (except XGBoost of course). If you want to use MLflow, you would have to import MXNet in the script, load the pretrained model, and save it again (just like you'd trained it). MLflow should then be able to deploy it. Give it a try and let me know :)

  • @MrWizomar
    @MrWizomar 2 года назад +1

    Merci pour la video!

  • @logicboardrepairservices2048
    @logicboardrepairservices2048 4 года назад +1

    Great video Julien
    Thanks!

  • @99ansh
    @99ansh 3 года назад +1

    Hey! Is there any way to modify the response by /invocation endpoint? I want the model to return not only the predicted label but also the prediction probability like sklearn predict_proba

    • @money_wins_controls
      @money_wins_controls 3 года назад

      hi can u please tell me whther u find any solution for this

  • @chandrakumar6575
    @chandrakumar6575 3 года назад

    1) Do we have to write a Dockerfile for it? No right? "mlflow sagemaker build-and-push-container" should build the docker image and push it to the associated AWS ECR right?
    2) I am working on Amazon workspace which has Amazon linux 2 installed. When I run the "mlflow sagemaker build-and-push-container" command, it is trying to use Ubuntu repository to install the required dependencies. As my OS is amazon linux (Centos dist) and uses yum it fails. Can you please suggest me on how to fix this?

    • @juliensimonfr
      @juliensimonfr  3 года назад

      1) Correct, no need to write it.
      2) Not sure, did you check the mlflow doc?

    • @chandrakumar6575
      @chandrakumar6575 3 года назад +1

      @@juliensimonfr thank you for the response.
      I got it to work with below commands in the mentioned order:
      1. mlflow sagemaker build-docker
      2. mlflow sagemaker build-and-push-container
      3. mlflow sagemaker deploy

    • @juliensimonfr
      @juliensimonfr  3 года назад

      @@chandrakumar6575 Great!

  • @rasmustoivanen2709
    @rasmustoivanen2709 4 года назад

    This is really good video but I am hungry for more :)
    Julien could you make example with MLFlow + Hyperpt + loggin metrics accross multiple epocs?
    Or if there is such could you point me to it.

  • @jonmaurer9231
    @jonmaurer9231 4 года назад

    Great video! Is there a way to train a model as a Sagemaker Training Job using mlflow?

    • @juliensimonfr
      @juliensimonfr  4 года назад +2

      Hi Jon, not that I know. mlflow.sagemaker seems to be for deployment only. Take a look at github.com/Kenza-AI/sagify, it could do the trick :)

  • @narijami
    @narijami 4 года назад

    Hello. I have a question about Docker image. Why when I do changes in docker image with new name changes are not updated? for example I want to change version of one of packages. I use: docker build . -t new-image But I see that new version of package is not updated.

    • @juliensimonfr
      @juliensimonfr  4 года назад

      You mean locally or on AWS?

    • @narijami
      @narijami 4 года назад

      @@juliensimonfr I could solve it just remove one line and then I could install packages. Thanks

  • @kaustubhwani1647
    @kaustubhwani1647 4 года назад

    Great video!!!
    I am actually new to the aws, trying to deploy locally trained image classification model on to sagemaker. Can you guide me here with any article or video ?

    • @juliensimonfr
      @juliensimonfr  4 года назад

      plenty of info here: docs.aws.amazon.com/sagemaker/latest/dg/your-algorithms.html

  • @aryanmehta8607
    @aryanmehta8607 3 года назад

    Is there any list of endpoints like /invocations? Where can I find them?

    • @juliensimonfr
      @juliensimonfr  3 года назад

      You can see your endpoints in the SageMaker console, and list them with the list-endpoints SageMaker API.

    • @aryanmehta8607
      @aryanmehta8607 3 года назад

      @@juliensimonfr Okay, thanks :) How did you create /invocations endpoint? How can I modify it's functionality?

  • @gavin8535
    @gavin8535 4 года назад

    Do we need to stop the service of the model just deployed in sagemaker since they may charge us when not being used?

    • @juliensimonfr
      @juliensimonfr  4 года назад +1

      Sure, always delete your endpoints where you're done.

    • @gavin8535
      @gavin8535 4 года назад +1

      @@juliensimonfr So, if I want to keep using it, then I need to be charged.

  • @SaiKrishna-go7qt
    @SaiKrishna-go7qt 3 года назад

    When i am trying to deploy the model locally using sagemaker(just like it was instructed in the video i am getting following error, I checked docker hub there is no image called "mlflow-pyfunc") Did you build the following image before running the code @Julien Simon
    Unable to find image 'mlflow-pyfunc:latest' locally
    docker: Error response from daemon: pull access denied for mlflow-pyfunc, repository does not exist or may require 'docker login': denied: requested access to the resource is denied.
    See 'docker run --help'.
    Thank you

    • @juliensimonfr
      @juliensimonfr  3 года назад +1

      Did you build the sagemaker container? mlflow sagemaker build-and-push-container

    • @SaiKrishna-go7qt
      @SaiKrishna-go7qt 3 года назад

      @@juliensimonfr ooh I didn't do that as I wanted it to deploy locally first, but I guess I should be doing because that builds the image which inturn would be used locally
      Thank you 🙂

  • @sharathvarmatirumalaraju2251
    @sharathvarmatirumalaraju2251 3 года назад

    great help

  • @_XY_
    @_XY_ 2 года назад

    👏👏

  • @anilvure9009
    @anilvure9009 4 года назад

    Have been hitting this error while running "mlflow sagemaker run-local -m $MODEL_PATH -p $LOCAL_PORT" command.
    Unable to find image 'mlflow-pyfunc:latest' locally
    docker: Error response from daemon: pull access denied for mlflow-pyfunc, repository does not exist or may require 'docker login': denied: requested access to the resource is denied.

    • @juliensimonfr
      @juliensimonfr  4 года назад

      Did you build the container?

    • @wardsworld
      @wardsworld 4 года назад

      @@juliensimonfr same error here actually. Which image to build? ... `mlflow sagemaker run-local -m $MODEL_PATH -p $LOCAL_PORT` throws the error mentioned by Anil.

    • @juliensimonfr
      @juliensimonfr  4 года назад

      @@wardsworld 'mlflow sagemaker build-and-push-container'. See www.mlflow.org/docs/latest/python_api/mlflow.sagemaker.html

    • @chandrakumar6575
      @chandrakumar6575 3 года назад

      @@juliensimonfr
      1) Do we have to write a Dockerfile for it? No right? "mlflow sagemaker build-and-push-container" should build the docker image and push it to the associated AWS ECR right?
      2) I am working on Amazon workspace which has Amazon linux 2 installed. When I run the "mlflow sagemaker build-and-push-container" command, it is trying to use Ubuntu repository to install the required dependencies. As my OS is amazon linux (Centos dist) and uses yum it fails. Can you please suggest me on how to fix this?

    • @chandrakumar6575
      @chandrakumar6575 3 года назад

      @@wardsworld Were you able to fix it? If yes, please let me know what steps you had to follow.