Inference ML with C++ and

Поделиться
HTML-код
  • Опубликовано: 6 янв 2025

Комментарии • 42

  • @Daniel152315
    @Daniel152315 2 года назад +2

    Wow! This is a big help, thanks so much!

  • @MrIlvis
    @MrIlvis Год назад +2

    Next would be nice if there will be an example in Java, because there is no working example of ONNX runtime in Java language. Thanks for C++ example! :)

  • @xiangliu-qv2mp
    @xiangliu-qv2mp 2 года назад

    this is a big help for a primer

  • @noone-dc4uh
    @noone-dc4uh Год назад

    On line no 85 how did you get variable inputNames?

  • @MetaHD1
    @MetaHD1 Год назад

    Why the confidence scores in this example are not in the range of 0-1? I have also followed the same process for my custom cnn model and the scores are not in the range of 0 to 1.

    • @TPK-ry3ze
      @TPK-ry3ze Год назад

      Is there no sigmoid layer or softmax layer included in your ONNX model's output?

  • @AagneyWest
    @AagneyWest 2 года назад +1

    Hii, the function GetOutputName() seems deprecated, GetOutputNameAllocated () is the new function but it returns a pointer. Help?

    • @AagneyWest
      @AagneyWest 2 года назад +1

      Got it 🤝 if anybody have any doubt feel free to ping me

    • @savasatarzade1086
      @savasatarzade1086 2 года назад +1

      @@AagneyWest Hi, can you help me in this, I have the same problem and GetOutputName() not workinig

    • @AagneyWest
      @AagneyWest 2 года назад

      @@savasatarzade1086 yeah sure, i will get back to you in 12hours. I am in the middle of something.

    • @AagneyWest
      @AagneyWest 2 года назад +1

      @@savasatarzade1086 i can share the code if you comment your email or any other contact.

    • @ONNXRuntime
      @ONNXRuntime  2 года назад +2

      Thanks for pointing out this issue! I have updated the sample project to fix this!

  • @mikailgee1466
    @mikailgee1466 2 года назад

    STILL trying to figure out where "results" was defined and assigned, should you not be trying to print out the contents of "outputTensor" instead after a session "Run"?

    • @ONNXRuntime
      @ONNXRuntime  2 года назад +1

      Hey Mikail! "results" is defined on line 61:
      array results;
      It is pointed to in the `outputTensor` on line 66:
      auto outputTensor = Ort::Value::CreateTensor(memory_info, results.data(), results.size(), outputShape.data(), outputShape.size());
      Here is more on how "CreateTensor" works from our c++ docs onnxruntime.ai/docs/api/c/struct_ort_1_1_value.html#ab19ff01babfb77660ed022c4057046ef

  • @krinodagamer6313
    @krinodagamer6313 Год назад

    I created my model in .pkl file but I created a Python script to convert it to ONNX...Im trying to get it loaded in the Visual Studio project but I cant should I just retrain the model in ONNX all around?

    • @ONNXRuntime
      @ONNXRuntime  Год назад +1

      What framework did you use to train the model? You should export to ONNX format from your training framework.

  • @yanhuacui
    @yanhuacui Год назад

    I'm also trying to enable CUDA for GPU usage. I ran your example. When it calls OrtStatus* onnx_status = g_ort->SessionOptionsAppendExecutionProvider_CUDA(session_options, &o); It returns error "CUDA is not available". I use Windows10, Visual Studio 2022. My GPU is NVIDIA RTX A2000. I installed latest CUDA Toolkit V12.1 and cuDNN is compatible version with CUDA. Environment variables have been set properly. The version of onnxruntime is latest, onnxruntime-win-x64-gpu-1.14.1. Could you advise what should i do? Thanks in advance.

    • @ONNXRuntime
      @ONNXRuntime  Год назад

      1. Can you confirm you completed all the install steps including updated the drivers for your card?
      2. ORT 1.14 is built with CUDA version 11.6. Can you install 11.6 and see if that resolves the issue?
      Install CUDA for windows steps: docs.nvidia.com/deeplearning/cudnn/install-guide/index.html#install-windows

    • @yanhuacui
      @yanhuacui Год назад

      @@ONNXRuntime Thanks very much. Q1, Yes. Q2, it succeeds after my installation of CUDA Toolkit V11.6, and degrading onnxruntime to onnxruntime-win-x64-gpu-1.13.1.
      If i use CUDA Toolkit V11.6, and onnxruntime-win-x64-gpu-1.14.1, it throws an exception when calling SessionOptionsAppendExecutionProvider_CUDA. It says "Unhandled exception at 0x00007FFD3AAC286E (ucrtbase.dll) in fns_candy_style_transfer.exe: Fatal program exit requested."
      Does that mean this is a compatibility issue? I can't use CUDA Toolkit above V11.6? Why it throws exception with latest onnx?

    • @ONNXRuntime
      @ONNXRuntime  Год назад +2

      I tested 1.14 ort with CUDA 11.6 and it works. I updated the sample project for the new version of ORT. Can you try grabbing the most recent updates on the sample project and see if you get the same error?

  • @XXXXXDarkLord
    @XXXXXDarkLord 2 года назад +3

    I followed the video exactly and i got stuck at 1:05 with the error "cannot open source file "opencv2/highgui/highgui.hpp" " I have installed the nuget packages and added the header file "imgproc.hpp" in the include paths properties of the project. The file "highgui.hpp" was not even in the packages. Also "imgproc.hpp" is not found even after adding the path. I also downloaded the full example from Github. Also there is the error that the header files of openCV are not found. Do you have any ideas or does the example in the video not work anymore? (Visual Studio 2019)

    • @FereshtehJourney
      @FereshtehJourney 2 года назад +1

      I have the same problem too! I run in Visual studio 2019 and I installed the packages through nuget

    • @ONNXRuntime
      @ONNXRuntime  2 года назад +4

      I was able to recreate this issue. It is an issue with the OpenCV nuget package I was using that previously was not an issue. I updated to use this nuget package and it resolved the issue: www.nuget.org/packages/opencv4.2/
      I will update the sample project to reflect this change. Thanks for letting me know!!

  • @JirongYi
    @JirongYi Год назад

    Does not work for me. I got tons of errors when I follow this tutorial. Not sure if it's because my machine is win11 and with community edition of visual studio

    • @ONNXRuntime
      @ONNXRuntime  Год назад

      This project was built on win 11 with visual studio. Can you post the error?

    • @JirongYi
      @JirongYi Год назад

      @@ONNXRuntime Thanks for your timely reply. Definitely appreciate any help from you. I fixed some of the errors which were due to the system configurations. There are still some left. One of them is "The ordinal 3 could not be located in the dynamic link library C:Users\yijir\Downloads\cpp-onnxruntime-resnet-console-app-main\x64\Debug\OnnxRuntimeResNet.exe". Hope to get help from you. Thanks

    • @JirongYi
      @JirongYi Год назад +2

      Quick update. I just fixed the bug and got the results as your did. Thanks for creating this great tutorial. Below is what I got from the console.
      Hwllo 1: golden_retriever 11.9425
      2: Tibetan_terrier 10.291
      3: otterhound 8.63227
      4: Tibetan_mastiff 8.61618
      5: kuvasz 8.55493
      C:\Users\yijir\Downloads\cpp-onnxruntime-resnet-console-app-main\x64\Debug\OnnxRuntimeResNet.exe (process 16600) exited with code 0.
      To automatically close the console when debugging stops, enable Tools->Options->Debugging->Automatically close the console when debugging stops.
      Press any key to close this window . . .

    • @megistone
      @megistone Год назад

      @@JirongYi Can u tell please how did u fix ordinal 3 error?

  • @vibhudalal901
    @vibhudalal901 2 года назад

    This doesn't seem to work on VS 2022, can anyone else confirm?

    • @ONNXRuntime
      @ONNXRuntime  2 года назад

      Did you include .Net Core 3.1 in the VS2022 installation?

    • @vibhudalal901
      @vibhudalal901 2 года назад

      @@ONNXRuntime Yes

    • @Navhkrin
      @Navhkrin 2 года назад +1

      @@ONNXRuntime For some ultra weird reason this does not work in VS2022; #include but this works completely fine #include "onnxruntime_cxx_api.h"

    • @ONNXRuntime
      @ONNXRuntime  2 года назад +1

      I updated the sample in GitHub to VS2022 and it works with and "". Please grab the new source and let me know if you still have an issue! :)

    • @krinodagamer6313
      @krinodagamer6313 Год назад

      Ive ran into that issue too which version do you have

  • @pythonp7217
    @pythonp7217 9 дней назад

    Outdated