What is Mean Average Precision (mAP)?

Поделиться
HTML-код
  • Опубликовано: 22 окт 2024

Комментарии • 24

  • @shukkkursabzaliev1730
    @shukkkursabzaliev1730 Год назад +2

    Thank you guys for all the hard work you do! And making available for free to all of us!

  • @sonnyson0723
    @sonnyson0723 3 месяца назад

    Thanks ALL FOR instruction

  • @Anton_Sh.
    @Anton_Sh. 7 месяцев назад

    7:10
    The IoU is not the amount of overlap between the two boxes, it's "Intersection over Union", so the area of overlap / area of union, its the proportion, whereas the intersection alone is the overlap value.

  • @brunozana
    @brunozana 2 месяца назад

    Best video in this topic

  • @XiaoZhao-d4j
    @XiaoZhao-d4j 5 месяцев назад

    AP (of a single class) is caculated for a fixed IoU, right? Because a P-R value is dependent on confidence and IoU (two factors). By computing the P-R curve, only confidence is changed (IoU is fixed).

  • @EvangelosKarajan
    @EvangelosKarajan 2 месяца назад +1

    Great content!

  • @yuganshgoyal6348
    @yuganshgoyal6348 4 года назад +26

    1. F1 score is harmonic mean of precision and recall, and just not simply the result of their multiplication.
    2. 9:20 you totally failed to clarify things. So what mAP is:
    a. is it average of AP at different IoUs of a single class
    b. or average of AP across different classes
    but then what happened to AP at different IoUs
    Overall it is informative. But would be better if you can just clarify things a bit more..

    • @ben6
      @ben6 4 года назад +1

      I found the same thing on their blog post. Doesn't actually answer the title of the video.

    • @alejandromarceloproiettian5079
      @alejandromarceloproiettian5079 4 года назад +12

      AP is calculated using a single IoU, as the mean of precisions achieved at each recall level (different detection thresholds).
      As AP is calculated for each class, mAP (mean average precision) is calculated as the mean value of average precisions.
      AP and mAP depend on the selected IoU, and are thus called by its IoU (mAP50, mAP75, etc.)

    • @ankitmagan
      @ankitmagan 3 года назад

      @@alejandromarceloproiettian5079 You mention different detection thresholds. Is this the confidence value that the model outputs?

    • @VinayVerma982
      @VinayVerma982 3 года назад +1

      @@ankitmagan Confidence Value (confidence score) is the probability of the object present in a particular anchor box. Its mostly coming from the classifier.
      We are talking about IoU. Its overlap/union ratio between the predicted and ground truth(actual) bounding box that we have in our labelled dataset. We can calculate mAP when we have labelled test dataset and we predict boxes and compare how precise bounding boxes are generated with respect to ground truth boxes.

  • @diogenesia376
    @diogenesia376 Год назад +1

    Thank you very much

    • @Roboflow
      @Roboflow  Год назад +1

      You welcome 🙏🏻

  • @robertmigliara7827
    @robertmigliara7827 Год назад

    Nice work. Thanks!

  • @durarara911
    @durarara911 2 года назад +2

    Amazingly explained!

    • @Roboflow
      @Roboflow  2 года назад

      Glad it was helpful!

  • @Maciek17PL
    @Maciek17PL 2 года назад +1

    What is that plot with confidence as y-axis at 4.18 its super confusing

  • @nitinbommi1867
    @nitinbommi1867 2 года назад

    Can I get the link to the paper that introduced mAP?

  • @kokebdese4787
    @kokebdese4787 3 года назад

    Can I get the code to calculate them?

    • @legohistory
      @legohistory 2 года назад +1

      use tensorflow for that

    • @abbasalsiweedi9019
      @abbasalsiweedi9019 Год назад

      @@legohistory isn't calculated directly inside google colab algorithm folders?

    • @legohistory
      @legohistory Год назад

      @@abbasalsiweedi9019 I do not understand. What do you mean?