Autoencoder Forest for Anomaly Detection from IoT Time Series | SP Group

Поделиться
HTML-код
  • Опубликовано: 18 ноя 2024

Комментарии • 10

  • @jmvanlith
    @jmvanlith 4 года назад +3

    Great idea to cluster on time!

    • @MrProzaki
      @MrProzaki 4 года назад

      Yep agree , just watched it and i cant wait to test that!

  • @MichalMonday
    @MichalMonday 2 года назад +1

    Hello, is there any publications about this method?

  • @erminkevric4921
    @erminkevric4921 2 года назад

    How is the specific autoencoder selected in the end, when the testing data is passed?

  • @tthaz
    @tthaz 4 года назад

    Excellent talk. Wondered how you label your data in the first place.

  • @markus-sagen
    @markus-sagen 4 года назад

    Great talk

  •  4 года назад +1

    Don't you effectively mask your training data to exclude the linear example? Would be interesting how the single encoder looks if you run the same masking on the input before training it.

    • @YiqunHu
      @YiqunHu 2 года назад

      The reason to apply multiple encoder to different shift windows of training data is that even for the same repeating pattern, if you look at the different starting point, it will be the different patterns. When you apply the single encoder, it will require the single model has a lot more representation ability and it is hard to trade-off between complexity and the gnerealization capability.

  • @najmesouri2088
    @najmesouri2088 4 года назад +2

    excellent. can to see code?

  • @abhalla
    @abhalla 4 года назад +1

    Very good talk