Convolutional Differentiable Logic Gate Networks - NeurIPS Oral - difflogic

Поделиться
HTML-код
  • Опубликовано: 18 дек 2024

Комментарии •

  • @sword_of_the_morn
    @sword_of_the_morn 3 дня назад +1

    Came here after I saw a picture of the poster from NeurIPS. I wish more deep learning researchers made videos explaining their papers in brief. Great use of Manim.

  • @boulabiar
    @boulabiar Месяц назад +13

    Impressive.
    Some ideas look so natural, so useful that we ask ourselves why we haven't tried this before.
    Congrats!

  • @sam.scrolls
    @sam.scrolls Месяц назад +5

    Logic gates is an interesting idea for model to learn. And the time taken in nanoseconds is insane !!

  • @I_am_who_I_am_who_I_am
    @I_am_who_I_am_who_I_am 12 дней назад +2

    We are working on the same topic, except that I'm treating them as superpositions rather than differentiable parts. Congrats on beating me up to this!

  • @physiologic187
    @physiologic187 Месяц назад +4

    Wow! This is what real research is about. What a cool idea!! Keep uo the great work.

  • @robharwood3538
    @robharwood3538 12 дней назад

    Just awesome! I've been wondering about how logic and neural networks could be combined for a long time, and this seems exactly the kind of thing I was hoping was possible! Amazing work, guys!

  • @shehrozeshahzad4363
    @shehrozeshahzad4363 Месяц назад +1

    Mind blowing amazing and it reduces computation too 👏!

  • @AmanSharma-ug6sr
    @AmanSharma-ug6sr Месяц назад +1

    Amazing work!

  • @patrickl5290
    @patrickl5290 13 дней назад

    Damn, this is impressive. Where did you get the idea?

  • @phaZZi6461
    @phaZZi6461 Месяц назад +2

    this is extremely cool

  • @emmanuelbalogun6757
    @emmanuelbalogun6757 Месяц назад

    I love this. Thought about this a while ago, great work!

  • @chassemyland
    @chassemyland Месяц назад

    This idea seems to potentially be a genius one

  • @SuperJg007
    @SuperJg007 18 дней назад

    this is very interesting. cheers 🎉

  • @ravipratapmishra7013
    @ravipratapmishra7013 16 дней назад

    Nice, just want to know, what push you guys to use gates.

  • @sedthh
    @sedthh 23 дня назад

    Wow, the implications here for running embedded models with such speed up is amazing!
    Can you elaborate on how you create logic gates for real inputs? After passing any of the weighted functions, the inputs will no longer be binary?

  • @סרטוניםבעמ
    @סרטוניםבעמ Месяц назад +5

    Could you please share the manim code for this video? Thank you!

  • @mprone
    @mprone 29 дней назад +3

    Are DLGN interpretable (or more interpretable) than traditional neural networks? Afaik, the field of digital design is "very well understood" and HW designers have been using synthesizers for 40+ years to map HDL code (i.e. operations) to real logic circuit. I am wondering if any of this could be reused to understand what a logic gate (or a group of them) at a given layer is doing to perform the task -- something that is arguably not really possible in traditional DNN.

    • @I_am_who_I_am_who_I_am
      @I_am_who_I_am_who_I_am 12 дней назад

      their differential gates can be completely replaced by others during backpropagation which makes this unsuitable for hardware, unless you re-route prior layer to different routes for which we know apriori the behaviors. Thus, in hardware you need to create a lot of redundancy which is not a bad thing. The whole brain is redundant starting from the fact of having two separately functionating hemispheres. The idea is amazing. I'm working on something similar, but the gates are in probabilistic superposition. Hence, they don't have to be replaced with others but rather turned partially on and off. Certainly, I have a lot of inherent redundancy inside my model. And since each gate is a single CPU operation, this is extremely fast.

  • @sampadchowdhury6583
    @sampadchowdhury6583 18 дней назад

    I have been very interested in this topic and trying to get my hands into this, however I have a question. Is this scalable? Can we increase the number of inputs from 2 and go beyond?

  • @Justusv.Hodenberg
    @Justusv.Hodenberg Месяц назад +1

    ineresting!