GPUs, from Simulation to Encryption (with Agnès Leroy)

Поделиться
HTML-код
  • Опубликовано: 16 ноя 2024

Комментарии • 25

  • @johanmaasing
    @johanmaasing 22 дня назад +1

    This conversation was a delight to listen to.

  • @RodrigoStuchi
    @RodrigoStuchi Месяц назад +1

    It was such an amazing conversation that I felt as if an hour passed in the blink of an eye

  • @kompila
    @kompila Месяц назад +9

    It is a new era of GPU computation. That is why content like these are crucial for our improvement.
    Thanks Chris and Agnès.

    • @colinmaharaj50
      @colinmaharaj50 Месяц назад

      Yes but this has been available with the event of CUDA back in 2008. Before it was called GPGPU.

  • @onedarkcoder
    @onedarkcoder Месяц назад +2

    Great interview! During my undergrad, I worked on fluid simulation for my final year project. Later, I transitioned into programming at a bank. Listening to her talk about Maple software, CUDA programming, and SPH techniques brought back so many memories-I could understand it all. Definitely a nostalgic moment!

  • @mananabanana
    @mananabanana Месяц назад +7

    As someone who used to do computational ecology and struggled with parallelizing seemingly sequential problems, I really enjoyed this episode. Thanks Kris and Agnes!

  • @randxalthor
    @randxalthor Месяц назад +3

    Fantastic interview. Agnes is in rarefied air up there, dealing with CFD and then transitioning to FHE. Thanks for hosting her and asking such excellent questions!

  • @paxdriver
    @paxdriver Месяц назад +2

    Thank you soooo much!!! This was my favourite episode ever, Kris. Agnès did so well!!

  • @paxdriver
    @paxdriver Месяц назад +2

    Merci bcp Agnès!! Vous l'avez expliqué plus expert que n'importe qui d'autre online (apologies for my Canadian second language French 😅)

  • @mattanimation
    @mattanimation Месяц назад +5

    very cool discussion, you should reach out to the devs making the polynom app about post quantum cryptography, that would be another interesting discussion as well. I think its Jeff Phillips at Code Siren

  •  Месяц назад +1

    Great interview. Really curious about that Rusty future of the GPU

  • @budiardjo6610
    @budiardjo6610 Месяц назад

    wow, learning a lot of thing.

  • @AGeekTragedy
    @AGeekTragedy Месяц назад

    Having the parallelized algorithms from NVidia's Thrust library working over an abstraction like a Rust iterator would be really cool. Dibs on the name thRust.

  • @0LoneTech
    @0LoneTech Месяц назад

    I totally agree on preferring Rust over C++, but it isn't a magical silver bullet. What you want isn't just CUDA in Rust, but the best pieces of Halide, Chapel and Futhark. Chapel has a strong concept of domain subdivisions and distributed computing, Halide has algorithm rearrangements, Futhark has a less noisy language with some strong library concepts like commutative reductions and tooling that can autotune for your data. You'd also want a reasonably integrated proof system, as in Idris 2.
    The core thing that Chapel and Halide bring is the ability to separate your operational algorithm from your machine optimizations. E.g. if you chunk something for optimization, the overall operation is still the same. Futhark does some of that too, but only profile guided. Some fields approach this by separately writing formal proofs that two implementations are equivalent instead, but it's a much smoother process if you can maintain that as you write, like Idris attempts.

  • @robertlawson4295
    @robertlawson4295 Месяц назад

    Does it sound like Kris has never heard of Nvidia CUDA programming?

    • @DeveloperVoices
      @DeveloperVoices  Месяц назад +9

      I've heard of it, and knew it was for GPU programming, but nothing beyond that. I've never dived into it. 🙂
      (Actually, I think I did make a failed attempt to install in on my laptop once, which makes me think of the Zig episode we did on reliable cross-platform builds. 😁)

  • @johnridout6540
    @johnridout6540 Месяц назад

    Operations on encrypted data - what dark magic is this?

  • @morgomi
    @morgomi Месяц назад

    Shaders are nice, but Cuda is ugly.