- Видео 5
- Просмотров 111 792
Wang Jui-Hsien
Добавлен 5 май 2014
[SIGGRAPH 2019] KleinPAT: Optimal Mode Conflation For Time-Domain Acoustic Transfer Precomputation
This video goes with our SIGGRAPH 2019 technical paper on precomputed rigid-body sound models. Our KleinPAT system is over 4000x faster compared to state-of-the-art methods, and is capable of building realistic rigid-body sound models in minutes; these models can then be used in real-time virtual environments (games, VR/AR/MR, what have you) to generate realistic, synchronized sounds on-the-fly. See below for the link of our runtime demo:
ruclips.net/video/E-du5ihBDWs/видео.html
For more details about the paper, please see our project website:
graphics.stanford.edu/projects/kleinpat
Paper abstract:
We propose a new modal sound synthesis method that rapidly estimates all acoustic transfer field...
ruclips.net/video/E-du5ihBDWs/видео.html
For more details about the paper, please see our project website:
graphics.stanford.edu/projects/kleinpat
Paper abstract:
We propose a new modal sound synthesis method that rapidly estimates all acoustic transfer field...
Просмотров: 2 684
Видео
[SIGGRAPH 2019] KleinPAT runtime demo
Просмотров 1,2 тыс.5 лет назад
This is a demo showcasing the expressive rigid-body sound models built by our KleinPAT system. This runtime demo is written in C , and runs in real-time on my 2015 mac laptop. The sound is generated on-the-fly (no pre-recording!), and is suitable for any interactive virtual environment: games, VR/AR/MR, what have you. Check our the following link for the paper video and more technical details a...
[SIGGRAPH 2018] [Highlights] Toward Wave-based Sound Synthesis for Computer Animation
Просмотров 12 тыс.6 лет назад
This is a highlight version of our 2018 SIGGRAPH paper. For more technical details and comparison see the following video: ruclips.net/video/su6z9snjU-U/видео.html Project webpage: graphics.stanford.edu/projects/wavesolver/
[SIGGRAPH 2018] Toward Wave-based Sound Synthesis for Computer Animation
Просмотров 93 тыс.6 лет назад
NEW 08/02/2018: We have a highlight version of the video. See here: ruclips.net/video/5I8KCTuDBek/видео.html graphics.stanford.edu/projects/wavesolver/ Jui-Hsien Wang, Ante Qu, Timothy R. Langlois, and Doug L. James. 2018. Toward Wave-based Sound Synthesis for Computer Animation. ACM Trans. Graph. 37, 4, Article 109 (August 2018), 16 pages. doi.org/10.1145/3197517.3201318 Abstract: We explore a...
[SIGGRAPH2017] Bounce Maps: An Improved Restitution Model for Real-Time Rigid-Body Impact
Просмотров 3,3 тыс.7 лет назад
Jui-Hsien Wang, Rajsekhar Setaluri, Doug L. James, and Dinesh K. Pai. 2017. Bounce Maps: An Improved Restitution Model for Real-Time Rigid-Body Impact. ACM Trans. Graph. 36, 4, Article 150 (July 2017), 12 pages. doi.org/10.1145/3072959.3073634 graphics.stanford.edu/projects/bouncemap/ Abstract: We present a novel method to enrich standard rigid-body impact models with a spatially varying coeffi...
😄
Relatable
Is this open source for anyone to use? Can I have a link to download or more information?
Wow. Excellent work.
I can imagine a time where video games will have real time sound simualtions This would require a lot of computing but would save companies a lot of money cuz they save on sound recording and stuff… Maybe idk
THIS WILL BE REVOLUTIONARY FOR MUSIC PRODUCTION. I am imagining a VST synthesizer where you can model sounds from real life objects!! Or create virtual objects in a 3d space, than generate sounds produced by hitting them, blowing on them, etc.
Could even lead eventually to an accurate guitar emulator. Guitar sample instruments dont come close to the real thing but this could change that
WTF you guys created a world following our physics inside a computer Only if someday computers get really really fast
🅱️
And after all of this, we have among us
1:31
This is absolutely fucking groundbreaking.
now simulate two drums and a cymbal falling from a cliff
Is this realtime program very gpu/CPU intensive? Just wondering whether or not it's hard for computers and stuff to render out the sounds
Why the hell this is not available for sound designers/Visual Artists/Musicians to use?!!!! The World Needs It.
I Noticed that the fan blocks the voices and maybe reflects it but doesn't seem to "chop" the breath carrying the voice like it would in this dimension.
I'm currently making a physics game in VR and I'm working on the sounds. Is there a way to download this somewhere? Porting this to the Unity asset store would be invaluable for lots of developers like myself. Cool project!
DAMM
IS THIS THE AUDIO VERSION OF RAY TRACING???!
How do I get this? Seriously xD
2:59 *_B_*
🅱️
I knew this is possible, but probably takes too many resources for being useful nowadays. However, it's good to know that the algo already exists.
man these 3d research papers are the most surreal shit ive ever seen
This kind of stuff is really cool, it gets even crazier when you think about it in the context of things like video game application.
this is like realtime ray tracing, but for sounds.
acoustic and asthetic
Was the cut-off mid sentence at the end intentional?
The metal sheet and bowl were great. Cymbals not at all.
load in a xylophone model and play a melody on it ::)
3:33 When the plunger went in front of the trumpet the first time (just to show the animation) my brain was automatically changing the sound I heard... When I watched the clip again without looking at the screen this time, I heard it as it should sound. very interesting
Simulating compression waves in a virtual space to generate real-world organic sounds? Incredible.
This is just perfection. Why is this not implemented in games? Or is it?
we are about to enter the matrix folks
i wish i can ABCD in a barrel
a *B* c (d)
Amazing!
well the cymbal was kinda disappointing.
2:30 Arch-ae-ol-o-gists they like bones, and Ancient civilizations arch-ae-ol-o-gists (And one of them's gay)
4:34 inspiration for psycho pass by Xavier wulf
For anyone unfamiliar with latin or how professional study papers are written, "Et Al" means "And Others." So, it is researcher Langlois and others being cited, implying that more than one person was deeply involved in or helped write the paper.
Thanks. I was wondering about that for a moment
This is amazing....
This is extraordinary...!!!
Now i totally believe we could be living inside a simulation
The legos sound kinda soft
These videos are cool and all and I'm amazed by the work that's gone in to all of these techniques but this is the first time a siggraph demonstration has really made me question my grasp on reality.
"A, [megaphone] B, C, [deep pot] D"
Archeologists they like bones and ancient civilisations, archaeologists!
My man. World Doctors is hilarious.
@@HarmoniChris I just noticed the character model in the video hahah.
3:33 Is that a cat on the reflection on trumpet?
Wow this technology is awesome. I have the vision in 20 years its normal to use it in videogames and interactive video game movies.
Cant wait for this to be implemented into gaming. But I will wait.
5 years later still nothing
We need this shit in games
Oh my god, this is amazing!