I had loved this piece years ago and RUclips decided today was a good day to remind me of it. I had forgotten about it and I remember now how much I enjoy it. Congratulations
The program is interactive via a foot pedal that performs different actions depending on how far the piece has progressed (adding to the texture, reducing the texture, recording and playing back the flute, etc). Many sections of the electronic part are generated indefinitely until the pedal is pressed again, so the player does not have to count measures precisely.
We are fortunate to have discovered this work. It is, in this layperson's opinion, a singularly unique composition and performance that may well endure the test of time. Again, thank you for this and the excellent performance. -Ufi
Sorry, Beethoven endures the test of time. If this is art in any way, shape or form, you could have fooled me. Art is about beauty. This is pure gimmickry.
@@Ron239 Beethoven endures no more than this, he is just a representative of a special strand of aesthetics. You can't make him an universal guideline for art. His music is great however, and so is this. Its simply about something very different and of course its beautiful, beauty doesn't need major chords and stuff.
I like this! Convincing interaction and partnership between the flute and electronics, and I like how it transitions into that very sentimental, almost cinematic section near the end and back out. A transition that is unexpected but still makes sense.
Stunning, mesmerizing work. Highly inspiring for me, as a graduate level student who is seeking this kind of electroacoustic performance/aesthetic... I'm curious to know more about this piece, how it's prepared, what kind of control the performer has over the electroacoustic elements, etc. Gorgeous work.
I thought I'd heard some wild things in my time, but this takes the cake. I'll have to get a carrying case for my superconductor reactor next time I go on a gig.
Wow what an interesting piece! I had ideas about writing a piece for cello and live electronics, and stumbled upon your channel while researching about technical solutions. Do you maybe have a resource covering your use of interaction with the performer and the programming of different actions for the foot pedal in a piece like this?
Thanks! I don't think I have a video that directly addresses this topic, but I wrote chapters 9, 10 and 11 of my book hoping they'd provide insight in situations like yours.
Hi Ely, thanx for this wonderful piece and for sharing your inspiration and knowledge. One question, how is your approach to compose this kind of pieces, first you create the instrument part (flute in this case) and then the supercollicer part. Or do you make sections and complete both parts? Best regards
Thank you for the answer My second question was may be not enough clear : I'm also interested in the notation software you used to edit the electronic part on the score, what is the software you use ?
The piece is very interesting, and very well performed, good job ! Thanks for sharing this. Does the program interact in real time with the instrumentist or the algorithmic part runs itself and the player plays in time ? The score is very nice, in particular the CPU part . Which editor do you use ? (I'm just discovering supercollidar)
OK, ... I was searching for a magic supercollidar plugin that would edit the score automatically from the real-time code ... but if the solution is "a lot of patience" ...
Neither-- a MIDI foot controller is the only physical means of human-computer interaction. The piece is designed to sound interactive, and the performer can make certain choices in response to the computer-generated sound, but the piece is inherently linear and follows a loose, but predetermined trajectory.
I am really interested in using supercollider in my latest composition but I simply do not have enough technical knowledge about the software. My problem is I don't know how to make the code so that the program interacts with instruments through a microphone live. I am also interested in whether it would be possible to connect a VST instrument to interact live. Is there anywhere you could please point me to to get me started? I do have knowledge of program coding in many languages. By the way, I really enjoy your Fractus series.
Learning SC takes time. If it were simple enough that anyone could pick it up in a few hours, there would be little need for this video tutorial series, the SC book, and the dozen or so pdf tutorials floating around on the web. SC cannot interface with VST plugins, as far as I know. SoundIn.ar is the simplest UGen for pulling a live mic signal into SC for further processing. From there, it is simply a matter of designing SynthDefs to create the desired effects, and creating the desired nodes on the server at desired times. Coding a larger, more generalized performance structure is a difficult topic to address in a tutorial, principally because there are so many fundamentally different approaches, each dependent on the type of music being composed, and each has unique strengths and weaknesses. I would be happy to demonstrate my own approaches (and likely will do so at some point), but I think it is far more valuable for neophytes to dive in on their own, make their own mistakes, and eventually discover an approach which works well for them, specifically.
Thank you for all of your help. Basically, I want to create a composition where a microphone picks up part of a sound(like the harmonics of a piano) and they are amplified and transformed both live and 'stored' for an appropriate time. I have discovered a code for streaming live sound from the microphone: s = Server.local.boot; b = Bus.audio(s, 2) ; ( SynthDef(\MicRecordBuf, { arg out; var mic; mic = SoundIn.ar([0,1]); Out.ar(out, mic) }).play(s,[\out, b]); ) ( SynthDef(\MicPlayBuf, { arg out = 0, in; var rev; rev = FreeVerb.ar(In.ar(in)) ; Out.ar(out, rev); }).play(s, [\out, 0, \in, b], addAction: \addToTail); ) Also, for storing in a buffer: s = Server.local; s.boot; b= Buffer.alloc(s, 44100 * 5.0, 2) ( SynthDef(\MicRecordBuf, { arg bufnum = 0; var mic; mic = SoundIn.ar([0,1]); RecordBuf.ar(mic, bufnum, doneAction: 2, loop: 0); }).play(s,[\bufnum, b]); ) // record // play the recording ( SynthDef(\MicPlayBuf, { arg out = 0, bufnum = 0; var playbuf; playbuf = PlayBuf.ar(2,bufnum, doneAction:2); Out.ar(out, playbuf); }).play(s, [\out, 0, \bufnum, b]); ) The next stage is to work out how to modify the code for my own purposes. E.g. press a button and then activate the record code and then press another button and play the recording. Any suggestions, please?
I had loved this piece years ago and RUclips decided today was a good day to remind me of it. I had forgotten about it and I remember now how much I enjoy it. Congratulations
Thank you so much!
The program is interactive via a foot pedal that performs different actions depending on how far the piece has progressed (adding to the texture, reducing the texture, recording and playing back the flute, etc). Many sections of the electronic part are generated indefinitely until the pedal is pressed again, so the player does not have to count measures precisely.
Absolutly wonderful
Thanks so much Mark!
Finale 2011. The electronic graphics were created with the Shape Tool (within the Expression Designer) and lots of patience.
Amazing!
This is next level of listening. Beautiful.
We are fortunate to have discovered this work. It is, in this layperson's opinion, a singularly unique composition and performance that may well endure the test of time. Again, thank you for this and the excellent performance. -Ufi
Sorry, Beethoven endures the test of time. If this is art in any way, shape or form, you could have fooled me. Art is about beauty. This is pure gimmickry.
@@Ron239 Beethoven endures no more than this, he is just a representative of a special strand of aesthetics. You can't make him an universal guideline for art. His music is great however, and so is this. Its simply about something very different and of course its beautiful, beauty doesn't need major chords and stuff.
breathtaking
Melhor exemplo que achei do uso do supercollider na composição electroacústica, muito inspirador!!
Amazing! Brilliant use of tech and musicianship.
I like this! Convincing interaction and partnership between the flute and electronics, and I like how it transitions into that very sentimental, almost cinematic section near the end and back out. A transition that is unexpected but still makes sense.
Thank you! “Unexpected but still makes sense” is exactly what I was going for.
Increíble!!! esto es genial!!
i loved it!! i wish to play this piece some day
This is so cool. The tablature for the cpu IS CRAZY! Almost likes some abstract art in itself.
fascinating !!
Congratulations, this is awesome
what i've heard of the Fractus series is awesome! i really like your style. it's such a breath of fresh air from everything else i've heard!
+Felix Pena Thank you!
Stunning, mesmerizing work. Highly inspiring for me, as a graduate level student who is seeking this kind of electroacoustic performance/aesthetic... I'm curious to know more about this piece, how it's prepared, what kind of control the performer has over the electroacoustic elements, etc. Gorgeous work.
What a cool tune!!
I thought I'd heard some wild things in my time, but this takes the cake. I'll have to get a carrying case for my superconductor reactor next time I go on a gig.
Wonderfull
Wow what an interesting piece! I had ideas about writing a piece for cello and live electronics, and stumbled upon your channel while researching about technical solutions. Do you maybe have a resource covering your use of interaction with the performer and the programming of different actions for the foot pedal in a piece like this?
Thanks! I don't think I have a video that directly addresses this topic, but I wrote chapters 9, 10 and 11 of my book hoping they'd provide insight in situations like yours.
Hi Ely, thanx for this wonderful piece and for sharing your inspiration and knowledge. One question, how is your approach to compose this kind of pieces, first you create the instrument part (flute in this case) and then the supercollicer part. Or do you make sections and complete both parts? Best regards
Cooool!
Loved it! Trully beautiful.
this is amazing! thanks for sharing
This is so beautiful!
MINDBLOWING
Really enjoying this. I listened to it a few years ago and it's still inspiring.
damn! love it! great work.
I hope you perform this!
Absolutely amazing!
Congratulations for your work, this was inspiring. Great music and great performance, I'm happy I got to hear it. Thank you for sharing it!
love it!!
love it! quite eclectic piece you created here... I liked it!
Wow, awesome!
Awesome!
Amazing work, Eli. Whatever you're doing, keep doing it!
Mostly one performance. Kenzie and I recorded large sections two or three times each and then chose our favorites.
great!
genial
i wish i know how they programmed how the sheet music was played in a sequence
Thank you for the answer
My second question was may be not enough clear :
I'm also interested in the notation software you used to edit the electronic part on the score, what is the software you use ?
good
The piece is very interesting, and very well performed, good job !
Thanks for sharing this.
Does the program interact in real time with the instrumentist or the algorithmic part runs itself and the player plays in time ?
The score is very nice, in particular the CPU part . Which editor do you use ?
(I'm just discovering supercollidar)
OK, ... I was searching for a magic supercollidar plugin that would edit the score automatically from the real-time code ... but if the solution is "a lot of patience" ...
What did you use as your interactive element? WiiMote or TouchOSC?
Neither-- a MIDI foot controller is the only physical means of human-computer interaction. The piece is designed to sound interactive, and the performer can make certain choices in response to the computer-generated sound, but the piece is inherently linear and follows a loose, but predetermined trajectory.
I am really interested in using supercollider in my latest composition but I simply do not have enough technical knowledge about the software. My problem is I don't know how to make the code so that the program interacts with instruments through a microphone live. I am also interested in whether it would be possible to connect a VST instrument to interact live. Is there anywhere you could please point me to to get me started? I do have knowledge of program coding in many languages. By the way, I really enjoy your Fractus series.
Perhaps a simple example or a tutorial?
Learning SC takes time. If it were simple enough that anyone could pick it up in a few hours, there would be little need for this video tutorial series, the SC book, and the dozen or so pdf tutorials floating around on the web.
SC cannot interface with VST plugins, as far as I know.
SoundIn.ar is the simplest UGen for pulling a live mic signal into SC for further processing. From there, it is simply a matter of designing SynthDefs to create the desired effects, and creating the desired nodes on the server at desired times.
Coding a larger, more generalized performance structure is a difficult topic to address in a tutorial, principally because there are so many fundamentally different approaches, each dependent on the type of music being composed, and each has unique strengths and weaknesses. I would be happy to demonstrate my own approaches (and likely will do so at some point), but I think it is far more valuable for neophytes to dive in on their own, make their own mistakes, and eventually discover an approach which works well for them, specifically.
Thank you for all of your help. Basically, I want to create a composition where a microphone picks up part of a sound(like the harmonics of a piano) and they are amplified and transformed both live and 'stored' for an appropriate time. I have discovered a code for streaming live sound from the microphone:
s = Server.local.boot;
b = Bus.audio(s, 2) ;
(
SynthDef(\MicRecordBuf, { arg out;
var mic;
mic = SoundIn.ar([0,1]);
Out.ar(out, mic)
}).play(s,[\out, b]);
)
(
SynthDef(\MicPlayBuf, { arg out = 0, in;
var rev;
rev = FreeVerb.ar(In.ar(in)) ;
Out.ar(out, rev);
}).play(s, [\out, 0, \in, b], addAction: \addToTail);
)
Also, for storing in a buffer:
s = Server.local;
s.boot;
b= Buffer.alloc(s, 44100 * 5.0, 2)
(
SynthDef(\MicRecordBuf, { arg bufnum = 0;
var mic;
mic = SoundIn.ar([0,1]);
RecordBuf.ar(mic, bufnum, doneAction: 2, loop: 0);
}).play(s,[\bufnum, b]);
)
// record
// play the recording
(
SynthDef(\MicPlayBuf, { arg out = 0, bufnum = 0;
var playbuf;
playbuf = PlayBuf.ar(2,bufnum, doneAction:2);
Out.ar(out, playbuf);
}).play(s, [\out, 0, \bufnum, b]);
)
The next stage is to work out how to modify the code for my own purposes. E.g. press a button and then activate the record code and then press another button and play the recording. Any suggestions, please?
I did this with serum but I lost it ;(
Flute City Stories
Pretty sure almost no one can play this, unless they got ALOT of practice