Kasper T. Toeplitz



__ BassComputer __ Compositions & Composers __ Instruments __

Difficult to explain exactly what the BassComputer is, but it is at least 2 things, in one:

1/ an aesthetic idea, or how to make music - or what should music be - in 21st century; at least in the beginning of this century. The way I see it it has a lot to do with the notion of noise/pitch (or pitched noises) - the unknow frontier between the 2. Also the notation - once you decide to produce music which is made of masses and relations between them, made of noises, made of granularity, the traditionnal notation is not of great use. I, however, still belive than writing music (as opposed to "just" producing sounds) is a great great strength. What else - the computer of course, as a machine which helps us to produce the sounds we want to hear, but also helps us to _think_ the music, to structure our ideas. Not quite an instrument - a little more, a little less.

2/ it is also the instrument - a hybrid of an electric Bass and of the computer seen as a real-time instrument. It is much more than plugging a bass into a computer, and using the computer's DSP as effects. It is not only about _transforming_ the sound produced by the bass (instrument) and its components/infos (pitch, duration, timber): it is about an instrument with 2 possible ways to get in - the bass strings or the computers keyboard/trackpad/generators. The idea here is to get away with the usual organologic limitations - infinte ambitus and infinite polyphony (and infinite polyphony of timbers) - at least in theory.


Then I found this, on the web: some guy's blog, at describing my work - and the concerts at GRM, march 2006 when playing on my BassComputer the Eliane Radigue's ELEMENTAL II, and my own Lärmesmitte. He totally understood what i am after, why and how the bass, etc. The best introduction to this part of my work. Thanks Bastien (the blog is

Kasper is just amazing, he has broken so many walls : you can’t tell if he is an academic composer (he has won many prizes) or a noise musician when you hear his compositions or a punk rocker when you see his look. That’s something I really like in Kasper, he takes good stuffs from different places. He was actually a bass player before, then he gave up playing bass to focus on computers but now he has mixed them with a totally revamped bass, his instrument is just beautifull. He plays his instrument without any of the rituals attached to the traditional “rock” instrument, something I really appreciate. He played twice during the festival, the first time to present “Larmesmitte”, his own composition for solo BassComputer and the second time he played Elemental II, a composition by Eliane Radigue. Most of his work in these pieces was done throw playing with a kind of electronic wind.


Apart from my own compositions for this instrument, I also ask some composers - whose work I feel being great, of course, but also when i have a feeling my playing could add a little "more" to their music - to write pieces for me, or for the BassComputer. What are the "obligations" is always more or less the same :

___I ask for "long" pieces - that is more or less one hour, the duration of a CD, or an evening; also because i am fed-up with the usual 13-minutes pieces;
___They don't have to worry about the instrument - the idea is to play the music, not an instrument;
___For each piece i also write the corresponding computer program - so far it means a MaxMSP patch, but it could be another program - Reaktor, CSound, SuperCollider... It's just than Max feels easier to me;
___Most of the time the idea is to play all of the music in real-time - no sound-files, no samples... That is unless the composer really feels it is a big part of his work/style (such was the case with Phill Niblock);
___the rest is a question of discussions with the composer and me.....


The "critic's choice" of the Chicago Reader - Friday March 24, 2006 - on march 26th I played BassComputer there, at LAMPO, a very good concert - and a very long one, almost 3 hours of solo music:

Clad in black leather and sporting a Mohawk, Kasper T. Toeplitz strikes a badass pose onstage, and his music backs it up.
At his first Chicago concert in 2001, the Warsaw-born, Paris-based composer, bassist, and computer musician premiered "Yam Almost May," a throbbing yet lyrical drone piece by Phill Niblock composed from samples of Toeplitz's bass, and the low frequencies seemed to hit listeners' guts as hard as their ears.
For last year's Capture (released on Toeplitz's label, ROSA, an acronym for "Recordings of Sleaze Art") he captured the movements of three dancers via webcam and electronically translated them into sound, generating a slowly accumulating onslaught of flickering high tones and ear-scouring hiss.
For this show [March 25] he'll perform the U.S. premiere of Elemental II by Eliane Radigue, a French septuagenarian who's an unlikely collaborator for Toeplitz. Her detailed, gradually evolving electronic works evoked a meditative state even before she converted to Buddhism three decades ago, and she had never written a piece for another musician to play live. But Toeplitz successfully melds their disparate sound worlds on his recording of the piece. He first weaves a surface of coarse, frayed fabric, then pulls out intricate patterns of radiant, glassy pitches; about a half hour into the 50-minute performance a steady stream of wavering, ascendant sweeps, sounding more like bowed violins than bass guitar, escape like jets of steam from deep within the earth. Once the pressure's relieved, the piece resolves into distant groans, as if Toeplitz had made a field recording of shifting tectonic plates. Toeplitz will also play three of his own compositions. ------ Bill Meyer



coming soon..........

But in the meantime at least a picture of my main bass, custom-build by Philippe Dubreuille. It's a beautiful instrument.... 5 strings, + 4 resonant strings... and many more. The pickups were made by Benedetti, and the midi implementation by MESI

Philippe Dubreuille is..................... here
MESI (Marc Sirguy) is .................... here
Benedetti (Nicolas Mercadal) is .. here

More about instruments is here.
And all the "computer" part of it is done mainly with MaxMSP , even if sometimes it can be other softwares as Reaktor, CSound... and various plug-ins


Compositions & Composers

The first real piece for BassComputer - all the experiments were defined with this one. A one hour long exploration of the possibilities, of some of the interactions between the 2 elements. I played it for some years, but never recorded it - at least not a "real" recording: I however have 2 live recordings, one in Bordeaux (during a "bass Festival" made by Erik Baron) and one made in Chicago, during my first visit at Lampo. The really early age of BassComputer. I might come to it some days, but.... but it is a little bit primitive. Then you have to start somewhere, don't you??
The "Demonology" of the title is because of the writer Kathy Acker - because of one of her books. But I also met Kathy, in San Francisco, and tried to make an opera with her - actually I traveled to SF only to met her and convince her to do something together.
DEMONOLOGY #11 is 50 minutes long.
Phill Niblock  

The first collaboration with another composer - at least with the bassComputer. This one uses prerecorded material - samples of myself playing bass, with little bits of pitch shifts. This technique (instrument + samples of the same instrument) is so much part of Phill's aesthetic that it is difficult to say NO. I however choose to control the spatialisation and all the prerecorded sounds myself (as opposed to be played by the sound guy) It is a piece I played many times, live. It was released on Touch records - and the recording was made at CCMix.
The piece itself uses bowing and e-bowing - typical of Niblock fascination for long, held tones. At some moments it is even bowed AND e-bowed at the same time !!. Pitch-wise it's C and D#. Now, the idea for the next piece with Phill is to convince him, and to develop a system (in MaxMSP most certainly, but could be in LIVE) where there would be NO pre-recorded parts and all the superpositions and pitch-shifts would be done in real time.
YAM ALMOST MAY is 28 minutes long.
YAM ALMOST MAY is released on "Phill Niblock : Touch Food", Touch TO:59 (TOUCH is here)

This composition was never meant to be played live - from the beginning it was to be released on a CD - and since it is pretty short (according to my usual standarts) I don't imagine playing it live - it would mean playing many shorter pieces, which i don't think is such a good idea. But all the usual "tricks" of bassComputer are present on PURR#2 : the use of bass and also of sounds generated in computer, in real time, a quite important polyphony (at least one you could never have on a solo bass), the use of a very large ambitus, and wide dynamics - from the very loud at the beginning iot goes to the calm ending part. And also the use of the granular synthesis, in real time!!!
The piece was commisssioned by the San Francisco MOMA, for their program about french avant-garde music. The curator was Laurent Dailleau.
PURR #2 is 7:13 minutes long.
PURR #2 is released on "33RPM _ Ten Hours of Sound from France" , 23five/SFM 903 (23five is here)
Eliane Radigue  

It took me a very very long time to convince Eliane to write a piece for me. She never before did wrote anything for a "live" musician, or at least never wrote any music without sounds coming from synthesiser, without prerecorded sounds. (She actually did write "Geelriandre" for ARP synth and Gérard Fremy playing piano, but as she told me, it was a tape piece first , with some live playing on top of it). So it took me more than 2 years. Then she finally accepted.
There is no score as such - there is a drawing and a "plan" but this is no score. The main thing here was discussing with her, having her come to rehearsals, work and talk. There is no indication(s) of pitch(es) to be played, not much about duration(s) either. A lot about the meaning of it all - which suits me very well, since this is what the music score should be all about, those days - saying than you want "a low pass filter, oscillating" is not enough, and it basically does not say anything.. nothing about the music itself. But the "why", defining the goal is a way, for sure.
The MaxMSP patch for this piece is actually a 5 piece patch - following the idea of the 5 having 5 differents parts, or states. Nothing really spectacular here (max-wise) but watching at what I do, one can not guess the kind of resulting sounds - for instance in the "water" part, I am playing long E-bowed lines - which are ring-modulated by small "clouds" of sine-generators, whose frequencies are constantly moving - so playing a held note produces a moving result. This is more or less what all the BassComputer thing is about.
Elemental II is also the piece which made me start the r.o.s.a. label (for Recordings Of Sleaze Art) - I had the idea since a long time, but i it is always more efficient to do things for someone than for yourself - you feel kind of moral obligation.
Elemental II was first played during a CCMix festival, in Paris - since then I played it many times - and still do.
ELEMENTAL II is 50 minutes long.
ELEMENTAL II is released on "ELIANE RADIGUE ELEMENTAL II", r.o.s.a._01 (r.o.s.a. is here)

Static, the piece, is born from a cello piece I wrote for Ulrich Maiss. The cello piece - which Ulrich actually played on electric Bass-Cello - is named "Cello_Titan" and is written for cello and a real-time max patch. "Cello_Titan" is actually one of the very few - if not the only one (probably the only one)- piece that I wrote which uses a sample (or sound) from the "real world". Usually I am very much against samples, let alone sound files of field recording, "real" sounds, and so on - in my music, at least. But I found, on internet, arecordind of Titan, the moon of Saturn. And it felt just natural to include it in a piece written for Ulrich Maiss and its "über-cello" !!
Then I decided to adapt this piece for BassComputer. Since it is no cello, and mainly since I did change quite a lot of things - Static is two times longer than "Cello_Titan" - I decided to change the name. The structure of the piece is still evolving, and I am thinking about getting rid of the pre-recorded part and, in the "real" BassComputer tradition, to play it all in real time, pure generation. What is kept from its "cello" origins, and what will stay, being really part of STATIC, is the fact that this piece is only bowed - just a classical bow (either a cello or a double-bass bow).
Even if I consider STATIC still being a "work in progress", whose structure might still evolve, I already played it live quite a few times - and usually it is a piece than the audience likes a lot. For some unknown reason (the bow and its romanticism??) it is a piece than women - of ages - seem to like very much. Which is a good reason to keep playing it.
It is also the first piece which was conceived, from the beginning, on the new bass that Philippe Dubreuille build for me - and which now is becoming the "symbol" of BassComputer. Of course I now play all of the BassComputer concerts on this bass, but the previous pieces were conceived (and played) on other instruments.
STATIC, as of mid-2006, is 30 minutes long.
Lärmesmitte - for "center of the noise" - might be perceived as a good explaination of was BassComputer is about - or at least part of what it is about. Usually, in the pieces written for an instrument and live-electronics, the instrument makes the notes - that is the pitches, and their durations, and also, to some extent, the timber - and those elements are then manipulated, transformed, by the electronics. You can modify the timber (think of filters, wha-whas), you can change the duration (you can make it longer, mainly - it is much more difficult to make it shorter than what the musician plays - and the beginning of the note is always given by the player, on its instrument) and if you can change the pitch it is according to the given pitch, teh one played - which you transpose, pitch-shift, harmonize.... Add to this some delays and reverb - but basically you are doing (in a much smarter and precise way) what is doen since years with guitar pedals and effects.
In Lärmesmitte I did just the opposite - on the bass what i play is, basically, just a wall of noise - not noise as in guitar chords, but just noise as in white noise; many many notes, scratches, noises. An indistinct wall of sonorities, on an ambitus as big and as full as possible. Then, using the computer (pre-programmed, of course) I _sculpt_ in this noise, to creat, out of it pitches , movements, timbres. Different layers of sounds, very thick sounds, of varying "shapes" and densities. Up to some point (which one??) the input could have been something else (anything else??) - actually when writing the patch i was using a white noise generator as iinput and not the bass. The bass only gives the physicality, here. Of course, if i don't play, there will be no sound.... The score, the movements of the different layers, their ambituses were written first, away from the sound, away from the bass or from MaxMSP. Written in silence, just as when i compose orchestral music, on paper.
Lärmesmitte was commissioned by the GRM - and they wanted a short piece; short for me. So actually what exists now, is what is consider the "part 1" of the piece - part 2 will be written by the end of 2006.
Lärmesmitte (part 1) is 30 minutes long.
Dror Feiler  

For OUSIA, Dror Feiler choose not to write a score (or anything which could be similar to a score) but to send me a sound file (which i _think_ was made with proTools or a similar software, all the sounds being created in the computer - I know he uses Reaktor, so this might one of the sound creation tools) and then asked me to play _this_. Not to play on top of the sound file (which anyhow he knew i did not want, as i want to stay away from sound files and samples) but to play the music I was hearing on the sound file. Seems pretty easy (or not so easy) when you hear some traditionnal music - by traditionnal i mean music using the "classical" form - with a melody, or at least a "voice", with chords, with a clear structure. OUSIA is pure noise - a blast of pure noise, centered in rather the medium to medium-high frequencies. To play this could have meant anything - exactly what do you hear when hearing this kind of noise? is it about the exact frequencies, the chords (or any name you can put on an ensemble of different frequencies pleayed at the same time..) Just try to play a Merzbow CD on your guitar.
Took me a long time just to found out WHAT to do. I did a spectral analysis of the sound file (in AudioSculpt), calculated the "implied" chords. Did a "stylistic" analysis - moment by moment - here a filter sweep, there a high larsen-like squeetch... hum. This had nothing to do with the piece itself. Finally i decided than to play OUSIA was to recreate its global structure, its time structure and its energy. The global structure i defined is 5 parts (even if it has a tendency to sound as an uninterrupted flow) The time structure was exactly the one of the sound file (during the process of working on this piece i had the feeling one of the parts was too long - I made a proposition to the composer for a shorter version - which he accepted) .
and then play, creating the different layers, playing not less than 3 multiple whawhas designad in MaxMSP (each of them being a bank of 5 whas) with max made distortions - and some guitarRig in there as well.. the sound file actually plays, but is not heard - i use it as "carriers" for ring modulation. This is the most physical piece I played, i can not play it more than twice a day...
OUSIA was commissioned by Art Zoyd.
OUSIA is 38 minutes long

Next, I hope to be able to commission a BassComputer piece from Jean-claude Eloy - it would be a pleasure to play one of his compositions.... Other composers I wish could write a piece for this project include John Duncan and Francisco Lopez. This is the preliminary idea about a piece francisco would compose :
My sound work is based on a phenomenological approach to the sonic matter
of the "real world". I spend my life doing what is usually called as "field
recordings". A bad term for someone like me not interested at all in
representation or referentiality, but rather on the physicality and
spirituality of the essence of sound. I often say that my instrument is the
world itself: for me, it is the best sound generator. But that "world",
that "reality", is of course an ellusive substance and, in any case, it's
in fact the actual, concrete sonic matter that directs my work.

The real world is not only a source of material, but - more importantly- a
source of inspiration and learning. My conceptions about pace, virtual
space, temporality, texture, and many other features in the music I create
have been shaped by many years of intense listening and recording
activities across the globe. This is my musical education and training, and
this is precisely what I am interested in exploring in my composition for
BassComputer: not a representation or an imitation, but the enactment of
those unique features of the world (so dramatically different from the
music composition standards) in a sound creation that will explore the full
potential of the BassComputer, not as an instrument, but as a virtual sonic
world generator. For this composition, as for most of my soundwork, I am
particularly interested in the generation of an immersive virtual sonic
world that will stand by itself as a self-contained experience.

Then I hope to convince LaMonte Young to play one of his pieces

And of course some other of my own compositions