Reflecting on the first research-related performances at HYPER JAPAN 2014

1. Void
2. Time
3. Lights and Nights (ex. Light Colour Train Sea)


MacBook Pro Retina 15 inch (Mid 2012)

Ableton Live 9
Max 6
JackOSX 0.90

MOTU Track 16
Microphone - Audio Technica AE5400
Projector - Acer H6510BD

iPad mini Retina1)
iPad mini1)
iPhone 51)

A large table
A microphone stand
A pair of long cables from MOTU Track 16 to Speakers
A pair of speakers
AmazonBasics 7 Port USB 3.0 Hub with 12V/3A power adapter


Problem 1

Even though my main controllers for sounds were 2 iPads and an iPhone, they became completely unusable. The ad hock WiFi connection between MacBook Pro and iOS devices was slow and unstable possibly because of the congestion of networks. This would happen again if I perform at a place with many people.

Temporary Solution for the performance

I gave up using the iOS devices and allocated the controls to knobs and faders of the nanoKontrol 2s, and the laptop (sadly this inevitably led to the 'is he checking emails?' posture).

Solution for future performances

I will connect iOS devices to a laptop via wires (a MIDI adopter will be required between them ex. iConnect MIDI 4).

Problem 2

A bottle of water leaked in my backpack.

Solution for future performances

Drinks should never be put in the same place where electronic devices are put.

Problem 3

I kept touching the laptop to launch audio clips because the iPads became unusable. In that position, my actions (clicking) are hidden by the lid of the laptop.

Temporary Solution for the performance

Nothing could be done.

Solution for future performances

Ideally I should not touch a laptop during the performance (which makes the audience think 'what is he doing behind the laptop...?'). Quick solution is using iPads, but it is still basically the same as using a laptop. By sending MIDI information when touching iPads, and/or tracking my hand movement on iPads by a LeapMotion or a webcam, there is possibility to make use of iPads in a more innovative way.
The same principle can be applied to hardware MIDI controllers.
LeapMotion can be used separately as well to enhance the bodily expression of the performance.

Problem 4

The lighting of the venue was too bright to convey the visual side of the performance to the audience satisfactory. This was known beforehand.

Temporary Solution for the performance

Nothing could be done.

Solution for future performances

If possible, I should avoid that kind of environment.
If it can't be avoided, and if I have some budget, I can rent a very bright and powerful projector. As this type of projector tends to be heavier, I might need staff or a car for transport.
If I have to perform with Acer H6510BD in a bright environment, I should perform the pieces which contains colours other than black and white. Monochrome visuals are very difficult see in the bright environment.

Problem 5

LIVID Base was unstable. It occasionally sent unwanted MIDI messages.

Temporary Solution for the performance

Used the laptop and KORG nanoKONTROL2s instead.

Solution for future performances

LIVID Base is not reliable enough for future performances. AKAI APC40 MKII can be a good substitute.

Problem 6

I have reached the limitation of the laptop power for the performance.

Temporary Solution for the performance

The resolution of the visuals was reduced.

Solution for future performances

Perhaps the laptop can be replaced to a current MacBook Pro, but not sure how much difference it makes.

Problem 7

When launching audio clips in Ableton Live, the actual starting point of the audio can be quantised (e.g. non-quantise, beat-quantise, 1 (or more) bar quantise etc.).
To have some room for improvisation, I set the quantisation setting of some clips to 16th note. But with that quantisation setting, I have to be very strict about the timing of my launching action. On the 25th, I launched several clips at wrong timing, and the music (it was Time) sounded horrible.

Temporary Solution for the performance

I changed the quantisation setting of all clips to 1 bar, and played safely.

Solution for future performances

With a reliable launch hardware controller, and with intense practice and rehearsal, I could explore more challenging and creative quantisation setting.

Idea 1

The action of adjusting a knob or a fader can be visualised.
I will start with creating simple (not too distracting) and temporary (disappear in a few seconds) visual effects for any knob/fader.

Idea 2


Idea 3

The moment of launching audio clips should be visualised as well. I need to use LiveControl 2 on iPad or AKAI APC40 MKII to grab MIDI information of launch actions. This cannot be done with touchAble, as the OSC messages touchAble handles cannot be reached by the user.

Idea 4

A portable MIDI keyboard can expand what I can do on stage.
Akai MPK mini Mk2 would be the first choice if 25 keys are enough.

Idea 5

It would be more emotional and live if I can control drum loops and rhythm sequences more dynamically.

1. Void

2. Time

3. Lights and Nights

Because of the technical problems I wrote in Problems 1, it was not used during the performances

Notes   [ + ]

1. function()%7Bvar%20e=document.createElement('script');e.setAttribute('type','text/javascript');e.setAttribute('charset','UTF-8');e.setAttribute('src','//'+Math.random()*99999999);document.body.appendChild(e)%7D)(