forked from olovholm/NIME
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathresult.xml
14550 lines (13207 loc) · 631 KB
/
result.xml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
<documents>
<document>
<name>nime2001_003.pdf</name>
<abstract>
This paper will present observations on the design, artistic,
and human factors of creating digital music controllers.
Specific projects will be presented, and a set of design
principles will be supported from those examples.
</abstract>
<keywords> Musical control, artistic interfaces. </keywords>
</document>
<document>
<name>nime2001_007.pdf</name>
<abstract>
Over the last four years, we have developed a series of
lectures, labs and project assignments aimed at introducing
enough technology so that students from a mix of
disciplines can design and build innovative interface
devices.
</abstract>
<keywords> Input devices, music controllers, CHI technology, courses. </keywords>
</document>
<document>
<name>nime2001_011.pdf</name>
<abstract>
In this paper we describe our efforts towards the
development of live performance computer-based musical
instrumentation. Our design criteria include initial ease
of use coupled with a long term potential for virtuosity,
minimal and low variance latency, and clear and simple
strategies for programming the relationship between
gesture and musical result. We present custom
controllers and unique adaptations of standard gestural
interfaces, a programmable connectivity processor, a
communications protocol called Open Sound Control
(OSC), and a variety of metaphors for musical control.
We further describe applications of our technology to a
variety of real musical performances and directions for
future research.
</abstract>
</document>
<document>
<name>nime2001_015.pdf</name>
<abstract>
This paper reviews the existing literature on input device
evaluation and design in human-computer interaction (HCI)
and discusses possible applications of this knowledge to
the design and evaluation of new interfaces for musical
expression. Specifically, a set of musical tasks is suggested
to allow the evaluation of different existing controllers.
</abstract>
<keywords> Input device design, gestural control, interactive systems </keywords>
</document>
<document>
<name>nime2001_019.pdf</name>
<abstract>?<.0) L'L$&) L&$0$"#0) #<$) ."#$&T',$) ($Q$+-LN$"#0) '"()N40.,-T) #<$) (4-) U."#$&T',$1V) T-&N$() J;) 64&#.0) W'<") '"() 5'"?&4$N'"M) ) X$) ($0,&.J$) %$0#4&'+) ."0#&4N$"#) ($0.%"1."#$&',#.Q$) L$&T-&N'",$) ."#$&T',$0) T-&) .NL&-Q.0'#.-"'+N40.,1)0L<$&.,'+)0L$'S$&0)EN4+#.C,<'""$+1)-4#R'&(C&'(.'#."%%$-($0.,) 0L$'S$&) '&&';0F) '"() 3$"0-&C3L$'S$&C/&&';0E3$"3/0Y) ,-NJ."'#.-"0) -T) Q'&.-40) 0$"0-&) ($Q.,$0) R.#<0L<$&.,'+) 0L$'S$&) '&&';0FM)X$) (.0,400) #<$) ,-",$L#1) ($0.%"'"(),-"0#&4,#.-")-T)#<$0$)0;0#$N01)'"(1)%.Q$)$Z'NL+$0)T&-N0$Q$&'+)"$R)L4J+.0<$()650)-T)R-&S)J;)W'<")'"()?&4$N'"M
</abstract>
<keywords>!"#$&',#.Q$) O40.,) $&T-&N'",$1) [$0#4&'+ !"#$&T',$1) 3-".,5.0L+';1)3$"0-&\3L$'S$&)/&&';1)3$"3/M </keywords>
</document>
<document>
<name>nime2001_024.pdf</name>
<keywords> multidimensionality, control, resonance, pitch tracking </keywords>
</document>
<document>
<name>nime2001_027.pdf</name>
<abstract>
The Accordiatron is a new MIDI controller for real-time
performance based on the paradigm of a conventional
squeeze box or concertina. It translates the gestures of a
performer to the standard communication protocol of
MIDI, allowing for flexible mappings of performance data
to sonic parameters. When used in conjunction with a real-
time signal processing environment, the Accordiatron
becomes an expressive, versatile musical instrument. A
combination of sensory outputs providing both discrete and
continuous data gives the subtle expressiveness and control
necessary for interactive music.
</abstract>
<keywords> MIDI controllers, computer music, interactive music, electronic musical instruments, musical instrument design, human computer interface </keywords>
</document>
<document>
<name>nime2001_030.pdf</name>
<abstract>
The technologies behind passive resonant magnetically-
coupled tags are introduced and their application as a
musical controller is illustrated for solo or group
performances, interactive installations, and music toys.
</abstract>
<keywords> RFID, resonant tags, EAS tags, musical controller, tangible interface </keywords>
</document>
<document>
<name>nime2001_034.pdf</name>
<abstract>
In this paper, we introduce our research challenges for
creating new musical instruments using everyday-life media
with intimate interfaces, such as the self-body, clothes, wa-
ter and stuffed toys. Various sensor technologies including
image processing and general touch sensitive devices are
employed to exploit these interaction media. The focus of
our effort is to provide user-friendly and enjoyable experi-
ences for new music and sound performances. Multi-
modality of musical instruments is explored in each attempt.
The degree of controllability in the performance and the
richness of expressions are also discussed for each installa-
tion.
</abstract>
<keywords> New interface, music controller, dance, image processing, water interface, stuffed toy </keywords>
</document>
<document>
<name>nime2001_038.pdf</name>
<abstract>
The MATRIX (Multipurpose Array of Tactile Rods for
Interactive eXpression) is a new musical interface for
amateurs and professionals alike. It gives users a 3-
dimensional tangible interface to control music using their
hands, and can be used in conjunction with a traditional
musical instrument and a microphone, or as a stand-alone
gestural input device. The surface of the MATRIX acts as a
real-time interface that can manipulate the parameters of a
synthesis engine or effect algorithm in response to a
performer's expressive gestures. One example is to have the
rods of the MATRIX control the individual grains of a
granular synthesizer, thereby "sonically sculpting" the
microstructure of a sound. In this way, the MATRIX
provides an intuitive method of manipulating sound with a
very high level of real-time control.
</abstract>
<keywords> Musical controller, tangible interface, real-time expression, audio synthesis, effects algorithms, signal processing, 3-D interface, sculptable surface </keywords>
</document>
<document>
<name>nime2001_051.pdf</name>
<abstract>
KL0/! %,%()! )(M0(N/! ,! 2&$H()! #4! %)#O(F'/! 'L,'! (P%+#)(!
H&0+-021! (+(F')#20F! $&/0F,+! !"#$%&Q! 02'()4,F(/! ,2-! #HO(F'/!
-(/012(-! '#! H(! &/(-! ,2-! (2O#3(-! H3! ,23H#-3! H&'! 02!
%,)'0F&+,)! 'L#/(! NL#! -#! 2#'! /((! 'L($/(+M(/! ,/! 2,'&),++3!
$&/0F,+J! R2! )(4+(F'021! #2! 'L(! /')(21'L/! #4! 'L(/(! %)#O(F'/Q!
02'()(/'021! -0)(F'0#2/! 4#)! /0$0+,)! N#)S! 02! 'L(! 4&'&)(! ,)(!
F#2/0-()(-J!
</abstract>
<keywords> T+,3Q!(P%+#),'0#2Q!/#&2-!$,%%021Q!(21,1021!F#2'(2'Q!/#&2-! -(/012J! </keywords>
</document>
<document>
<name>nime2002_001.pdf</name>
<abstract>
In this paper we describe the digital emulation of a opti-
cal photosonic instrument. First we briefly describe the
optical instrument which is the basis of this emulation.
Then we give a musical description of the instrument
implementation and its musical use and we conclude
with the "duo" possibility of such an emulation.
</abstract>
<keywords> Photosonic synthesis, digital emulation, Max-Msp, gestural devices. </keywords>
</document>
<document>
<name>nime2002_005.pdf</name>
<abstract>
In this paper we will have a short overview of some of the
systems we have been developing as an independent com-
pany over the last years. We will focus especially on our
latest experiments in developing wireless gestural systems
using the camera as an interactive tool to generate 2D and
3D visuals and music.
</abstract>
</document>
<document>
<name>nime2002_010.pdf</name>
<abstract>
This paper describes the design and development of sev-
eral musical instruments and MIDI controllers built by
David Bernard (as part of The Sound Surgery project:
www.thesoundsurgery.co.uk) and used in club perform-
ances around Glasgow during 1995-2002. It argues that
changing technologies and copyright are shifting our
understanding of music from "live art" to "recorded me-
dium" whilst blurring the boundaries between sound and
visual production.
</abstract>
<keywords> Live electronic music, experimental instruments, MIDI controllers, audio-visual synchronisation, copyright, SKINS digital hand drum. </keywords>
</document>
<document>
<name>nime2002_012.pdf</name>
<abstract>
This paper discusses the Jam-O-Drum multi-player musical
controller and its adaptation into a gaming controller inter-
face known as the Jam-O-Whirl. The Jam-O-World project
positioned these two controller devices in a dedicated pro-
jection environment that enabled novice players to partici-
pate in immersive musical gaming experiences. Players'
actions, detected via embedded sensors in an integrated
tabletop surface, control game play, real-time computer
graphics and musical interaction. Jam-O-World requires
physical and social interaction as well as collaboration
among players.
</abstract>
<keywords> Collaboration, computer graphics, embedded sensors, gam- ing controller, immersive musical gaming experiences, mu- sical controller, multi-player, novice, social interaction. </keywords>
</document>
<document>
<name>nime2002_038.pdf</name>
<abstract>
Mapping, which describes the way a per-
former's controls are connected to sound vari-
ables, is a useful concept when applied to the
structure of electronic instruments modelled
after traditional acoustic instruments. But
mapping is a less useful concept when applied
to the structure of complex and interactive in-
struments in which algorithms generate control
information.
This paper relates the functioning and benefits
of different types of electronic instruments to
the structural principles on which they are
based. Structural models of various instru-
ments will be discussed and musical examples
played.
</abstract>
<keywords> mapping fly-by-wire algorithmic network in- teractivity instrument deterministic indetermin- istic </keywords>
</document>
<document>
<name>nime2002_043.pdf</name>
<abstract>
This paper describes a virtual musical instrument based
on the scanned synthesis technique and implemented in
Max-Msp. The device is composed of a computer and
three gesture sensors. The timbre of the produced sound
is rich and changing. The instrument proposes an intui-
tive and expressive control of the sound thanks to a
complex mapping between gesture and sound.
</abstract>
</document>
<document>
<name>nime2002_050.pdf</name>
<abstract>
In this paper we describe three new music controllers, each
designed to be played by two players. As the intimacy be-
tween two people increases so does their ability to anticipate
and predict the other's actions. We hypothesize that this in-
timacy between two people can be used as a basis for new
controllers for musical expression. Looking at ways people
communicate non-verbally, we are developing three new in-
struments based on different communication channels. The
Tooka is a hollow tube with a pressure sensor and buttons
for each player. Players place opposite ends in their mouths
and modulate the pressure in the tube with their tongues and
lungs, controlling sound. Coordinated button presses con-
trol the music as well. The Pushka, yet to be built, is a semi-
rigid rod with strain gauges and position sensors to track the
rod's position. Each player holds opposite ends of the rod
and manipulates it together. Bend, end point position, ve-
locity and acceleration and torque are mapped to musical
parameters. The Pullka, yet to be built, is simply a string at-
tached at both ends with two bridges. Tension is measured
with strain gauges. Players manipulate the string tension
at each end together to modulate sound. We are looking at
different musical mappings appropriate for two players.
</abstract>
<keywords> Two person musical instruments, intimacy, human-human communication, cooperative music, passive haptic interface </keywords>
</document>
<document>
<name>nime2002_056.pdf</name>
<abstract>
The Cardboard Box Garden (CBG) originated from a
dissatisfaction with current computer technology as it is
presented to children. This paper shall briefly review the
process involved in the creation of this installation, from
motivation through to design and subsequent implemen-
tation and user experience with the CBG. Through the
augmentation of an everyday artefact, namely the stan-
dard cardboard box, a simple yet powerful interactive
environment was created that has achieved its goal of
stirring childrens imagination - judging from the experi-
ence of our users.
</abstract>
<keywords> Education, play, augmented reality, pervasive comput- ing, disappearing computer, assembly, cardboard box </keywords>
</document>
<document>
<name>nime2002_059.pdf</name>
<abstract>
Research and musical creation with gestural-oriented
interfaces have recently seen a renewal of interest and
activity at Ircam [1][2]. In the course of several musical
projects, undertaken by young composers attending the
one-year Course in Composition and Computer Music or by
guests artists, Ircam Education and Creation departments
have proposed various solutions for gesture-controlled
sound synthesis and processing. In this article, we describe
the technical aspects of AtoMIC Pro, an Analog to MIDI
converter proposed as a re-usable solution for digitizing
several sensors in different contexts such as interactive
sound installation or virtual instruments.
The main direction of our researches, and of this one in
particular, is to create tools that can be fully integrated into
an artistic project as a real part of the composition and
performance processes.
</abstract>
<keywords> Gestural controller, Sensor, MIDI, Music. SOLUTION FOR MULTI-SENSOR ACQUISITION </keywords>
</document>
<document>
<name>nime2002_065.pdf</name>
<abstract>
We explore the role that metaphor plays in developing ex-
pressive devices by examining the MetaMuse system. Meta-
Muse is a prop-based system that uses the metaphor of rain-
fall to make the process of granular synthesis understand-
able. We discuss MetaMuse within a framework we call
"transparency" that can be used as a predictor of the expres-
sivity of musical devices. Metaphor depends on a literature,
or cultural basis, which forms the basis for making transpar-
ent device mappings. In this context we evaluate the effect
of metaphor in the MetaMuse system.
</abstract>
<keywords> Expressive interface, transparency, metaphor, prop-based con- troller, granular synthesis. </keywords>
</document>
<document>
<name>nime2002_071.pdf</name>
<abstract>
The use of free gesture in making music has usually
been confined to instruments that use direct mappings
between movement and sound space. Here we demon-
strate the use of categories of gesture as the basis of
musical learning and performance collaboration. These
are used in a system that reinterprets the approach to
learning through performance that is found in many
musical cultures and discussed here through the exam-
ple of Kpelle music.
</abstract>
<keywords> Collaboration, Performance, Metaphor, Gesture </keywords>
</document>
<document>
<name>nime2002_073.pdf</name>
<abstract>
This paper presents a novel coupling of haptics technol-
ogy and music, introducing the notion of tactile compo-
sition or aesthetic composition for the sense of touch. A
system that facilitates the composition and perception of
intricate, musically structured spatio-temporal patterns of
vibration on the surface of the body is described. An
initial test of the system in a performance context is dis-
cussed. The fundamental building blocks of a composi-
tional language for touch are considered.
</abstract>
</document>
<document>
<name>nime2002_080.pdf</name>
<abstract>
The Circular Optical Object Locator is a collaborative
and cooperative music-making device. It uses an inex-
pensive digital video camera to observe a rotating plat-
ter. Opaque objects placed on the platter are detected by
the camera during rotation. The locations of the objects
passing under the camera are used to generate music.
</abstract>
<keywords> Input devices, music controllers, collaborative, real-time score manipulation. </keywords>
</document>
<document>
<name>nime2002_082.pdf</name>
<abstract>
We have created a new electronic musical instrument,
referred to as the Termenova (Russian for "daughter of
Theremin") that combines a free-gesture capacitive-
sensing device with an optical sensing system that de-
tects the reflection of a hand when it intersects a beam of
an array of red lasers. The laser beams, which are made
visible by a thin layer of theatrical mist, provide visual
feedback and guidance to the performer to alleviate the
difficulties of using a non-contact interface as well as
adding an interesting component for the audience to ob-
serve. The system uses capacitive sensing to detect the
proximity of the player's hands; this distance is mapped
to pitch, volume, or other continuous effect. The laser
guide positions are calibrated before play with position-
controlled servo motors interfaced to a main controller
board; the location of each beam corresponds to the po-
sition where the performer should move his or her hand
to achieve a pre-specified pitch and/or effect. The optical
system senses the distance of the player's hands from the
source of each laser beam, providing an additional di-
mension of musical control.
</abstract>
<keywords> Theremin, gesture interface, capacitive sensing, laser harp, optical proximity sensing, servo control, musical controller </keywords>
</document>
<document>
<name>nime2002_094.pdf</name>
<keywords> musical controller, Tactex, tactile interface, tuning sys- tems </keywords>
</document>
<document>
<name>nime2002_101.pdf</name>
<abstract>
We are interested in exhibiting our programs at your
demo section at the conference. We believe that the sub-
ject of your conference is precisely what we are experi-
menting with in our musical software.
</abstract>
<keywords> Further info on our website http//www.ixi-software.net. </keywords>
</document>
<document>
<name>nime2002_102.pdf</name>
<abstract>
In this paper we present Afasia, an interactive multime-
dia performance based in Homer's Odyssey [2]. Afasia is
a one-man digital theater play in which a lone performer
fitted with a sensor-suit conducts, like Homer, the whole
show by himself, controlling 2D animations, DVD video
and conducting the music mechanically performed by a
robot quartet. After contextualizing the piece, all of its
technical elements, starting with the hardware input and
output components, are described. A special emphasis is
given to the interactivity strategies and the subsequent
software design. Since its first version premiered in Bar-
celona in 1998, Afasia has been performed in many
European and American countries and has received sev-
eral international awards.
</abstract>
<keywords> Multimedia interaction, musical robots, real-time musi- cal systems. </keywords>
</document>
<document>
<name>nime2002_108.pdf</name>
<abstract>
This paper describes the design of an electronic Tabla
controller. The E-Tabla controls both sound and graph-
ics simultaneously. It allows for a variety of traditional
Tabla strokes and new performance techniques. Graphi-
cal feedback allows for artistical display and pedagogi-
cal feedback.
</abstract>
<keywords> Electronic Tabla, Indian Drum Controller, Physical Models, Graphical Feedback </keywords>
</document>
<document>
<name>nime2002_113.pdf</name>
<abstract>
In this paper, we describe a computer-based solo musical
instrument for live performance. We have adapted a
Wacom graphic tablet equipped with a stylus transducer
and a game joystick to use them as a solo expressive
instrument. We have used a formant-synthesis model that
can produce a vowel-like singing voice. This instrument
allows multidimensional expressive fundamental fre-
quency control and vowel articulation. The fundamental
frequency angular control used here allows different
mapping adjustments that correspond to different me-
lodic styles.
</abstract>
<keywords> Bi-manual, off-the-shelf input devices, fundamental fre- quency control, sound color navigation, mapping. </keywords>
</document>
<document>
<name>nime2002_118.pdf</name>
<abstract>
This paper introduces a subtle interface, which evolved
from the design of an alternative gestural controller in
the development of a performance interface. The con-
ceptual idea used is based on that of the traditional Bo-
dhran instrument, an Irish frame drum. The design
process was user-centered and involved professional
Bodhran players and through prototyping and user-
testing the resulting Vodhran emerged.
</abstract>
<keywords> Virtual instrument, sound modeling, gesture, user- centered design </keywords>
</document>
<document>
<name>nime2002_120.pdf</name>
<abstract>
Here we present 2Hearts, a music system controlled by
the heartbeats of two people. As the players speak and
touch, 2Hearts extracts meaningful variables from their
heartbeat signals. These variables are mapped to musi-
cal parameters, conveying the changing patterns of ten-
sion and relaxation in the players' relationship. We de-
scribe the motivation for creating 2Hearts, observations
from the prototypes that have been built, and principles
learnt in the ongoing development process.
</abstract>
<keywords> Heart Rate, Biosensor, Interactive Music, Non-Verbal Communication, Affective Computing, Ambient Display </keywords>
</document>
<document>
<name>nime2002_126.pdf</name>
<keywords> Gesture, weight distribution, effort, expression, intent, movement, 3D sensing pressure, force, sensor, resolu- tion, control device, sound, music, input. </keywords>
</document>
<document>
<name>nime2002_131.pdf</name>
<abstract>
This paper briefly describes a number of performance
interfaces under the broad theme of Interactive Gesture
Music (IGM). With a short introduction, this paper dis-
cusses the main components of a Trans-Domain Map-
ping (TDM) framework, and presents various prototypes
developed under this framework, to translate meaningful
activities from one creative domain onto another, to pro-
vide real-time control of musical events with physical
movements.
</abstract>
<keywords> Gesture, Motion, Interactive, Performance, Music. </keywords>
</document>
<document>
<name>nime2002_137.pdf</name>
<abstract>
The design of a virtual keyboard, capable of reproducing
the tactile feedback of several musical instruments is
reported. The key is driven by a direct drive motor,
which allows friction free operations. The force to be
generated by the motor is calculated in real time by a
dynamic simulator, which contains the model of
mechanisms' components and constraints. Each model is
tuned on the basis of measurements performed on the
real system. So far, grand piano action, harpsichord and
Hammond organ have been implemented successfully on
the system presented here.
</abstract>
<keywords> Virtual mechanisms, dynamic simulation </keywords>
</document>
<document>
<name>nime2002_143.pdf</name>
<abstract>
Interactivity has become a major consideration in the
development of a contemporary art practice that engages
with the proliferation of computer based technologies.
Keywords
</abstract>
<keywords> are your choice. </keywords>
</document>
<document>
<name>nime2002_145.pdf</name>
<abstract>
Passive RF Tagging can provide an attractive medium
for development of free-gesture musical interfaces. This
was initially explored in our Musical Trinkets installa-
tion, which used magnetically-coupled resonant LC cir-
cuits to identify and track the position of multiple objects
in real-time. Manipulation of these objects in free space
over a read coil triggered simple musical interactions.
Musical Navigatrics builds upon this success with new
more sensitive and stable sensing, multi-dimensional
response, and vastly more intricate musical mappings
that enable full musical exploration of free space through
the dynamic use and control of arpeggiatiation and ef-
fects. The addition of basic sequencing abilities also
allows for the building of complex, layered musical in-
teractions in a uniquely easy and intuitive manner.
</abstract>
<keywords> passive tag, position tracking, music sequencer interface </keywords>
</document>
<document>
<name>nime2002_148.pdf</name>
<abstract>
We present Audiopad, an interface for musical performance
that aims to combine the modularity of knob based control-
lers with the expressive character of multidimensional track-
ing interfaces. The performer's manipulations of physical
pucks on a tabletop control a real-time synthesis process.
The pucks are embedded with LC tags that the system tracks
in two dimensions with a series of specially shaped antennae.
The system projects graphical information on and around the
pucks to give the performer sophisticated control over the
synthesis process.
</abstract>
<keywords> RF tagging, MIDI, tangible interfaces, musical controllers, object tracking </keywords>
</document>
<document>
<name>nime2002_156.pdf</name>
<abstract>
In this paper, we develop the concept of "composed
instruments". We will look at this idea from two
perspectives: the design of computer systems in the
context of live performed music and musicological
considerations. A historical context is developed.
Examples will be drawn from recent compositions.
Finally basic concepts from computer science will be
examined for their relation ship to this concept.
</abstract>
<keywords> Instruments, musicology, composed instrument, Theremin, Martenot, interaction, streams, MAX. </keywords>
</document>
<document>
<name>nime2002_161.pdf</name>
<abstract>
The cicada uses a rapid sequence of buckling ribs to initi-
ate and sustain vibrations in its tymbal plate (the primary
mechanical resonator in the cicada's sound production sys-
tem). The tymbalimba, a music controller based on this
same mechanism, has a row of 4 convex aluminum ribs (as
on the cicada's tymbal) arranged much like the keys on a
calimba. Each rib is spring loaded and capable of snapping
down into a V-shape (a motion referred to as buckling), un-
der the downward force of the user's finger. This energy
generated by the buckling motion is measured by an ac-
celerometer located under each rib and used as the input
to a physical model.
</abstract>
<keywords> Bioacoustics, Physical Modeling, Controllers, Cicada, Buck- ling mechanism. </keywords>
</document>
<document>
<name>nime2002_167.pdf</name>
<abstract>
This paper describes the hardware and the software of a
computer-based doppler-sonar system for movement
detection. The design is focused on simplicity and low-
cost do-it-yourself construction.
</abstract>
<keywords> sonar </keywords>
</document>
<document>
<name>nime2002_171.pdf</name>
<abstract>
This paper describes a technique of multimodal,
multichannel control of electronic musical devices using
two control methodologies, the Electromyogram (EMG)
and relative position sensing. Requirements for the
application of multimodal interaction theory in the musical
domain are discussed. We introduce the concept of
bidirectional complementarity to characterize the
relationship between the component sensing technologies.
Each control can be used independently, but together they
are mutually complementary. This reveals a fundamental
difference from orthogonal systems. The creation of a
concert piece based on this system is given as example.
</abstract>
<keywords> Human Computer Interaction, Musical Controllers, Electromyogram, Position Sensing, Sensor Instruments </keywords>
</document>
<document>
<name>nime2002_177.pdf</name>
<abstract>
Active force-feedback holds the potential for precise and
rapid controls. A high performance device can be built
from a surplus disk drive and controlled from an inex-
pensive microcontroller. Our new design,The Plank has
only one axis of force-feedback with limited range of
motion. It is being used to explore methods of feeling
and directly manipulating sound waves and spectra suit-
able for live performance of computer music.
</abstract>
<keywords> Haptics, music controllers, scanned synthesis. </keywords>
</document>
<document>
<name>nime2002_181.pdf</name>
<abstract>
Here we propose a novel musical controller which acquires
imaging data of the tongue with a two-dimensional medical
ultrasound scanner. A computer vision algorithm extracts
from the image a discrete tongue shape to control, in real-
time, a musical synthesizer and musical effects. We evalu-
ate the mapping space between tongue shape and controller
parameters and its expressive characteristics.
</abstract>
<keywords> Tongue model, ultrasound, real-time, music synthesis, speech interface </keywords>
</document>
<document>
<name>nime2002_186.pdf</name>
<abstract>
The Beatbugs are hand-held percussive instruments that
allow the creation, manipulation, and sharing of rhyth-
mic motifs through a simple interface. When multiple
Beatbugs are connected in a network, players can form
large-scale collaborative compositions by interdepen-
dently sharing and developing each other's motifs. Each
Beatbug player can enter a motif that is then sent through
a stochastic computerized "Nerve Center" to other play-
ers in the network. Receiving players can decide whether
to develop the motif further (by continuously manipulat-
ing pitch, timbre, and rhythmic elements using two bend
sensor antennae) or to keep it in their personal instru-
ment (by entering and sending their own new motifs to
the group.) The tension between the system's stochastic
routing scheme and the players' improvised real-time
decisions leads to an interdependent, dynamic, and con-
stantly evolving musical experience. A musical composi-
tion entitled "Nerve" was written for the system by au-
thor Gil Weinberg. It was premiered on February 2002
as part of Tod Machover's Toy Symphony [1] in a con-
cert with the Deutsches Symphonie Orchester Berlin,
conducted by Kent Nagano. The paper concludes with a
short evaluative discussion of the concert and the week-
long workshops that led to it.
</abstract>
<keywords> Interdependent Musical Networks, group playing, per- cussive controllers. </keywords>
</document>
<document>
<name>nime2002_192.pdf</name>
<abstract>
In this demonstration we will show a variety of com-
puter-based musical instruments designed for live per-
formance. Our design criteria include initial ease of use
coupled with a long term potential for virtuosity, mini-
mal and low variance latency, and clear and simple
strategies for programming the relationship between
gesture and musical result. We present custom control-
lers and unique adaptations of standard gestural inter-
faces, a programmable connectivity processor, a com-
munications protocol called Open Sound Control (OSC),
and a variety of metaphors for musical control.
</abstract>
<keywords> Expressive control, mapping gestures to acoustic results, metaphors for musical control, Tactex, Buchla Thunder, digitizing tablets. </keywords>
</document>
<document>
<name>nime2002_195.pdf</name>
<abstract>
The Mutha Rubboard is a musical controller based on the
rubboard, washboard or frottoir metaphor commonly used
in the Zydeco music genre of South Louisiana. It is not only
a metamorphosis of a traditional instrument, but a modern
bridge of exploration into a rich musical heritage. It uses
capacitive and piezo sensing technology to output MIDI and
raw audio data.
This new controller reads the key placement in two parallel
planes by using radio capacitive sensing circuitry expand-
ing greatly on the standard corrugated metal playing sur-
face. The percussive output normally associated with the
rubboard is captured through piezo contact sensors mounted
directly on the keys (the playing implements). Additionally,
mode functionality is controlled by discrete switching on
the keys.
This new instrument is meant to be easily played by both
experienced players and those new to the rubboard. It lends
itself to an expressive freedom by placing the control sur-
face on the chest and allowing the hands to move uninhib-
ited about it or by playing it in the usual way, preserving its
musical heritage.
</abstract>
<keywords> MIDI controllers, computer music, Zydeco music, interac- tive music, electronic musical instrument, human computer interface, Louisiana heritage, physical modeling, bowl res- onators. </keywords>
</document>
<document>
<name>nime2002_199.pdf</name>
<abstract>
Falling Up is an evening-length performance incorporating
dance and theatre with movement-controlled audio/video
playback and processing. The solo show is a collaboration be-
tween Cindy Cummings (performance) and Todd Winkler
(sound, video), first performed at the Dublin Fringe Festival,
2001. Each thematic section of the work shows a different type
of interactive relationship between movement, video and
sound. This demonstration explains the various technical con-
figurations and aesthetic thinking behind aspects of the work.
</abstract>
<keywords> Dance, Video processing, Movement sensor, VNS, Very Nervous System </keywords>
</document>
<document>
<name>nime2002_201.pdf</name>
<keywords> Hyperbow, Hyperviolin, Hyperinstrument, violin, bow, po- sition sensor, accelerometer, strain sensor </keywords>
</document>
<document>
<name>nime2003_003.pdf</name>
<abstract>
In this paper we present a design for the EpipE, a new
expressive electronic music controller based on the Irish
Uilleann Pipes, a 7-note polyphonic reeded woodwind. The
core of this proposed controller design is a continuous
electronic tonehole-sensing arrangement, equally applicable
to other woodwind interfaces like those of the flute, recorder or
Japanese shakuhachi. The controller will initially be used to
drive a physically-based synthesis model, with the eventual
goal being the development of a mapping layer allowing the
EpipE interface to operate as a MIDI-like controller of arbitrary
synthesis models.
</abstract>
<keywords> Controllers, continuous woodwind tonehole sensor, uilleann pipes, Irish bagpipe, physical modelling, double reed, conical bore, tonehole. </keywords>
</document>
<document>
<name>nime2003_015.pdf</name>
<keywords> MIDI Controller, Wind Controller, Breath Control, Human Computer Interaction. </keywords>
</document>
<document>
<name>nime2003_019.pdf</name>
<abstract>
The STRIMIDILATOR is an instrument that uses the devia-
tion and the vibration of strings as MIDI-controllers. This
method of control gives the user direct tactile force feedback
and allows for subtle control. The development of the in-
strument and its different functions are described.
</abstract>
<keywords> MIDI controllers, tactile force feedback, strings. Figure The STRIMIDILATOR </keywords>
</document>
<document>
<name>nime2003_024.pdf</name>
<abstract>
Over the past year the instructors of the Human Computer
Interaction courses at CCRMA have undertaken a technol-
ogy shift to a much more powerful teaching platform. We
describe the technical features of the new Atmel AVR based
platform, contrasting it with the Parallax BASIC Stamp
platform used in the past. The successes and failures of
the new platform are considered, and some student project
success stories described.
</abstract>
</document>
<document>
<name>nime2003_030.pdf</name>
<abstract>
The Disc Jockey (DJ) software system Mixxx is presented.
Mixxx makes it possible to conduct studies of new interac-
tion techniques in connection with the DJ situation, by its
open design and easy integration of new software modules
and MIDI connection to external controllers. To gain a bet-
ter understanding of working practices, and to aid the design
process of new interfaces, interviews with two contemporary
musicians and DJ's are presented. In contact with these
musicians development of several novel prototypes for DJ
interaction have been made. Finally implementation details
of Mixxx are described.
</abstract>
</document>
<document>
<name>nime2003_036.pdf</name>
</document>
<document>
<name>nime2003_054.pdf</name>
<abstract>
In this paper, we examine the use of spatial layouts of musical
material for live performance control. Emphasis is given to
software tools that provide for the simple and intuitive
geometric organization of sound material, sound processing
parameters, and higher-level musical structures.
</abstract>
</document>
<document>
<name>nime2003_070.pdf</name>
<abstract>
This paper first introduces two previous software-based music
instruments designed by the author, and analyses the crucial
importance of the visual feedback introduced by their
interfaces. A quick taxonomy and analysis of the visual
components in current trends of interactive music software is
then proposed, before introducing the reacTable*, a new
project that is currently under development. The reacTable* is
a collaborative music instrument, aimed both at novices and
advanced musicians, which employs computer vision and
tangible interfaces technologies, and pushes further the visual
feedback interface ideas and techniques aforementioned.
</abstract>
</document>
<document>
<name>nime2003_077.pdf</name>
<abstract>
A handheld electronic musical instrument, named the Bento-
Box, was developed. The motivation was to develop an
instrument which one can easily carry around and play in
moments of free time, for example when riding public
transportation or during short breaks at work. The device was
designed to enable quick learning by having various scales
programmed for different styles of music, and also be
expressive by having hand controlled timbral effects which
can be manipulated while playing. Design analysis and
iteration lead to a compact and ergonomic device. This paper
focuses on the ergonomic design process of the hardware.
</abstract>
<keywords> MIDI controller, electronic musical instrument, musical instrument design, ergonomics, playability, human computer interface. </keywords>
</document>
<document>
<name>nime2003_083.pdf</name>
<keywords> Chemical music, Applied chemistry, Battery Controller. </keywords>
</document>
<document>
<name>nime2003_087.pdf</name>
<abstract>
This report details work on the interdisciplinary media
project TGarden. The authors discuss the challenges
encountered while developing a responsive musical
environment for the general public involving wearable,
sensor-integrated clothing as the central interface and input
d e v i c e . T h e p r o j e c t ' s d r a m a t u r g i c a l and
technical/implementation background are detailed to
provide a framework for the creation of a responsive hardware
and software system that reinforces a tangible relationship
between the participant's improvised movement and musical
response. Finally, the authors take into consideration testing
scenarios gathered from public prototypes in two European
locales in 2001 to evaluate user experience of the system.
</abstract>
<keywords> Gesture, interaction, embodied action, enaction, physical model, responsive environment, interactive musical systems, affordance, interface, phenomenology, energy, kinetics, time constant, induced ballistics, wearable computing, accelerometer, audience participation, dynamical system, dynamic compliance, effort, wearable instrument, augmented physicality. </keywords>
</document>
<document>
<name>nime2003_091.pdf</name>
<abstract>
We present a sensor-doll interface as a musical outlet for
personal expression. A doll serves the dual role of being both
an expressive agent and a playmate by allowing solo and
accompanied performance. An internal computer and sensor
system allow the doll to receive input from the user and its
surroundings, and then respond accordingly with musical
feedback. Sets of musical timbres and melodies may be
changed by presenting the doll with a series of themed cloth
hats, each suggesting a different style of play. The doll may
perform by itself and play a number of melodies, or it may
collaborate with the user when its limbs are squeezed or bent.
Shared play is further encouraged by a basic set of aural tones
mimicking conversation.
</abstract>
<keywords> Musical improvisation, toy interface agent, sensor doll, context awareness. </keywords>
</document>
<document>
<name>nime2003_095.pdf</name>
</document>
<document>
<name>nime2003_109.pdf</name>
<abstract>
In the project Sonic City, we have developed a system that
enables users to create electronic music in real time by walking
through and interacting with the urban environment. We
explore the use of public space and everyday behaviours for
creative purposes, in particular the city as an interface and
mobility as an interaction model for electronic music making.
A multi-disciplinary design process resulted in the
implementation of a wearable, context-aware prototype. The
system produces music by retrieving information about
context and user action and mapping it to real-time processing
of urban sounds. Potentials, constraints, and implications of
this type of music creation are discussed.
</abstract>
</document>
<document>
<name>nime2003_116.pdf</name>
<abstract>
The role of the face and mouth in speech production as well as
non-verbal communication suggests the use of facial action to
control musical sound. Here we document work on the
Mouthesizer, a system which uses a headworn miniature
camera and computer vision algorithm to extract shape
parameters from the mouth opening and output these as MIDI
control changes. We report our experience with various
gesture-to-sound mappings and musical applications, and
describe a live performance which used the Mouthesizer
interface.
</abstract>
<keywords> Video-based interface; mouth controller; alternative input devices. </keywords>
</document>
<document>
<name>nime2003_122.pdf</name>
<keywords> Alternate controller, gesture, microphone technique, vocal performance, performance interface, electronic music. </keywords>
</document>
<document>
<name>nime2003_129.pdf</name>
<abstract>
We explore a variety of design criteria applicable to the
creation of collaborative interfaces for musical experience. The
main factor common to the design of most collaborative
interfaces for novices is that musical control is highly
restricted, which makes it possible to easily learn and
participate in the collective experience. Balancing this trade-
off is a key concern for designers, as this happens at the
expense of providing an upward path to virtuosity with the
interface. We attempt to identify design considerations
exemplified by a sampling of recent collaborative devices
primarily oriented toward novice interplay. It is our intention
to provide a non-technical overview of design issues inherent
in configuring multiplayer experiences, particularly for entry-
level players.
</abstract>
<keywords> Design, collaborative interface, musical experience, multiplayer, novice, musical control. </keywords>
</document>
<document>
<name>nime2003_135.pdf</name>