| ALAS on Thu, 13 Dec 2001 18:18:02 +0100 (CET) | 
[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]
| [Nettime-bold] TRANSDANCE REPORT | 
| ''e-phos 01''    TRANSDANCE REPORT ''e-phos 01''    athens' festival of 
digital culture    ''phos''         
light in greek                        
Apologies for cross posting                       
Here you will find the final report of the Research Lab on body, motion and 
                       
technology  ''TRANSDANCE'', produced and hosted by festival ''e-phos 
2001'',                        
in Athens, 23-31 May 2001.                        
For more info and photos click www.filmart.gr  for the ones who are interested 
enjoy -------------------------------------------------------------------------------- "T R A N S D A N C 
E'' Research Lab on Body, Motion and 
Technology Organised and hosted by festival 
"e-phos 2001'' 23-31 May 2001, Athens, Greece By Scott deLahunta (UK/ 
NL) 
 Description: The TRANSDANCE research laboratory 
was conceived and organised by Yiannis Skourogiannis of ALAS as a part of 
"e-phos 2001'', the 3rd International Festival of Digital 
Culture, from 23 May - 2 June in Athens. "e-phos 2001'' was 
entirely devoted on  the BODY 
KINESIS and BODY ANAMORPHOSIS and included a wide range of activities such as 
telematic dance perfomance, multimedia theatre perfomance, live electronic music 
festival, video games festival, festival of documentaries on art, sm fashion 
show, lectures, and new media exhibitions.  TRANSDANCE was advertised on the 
website http://www.filmart.gr 
as a 'dance and technology' research lab on 'body, movement, 
technology'. The dates of the research lab were 23-31 May, 2001, the precise 
location was in two warehouses located behind IME (Foundation for the Hellenic 
World) at 254 Pireos str., Athens, Greece. The lab was structured as a 
research project for professional artists with established practices. This means 
there was no separation between 'students' and 'teachers', and all learning took 
place in the context of peer to peer exchange. The international selection of 
invitees came from a diverse range of artistic backgrounds: electronic music, 
the visual and theatre arts, dance and performance art, interactive/ digital 
media and net art. They were: Sophia Lycouris (UK); Jenny Marketou (USA); John 
McCormick (AU); Konstantinos Moschos (GR); Alexandros Psychoulis (GR); 
Konstantinos Rigos (GR); Yacov Sharir (USA); Christian Ziegler (DE). My role was 
described as research or process advisor for the project. The production 
coordinator was Maria Softsi, mariasof@compulink.gr. 
   Summary: 
 The TRANSDANCE (always uppercase) 
research laboratory explored a variety of interfaces between the physical and 
virtual worlds. While taking the theme of 'dance and technology' as a starting 
point, TRANSDANCE supported a wider range of conceptions of the physical body or 
bodies, from the trained to the everyday, the social and the collective. It 
focussed on the virtual space as a networked space that can function as a 
performance space, a shared, creative, social and playful space. Through 
exploring interference and mapping processes, the participants worked towards 
realising the transformative possibilities inherent in emerging technologies. 
The lab has given rise to three extended projects (an animation and telematic 
project and a documentary). Hopefully the following report presented as a set of 
open conceptual tools and methodologies will help disseminate the results of the 
research to the wider community where further artistic investigation needs to 
continue to inform the technological developments in these 
areas.   The conditions for 
research: Before TRANSDANCE, I had 
participated in four research projects of varying scale involving digital media, 
electronic networks, live performance and choreography (Migratory Bodies, 
Chichester College of Higher Education [UK], Summer 1998; Digital Theatre 
Experimentarium, Aarhus University [Denmark], Winter/ Spring 1999; Hot Wired 
Live Art, Bergen Electronic Arts [Norway], Winter 2000; Cellbytes, Institute for 
Studies in the Arts [Phoenix, AZ], Summer 2000). These projects each brought 
together a range of creative expertise, e.g. choreographers, dramaturges, 
composers, writers, digital media artists, programmers, scripters, graphic 
designers, video/ filmmakers, telematic and installation artists, etc. They have 
involved a variety of technologies from basic audio video graphic editing, to 
interactive systems (sensors/ triggers), mobile technologies and high end motion 
capture systems. Each project has involved the building of or use of an existing 
electronic data network to a) facilitate the sharing of materials and b) to 
support real-time performance interaction. As one might expect, the research 
agendas and conditions for these projects have varied widely, depending on the 
mix of organisers, participants, cultural/ institutional contexts, funding and 
resources available, physical location, preparation work, etc. The aims and 
objectives of each project have not always been very explicit, partly because of 
the difficulty in knowing precisely what these can be beforehand. Usually some 
area of technology research that will be coordinated with an exploration of live 
performance forms is articulated (such as was done for TRANSDANCE). Often, some 
general cultural themes having to do with the transformation of the physical 
world confronted with emerging technologies are taken as a starting point for 
content exploration. The collaborative nature of these events is sometimes made 
explicit and an object for analysis during the working process while other times 
not. In all of these projects, there was an effort made to present something at 
the end of the event in order to give public access to the work that was done. 
Other forms of public dissemination of research outcomes have been through 
making project related videos, cdroms, websites and articles in 
journals. Each of the projects mentioned 
above was a rich and productive environment for learning and exchange, but 
amongst these TRANSDANCE provided an unprecedented mixture of technical 
expertise and facilities, diversity of artistic approaches and the space and 
time to do some very focussed and specific research work. 
 The conditions for TRANSDANCE 
: The organisation of the TRANSDANCE 
research laboratory followed a series of lectures on digital and interactive 
dance organised for the Festival of Dance of Kalamata in July 2000 by Yiannis 
Skourogiannis and the ALAS team. His e-mail of 4 September 2000 to me outlined 
the initial concept for the TRANSDANCE May 2001 event as follows: "... the 
invited artists will be provided the necessary means to work towards a completed 
event or concept that will use either the physical space, or the virtual space, 
or the combination of both." The preparations over the next 
several months were mostly left to Yiannis until we had a confirmed list of 
participants. Following this, I took on a greater role as process advisor for 
TRANSDANCE which involved making regular contact with the participants and 
organisers via an electronic mail list (yahoogroups.com), identifying what 
resources would be made available and what sort of research everyone would be 
interested in pursuing (for a short list of the hardware/ software that was 
available see below). From these discussions, two main research areas were 
specified: 1) to set up for some web streaming and possible influence from 
viewers/ on line audience; 2) real time 3-D environments. There was also an 
interest in exploring some scenographic/ installation possibilities in the 
physical space, but due to various circumstances, e.g. the Vicon system took up 
much of the space, etc., it was decided to place less emphasis on this 
area. "Web streaming" 
refers to the use of technologies such as Real Player http://www.real.com/ and Quicktime that are able 
to compress and deliver audio/ video to the desktop via what is referred to as a 
'live' stream. A popular technology for broadcasting using the internet, the 
player software for viewing the streams is available for free and often comes 
bundled with browsers such as Microsoft's Internet Explorer. The lab 
participants were interested in going beyond the broadcast model and exploring 
the interactive possibilities of using live streaming with the involvement of an 
audience. Despite the fact we had on hand the StreamGenie, Pinnacle's portable 
system for live, multi-camera web casting http://www.pinnaclesys.com, it proved 
difficult to explore this area in depth as this would have required the 
organisation of additional resources such as an online server and more technical 
expertise to support artistic experimentation in the streaming medium. (For some 
artistic work already done using the possibilities of streaming media please see 
John McCormick's site http://www.companyinspace.com/home 
and Jenny Marketou's Smellbytes site http://smellbytes.banff.org/) We did have the technology and 
expertise to move forward in the second research area: real time 3-D 
environments. For this, we had the unusual good fortune to be able to 
work closely and for almost the entire laboratory with high end Motion Capture 
technologies. Briefly, Motion Capture refers to the computer hardware and 
software that makes possible recorded digital 3-D representation of moving 
bodies. Recording sessions involve the placement of markers or sensors on 
strategic positions on the body that provide the basic information for the 
computer software. The expense of these systems, which includes the cost of the 
equipment as well as the expertise to run it, is quite high with developments 
being driven primarily by the industries such as medical, military, 
entertainment and advertising that have the necessary capital. These costs make 
it difficult to pursue investigative artistic work. For some insight into recent 
uses of Motion Capture technologies in the field of dance go to http://www.arts.uci.edu/lnaugle/html/mcs/. We were informed quite early on 
that there would be a "state of the art" Vicon Real Time (http://www.vicon.com) Motion Capture system 
brought over from the United Kingdom and installed for us to work with, to 
include technical support. It is my understanding that this was arranged as an 
exchange with the Athens based AMY Digital Video company (http://www.amy.gr/amydv). AMY provided the 
technical facilities and support for the lab and had access to the Vicon system 
for the purpose of marketing and demonstration. The system installed for 
TRANSDANCE used twelve high resolution infra red cameras to capture the position 
of 20 plus reflective markers placed on the performer. To this, John McCormick 
was able to add another Motion Capture system, an electro-mechanical suit often 
referred to as an "exoskeleton" made by Analogus / Meta Motion (http://www.metamotion.com/) and called the 
"Gypsy". This system is able to sense, capture and process the motion data in 
the suit itself. Both of these systems would be able to drive an animated 
character in real time through Kaydara's FilmBox Motion Capture software (http://www.kaydara.com/). With these systems, one is able to 
move in the motion capture suits (either wearing Vicon's marker suit or the 
Gypsy exoskeleton - or both at the same time) and simultaneously drive a three 
dimensional animation in the digital space of the computer. From a commercial 
broadcast industry perspective, this is often referred to as Performance 
Animation meaning real time animations can be used in the context of live media 
events - examples often used are to imagine the weather announcer on the local 
television station giving up-to-date forecasts in some animated form or 
combining live actors from remote locations as animated characters sharing the 
same scene. From a dancer's perspective, the possibility to watch one's movement 
in real time from any angle including from directly below to directly above is 
enabled in these systems and, despite the encumbrances of the respective body 
suits, as a movement visualization system for a dancer this has as yet 
unexplored possibilities. Exploring real time interaction in 
3-D environments evolved into a primary research trajectory of the TRANSDANCE 
laboratory. We were able to demonstrate in the final presentation a scenario 
that involved Jenny Marketou performing everyday domestic actions (e.g. cleaning 
the space, etc.) wearing the exoskeleton while sharing the same digital/ virtual 
space with a pre-recorded animation of one of the other participants. Jenny's 
wrist movements were mapped to the position of the other animation in space 
(vertical and axis orientation) so that as she performed her simple everyday 
tasks - the audience could see on the screen the outcomes of her actions in this 
shared virtual space. This demonstration built a representational bridge between 
a prosaic set of activities and a highly technologised, non-everyday virtual 
space. Jenny was also able to interact in the physical space with audience 
members making more explicit this connection between physical and virtual 
spaces. This was by no means a finished artistic work, but exemplified how it is 
that a research laboratory can produce an effective working demonstration of the 
artistic possibilities of a set of technologies. Out of this research, plans are 
underway to organise a larger scale telematic performance event linking three of 
four Greek Islands in the Aegean using some of these technologies and to advance 
some of the explorations made at TRANSDANCE. 
 Working at the level of the 
data: interference/ mapping/ 
systems In his useful survey of the field 
of electronic, communication, video and computer art, Art of the Electronic Age, 
published in 1993 Frank Popper writes: "Although digital processing is 
more than a mere improvement in the treatment of the image, and although 
computer editing may dramatically change the traditional concepts of 
image-making, the main breakthrough in this area takes place in the synthetic 
generation of the image. Being a virtual image produced by mathematical 
formulae, the video image, unlike the traditional pictorial image, can only be 
considered as a proof of the model it simulates, not as a copy of a pre-existing 
object or model in the real world. Moreover, a three-dimensional synthesis 
enables the artist to intervene not only on the image, but inside the image. 
Image has become architecture, a space to visit, to explore in various ways. 
Editing, often highly sophisticated, has been replaced by a scenographic 
concept." pp. 76-77 A long quote, but it sums up a 
fundamental difference between the images we are accustomed to seeing on 
television and in the movies, which are rendered as two dimensional fixed 
entities, and the possibilities for developing digital artistic practices that 
expand on the new possibilities inherent in the production and manipulation of 
digital objects (images, sounds, texts, graphics, etc.). We can find the same 
concepts covered by other writers on new media, for example, Lev Manovich's 
recently published (MIT Press 2001) The Language of New Media in which Manovich 
attempts to develop useful terminology for the analysis and understanding of the 
processes and products of digital media. He describes a set of five "principles 
of new media" and one of these in particular, the principle of "Numeric 
Representation", outlines 
the underlying structures of digital, programmable media in ways that support 
Popper's proposal that the digital artist can intervene not only on the image, 
but inside the image. This ability to work with the 
numeric properties of a new media or digital media image or sound means that in 
artistic terms, the basic materials of the new media/ digital artist is not 
necessarily the image or sound itself which is essentially a representation or 
manifestation of the underlying numeric representations or mathematical formulae 
(although this view does not take into account the needs of an audience/ 
viewers). Essentially these underlying numeric representations can be broken 
down further and used to represent a variety of "surface" media. Surface media 
refers here to the image or sound, text or graphics that are the generally 
accepted new media means for communicating and producing meaning for the 
viewers/ users. Generally speaking, today's average computer user/ consumer does 
not grasp the underlying numerical systems that lie at the heart of computation. 
However, for an experimental (non traditional) artist working with new media, it 
is normally not sufficient to simply manipulate the surface media as this does 
not allow for an interrogation of the basic materials or principles of the 
digital media - as defined both by Popper and Manovich. For TRANSDANCE, 
interference became the operative metaphor for working with 
technologies that were available to us - many of which were mainly targeting the 
user/ professional/ specialist who prefers to work in a more traditional sense 
to manipulate the surface representations of the media. To explain a bit 
further, the StreamGenie system (mentioned in detail above) and DPS Velocity 
(broadcast television video editing system http://www.dps.com), were two hardware/ software 
combinations we had access to that are designed as increasingly miniaturized and 
transportable broadcast studios. The dozens of editing features are designed to 
produce endless graphical variations and combinations of image, sound and 
graphics. However, the systems are generally built to support an industry that 
is not in a position to interrogate or practice modes of interference in the 
images and sounds and graphics that it needs to produce in seemingly 
never-ending new (re) combinations for the consumer market 
place. This is what is significant about 
organising an artistic research laboratory such as TRANSDANCE. David Chalkidis, 
from the commercially oriented AMY, summed it up for me in a short discussion we 
had about their support for the project by saying that the technology is 
developing so fast that those producing and selling for the market and the 
consumer do not have the time to keep up with and explore how best to use these 
new tools. For David, this is the role the artist can play, and his brother Alex 
and he are committed to trying to put these new media tools in the hands of 
artists to explore. I think I write the words here for all of the artists who 
participated in the project that AMY's support for the laboratory (and including 
the Vicon Motion Capture support team David Lowe and Tim Doubleday) was 
exemplary, beyond anything any of us had experienced before in similar types of 
research situations.  We wanted to interfere with the 
digital images, sounds, etc. by getting at the core of the digital media to the 
level of the data, and we explored the possibilities in three or four different 
scenarios. One of these was with the Motion Capture system in which normally 
three streams of information per marker or sensor are received by the computer 
to drive the animations. These three streams are roughly equivalent to the X, 
the Y and Z information that translates to the Cartesian coordinate system, the 
culturally accepted mapping of the physical space we still rely on today - 
despite the fact that Descartes devised this coordinate system almost 400 years 
ago. Another of our research aims was to 
try and map one of these data streams across the network to 
drive sounds being synthesized in Kostas Moschos' computer. This would link the 
movement of someone wearing one of the Motion Capture suits (Vicon or 
Exoskeleton) to the sound synthesis patches Kostas had programmed in MAX. There 
would be too much data if one were to take all the coordinate information from 
one marker, so this would require being able to strip out the data stream of one 
of the coordinates and send it over the network to Kostas' computer. In the end, 
we were unable to accomplish this mapping in the time allotted due to 
constraints in the Kaydara Filmbox software, at the time the only means at our 
disposal for accessing the real time motion data streams in the first place. 
While failing at the task, in the process discoveries were made that may enable 
a faster resolution to the problem in the future. Working for several days to solve a 
technical problem may seem at odds with an artistic process, in particular when 
the problem is not solved. If indeed we had accomplished this mapping of the 
Motion Capture data to the sound the question could have still been raised - so 
what do we do with this capability now once we have it? This question needs 
framing from different perspectives, firstly, solving the technical problem of 
linking motion capture to sound using these particular systems is a step forward 
in that it gets the software and hardware to do something it was not designed to 
do. It interrogates or interferes with the software/ hardware system as an agent 
for the marketplace and opens up other options for thinking creatively about 
technology research and development. This is what might be described as solving 
a technical problem within an aesthetic framework. The resulting solution can be 
shared as a technical tool amongst a larger range of practitioners, enabling 
them to experiment in other artistic contexts with the results. Shared of 
disseminated as an open methodology (similar in concept to 'open source'), the 
technical solutions find a manifestation in material form 
elsewhere. As mentioned above, we were 
successful at another mapping process and that was to link the 
movements of Jenny Marketou to another virtual character in the 3-D space. In 
addition, data streams were extracted from another process using NATO.0+55 
modular, a software programme that facilitates cross media synthesis, and sent 
to Kostas Moschos as will be described in more detail 
below. Interference and 
Mapping may describe two forms of artistic process, but the 
diversity of artistic practice represented by the TRANSDANCE participants 
inspired the formation (or appropriation) of a conceptual tool I found quite 
useful as a pragmatic way of framing the interrelationships between 
participants, technologies and processes. This was to loosely employ the concept 
of self-generating systems across the wide range of these 
interrelationships. Thinking in systems can be rather easily applied to a 
technology, e.g. a network that may, for example, be an open or a closed system. 
A closed network system might refer to a setup with input and output and maybe 
one or two machines on it - and with no access to a wider network. Such a 
'closed system' network can enable the prototyping of certain artistic concepts 
more easily than an open network for example. Once set up such a system can be 
seen as stable for the purposes of an intensive collaborative research 
process. I am interested in applying this 
concept of 'systems' more broadly to further enable generative working 
conditions and cross practice fertilizations in the circumstances of a research 
laboratory such as TRANSDANCE. (While this conception was not employed 
explicitly during TRANSDANCE, several participants contributed to its formation, 
in particular Christopher Ziegler.) The blurring of boundaries around various 
traditional forms of artistic practices appears superficially to disable 
convention and enable experimentation and perhaps emergent art forms. This has 
always seemed an overly simplistic view to me when applied generally across all 
circumstances as it so often is under the heading of the 'interdisciplinary'. 
There seems an even greater need these days to be able to apply a 
self-referential system to arts practices of all kinds in order to re-enable 
interpenetration of practice and the potential for emergent, unexpected 
phenomenon.  This should be on a 
contingency basis, a flexible and workable set of protocols that can be applied 
to the situation as necessary and enable relocation and migration of certain 
aspects of practice between various systems more easily. For TRANSDANCE for example, we had 
choreographers, digital artists, visual artists, net artists, performance 
artists and electronic musicians. Each of these categories implies a self 
referential system in the form of historical and philosophical continuities, of 
communities and cultural production networks that provide a sense of coherence 
to any one of these categories of arts practice. 'Categories' might be an 
optional term to use // but it does not appeal as much as the notion of 
'systems'. Taken more broadly, systems might be seen as social and cultural and 
indeed the concept has been applied to both biological as well as social systems 
by theorists working from the General Systems Theory developed in the 1950s. 
However, this is beyond the scope of my report to go into further detail. I 
share it here as a conceptual tool I found useful in these circumstances, and I 
may return to its application in the future. 
 Parallel 
Projects: nato/ wearables/ 
choreograph-animation/ documentation As this report indicates, the 
primary research aim of the workshop was to explore the possibilities of real 
time Motion Capture systems in exploring shared 3-D environments. The sharing of 
this data occurred over a high speed Ethernet (a closed system), but the Motion 
Capture X Y and Z vector data itself is a relatively small data stream (as 
compared to the full 3-d animation) and could potentially be used to drive an 
animation in real time on another server across the Internet. This may be 
explored further in another research laboratory. Other research objectives were 
pursued in parallel to the primary research into real time 3-D environments, 
e.g. Christian Ziegler migrated an existing performance software tool written in 
Director's Lingo script called SCANNED (http://www.movingimages.de/scan.htm) 
to NATO.0+55 modular (a digital cross-media synthesizer). Christian's piece 
SCANNED uses a software performance tool that plays a video image in the 
background and is able to stop the image playing one horizontal or vertical line 
of pixels at a time. These horizontal or vertical lines can be triggered as 
single lines or sequentially moving across the screen from side to side or up 
and down. Whatever image is playing behind the scan appears to be frozen in 
time. By migrating this concept to NATO, Chris has enabled new interactive 
possibilities for SCANNED as NATO comprises a set of Quicktime externals 
building on and interfacing with MAX in the same manner as MSP so that MIDI and 
numerical data can be used to control any NATO function. This will open up 
Chris's SCANNED system to other systems. He has migrated an existing 
aesthetically coherent work from one platform to another that will offer more 
possibilities for transformation. NATO.0+55 modular has many features 
usually referred to as 'patches' because of the way it interfaces with MAX. The 
Difference plugin and Quick Draw were two used during the final presentation of 
the research laboratory - each set to analyze motion from a video source in 
different ways and out put this data to sound and image. Chris's research was of a very 
practical nature and involved many hours "inside the machine" studying and 
problem solving. At the same time, a conceptual project was evolving with the 
emergence of the notion of the everyday user's body interfacing with the virtual 
space. This conceptual project was founded on the presence of three technology 
systems offering to provide an interface between physical and virtual space that 
would use the whole body instead of just the fingers. Two of these systems have 
been mentioned, the Vicon Real Time and the Gypsy Exoskeleton motion capture 
systems. A third system was available - the Wearable Computer 
choreographer/ dancer Yacov Sharir had brought with him from the 
University of Austin, Texas. The wearable computer is clearly 
something we are inching closer to day by day as computing science and 
engineering research laboratories focus on a future in which wearable computers 
are assimilated into our world. The use of the wearable is already embraced by 
the field of mobile workers from telephone repair to Federal Express, by the 
fashion industry both as cultural statement and means of collective 
communication, and into the fields of leisure and exercise where monitoring of 
vital sign information such as heart and respiratory rate can be performed by 
the wearable (see the Lifeshirt: http://www.lifeshirt.com/). The concept of the wearable 
computer has penetrated live performance in the field of electronic music and to 
a lesser extent in the field of theatre and dance. One example of this would be 
Marcel.li Antunez Roca's AFASIA which was performed at the "e-phos 2001'' 
Festival (http://www.filmart.gr). In this 
performance, Marcel.li wears an exoskeleton that allows him to interact and 
control sound, multimedia images, video and robots.  In the dance field it is more common to 
find artists working with interactive motion sensor or motion capture system. 
This has partially to do with the emphasis on unrestricted motion in dance. 
Generally, the 'wearable computer' introduces some motion constraints on the 
body therefore apparently rendering it less than ideal for the dancer/ 
performer. However, in Athens, partially due to the presence of the wearable and 
the nature of the motion that can be performed in it, we were able to engage in 
questioning the assumptions regarding full body motion that usually come bundled 
with the concept of choreography and dance. Yacov's wearable has been designed 
with the intention of being able to wirelessly control live performance 
material. However, the world of wearable computing seems to suggest less the 
specialist functions of an artist and much the sort of technological systems we 
may in some not too distant future be integrating into our daily moment to 
moment existence (as mentioned above). Yacov's wearable consists of a small 
computer mounted in a heat insulated vest along the surface of his body with a 
small keyboard strapped to his wrist and a tiny head mounted video display 
window. The system is wirelessly transmitting data to a server enabling Yacov to 
control and manipulate media in real time in a live performance. Some of this 
data includes signals from EEG and EKG electrodes that he can place on his body 
during performances. While the conditions weren't right for us to experiment 
extensively with the data we might have received from this technological system, 
the presence of Yacov's wearable at TRANSDANCE helped to open up some of the 
conceptual terrain we explored in the laboratory. **************************************************** 
 Two further parallel projects 
evolved during the laboratory. For one of these a selection of approximately 20 
minutes of high quality motion capture data was recorded using the Vicon Real 
Time system of choreographer/ dancer Konstantinos Rigos improvising several 
short segments of varied movement material. This motion capture data was turned 
over to Rigos and a professional MAYA animator, Spyros Frigas, to collaborate 
together in the making of a short animated film to be realised at some point in 
the future. Final mention in this report goes 
to the documentary project begun by interactive installation artist Alexandros 
Psychoulis during TRANSDANCE. Alexandros observed and filmed the laboratory and 
interviewed all the participants. He edited together two short clips from the 
first and second half of the lab that proved invaluable when shown to the public 
to help them understand the process of the research. These short clips were 
constructed to be shown in the context of the laboratory and with some 
explanation. Alexandros and Yiannis Skourogiannis are in the process of raising 
funds to make a more thorough documentary to be shown to the public. This 
subsequent documentary, when completed, will be an important additional means of 
disseminating the objectives and outcomes of the research process of 
TRANSDANCE.   Scott deLahunta Writing Research Associates, 
NL Sarphatipark 26-3, 1072 PB 
Amsterdam, NL mobile: +44 (0)797 741 2060 
[messages too] fax: +44 (0)845 334 
2931 email: mailto: 
sdela@ahk.nl http://huizen.dds.nl/~sdela/main.html     Scott deLahunta  BIO  Began in the arts as a dancer and 
choreographer. Since 1992, as a partner of Writing Research Associates (WRA), he 
has organised several international workshop/ symposia projects in the field of 
performance including recently the third session of Conversations on 
Choreography at the Institute for Choreography and Dance, Cork, Ireland. From 
February-May 1999, Mr. deLahunta was a guest professor with the Department of 
Dramaturgy, Aarhus University, Denmark where he was also co-organiser of the 
Digital Theatre Experimentarium, a project investigating the relationship 
between motion capture, animation and live performance. He is frequently invited 
to facilitate workshops, give presentations and contribute to publications on 
the overlap between dance and new media technologies. In Autumn 2001, the WRA 
initiative *Software for Dancers* will conduct the first in a series of research 
labs/ thinktanks looking to develop new software tools for performance 
artists. 
 "e-phos 2001'' artistic director: Yiannis Skourogiannis 57 Archimidous GR-11636 
Athens tel:00301-7520064-5 fax:00301-7520064 |