ÅÆÅÄÍÅÂÍÛÅ ÍÎÂÎÑÒÈ ÈÑÊÓÑÑÒÂÀ@ARTINFO
 ÌÈÐÅ
 ÌÎÑÊÂÅ
 ÐÎÑÑÈÈ
 ÏÈÒÅÐÅ Â
ÈÍÒÅÐÍÅÒÅ ÏÅÐÈÎÄÈÊÀ
ÒÅÊÑÒÛ
ÍÀÂÈÃÀÒÎÐ
ÌÀÍ ñ ËÞÄÌÈËÎÉ ÍÎÂÈÊÎÂÎÉ
ART_PUZZLE
artontheground
ÀÐÒ ËÎÍÄÎÍ
- ÐÅÏÎÐÒÀÆÈ ÅËÅÍÛ ÇÀÉÖÅÂÎÉ ÀÐÒÈÊÓËßÖÈß
Ñ ÄÌÈÒÐÈÅÌ ÁÀÐÀÁÀÍÎÂÛÌ ÀÐÒ
ÔÎÍ Ñ ÎÊÑÀÍÎÉ ÑÀÐÊÈÑßÍ ÌÎËÎ×ÍÈÊÎÂ
ÈÇ ÁÅÐËÈÍÀ ÂÅÍÑÊÈÅ
ÇÀÌÅÒÊÈ ËÅÍÛ ËÀÏØÈÍÎÉ SUPREMUS
- ÖÞÐÈÕ ÎÐÃÀÍÀÉÇÅÐ
ÂÅËÈÊÀÍÎÂ
ßÐÌÀÐÊÈ
ÒÅÒÅÐÈÍ
ÍÜÞÑ ÔÎÒÎÐÅÏÎÐÒÀÆÈ
ÀÓÄÈÎÐÅÏÎÐÒÀÆÈ
Ó×ÅÁÀ
ÐÀÁÎÒÀ
ÊÎËËÅÃÈ
ÀÐÕÈÂ
Lev Manovich, The Language of New Media, Cambridge, MA: MIT Press, 2001, p.
46.
Ðåöåíçèÿ Are Flagan "Sign of the net.art times" (áåç
ïåðåâîäà, òàê êàê è êíèãà âåðîÿòíî íå áóäåò
ïåðåâåäåíà :)
In his influential book The Language of New Media, Lev Manovich
prominently listed "transcoding" among the founding principles of new
media. Discussing the digital practices and operations arising to merit the
debated shift into "new," he singled out the ability of numerically
encoded media objects to translate or transform themselves, with unprecedented
ease and according to hitherto unfamiliar properties and coordinates. Coupled
with the widespread computerization of all media (still and moving images,
sounds, texts, etc.), this technology-driven metamorphosis moreover influences
attendant cultural categories and concepts, as Manovich succinctly notes: "Because
new media is created on computers, distributed via computers, and stored and
archived on computers, the logic of a computer can be expected to significantly
influence the traditional cultural logic of media; that is, we may expect that
the computer layer will affect the cultural layer." [1] Although the
transcoding concept has received its due share of attention since the book's
publication last year, frequently being quoted as the prime example of "old"
cataclysms, the associated grammar of principles has largely ignored many common,
more pragmatic, uses and applications of the term. At its computing root,
transcoding obviously regulates and facilitates the play of presence and absence
through math and logic; thereby making its operations active across a vast yet
proprietary field, ranging from the foundations of western metaphysics to the
latest electronic switches. So considered broadly along with its profound
dispersal, which significantly returns to the consolidating principles deployed,
the impending gravity of computer transcoding is consequently, and not only
epistemologically speaking, immense. To avoid the neighboring black hole of
sweeping generalizations compiled in rounded nutshells, this brief essay will
attempt to theorize some aspects of this pervasive impact through specific and
prominent trends in contemporary net.art.
To once more narrow the focus on these preoccupations, one can in retrospect
appreciate that even the earliest net.art controversies of unauthorized
mirroring were less about repeating the simulacra of postmodernism, which had
already been exhaustively explored through the medium of photography in the
preceding decade, than it was about revisiting questions of authenticity and
authority through the added momentum of transcoding. The act of mirroring, seen
here as always in a differentiated yet fulfilling presence, in the 1999 actions
of 0100101110101101.org did not only clone the destined-for-stardom site
jodi.org byte by byte under another domain name, it also downloaded and offered
a subversively altered version of Art.Teleportacia, the first art gallery for
the Web. Negotiating these mirror(ing) phases obviously cast a long backward
glance at postmodern questions of replication and reproduction, but it also
recognized that the cumulative ability to transfer, transport, translate and
transform, all subsumed and made available under transcoding, had leveled the
playing field for a rather predictable set of artistic games to begin anew in a
pioneering context. If we leap three giant net years ahead to the present, an
attentive look at some recent entries to the net.art catalog will garner
attention to a subsequent and related strategy that has become increasingly
popular among dedicated practitioners. A striking number of current works
literally employ and repeat what one may term an expansive approach to the
transcoding principle: they collect and/or generate structured data through
various, often rather novel, forms of input and then output this in a scrambled
appearance, regularly on rather abstract terms and generally according to very
simple rules.
To better illustrate this rapidly overflowing genre, three projects may suffice:
Taxi Art, [2] produced by SAS Design in London, uses the GPS tracking of London
taxis, which is already done for booking reasons, to offer visitors to the site
a series of choices for an online artwork drawn by the humdrum path of taxis on
the streets. First pick your minimalist and formalist preference for aesthetics
that largely resemble pie charts or graphs in the form of lines or circles, then
watch the drivers negotiate the traffic to render your masterpiece. The result:
a GPS doodle of urban corridors that, from a cartographic point of view, would
probably require that you immediately hailed a cab to get around without getting
lost. Another recent example is Goodworld by Lew Baldwin, which can be found on
the Whitney Museum's lofty artport site. [3] Here you pick any URL and let the
site transform your location into colorful blobs for images, where the color
field is an aggregate of dominant RGB values in the original, and emotive smiley
faces for text. An almost analogous gig for music is the developing WebPlayer
[4] by Pete Everett, which currently prepares the stage for a filtering of an
URL into soft, luscious sounds transcoded from the ASCII values of the hypertext,
sans recurring code brackets. Somewhat unexpectedly (unless you first read the
process notes that pays homage to how mathematically inspired composers turned
repetitive numbers--base note sequences--into sweet music), the result resonates
more like naturalistic jingles from the oceans than past sounds sampled from
data and voiced by tinny 386 processors to strike a distinctive digital note.
This net can easily be cast much broader and wider in all directions to catch
numerous projects that indulge in the type of transcoding alluded to. But to
save the impressions formulated thus far, we can discern the repeated
predilection toward taking ordered stacks of data and reshuffling the packets:
GPS traces in longitude and latitude turns to coordinated strokes, graphical RGB
values coalesce in bland color fields and HTTP rocks on through the speakers,
all according to Radio Taxis, Goodworld and WebPlayer respectively. The reason
all this reverse-engineered data mining and logical-mathematical magic can
unfold is of course due to the common binary denominators of all data: 0 and 1.
Translated into the bitplane through binary notation a decimal value of, let's
say 97, will read as a series of 0s and 1s. But this string of 97 reinterpreted
through ASCII code is in fact the "a" in the fact just presented and
represented (given that this essay does indeed appear as ASCII). And the 97 may
of course also be attributed, and reassigned, to a medium dark pixel value in an
image or the pitch of a programmed tone. Consider, then, that this 97 already
circulates around the Internet in many wrappings, from the corner of a company
logo via the central "a" in every wording of Mac to a frequency in an
embedded sound object, and you get the basic picture (or word or sound) of the
Esperanto-styled computing these projects are practicing and pointing to. Within
this mind-blowing conundrum of the computer medium lies the rationale why these
types of projects are both incessantly compelling and instantly mundane: on one
hand, since we are indeed talking binaries here, their claims to isolate the
multifarious behavior of data bits to their own limited operations subdues the
potential madness of an arbitrary bit architecture and thereby grounds protocols
in an oppositional, highly reasonable context. But, on the other hand, the
projects themselves reveal these operations to always already be active and
working away within this selfsame structure. It is not insignificant in this
regard that most net.art transcoding endeavors appear to indulge in rather
semantically poor output at the front end. In the three works discussed, we get
abstract shapes and patterns along with base sensory information scattered in
HTML grids and mellow MP3 music submerged in atmospheric harmonies. This choice,
and it is crucially a choice on the scripters/programmers behalf, basically
attempts to move away from the widely conversant computer literacy promoted by
transcoding, which implies the successive application of established protocols,
toward the linguistic plight of translation as transformation. The flexible
exchange rate of bits remains the modus operandi, but the currency of the data
outlet fluctuates in value--from ordered to scattered, meaningful to meaningless
and so on. Given the identically encoded binary
origin here, this treatment signals a distinctly asymmetrical rupture in
prevailing systems of representation and signification, making interconnected
expressions appear equal despite very obvious differences.
To better appreciate this fascinating move, a tangential and cursory shift into
semiology is desirable to avoid sidelining the fact that computing has, or even
is, a cultural history. Traditionally posited as a science of signs, which are
defined broadly without substance or limits, semiology operates with a
tripartite structure of sign, signifier and signified to systematically
elucidate the processes whereby any form of representation appears meaningful.
Although this premise originally looked at all sights and sounds that may, in
some form, solicit or elicit communication, it gradually turned toward the
primary intelligibility of language to study the enunciating relations. At its
core, however, and this is the crucial reference to our present concerns,
semiology was conceived as a system that, as Roland Barthes has tellingly
remarked, pursued a euphoric dream of scientificity. By first positing a model
that hypothetically supersedes language through signification, this operative
system is able to predict and precede the moment of enunciation, rendering its
inevitable emergence, in semiological jargon, a transcendental signified. In
very simple terms, one could say that the system reveals something through the
operations of the model, and it appears natural when it successfully hides this
fact. A short, chronological list covering how this science has developed, and
implying how semiology is more broadly understood in this context, may include
Charles Sanders Peirce, Ferdinand Saussure, Roland Barthes and Jacques Derrida,
but this narrow trail of contributions to the discipline branches out just about
everywhere, for example into the psychologism furthered by Jacques Lacan, or,
for those more familiar with photographic theory, the psychosemiology of Victor
Burgin. Only roughly sketching this particular context serves to drastically
shorthand the above scenarios for how the sign, signifier and signified interact,
what roles they respectively serve within the prescribed signifying chains, and
even how or by what each entity and each link is constituted (every author
mentioned gradually gets entangled in solving questions raised by their own
arguments). But the contested point of acquiring a locus for logos, attached to
these conjectural contortions, is of course far from trivial and essentially
perpetuates the debate. The important legacy of immediate use here is that the
presupposed division of sign, signifier and signified has prevailed along with a
preponderance toward scientificity; it is of direct relevance to how the concept
of transcoding is built into computer logic, and accordingly understood and
practiced within new media theory and net.art.
Having acknowledged, in a roundabout yet very economical way, that the
distinction between signifier and signified is problematic at the root (as it
relies on the unity of the sign to make the concept present in and of itself
through, and despite of, this opposition), let us turn briefly to a quote from
an interview with Jacques Derrida conducted by Julia Kristeva before returning
to a more comprehensive discussion of computer transcoding. Speaking of the
opposition between signifier and signified, Derrida notes:
That this opposition of difference cannot be radical or absolute does not
prevent it from functioning, and even from being indispensable within certain
limits--very wide limits. For example, no translation would be possible without
it. In effect, the theme of a transcendental signified took shape within the
horizon of an absolutely pure, transparent and unequivocal translatability. In
the limits to which it is possible, or at least appears possible, translation
practices the difference between signifier and signified. But if this difference
is never pure, no more so is translation, and for the notion of translation we
would have to substitute a notion of transformation: a regulated transformation
of one language by another, one text by another. [5]
Translation, to playfully paraphrase the same again in other words, implies the
seamless movement of pure signifieds across languages and texts (platforms and
formats) that the signifying apparatus itself supposedly leaves untouched. It
denies any precarious intertextuality, invoking a chain of substitutions, in
favor of an original that effectively surpasses any and all transformation.
The popular new media concept of transcoding does indeed speak of a limitless
and highly effective translatability. Coupled with the associated premise of
numerical representation, it proposes that the application of protocols to
numbers has conjured up a science that programs closure into every transaction,
every translation, and every transposition of what presents itself, in each
transmuted instance, as the transcendental identity of the signifier/signified
in a sign. There is an unprecedented equivocality at play here, one that
operates in the dark passages of hardware and comes to light through software,
and which is consequently instrumental in separating itself (and its objects)
from the elucidating passage of the signifying operations. Translation,
practiced as the aforementioned difference between signifier and signified,
consequently succumbs to a science of logical-mathematical notation. As such, it
signals the practical apotheosis of semiology, which has precisely been
conceived of as a systematized science of signs to break the metaphysical bounds.
Hence the longstanding semiotic project--founded and resolved upon the
tripartite sign, signifier, signified--reaches a certain "organic"
totality through computerized transcoding, bringing the necessary presupposition
of a priori, an innocent and independent writing before the letter, to
communication.
What is not yet accounted for in this view (although it is of course there
through the founding signifier/signified opposition) is the move that previously
brought out the psycho prefix and applied it to semiology. The signified,
although attributed to the signifying chain that revolves around the elusive
conglomerate of a sign, may instead be part of a general psychology, a scenario
of mind over matter seeking a uniform social body with a cohesive psychology to
ground the sign in a detached collectivity. This position, explored indirectly
by Barthes through the gathering concept of myth and more directly by Burgin in
his reliance on Freudian psychoanalysis, should of course not be discounted with
regards to affective, as a counterpoint to effective, data. The very human back
and front end--the self-fulfilling cycle--of transcoding will of course always
be subject to the same semantic mysteries as any pre-digital entity when it
comes to these instructive semiotic structures. The key point, however, is that
the appearance, the coming into being, of the signifier/signified opposition
through transcoding hinges on the murky fusion of zeroes and ones: the base
metaphysical counterpoints that now crucially couple through a machine and not
mental conjunction. Although this latter digression is ripe with the usual
analogies of mind and machine, the link between semiology and psychology when it
comes to computer operations essentially broadens the usual turns of the logical
circuit by further implicating a range of associated discourses in the central
transcoding principle. Effectively, this is where the user figure comes into
play, but that's an interesting biography that remains to be written.
Despite the documented and discussed ability of transcoding to transform,
witnessed in the listed net.art works and noted via Derrida, it appears that the
representational claims to metonymy rather than analogy actually conjure up
directly translatable aspects that perceptively and conceptually manage to fully
survive this revolution. In Taxi Art, does the work not indicate a blinking
orange, signaling left or right, at every turn of the colorful geometric drafts?
Does Baldwin's Goodworld not bring an inebriated textual smile to blurry color
vision only through comparison with the clearly aliased input URL? Do you not
descend into soundscapes of corresponding hypertext when WebPlayer embarks on
its heavily transmuted aural voyage? Isolating such experiences, sensory as well
as conceptually, makes for a far more complicated analysis of transcoding. The
effect produced and described is doubly stunning: on one hand
logical-mathematical notation denies to confirm the, in lack of a better word,
theology of transcoding as the virgin passage of translation; on the other, it
retains an empirical contingency of unprecedented representational and
signifying power. It may very well contest the formalism of equivalence by
logically and mathematically scrambling the bits beyond recognition (in a
classical representational sense), but the overriding yet obscure science of
this operation, the alchemic feat of numbers and logic, brings an overwhelming
empirical closure to the experience, a strangely distorted yet comforting sense
of deja vu. What sunders then ultimately unites; numbers break apart but finally
add up. The checksum of all this is that each and every one of these projects,
and they only comprise three exemplary instances of an overwhelming trend,
believe in the divine translatability of transcoding to the extent that complex
semantic devices are readily and purposefully sacrificed for an applied
metaphysics of the excruciatingly simple, reflected in Euclidean cartography (Taxi
Art), typographic emoticons that recall Platonic pure form (Goodworld) and the
omnipresent Muzak of the deep network (WebPlayer). This reductive approach to
the semiotic question obviously echoes the overwhelming progress of
logical-mathematical notation, and it does not in actuality query the unity of
the signifying division, or rather the universal scientifcity of the process
that now brings it to bear so fancifully and persuasively. On the contrary, the
troublesome collaboration between applied science and metaphysics that always
promotes an omniscient empiricism has reached its apotheosis in transcoding, and
this is indeed the sign of our net.art times.
NOTES:
[1] Lev Manovich, The Language of New Media, Cambridge, MA: MIT Press, 2001,
p. 46.
[2] http://www.radiotaxis.net
[3] http://www.artport.whitney.org/gatepages/artists/baldwin/index.html
[4] http://www.twofivesix.co.uk/snd/index.html
[5] Jacques Derrida, Positions, Chicago, IL: The University of Chicago
Press, 1981, p. 20.