Showing posts with label software studies. Show all posts
Showing posts with label software studies. Show all posts

Sunday, 9 January 2011

Wirelessness - radical empiricism in media theory


Adrian Mackenzie captures something extremely essential and apt in his fresh book Wirelessness - Radical Empiricism in Network Cultures (2010). Besides being an analysis of an aspect of contemporary "network" culture so neglected by cultural analysers, it offers a view into how does one conduct post-phenomenological analysis into the intensive, moving, profiliterating aspects of experience in current media culture. So much of what seems wired is actually wireless; so much of what seems experienced, is actually at the fringes of solid experience, which is why Mackenzie sets out to use William James's exciting philosophical theories of radical empiricism as his guide to understanding wirelessness.

Let's define it, or let Mackenzie define it:

"The key claim of the book is that the contemporary proliferation of wireless devices and modes of network connection can best be screened against the backdrop of a broadly diverting and converging set of tendencies that I call 'wirelessness'. Wireless designates an experience trending toward entanglements with things, objects, gadgets, infrastructures, and services, and imbued with indistinct sensations and practices of network-associated changed. Wirelessness affects how people arrive, depart, and inhabit places, how they relate to others, and indeed how they embody change." (5)

Indeed, Mackenzie does not remain content to just stick to the techy details or the phenomenology of how it feels to be surrounded by wireless devices and discourses, but sets out to treat these as a continuum. This too follows from James. Things go together well with our minds/brains. Thoughts are very much things even if at the other end of the spectrum than the more seemingly solid things of the world. Thinking and things cannot be separated. Mackenzie quotes James: "Thoughts in the concrete are made of the same stuff as things are." The stuff of continuum.

Hence, what follows is also methodologically exemplary treatment of this weird phenomena of wireless communication. Already in its early phase, the fact that communication started to remove itself from solid bodies and the messaging human body, was a topic of awe and wonderment. James was roughly a contemporary to the buzzing discourses of electromagnetic fields and experiments in wireless communication closer to the end of the 19th century by such figures as Preece, Willoughby Smith and of course Marconi; this media archaeological aspect is not so much touched upon by Mackenzie. In any case, one would do well to look at it's 19th century radical empiricist discourses as well, to examine the way bodies, solids, experience and media were being rethought in those early faces, here described in the words of one pioneer and early writer Sir William Crookes:

" Rays of light will not pierce through a wall, nor, as we know only too well, through London fog; but electrical vibrations of a yard or more in wave-length will easily pierce such media, which to them will be transparent." (quoted in J.J.Fahie, Wireless Telegraphy, 1838-1899, p.197).

Even if not transparency, wirelessness affords new senses of mobility. For us, wireless is heavily an urban phenomena (even if touches on how rural areas are being connected, peripheries harnessed, and now, also, the human body and its organs individually connected to the internet with new wireless device surgery). For Mackenzie, the mobility relates to "transitions between places" and how such hotspotting of for example the urban sphere creates new forms of intensity that are not stable. In his earlier book Transductions Mackenzie was using Simondon's vocabulary which offered the idea of the primacy of metastability, now James is doing the same trick with offering a conceptual vocabulary for an experience that is distributed, diffuse and coming and going.

What is fascinating is how Mackenzie moves between the various scales, and still is able to keep his methodology and writing intact. In addition the fact that the urban experiences of humans is being enabled by the variety of wireles devices, networks, accesses, and so forth, he is after such radical technological experience where hardware and software relations within technology matter as well. Talking about chipsets such as the Picochip202, Mackenzie compares these to cities: "The 'architectures' of chipsets resemble cities viewed from above precisely because they internalize many of the relational processes of movement in cities." (65).

The way bodies were moved and managed in urban environments has now been transposed as a problem on the level of chips and other seemingly "only" technical solutions. Yet, what Mackenzie does succesfully is to show how we need insights into such biopolitics that engage not only with human phenomenological bodies, but biopolitics of technological bodies too. This is what I find a very exciting and necessary direction, and while I know some of the great work done in Science and Technology studies, more media studies work in this direction of new materialism is very much welcome.

So now that we got talking about technological bodies in relation, and probably going soon so far that we could say that they have affects, would some critic say, does this not mean that we losing our grip on politics -- that technology is such a crucial way of governing our worlds, offering meanings, and is itself embedded in a cultural field of representation and such?

Mackenzie does not however neglect representations, or the variety of materials of which the experience of wirelessness consists; from wireless routers to marketing discourses and adverts, the ontological claim that thinking and things do not differ work also as a methodological guideline for rigorous eclectism. Similarly, Mackenzie shows how his methodology and writing lends itself also to postcolonial theory in chapter 7 "Overconnected worlds". Here, the claim is consist with a radical constructedness inherent in how transnationality and the global are created, not received, structures of experiencing; here, various wireless projects offer such platforms for both belief as well as physical connection.

Wirelessness overflows individual bodies and act as a catalyzer, intensifier, a field for experience perhaps in the sense as electromagnetic fields afford the technical signal between devices. What the book does as well is overflows in its richness - but it is clear that it is so rigorous in its take that media theory benefits from this for a long time. It picks up on some of the same inspiration that has been catalyzed into more philosophical takes on communication and contemporary culture by Brian Massumi, but is one of the first ones to take this mode of analysis of lived abstractions into concrete media analysis - similarly as he did with Simondon already in Transductions.

Tuesday, 15 June 2010

New Materialism abstracts

For the forthcoming 21st June event New Materialisms and Digital Culture, here are the abstracts which promise very interesting crossdisciplinary perspectives into investigating what is new materialism in the context of various practices and arts of digital culture.


David M. Berry: Software Avidities: Latour and the Materialities of Code

The first difficultly in understanding software is located within the notion of software/code itself and its perceived immateriality. Here it is useful to draw an analytical distinction between ‘code’ and ‘software’. Throughout this paper I shall be using code to refer to the textual and social practices of source code writing, testing and distribution. In contrast, I would like to use ‘software’ to include products, such as operating systems, applications or fixed products of code such as Photoshop, Word and Excel and the cultural practices that surround the use of it. This further allows us to think about the hacking as the transformation of software back into code for the purposes of changing its normal execution or subverting its intended (prescribed) functions. However, this difficulty should not mean that we stay at the level of the screen, so-called screen essentialism, nor at the level of information theory, where the analysis focuses on the way information is moved between different points disembedded from its material carrier, nor indeed at the level of a mere literary reading of the textual form of code. Rather code needs to be approached in its multiplicity, that is as a literature, a mechanism, a spatial form (organisation), and as a repository of social norms, values, patterns and processes. In order to focus on the element of materiality I want to use Latour's notion of the 'test of strength' to see how the materiality of code, its obduracy and its concreteness are tested within computer programming contests. To do this I want to look at two case studies: (1) the Underhanded C Contest, which is a contest which asks the programmer to write code that is as readable, clear, innocent and straightforward as possible, and yet it must fail to perform at its apparent function. To be more specific, it should do something subtly evil; and (2) The International Obfuscated C Code Contest, which is a contest to write the most Obscure/Obfuscated C program possible that is as difficult to understand and follow (through the source code) as possible. By following the rules of the contest, and by pitting each program, which must be made available to compile and execute by the judges (as well as the other competitors and the wider public by open sourcing the code), the code is then shown to be material providing it passes these tests of strength.

Rick Dolphijn: The Intense Exterior of Another Geometry
Starting with several examples from contemporary ‘animal architecture’, this paper proposes a search for how anything ‘surrounding’ the organic body (a box, a piece of cloth, a house), in the alliance it creates with this body, is mutually united with it. It brings us to the practices central to this paper as they concern envisioning our “urban exoskeleton” as DeLanda calls it, and how this sets forth the emergence of a “future people” as Proust already foresaw it. In other words, our interests lie with how life comes into being in its intense relationships with urban morphology. We then needs to accept the definition of life offered to us by Christopher Alexander who considers life “a most general system of mathematical structures that arises because of the nature of space” (2004: 28). To speculate the future lives (unconsciously) hidden in the morphogenetic qualities of urban form today, should then be pursued in terms of the (aesthetic) principles of creating space. Conceptualizing these principles in the Occident and in the Orient, we allow ourselves to conceptualize a difference between two wholly other urban bodies of which especially the latter (the Oriental) has hardly received any attention in contemporary theory. This Oriental ‘city of axonometric vision’, as we develop this next to the (Occidental) ‘city of linear perspective’ allows us to think the urban exoskeleton in terms of a multiplicity of dynamic surfaces (as opposed to a centralized pattern), through an “equal-angle see-through” (dengjiao toushi in Chinese) (as opposed to a linear perspective) and through a non-dualist felt-togetherness. It allows us to think the creative dynamics of unlimited growth as the new proposition of what the bodies can do.

Eleni Ikoniadou: Transversal digitality and the relational dynamics of a new materialism
The relationship between digital technology and matter has preoccupied media and cultural theorists for the last two decades. During the 90s it was articulated through a celebration of the disembodied, immaterial and probabilistic properties of information (cybercultural theory). More recently, it has been asserted through a reliance on sensory perception for the construction of a predominantly observable, otherwise void, digital space (digital philosophy). However, alternative materialist accounts may be able to offer more dynamic ways of understanding the heterogeneity, materiality and novelty of digital culture (Kittler, 1999; Mackenzie, 2002; Fuller, 2005; Munster, 2006). Following on their footsteps, this presentation will aim to rethink the ontological status of the digital as immanent to the flows of a ‘new materialism’. The latter is understood as a transversal process that cuts across seemingly distinguished fields and disciplines, such as the arts and sciences, establishing new connections between them. New materialism, then, becomes a concept and a method proper for investigating digital media and their tendency to bring together different aspects of the world in new ways. The paper discusses how an abstract materialist new media theory can enable transversal relations between science studies, philosophy and media art, as well as between the actual and the virtual dimensions of reality; allowing the emergence of heterogeneous digital assemblages of material, aesthetic and scientific combination.

Adrian Mackenzie: Believing in and desiring data: 'R' as the next 'big thing'
How could materialist analysis come to grips with the seeming immateriality of data network media? This paper attempts to think through some of the many flows of desire and belief concerning data. In the so-called 'data deluges' generated by the searches, queries, captures, links and aggregates of network media, key points of friction occur around sharing and pattern perception. I focus on how sharing and pattern perception fare in the case of the scripting language R, an open source statistical 'data-intensive' programming language heavily used across the sciences (including social sciences), in public and private settings, from CERN to Wall Street and the Googleplex. R, it is said, is a 'next big thing' in the world of analytics and data mining, with thousands of packages and visualizations, hundreds of books and publications (including its own journal, /R Journal/) appearing in the last few years. In this activity, we can discern vectors of belief and desire concerning data. The tools and techniques developed in R can be seen both intensifying data, and at times, making the contingencies of data more palpable.

Stamatia Portanova: The materiality of the abstract (or how movement-objects ‘thrill’ the world)
Gilles Deleuze and Alfred N. Whitehead have defined the ‘virtual’ not as an unreal simulation but as a real potential, an idea (respectively conceived by them as a ‘mathematical differential’ or a ‘mathematical relation’) around which an actual fact takes shape. Drawing on Deleuze and Whitehead's concepts of 'virtuality', this paper addresses the possibility of a materialist approach that is able to take into account the virtuality of matter, i.e. how the abstract dimension of ideas (‘the mind’, ‘thought’) possesses its own consistence. The concrete object analyzed to exemplify this approach is the relation between digital culture, digital technology and movement, from which something like 'virtual movement-objects' emerge. More specifically, the paper explores the use of several technologies of movement creation and distribution (Motion Capture, digital video editing, the Internet) in mass-media environments such as pop music clips and You Tube amateur videos, dance video games and choreography web sites. The main objective is to understand how these applications generate and replicate what will be defined as ‘virtual movement-objects’, digitally generated dance steps that are widely imitated and adapted. From an ‘abstractly materialist’ point of view, the numerical data produced through the digitalization of dance will be considered as virtual movement ideas with a potential to be repeatedly actualized (in videos, live events, games). These ideas have the possibility of infinite reanimation: the same step can be endlessly repeated, becoming a dance of graphic shapes or 3D images, but also a movement across people and cultures. This definition also draws on Gabriel Tarde and Bruno Latour’s understanding of imitation. Imitation, in Latour’s words, weaves a sort of contagious ‘behavioural network’ based on the return of 'virtual centres of gravity’, ideal patterns attracting a repetition of movements that ‘look the same’ but are always different and unpredictable. This paper therefore explores how, despite their designed nature, movement-objects appear as open and creative movement ideas able to autonomously circulate in transversal social networks and generate unexpected rhythmic behaviours. The diffusion of Michael Jackson’s Thriller dance on YouTube, in Sims animations or in the choreographed performance of 1500 detainees of the Cebu Provincial Detention and Rehabilitation Center (Philippines), can e considered as one of the most famous examples of how dance steps have become virtual movement-objects to be infinitely actualized.

Anna Powell: Affections in their pure state? The digital event as immersive encounter
Digital video offers a distinctively immersive encounter. In its early analogue days, video art seemed to validate Deleuze’s diagnosis of ‘electronic automatism’ (Cinema 2, 1985). Its characteristics include ‘omnidirectional space’, framing which is ‘reversible and non–superimposable’ and the unpredictable motion of ‘perpetual reorganisation’. Spatial composition becomes an opaque ‘table of information’ on which data ‘replaces nature’. Some of Deleuze’s anxieties for the (then new) medium have been fulfilled by surveillance and the mainstream spectacle of GGI, as in the ‘gigantism’ of Avatar’s 3D optical illusionism. Yet, this ‘original regime of images and signs’ has also proved its credentials for the schizo will to art.One obvious formal distinction between cine and digital video is editing. Video editing does not operate by cutting and splicing footage but by ‘dragging and dropping’ sections of film on top of each other. Rather than being excised by cuts to produce temporal elision, uploaded video clips are pulled down on top of a ‘master’. An editing decision can be reversed by using a sliding tool to reveal that the first layer of images is only temporarily overlaid by another. Digital editing thus increases the density and depth of the plane of images by potentially limitless conjunctive synthesis.Deleuze argues that without a sense of the out-of-frame, time and space are overwhelmingly immanent in electronic automatism. This apparent removal of the out-of-frame and the elsewhere leads instead to an intensive meld of brain and screen that can move the mind/screen in schizoanalytic directions. Video art’s preference for gallery installation or live performance with VJ-ing and music rather than cinema screen offers further haptic immersion in the medium.Digital videos that repudiate both the televisual and the cinematic regimes can express what video artist Mattia Casalegno calls ‘affections in their pure state’. The aesthetic properties of digital video bring affect, perception and time closer together. What are the implications of this apparent removal of the gap between actual and virtual? If, as Deleuze suggests, the brain is the screen, what kind of schizo images and thoughts might future digital art unfold? Starting from the overt distinctions of cine and video this paper investigates the impact of the digital body without organs. It references work by video artists specifically Deleuzian inspiration whose works express new materialist intent.

Iris Van der Tuin: A Different Starting Point, A Different Metaphysics: Reading Bergson and Barad Diffractively
This paper provides an affirmative feminist reading of the philosophy of Henri Bergson by reading it through the work of Karen Barad. Adopting such a diffractive reading strategy enables feminist philosophy to move beyond discarding Bergson for his apparent phallocentrism. Feminist philosophy finds itself double bound when it critiques a philosophy for being phallocentric, since the set-up of a Master narrative comes into being with the critique. By negating a gender-blind or sexist philosophy, feminist philosophy only gets to reaffirm its parameters and setting up a Master narrative costs feminist philosophy its feminism. I thus propose and practice the need for a different methodological starting point, one that capitalizes on “diffraction.” This paper experiments with the affirmative phase in feminist philosophy prophesied by Elizabeth Grosz, among others. Working along the lines of the diffractive method, the paper at the same time proposes a new reading of Bergson (as well as Barad), a new, different metaphysics indeed, which can be specified as onto-epistemological or “new materialist.”

Tuesday, 8 June 2010

Affect, software, net art (or what can a digital body of code do-redux)

After visiting the Manchester University hosted Affective Fabrics of Digital Cultures-conference I thought for a fleeting second to have discovered affects; its the headache that you get from too much wine, and the ensuing emotional states inside you trying to gather your thoughts. I discovered soon that this is a very reductive account, of course -- and in a true Deleuzian spirit was not ready to reduce affect into such emotional responses. Although, to be fair, hangover is a true state of affect - far from emotion -- in its uncontrollability, deep embodiment.

What the conference did offer in addition to good social fun was a range of presentations on the topic that is defined in so many differing ways; whether in terms of conflation it with "emotions" and "feelings", or then trying to carve out the level of affect as a pre-conscious one; from a wide range of topics on affective labour (Melissa Gregg did a keynote on white collar work) to aesthetic capitalism (Patricia Clough for example) which in a more Deleuzian spirit insisted on the non-representational. (If the occasional, affective reader is interested in a short but well summarizing account of differing notions of affect to guide his/her feelings about the topic, have a look at Andrew Murphie's fine blog posting here - good theory topped up with a cute kitty.)

My take was to emphasise the non-organic affects inherent in technology -- more specifically software, which I read through a Spinozian-Uexkullian lense as a forcefield of relationality. Drawing on for example Casey Alt's forthcoming chapter in Media Archaeologies (coming out later this year/early next year), I concluded with object-oriented programming as a good example of how affects can be read to be part of software as well so that the technical specificity of our software embedded culture reaches out to other levels. Affects are not states of things, but the modes in which things reach out to each other -- and are defined by those reachings out, i.e. relations. I was specifically amused that I could throw in a one-liner of "not really being interested in humans anyway" --- even better would have been "I don't get humans or emotions", but I shall leave that for another public talk. "I don't do emotions" is another of my favourite one's, that will end up on either a t-shirt or an academic paper.

The presentation was a modified version from a chapter that is just out in Simon O'Sullivan and Stephen Zepke's Deleuze and Contemporary Art-book even if in that chapter, the focus is more on net and software art. I am going to give the same paper in the Amsterdam Deleuze-conference, but as a teaser to the actual written chapter, here is the beginning of that text from the book...

1 Art of the Imp
erceptible

In a Deleuze-Guattarian sense, we can appreciate the idea of software ar
t as the art of the imperceptible. Instead of representational visual identities, a politics of the art of the imperceptible can be elaborated in terms of affects, sensations, relations and forces (see Grosz). Such notions are primarily non-human and exceed the modes of organisation and recognition of the human being, whilst addressing themselves to the element of becoming within the latter. Such notions, which involve both the incorporeal (the ephemeral nature of the event as a temporal unfolding instead of a stable spatial identity) and the material (as an intensive differentiation that stems from the virtual principle of creativity of matter), incorporate ‘the imperceptible’ as a futurity that escapes recognition. In terms of software, this reference to non-human forces and to imperceptibility is relevant on at least two levels. Software is not (solely) visual and representational, but works through a logic of translation. But what is translated (or transposed) is not content, but intensities, information that individuates and in-forms agency; software is a translation between the (potentially) visual interface, the source code and the machinic processes at the core of any computer. Secondly, software art is often not even recognized as ‘art’ but is defined more by the difficulty of pinning it down as a social and cultural practice. To put it bluntly, quite often what could be called software art is reduced to processes such as sabotage, illegal software actions, crime or pure vandalism. It is instructive in this respect that in the archives of the Runme.org software art repository the categories contain less references to traditional terms of aesthetics than to ‘appropriation and plagiarism’, ‘dysfunctionality’, ‘illicit software’ and ‘denial of service’, for example. One subcategory, ‘obfuscation’, seems to sum up many of the wider implications of software art as resisting identification.[i]

However, this variety of terms doesn’t stem from a merely deconstructionist desire to unravel the political logic of software expression, or from the archivists nightmare á la Foucault/Borges, but from a poetics of potentiality, as Matthew Fuller (2003: 61) has called it. This is evident in projects like the I/O/D Webstalker browser and other software art projects. Such a summoning of potentiality refers to the way experimental software is a creation of the world in an ontogenetic sense. Art becomes ‘not-just-art’ in its wild (but rigorously methodological) dispersal across a whole media-ecology. Indeed, it partly gathers its strength from the imperceptibility so crucial for a post-representational logic of resistance. As writers such as Florian Cramer and Inke Arns have noted, software art can be seen as a tactical move through which to highlight political contexts, or subtexts, of ‘seemingly neutral technical commands.’ (Arns, 3)


Arns’ text highlights the politics of software and its experimental and non-pragmatic nature, and resonates with what I outline here. Nevertheless, I want to transport these art practices into another philosophical context, more closely tuned with Deleuze, and others able to contribute to thinking the intensive relations and dimensions of technology such as Simondon, Spinoza and von Uexküll. To this end I will contextualise some Deleuzian notions in the practices and projects of software and net art through thinking code not only as the stratification of reality and of its molecular tendencies but as an ethological experimentation with the order-words that execute and command.


The Google-Will-Eat-Itself project (released 2005) is exemplary of such creative dimensions of software art. Authored by Ubermorgen.com (featuring Alessandro Ludovico vs. Paolo Cirio), the project is a parasitic tapping in to the logic of Google and especially its Adsense program. By setting up spoof Adsense-accounts the project is able to collect micropayments from the Google corporation and use that money to buy Google shares – a cannibalistic eating of Google by itself. At the time of writing, the project estimated that it will take 202 345 117 years until GWEI fully owns Google. The project works as a bizarre intervention into the logic of software advertisements and the new media economy. It resides somewhere on the border of sabotage and illegal action – or what Google in their letter to the artists called ‘invalid clicks.’ Imperceptibility is the general requirement for the success of the project as it tries to use the software and business logic of the corporation through piggy-backing on the latter’s modus operandi.


What is interesting here is that in addition to being a tactic in some software art projects, the culture of software in current network society can be characterised by a logic of imperceptibility. Although this logic has been cynically described as ‘what you don’t see is what you get’, it is an important characteristic identified by writers such as Friedrich Kittler. Code is imperceptible in the phenomenological sense of evading the human sensorium, but also in the political and economic sense of being guarded against the end user (even though this has been changing with the move towards more supposedly open systems). Large and pervasive software systems like Google are imperceptible in their code but also in the complexity of the relations it establishes (and what GWEI aims to tap into). Furthermore, as the logic of identification becomes a more pervasive strategy contributing to this diagram of control, imperceptibility can be seen as one crucial mode of experimental and tactical projects. Indeed, resistance works immanently to the diagram of power and instead of refusing its strategies, it adopts them as part of its tactics. Here, the imperceptibility of artistic projects can be seen resonating with the micropolitical mode of disappearance and what Galloway and Thacker call ‘tactics of non-existence’ (135-136). Not being identified as a stable object or an institutional practice is one way of creating vacuoles of non-communication though a camouflage of sorts. Escaping detection and surveillance becomes the necessary prerequisite for various guerrilla-like actions that stay ‘off the radar.’

Wednesday, 12 May 2010

Culture Synchronised: Remixes with Nick Cook and Eclectic Method


The room Hel 252 is starting to have good karma as the remix-class room at Anglia Ruskin. Not because its equipped with computers, editing equipment or such, but because it is starting to have a good track record as the room where we have now hosted both the screening and discussion of RIP: Remix Manifesto with Brett Gaylor, and now also discussed the work of Eclectic Method -- one of the most well known remix-acts.

Geoff Gamlen, a founding member of Eclectic Method, visited us in the context of Professor Nicholas Cook's talk on musical multimedia. Professor Cook continued themes that were addressed already in his 1998 book on the topic and now followed up in the form of a new book project that deals with performance. With a full room of excited audience, Cook gave a strong presentation on hot topics in musicology and the need to move to new areas of investigation, as well as showing how such ideas relate to the wider field of cultural production in the digital age. Remix-culture is not restricted to music but where such examples as Eclectic Method (or we could as well mention for example Girl Talk) are emblematic of software driven cultural production that ties contemporary culture with early 20th century avant-garde art practices, and shows how political economy of copyright/copyleft, of participatory and collaborative modes of sharing and producing, of aesthetics of image/sound-collages and synchronisations, all are involved in this wider musical assemblage. What Cook argued in terms of musicological approaches that, in my own words, are suggesting "the primacy of variation" was apt. Such performance practices as Eclectic Method's are important in trying to come up with up-to-date understanding of what is performance, what is the author, and how performance practices relate to wider media cultural changes that are as much about the sonic, as they are about pop cultural aesthetics -- hence the examples on Tarantino were apt in the presentation. We need to move on (whether in terms of the epistemic frameworks or the legal ones) from the 19th century romantic notion of the Creator as the source of the artwork to what I would suggest (in a kinda of a Henry Jenkins sort of way) to an alternative 19th century of folk cultures where sharing and participating was the way culture was distributed, and in continuous variation. Despite the increasing amount of skeptics from Andrew Keen to Jaron Lanier (and in a much more interesting fashion Dmytri Kleiner), who also rightly so remind us that Web 2.0 is not only about celebration of amateur creativity and sharing but a business strategy that compiles free labour through website bottlenecks into privatized value, I would suggest that there is a lot to learn from such practices of creation as remixing and their implications for a theoretical understanding of musical and media performance.

Eclectic Method's work...range from political remixes...


...to pop/rock culture synchronisations...

Tuesday, 6 April 2010

Choice, self-regulation, security and other characteristics that make us desire to see less


"Not long ago it would have been an absolutely absurd action to purchase a television or acquire a computer software to intentionally disable its capabilities, whereas today's media technology is marketed for what it does not contain and what it will not deliver." The basic argument in Raiford Guins' Edited Clean Version is so striking in its simplicity but aptness that my copy of the book is now filled with exclamation marks and other scribblings in the margins that shout how I loved it. At times dense but elegantly written, I am so tempted to say that this is the direction where media studies should be going if it did not sound a bit too grand (suitable for a blurb at the back cover perhaps!).

I shall not do a full-fledged review of the book but just flag that its an important study for anyone who wants to understand processes of censorship, surveillance and control. Guins starts from a theoretical set that contains Foucault's governmentality, Kittler's materialism and Deleuze's notion of control, but breathes concrete specificity to the latter making it really a wonderful addition to media studies literature on contemporary culture. At times perhaps a bit repetitive, yet it delivers a strong sense of how power works through control which works through technological assemblages that organize time, spatiality and desire. For Guins, media is security (even if embedding Foucault's writings on security would have been in this context spot on) -- entertainment media is so infiltrated by the logic of blocking, filtering, sanitizing, cleaning and patching (all chapters in the book) that I might even have to rethink my own ideas of seeing media technologies as Spinozian bodies defined by what they can do...Although, in a Deleuzian fashion, control works through enabling. In this case, it enables choice (even if reducing freedom into a selection from pre-defined, preprogrammed articulations). Control is the highway on which you are free to drive as far, and to many places, but it still guides you to destinations. Control works through destinations, addresses -- and incidentally, its addresses that structure for example Internet-"space".

Guins' demonstrates how it still is the family that is a focal point of media but through new techniques and technologies. Software is at the centre of this regime - software such as the V-Chip that helps parents to plan and govern their children's TV-consumption. Guins writes: "The embedding of the V-Chip within television manifests a new visual protocol; it makes visible the positive effects of television that it enables: choice, self-regulation, interaction, safe images, and security." What is exciting about this work is how it deals with such hugely important political themes and logics of control, but is able to do it so immanently with the technological platform he is talking about. Highly recommended, and thumbs up.

Wednesday, 31 March 2010

Nick Cook talk on Beyond reference: Eclectic Method's music for the eyes

Another ArcDigital and CoDE talk coming up...

Professor Nicholas Cook, Cambridge University:
Beyond reference: Eclectic Method's music for the eyes
Date: Tuesday, 11 May 2010
Time: 17:00 - 18:15
Location: Anglia Ruskin University, East Road, Cambridge, room Hel 252

Screen media genres from Fantasia (1940) to the music video of half a century later extended the boundaries of music by bringing moving images within the purview of musical organisation: the visuals of rap videos, for example, are in essence just another set of musical parameters, bringing their own connotations into play within the semantic mix in precisely the same way as do more traditional musical parameters. But in the last two decades digital technology has taken such musicalisation of the visible to a new level, with the development of integrated software tools for the editing and manipulation of sounds and images. In this paper I illustrate these developments through the work of the UK-born but US-based remix trio Eclectic Method, focussing in particular on the interaction between their multimedia compositional procedures and the complex chains of reference that result, in particular, from their film mashups.

Professor Nicholas Cook is currently Professor of Music at the University of Cambridge, where he is a Fellow of Darwin College. Previously, he was Professorial Research Fellow at Royal Holloway, University of London, where he directed the AHRC Research Centre for the History and Analysis of Recorded Music (CHARM). He has also taught at the University of Hong Kong, University of Sydney, and University of Southampton, where he served as Dean of Arts.

He is a former editor of the Journal of the Royal Musical Association and was elected a Fellow of the British Academy in 2001.

http://en.wikipedia.org/wiki/Nicholas_Cook

The talk is organized by the Cultures of the Digital Economy Institute at Anglia Ruskin University and the Anglia Research Centre in Digital Culture (ArcDigital).

The talk is free and open for all to attend.

Monday, 1 March 2010

Does Software have Affects, or, What Can a Digital Body of Code Do?

I am going to attach here an abstract I submitted for a conference today -- the Deleuze studies conference in Amsterdam. Its something I did for a book coming out soonish, on Deleuze and Contemporary Art:

Can software as a non-human constellation be said to have “affects”? The talk argues that as much as we need mapping of the various affects of organic bodies-in-relation in order to understand the modes of control, power and production in the age of networks, we need a mapping of the biopolitics of software and code too. If we adopt a Deleuze-Spinozian approach to software we can focus on the body of code as a collection of algorithms to bodies interacting and affecting each other. What defines a computational event? The affects it is capable of. In a parallel sense as the tick is defined through its affects and potentials for interaction, software is not only a stable body of code, but an affordance, an affect, a potentiality for entering into relations. This marks moving from the metaphoric 1990s cyberdiscourse that adopted Deleuzian terms like the rhizome into a different regime of critique that works through immanent critique on the level of software. This talk works through software art to demonstrate the potentials in thinking software not as abstract piece of information but as processes of individuation (Simondon) and interaction (Deleuze-Spinoza). A look at software practices and discourses around net art and related fields offers a way of approaching the language of software as a stuttering of a kind (Jaromil). Here dysfunctionalities turn into tactical machines that reveal the complex networks software are embedded in. Software spreads and connects into economics, politics and logics of control society as an immanent force of information understood in the Simondonian sense. The affects of software do not interact solely on the level of programming, but act in multiscalar ecologies of media which are harnessed in various hacktivist and artist discourses concerning the politics of the Internet and software.

Encountering (only as a website though) today the Sonicity-installation project I continued thinking about this. The project turns light, humidity and other environmental data such as people into input for algorithmic sonification through MAX MSP and further to visualisation.

What intrigues me in this is the process of transformation and transposition of various sensory regimes; translations from input into data and further to sound, image, etc. This somehow connects for me to considerations of affect (bodies in relationality, a variety of heterogeneous bodies) as well as the materiality of code data as well (especially becoming sonorous, visible, and hence touching human bodies directly too). "The changing data is what affects what you see and experience. Live XML feeds are ciming from the real time sensors.. The sensors monitor temperature, sounds, noise, light, vibration, humidity, and gps. The sensor network takes a constant stream of data which is published onto an online environment where each different interface makes representations of the XML." (Sonicity-website).

Naturally, such transpositions could be connected to earlier avant-garde synaesthesia; people such as László Moholy-Nagy's explorations into the interconnectedness of sound with visual regimes is exemplary here (see Doug Kahn's Noise Water Meat, p. 92-93), especially when the point about synaesthesia not only as an aesthetic category but irreducibly laboratorial is made clear. Such synthetic processes that make us think about the interrelations of heterogeneous sensations and their sources work through the new technologies and sciences of sound and perception. Indeed, if code/sofware has affects -- that is not anymore sillier question than "I wonder how your nose will sound" (Moholy-Nagy).

Wednesday, 17 February 2010

Operational Management of Life


Management of life -- in terms of processes, decisions and consequences -- is probably an emblematic part of life in post-industrial societies. Increasingly, such management does not take place only on the level individuality, but dividuality -- i.e. managing the data clouds, traces, and avataric transpositions of subjectivity in online environments. This is the context in which J. Nathan Matias' talk on operational media design made sense (among other contexts of course), and provided an apt, and exciting, example of how through media design we are able to understand wider social processes.

Nathan addressed "operationalisation" as a trend that can be incorporated in various platforms from SMS to online self-management and operationalisation. More concretely, "operational media" can be seen as a management, filtering and decision mechanism that can be incorporated into services and apps of various kinds. Nathan's talk moved from military contexts of "command and control" (operationalisation of strategic ways into tactical operations) to such Apps as the blatantly sexist Pepsi Amp up before you score which allowed the (male) user to find "correct" and functional responses to a variety of female types. In addition to such, Nathan's talk was able to introduce the general idea of computer assisted information retrieval and management which to me was a great way of branding a variety of trends into "operational media". He talked about visualisation of data, augmented reality, filtering of data, expert, crowd and computer assisted information gathering, and a variety of other contexts in which the idea works.

"Should I eat this croissant" considering its calories, the needed time I need to work out to get it again out of my system, the time available etc. is one example of operationalisation of decisions in post-fordist societies of high-tech mobile tools that tap into work and leisure activities.

Another example is the service offered by Nathan's employed KGB (not the spies, but Knowledge Generation Bureau. See their recent Superbowl ad here. The KGB service is one example of mobile based operational services which in the character space of an SMS try to provide accurate answers to specific questions and hence differ from e.g. search engines.

Of course, one could from a critical theory perspective start to contextualise "operational media". Is it a form of digital apps enhanced behavioralism that does not only assume but strengthens assumptions about the possibility of streamlining complex human actions? Is it a mode of media design that further distances management of life into external services? Is it hence a form of biopower of commercial kinds that ties in with the various processes from the physiological to cultural such as labour and provides its design-solutions for them? In any case, Nathan's expertise in this field was a very enjoyable, and a good demonstration of a scholar/designer working in software studies.

Monday, 25 January 2010

Operational Media: Functional Design Trends Online -guest talk


February ArcDigital talk by J. Nathan Matias

Operational Media: Functional Design Trends Online

Tuesday, February 16, 17.00-18.30, Helmore 252 at Anglia Ruskin, East Road, Cambridge

Two prominent visions have guided the development of Internet technology from its beginning: the never-ending information space of creativity and information; and the networked tool for action. Now that markets for media production and search are saturated and stalling, second generation web tech has shifted focus to media that helps people make decisions and get things done. This lecture provides an introduction to key issues in the information design and software engineering of operational media.

Bio: J. Nathan Matias is a software engineer and humanities academic based in Cambridge, UK. His work focuses on enhancing human capabilities and understanding with digital media. Recent work has included digital history exhibits, work in online documentary, research on visual collaboration, and a visual knowledge startup. He currently spends half of his time as a software engineer on SMS information services for the Knowledge Generation Bureau, and half on digital media projects.

All welcome!