Feb 27, 2013
Ideology of dreams
Let's imagine a movie, which would begin with "the end." Then — closing titles on the creen, emotional soundtrack in the background. Banal shots of people in cinema theatre, close shots of faces en face in a pale light, then scenes with audience leaving their comfy chairs, slowly moving towards the exit, dispersing in the street. And only then the usual movie would begin — like a continuous, naturally choreographed movement of the characters (the "screen-people") — moving away from the screen. Movement with respect to screen. Screen as a technology of an impossible transition.
Speaking of transitions — Inception has got an interesting episode: Cobb (DiCaprio) and Adriane (Ellen Page) are talking at the table in a street cafe, Adriane is told that she is dreaming, she wakes up in shock, but they immediately resume the conversation as if the sensations of waking up, or, actually, the whole phenomenon of waking-up was some sort of inevitable (but not necessary) artifact of a human life, almost at its mundane, physi(ologi)cal level. Jumping to and fro the dream here is a question of technology, a technically solvable problem. This technology of transition functions as something essential, but not important (like during the immersion in the screen, the surrounding physicality looses its importance, although we still need it). The economy is clear: without technology of transition there would be no access to dreams, but the transition must be ignored and repressed (controlled, ergonomic, effective), otherwise there would be no chance to smuggle the consciousness into the dream. And on top of that, things need to be under control. And that's a hell-of-a-job.
So, there must be some driving story, a thread which ties and connects alternating experiences. The thread which keeps the realities together. Every watchable Hollywood is familiarly ideological: jobs are being done, people are being killed, some action above and below the law, some sex (or the absence of sex, which was elevated into the state-of-art in The X-Files) and at the end of the day peace returns to the institution (family, state, country, government). The status quo and punto stabile are recombined.
Even amidst all of that mess during the hardcore transitions between states of dreaming and being awake, a protagonist almost never looses the sense of orientation — both, identity-wise, and reality-wise. In the world of Inception, understandings such as "my family," "my country," "a job," "value," "money," "world," "real" — all of them remain unchanged, unquestioned, and unquestionable. Protagonist shows no hesitation to identify them. All the constructs of political and social realities — the framework, the base — remain intact.
————
I wonder, how often we dream about politics? Do you dream about the topics such as, say, Middle East Crisis? Or NATO? Do you retain the political stories in your dreams? Let's make it a rethorical question.
Feb 26, 2013
Hacking Times
It's about the time, and the way we feel about it.
One of those interesting ways to reflect on how "time" is embodied in technologies, is imagining what would it mean to experience this embodiment. The very moment we encounter a word "time" as an utterance, text, or thought, it opens up a whole totality of meanings and possibilities: it can be the date on a wall calendar (which is found — more often than not — on the screen), a clock (which is found — more often than not — on the screen), an old postcard (which is f. . .), etc. It must have — and it most often does — some particular meaning, one of the countless others, even if it's just a romantic generalization "oh, time... ."
What are we counting when we are "counting time"? That is a good question. What does this imaginary "tick" refers to? It's not a mechanical click of a clock. It's barely a sound anymore. The vibrations are not detectable, because it's a signal in a circuitry. It has no commanding physical presence, unlike the relics of an era which has just passed away: an old mechanical watch, ticking, banging and squeaking hours out 24/7, a bell tower in the middle of a town, a wrist watch. The sceptre * of time is confidently delegated to an electric signal in a "digital world."
* Peculiarly enough, sceptre a misspell-away from spectre. The latter could serve as an interesting substitute in another discussion, maybe I'll return to it in another post about ghosts.
Now let's see what would happen (within the strict confines of the imaginary experiment, of course) if the time (i.e., the official, scientific Time) in a particular social context would be changed without people being aware of it. Even if it's a relatively slight change. Imagine, that you wake up one morning and every media confirms a bad feeling that it is... yesterday. Let's say, yesterday was Friday. You expect to wake up into a lazy Sunday morning, but instead of that you're awaken by an alarm clock. It's 7:30 pm. You can't believe your eyes, check the date on the iPhone. "It cannot be!", — you get up out of bed to debunk this glitch, but TV, Internet, even the old electronic alarm clock seem conspired against you: it's Friday, again. You finally loose your sense of righteousness, when you realize that you're rushing to work, like everyone else, like you normally would, on a Friday morning. Soon you find out that everyone around share the same feeling of a "glitch." Even the pictures you really took yesterday, have proper dates assigned to them — it seems that time (again, the official, scientific, mediated time as a universal numerical representation, the "symbolic" time), together with the whole digital time grid, suddenly has taken a 24 hour step back. Nonetheless, you don't protest it too much, because there is no evidence to prove your point. You just have a feeling about it. Or maybe you protest and get hysterical — it doesn't matter. What matters here, is that our imaginary time shift occurs on top of, or in the background of... something. This background already has many names (and counting): reality, the Real, materiality, subjectivity, the World, nature, the physical. That "something" is always being referred to and coordinated with. There is this peculiar link between the universal time and "something."
It is possible to shatter this link not by questioning if the real events are "real", but by wandering are the real events still relevant? We're not just counting something when we mean "time," we're counting on something, which does that instead of us. There are no everyday observations of nature to rely on, these observations are carried out by groups of specialists and scientists, which count on scientific methods. For someone who is far from this particular scientific area, universal time is something given as a trusted knowledge. And like every knowledge, it — in its digital form — is, technically, "hackable." It can be altered, even if it is a highly difficult and complex venture. At least, within a certain scale. Here I caught myself thinking about The Truman Show, only in my version Truman was living in the real world with... the wrong time. Say, he would normally read that kind of newspaper every morning:
1853, 19th century. And it's ok. Until the day when poor chap discovers that it is actually 2013 that he's living in: same world, different dates. If my imaginary scenario were a Hollywood movie, it would end with the protagonist returning to normality and accepting the right time, right sexual partner, right thinking, etc. But it is not a Hollywood movie, it's my line of thought, and I'm ending with a promise to return and take care of further developments.
Feb 25, 2013
Media archeology of enjoyment
The Genesys system, developed by Dr. Ron Baecker in MIT's Lincoln Laboratory from 1967-69, was one of the world's first interactive systems for real-time animation. Genesys utilizes an interactive animation that exploits the potentialities of direct graphical interaction. Because every last picture in the animation sequence drives algorithms to generate the next one (so-called "dynamic display"), the process is referred to as "picture-driven animation".
It would be hard to imagine a contemporary media user enjoying the interaction with the same graphic interface and experiencing the same joyful immersion, as that young illustrator in the movie did. Realization that you can control the imagery in such a new way must have triggered the feeling that suddenly, a completely new space for explorations opened up right in front of you.
From the times of Ch.Babbage, explorations in the fields of automated computing and cybernetics were carried out mostly by professionals, *
[* In contemporary techno-cultural lingo, the analogue for "professional" is "geek." It is interesting to notice, that the connotations of "geek" over the period from 60s until the "noughtees," followed a peculiar dynamics: from negative (sociopathic, asexual, "smart-ass", "programmer") to positive (social, sexual, successful, "CEO"). ]at least until the 1990s, when "realistic" graphic interface became popular — i.e., became a norm. The representations in the computer screen became "real enough" to be enjoyed by everyone, not just programmers.
As we know, this realistic turn was met with explosion of enjoyment, in direct proportion to global computer sales. A radical change in the world-wide economy of enjoyment, and a general facelift to the interfaces: keyboard, drawing pad and e-pen, monitor. They remained intact, without any changes on a fundamental level. Obviously, there must have been reasons for that, and the "enjoyment" factor is not to be missed here. The screen — and not the "margins" around the screen: monitor, input/output devices — received the most of attention. The joy of watching, seeing, looking, gazing, peering, staring, glancing, glaring, peeping. It would be only natural to say, that the "picture-driven animation" was, indeed, also an "enjoyment-driven animation" (and "progress").
There was no reason to change the concept of the rectangular computer screen as a "window" metaphor, or to change the concept of a drawing pad as a metaphor of "paper," or the concept of an e-pen as a "pen." The factors of familiarity, convenience, and cognitive accessibility can still guarantee their dominance.
If the first stage (mode? level?) of enjoyment in the 50s-70s was "the joy of recognition," then the second stage — the one which still lasts (more or less) — could ne named "the joy of immersion." It is hard to ignore, that both of them suggest a progression to "the joy of acting." The joy of acting after the new media have been recognized, and after they have been embedded in our being, in our Dasein. Acting under the influence of the new media with Heideggerian "you are what you do" in mind.
If there is a direction to the movement of enjoyment, then it is the ideology of representational realism which determins it. The "joy of acting" must be tied with the realism of robotic action. That is a good topic, which requires a separate, more elaborate discussion, and I'll put this aside for now.
Returning to my initial surprise, what is actually surprising to see (or, rather, to witness again, to remember), it is how enjoyable the early "computer graphic systems" were at their time. Computer-generated visualizations were highly abstract and symbolic (those dotted lines even bare some semblance with aboriginal art), reduced to still recognizable minimum, bound to the aesthetics of visual economy. And in the aesthetical totality, the imagery has a certain raw quality, in the sense of raw imagery of the cinema avantgarde in 1900-30s...
Jan 27, 2013
In Defense of Incalculable: Big Data and the Un-dead Theorist
Quick comment written after reading Ian Steadman's 'Big data, language and the death of the theorist' in Wired UK [http://www.wired.co.uk/news/archive/2013-01/25/big-data-end-of-theory]A theorist can (and will) die, indeed, that is quite a probabble event. S/he can commit suicide as a result of a nervous breakdown, exhaustion, or loss of faith in something essentially humane. Theorist can also die in an accident caused by some crucial miscalculation. And death can be a calculable event as well——when it actually happens. But until then (when?) there is also a potential chance to make some incalculable reflections about this calculability.
It is the theorist who makes a resolution to understand The Data itself instead of competing with data crunching supercomputers. This concept of calculable realities, which is being propagated and exploited by the dominating contemporary techno-scientific discourse, was (and still is) tormenting many theorists. Wittgenstein was forced to make a paradigmic U turn after publishing his 'Tractatus Logico-Philosophicus' and renounce the idea of calculability. Godel used the language of calculation itself——mathematics——while trying to prove that the result of a correct calculation can be wrong and thus arrived at his famous theorem.
And there is another aspect of this 'big data crunching ideology': it encloses the discourse within itself by remaining mute about the philosophical problems surrounding some of the fundamental constitutive elements of a techno-scientific reality: 'thinking', 'intelligence', 'reality', 'self', 'representation', etc. Here the premise of incalculability, or 'outside calculability'——i.e., also, 'outside-the-text'——functions not as a reference to some self-sustained Being outside-the-text per se, but rather points to something which can be identified as a potentiality, a human 'spirit'.
To defend the incalculable would also mean to defend the faculty of reflection. The 'death of a theorist'——or, rather, termination of reflection by rendering it obsolete as a faculty——would signal that the text is finally able to reflect upon itself, that it has become conscious of itself.
Oct 15, 2012
They Say, Jump!
It is a strange and firmly established norm of fusing the willingness to test the boundaries of technology (and dominant technological ideology) with the willingness to risk one's own life. Even with Fearless Felix (or any other subject) dead as a result of this test, the boundaries of technology still would have been tested sucessfully. Where personal test ends, the test conducted by the Other begins. And having in mind that during all these four hours the major part of Baumgartner's performance was concerted by the 'command center', the most signifficant manifestation of his personall will (his personal test) was the actual jump, this tiny moment of choice. But there is a chance, that this choice was also a false choice: you have a right to choose from any of the available options with the condition, that you will make the right choice. So, what is the right choice?
And there he was, trapped in the totally ideological and totally mediated spectacle.
Quick comment on a quick comment of a friend, philosopher Ivan Flores Arancibia.
Aug 23, 2012
Between Possibility and Necessity: Get Online with God
| Photo: Geiste Kincinaityte (2012) |
About 1450 European intellectuals began to become aware of technological progress not as a project (. . . this came in the late thirteenth century) but as an historic and happy fact, when Giovanno Tortelli, a humanist at the papal court, composed an essay listing, and rejoicing over, new inventions unknown to the ancients. . . . It was axiomatic that man was serving God by serving himself in the technological mastery of nature. Because medieval men believed this, they devoted themselves in great numbers and with enthusiasm to the process of invention. (White, 199)Considering the further historical trajectories of Western thought and praxis, this insight obviously resonates with contemporary post-religious discourse. It is the servitude to the (Western) "values of democracy" which is now spontaniously associated with the techno-scientific progress (e.g., the idea of democratically positive transparency of hi-tech-savy societies vs the fundamentalist opacity of "developing" societies; think Egypt revolution and Arab Spring events).
What deserves our critical attention here is the "self-explanatory" and seamless reversal of this modified Medieval equation—i.e. the false logic of the notion that every technological achievement is an inevitably valuable contribution to the democratization. The problem here is the actual lack of the etchical dimension in technology proper for this equasion to be possible in the first place. The "self-explanatory" presence of an ethical (and thus democratically correct) justification in contemporary post-industrial technology can be paralleled with the "self-explanatory" presence of a Holy Ghost manifesting itself in the technology of Middle Ages.
The distortion becomes apparent when the two sides of the equation are attributed to the categories of necessity (of technology) and the possibility (of God or the secular Good). In both cases of the justification of technological servitude, possibility is allowed to function as a (mode of) necessity.
To be continued
Aug 15, 2012
Heidegger and Television
It is still a bit surprising that Heidegger, a major figure in philosophy of technology, didn't leave any systematic reflections on radio and television, even though he was obviously familiar with both of these technologies from the pre-WW2 era. The fact that mid-20 century media remained outside Heidegger's theoretical scope might indicate the presence of "modernist fundamentalism" (and it is the same modernism, which found a threatening dimension in the technologies of post-Industrial revolution). It's quite peculiar that at the end of his life he became interested in television after all... but only as a medium for broadcasting soccer games.
Aug 14, 2012
The Moment
Before trying to answer this question, let us allow the possibility of an idea of "technological time."
This text for me as an author is a way (or one might be tempted to say "a practice") to trace the existence of this time, and at the same time, to acknowledge the ability to reflect about and withdraw from it.
To b.c.
Jul 10, 2012
Birth of Language
Did the ability to talk evolve gradually, or did it appear suddenly, as a result of some genetic mutation? Steven Pinker, or Noam Chomsky? Is language an evolutionary error?
We don't know why we are able to ask a question about our ability.
