It had been a long time since I’d felt surprised by stories. I wondered where had gone the woman who devoted her life to the study and teaching of them? I taught writing. It was a good job.
I thought a lot about Milton’s line from his sonnet “On Blindness”: “They also serve who only stand and wait.” I was waiting.
[And standing in the grocery line. And making babies. And teaching full time.]
But still waiting.
Hypertextual narrative was my way into elit. It felt familiar to click through bits of text. Of course the narrative fragmentation makes it different from paper novels. I scribbled on scraps of paper to trace how the lexias interrelated. I liked the puzzle.
But the transformative moment for me as a reader of elit is actually device specific. I had read Twelve Blue once before on laptop: outside in the dark, warm summer air buffeting my arms. But I found myself distracted.
A year later and in daytime, my body curled around an iPad like the letter Q, I pulled Twelve Blue and started to read. The sun was bright. The high-gloss black screen reflected my face onto the surface of Twelve Blue. I was watching myself read.
It was uncanny. That my face should become another legible surface in Joyce’s rivers, his ocean: it’s thematically appropriate. I kept reading and being jarred by the shocked looks on my face as I reacted to the story. Heimliche and unheimliche: home but not home.
I noticed that touch became more than navigation; it was also a way to engage the characters. When I was empathizing, I found myself soothing the screen or, at different moments, brushing characters away. Video and sound are thought to be more immersive, but hypertext on iPad was permitting a rich sensory experience. The iPad’s screen resolution, its pixel density, renders Joyce’s blues sumptuous.
Reading on the iPad, touch slid my senses into alignment with my intellectual appraisal. The story made sense without mnemonic aids. I was reading as I had when a child: enraptured.
I take that device consciousness with me as I read other elit. Erik Loyer‘s Strange Rain app for iOS is one of my favorite pieces precisely because text, animation, sound and touch vie to compel me. Games urge us to move quickly, to “level up.” Narrative enjoins us to slow down and feel. That tension between the drive to complete and the drive to linger fosters, in Strange Rain, a synesthetic experience. The game itself approximates synesthesia when, after making it through many screens of the protagonist Alphonse’s observations and worries about his ailing sister, the gamer taps the screen quickly with two fingers: this suddenly telescopes the screen, as if catapulting the gamer into it: it feels like 2.5D, but I don’t know that for sure. A jet passes from one edge of the screen to the other. Animated frames explode in jewel tones, and the music rises to a crescendo as your touch–not words–asks Alphonse: will you go inside from the rain? Text responds yes or no as the music strikes an ominous note or chord (depending on which of the three scores you’re running).
Brian Stefans also thinks Loyer’s work is synesthetic, but he finds Loyer’s synesthesia more orderly than I do: “sound, image, and interactivity are choreographed into a unified, harmonized experience.” I can understand why Stefans sees unity and harmony in Strange Rain: that is a designer’s perspective. The work is meticulously choreographed, touch and music synchronised so granularly you have the feeling each note has been planned. That’s an awesome achievement when you also consider the sonic/animation overlay of the rain splattering. (If we are vetting for plausability, eyeglasses are the only explanation for the central conceit of the game: Alphonse tells us he’s getting soaked but we’re also hearing and watching the rain splatter on a surface that sounds like a skylight.) Stefans, himself a designer and poet, appreciates that Loyer can orchestrate many elements in a design.
It’s funny that quasi-neoclassical “unities” result in an experience that shatters me. This game is marketed at the Apple store as “relaxing,” an app that in its Wordless and Whispers modes might lull you to sleep.
Device specificity–a variation on Katherine Hayles’s field-defining discussion of Medium Specificity in Writing Machines (2003)–gives us some new parameters to knock against. To what extent does touch as an “interface free” navigation elide its role in story composition or concept? What will it mean for story if touch is no longer “invisible,” but factored in as a potential source of narrative–what we might call a touch vernacular? Does touch vitalize interactivity in hypertext (for anyone other than me)? I’m eager to read and play with Judy Malloy‘s forthcoming iPad adaptation of its name was Penelope. “The underlying structure and words are basically the same,” Malloy noted in a recent correspondence. “However, I did edit (not substantially) some of the text and notes, and the look and feel is different.”
Michael Joyce said in this video short (May 2011) glossing Agnes Martin’s painting “The Harvest”: “It occurs to me that I’ve always written about space in one way or another since many hours of my adolescent years were spent looking out from a screen door window upon summers’ nights in south Buffalo and left longing for a wider world.”
I long to look into the screen and fall behind it. To understand interface as well as I understand story.
Image credits: Milton Photo: Michael Nagle for The New York Times; http://nyti.ms/seVsbB; Twelve Blue: http://bit.ly/vuU5XW; Strange Rain: http://bit.ly/e6nPNN