[READINGS]  [^^Inter-Active]  []  [^^^TERMS]

Lunenfeld

[Lunenfeld PDF] BEGIN BLOCK QUOTE The 1500c shift from manuscript to print was pushed in large measure by the populace's demand for copies of the living word of God [see map][Note 1]. The 1900c's escape from the Gutenberg Galaxy into the realms of the non-linear, the hypter-textual,and the multi-mediated was sriven by something far humbler -- the memorrandum. The generation of ever more paper in an information ecconomy produced a demand for computer systems to create and store documents. These systmes became smaller and less expensive with each passing business quarter until they reached the point when users moved the machines out of their offices, on the road, and into their homes. [2] The proliferation of word-processing systmes and scren-based reading environments like the Internet has engendered a radical re-orientation in the way that people write and read, and hence think. Rather than having to re-write every text from start to finish, the contemporary writer/reader enters a text at any point and amends it, with all the other elements shuffliing themselves into a new order -- a fluidity the term non-linearity used to describe. This shift has taken place not just in the sheltered laboratories or academia and industry bt as well to a majority of those people who write in the information economy. [3] No longer stationary on the page, the word -- once digitised -- is a-float in universe of poly-valent databases. [4] Reading becomes less a matter of following than a process of extracting. Search on: Victory Garden by Stuart Moulthrop The user enteres the database like a miner after precious metals. The search may take un-expected turns, but extraction is the paramount concernt. The pressing need for a system of extraction encouraged the next shift away from the stable universe of the book, a shift that took full advantage of the computer's ability to link disparate bits of data instantaneously, regardless of their origin. [5] The word "hypertext" was coined in the 1960's by visionary systems designer Ted Nelso, who defines it as "non-sequential writing -- text that branches and allows choices to the reader, best read at an inter-active screen" [LOCAL NOTE 1] The interactivey of the most sophisticated hypertexts allows users to choose their own paths through materials contained in the computer or in any electronic database to which it is connected. [Providing you have a PAID logon name ;) ] As a technolgoy, it is the most sophisticated manifestation of the computer's impact on writing and reading. At its best, then, the medium of hypertext opens up the static book to non-linear exploriation, exegesis [like this!], and of course, extraction. There has been an explosion of literary critical writing about hypertext [LOCAL NOTE 2]. Once exposed to electronci languages' open-ended, multi-user, multi-creator [author] documents, theoriests (ever resourceful) have noted similarities to post-structuralist notions of the production of meaning. Clerly, digital env's, complicate questions of authorship, as noted in Chapter 3, "Real-Time Theory". They also seem to offer a privlidged space to explore theorists Roland Barthe's vaolorisation of "writerly" textuality [[[Barthe]] -- again! where-in the reader does not encounter a work whose meaning is fixed, but rather (re)-writes the text thru the process of reading. The "writerly" is opposed to the "readerly" qualities of classical fiction, where-in the art object is static and the hierarchy of the creator and consumer is rigidly maintained. [LOCAL NOTE 3] In the early 1990's, there was a surge of interest in the possibilites and accomplusihments of hypertextual fictios in the popular media. On the same Sunday morning, the book section of both the Los Angeles Times and the New York Times had front-page reviews of hyperfiections. [LOCAL NOTE 4]. The New York Times's was actually the second cover piece the paper ran by novelest Robert Coover on emerging hyper-fictions. Coover included a long review of Stuart Moulthrop's "Victory Garden" [LOCAL NOTE 5], short reviews of ten other hyper-fictions (occasionally scathing, a refreshing change for a field where breathless praise for the new is the norm), a theoretical over-view of hypertext, and a resoucre guide for ordering the works reviewed. In other words, the NYT Book Review, the most powerful critical orgaan of the publishing establishment, took hypertext seriously. [most powerful in New York? in the world? in all of the univeres?] One of the most evocative hypertexts published in the 1990's was "Agrippa: A book of the Dead". Agrippa was a colaborative protject among book publisher Keven Begos, artist Denis Ashbaugh, and authoer WIlliam Gibson, best known as the authoer of the previously mentioned "Neuromancer", the most influential cyber-punk SF novel. Agrippa, however, is something quite distinct. Described as "a black box recovered from some un-specified disaster". Agrippa opens to reveal charred-edged pages, covered with repeated letter patterns: "AATAT/TACGA/GTTTG" [ooohhhh, wooowww!!!] [LOCAL NOTE 6] After a moment, the realisation comes that these are not merely couplets of concrete poetry, [7] that in fact [fact? what fact?] they are the signifiers of the the genetic code, sequences of DNA's. [6] The pages of DNA codes are inter-mingled with Ashbaugh's engravings of subjects ranging from guns to telephones. Embedded within Agrippa's back cover is a computer disk that contains the text of Gibson's poem: The sweet hot reek Of the electric saw Biting into decades closes one stanza. What is un-usual is not simply that the text is designed to be read only on the screen -- many hypertexts are written to be read this way -- but rather that Gibson's work is meant to be read once and once only. The floppy disk is programmed to destroy the text as soon as it is read. [8] The poem itself is about family and memory, which are usually considered to be elemetns of our lives that endure. Agrippa plays with temporalities; the past, present, and future implode as an integral part of experiencing the work. That the material is intended to be read once and only once, and then to deteriorate [actually simply cease to exist; deteriorate would mean that the text becomes "mangled" or "damaged"], is in itself the deftest of hyper-aesthetci gestures -- "biting into decades" indeed. [this is all rather naieve if you ask me visual artists/musicians/actors/dancers have been doing this sort of stuff for decades -- ironic semi-pun intended] THis kind of disk-based hypter-fiction, no matter how [it is] packaged, did not emerge as a marketable commodity within the constraints of the publishing indurstry. [That much IS certain!] Its spirit, however -- and many of its forms -- moved gleefully onto the World Wide Web. "Avant-pop" hyper-authro Mark Amerika [is that a *reall* name?] is perhaps the best known author of sustained hyper-fictions on the net. Amerika describes "Grammatron" www.Grammatron, as a public-domain hypermedia narrative environment", complete with " "Grammatron magic cookies" that connect users to pages determined in part by which links they previously followed. [LOCAL NOTE 7] That these magic cookies might indeed be reading the reader is a new twist on McLuhan's observation that "schizophrenia may be a necessary consequence of literacy" [LOCAL NOTE 8]

The Rebirth of Text

THe development of hypertext came at the close of a centuury that had been a great one for the printing business, but ironically, and awful one for the culture of literacy. For 100 years, commercial and governmental bureaucracies generated a terrifying tower of paper at the same time that print was losing its primacy as the source for information, education, and for entertainment. Audio-visual mass media, especially cinema and television (those bastard children of the photograph and radio), have poisoned the environmental for text. [LOCAL NOTE 9] It is not simply that audiences are seduced away from typographic culture by the moving image, it is that it is almost impossible to read text within linear audiovisual medial like film and TV. One of the defining qualities of printed text is tha readers can skip around, return to previously read passages, linger or push on -- in other words, set their own pace. Chapter 3, "Real-Time Theory", noted the ways in which the Web can push media on the Internet, but it is important to remember that film and TV are the original push media: Their forward movement is un-controlable and it is impossible to refer back to that which has come before. [9] [LOCAL NOTE 10] It is this linear dynamic that accounts for the central importance of over-determination [???] in the dominant narrative forms of --- ???? entertainment media. Plot points, character names, vital props, important locations, all must be constantly re-iterated if they are to make their required impact on the spectator. [10] Commercial film and TV are thus quintessentially over-determined media. Like everything else in film and TV, text in linear dynamic media spews out at a predetermined and un-controllable rate, and can neither be referred to nor reverse. [LOCAL NOTE 11] As well, film and TV subject text to specific technological abuse. Though the resolution of 35mm film is higher than electronic displays, the very vastness of the cinematic screen challenges our pre-conceptions about how type and text should be displayed: A movie is a billboard, not a page. [11] If the over-whelming size of the screen is a problem in the cinema, television offers different limitations, NTSC video (the North American standard) is a terrible medium for all but the largest fonts. [LOCAL NOTE 12]. As commercial videotext proviers learned in the disastrous experiments of the 1970's, people do not like to read from their! television screens. [well, yes, most bourgeouise don't!] NTSC is an inter-laced video system, meaning that only alternating lines are refreshed by the scanning gun, contributing to American TV's over-all bluriness, which in turn leads to eyestrain and headahces when-eer text is present. [LOCAL NOTE 13] [hmm, couldn't a FONT be designed to specifically word with this limitation?] The computer, on the other hand, solves both of the major problems prented by cinematic and television technologies. The computer monitor's scale is obviously more initimate than the screen in a movie theatre. Ergonomically, a computer workstation offers a more amenable distance for reading than the typical living-room layout of a couch placed far from the television. [LOCAL NOTE 14] In addition, computers use much higher resolution non-interlaced screens, which offer vastly better legibilitye. Beyond these tech differences, the higher-order possibilities of non-linear access, hyper-textual linking, and inter-activity that distinguish digital media can combine to offer the kinds of temporal control we expect from print rather than audio-visual media. 12] In other words, the user of an inter-active entertainment has the opportunity to go back, to linger, or to speed ahead, just as with a printed magazine or novel. Dynamic, yet free to escape from the constraints of over-determination, digital media are open to text and subtle typographic treatments. Alpha-numerica text has risen from its own ashes, a digital phoenix taking flight on monitors, across networks, and in the realm of virtual space.

The Technics of Text

It is not simply that computers are technically suited to revive typographic culture, users, for decades now, have been conditioned to view computers first and foremost as machines to create, store, and manipulate alpha-numeric text. From early word processing systems, like those offered by the Wang Corporation to spread-sheet programs like Lotus 123 that made the PC ubiquitous wihin the business economy and the PostScript typographci printing technology that powered the desktop publishing explosion, users have come to expect text to be a major component of digital environments. [LOCAL NOTE 15] Even as processing speed improved enough to make computer driven multi-media a marketable commodity, users continued to demand some sort of textual interface. When, for example, digital publishers re-purpose film and TV properties, the first thing they tend ---??? re-purpose to do is to add textual suplements. The Voyager Comapny did just this with its 1993 QuickTime version of the Beatles film "A Hard day's Night" (Richard Lester; ;1963). The disc hyper-linked the audo-visual materials with texts including the original script and an essay by the critic Bruce Elder on the band, their music, and the movie. Just because the tech capacity exits does not mean that text will re-emerge in a form that transcends logos and info-byes. Too easily glossed over in all the excitement are the questions that non-linear authoring and use raise about the creation of textual ahd hyper-textual meaning. Examine the temporality of texxt: The action of reading is always linear [not necessarily]; meaning is formed by stringing words together one after another in squence [ditto comment]. Yet, in the future/present, the computer allows non-linearity in the way that authors present materials and users extract info. The constant play between inter-linked nodes of info transforms our conceptions of rhetoric; we can no longer know where a proposition will come in realtion to the other propositions. Our situation is some-what akin to that facing the originaros of [quantum mechanics]. ... The most we can know of a micro-particle, then is its partially defined state -- its contribution to an irresovable ensemble. This is quite different from the ability to pin down the exact location of a particle in the Cartesian grid at place x,y, and z and at a timet. In a manner, we can no longer count on the physical unity of the book and can-not precisely determine the position of the proposition within a hyper-text system. We simply accept its position as a probabiliity and make do with that level of un-certainty. [13]

Extracting the Nano-Thought

The author can assume that no prior knowledge on the reader's part because hyper-text allows the reader to enter, exit, and augment the work at any point or time.
[14] One strategy that hyper-authors have developed is the repetition of key topics through-out the linked nodes. [LOCAL NOTE 18]. To laud the use of these small textual units, or lexia as Barthes --- again witht ehe Barthes!!! coined the term, as base reading units is to acknowledge a condition of non-linear production and reception -- the difficulty of pre-structureing complex arguments of extended length. [15] Butare all ideas, metaphors and images then to be processed down to their smallest units, the nano-thought and repeated ad nausieam throughout digital databases? In this analoty, the nano-thought represents "information", the raw data of science or the un-digested facgts, factorids of the essay -- or even fictional -- form. This is not to say that intriguing ideas can not be generated by sifting through nano-thoughts, just athat a regime of nano-thinking to the exclusion of other conceptual practices is probably going to lead to an impoverished discourse. [16] To evaluate hyper-textgual systems, much less the even more quick-silver "wisdom". If we accept that hyper-textualised, database-driven culture will perforce encourage the proliferation of nano-thoughts, the next issue becomes ensuring that this new form can be used with precision, and with towards those in-effable goals lauded above. Rhetoric is the study of language as the art of persausion, and its ancient lexicon can be mined for tools to address the nano-thought. Two terms in particular, multum in parvo, [I prepare the many? Much is/are in the part/parts?; Loquorisne Lingua Latine?] and mise-en-abyme [ditto???], offer insights into how to ensure that hyper-textual systems do not completely atomise discourse. Susan Steward notes that a "reduction in dimensions does not produce a corresponding reduction in signifcance. [LOCAL NOTE 19] [17] Collapsing the Oxford English Dictionary from 24 volumes to two, for example, and then to a single CD-ROM, does not affect the dictionary's content. Precisely how, though does one collapse discourse without completely losing its meaning, much less its signficance? The Next: NOTES.

Notes

[1] As i recall, the Guttenberg bible was *not* a financial success. Since it was still costly to produce books (even using the non-movable type process of the day), not many people could read. {Back to the TEXT} [2] This had in fact all ready occured with the Apple II, and other pre-PC microcomputers; note esp the fine efforts by Osborne -- who also published the first two books on micro-processor programming. I recall seeing several portable KIM computers that were carried about in a standard briefcase. {Back to the TEXT} [3] This is not quite accurate, i think there are several distinct catagories of "reading" a) Linear (in the tradition sense) which is what we are doing with this paper by Lunenfeld. b) Entry via search engine. This is the same thing as "entering" a book via the index. Of course, it's a bit more accurate in that you are (hopefully) linked direction to the desired area. A major problem is of course the proliferation of INDEX references in the various search engines -- which are a great confusion over the traditional book-index -- albeit, almost certainly more complete than almost any book-index could ever have been. c) The random by design path. This is the current topic of the Story Lab study group this semester. Also, for the most part the order doesn't really change that much. For the most part *most* of the plethora of text still follows the old water-fall (encyclopedia) or i nverted-pyramid (newspaper) methods. {Back to the TEXT} [4] Huh???? {Back to the TEXT} [5] And hence the current mess: If you google on *any* topic, you get less and less authority, and more and more blurble; eg, blogs, fan fic, etc. Also, note that the search methods are *not* all that different from the traditional use of an encyclopedia -- as you are searching for "He Tzu" you come across the "Hermetic Society" of poets under the direction of Eugenio Montale -- which turns out to be *much* more important than a rather obscure Korean artist. Also the idea that books are "stable" must surely ignore the concept of abstracts, indexes, and concordences, etc. {Back to the TEXT} [6] It is important to recall that the film "GATACA"'s title is derived from the base-order in genetics tha tmakes cloning possible. This sequence trigger's the genetic code to start copying at that point. Refer to the superbly readable (and always enjoyable) book "THe Cartoon Guide to Genetics", Larry Gonnick. (You'll be glad you did!) He also wrote "The Cartoon Guide to American History", history of the world, etc, etc. (tips towel 3 times to Larry; once to the man, once to his works, and once to all who seek the way of peace). {Back to the TEXT} [7] Hunh? They are hardly poetry at all, but blocks of letters. Even outside of the "context" of the recovered "black box" (assume we don't know what that is), they letters either form part of a CODE (cypher), or they could be abbreviations, computer commands, or some sort of library system of classification. In an long-ago published episode of "Archie Comic", the gang comes across a rubble heap and on concrete blocks they lay out the following text KING ZONOPAR and begin digging excitedly. As it happens, Jug head happens along and re-arranges the blocks to form: NO PARKING ZON (the "e" appraently missing). Note that unless we "know" (or speak) the language of genetic science then we have *no* way of decoding the cryptic message. This goes back (heavily) to the SETI and CETI concerpts (now where are my notes....) [Complexity of SETI/CETI] debated and annotated. (Don't you just *love* it when you find that puzzle piece with the lovely dark-grey krinkles and *know* (that mometn!) that it's the missing piece of the Fjords!!! ? {Back to the TEXT} [8] Actually, i'm not exactly in awe of this "hypertext" happening", since ephemeral performance art has been around since the 1960's (and of course much earlier versions of Dada=ist activities, etc. Also, from a *political* POV, this is no different than during the Watergate and Irangate days, wheen one person was reading memos and then as they were finished (or it was deemed time) some-one else shreaded the memos. Obvious examples are: John Cage, Joseph Beuys, and Barbara Kruger. {Back to the TEXT} [9] I'm not exactly sure what Lunenfeld means here. It may refer to the term "push TECHNOLOGY" -- LOOK UP Certainly we can refer back to them in discussions, and of course the use of BOTH TV, FILM, RADIO, and even PRINTED TEXT in all of the arts is v. doable. (Pre-recorded segments, mirrors that reflect from off stage, or off-canvas, etc -- all are "do-able" Hmmm {Back to the TEXT} [10] I don't think that this is the case in *any* medium. Certainly the avant-guarde doesn't *have* to have this form; eg, the early experimental Russian and some current French film. Indeed we can imagine a "film" where depending upon you came in on it, it might have a happy or a sad ending; eg, the ending of "The Little Prince", by Antoine St. Exuprey ??sp??. Certainly, i wouldd agree for the most part the current swath of "hit blockbusters" are *completely* formulaic and tired. {Back to the TEXT} [11] Indeed, as almost any DESIGN artist will tell you: "Many fonts are possible; few are ever used" or as John Cage put it (concerning pattersn: A, B, A, B, A, B, C ...(rule of 3) "Many [musical] patterns are possible; few are ever tried" {Back to the TEXT} [12] Note, that with the new DVD menuing features (as well as course TIVO -- which potentially could have the same SCRIPTING abilities avail on internet pages, Java, etc, etc), the menuing systems *do* give the user as much control (potentially) as with a book. A wish list of things for DVD/TIVO/next-generation book marks (fav sections; please don't fold the vido page corners down ;) margin notes (text/audio/import file, reference to other screens/ segments, HREF to other DVD, inter-net etc) creating notes at the front of the "book" (ie, vid, aud, etc) where we can create our own: indexes extract (photocopy as metaphor for vid/aud extract to computer notebook) Also, there's no reason that the previous developed technology of the interactive video disk shouldn't be extended to new media, dvd's vid files etc. This i shouldn't have to learn "final cut pro" to create (say) a text file, and then you click on THIS and up comes a clip-in clip-out from (say() the h2g2, then i have the same segment with (say) my voice over, and then back to the aud/txt/vid where i say, "Indeed, director Garth Jennings had this to say about the scene" and then cut-in the director's commentary, etc. Also, to use the "inverted pyramid" (newspaper) method, we could create a QUICKIE version of a movie/documentary/etc that has only certain selected segments, and then we insert (of course they finally get out of the labyrinth) -- the only way to keep people from nixing Jar Jar Binks is to make him *essential* to the story. hmmm it's indeed a brave new (media) world out there !! {Back to the TEXT} [13] Indeed, this goes the heart of the ambiguity problem itself (as well as the "classification problem" and of course the *continual* problem of "false distinction", etc. Now if the text is supposed to be factual, then this is a grave concern: Cause and effect are the basis for most human thought, it matters a great deal if we are dealing with modus tolens (if a then b, a, therefore b) or modus poens (if a (which is false), then anything; a, therfore anything). And of course this goes back the matter of degrees, if the *intent* or the *body* of some information is essentially correct, the problem arrises of nit-picking it to death; ie, muddying the waters, creating false distinctions, and of course any/all of the lobical falacies. As regarding entertainment and/or absurdist, etc literature then such uncertainties may be a happy element of chance; or not -- was it the lady or the tiger? So, is Schroedenger's cat alive or should we get out the tin foil and an old shoe box? Did the pilot draw a strap for the muzzle on the sheep or not -- all of these things the stars know but do not reveal to the reactionary mind that does not see with their heart, if indeed they have one at all. {Back to the TEXT} [14] This is the same as the "communication problem" (SETI/CETI, or otherwise). If i am allowed to "pre-screen" what you do or do not know, then i can *adjust* the presentaion. On the other hand, the user, may be the best one to determine the "level of ignorance" that they wish to operate at. A usefull thing would be to allow the user to raise/lower the level of detail/background info/etc. that is presented; eg, pop-up info bubbles, automatic dictionary level (say, words above grade 10, etc). {Back to the TEXT} [15] This is where the "bingo" card concept comes in handy. In provding the user with an open-ended (eg, menu-driven) system that doesn't have "guided tasks" (what do you wan to do: create a letter, email someone, etc). Then the bingo card is a check-list of things that MUST be determined before the user can actually execute/finish the task. There are two ways to do this (and in combos as well) The "what you will need" list; eg, "Turbo Tax" tells you the various docs you will need before you even start. Thus, saving *extreme* frustration that you get half way in and realise that you don't have an employer's STREET address, etc. Checking over what you have entered (again Turbo Tax does this well). The bingo card goes like this: We allow the user to freely navigate/set-up/specify/turn on and off otpions/program/etc then we they press DO IT, the bingo card (for the given that the user want's done is checked. If something critical is missing they are then led to that part of the command menu tree and *must* proceed thru the prompting dialog (or simply (as an expert) fill in the required blanks). When all of the *required* items on the bingo card are completed, the user can be prompted to DO IT and then they can do so. Note: The user may have several tasks in mind, and thus have been concentrating on a given thing, and over-looked a "trivial" (to them) item. This system also means that *extra* info that is entered (eg, from a profile) doesn't necessarily actuate an activity to DO SOMETHING by the system. {Back to the TEXT} [16] This is just another instance of the "too much of a good thing". See for example, "The Cyberiad" by Stanislov Lem, whereupon the inventors create a Maxwell's Demon of the Second Kind. [Ref here] {Back to the TEXT} [17] Yes, but it can lead to loss of meaning/content/refernce/etc For example, reducing chemical equations might be good, as long as you don't lose a *generic* name in the index. (Murphy's law!). The more information that is present, means that the search to re-build a broken link is more likely to be successful. And "obviously" if a thing *is* important (signicant) then reducing it just a blip does not change that. But, since it's "footprint" is reduced, it might more easily be over looked or even ignored or erased. Also, technically "reducing" the OED to 2 vols is *not* reducing the "dimensions" of the "thing", it is simple re-scaling it. I'm not sure what "dimensionality" a dictionary has, maybey 2.713 (certainly not 3.0, and obviously with cross references, i would guess that it is greater than 2.0 (maybe by only a little, say DIM 2.137 ;) {Back to the TEXT} [18] {Back to the TEXT} [19] {Back to the TEXT} [20] {Back to the TEXT} [21] {Back to the TEXT} Next: eof.