Saturday, February 28, 2009

Writing in the twenty-first century

This may sound strange coming from someone who is a technical writer, but the greatest thing about the English language is its ability to be ambiguous. I can't speak for other languages, such as Chinese, because I am not fluent in any other language. Although it may seem like I'm speaking a foreign language in this blog, I'll try to be as "un-ambiguous" as possible.

Let me start by saying that I fell in love with English because of its ambiguity. For example: If you read a novel, the author may give very good descriptions of setting, character, and action, but it's up to your imagination to supply the images the text represents. The protagonist in the novel develops as a picture in your mind and you feel a connection because the image in your mind is one you created. But what if the author described the protagonist in perfect detail and you had a picture of someone like Mel Gibson , but instead the author provided an actual picture of his or her vision and it looked more like Rodney Dangerfield? This sort of creates a disconnect or a conundrum because it would be hard to believe that someone who looks like Dangerfield could be a great lover of young women, totes a shotgun, and saves the world from all the evil people who look like fashion models--pictures supplied by the author. Unless this novel is meant to be a comedy it really ruins it because the pictures are supplied. I'm no longer left with the intimacy that the text could provide through my own imagination.

So what's my point? I find that there is a pendulum that swings in extreme directions in writing with technology. On one hand, we have graphics, sound, animation, and video feeds that leave very little room for an intimate imagination. On the other hand, we have email, text messaging, Twitter etc. that allows us to be so ambiguous that we can only guess at the tone, context, and spirit of the message the author intended. For the latter, if our imagination runs awry we can be caught in a serious violation of misinterpretation that was never intended by the author.

Through the history of the written word there was a certain amount of ambiguity that was built in. The ambiguity is what makes poets so interesting and can lead scholars to a lifetime of effort at interpretation based on their values and what they bring to the text. But this does not necessarily mean that they are right. A well crafted poem will often explicitly defy meaning on anything but an individual level. I can imagine Walt Whitman with his words, "what I assume, you shall assume" conveying his message on Facebook complete with pictures and diagrams of what he is exactly talking about. It would leave little to the imagination and good old Walt would soon find himself being listed as a "cyber pervert." Think about all of the creative texts that you have ever read and then think about how ruinous it would be for you if the author had supplied graphics and sound. The authors allow us the creative discretion to formulate mental images and interpretations on our own.

There is something to be said about ambiguity: It often creates a greater understanding about ourselves and the world around us than anything that can be explicitly shown. The type of ambiguity I'm talking about cannot even start to take shape within 140 characters. Imagine how chaotic the world would be if we talked in acronyms or limited each sentence that we spoke to 140 characters. What kind of discourse can we have when we are so transparent that we include pictures of our thoughts or we are limited to talking in acronyms?

I see the use of writing with computers, but I'm sorry; I am a traditionalist. We can gain a lot by learning how to use technology in a writing classroom, but I don't think we will ever have a canon of writers like we have had in the past; at least not in the sense that we view the canon of great literary works now. Being "wired" is either making us too vague or too transparent How we find a middle ground in the sea of technology to create the next great canon is the key. My only hope is that the next great canon does not contain LOL, OMG, or clips from YouTube.

Monday, February 16, 2009

Ebay and Me: The Things We Learn That We Didn't Mean to Learn

This is an anecdote about how much we learn by doing things on the computer that we did not intend to learn. Let me digress here just a little and tell you that every time I use the word anecdote I'm reminded of the Ron White joke that goes: "If I knew the difference between anecdote and antidote my best friend, Billy, would still be alive. Billy was bitten by a poisonous snake and I thought the best thing for him was to use an anecdote. I was reading sections out of Reader's Digest and Billy, as he was dying, kept yelling, 'Read faster, Read faster'"

So my anecdote is this: Before I returned to school--for the third time--I was an antique dealer for 15 years. My early years in the antiques business were in the mid 80's. I had colleagues in the business; people I bought from, sold to, or traded with that told me about the "bazillions" of dollars that they were making on a site called Ebay. They told me that rich buyers on the West and East coast were buying up antiques at prices that "us hicks," here in the Ozarks would find ridiculous. It peaked my interest.

The first time I tried to sell anything on Ebay, I learned that there was a learning curve to it. I'm not talking about a minor curve, but a MAJOR curve existed in order for me to post anything. In those days there was no such thing as Web authoring tools. If you wanted to post an item on Ebay you had to code your ad by hand using their template, but everything else was pure html coding. I have to give Ebay credit, they had an outstanding tutorial for how to code a posting and make it look attractive. But still, it had to be coded the hard way and if you messed up by missing one key stroke such as misplacing an opening or end tag (for those of you who have used code, you know what I'm talking about), then nothing would appear right. Pictures, text, and headings had to be placed without the benefit of WYSIWYG, or "what you see is what you get." It was strictly html code.

It's amazing to think that Ebay survived at all. An entire nation was jumping on the Ebay wagon, and at the same time learning how to do html code. People who had never owned a computer before were jumping into the deep end of the pool and doing html coding! Me included.

Coding is a lot like learning Spanish. If your immersed in it everyday then it comes natural to you. You can learn the basics, just enough to get you by, but the minute you no longer need it then it slips from your mind and is replaced by other data. My coding experiences have started to erode, and I wish they wouldn't.

As I prepare to give a presentation on PBWiki, I am reminded of the fact that we learn from the software we use, and at the same time are limited by the functions that are presented to us. The nice thing about the old Ebay environment was that it taught me something useful, and at the same time taught me that there were no limits to what I could do. With the new Web authoring tools that we have, we are strictly limited to what the Web authors have decided we can do. They may be easier to fly, but the sky is no longer the limit.

Saturday, February 14, 2009

Selber, rhetoric, and the way we use computers

Okay, I'm running a little behind on my blogging. It's not that I have procrastinated, but somehow life gets in the way and I had more on my plate than I could handle. It's a long story and one I won't bore you with, but it's a perfect transition to rhetorical theories in technology.

The ideas that Selber has about "human interface" (chapter 4 "Rhetorical Literacy) with computers seems to me to coincide with what we do every day of our lives. Although Selber uses the term "Human-Computer Interaction," when you think about it, it's what we do with our friends, colleagues, and people we deal with everyday. We speak to people and every word that we use is persuasive in some way or another. What is the alternative? We are deemed fakes or liars if we fail. Should it be any different for electronic media? Absolutely not. I know it sounds radical, but technology and people now seem to meld together so seamlessly that they are difficult to compartmentalize in our minds.

To give you a good example, I was interupted while I was creating this blog--somewhere after the line in the first paragraph that says "It's a long story..."--by some friends who I hadn't seen in a while. I couldn't be rude and say, "Oh by the way, you are interupting me in my endeavors to catch up on my school work. I don't have time to socialize right now." Instead, I put my best face forward and was a hospitable guest. But there is always an icon in my brain that says, "finish and save, finish and save." The icon can't be turned off. My friends who came over started looking like Web sites that take too long to download. All I wanted to do was Refresh or Delete. So here I am hours later trying to piece together thoughts that probably don't make any sense at all, but somehow they did before I was interupted.

The point I'm trying to make is that we somehow forget which world we are living in. We can no longer ignore that we have a foot in a digital world and a foot in the real world, and keeping a clear delineation between these two worlds can be difficult. If you think I'm overstating my case, let me give you a quote from Sherry Turkle, an MIT professor who, like Selber, teaches the psychology behind technology. (By the way, I would have liked to have posted this as a PDF but it's much too difficult with Blogger).

"We live in a culture of simulation. Our games, our economic and political systems, and the ways architects design buildings, chemists evisage molecules, and surgeons perform operations all use simulation technology. In 10 years the degree to which simulations are ebedded in every area of life will have increased exponentially. We need to develop a new form of media literacy: readership for the culture of simulation.

We come to written text with habits of readership based on centuries of civilization. At the very least, we have learned to begin with the journalist's traditional questions: who, what, when, where, why, and how. Who wrote these, what is their message, why were they written, and how are they situated in time and place, politically and socially? A central project for higher education during the next 10 years should be creating programs in information technology literacy, with the goal of teaching students to interrogate simulations in much the same spirit, challenging their built-in assumptions" (885-86).

Turkle's argument is that we live in such a simulated world that computers only recognize success or failure. Success or failure is determined by the programmer who made the program and gray areas where discourse might occur are omitted. Selber's argument runs along the same lines, except that he sees technology to promote discourse instead of disouraging it. My opinion is that it depends on the program you are using. Teaching students the difference is the key.

I guess the point I'm trying to make with this blog is this: Turkle is exactly right when she suggests we question the software we are using and what its function is. There is more to it than we explicitly see. What is implicitly being sold? If we can't answer these questions for ourselves and our students then we need to hit Refresh or Delete. We are still humans and there is no getting around that. If I could Delete my friends when they show up at unexpected times, Delete my problems when they cropped up, and Refresh my memory after all the hooplah, life would just be grand, but it just doesn't work that way does it?

Turkle, Sherry. "How Computers Change the Way We Think." The McGraw Hill Reader. Ed.
Gilbert H. Muller. New York: McGraw Hill, 2008. 881-886.