Wednesday 13 November 2013

week 9 follow-up

Our music this morning was a synthesis of salsa and jazz, in the form of Tito Puente's reinterpretation of a tricky jazz standard we heard just before reading week: Dave Brubeck's "Take Five." Tito's energy, enthusiasm, and mastery of polyrhythms are a lesson to us all... And, speaking of referential meaning, he was a guest on The Simpsons many years ago.

Lecture slides are available here and in the usual place on BB:




We looked at a few examples that aren't in the lecture slides, including an example of coding from Prof. Hartel's article from earlier in the course: Hartel, J. (2010). Managing documents at home for serious leisure: A case study of the hobby of gourmet cooking. Journal of Documentation, 66(6), 847-874. [http://go.utlib.ca/cat/7723987].

The anthropological technique of thick description, which Prof. Hartel mentioned in her lecture on ethnography, is generally attributed to Clifford Geertz. See his chapter on the topic in The Interpretation of Cultures (1973).

Here is the James Bond clip we looked at: http://www.youtube.com/watch?v=kCNb5QDtc18. I also mentioned the Bechdel Test as an example of a very basic quantitative interpretive method. You can find an example of a popular application of this method here: http://bechdeltest.com/ . Try applying it to the next film or tv episode you watch, and let's see what informal results we get in the next class. The Bond canon probably doesn't hold up too well against the Bechdel Test -- though I'm pretty sure there's a scene in From Russia with Love that might pass -- but what's more important is that the test's value derives from the best tradition of quantitative methods: it prompts us to notice the things that might otherwise escape our attention, especially phenomena that try to fly under our cognitive, social, and ideological radar. Some of the most important insights in feminist research have come from simply running the numbers in relation to gender, especially salaries. Quantitative approaches to interpretation may be a blunt instrument, in the sense that they tend to deal poorly with context, nuance, and ambiguity, but sometimes a blunt instrument is the right tool for the job -- especially if there are barriers in the way.

Regarding the peril and promise of quantification in interpretation, I also mentioned The Guardian's questionable application of the Flesch-Kincaid reading-level test to State of the Union addresses. See The Economist's critique (containing links to other critiques) makes some good points about the too-easy reduction of texts to data, as well as the social value that is sometimes uncritically bestowed on certain methods. A more thoughtful application of text analysis to State of the Union addresses may be found here: bellm.org/blog/2013/02/10/tracing-the-changing-state-of-the-union-with-text-analysis/; and especially here: stateoftheunion.onetwothree.net/ . In the latter case I recommend reading the essay that accompanies the analysis, which we looked at briefly in class. It's biased, unobjective, and selective in its evidence, but also makes no claims to be otherwise (hence the author's deliberate use of the term essay), and thereby avoids The Guardian's mistake of assuming that data speaks for itself. I also recommend using the text-zoom feature on your browser -- probably ctrl- or command- plus or minus -- to block out the site's bright red background, or just wear sunglasses...

If the study of texts and artifacts from an information perspective interests you, I'd recommend some further reading in the form of the Latour and Winner articles I referenced in this week's blogging question. I'd also recommend two pieces of very recent work (which will almost certainly show up in my Future of the Book course next term). Actually, I just realized that one of them isn't published yet, but should appear in the next issue of the Journal of the American Society for Information Science and Technology (JASIST): Bonnie Mak's "Archaeology of a Digitization," which gives a brilliantly nuanced reading of the digitization project Early English Books Online. Another similar article which has been published is Whitney Anne Trettien's "A Deep History of Electronic Textuality: the Case of English Reprints Jhon Milton Aereopagitica" in the latest issue of Digital Humanities Quarterly. Both of these are great examples of researchers who discover and unfold the stories that artifacts can tell about how they were made. Matt Kirschenbaum has a very interesting article in the same issue of DHQ, and I'd encourage you to explore this issue if you're interested in the topic. I also mentioned my own study of e-book and print versions of a recent Giller Prize novel, The Sentimentalists, which was very much inspired by the kind of work represented here. As I mentioned in class, this topic is an example of traditional information research extending into new frontiers -- not just studying new materials, but also using new methods and theoretical influences -- and the result is that there are a lot of opportunities here for junior researchers, especially those with eclectic iSchool backgrounds.

PS: Last James Bond reference of the course, I promise: http://www.youtube.com/watch?v=M8oibBJTEpc

No comments:

Post a Comment