47 Matching Annotations
  1. May 2016
    1. James Joyce’s “Nausicaa” episode from Ulysses by focusing on the modals “could” and “would,” the causal conjunctives “so” and “because,”

      While this is definitely useful for doing an extremely close reading, I definitely would not have the patience to visualize this kind of data. Talk about a lesson in perseverance.

    2. As “cultural production”—“for its political role, its exposure of the state of a given society” (9)

      Does anyone remember when Atlas Shrugged became a hot, political topic a few years back? I think it was in the 2012 election-- I remember CNN had a headline that read "Who is Ayn Rand?". You can tell there was at least one English major in the editing department.

    3. Gertrude Stein. In

      I feel like Stein is a perfect writer to utilize data visualization for as much of her work revolves around repetition. For example, I would be interested to see how many times Stein says the word "patriarchal" versus "poetry" in Patriarchal Poetry.

    1. I remember during football season the New York Times posted this ridiculous page on it's back cover with a data visualization of all of the ways that the New York Jets could clinch the playoffs. In each alternate timeline certain teams would win and lose and drift further into alternate timelines. It seemed as if the data visualization itself was commenting on the unpredictability of playoff sports, or perhaps, the dismal chances of the NY Jets (who, ultimately, did not make the playoffs.)

    2. challenge is to understand how the information visualization creates an argument and then make use of the graphical format whose features serve your purpose.

      This is interesting, as in this case, the platform for the information becomes information (an argument) within itself.

    1. In the PC versions of the series, gameplay was typically thought of as a toy—something for players to pick up and enjoy for however long they want with no clear end in sight. (Lewis and Boulding)

      It's this exact reason when I spent my childhood pouring hundreds of hours into the Sims. The imagination it leaves you with is unmistakable.

    2. Just as the game had no offi cial name, it also had no marketing (in fact, was unavailable for purchase), and no offi cial beginning. Or, to put it another way, it began when and how people be-gan to play it. For many it began with the second A.I. trailer, in which “Jeanine Salla” is credited as “Sentient machine therapist” (Hon). Players’ web searches for these terms revealed the beginnings of a trail that threaded through texts, images, and movies across the internet—as well as phone calls, faxes, US Post-al Service deliveries, bathroom walls, and live event

      This is very reminiscent of the alternate reality game for the 2008 film "Cloverfield." I remember running home from school to uncover fake websites for fake brands and dig through enigmatic websites with forums full of archaeologists such as myself.

    1. In some of these cases, we've invented devices that perform the actions, solutions that represent definitive answers for a particular problem, be it increasing the amplitude of a signal, removing impurities from a liquid, or increasing moisture in a room.

      Thankfully, in video games we're ultimately seeing shifts towards ambulatory games-- games without "leader boards, points, badges, etc". This is aided by the development of indie games.

    2. Most recently, Serious Games have offered another, more general attempt to expand games' scope.

      I always disliked the rhetoric of "serious games" within the gaming community because I feel like it missed the point of games to begin with. Take a game like Donkey Kong that's entirely presented in 8-bits, literally takes it's narrative from King Kong, and contains three stages. However, this game, is (in many instances, for me) a lot more enjoyable than repetitive FPS franchises that keep releasing the same game (cough Call of Duty cough).

    1. Salem Witch Trials,

      I feel that this is especially poignant as IVANHOE inevitably leads to "contextualization" no matter which way it's utilized. When I played Billy Budd I was unable to see past the realm of my role (Melville) after the game concluded. Contextualizing Melville's biography proved not only to be interesting and fruitful, it legitimately altered my real life reading of Billy Budd.

    2. agree to collaborate in thinking about how

      Perhaps it would be more interesting if the participants didn't agree to collaborate in re imagining the text? Although such a game may not be as academically fruitful, it could be more insightful as to how one originally thinks about a text and track how the different viewpoints flow and ebb, perhaps change, each other throughout the context of the game.

    3. all interpretation is misinterpretation

      Sounds more like Harold Gloom, am I right or am I right?

      crickets

    1. Gaming. These artifacts represent the most literal meaning of play. The pedagogical value of games and simulations has long been known, but it was James Paul Gee’s What Video Games Have to Teach Us about Learning and Literacy that raised awareness about gaming’s potential as a pedagogical tool. The artifacts in this category either treat games as objects of study in their own right alongside literature, or provide examples of games as pedagogical tools.

      Although I'm far from a self-described "gamer," I've had several experiences where gaming has helped me learn in the process of my youth. I did play strict educational games (in my opinion, both "Typing of the Dead" and "Math Blaster" were great fun) but I feel even "non-educational" video games had a pedagogical impact on me. A good example of this is the N64 game "Paper Mario," which was (even by today's standards) a tough RPG. I think it took me about forty to forty-five hours as a kid to beat the game. Recently I looked back at the structure of the game (through some YouTube videos) and I was astounded that I was able to play it as a child-- you are often given complicated directives, objectives, and are forced to creatively solve a multitude of platforming puzzles. Also, you had to choose very wisely the ways in which you upgraded your character. Choosing the proper sidekick, attack moves, defense moves, and items to keep in your repertoire was a vital part of success in the game. In fact, the beginning of the game was reminiscent of "Mistakes over Sucuess." The game takes the phrase "Every Game Over is the start of a new game" literally-- in Paper Mario you are immediately forced to face Bowser at the beginning of the game and it's impossible to win. The entirety of the rest of the game is you, Mario, training to take another stab at Bowser and rescue Peach.

    1. Although not directly related to this piece, I think it would be interesting to see what the author thinks about the emergence of "two-screen" experiences. The author talks about "interactivity," which is related. When Microsoft was showing off the Xbox One before it's release they showed that you could plug your cable box into your XBox to enhance your experience-- I believe the demo reel showed player statistics on the bottom of the screen and a live tweeter feed of a conversation about the game on the margins. In this regard, the game's marginalia becomes a fluid part of the game. This is true even without being on the same screen, for example, when watching the Mets play you can easily find pitch by pitch reactions by searching for the pitchers name or an associated hashtag. During a recent game the commentators were speculating whether or not Matt Harvey was pitching so poorly due to the viral buzz about his poor outings lately. Although people have been critiquing athletes for centuries, the accessibility and the ubiqiotusness of social media can surely broaden that criticism, and in turn, effect Harvey's performance. In this way there's an "interactivity" tangible on several levels.

    1. Bridle argues that in a world in which we’ll no longer own books as discrete physical objects, the only really meaningful thing we’ll own will be the reading experience itself.

      This is a truly radical idea that I can get behind. Not only does this observation change the way we can view annotations or marginalia, rather, the experience of art itself. Especially within the context of the twenty-first century, where a new profit-model for any art form is constantly being made the new industry standard, it's important to attribute value to the art and not the medium. Simon & Garfunkel's music, for example, contains value-- not the 180 gram vinyl, the Tidal subscription, or the eight track.

    2. Infinite Jest

      Doesn't Infinite Jest already have three hundred pages of footnotes? DFW is on track to give Joyce a run for his money.

    3. According to the marginalia scholar H. J. Jackson, the golden age of marginalia lasted from roughly 1700 to 1820. The practice, back then, was surprisingly social — people would mark up books for one another as gifts, or give pointedly annotated novels to potential lovers

      This is incredibly interesting to me as once I annotate a book I consider it "unborrowable" unless I have a close friendship/intimacy with the lendee. Also, annotating something that you know others are gonna read makes your annotations more faux-intellectual and less utilitarian. (Oh my god am I doing that right now, this is getting far too meta for my taste.)

    4. As John Dickerson recently put it on Slate, describing his attempt to annotate books on an iPad: “It’s like eating candy through a wrapper.

      I find Hypothes.is largely enjoyable, however I find it hard to stomach lengthy readings on a screen. Put the author and John Dickerson make a great point here, writing on .PDF's is a complete nightmare.

    5. Writing in them is the closest I come to regular meditation; marginalia is — no exaggeration — possibly the most pleasurable thing I do on a daily basis

      This catharsis reminds me of an anecdote a classmate in a modernism class once shared with me. She said that her mother was an editor for a publishing company and decided one day to pick up a copy of William Faulkner's "The Sound and the Fury." She was apparently so distraught by Faulkner's flagrant disregard of grammar, spelling, and syntax in Benji's chapter that she reorganized the entirety of the first two narratives with her own marginalia. Of course, following the expansive traditions of new criticism (and Barthes) this text would be viewed in academia as a completely different text.

    1. Barthes' point on "filiation" is certainly an interesting one, given that many contemporary academics believe that most texts are written about past periods are inevitably going to be about the period they're being written in. For example, it's highly improbable that "The Crucible" acted as a platform for Arthur Miller to talk about the witch trials, and not, the HUAC trials happening in the United States. Although, perhaps, Barthes is arguing that the text goes beyond this interpretation. Reading "The Crucible" the reader is allowed to view it in the vein of McCarthyism, yet acknowledge the witches and every other reading. It's apparent that Barthes doesn't want the modern reader to say that a text is "about something" or that there's one end all, be all way to read a text.

    1. and move toward formulating their own distinctive voice

      This brings up an interesting point, when does marginalia becomes it's own text? Many (including myself) would argue that marginalia could always constitute it's own vitality, however, if readers are urged to bring their own "distinct voice" to their annotations, does marginalia only become an autonomous text if the reader goes against the initial point of the text? Surely, there must somewhere be a line between an echo-chamber and an autonomous text, but where is it?

    2. As Fitzpatrick has pointed out, the visibility of this annotative action is both a gift and a problem

      Unknowingly, I opted to turn my annotations on Hypothes.is off right before I read this line. Fitzpatrick, and the author of this piece, both have great points, however, if anything annotations are distracting on the page. While it would be interesting to see if numerous people/scholars find the same passages noteworthy, if anything, it becomes unaesthetically pleasing on the page. Although aesthetic is (rightfully) unimportant in most open-source educational software there has to be a firm middle-ground between endlessly scrolling to the end to see annotations or being overwhelmed on the page. I think Greg in our class made a good compromise and remedied this problem through his final project for #Allred399.

    1. An instructor’s comment can also prompt students to consider a particular passage in the larger context of the work or in the context of ideas.

      I feel like this super telling as I, annotating this piece, found myself looking for other annotations for points of interest. Although this is definitely useful, as it's beneficial to know where others found merit, it could also discourage readers from uncovering their own interesting passages. Although I'm no master of pedagogy, wouldn't it be more interesting to have every student annotate a single passage they see as important, and compare the findings in class?

    2. Annotating The Elements of Style

      This is unconstructive, but that Comic Sans really threw me for a loop.

    1. Tomorrow's readers will immerse themselves in their favorite books, not self-consciously as I did for this experiment, but based on deeper needs

      The key word here is needs, "whatever is useful for us shall be beautiful." Utilitarianism is the way to go in this battle.

    2. That's the worst accusation: that I am not a serious reader. Not guilty! I love books as much as anybody. But I love reading more.

      I think worrying about whether others think you're a serious reader or not is kind of silly. The idea of "poser academics" is funny because on paper academia is the polar opposite of "cool," which is the only reason someone would be a poser. I can just see the Vice article now: "Are Academics the New Hipsters?". Actually, I should stop here before the folks at Vice get any ideas.

    3. Kindle2

      Reading about the kindle at this point is like reading about the Motorola RAZR or the T-Mobile Sidekick. On paper, it's so recent, but it's so, so far away.

    4. The iPhone is a Kindle killer.

      I don't understand how ANYONE can primarily read on a screen. Don't get me wrong, I know I'm the outlier in this equation, but reading on an iPhone is infuriating to me. I'll eventually adapt-- I'm not nostalgic about the feel of "old pages" or anything like that, it's just easier to me. Print is not dead-- yet.

    1. It's interesting to learn how/why Edison developed the technology necessary for audio recording-- I feel that this course as recurring theme in which the main use of an invention is never what it was intended to be. Pretty much any time a new technology is invented someone figures out to capitalize on it or completely perverts it. I highly doubt Steve Jobs, struggling over the initial concepts of the iPhone, imagined me mindlessly swiping through OkCupid on the bus.

  2. www.mlajournals.org.proxy.wexler.hunter.cuny.edu www.mlajournals.org.proxy.wexler.hunter.cuny.edu
    1. When the community finds such language in supposedly nonliterary sources—A m a z o n. c o m reviews, Web forums, Goodreads, blog book events, and library re-sources online—professionals of the “read-ing class” are quick to dismiss the activity as wrong reading, a phenomenon we might call the Biennial Harry Potter Backlash

      I think the growing interest in "found poetry" represents how these are being (slowly) acknowledged as forms of writing. Recently I saw a (comical) tweet that took the photos out of a Buzzfeed list and claimed it worked as a "strangely affecting" poem. I've done similar art projects in the past; going through Craigslist "Missed Connections" pages and making compilations of sincere, straightforward, wistful feelings of regret. "Content" is no longer just king, content is everything. This annotation is content. The tradition continues.

    2. how good

      I feel like the tone of this piece is vaguely condescending to those not enrolled in Penn State. The author seems surprised that non-academics were able to enjoy a novel in a case where (most likely, pretentious) college kids weren't able to enjoy it. I wish the author had a little more faith in the quality control of non-academic readers.

    3. too cerebral

      I don't want to be friends with anyone who criticizes a novel for being "too cerebral."

    1. In the future, we might expect digital humanities researchers to adapt such off-the-shelf social-computing technologies or innovate new ones to allow for other ways to experience and communicate—that is, to read, interpret, and perform—primary literature. Perhaps custom-designed reading, interpreting, and performing applications will be created by the robust creative and scholarly community of the Electronic Literature Organization to make literature not just what Noah Wardrip-Fruin calls “playable media” but, specifically, socially playable media akin to two of the interactive social modeling examples that Wardrip-Fruin studies in depth in his Expressive Processing: The Sims, a computer game, and Façade, an interactive computer drama. Or perhaps Ivanhoe, the game of interactive, role-playing literary interpretation created by Jerome McGann, Johanna Drucker, and others, will set the mold for socially computable literary experience.

      This is especially poignant due to the developments and huge advancements made in the field of VR. Ivanhoe may not appeal to the casual reader, however, being able to visually "play" a novel might change the average consumers mind.

    2. And it is there in the epic of all the social-news, shared-bookmark, or similar sites that build a portrait of collective life from constantly reshuffled excerpts, links, and tags from that life akin to Homeric formulae. Above all, as a literature professor, I recognize that—viral YouTube videos aside—the vast preponderance of Web 2.0 is an up-close and personal experience of language.

      What disallows viral YouTube videos from this equation? Although their virality may make them a less "personal" experience, they convey wavelengths and bring people together. Especially as the years progress, the internet has it's own distinct language. Everyone I know that isn't connected to the internet a significant portion of time finds no merit in "dat boi,"which I personally find hysterical. Call it a decline in humor standards, but I was born well after someone wrote a hit Christmas song about "an Italian Christmas donkey."

    3. “the use of technology in networked communication systems by communities of people for one or more goals,” even if that goal is as seemingly unfocused as building the community itself and one’s identity in it.

      While this is most certainly true, it's essential to note the importance of the "Internet boom" through all of this-- these mediums and startups were able to gain prominence because the online market place that simultaneously built around it. Imagining my parents in the mid-1990's, when we bought our first personal computer, I can't imagine them being very focused on building an online community.

    1. Second, of course, there is the long association between computers and composition, almost as long and just as rich in its lineage

      Everyone scoffs at the English major until they're forced to cohesively utilize the written word. This is a great point by Kircschenbaum, the English departments are probably they only group of people that would know what to do with the various tools that came to fruition in the early days of computing.

    2. Digital humanities has also, I would propose, lately been galvanized by a group of younger (or not so young) graduate students, faculty members (both tenure line and contingent), and other academic professionals who now wield the label “digital humanities” instrumentally amid an increasingly monstrous institutional terrain defined by declining public support for higher education, rising tuitions, shrinking endowments, the proliferation of distance education and the for-profit university, and underlying it all the conversion of full-time, tenure-track academic labor to a part-time adjunct workforce

      The need for more expansive digital humanities programs in schools is so apparent. The fact that I personally know four people that have the term "social media" in their job title shows the sprawling nature of the twenty-first century job market. However, these jobs are usually scoffed at by baby boomers and other forms of narcissists (too low a blow?) because disciplines like digital humanities don't have enough visibility in the public sphere.

    3. Tweeting has rapidly become an integral part of the conference scene, with a subset of attendees on Twitter providing real-time running commentary through a common “tag” (#mla09, for example), which allows everyone who follows it to tune in to the conversation.

      The origin of the hashtag is so interesting because how quickly it's initial purpose became hijacked by corporate entities and comedic communities. Of course, I say this as someone who uses hashtags almost exclusively for the sake of irony. I feel that the use of hashtags, as well as memes for that matter, have ultimately closed discourse in areas where it should be opened up. For example, there's no real political value by searching #NeverTrump or #ImWithHer or #BernieOrBust-- you're either gonna stumble upon an echo chamber or trolls just looking for a fight.

    1. author revises and as editors, printers, and other "collaborators" make their own changes to a manuscript

      This is a fantastic point-- "collaborators" are often invisible to the casual reader. Listening to a record passively one thinks of the songwriter not the session players, producer, mixing engineer, mastering engineer, A&R person, etc. There's significant ebb and flow in the creative process of anything, in fact, it's a new concept (brought to life by the combination of "DIY" culture and the accessibility of the internet) that one person can be oversee the conception and creation of something from start to finish.

    2. Melville responds on the level of diction, syntax, image

      True, however, isn't it acknowledged that all art is derivative and/or intextual? Melville's musings on his predecessors and contemporaries are noteworthy and interesting, but he's by no means an outlier.

    3. but all the passages that are incorporated freely or in a modified way … are also marked in this copy

      "Incorporated freely" is a fantastic euphemism for stolen. However, this answers my previous question about the source of Melville's passages that were "incorporated," and really works as an adhesive to blend Melville's marginalia and Beale's work together.

    4. The whereabouts of 285 titles have been tracked, which means that more than 700 could still be extant somewhere, waiting for scholars to find them.

      Unrelated: This is reminiscent of Jack White's "secret" vinyl records that he hid in couches during his time as an upholsterer in Detroit. People are still coming across this records as their furniture falls apart.

    5. But he has recovered the next best thing — the notes Melville made in his copy of a critical source for Moby-Dick: Thomas Beale's 1839 book, The Natural History of the Sperm Whale.

      I wonder, I'm sure a cursory Google search could answer this, if this is the same book that Melville "borrowed" from verbatim. Surely, such a discovery would irrefutably link both texts closer together, no?

  3. Mar 2016
    1. Creative people are beginning to exploit interactive and multi-media capability into digital books.

      I think the term "exploit" is kind of loaded here-- certainly, we should not shame creative people for utilizing new avenues?

    2. The members of the generation that grew up playing Game Boys and telling time on their cellphones will have absolutely no problem reading from a small screen.

      My experience is not true to this. Growing up I was always playing my Game Boy, whether it be at home, after school, the car. However, to this day, I have trouble reading long form texts on a screen. My distaste for reading novels on screens doesn't stem from some misplaced fear of technology, I would love to be able to read that way, but I just can't concentrate.

    3. I think this it's an interesting idea that an audiobook can happen without you. Although I strongly disagree with the colloquial view that listening is inherently a passive trait, there's no denying that an audiobook can leave you behind if you get distracted. It requires a certain discipline in the same way that watching Netflix at night does. Fall asleep during an episode of Arrested Development and you'll have to feverishly thumb through multiple episodes to find your place and what you last saw. The audiobook requires you to be completely present, even if you're multitasking.

  4. Feb 2016
    1. Computationally assisted text analysis, we realize, is a way to experiment with literature to bring out, among other features, its latent social network and that of the characters in its imaginative worlds. In the last analysis, after all, a concordance represents how even disjunct speakers share a sense of a word and so conjoin in a discursive structure that images a larger social structure.

      This is an interesting concept-- implementing these types of tools offers a breath of fresh air into stale, canonical texts. Over time the criticism and the text become so seamlessly intertwined that it's hard to postulate where the author ends and the discourse begins. That being said, using scientific tools to analyze and find patterns into classical texts can bring a new understanding of unknown trends and patterns in literary history.