regulated
summarize Fanon's work from here for theory background
regulated
summarize Fanon's work from here for theory background
I think this looks great and it should make a great template for further inquiries from other sources in the future! I agree with Prof Kleinman about the year time frame, that's where the rest of the project starts. Another idea we could consider would be to try to look at time frames that correspond with Ben Schmidt's humanities crisis time frames. However...like you say, that's a lot of local news transcripts. The tricking thing is that we don't know how much useable data we will get from one year vs 10 years...but we know that we will get a lot of raw data to sort through when they send us big chunks of time. Processing transcript data for our purposes may prove pretty time consuming so we want to consider that. I'm not sure, as you note, what they would be able to give us, so maybe its good to ask for more and accept less...?
Anyway I think it's brilliant, Thank you Leo!
For all the interesting things in Drout's (and Hitotsubashi's and Scavera's) article, it does not, ultimately, stray too far from the central idea of Tolkien's "impression of depth" which all the other evidence is marshaled to support. The tools are used to confirm a hypothesis, which is not too different from the way close-reading as a tool is employed
I had this thought as well. Maybe I missed something but I couldn't pinpoint in this case where the digital tool was able to uncover something traditional humanities scholarship could not. I was convinced by the argument long before they got to Lexomics and I kept waiting for the digital method to give me something more. I suppose that digital methods needn't always be the main event and can sometimes be used to simply back up claims, and its a great showcase of what this tool can do. It seems like they already knew what they would find (it would have been interesting if they had found something different though...then we've got a party) and I didn't really find the digital evidence to be more compelling than the traditional evidence.
Obviously, the author had a change of heart after more closely working with Digital Humanities. I think this is what happens to most literary scholars when they delve into DH. First, they have an aversion to it because of the technical barriers, and then they see that the secret they sought in the humanities might be better detailed by using Digital Humanities.
This is a good point, Ray. I do hope that Marche's experience working with digital humanists and DH technologies has softened him to the field or at least helped him understand it's possibilities better, even if I do think that what he was interested in doing (writing a better story) is not the best implementation of DH methods.
The critic can learn and gather enough information in order to support an argument, regardless of the fact that this information, as Marche puts it, is “fragmented.” We work with what we have, and this applies to both the digital and non-digital aspects of work.
This is so spot on. Marche seems to be laboring under the idea that data in any discipline is to be treated as some kind of sterile perfect whole-picture, when this is not the case in digital humanities or in any field that regularly interacts with this thing called data (as selisker points out) Data is perfectly comfortable being fragmented. And not only that, but the idea that "turning literature into data" is some kind of irreversible transformative (and degenerative) process like turning gold into iron seems to be weighing heavily on Marche's mind, but he needn't worry. A digital humanist is not going to forget that the data they are working is the subject of humanistic study.