- Sep 2023
-
delong.typepad.com delong.typepad.com
-
How to Use a Dictionary
Surely this won't include the idea of John McPhee's Draft #4 use of a dictionary?
-
-
-
In 1807, he started writing a dictionary, which he called, boldly, An American Dictionary of the English Language. He wanted it to be comprehensive, authoritative. Think of that: a man sits down, aiming to capture his language whole.
Johnson's dictionary is much like this article describes too.
Perhaps we need more dictionaries with singular voices rather than dictionaries made by committee?
-
John McPhee — one the great American writers of nonfiction, almost peerless as a prose stylist — once wrote an essay for the New Yorker about his process called “Draft #4.” He explains that for him, draft #4 is the draft after the painstaking labor of creation is done, when all that’s left is to punch up the language, to replace shopworn words and phrases with stuff that sings.
I quite like the idea of this Draft #4 concept.
-
- Feb 2022
-
materchristi.libguides.com materchristi.libguides.com
-
When you read widely, your brain is exposed to different ways in which a sentence or paragraph is written. There are patterns in the use of nouns, pronouns, verbs and other parts of speech; there are patterns in syntax and in sentence variation; and there are patterns in sound devices, such as alliteration and assonance. You can annotate these with different symbols or colors, and develop understanding as patterns emerge, and style emerges from patterns. To read like a writer, you need to annotate like one, too.
I haven't seen very much in the area of annotating directly as a means of learning to write. This is related to the idea of note taking for creating content for a zettelkasten, but the focus of such a different collection is for creating a writing style.
Similar to boxing the boring words (see Draft #4; http://jsomers.net/blog/dictionary), one should edit with an eye toward the overall style of a particular piece.
Annotating structures and patterns in books is an interesting exercise to evaluate an author's style as a means of potentially subsuming, modifying, or learning other styles.
-
-
Local file Local file
-
We need to getour thoughts on paper first and improve them there, where we canlook at them. Especially complex ideas are difficult to turn into alinear text in the head alone. If we try to please the critical readerinstantly, our workflow would come to a standstill. We tend to callextremely slow writers, who always try to write as if for print,perfectionists. Even though it sounds like praise for extremeprofessionalism, it is not: A real professional would wait until it wastime for proofreading, so he or she can focus on one thing at a time.While proofreading requires more focused attention, finding the rightwords during writing requires much more floating attention.
Proofreading while rewriting, structuring, or doing the thinking or creative parts of writing is a form of bikeshedding. It is easy to focus on the small and picayune fixes when writing, but this distracts from the more important parts of the work which really need one's attention to be successful.
Get your ideas down on paper and only afterwards work on proofreading at the end. Switching contexts from thinking and creativity to spelling, small bits of grammar, and typography can be taxing from the perspective of trying to multi-task.
Link: Draft #4 and using Webster's 1913 dictionary for choosing better words/verbiage as a discrete step within the rewrite.
Linked to above: Are there other dictionaries, thesauruses, books of quotations, or individual commonplace books, waste books that can serve as resources for finding better words, phrases, or phrasing when writing? Imagine searching through Thoreau's commonplace book for finding interesting turns of phrase. Naturally searching through one's own commonplace book is a great place to start, if you're saving those sorts of things, especially from fiction.
Link this to Robin Sloan's AI talk and using artificial intelligence and corpuses of literature to generate writing.
-
- Jan 2022
-
vimeo.com vimeo.com
-
from: Eyeo Conference 2017
Description
Robin Sloan at Eyeo 2017 | Writing with the Machine | Language models built with recurrent neural networks are advancing the state of the art on what feels like a weekly basis; off-the-shelf code is capable of astonishing mimicry and composition. What happens, though, when we take those models off the command line and put them into an interactive writing environment? In this talk Robin presents demos of several tools, including one presented here for the first time. He discusses motivations and process, shares some technical tips, proposes a course for the future — and along the way, write at least one short story together with the audience: all of us, and the machine.
Notes
Robin created a corpus using If Magazine and Galaxy Magazine from the Internet Archive and used it as a writing tool. He talks about using a few other models for generating text.
Some of the idea here is reminiscent of the way John McPhee used the 1913 Webster Dictionary for finding words (or le mot juste) for his work, as tangentially suggested in Draft #4 in The New Yorker (2013-04-22)
Cross reference: https://hypothes.is/a/t2a9_pTQEeuNSDf16lq3qw and https://hypothes.is/a/vUG82pTOEeu6Z99lBsrRrg from https://jsomers.net/blog/dictionary
Croatian acapella singing: klapa https://www.youtube.com/watch?v=sciwtWcfdH4
Writing using the adjacent possible.
Corpus building as an art [~37:00]
Forgetting what one trained their model on and then seeing the unexpected come out of it. This is similar to Luhmann's use of the zettelkasten as a serendipitous writing partner.
Open questions
How might we use information theory to do this more easily?
What does a person or machine's "hand" look like in the long term with these tools?
Can we use corpus linguistics in reverse for this?
What sources would you use to train your model?
References:
- Andrej Karpathy. 2015. "The Unreasonable Effectiveness of Recurrent Neural Networks"
- Samuel R. Bowman, Luke Vilnis, Oriol Vinyals, et al. "Generating sentences from a continuous space." 2015. arXiv: 1511.06349
- Stanislau Semeniuta, Aliaksei Severyn, and Erhardt Barth. 2017. "A Hybrid Convolutional Variational Autoencoder for Text generation." arXiv:1702.02390
- Soroush Mehri, et al. 2017. "SampleRNN: An Unconditional End-to-End Neural Audio Generation Model." arXiv:1612.07837 applies neural networks to sound and sound production
-