3,016 Matching Annotations
  1. Aug 2022
    1. in Oberon (as in Python, but very much not as in C or Pascal) a character is a string of length 1

      nope

    1. software collapse

      Is this not just a new name for what was already described in the 1960s as the "software crisis"?

    1. I can't possibly keep updating software to deal with new JavaScript versions

      There's a fundamental misunderstanding (alternatively, misdirection) about what the source of breakage is. JS is not an SDK, and neither is the Web platform. Whatever worked in the old "JavaScript version" will work in the new one.

    1. A complete description on implementing eventhe simple IMGUI would take too much space. Iinvite you instead to watch Casey’s video lecturemirrored on my web site, and look at the samplecode I’ve posted. I’d like to thank Casey Muratori, JonathanBlow, and Atman Binstock for feedback on thisarticle, some of which you can read on Casey’sweb site forum.
    2. a “global variable” which identifies whichwidget is active
    3. Instead, we simplydescribe them from scratch every frame, whichdoesn’t actually result in more code. It’s more likewe run the “create” code every frame.

      Now imagine, as a performance optimization, a lib where you can write in IMGUI style, so you write the same amount of code, but behind the scenes, the lib is managing things, so it's actually free to do all the retained mode heavy lifting.

      We just invented declarative programming.

    4. This is less efficient than the previousapproach, but it’s less likely to have bugs.If we have CPU speed to spare, we can get rid ofall the bugs.

      Interesting remarks, given the Muratori school on profligate misuse of computing resources by modern developers.

    5. Game Developer. September 2005.

    Tags

    Annotators

    1. This describes one of the most pleasing hacks I've ever come across. I just now tracked it down and added it to my bookmarks. (Not sure why it wasn't already there.)

      You could also conceive of going one step further. When your app (doesn't actually have to be a game, though admittedly it's much easier for you if it is) is compiled with tweak.h, it gives it the power to paint the source file on the screen—so you don't actually have to switch over to your text editor to save it, etc. Suppose you want to provide custom inputs like Bret Victor-style sliders for numeric values. You could edit it in your text editor, or you could derp around with it in-app. Tweaking the value in-app should of course both update it wrt the app runtime but also still write the file to disk, too, so if live reloading is turned on in your text editor, whatever changes you make inside the live process image gets synced out.

  2. scattered-thoughts.net scattered-thoughts.net
    1. I like to organize all the state in a program so that it's reachable from some root pointer. This makes it easy for the reader to understand all the resources allocated by the program by just following the tree of types. It also makes it easy to find any piece of state from within a debugger, or to write little debugging helpers that eg check global invariants or print queue lengths.
    1. I've gotten to where "if I can't make sense of it and be productive in it with just vim, grep, and find, your code is too complex".

      See Too DRY - The Grep Test.

      But even allowing for grep is too lax, in my view. It's too high a cost. If I've got some file open and am looking at some Identifier X, I should be able to both deterministically and easily figure out exactly which file X lives in.

      Violating this is what sucks about C's text-pasting anti-module system, and it's annoying that Go's "package" system ends up causing similar problems. I shouldn't have to have a whole-project view. I should be able to just follow my nose.

    1. Maybe that’s why Wordpress has always struggled so much with building any kind of network effect

      I think it is the beneficiary of a network effect—it's just driven by a professional class, and not end users.

    1. and free of globals

      Ah! This remark highlights a fundamental difference in understanding between two camps, which I have been (painfully) aware of, but the source of this confusion has eluded me until only just right now. (Really, this is a source of frustration going back years.)

      In one camp, the advice "don't use global variables" is a way of attacking a bunch of things endemic to their use, most notably unnecessary coupling to spooky state. In another camp "no global variables" is understood to mean literally that and taken no further—so you can have as much spookiness as you like, and so long as the value is not directly accessible (visible) from, say, another given piece of code appearing at the top-level ("global") context, as with the way i is bound to the activation record in this example but is not accessible outside the scope of getGetNext, then you're good.

      That is, there are two aspects to variables: visibility and extent, and the first interpretation seeks to avoid the negative effects on both dimensions, while the second is satisfied by narrowly prohibiting direct visibility across boundaries.

      I find the latter interpretation bizarre and completely at odds with the spirit of the exhortation for avoiding globals in the first place.

      (What's worse is the the second interpretation usually goes hand in hand with the practice of making extensive use of closures, which because they are propped up as being closely associated with functions, then leads people to regretfully refer to this style as functional programming. This is a grave error—and, to repeat, totally at odds with the spirit of the thing.)

    1. I don't like bloat and the JavaScript fuelled Win98-isation of the web. Therefore I tend to write very spartan webpages. This one for example is generated from a lightweight markup format that I created, and uses only the P and A elements.
      • The Web has suffered bloating
      • This page, therefore, is simple:
      • All that's involved here is a de novo language that was created in such a way that this page has to be generated from a separate, canonical source file written in that format

      This is silly set of statements to make in series.

      Just make the published form of this document the canonical representation—treat it as suitable for reading and editing alike (since it is).

    2. Typography in HTML is awful

      A similar mistake is being made here as the one that precedes the reminder that there is no such thing as a fast programming language—only a given implementation can be called fast.

    1. I avoided using languages that I don't know how to bootstrap like node.js

      There's a weird (read: "weirdly obvious") category error here. NodeJS is not a language. (This wouldn't be so notable if the comment didn't go on to say "The key point is writing to an interface and not an implementation.")

      The puzzle piece that fits the shape of the hole here is "JS". JS is the language, NodeJS is one of its implementations—and chubot knew both of these things already, so it's odd that it was expressed this way. Plus, there's a lot more diversity of JS implementations than exist for e.g. Python...

    2. IMO avoiding things like CommonMark and Python will just make your site worse right now

      I don't see how avoiding Python "will just make your site worse"—whether right now, or ever. Is there supposed to be something inevitable about Python?

    1. Its account of how Heft made his flag closely resembled the standard story, but instead of any assertion that it became the basis for the official design, it merely said that it was “considered Lancaster’s first.”

      I'm having trouble parsing this.

    1. I basically think of it as an "executable README". A README or blog post often contains shell commands. So I just copy those into a script.

      Suppose that instead of shell commands, the snippets were JS, and the README, rather than being named README.markdown or README.txt, were actually named e.g. README.txt.htm. it wouldn't be basically like an executable README—it would actually be executable. You could double click it to open, and then read through it and use it to actually do the stuff that the README is documenting (like build the project in question).

    1. they're called objects, and everybody has them

      Even most ostensible FP practitioners who swear they don't.

  3. gbracha.blogspot.com gbracha.blogspot.com
    1. Static variables are bad for re-entrancy. Code that accesses such state is not re-entrant. It is all too easy to produce such code. Case in point: javac. Originally conceived as a batch compiler, javac had to undergo extensive reconstructive surgery to make it suitable for use in IDEs. A major problem was that one could not create multiple instances of the compiler to be used by different parts of an IDE, because javac had significant static state.

      Walter Bright described similar issues with the Digital Mars compiler in a recent NWCPP talk "Adding Modules to C in 10 Lines of Code", albeit Walter's compiler sounds like it was far easier to fix.

      It's funny that this happens with compilers. Wirth's Oberon compiler (module ORP) also depends on static state. But compilers seem natural to me to be something that you want to allow multiple instances of—it's not even among the list of things I'd expect people to make excuses for (cf https://kentonshouse.com/singletons).

    1. There are no static variables and no initialization
    2. Since the original Pascal was implemented with a one-pass compiler, the language believes strongly in declaration before use.  In particular, procedures and functions must be declared (body and all) before they are used.  The result is that a typical Pascal program reads from the bottom up - all the procedures and functions are displayed before any of the code that calls them, at all levels.  This is essentially opposite to the order in which the functions are designed and used.

      Worth noting that almost every C program is impaired by a similar limitation in the C language (despite the disclaimer that follows this passage about the use of the preproccessor), and many programmers' thought processes suffer because of it—no consideration for the presentability of code (even when not in languages that are affected by this limitation!)

    1. JavaScript Tips

      This is a 404.

    2. see {{bug(442099, “bug”, 19)}}

      That's bug 442099, comment 19, i.e. trying to substantiate the opinionated claim that "an operator at the front" is better for human factors.

    3. In JavaScript, == is preferred to ===.
    1. And second it was not immediately visible in a program text, where an imported identifier was declared.

      The problem of multiple import forms plagues ECMAScript today. Additionally the lack of "immediate" visibility of "where an imported identifier was declared" continues to plague Golang, to an extent even worse than described here for Modula-2 (because at least when you're iteratively jumping through the file with naive text search, you'll eventually reach the Modula-2 import statement, even though it involves wrapping around after reaching the bottom). It's explicit at some point. With Golang, on the other hand, having not been sufficiently wounded by the problems of C, the team designed it such that it has magical name resolution, making it tedious to carry out (as a human) the process that the compiler uses.

    2. identifiers may now appear that are neither locally declared, nor qualified by a module name, nor visible in an import list; an entirely undesirable situation in a structured language
    3. As an aside, we note that in object-oriented languages the concept of data type is merged with the module concept and is called a class.

    Annotators

    1. I've been perusing codetriage.com/ this morning and most projects are essentially for the benefit of other developers.

      This is the dirty secret of the neo-OSS era—that it's mostly devops shovelware (to a degree that makes it look like Sturgeon was being conservative)

    1. The effort got him accolades and commit access to the Rails repo.

      But having commit access and the having ability to fiddle with bugs are two orthogonal sets of privileges...

    2. The creator of CodeTriage, Richard Schneeman, was surprised to learn one day that the Ruby on Rails core team (of about 7 or so people) were responsible for handling ALL the issues opened on the Rails GitHub repo. At the time, there were about 700 or so issues. These maintainers are hugely valuable to the project due to their depth of knowledge, however keeping up with issues was taking a huge amount of time. Would you rather these highly specialized maintainers spend their time developing new features and actually fix bugs, or would you want them to spend their days responding to hundreds of issues? Asking for simple questions like "what version of Rails are you using?", and "can you provide an example app?", and "is this fixed on master?”. While asking these questions might only take 5 or 10 minutes of time, the sheer volume of issues per maintainer was unreasonable. This was highlighted by the herculean efforts of another developer Steve Klabnik, who went through every single issue and responded to all of them in a marathon session spanning multiple days. The effort got him accolades and commit access to the Rails repo. While he deserves the praise, the efforts were ultimately unsustainable.

      Surprise: going all in on GitHub—including abandoning traditional practices governing bugtrackers in favor of GitHub's anemic project management tools—has a negative impact.

    1. "Why have a locked wiki when you can instead just post static Web pages?"

      What even is a locked wiki insofar as the ways it differs from traditional (pre-wiki) content publishing pipelines? Where's the wiki part of it?

    1. Wirth pithily makes some of the same points in his HOPL III paper on the history of Modula-2 and Oberon

    1. Related: much of functional programming is not even functional. Closures end up doing a lot of heavy lifting, despite being at odds with what FP claims to be.

    1. I feel those implications are very intellectually liberating and are far more cognitively organic than the print paradigm which currently shapes our intellectual mindset

      There's a lot to be gained by reflecting on pre-Web paradigms and hewing to similar approaches—electing to be bound by the constraints. Print is a discipline, and the practices surrounding the print-based publishing industry comprises a form of technology itself.

      Related: lineality is a virtue.

    1. Other versions which are available are:

      Another PDF from CERN, but this one what looks like a PDF of the original as a first-class digital document, i.e., not a scan of a paper copy: https://cds.cern.ch/record/369245/files/dd-89-001.pdf

  4. Jul 2022
    1. TI’s effective monopoly power hurts these kids as much as it has chilled the development of better graphing calculators with superior feature sets from other manufacturers.

      I'm not sure the argument of chilling effect holds up in light of the high price. A high-priced incumbent should be easier to dislodge than one priced reasonably.

    1. if you’re a beginner you can use Replit which allows you to program through your browser without installing anything on your machine
    1. publishers and advertisers deserve full control of their audio experiences

      gross

    1. We never got there. We never distributed the source code to a working web browser, more importantly, to the web browser that people were actually using. We didn't release the source code to the most-previous-release of Netscape Navigator: instead, we released what we had at the time, which had a number of incomplete features, and lots and lots of bugs.
    2. People only really contribute when they get something out of it. When someone is first beginning to contribute, they especially need to see some kind of payback, some kind of positive reinforcement, right away. For example, if someone were running a web browser, then stopped, added a simple new command to the source, recompiled, and had that same web browser plus their addition, they would be motivated to do this again, and possibly to tackle even larger projects.
    1. readable

      Readable how? I think the better approach over what we do now, where we run source code through what are essentially compilers for making websites and then treat the output similar to object files—i.e. opaque blobs that are unsuitable for anything other than either (a) executing/experiencing, or (b) publishing to others—would be to pursue non-destructive compilation. So after the Markdown is compiled (if there is any compilation step at all), you don't have to keep the original sources around. The tooling should be sufficiently advanced to work with the publishable form as input, too, and not demand only Markdown. In instances where the Markdown files are kept around because the spartan experience of opening a plain text file where the content is almost entirely unadorned by formatting concerns is the preferred way to get things done, the tooling should be able to derive the original Markdown (or a good enough rendition) from the output itself. HTML is rich enough to be able to encode the necessary structure for this on the first Markdown-to-HTML pass. Systems that implement this kind of thing could be said to support a sort of "reversible Markdown", making the publishable form a suitable source of truth—a treatment that is right now reserved only for the originals that are collected and kept off-site.

      Make the writing and editing experience more like Word or Google Docs and less like LaTeX.

    2. A CMS for hosting, editing and maintaining markdown files AND a hosting service for publishing these as blogs.

      Another solution, in two steps:

      1. make your SOPs executable
      2. give them a promotion; make them first-class content (they should live on the site that you're publishing—although not necessarily front-and-center—not hidden away in the README of some ghost repo)

      See also: A New Publishing Discipline.

    3. triplicate
    4. Slab
    5. I think Mozilla would be a great steward for a project like this.

      From a former Mozillian's perspective, Mozilla would be a terrible steward for this. They would be a terrible steward now, and they'd have been a terrible steward in 2018.

    6. Here’s what I’d like to see:

      Sounds like write.as.

    7. getting set up requires a github account and “pushing” commits every time I write a post
    8. But starting, hosting and maintaining your own blog is still too hard.
    1. Finally, there is the nuclear option: the move the authors label the “hollow man”. With this, one does not even try to refute the substance of an argument; instead, one claims that whatever one’s opponent has to say in favour of proposition P is in reality a smokescreen for their real (and terrible) reasons for holding P. One can label one’s opponent a “racist”, a “bigot” or a “fascist”, and suggest that an audience is safest by ignoring the arguments altogether, for fear of being duped into bigotry itself.

      Did this require a new term ("hollow man")? Isn't "ad hominem" sufficient?

    1. Since they are already using the Node toolchain for the front-end, developers from this track only needed to stretch a bit more to become “full-stack” engineers.

      Think about the irony of this.

    2. The current trend of using arrow functions everywhere is absolutely killing me
    3. in my mind, this had led to a generation of engineers that write poor JavaScript because now they are reliant purely on the tooling to provide productivity instead of using the tooling to improve productivity and using plain old good practices for organizing code, naming things well, encapsulating logic, and so on.
    4. Unlike .NET, JavaScript does not have a rich set of base class libraries. It has been dependent on the community to fill that gap by writing open source projects and publishing shared packages to NPM.

      What always gets me when people bring this up is that none of those repeating it ever seem to have thought of shamelessly copying the superior libraries they have in mind, rather than buying in to the purportedly lower quality system.

    1. I also don't think you know what you're talking about when you talk about code readability. By and large, the vast majority of opinions on code readability, when you break them down, are based on a personal stance about preferences in writing
    1. In order to keep your service online, you are required to keep a positive account credit balance. If your account balance drops low, our system will automatically send multiple warning emails. If despite that, you still fail to recharge your account, the system will automatically suspend your account and all your pull zones. Any data in your storage zones will also be deleted after a few days without a backup. Therefore, always make sure to keep your account in good standing.

      Should be able to separate storage balance and credits for servicing traffic.

    1. It's also hard to share this workflow with someone non-technical. I have to setup and maintain the correct environment on their machine
    1. Not saying I agree or disagree with this, but the existence of a class system in tech jobs is the OP’s central point.

      I'm continually surprised when someone posts and HN fails to understand even very basic points in a piece of writing, even when they're very clearly made like they were here. PragmaticPulp's top comment (and the fact that is the top comment) is completely mystifying, for example.

    1. Yes, it’s making it easier than ever to write code collaboratively in the browser with zero configuration and setup. That’s amazing! I’m a HUGE believer in this mission.

      Until those things go away.

      A case study: DuckDuckHack used Codio, which "worked" until DDG decided to call it a wrap on accepting outside contributions. DDG stopped paying for Codio, and because of that, there was no longer an easy way to replicate the development environment—the DuckDuckHack repos remained available (still do), but you can't pop over into Codio and play around with it. Furthermore, because Codio had been functioning as a sort of crutch to paper over the shortcomings in the onboarding/startup process for DuckDuckHack, there was never any pressure to make sure that contributors could easily get up and running without access to a Codio-based development environment.

      It's interesting that, no matter how many times cloud-based Web IDEs have been attempted and failed to displace traditional, local development, people keep getting suckered into it, despite the history of observable downsides.

      What's also interesting is the conflation of two things:

      1. software that works by treating the Web browser as a ubiquitous, reliable interpreter (in a way that neither /usr/local/bin/node nor /usr/bin/python3 are reliably ubiquitous)—NB: and running locally, just like Node or Python (or go build or make run or...)—and

      2. the idea that development toolchains aiming for "zero configuration and setup" should defer to and depend upon the continued operation of third-party servers

      That is, even though the Web browser is an attractive target for its consistency (in behavior and availability), most Web IDE advocates aren't actually leveraging its benefits—they still end up targeting (e.g.) /usr/local/bin/node and /usr/local/python3—except the executables in question are expected to run on some server(s) instead of the contributor's own machine. These browser-based IDEs aren't so browser-based after all, since they're just shelling out to some non-browser process (over RPC over HTTP). The "World Wide Wruntime" is relegated to merely interpreting the code for a thin client that handles its half of the transactions to/from said remote processes, which end up handling the bulk of the computing (even if that computing isn't heavyweight and/or the client code on its own is full of bloat, owing to the modern trends in Web design).

      It's sort of crazy how common it is to encounter this "mental slippery slope": "We can lean on the Web browser, since it's available everywhere!" → "That involves offloading it to the cloud (because that's how you 'do' stuff for the browser, right?)".

      So: want to see an actual boom in collaborative development spurred by zero-configuration dev environments? The prescription is straightforward: make all these tools truly run in the browser. The experience we should all be shooting for resemble something like this: Step 1: clone the repo Step 2: double click README.html Step 3: you're off to the races—because project upstream has given you all the tools you need to nurture your desire to contribute

      You can also watch this space for more examples of the need for an alternative take on working to actually manage to achieve the promise of increased collaboration through friction-free (or at least friction-reduced) development: * https://hypothes.is/search?q=%22the+repo+is+the+IDE%22 * https://hypothes.is/search?q=%22builds+and+burdens%22

    1. Old clients, bosses, colleagues?

      This presupposes something; is this step #1 or not? (In other words, "Where did those people come from?")

      I see lots of handwaving on this part—which is silly, because for the people who are the main audience for pieces like this, this is the part that's most interesting. It's easy to see their frustration.

      The thing I really admired about Chris Lattner in an interview I listened to during lockdown was the fact that (one of?) the first things he mentioned about success was the role that luck played in his life. There's a tacit admission there that, although he's Chris Lattner and a really smart guy and clearly deserving of his success because of the facts from our now-foregone past that serve as proof, if you reset and tried to "replay' it all, then things might not actually work out like they did the first time. It's a tacit admission that replicability is not guaranteed.

    1. This polemic identifies CS as the culprit. That seems empirically wrong. As stated, it's "not a prerequisite for most programming" even in theory, and in practice, there are mountains of GitHub programmers, at least, who don't have CS backgrounds. Non-CS folks probably account for most of the "frontend"/"full stack" development today. This has exacerbated the Tower of Babel, not improved it.

      HCI is CS—and that's what we should focus on. There's a fair bit of emphasis on engineering due, too. To be able to look at a problem and ask, "What should it take?" and ocnversely, "What isn't required here (contra cultural imperatives)?"

    2. Open source development has succeeded mostly at solving the problems of expert programmers.
    3. It is long past time to return to designing tools not just for rock stars at Google but the vast majority of programmers and laypeople with simple small-scale problems.
    4. Unlike every other technology, software doesn’t wear out.
    1. these things could have or would have doomed the earlier phases

      This is what people refer to when they say something was "ahead of its time".

    1. can be used in a TACE compatible way

      Wait, what? At this point, I'm not sure these are accurately being scored against the given rubric.

    2. to write HTML you need to think about CSS
    3. it is TACE compatible

      Huh? Again—is it? I don't think I've seen anyone put Bootstrap to serious use and not use the preprocessed version.

    4. these concerns also exist when using vanilla JS

      These concerns also exist if you're using frameworks.

      I also see people regard libraries like DOMPurify as magic. There is no magic. Even using these libraries, you have to know what you're doing; you have to make sure DOMPurify is configured correctly in order to carry out your intent. You can be just as vulnerable if you're not paying attention with a library as you are when using one.

    5. it falls within TACE

      Does it? Most jQuery is served minified. Even if not, the "full fat" version is not especially readable.

      There are bad reasons to despise jQuery (e.g. because it's old). But there are good reasons, too (because it's bloated and just not very good).

    6. actual programming language
    7. modern web development has become selfish
    8. focusing on the developers and making sure the developers can quickly output projects at the expense of the end users
    1. @6:15

      An engraving (as shown here) was published in at least one place with the caption "The Battle of Omdurman: The Defence of the Khalifa's Black Flag".

      The watercolor(?) of this appeared in the 1898 September 24 print of The Graphic, on page 406. It is captioned "The Battle of Omdurman: The Fight for the Khalifa's Standard". It is signed "C. Hentschel", but attributed "Drawn by J. Gulich, R.I."

    1. I actually do it all the time myself - in my own private git source tree. I send git patches to Junio by email, and I don't publish my own git tree any more

      Git as He Is Poke(d)

    1. the straw man fallacy

      I've come around to preferring the term "strawchild".

      • It de-genders the term (important for some people)
      • It evokes the imagery of the kind of loser* who is only willing to engage in battle with children and/or is perhaps prone to striking them
      • It conveniently sidesteps the cliche/fatigue associated with invocations of the term "strawman"

      * Is this aspect of "strawchild" an instance of failure to elevate the other (i.e. steelman/starman them)? Yes.

    1. instead of inlining the images, the image URL’s (and captions) are read from a .yaml file. The URL of the yaml file is passed as an argument when loading the page. The .yaml file as well as the images should be publicly served.
    1. The language is academic, which has contributed to the confusion around the topic, but it is clear enough that most developers should be able to understand it.

      This I disagree with. Even Fielding's "... must be hypertext driven" rant (which is great and in my bookmarks) is sabotaged by the curse of knowledge. If you know what REST is—and how most "REST" isn't REST (including the things that try to stand out as doing REST right and still just doing it wrong, but with nuance) then "REST APIs must[...]" makes sense. If you don't already get it, though, then it's nigh impenetrable—funnily enough, you need an a priori understanding of REST to be able to understand these attempts to explain REST and what Fielding is trying to communicate about REST not requiring *a priori" knowledge!

    2. Let's call this style of API pseduoREST or JSON-RPC.

      What the re-education around REST needs is a catchy label for what people call REST that works well as a light pejorative. Two-Bit History gave it a shot, coining the ad hoc acronym "FIOH", but it doesn't have the desired properties.

    1. OpenBooks is a hub for Neocities community projects.

      Why the name, though?

    1. The trouble with redefining "REST" to mean "not REST" is that the first step in learning known techniques to solve a problem is learning the terminology that people use to explain the techniques. If you think you know the terminology, but you have the wrong definition in your mind, you will not be able to understand the explanations, and you will not be able to figure out why you can't understand them, until you finally figure out that the definition you learned was wrong.
    1. @54:25:

      There's this debate about whether those games are truly interactive, because there's no way to actually transcend the intent of the designer (other than with, like, glitches or that sort of thing)

      They're talking about video games—"authored experience[s]"—but it's relevant of other types of games (boardgames, like chess), too.

      cf Carse:

      Finite players play within boundaries; infinite players play with boundaries

    2. something something declarative effects

    1. @2:10

      They didn't publish the code. They published the algorithm. And they prided themselves on—the computer scientists at the time—of describing the algorithm, not GitHubbing the code. These days we don't—we GitHub the code. You want the algorithm? Here's the code.

      This is not always reliable. There are some non-highly-mathematical things that you'd prefer to have the algorithm explained rather than slog through the code, which is probably adulterated with hacks for e.g. platform gotchas, etc.

      There is a better way, though, which is to publish a high-level description of the workings as runnable code that you can simulate in a Web browser. Too many people have misconceptions about the stability of the Web browser as a platform for simulations, however. We need to work on this.

    1. So that’s when I came across Hypothesis. At the time, I was like, “Hey, this is cool,” but I also resonated with the company — like the web ethos being open-source. Being embedded into this distributed web of conversation was super cool to me, but at the time that wasn’t important for the Sidekick product. And then, when we pivoted Sidekick into a chatbot for youth apprenticeships, we didn’t need anything like annotation anymore, but also what happened is I basically reorg’d myself out of a job. So I was kind of looking around, but just casually looking because I didn’t have any end date, and then a recruiter reached out, and I was like, “I know these people,” and I really liked the company.

      This reads like a (fairly raw) transcript of a spoken-word interview. Hard to read.

    1. Gates befriended Ballmer at Harvard

      I'm surprised that this is the exemplar, and not the Allen–Gates relationship.

    1. It uses the jsdoc syntax, and strives to document all the tools and members available to front-end developers. To generate documentation, you'll also need the jsdoc utility available via npm. See the disclaimer above about the node package manager, of course. # npm install jsdoc -g

      This approaches it backwards. Consider a TypeScript-free codebase where the type annotations live in the documentation—which isn't generated. Your "compiler" then is really just a program verifier. Given the JS program text and the availability of docs, it takes the fusion as input and runs the verifier on that. No source-to-source compilation or mangling. The code that runs is the code you write—not some intermediate file output by tsc.

    1. Another key idea here is to separate meaning from tactics. E.g. the meaning of sorting is much simpler and more compact than the dozens of most useful sorting algorithms, each one of which uses different strategies and tactics to achieve the same goal. If the “meanings” of a program could be given in a way that the system could run them as programs, then a very large part of the difficulties of program design would be solved in a very compact fashion. The resulting “meaning code” would constitute a debuggable, runnable specification that allows practical testing. If we can then annotate the meanings with optimizations and keep them separate, then we have also created a much more controllable practical system.
    2. This opens the possibility of doing a design much better than Squeak's, both fundamentally and at the user-level, to create a model of an entire personal computer system that is extremely compact (under 20,000 lines of code)

      See: Oberon

    3. This is critical since many optimizations are accomplished by violating (hopefully safely) module boundaries; it is disastrous to incorporate optimizations into the main body of code. The separation allows the optimizations to be checked against the meanings.

      See also the discussion (in the comments) about optimization-after-the-fact in http://akkartik.name/post/mu-2019-2

    4. the Squeak system, which was derived from PARC Smalltalk, includes its own operating system, GUI, development tools, graphics, sound, Internet sockets, and many applications including Etoys and Croquet, yet is only about 230,000 lines of code. The executables are about 2.8MB and only half of this code is generally used.
    1. @52:20

      We know it will happen, barring some radical change in human psychology, because that is what we're living with now. Everyone is walking around with a smartphone in their pocket that not even the president of the United States could have gotten his hands on in, you know, the year 2000. It's pure science fiction. And yet now it's just a basic necessity of life [...] we reset to the new level, and again, we keep comparing ourselves to others

      There's a submarine counterargument to the overall point here (and the last half of the sentence quoted here), and it lies in the words "necessity" and "new level". (I realize that when it was spoken, "necessity" was chosen for effect and meant as a slight exaggeration, but it's not as exaggerated as it would need to be in order to erase the force behind of the counterargument.)

      In other discussions like these, you can often find people bringing up the argument that Keynes's remark about 15-hour work weeks wasn't wrong, provided that you're willing to accept the standards of living that existed at the time when Keynes was saying it. But that's not exactly true, because it doesn't really ever come down to a true choice of deciding whether you'd like to opt in or not.

      You could take the argument about smartphones and make the same one by swapping out automobiles instead. The problem is that even if you did desire to opt out of the higher standards, the difficulty lies in the fact that the society that exists around you will re-orient itself such that access to a car is a baked-in requirement—because it's taken as a given in others' own lives, it gets absorbed into their baseline of what affordances they expect to be available to people who are not them ("new level"). This continual creation of new requirements ("necessities") is the other culprit in the pair that never gets talked about in these conversations. Everyone focuses on the personal happiness and satisfaction component wrt comparison to others.

    1. It makes it really hard often to reason about the impact of this kind of work, because there are no easy metrics. One of the takeaways that I take from it is that making tools easy to use, fast to use, and pleasant to use is really powerful. It’s really powerful in ways that are hard to predict until you’ve done it, and so you should just take it as axiomatic that it’s worth a little bit more time than your organization otherwise would spend investing in tool quality, because people will change how they relate to those tools.They’ll find new ways to use it. They’ll use them more often. It often leads to this productivity flywheel in somewhat non-obvious ways.

      Surprise! The point of technology is that it's supposed to make things easier. Why not make sure it's easy to make things easy while you're at it?

    2. When you’re building developer tools, if the officially supported developer environment doesn’t work for people in some way, they build their own approach.
    3. it shows that any time that you make something easier or harder to do, either because it’s faster or slower, or just because you’ve reduced the number of steps, or you’ve made the steps more annoying, or you’ve added cognitive overhead, then people react by changing how they use your tool.
    1. a lot of the time i get a lot of questions these days

      @3:07:14:

      Blow: A lot of time I get a lot of questions these days by people who are asking, "How do you do a lot of work?", or, "How do you get started?" and all that. And very often these questions are themselves a procrastination, right? It's like, "Obviously, I'm in the state where I can't do a lot of work right now. So I need somebody to give me the answer before I can." And actually the secret is you sit down and decide to do it. That's all it is, right?

      Jaimungal: Seinfeld is like that. That's his famous advice to comics, to comedians.

      Blow: Mmm. Yeah. I mean—

      Jaimungal: Comedians always want to know what's the secret. He says, "Just work. Stop talking about it."

      Blow: Yeah. [...] Because that's an exc— it's like, "Oh, someday— I have permission to not actually do this work until someday somebody bestows upon me the magical[...] baton[...]"

    2. i mean i have a whole speech about that

      @03:06:54:

      Blow: I mean I have a whole speech about that that I can link you to as well.

      Should that be necessary? "Links" (URLs) are just a mechanical way to follow a citation to the source. So to "link you" to it is as easy as giving it a name and then saying that name. In this case, the names are URLs. Naming things is said to be hard, but it's (probably) not as hard as advertised. It turns out that the hard part is getting people to actually do it.

    1. maybe people today are more complacent about emergencies because they think someone will come along and save the day.

      The actual phrasing here (@39:03):

      maybe people today are more complacent about emergencies because at some implicit level they see there's more division of labor and they think someone will come along and save the day

      (No idea what's up with the transcript here. It omits some crucial wording/context.)

    1. @18:08:

      The interesting thing is when you read, say, extremely foreign fiction—so fiction that's very alien to you in terms of mental models—the author may assume you understand the mental models, but you may not. So for example Japanese comic books—the couple of times I've tried to read them, they just feel so bizarre to me—the sort of conventions for indicating, you know, emotions and actions and so forth. They're just so unintuitive to me that the world[...] that should be in the background and implicit and I should just be able to reference it like an operating system, it sort of becomes a little too visible for me to read the fiction seamlessly.

    1. @47:58 audible gasps at the mention of standards

    2. @37:52:

      Again, when I was traveling around in 2004 giving talks, the predominant attitude I had was a misconception about Darwinian processes that—most people I talked to thought that we must be at the best place that we could be because look at how many millions of people are participating in this thing. And if you know about Darwinian processes, they aren't optimizers at all. They're "satisficers"—a term made up by one of the most famous professors ever to be at Carnegie—and satisficing is not the same as designing something great.

    1. @18:09:

      So the real question is not whether we can improve on this but "what is the actual level of improvement?", which is tantamount to asking, "how complex is the actual problems we're trying to solve compared to the complexity we're creating by just bumbling around?"

    1. I guess my hesitation in answering your question is that I hate essentialism. It’s the same way that I hate it when people say women are better leaders because we are more empathetic. The problem with essentialism is, the moment you pay yourself a compliment based on gender, caste, religion, color of your skin — whatever — country of your origin — if you’re going to accept one generalization is true, then you’re going to have to suck up the generalizations and the caricatures that aren’t so flattering.
    1. One thing that I didn't hear considered is to kill modelessness by allowing selection anytime. If you want to select something, then just draw the shape that you want using whatever tool you already have, and then invoke some action that transforms it after the fact.

      This falls in line with the principle of preferring to make it easy to undo something rather than making it difficult to do in the first place.

      The relevant analogy is that if you were in your office in dialogue with someone and you had a printout of your WIP, then in order to communicate your intent, the most likely course of action would be to mark up the work in its current state using whatever tool is handy, explaining what it is you want to happen, passing it to your partner in dialogue, and then trust that they'll return to the source and get to it wrt to carrying out your desire.

    1. The thing that bugs me when I listen to the Muse podcast—it's something that's present here along with the episode with gklitt—is that there's this overarching suggestion that the solution to this is elusive or that there are platform constraints (especially re the Web) that keep any of these things from being made. But lots of what gets talked about here is possible today, it's just that no one's doing it, because the software development practices that have captured the attention of e.g. GitHub and Programmer Twitter value things that go against the grain of these desires. This is especially obvious in the parts that mention dealing with files. You could write your Web app to do that. So go do it! Even where problems exist, like with mobile OSes (esp. iOS), there're things like remoteStorage. Think remoteStorage sucks? Fine! Go embrace and extend it and make it work. It's not actually a technical problem at this point.

    2. @18:52:

      I wanna also dig a little more into the kind of... dynamism, ease-of-making-changes thing, because I think there's actually two ways to look at the ease of making changes when you solve a problem with software. One way is to make software sufficiency sophisticated so that you can swap any arbitrary part out and you can keep making changes. The other is to make the software so simple that it's easy to rewrite and you can just rewrite it when the constraints change.

    3. @14:18:

      So, for example, if you want to make a very basic static site: well, okay, now you need the static site generator, and now you need a library system, you need a package manager, you need a way to install the package manager, you need a way to check for security vulnerabilities in all the packages, you need a web server, you need a place to run the app. It's a whole thing, right?

    1. It made sense when JS was doing small simple things in HTML - it doesn’t make much sense anymore

      No, it still makes sense.

      Insisting that everyone use === everywhere, on the other hands, makes as much sense as disallowing method declarations that accept a parameter that implements an interface (in favor of formal parameters that insist on a certain class or a derivative), or injecting a bunch of instanceof checks for no good reason...

    2. I have rarely encountered a good reason to use == in JS. Most of the time, or you are relying on it, you are probably doing something wrong.
    3. Easier to just add a new operator that does things the right way and keep the original == operator as is. That way people can transition on their own time

      Stupid myth.

    1. But I later realized writing is many things, one of which is the finished article you’re reading now. Mainly though, it’s a tool for thinking things through.

      I've mentioned this elsewhere, but I'm skeptical of this popularly recurring take that says writing is thinking, or that thinking without writing really isn't thinking. If writing helps you think, it's better for you to know that than the alternative. But thinking is thinking, and writing is writing.

      I worry with all the insistence around this view of writing as a precondition to real thinking that people who are thinking at or near capacity without writing will believe they're somehow missing something and waste a lot of cycles in frustration as they attempt to write and find that it doesn't do anything for them thoughtwise that they weren't already getting before.

    2. Think about the sad essay we all used to write for your (insert language here) class: back then you didn’t have permission to generate original ideas.

      I'm not sure that's the correct diagnosis.

      Alternative take: you were not, at that point in your life, equipped to understand that you could be generating new ideas and that you should walk away from that writing course with an appreciation for writing as a vehicle for what you'd like to accomplish with a given subject/format. It's fine that you didn't—many people don't—and your instructors, institution, parents, community, etc. probably could have done a better job at communicating this to you, but it was there, and it was the point all along.

    1. The type for an item is given as the value of an itemtype attribute on the same element as the itemscope attribute.

      The presence of an itemtype attribute should imply the existence of an item, and thus make the presence of itemscope optional—itemscope should be required only when there is no explicit itemtype declared.

    2. It's important to note that there is no relationship between the microdata and the content of the document where the microdata is marked up.

      Wait, really? That's unfortunate. It seems plainly useful to be able in e.g. the datetime example to correlate the text content of the element with the datetime property with the datetime value.

    1. There is no inherent virtue in insisting that paths continue to use backslash even though we're well past the days of CP/M and DOS. There are good reasons not to use it, however.

      I do recognize that it provides a source of the type of obscurantism that Windows users take delight in. At this point, though, backslash-as-path-separator is a liability, and application authors should work to eradicate it from every UI surface that users come in contact with. Microsoft themselves should scrub their own apps so that e.g. even Windows Explorer favors the common solidus in any displayed path name.

      Consider this to be a straw proposal for an "are we slash yet?" movement.

    1. Respect your seniors. Love your juniors.

      Suppose these were inverted.

    2. Deny oneself in order to follow Christ.

      What does this mean?

    1. // NB: Since line terminators can be the multibyte CRLF sequence, care // must be taken to ensure we work for calls where `tokenPosition` is some // start minus 1, where that "start" is some line start itself.

      I think this satisfies the threshold of "minimum viable publication". So write this up and reference it here.

      Full impl.:

      getLineStart(tokenPosition, anteTerminators = null) {
        if (tokenPosition > this._edge && tokenPosition != this.length) {
          throw new Error("random access too far out"); // XXX
        }
      
        // NB: Since line terminators can be the multibyte CRLF sequence, care
        // must be taken to ensure we work for calls where `tokenPosition` is some
        // start minus 1, where that "start" is some line start itself.
        for (let i = this._lineTerminators.length - 1; i >= 0; --i) {
          let current = this._lineTerminators[i];
          if (tokenPosition >= current.position + current.content.length) {
            if (anteTerminators) {
              anteTerminators.push(...this._lineTerminators.slice(0, i));
            }
            return current.position + current.content.length;
          }
        }
      
        return 0;
      }
      

      (Inlined for posterity, since this comes from an uncommitted working directory.)

    1. To make a page on MySpace, all it took was text in a textbox.The text could be words or code.Anyone could read the words and see the code.
    1. Free as in ...? Points out that freedoms afforded by foss software to the average computer user are effectively the same as proprietary software, because it's too difficult to even find the source and build it, let alone make any changes. Advocates the foss developers should not think only about the things that users are not legally prevented from doing, but about what things they are realistically empowered and supported in doing.
    1. I have 35 MB of node_modules, but after webpack walks the module hierarchy and tree-shakes out all module exports that aren't reachable, I'm left with a couple hundred kilobytes of code in the final product.

      This directly contradicts the earlier claim that irreducible complexity is the culprit behind the size of the node_modules directory.

      35 MB → "a couple hundred kilobytes"? Clear evidence of not just reducibility but a case of actual reduction...

    1. The early phase of technology often occurs in a take-it-or-leave-it atmosphere. Users are involved and have a feeling of control that gives them the impression that they are entirely free to accept or reject a particular technology and its products. But when a technology, together with the supporting infrastructures, becomes institutionalized, users often become captive supporters of both the technology and the infrastructures.

      the illusion of preference-revealing actions

    1. Here is how I produce invoices and contracts for consulting: Open an old invoice/contract in firefox. Use the inspector to change the values. Hit 'save as new file'.
    2. In terms of this analogy, a lot of objections to end-user programming sound to me like arguing that Home Depot is a waste of time because their customers will never be able to build their own skyscrapers. And then on the other side are the people arguing that people will be able to build their own skyscrapers and it will change the world. I just think it would be nice if people had the tools to put up their own shelves if they wanted to.
    3. none of that saves me any time in the long run
    4. It took me an hour to rewrite my ui code and two days to get it to compile. The clojurescript version I started with miscompiles rum. Older clojurescript versions worked with debug builds but failed with optimizations enabled, claiming that cljs.react was not defined despite it being listed in rum's dependencies. I eventually ended up with a combination of versions where compiling using cljs.build.api works but passing the same arguments at the command line doesn't.
    1. My Fortran professor gave me a low mark because I used a lookup table for octal to binary conversion, instead of using division and modulo.
    1. resulting HTML

      Imagine if this were just the format that the source document itself used...

    2. General processes and common best practices learned from other Web Content Management projects do not apply and are in many ways obstructions to move quickly and get the best out of Helix.
    1. I recently started building a website that lives at wesleyac.com, and one of the things that made me procrastinate for years on putting it up was not being sure if I was ready to commit to it. I solved that conundrum with a page outlining my thoughts on its stability and permanence:

      It's worth introspecting on why any given person might hesitate to feel that they can commit. This is almost always comes down to "maintainability"—websites are, like many computer-based endeavors, thought of as projects that have to be maintained. This is a failure of the native Web formats to appreciably make inroads as a viable alternative to traditional document formats like PDF and Word's .doc/.docx (or even the ODF black sheep). Many people involved with Web tech have difficulty themselves conceptualizing Web documents in these terms, which is unfortunate.

      If you can be confident that you can, today, bang out something in LibreOffice, optionally export to PDF, and then dump the result at a stable URL, then you should feel similarly confident about HTML. Too many people have mental guardrails preventing them from grappling with the relevant tech in this way.

    2. if I died today, thoughts.page would probably only last until my credit card expires and DigitalOcean shuts down my servers

      I've noted elsewhere that NearlyFreeSpeech.Net has a billing system where anyone can deposit funds for a hosted account. That still leaves the matter of dealing with breakage, but for static sites, it should work more or less as if on autopilot. In theory, the author could die and the content would remain accessible for decades (or so long as fans have the account ID and are willing to continue to add funds to it), assuming the original registrant is also hosting their domain there and have auto-renewal turned on.

    3. Trying to keep websites around forever is struggling against the nature of the web.

      As above, I think this is more a consequence of struggling against the nature of the specific publishing pipelines that most people opt for. Many (most) Web-focused tech stacks are not well-suited to fulfill the original vision of the Web, but people select them as the foundation for their sites, anyway.

    1. it's very easy to measure how many github back and forths people have

      Bad example. The way most GitHub-adjacent subjects are handled and the overheads involved is already evidence that most people are not interested in operational efficiency, let alone measuring it to figure out how to do it better.

    2. computation it's the most important cost the e right how much does it cost to execute this thing on the end user's computer

      It's very hard to take this seriously from someone who's main endeavor is making video games. Essentially every CPU cycle ever spent running a game was superfluous.

    3. even a slight bit of being bad meant you couldn't get the thing to work at all

      dubious

    4. based on whether the software was any good

      nebulous

    5. This is as good an example as any of why I'm not a fan of Casey Muratori.

      I'm 25% of the way through this video (a "lecture")—10+ whole minutes—and he's hasn't said anything insightful or of any substance whatsoever. He certainly communicates that he has strong opinions, and expresses them (I guess?) in a very emphatic way, but holy shit, dude. What is your point? Say something that makes sense. Hell, just say anything at all.

    1. the idea of a thoughts page was originated by maren, who made a script for generating thoughts pages. thoughts.page is a way of lowering the barrier of entry to putting thoughts on the internet for people who don't want to or don't know how to set up a script to do it.

      Good use case for the application of the principles in A New Publishing Discipline.

    1. Primary program modules

      This is sort of a failing of the code-as-content thing that we're going for here. Take a page from GPE.

  5. Jun 2022
    1. Okay, so the original source seems to be Proteus (A Journal of Ideas). ~~Specifically, vol. 3, iss. 1.~~ (Thanks to Nikos Katsikis by way of Neil Brenner for helping track this down.)

    2. Also available here under the alternate title "Livingry" from BFI: https://www.bfi.org/about-fuller/big-ideas/livingry/

    1. A story: when I wanted to meet with a really busy friend of mine in SF, I first sent him 2 twitter DMs, then 2 emails, and then 3 text messages, letting him know that I will keep sending one text a day, until an email from him finally landed in my inbox letting me know that he would love to get lunch.

      This whole piece is filled with this, but this story in particular comes across strongly as "I'm happy to impose my habits upon you." It's obnoxious.

    1. There’s not much implementations can do, and it’s up to the debugger to be smarter about it.

      This is fatalistic thinking.

      Here's what both implementations and debuggers can do together: 1. implementations can document these things 2. debuggers can read the documentation and act accordingly

    1. It’s certainly possible that you become so framework-driven going down this path that your wider problem-solving skills suffer.

      hammers, nails

    1. other people’s toolchains are absolutely inscrutable from the outside. Even getting started is touchy. Last month, I had to install a package manager to install a package manager.
    1. This is a great, ancient browser feature, but as developers we often engineer it away.

      Right. Leading to such questions as, "If you're putting all this effort in just to get things wrong and ending up with something that's worse than what you get for free—worse than just doing nothing at all—then what is your actual contribution?"

    1. What they didn’t consider is that Google had a crack team of experts monitoring every possible problem with SPAs, right down to esoteric topics like memory leaks.

      I've had conversations where I had to walk other people through why garbage collection doesn't mean that memory use is automatically a solved problem—that you still have to be conscious of how your application uses memory and esp. of the ownership graph. They were fully under the illusion that steady growth of memory was just a non-issue, an impossibility in the world of garbage collectors.

    2. the MPA sites gave immediate feedback to the user when clicking – showing a loading indicator in the browser chrome

      not just immediate, but standard

    3. web dev culture war

      What the linked piece is not: an analysis of the web dev's self-interested perspective on fairness, economic balance, and the implicit societal mandate for their skillset/work product (and the associated subsidies they benefit from and fight to maintain).

    4. Want to animate navigations between pages? You can’t (yet). Want to avoid the flash of white? You can’t, until Chrome fixes it (and it’s not perfect yet). Want to avoid re-rendering the whole page, when there’s only a small subset that actually needs to change? You can’t; it’s a “full page refresh.”

      an impedance mismatch, between what the Web is (infrastructure for building information services that follow the reference desk model—request a document, and the librarian will come back with it) versus what many Web developers want to be (traditional app developers—specifically, self-styled product designers with near 100% autonomy and creative control over the "experience")—and therefore what they want the Web browser to be (the vehicle that makes that possible, with as little effort as possible on the end of the designer–developer)

    1. First thing I noticed is that I spent a bunch of time writing tests that I later deleted. I would have been better off writing the whole thing up-front and just doing end-to-end tests.

      need for cheaper throwaway tests

    1. the expected lifespan of even very successful SaaS companies is typically much shorter than the lifespan of personal data

      A strength of boring tech that relies on the traditional folders-of-files approach, incl. e.g. the byproducts of using office suites.

    2. I suspect because most software is optimized for industrial use, not personal use. For industrial uses the operations overhead is not a big deal compared to the development and operational efficiency gained by breaking things up into communicating services. But for personal uses the overwhelming priority is reducing complexity so that nothing fails.
    3. preventing the build from bitrotting would probably require a full-time maintainer in the long run
    4. data can be exported from airtable, but the logic and UI can't
    5. when doing independent research I've typically fallen into the habit of creating unbounded projects with no internal structure. I don't think this has been good for my sanity.
    1. Yesterday evening (London, UK time) something amazing happened which can best be described in a single picture:

      That doesn't describe anything. It's a grid of GitHub avatars. What am I supposed to be seeing?

    1. you get so used to the way things are you don't think of the obvious next step and you know that can be so frustrating
    2. this can't possibly work because if it worked somebody in the last 40 years would have done it
    3. as long as the compiler doesn't use a bunch of global variables to store the compiler state, you can use it to run another instance of itself

      In other words, singletons are harmful.

    1. In saner bugtrackers like e.g. Bugzilla the community is empowered to step in and triage bugs. GitHub infamously chose to go for a "simpler" system with fewer frills, pomp, and circumstance. Perversely, this has the opposite of the intended effect. The net result is that for the community to have these powers, the project owner has to explicitly grant them to individual users, which is considered to be a lot more heavyhanded/ceremonial than how it works on bugzilla.mozilla.org.

      I'd have no problem, for example, stepping in and enforcing these things if it weren't the case that it were such a chore to go through the ceremony and getting approval to police this kind of stuff. GitHub's lackluster approach to user privacy, of course, doesn't help.

    1. Signal, which is damn good crypto work, announced MobileCoin support, and I stopped donating, bummed.

      Signal trades on some other stuff of dubious* merit, like the "guarantees" of SGX, and does other user-hostile stuff: requiring a PIN, doing user data backups without permission, locking out third-party clients... (What's worse is that the latter is excused as being the only way to reliably enable the undesirable "enhancements").

      * Even calling it merely "dubious" here is pretty generous.

    1. I've come to much the same conclusion, I think: to wit, most people secretly enjoy their problems and suffering.

      See also, from Brooke Allen's "How to hire good people instead of nice people" https://brookeallen.com/2015/01/14/how-to-hire-good-people-instead-of-nice-people/:

      I won’t get between you and your dreams. If you have a dream, I need to know what it is so we can figure out if this job gets you closer. If you don’t have a dream then that’s fine, as long as you really want one and you’re not addicted to wishing and complaining.

  6. buckyworld.files.wordpress.com buckyworld.files.wordpress.com
    1. enormous an ounce of energy;

      I can't parse this. Ravasio says any typo is probably her fault. The best I can come up with is "an enormous amount of energy", which doesn't make sense as a typo, but does sort of sound the same.

    1. Rephrasing Brian Smith: Some thing is on the Web such that if the Web itself was destroyed, that thing would also be destroyed. If not, it's not fully on the Web. If someone destroyed the Web, this would not damage me if I were being denoted by a URI, but my homepage at that URI would be up in smoke if that what's people were using to refer to me by. I am not on the Web in a strong sense, but my homepage sure is.

      I don't think this is a good definition. The example, at least, is a bad one. That resource could still exist (the same way a .docx that lives in the Documents directory and has been uploaded but later had the file host go down would still exist)—it just wouldn't be resolvable by URL.

    2. In theory, through content negotiation a news website could communicate with your browser and determine where you live and then serve your local news. This rather simple example shows that the relationship between resource and representation can not be one-to-one.

      I don't think this is a good example. I'd call it bad, even. It's self-defeating.

    3. This page is excellent for an example of HTML being an adequate substitute for traditional office formats.

    1. > If I understand your critique, it's this: "How dare you critique their use of Ra? You have no standing! You have no right!" Which is basically an ad hominem attack that doesn't address any of the substance of my complaint.Sorry, no, making up your own caricature of what I said isn't an effective way of responding to it.

      Yeah, why has this become so normalized? It's gotten to the point where people will respond to something by posting nothing but an attempt at false attribution by rewording the other—typically in the most convenient, pithy, hackneyed, and strawmannish way—and then putting quotes around it while drowning in plaudits from those who already agree—often for reasons no better than shameless tribal affiliation.

      The basic precondition to summarizing the other's position in order to refute it is that the other side actually agrees that it's an accurate summary of their position. If you don't have that, then you don't have anything.