3,138 Matching Annotations
  1. Jan 2023
    1. One convenience feature is that if you paste the Apple Podcasts directory listing instead of the feed URL, I’ll look up the feed URL from the listing and treat it as a redirect.

      Some thoughts:

      • This is indeed good UX.

      • What's not good UX—and which I discovered/realized this week—is that there seems to be no easy/straightforward way to map a podcasts.apple.com podcast page to either its feed URL or to the original publisher/podcast's Web site. In reality, this would be as trivial as embedding a <link>.

      Additionally, there's a missed opportunity in the podcasting space to make judicious use of the Link header—every piece of media served as part of a podcast should include in the HTTP response a link back to the canonical feed URL and/or the original site! (And they should also have CORS enabled, while they're at it.) Why isn't this already a thing? Answer: because it's a trivial detail; podcasters could do this, but what's the point?, they'd say—almost no one is going to attach a network inspector to the requests and check to see whether they're sending these headers for only the sake of steadfast and adherence to hypermedia ideals. Worth noting that this is the exact opposite of Jobs's principle of carrying out good craftsmanship throughout e.g. a chest of drawers or when building a cabinet, even for the parts that no one will see, in order to "sleep well at night". Maybe this could be used to shame Apple?

    1. this is misplaced outrage on the author's part, since Apple has never produced RSS feeds

      Another casualty of the Copenhagen interpretation of ethics (or close enough, at least)?

      https://hn.algolia.com/?query=copenhagen%20interpretation%20of%20ethics%20strikes%20again&type=comment

    1. Premium feeds are rehosted by Apple and it's huge PITA because we have ad-supported public feeds and ad-free premium feeds and need to build them twice.

      The author here makes it sound like they have to reach out and grab content stream chunks, stitch them together with their own hands, and then plonk them down on the assembly line for 14 hours a day or something.

      It's a program. You write a program that does the building.

    1. There is no predetermined correlation between this import path and the file system, and the imported module doesn’t have to know anything about the import path used in an importing module.

      This is not a good approach. It's the opposite of what you want. Module resolution remains easy for computers (because of their speed), but tedious for humans.

      As a writer, maybe there's some benefit for no correlation. As a reader trying to understand a foreign codebase, esp. one who is in the moment trying to figure out, "Where is this thing defined? Where can I read the source code?" when jumping through procedure definitions, not being able to trivially ascertain which file a given thing is in is unnecessary friction. Better to offload a tiny bit of work onto the author who knows the codebase (or at least their own immediate intention) well rather than to stymie the progress of dozens/hundreds of readers trying to work things out.

    2. it quickly became clear that this approach would reach its limits as soon as several people contributed modules

      Worth noting that for many of Rochus's own projects (esp. ones related to Oberon/Oberon+), it's just a bunch of .cpp and .h files in one directory...

    1. igal needs Perl to run and it also relies on a few other programs that come standard with most Linux distributions.
    1. Hyperdocumentsmay be submitted to a library-like service (an adminis-tratively established, AUGMENT Journal) that catalogsthem, and provides a permanent, linkable address andguaranteed as-published content retrieval. This Jour-nal system handles version and access control, pro-vides notifications of supercessions and generallymanages open-ended document collections.

      Imagine an arxiv-like depository that dealt with native hypertext, rather than TeX/PDF.

      Food for thought: PWP as a prereq? What about RASH+SingleFileZ

    2. Meta-level referencing (addresses on linksthemselves) enables knowledge workers to commentupon links and otherwise reference them.
    3. Individualapplication subsystems (graphical editors, programlanguage editors, spreadsheets) work with knowl-edge products, but do not “own” hyperdocuments inthe sense of being responsible for their storage

      The opposite of current Web app norms (contra desktop).

    1. The overriding class Shape hasadded a slot, color. Since Shape is the superclass of all other classes in ShapeLi-brary, they all inherit the new slot.

      This is the one thing so far where the procedural syntactic mechanism isn't doing obvious heavy lifting.

    2. The slot definition of List fills the role of an import statement, as do thoseof Error and Point.

      ... at some expense to ergonomics.

      It's odd that they didn't introduce special syntax for this. They could have even used import to denote these things...

    3. The factory method is somewhat similar to a traditional constructor. How-ever, it has a significant advantage: its usage is indistinguishable from an ordinarymethod invocation. This allows us to substitute factory objects for classes (orone class for another) without modifying instance creation code. Instance cre-ation is always performed via a late bound procedural interface.

      The class semantics for new in ES6 really bungled this in an awful way.

    4. Newspeak programs enjoy the property of representation independence

      Is that mechanism or culture?

    5. All names are late bound

      Not all names, I think. Local identifiers, for example...?

    Annotators

    1. Patch based systems are idiotic, that's RCS, that is decades old technology that we know sucks (I've had a cocktail, it's 5pm, so salt away).Do you understand the difference between pass by reference and pass by value?

      Larry makes a similar analogy (pass by value vs pass by reference) to my argument about why patches are actually better at the collaboration phase—pull requests are fragile links. Transmission of patch contents is robust; they're not references to external systems—a soft promise that you will service a request for the content when it comes. A patch is just the proposed change itself.

    1. Literate programming worked beautifully until wegot to a stage where we wanted to refactor theprogram. The program structure was easy tochange, but it implied a radical change to thestructure of the book. There was no way we couldspend a great deal of time on restructuring thebook so we ended up with writing appendices andappendices to appendices that explained what wehad done. The final book became unreadable andonly fit for the dustbin.The lesson was that the textbook metaphor is notapplicable to program development. A textbook iswritten on a stable and well known subject whilea program is under constant evolution. Weabandoned literate programming as being toorigid for practical programming. Even if we got itright the first time, it would have failed in thesubsequent maintenance phases of the program’slife cycle.

    Tags

    Annotators

    1. How do we package software in ways that maximize its reusability while minimizing the level of skill required to achieve reuse?

      Is that really the ultimate, most worthy goal? It seems that "minimizing the level of skill required[...]" is used as a proxy here for what we're really after—minimizing the total cost of producing the thing we want. Neither the minimization of skilled use nor reuse should be held as a priori goals.

    1. if you are running a software business and you aren't at, like, Google-tier scale, just throw it all in a monorepo

      The irony* of this comment is that Google and Google engineers are famously some of the most well-known users/proponents of monorepos.

      * not actual irony; just the faux irony—irony pyrite, or "fool's irony", if you like

    1. I would argue that it’s simply more fun to engage with the digital world in a read-write way, to see a problem and actually consider it fixable by tweaking from the outside

      He doesn't exactly say it here, but many others making the same observations will pair it with the suggestion that this is because of some intrinsic property of the digital medium. If you think about it, that isn't true. If you consider paper, people tend to be/feel more empowered to tweak it for their own use (so long as they own the copy); digital artifacts seem more hands-off, despite their potential, because the powers involved are reserved for wizards, largely thanks to the milieu that those who are the wizards have cultivated to benefit themselves and their livelihood first, rather than empowering the ordinary computer user.

    2. Software should be a malleable medium, where anyone can edit their tools to better fit their personal needs. The laws of physics aren’t relevant here; all we need is to find ways to architect systems in such a way that they can be tweaked at runtime, and give everyone the tools to do so.

      It's clear that gklitt is referring to the ability of extensions to augment the browser, but: * it's not clear that he has applied the same thought process to the extension itself (which is also software, after all) * the conception of in-browser content as software tooling is likely a large reason why the perspective he endorses here is not more widespread—that content is fundamentally a copy of a particular work, in the parlance of US copyright law (which isn't terribly domain-appropriate here so much as its terminology is useful)

    3. a platform with tremendous potential, but somewhat disorganized and neglected under current management

      This has almost always been the case—at least as far back as 10+ years ago with addons.mozilla.org, too.

    4. CSS classes

      NB: there's no such thing as a "CSS class". They're just classes—which you may use to address things using CSS's selector language, since it was conveniently (and wisely) designed from the beginning to incorporate first-class* support for them.

      * no pun intended

    5. it’s getting harder to engineer browser extensions well as web frontends become compiled artifacts that are ever further removed from their original source code
    6. because it’s building on an unofficial, reverse-engineered foundation, there are no guarantees at all about when things might change underneath

      This is an unfortunate reality about the conventions followed by programmers building applications with Web-based interfaces: no one honors the tradition of the paper-based forms that their digital counterparts are supposed to mimic; they're all building TOSS-style APIs (and calling that REST) instead of actual, TURN-style REST interfaces.

    1. too much focus on the ‘indie’ (building complicated self-hosted everything-machines) and not enough on the ‘web’
    1. a special/reserved GET param could be used in order to specifying the version hash of the specific instance of the resource you want
      • MementoWeb
      • WebDAV
    2. we have one of the most powerful languages for manipulating everything in the browser (ecmascript/javascript) at our disposal, except for manipulating the browser itself! Some browsers are trying to address this (e.g. http://conkeror.org/ -- emacs styled tiled windows in feature branch!) and I will be supporting them in whatever ways I can. What we need is the bash/emacs/vim of browsers -- e.g. coding changes to your browser (emacs style) without requiring recompiling and building.

      That was what pre-WebExtensions Firefox was. Mozilla Corp killed it.

      See Yegge's remarks on The Pinocchio Problem:

      The very best plug-in systems are powerful enough to build the entire application in its own plug-in system. This has been the core philosophy behind both Emacs and Eclipse. There's a minimal bootstrap layer, which as we will see functions as the system's hardware, and the rest of the system, to the greatest extent possible (as dictated by performance, usually), is written in the extension language.

      Firefox has a plugin system. It's a real piece of crap, but it has one, and one thing you'll quickly discover if you build a plug-in system is that there will always be a few crazed programmers who learn to use it and push it to its limits. This may fool you into thinking you have a good plug-in system, but in reality it has to be both easy to use and possible to use without rebooting the system; Firefox breaks both of these cardinal rules, so it's in an unstable state: either it'll get fixed, or something better will come along and everyone will switch to that.

      Something better didn't come along, but people switched anyway—because they more or less had to, since Mozilla abandoned what they were switching from.

    1. Sciter. Used for rendering the UI of apps. There's no browser using Sciter to display websites, and the engine is Closed source.

      Worth noting that c-smile, the creator of Sciter, put out an offer during COVID lockdowns to make Sciter open source if someone would fund it for $100k. That funding never came through.

    1. https://michaelkarpeles.com/math.html
    2. https://michaelkarpeles.com/essays/philosophy/what-the-browser-is-missing.html
    3. My central goal is to further Paul Otlet, et al's, vision and head toward an amalgamous World Wide Web (a Universal Knowledge Repository) freed of arbitrary, discrete "document" boundaries.

      My central goal is a universal knowledge repository freed of discrete "document" boundaries

    1. Readers must learn specific reflective strategies. “What questions should I be asking? How should I summarize what I’m reading?” Readers must run their own feedback loops. “Did I understand that? Should I re-read it? Consult another text?”

      I generally don't have to do that when reading except when reading books or academic papers. This suggests that there's not really anything wrong with the form of the book, but rather its content (or the stylistic presentation of that content, really).

      I've said it a bunch the biggest barrier to accessibility of academic articles specifically is the almost intolerable writing style that almost every non-writer adopts when they're trying to write something to the standards for acceptance in a journal. Every journal article written for joyless robots should be accompanied by a blog post (or several of them) on the author's own Web site that says all the same things but written for actual human beings.

    2. Readers can’t just read the words. They have to really think about them. Maybe take some notes. Discuss with others. Write an essay in response. Like a lecture, a book is a warmup for the thinking that happens later.

      What if, when you bought a book, included was access to a self-administered test for comprehension? Could this even solve the paying-for-things-limits-access-to-content problem? The idea would be to make the thing free (ebooks, at least), but your dead tree copy comes with access to a 20-minute interactive testing experience (in a vendor-neutral, futureproof format like HTML and inline JS—not necessarily a Web-based learning portal that could disappear at any moment).

    1. I saw this tech talk by Luis Von Ahn (co-creator of recaptcha) and learned about the idea of harnessing human computation

      Consider: a version of the game 20 Questions that helps build up a knowledge base that can be relied upon for Mek's aforementioned Michael Jackson-style answers.

    2. How did it work? GNUAsk (the aspirational, mostly unreleased search engine UI) relied on hundreds of bots, running as daemons, and listening in on conversations within AOL AIM, IRC, Skype, and Yahoo public chat rooms and recording all the textual conversations.
    1. and developers are required to have the Ruby runtime in their environment, which isn’t ideal.
    2. In one of our early conversations with developers working on CLIs outside of Shopify, oclif came up as an excellent framework of tools and APIs to build CLIs in Node. For instance, it was born from Heroku’s CLI to support the development of other CLIs. After we decided on Node, we looked at oclif’s feature set more thoroughly, built small prototypes, and decided to build the Node CLI on their APIs, conventions, and ecosystem. In hindsight, it was an excellent idea.

      Minority(?) viewpoint: oclif-based command-line apps (if they're anything like Heroku's, at least) follow conventions that are alien and make them undesirable.

    3. There’s a caveat that we’re aware of—while Hydrogen and App developers only require one runtime (Node), Theme developers need two now: Ruby and Node.

      Well, you could write standards-compliant JS... Then people could run it on the runtime everyone already has installed, instead of needing to download Node.

    4. Of all the programming languages that are used at Shopify, Ruby is the one that most developers are familiar with, followed by Node, Go, and Rust.

      Node is not a programming language.

    1. And my mom is getting older now and I wish I had all the comments, posts, and photos from the past 14 years to look back on and reminisce. Can’t do that now.

      This reminds me of, during the height of the iPod era, when someone I know was gifted* an non-Apple music player and some iTunes gift cards—their first device for their first music purchases not delivered on physical media. They created an iTunes account, bought a bunch of music on the Music Store, and then set about trying to get it onto their non-Apple device, coming to me when it wasn't going well trying to get it to work themselves. I explained how Apple had (at the time) made iTunes Music Store purchases incompatible with non-Apple devices. Their response was baffling to me:

      Rather than rightly getting pissed at Apple for this state of affairs, they did the opposite—they expressed their disdain about the non-Apple MP3 player they were given** and resolved to get it exchanged for credit so they could buy a (pricier, of course) Apple device that would "work". That is, they felt the direct, non-hypothetical effects of Apple's incompatibility ploy, and then still took exactly the wrong approach by caving despite how transparently nefarious it all was.

      Returning to this piece: imagine if all that stuff hadn't been locked up in the social media silo. Imagine if all those "comments, posts, and photos from the past 14 years" hadn't been unwisely handed over for someone else to keep out of reach unless you assimilated. Imagine just having it delivered directly to your own inbox.

      * NB: not by me

      * NB: not as a consequence for mimetic desire for the trendiest device; they were perfectly happy with the generic player before they understood the playback problem

    2. It’s not feasible to constantly be texting and phone calling Paula from 10th grade geometry, etc.

      This was initially confusing. What makes texting infeasible, but doing it through Facebook is feasible? I realized upon reaching the end of the next paragraph: "I cant make a new Facebook in 2023 and add all these old friends. Literally psychotic behavior."

      When this person talks about "keeping up", they don't mean "interacting". They mean non-interactively keeping tabs on people they once knew but that they don't really have an ongoing relationship with.

    1. It's interesting how few comments are engaging with the substance of the piece. They are encountering for the first time the idea that Rikard is providing a commentary on—that is, giving students their own big kid Web site, an idea that "belongs" to the "Domain of One's Own" effort—and expressing enthusiasm for it here as comments nominally about this piece, which is really rather intended to express a specific, reflective/critical response to overall idea, and does not pretend to present that idea as a novel suggestion for the first time...

    1. References to "the World Wide Wruntime" is a play on words. It means "someone's Web browser". Viz this extremely salient annotation: https://hypothes.is/a/i0jxaMvMEey_Elv_PlyzGg

    1. the patriotic or religious bumper-stickers

      College graduates in 2005 could understand what this meant. I'm skeptical that college graduates in 2023 can really grok this allusion, even if it were explained.

      See also:

      this previous comment thread with a minority detractor view on Idiocracy [...] argues it’s a little more dated to it’s specific Bush-era cultural milieu than everyone remembers

      https://news.ycombinator.com/item?id=29738799

      E.g.:

      [Idiocracy's] "you talk faggy" [...] sadly was common in real life during the mid-00s [...] but would be completely taboo now

      https://news.ycombinator.com/item?id=18489573

    2. how annoying and rude it is that people are talking loudly on cell phones in the middle of the line. And look at how deeply and personally unfair this is

      That's actually not (just) seemingly "personally unfair"—it's collectively unfair. The folks responsible for these things serve as the better example of self-centeredness...

    3. Because my natural default setting is the certainty that situations like this are really all about me. About MY hungriness and MY fatigue and MY desire to just get home, and it’s going to seem for all the world like everybody else is just in my way.

      The fact that we're not talking about a child here but that it was considered a normal for a 43-year-old man in 2005 to have this as his default setting perhaps explains quite a lot about the evident high skew of self-centeredness in folks who are now in their sixties and seventies.

      I didn't notice this in 2005, but maybe I wasn't paying close enough attention.

    4. clichés

      thought-terminating ones, even

    5. there is actually no such thing as atheism. There is no such thing as not worshipping. Everybody worships

      Very sophomoric argument, and it's hard not to point out the irony between this claim and everything preceding it wrt self-assuredness.

      Is it impossible for there to exist people to whom this description doesn't apply, or is it merely annoying and inconvenient to consider the possibility that they might?

    6. none of this is likely, but it’s also not impossible
    7. Everyone here has done this, of course. But it hasn’t yet been part of you graduates’ actual life routine, day after week after month after year.
    8. By way of example, let’s say it’s an average adult day, and you get up in the morning, go to your challenging, white-collar, college-graduate job, and you work hard for eight or ten hours, and at the end of the day you’re tired and somewhat stressed and all you want is to go home and have a good supper and maybe unwind for an hour, and then hit the sack early because, of course, you have to get up the next day and do it all again. But then you remember there’s no food at home. You haven’t had time to shop this week because of your challenging job, and so now after work you have to get in your car and drive to the supermarket. It’s the end of the work day and the traffic is apt to be: very bad. So getting to the store takes way longer than it should, and when you finally get there, the supermarket is very crowded, because of course it’s the time of day when all the other people with jobs also try to squeeze in some grocery shopping. And the store is hideously lit and infused with soul-killing muzak or corporate pop and it’s pretty much the last place you want to be but you can’t just get in and quickly out; you have to wander all over the huge, over-lit store’s confusing aisles to find the stuff you want and you have to manoeuvre your junky cart through all these other tired, hurried people with carts (et cetera, et cetera, cutting stuff out because this is a long ceremony) and eventually you get all your supper supplies, except now it turns out there aren’t enough check-out lanes open even though it’s the end-of-the-day rush. So the checkout line is incredibly long, which is stupid and infuriating. But you can’t take your frustration out on the frantic lady working the register, who is overworked at a job whose daily tedium and meaninglessness surpasses the imagination of any of us here at a prestigious college. But anyway, you finally get to the checkout line’s front, and you pay for your food, and you get told to “Have a nice day” in a voice that is the absolute voice of death. Then you have to take your creepy, flimsy, plastic bags of groceries in your cart with the one crazy wheel that pulls maddeningly to the left, all the way out through the crowded, bumpy, littery parking lot, and then you have to drive all the way home through slow, heavy, SUV-intensive, rush-hour traffic, et cetera et cetera. Everyone here has done this, of course. But it hasn’t yet been part of you graduates’ actual life routine, day after week after month after year.
    9. how to keep from going through your comfortable, prosperous, respectable adult life dead, unconscious, a slave to your head and to your natural default setting of being uniquely, completely, imperially alone day in and day out

      "All of humanity's problems stem from man's inability to sit quietly in a room alone" —Blaise Pascal

    10. It is not the least bit coincidental that adults who commit suicide with firearms almost always shoot themselves in: the head. They shoot the terrible master.
    1. For better or worse, people will continue to run things to inspect the results manually—before grumbling about having to duplicate the effort when they make the actual test. The ergonomics are too tempting, even when they're an obvious false economy.

      What about a test-writing assistant that let you just copy and paste your terminal session into an input field/text file which the assistant would then process and transform into a test?

    1. My first startup, Emu, was a messaging app. My second startup, Miter, was not a messaging app but sometimes acted suspiciously like one. Am I obsessed with communication software? Well…maybe. I’m fascinated by the intersection of people, communities, and technology

      and, apparently, business—which definitely explains the author's overall position, specific recommendations, and the fact that this blindspot (about failing to mention the intersection of business with their interest in messengers).

    2. anything is better than SMS

      Happy to hear this is the author's position, at least, because delta.chat and/or something like it is really the only reasonable way forward.

      (This isn't to say the current experience with delta.chat doesn't have problems itself. I'm not even using it, in fact.)

    3. perpetuates SMS’s dependence on the mobile carrier and device

      This isn't true: where does the "and device" part come in?

    4. you can message anyone, anywhere, without thinking about it too much

      But we have already heard plenty of evidence about why this isn't true...

    5. Your Mom has an iPhone so she loved it. Your brother has an Android, so he saw a blue-green blob with a couple tan blobs in the middle.

      "I'll blame Apple" is both an acceptable and reasonable response to this.

    6. typing indicators, read receipts, stickers, the ability to edit and delete messages.

      Yeah, I don't want any of those. It's not that I'm merely unimpressed—I am actively opposed to several of them for good reasons.

    7. Messages get lost.

      The only reason why I "switched" to Signal ~5 years ago, was because it became clear that some of my messages weren't coming/going through.

      When I switched to Signal, the experience was even worse. Someone would send the message or attempt a voice call, but Signal would not reliably notify me that this happened. I'd open the app to find notifications for things that should've been delivered hours/days earlier.

      This had nothing to do with my app/phone settings. Signal did deliver some notifications, but it would do so unreliably. Eventually, I switched back to SMS in part because I was baffled by how the experience with Signal could be so much worse—as well as a bunch of dodgy decisions by the Signal team (which was actually the main catalyst for the switch back, despite the deliverability problems).

    8. might lose all your messages if you switch phones

      As a point of fact, this has nothing to do with SMS per se....

  2. thecomputersciencebook.com thecomputersciencebook.com
    1. That’s pretty much it

      The lack of emphasis on the original design motivations for the Web as an analog for someone sitting at the reference desk in e.g. a corporate library who will field your request for materials is something that should be corrected.

    1. The usefulness of JSON is that while both systems still need to agree on a custom protocol, it gives you an implementation for half of that custom protocol - ubiquitous libraries to parse and generate the format, so the application needs only to handle the semantics of a particular field.

      To be clear: when PeterisP says parse the format, they really mean lex the format (and do some minimal checks concerning e.g. balanced parentheses). To "handle the semantics of a particular field" is a parsing concern.

    1. We rebuilt Cloudflare's developer documentation - here's what we learned

      This post is one massive derp. (Or rather, the problem solved by the changes documented here is... The post itself would instead be best described as one massive "duh".)

      Spoiler alert: anti-wikis with heavyweight PR-based workflows that ignore how much friction this entails for user contributions don't beget many user contributions! (And it also sucks for the people getting paid to write and edit the content, too.)

    1. If the differentiator is the ease of putting a new application out into the world, that value prop is competing against the value prop of e.g. today’s full-stack web frameworks, right?
    1. Considerations

      What about chained dotted access? foo.bar.baz is probably okay as bar.baz @ (the Foo) (or even @the Foo), but probably not if it takes the form bar.baz from the Foo. (It just doesn't look reasonable to me.)

      Alternatively, what about @bar.baz for the Foo?

    2. this the

      should be a "to" here

    3. constructs involving a lone ASCII single quote can make the job of the parser more difficult, when single quote is already significant within the language (such as for denoting character or string literals)

      NB: not actually that much harder. In fact, my prescription today would probably be to omit the trailing s and allow only a bare single quote, which altogether would be incredibly easy to parse. (Omitting the s would also solve the it-doesn't-look-contrived-enough problem.)

    1. it’s ambiguous whether x-y is the expression x minus y or the invocation of the x-y function. Seems like a bad tradeoff, though. How often do you use -, and how often do you write multiword functions?
    2. In Lua you can write raw, multiline strings with [[]]: [[ Alice said "Bob said 'hi'". ]]

      This is indeed very good (for the reasons stated here).

    1. a lot of people think they understand the pictures but they don't

      See also: REST (compare: Fielding-style vs Valley-style)

    2. we need to reunite model language and programming languages this was the great vision of Simula of beta and Delta L o Delta was not designed to

      We need to reunite model language and programming languages. This was the great vision of Simula[...] We need to stop believing that we can document programs by some well-written code or "clean code". Clean code is great for small programs. Systems need more than comments and a few diagrams—systems need the voice of the designer in them with multimedia, but they also need more expressive paradigms for putting these in our programs.

    1. Although this episode is listed on Lex's own podcast page https://lexfridman.com/podcast/, it isn't actually available in the podcast RSS feed.

      I guessed the URL.

      And even though the latest episode right now is title #350, there are only "300 episodes" listed for the show on the Apple Podcasts page https://podcasts.apple.com/us/podcast/lex-fridman-podcast/id1434243584

    1. how important is the concrete syntax of their language in contrast to

      how important is the concrete syntax of their language in contrast to the abstract concepts behind them what I mean they say can someone somewhat awkward concrete syntax be an obstacle when it comes to the acceptance

    1. global.getProcessControl = new ServiceProcurement(page)

      This can be migrated to a utility method (for ServiceProcurement); viz:

      static initialize(slotted, page, key) {
        const { OverrideFailure } = ServiceProcurement;
      
        let override = new ServiceProcurement(page, key);
        if (!(slotted.name in override.global) ||
            override.global[slotted.name] != slotted) {
          throw new OverrideFailure(slotted, page, override);
        }
      
        override.global[slotted.name] = override;
      }
      

      (Alternatively, omit the override failure checking?)

    2. ExportProcessControl

      This can (should) be parameterized—not just here, but in the procurement constructor.

    1. with Xcode you can't resume the download if it fails
    2. The above (image) is the opcodes that it had.

      Hardly small or simple. Huge departure from Oberon...

    3. It is less and less the case now, but for a while you could inspect websites to see how it was put together.

      It should not go unmentioned that a big reason why this is the case is the types of folks from the presenter's social circles being hostile to this and trafficking in dubious maxims about how it somehow has to be this way...

    1. Theproduct of this is data that are more nearly accurate than could besecured with the distractions and many variables of shop conditions.
    2. Success in handling the human element, like success inhandling the materials element, depends upon knowledge of theelement itself and knowledge as to how it can best be handled.
    3. Laboratory practice has taught that whthe immediate results are important, the standardization of tmethod is more important, since the unexpected ultimate resultssometimes called by-products, are often by far the most valuableoutcome of the work.
    4. Scientificmanagement is simply management that is based upon actualmeasurement

      and yet Moneyball took almost a century, and the human-level processes behind semiconductor fabrication remain astoundingly inefficient (doubly ironic given the task at hand...)

  3. Dec 2022
    1. as a developer, writing against the Win32 APIs allows your software to run on over 90 percent of the computers in the world

      (Something else that has changed in the intervening years; most computers in the world—or a plurality of them, at least—are now running Android, not Windows, but Win32 is useless on Android. It's no help on iOS, either.)

    2. web apps are just so damned easy to use

      Despite the number of times it's been submitted over the years (most recently two months ago), this post has received very little commentary on HN. I think it suffers for its choice of title. This snippet is a better candidate, I think.

    3. Microsoft totally fucked up when they took aim at Netscape. It wasn’t Netscape that was a threat to Windows as an application platform, it was the web itself.

      I dunno about this assessment...

      They knew, and they tried.

      They just eventually stopped trying, because they beat Netscape and Sun.

    4. Web apps are easier to deploy.

      Well...

      Depends on how you measure it.

    5. its overall look-and-feel is far inferior to that of a real desktop mail client

      From a 2022 perspective, things are largely thought of as the opposite. People including developers and users alike on the whole seem to prefer the free-form canvas that enables non-traditional (otherwise referred to as "non-native") UIs—even if developers hate the front-end tech stack.

      Not saying I share the sentiment—just observing that the folks who prefer colorful, do-what-you-want UIs like those found among mobile and Web apps tend to outnumber the SerenityOS fans.

    6. This isn’t about being “Mac-like” — it applies equally to Windows and open source desktop platforms.

      Yeah, but they're basically cribbing the paradigm introduced by the Lisa and popularized by the Mac in 1984. (At least that used to be the case—before Chrome introduced the hamburger menu and everyone else followed suit, Microsoft's attempt to change things up with the ribbon not withstanding.)

    7. They don’t have menu bars or keyboard shortcuts.

      Two things which typical users have been shown not to care about. (I understood the shortcuts thing around this time. It took me, like, 5–10 years after the date that this post was published to realize the thing about menu bars, too. And I'm not even a Mac user.)

    1. It’s fairly clear now that the current catalog process is too heavyweight. I hope we can move to a lighter workflow in the future that feels more like editing a wiki.
    1. web standards are so inscrutably complex and fast-moving that building and maintaining a new browser from scratch requires the resources of a medium-sized nation-state

      no

    1. technology-driven development

      another term for resume-driven development

    2. designers are fickle beasts, and for all their feel-good bloviation about psychology and user experience, most are actually just operating on a combination of trend and whimsy

      the attitude of software designers that gripped the early 2010s described succinctly

    3. you should use this idea to guide your app’s architecture and your class design too. Start from the problem, then work through solving that problem to building your application.
    1. Bun is written in Zig, a low-level programming language with manual memory management.

      Very awkwardly stated.

    1. The battle for convivial software in this senseappears similar to other modern struggles, such as the battle toavert climate disaster. Relying on local, individual rationality aloneis a losing game: humans lack the collective consciousness thatcollective rationality would imply, and much human activity hap-pens as the default result of ‘normal behaviour’. To shift this meansto shift what is normal. Local incentives will play their part, butsocial doctrines, whether relatively transactional notions such asintergenerational contract, or quasi-spiritual notions of our evolvedbond with the planet, also seem essential if there is to be hope ofsteering humans away from collective destruction.
    2. Consider how many web applications con-tain their own embedded ‘rich text’ editing widget. If linking weretruly at the heart of the web’s design, a user (not just a developer)could supply their own preferred editor easily, but such a feat isalmost always impossible. A convivial system should not containmultitudes; it should permit linking to them.
    1. It feels weird to say this in 2020, when the idea was presented as fait accompli in 1997, but an enabling open source software movement would operate more like a bazaar than a cathedral. There wouldn’t be an “upstream”, there would be different people who all had the version of the software that worked best for them. It would be easy to evaluate, compare, combine and modify versions, so that the version you end up with is the one that works best for you, too.
    2. I just give them the thing, and they use it.
    3. Currently modes of software development, including free and open source software, are predicated on the division of society into three classes: “developers” who make software, “the business” who sponsor software making, and “users” who do whatever it is they do. An enabling free software movement would erase these distinctions, because it would give the ability (not merely the freedom) to study and change the software to anyone who wanted or needed it.
    1. Since all reading at that time occurred out loud rather than inside one’s head, the study rooms were a modern librarian’s nightmare

      The modern library is the quiet reader's nightmare; I've been to many noisy libraries over the last decade.

    1. aren’t really pens at all, in fact, but tributes to pens

      Nice turn of phrase.

      See also: mop handles that flex and bend like rubber when subjected to the downward pressure that is typical during mopping.

    2. the cute yellow mittens my wife picked up at Target which unraveled the second time she wore them

      I've said it before: we focused too much on the dream of a 3D printer in every home when we should have focused on personal electric looms.

    3. For the remains of the Pyrex casserole that shattered when I removed it from the oven, strewing the floor with blade-like shards, some so tiny I probably won’t find them for another couple of months, and only when they lodge in my bare feet.

      Would it be possible to adulterate glassware to glow underneath a blacklight?

    1. People seem to think it's the browser's job to block ads, but my perspective is that if a business owner wants to make their business repulsive, the only sensible response is to stop using the business. Somehow once technology is involved to abstract what's happening, people start talking about how it's their right to unilaterally renegotiate the transaction. Or for another analogy that will likely make you upset: "I hate how this store charges $10 for a banana, so I am just going to pay $2 and take the banana anyway".

      terrible analogy is terrible—and I say this as someone who doesn't even fall in line with the general anti-copyright sentiment that is prevalent on sites like HN and Reddit

    1. programs with type errors must still be specified to have a well-defined semantics

      Use this to explain why Bernhardt's JS wat (or, really, folks' gut reaction to what they're seeing) is misleading.

    1. Fielding’s dissertation (titled “Architectural Styles and the Design of Network-based Software Architectures”) is not about how to build APIs on top of HTTP but rather about HTTP itself.

      I'm a big fan of how REST is explained in jcrites's comment as part of the discussion that arose on HN in response to this piece: https://news.ycombinator.com/item?id=23672561

    2. Many more people know that Fielding’s dissertation is where REST came from than have read the dissertation

      in other words, widely referenced but rarely read

    1. "By definition" is not an argument.

      Yes it is, dumbass.

      Also:

      I am not sure why you think that having an obscure URI format will somehow give you a secure call (whatever that means). Identifiers are public information.

      Fielding. 2008. https://roy.gbiv.com/untangled/2008/rest-apis-must-be-hypertext-driven#comment-806

    1. non-concrete ideas are very hard to falsify

      Maybe this is just a regional thing, but something I really began to notice several years ago is that (a) what Jamie is saying is true, but (b) it's evident that people actually love the unfalsifiability aspect of non-concrete claims—it's safe refuge that they actively seek out.

      Two opposing examples from real life that particularly stuck out: * "Slow down. [You're going too fast.]" * "[...] since you always take so long"

      (These were two different instances/contexts with, I think, a year+ in between them; it wasn't the contrast between them that made me notice the phenomenon. Rather, I recognized each one at the time as being part of this phenomenon. They just serve as good examples because of how easily they could be made concrete—and therefore falsified in light of the facts: "without defining exactly what 'too fast' means, what is an acceptable speed?", "without defining what it means to take too long, what is an acceptable amount of time to take?"—both arising from wholly disingenuous complaints that were forms of externalized gut reactions rather than anything that would hold up under scrutiny...)

    1. Building programs by reusing generic components willseem strange if you think of programming as the act ofassembling the raw statements and expressions of a pro-gramming language. The integrated circuit seemed just asstrange to designers who built circ uits from discrete electronic com ponents. What is truly revolutionary aboutobject-oriented programming is that it helps programmersreuse existing code. just as the silicon chip helps circuitbuilders reuse the work of chip designers.

      Oh man, this metaphor really fell apart and, if anything, works against itself.

      "If integrated circuits are superior to discrete components, why exactly are we supposed to be recreating the folly of reaching for reusable components in creating software?"

    1. The only difference is that standard data repre- sentations (XML schemas) eliminate the need for custom parsers

      They don't, though. Things like JSON, XML, etc. mean you don't need to write a lexer--not that you don't need to write a parser. People land themselves in all sorts of confused thoughts over this (smart people, even).

    1. From the 1976 edition of Naur, Randell, and Buxton's "Software Engineering: Concepts and Techniques":

      Get some intelligent ignoramus to read through your documentation and try the system; he will find many "holes" where essential information has been omitted. Unfortunately intelligent people don't stay ignorant too long, so ignorance becomes a rather precious resource. Suitable late entrants to the project are sometimes useful here.

      Burkinshaw on correctness and debugging. p. 162. (Only in the 1976 edition.)

    2. Good case for S4/BYFOB+bookmarklets.

    1. as forking Electron to make Min wouldn't make any sense, and the replier knew this, reading it to mean that seems like a mistake to me

      Right. If there are two ways to take a statement, one which is absurd because it's inconsistent with actual fact and another which isn't, it's a bad idea to make an argument that hinges on it being the former—that would mean you're insisting on an absurdity.

    1. A world in which I wouldn’t feel like my app is inadequate because I’m not using the [insert name] technology.

      they keep coming

    2. A world where I wouldn’t need dozens of third-party linters, type checkers and configurations just to make sure that my code is correct

      another you-problem

    3. unified module system and I wouldn’t need to worry about file name extensions

      There is. It was standardized in ES6 aka ES2015. 2015! Direct your ire in the appropriate direction, i.e., at your vendor—and, ultimately, at yourself regarding your own lack of foresight.

    4. I just can’t stop dreaming about a perfect world where I could go back to any of my old JavaScript projects with an ease of mind and knowing that everything just works. A perfect world where my program is going to stand the test of time.

      That's a you-problem. The pieces are there—the language is stable, and there's a ludicrously backwards compatible World Wide Wruntime that you can trust to be around—it's on you if it fails.

    5. perhaps we should take a few steps back before making decisions to reflect on our long-term vision and to recognize that how every change, even the tiny ones, could affect a whole group of users

      The truest thing I've read in this post so far.

    6. being developed on their own to push the boundaries

      I don't think so. I go back look at what boundaries were being pushed in historical projects like some of the stuff that Tolmasky was doing and I see way more than today.

      The last 10 years are pretty stagnant, in comparison. React sucks up a lot of resources to ultimately do little more than reinvent the wheel (poorly). Same with the adjacent tools mentioned here.

    7. JavaScript has been evolving and growing so fast it’s unlike any other tech I’ve seen.

      This statement belies some ignorance of history, I think, or some form of intellectual dishonesty.

      JS is objectively and demonstrably slower on the uptake than what happened with Java.

    8. we mean it as a whole and how it gets used by the user, not just the language specifications alone

      Well, "as a whole", JS includes non-Node-affiliated silliness; it does get used by people in ways that doesn't involve any of this stuff. So use it that way, too, if you have a problem with the way you're using it now (and you should have a problem with it—so stop doing that!)

    9. The language is responsible for providing the context and the environment in which things happen and get shaped.

      No. Well, yes, this is true, strictly speaking. But, again, this is a true-in-some-sense-that-isn't-relevant-here sort of way.

      Parents are not responsible for the crimes that their children grow up and commit.

      The NodeJS community is responsible for the NodeJS community's problems. And people outside the NodeJS community who choose to align themselves with the NodeJS community are responsible for their choices.

    10. JavaScript is evolving too rapidly

      JS is evolving too rapidly—the ECMAScript group is putting too much in the language too often—but that's not the problem described here.

      As noted in the comments on HN[1], the complaints here are not about JS, but about NPMJS/NodeJS specifically and how the NPM approach to engineering fails.

      1. https://news.ycombinator.com/item?id=33965914
    11. Six months passes and while you had almost forgotten about your little project, you now have got some new ideas that could make it even better. The project also has a few open issues and feature requests that you can take care of. So you come back to it. “Let’s start”, you whispered to yourself with excitement. You run npm install in your terminal like an innocent man and switch to your browser to scroll on Twitter while you are waiting for the dependencies to be installed. Moments later you return to your terminal and see… an error!
    1. @15:40

      His point is that a lot of software design has failed to be even as good as print design--that software design really has focused so much on interaction, we kind of treat the computer as this machine that we need to manipulate and less about displaying information--for people to make decisions, to come to conclusions, to learn something--and that by focusing so much on interaction design, by focusing so much on the computer as a mechanical thing, we've really made bad software.

      And so he wants to say part of the problem here is that the people who design can't make these context-sensitive magic ink--right? It's like: print design but now it knows something about your context. And so designers really aren't able to make these rich information things that are dynamic but not interactive. And so you could really kind of call this "Interaction Considered Harmful", and the idea is that our software should only be interactive when it has to be and really our software should be context-aware, good, print-like displays of information.

    1. Bellheads believed in "smart" networks. Netheads believed in what David Isenberg called "The Stupid Network," a "dumb pipe" whose only job was to let some people send signals to other people
    1. This brings interesting questions back up like what happens to your online "presence" after you die (for lack of a better turn of phrase)?

      Aaron Swartz famously left instructions predating (by years IIRC) the decision that ended his life for the way that unpublished and in-progress works should be licensed and who should become stewards/executors for the personal infrastructure he managed.

      The chrisseaton.com landing page has three social networking CTAs ("Email me", etc.) Eventually, the chrisseaton.com domain will lapse, I imagine, and the registrar or someone else will snap it up to squat it, as is their wont. And while in theory chrisseaton.github.io will retain all the same potential it had last week for much longer, no one will be able to effect any changes in the absence of an overseer empowered to act.

    1. In our way of delivering orders we emphasise explaining the context two levels up. I may tell my soldiers to raid a compound, but I would also tell them that the reason for this is to create a distraction so that the Colonel can divert the enemy away from a bridge, and that the reason the Brigadier wants the Colonel to divert the enemy is so that the bridge is easier to cross. Not only do the soldiers then know why it’s important to raid the compound (so that others can cross the bridge), but they know that if for some reason they can’t raid the compound, creating any other diversion or distraction will do in a pinch, and if they can’t do that they can still try to do something to make it easier to cross the bridge. It lets everyone adapt to change as it happens without additional instruction if they aren’t able to get in touch with me. Again I think tech could possibly learn from that.

      def

    1. By Brad J. Cox, December 06, 2004

      NB: the footnote at the end indicates that this was originally published in Byte Magazine (October 1990). By a reasonable guess, the 2004 date here is when this online copy was published to drdobbs.com?

    1. McIlroy envisioned a world where we would be constructing software by picking components off a shelf, and snapping them together like Legos. The hard work would be building the right blocks, and then it would be easy to snap them together.

      See also: Brad Cox on software ICs

    2. Briefly, Taylorism has two central tenets:Measurement: associate metrics with all aspects of work.The separation of thinking and doing: An educated class of managers measures and plans all the work, and is responsible for the overall process, while a class of laborers carries out the implementation of those plans.

      I find it difficult to reconcile these two tenets with the claim that "Taylorism is so deeply ingrained in every aspect of not just modern commerce but life in general".

      Many (most?) places—even big engineering orgs like Samsung—are failing on the first principle alone and are doing a lot of wasteful, gut-driven operational stuff.

    1. we believe that in the long term thekey to the reuse of software is to reuse analysis and design; not code

      cf akkartik

    1. A paper doesn’t even contain all the information and data required to reproduce a result. Because if it did, it would be the size of a book.
    1. The migration would not be complete without calling out that I was unable to build the Mastodon code base on our new primary Puma HTTP server.
    1. If you write an algorithm in a straightforward way in Node, you can expect it to run about as fast as if you write it in a vectorized way using Numpy, or twenty times as fast as if you write it in a straightforward way in CPython.
    2. it also has always had a binding problem with this

      No, it doesn't. The "binding problem with this" is a problem that (some) people have. JS binds it the right way.

    3. Also those error messages are super confusing, which I guess is one way JS has always been worse than Python.

      That's Node, not "JS".

    4. Everywhere you could use var you can now use let or const

      No, not everywhere.

      You start replacing var with let and things can be observed to break (e.g. multiple declarations, no declaration before use...)

    1. I’m rather surprised that a separate issue did not come up. That is, many schools make students sign something saying that any code they create as a student has the copyright automatically assigned to the school.

      This is not nearly as common as the highlighted comments suggest—at least not for undergrads at public universities in the US. It is a thing, though, with grad students (for obvious reasons).

    1. Lacking a better name, I will just call them “Wirth-the-RISC”

      I call it "RISC-W" (esp. since Wirth is not actually pronounced like "worth", which is the backbone of this attempted pun).

      Prior art: https://en.wikipedia.org/wiki/ALGOL_W

    1. On the onehand we have our technical toolbox full but on theother, we cannot use these tools effectively becausea proper infrastructure is absent.
    1. Multiplayer stategy games.

      Today, you tend to hear "web app" as a term used e.g. to contrast SPAs from classic, form-based apps. But the term was already being used then, when (barring the use of plugins, which weren't really "of" the Web) only the form-based ones were viable because SPAs and similar Web-native RIAs weren't yet an option.

      Should we refer to those (the classic Web apps) as "turn-based apps" to emphasize the paradigm in play (i.e. REST)?

    2. Hickson elaborates in his blog
    1. If I had to guess, I'd assume that this was done here to save some keystrokes by not having to pass around all arguments again.

      One of the most obnoxious motivators for turning out code that ends up sucking.

      When trying to save keystrokes, prefer keyboard macros over code macros that get checked in as source code.

  4. Nov 2022
    1. But. I somehow changed-ish laptop, and you know the problem where you change laptop and suddenly you lose access to a few things? Yes. That's one of my problems. This site was using nini to statically generate pages and interconnect them with backlinks, and it's great. But I figured I'll keep it to simple html pages that don't need any compilation for now, so that when I change this laptop, I'll still be able to publish without having to install anything.
    1. I encounter a lot more smug static weenies than smug dynamic weenies, so I defend dynamic typing out of spite. There have been a few cases where I was surrounded by SDWs and I immediately flipped around to being pro-static. The industry is moving towards static typing and I like being weird and contrarian.

      See also:

      In the Smalltalk community, I always had the reputation as the guy who was trying to make Smalltalk more static so it was more applicable for industrial applications. In the Microsoft environment, which is a very static world, you know, I go around pushing people to make everything more dynamic.

      — Allen Wirfs-Brock, in an interview with Jon Udell https://hypothes.is/a/YH0Mwp0yEeu-Ybt6B3i1Ew

    1. The creators of Scrivener have taken a process that formerly had to be done manually by writers, and built a system of cues that make it easy and natural.
    1. endless stories of people exporting Euro VI emissions compliant cars to Africa that then detonate on the local fuel

      can't parse

    1. They demand less catching up time, less file-switching and ultimately give you the very bit of information you’re looking for anyway when dealing with them.

      déformation professionnelle

    2. unused classes

      again: unmatched class selectors

      additionally, it's not the fact that they are unmatched (or that they are class selectors specifically) that it's a problem—it's the fact that there are a lot of them

      the entire choice of focusing on classes and class selectors here is basically a red herring

    3. Harder to know what class does

      where "does" here means "how the author recommends it should be styled", which is not actually the purpose of the class attribute

    4. From a file size standpoint, you shouldn’t worry about repeated class names in the HTML. That’s what Gzip is for.

      you suddenly find it worthwhile to acknowledge the existence of gzip?

      what happened? https://hypothes.is/a/Bg1yFm4vEe2_WDuDjmBVaA

    5. do what’s best for the industry, the projects, and the users

      "in that order", apparently

    6. keeping class names not too specific: instead of .margin-bottom-8, you can use a more abstract name like .margin-bottom-xxs

      just torpedoed your entire argument, dude

    7. reuse

      "reuse" is not the point of classes

      semantics are the point

    8. thinking you should be able to understand an entire project only by reading the source code is pure fantasy

      lol wut

      vacillating wildly here

    9. unused classes

      class selectors are not classes

      "unused" or "unmatched class selectors" is what they're really talking about

    10. .color-red

      you mean color-red; there is no .color-red class on the div

    11. Unnecessarily verbose, heavier file size

      contradicted later

    12. “Favor composition over inheritance”. This piece of wisdom from Design Patterns, one of the most influential software engineering books, is the foundation of utility-first CSS. It also shares many principles with functional programming: immutability, composability, predictability, and avoidance of side-effects. The goal behind all those fancy terms is to write code that’s easier to maintain and to scale.

      déformation professionnelle

    1. As with many front-end techniques, the approach gets a lot of criticism for being different from what people are used to.

      Example of focusing on your weakest critics and choosing not to confront the strong arguments against your position.

    1. I want to add software to offer to inline HN links in this format when people include links to past threads in their comments.

      You could pretty easily do this with a bookmarklet. (Call it "citehn", pronounced "citation".)

    1. logic that depends upon fixed format/fixed position fields

      So don't do that.

      The class attribute had been in HTML for years by the time this post was written—and (contrary to any belief otherwise) was meant for this purpose, and not something contributed by the CSS group for use with CSS selectors.