3,525 Matching Annotations
  1. Jan 2023
    1. The overriding class Shape hasadded a slot, color. Since Shape is the superclass of all other classes in ShapeLi-brary, they all inherit the new slot.

      This is the one thing so far where the procedural syntactic mechanism isn't doing obvious heavy lifting.

    2. The slot definition of List fills the role of an import statement, as do thoseof Error and Point.

      ... at some expense to ergonomics.

      It's odd that they didn't introduce special syntax for this. They could have even used import to denote these things...

    3. The factory method is somewhat similar to a traditional constructor. How-ever, it has a significant advantage: its usage is indistinguishable from an ordinarymethod invocation. This allows us to substitute factory objects for classes (orone class for another) without modifying instance creation code. Instance cre-ation is always performed via a late bound procedural interface.

      The class semantics for new in ES6 really bungled this in an awful way.

    Annotators

    1. Patch based systems are idiotic, that's RCS, that is decades old technology that we know sucks (I've had a cocktail, it's 5pm, so salt away).Do you understand the difference between pass by reference and pass by value?

      Larry makes a similar analogy (pass by value vs pass by reference) to my argument about why patches are actually better at the collaboration phase—pull requests are fragile links. Transmission of patch contents is robust; they're not references to external systems—a soft promise that you will service a request for the content when it comes. A patch is just the proposed change itself.

    1. Literate programming worked beautifully until wegot to a stage where we wanted to refactor theprogram. The program structure was easy tochange, but it implied a radical change to thestructure of the book. There was no way we couldspend a great deal of time on restructuring thebook so we ended up with writing appendices andappendices to appendices that explained what wehad done. The final book became unreadable andonly fit for the dustbin.The lesson was that the textbook metaphor is notapplicable to program development. A textbook iswritten on a stable and well known subject whilea program is under constant evolution. Weabandoned literate programming as being toorigid for practical programming. Even if we got itright the first time, it would have failed in thesubsequent maintenance phases of the program’slife cycle.

    Tags

    Annotators

    1. How do we package software in ways that maximize its reusability while minimizing the level of skill required to achieve reuse?

      Is that really the ultimate, most worthy goal? It seems that "minimizing the level of skill required[...]" is used as a proxy here for what we're really after—minimizing the total cost of producing the thing we want. Neither the minimization of skilled use nor reuse should be held as a priori goals.

    1. if you are running a software business and you aren't at, like, Google-tier scale, just throw it all in a monorepo

      The irony* of this comment is that Google and Google engineers are famously some of the most well-known users/proponents of monorepos.

      * not actual irony; just the faux irony—irony pyrite, or "fool's irony", if you like

    1. I would argue that it’s simply more fun to engage with the digital world in a read-write way, to see a problem and actually consider it fixable by tweaking from the outside

      He doesn't exactly say it here, but many others making the same observations will pair it with the suggestion that this is because of some intrinsic property of the digital medium. If you think about it, that isn't true. If you consider paper, people tend to be/feel more empowered to tweak it for their own use (so long as they own the copy); digital artifacts seem more hands-off, despite their potential, because the powers involved are reserved for wizards, largely thanks to the milieu that those who are the wizards have cultivated to benefit themselves and their livelihood first, rather than empowering the ordinary computer user.

    2. Software should be a malleable medium, where anyone can edit their tools to better fit their personal needs. The laws of physics aren’t relevant here; all we need is to find ways to architect systems in such a way that they can be tweaked at runtime, and give everyone the tools to do so.

      It's clear that gklitt is referring to the ability of extensions to augment the browser, but: * it's not clear that he has applied the same thought process to the extension itself (which is also software, after all) * the conception of in-browser content as software tooling is likely a large reason why the perspective he endorses here is not more widespread—that content is fundamentally a copy of a particular work, in the parlance of US copyright law (which isn't terribly domain-appropriate here so much as its terminology is useful)

    3. CSS classes

      NB: there's no such thing as a "CSS class". They're just classes—which you may use to address things using CSS's selector language, since it was conveniently (and wisely) designed from the beginning to incorporate first-class* support for them.

      * no pun intended

    4. because it’s building on an unofficial, reverse-engineered foundation, there are no guarantees at all about when things might change underneath

      This is an unfortunate reality about the conventions followed by programmers building applications with Web-based interfaces: no one honors the tradition of the paper-based forms that their digital counterparts are supposed to mimic; they're all building TOSS-style APIs (and calling that REST) instead of actual, TURN-style REST interfaces.

    1. we have one of the most powerful languages for manipulating everything in the browser (ecmascript/javascript) at our disposal, except for manipulating the browser itself! Some browsers are trying to address this (e.g. http://conkeror.org/ -- emacs styled tiled windows in feature branch!) and I will be supporting them in whatever ways I can. What we need is the bash/emacs/vim of browsers -- e.g. coding changes to your browser (emacs style) without requiring recompiling and building.

      That was what pre-WebExtensions Firefox was. Mozilla Corp killed it.

      See Yegge's remarks on The Pinocchio Problem:

      The very best plug-in systems are powerful enough to build the entire application in its own plug-in system. This has been the core philosophy behind both Emacs and Eclipse. There's a minimal bootstrap layer, which as we will see functions as the system's hardware, and the rest of the system, to the greatest extent possible (as dictated by performance, usually), is written in the extension language.

      Firefox has a plugin system. It's a real piece of crap, but it has one, and one thing you'll quickly discover if you build a plug-in system is that there will always be a few crazed programmers who learn to use it and push it to its limits. This may fool you into thinking you have a good plug-in system, but in reality it has to be both easy to use and possible to use without rebooting the system; Firefox breaks both of these cardinal rules, so it's in an unstable state: either it'll get fixed, or something better will come along and everyone will switch to that.

      Something better didn't come along, but people switched anyway—because they more or less had to, since Mozilla abandoned what they were switching from.

    1. Sciter. Used for rendering the UI of apps. There's no browser using Sciter to display websites, and the engine is Closed source.

      Worth noting that c-smile, the creator of Sciter, put out an offer during COVID lockdowns to make Sciter open source if someone would fund it for $100k. That funding never came through.

    1. My central goal is to further Paul Otlet, et al's, vision and head toward an amalgamous World Wide Web (a Universal Knowledge Repository) freed of arbitrary, discrete "document" boundaries.

      My central goal is a universal knowledge repository freed of discrete "document" boundaries

    1. Readers must learn specific reflective strategies. “What questions should I be asking? How should I summarize what I’m reading?” Readers must run their own feedback loops. “Did I understand that? Should I re-read it? Consult another text?”

      I generally don't have to do that when reading except when reading books or academic papers. This suggests that there's not really anything wrong with the form of the book, but rather its content (or the stylistic presentation of that content, really).

      I've said it a bunch the biggest barrier to accessibility of academic articles specifically is the almost intolerable writing style that almost every non-writer adopts when they're trying to write something to the standards for acceptance in a journal. Every journal article written for joyless robots should be accompanied by a blog post (or several of them) on the author's own Web site that says all the same things but written for actual human beings.

    2. Readers can’t just read the words. They have to really think about them. Maybe take some notes. Discuss with others. Write an essay in response. Like a lecture, a book is a warmup for the thinking that happens later.

      What if, when you bought a book, included was access to a self-administered test for comprehension? Could this even solve the paying-for-things-limits-access-to-content problem? The idea would be to make the thing free (ebooks, at least), but your dead tree copy comes with access to a 20-minute interactive testing experience (in a vendor-neutral, futureproof format like HTML and inline JS—not necessarily a Web-based learning portal that could disappear at any moment).

    1. I saw this tech talk by Luis Von Ahn (co-creator of recaptcha) and learned about the idea of harnessing human computation

      Consider: a version of the game 20 Questions that helps build up a knowledge base that can be relied upon for Mek's aforementioned Michael Jackson-style answers.

    1. In one of our early conversations with developers working on CLIs outside of Shopify, oclif came up as an excellent framework of tools and APIs to build CLIs in Node. For instance, it was born from Heroku’s CLI to support the development of other CLIs. After we decided on Node, we looked at oclif’s feature set more thoroughly, built small prototypes, and decided to build the Node CLI on their APIs, conventions, and ecosystem. In hindsight, it was an excellent idea.

      Minority(?) viewpoint: oclif-based command-line apps (if they're anything like Heroku's, at least) follow conventions that are alien and make them undesirable.

    1. And my mom is getting older now and I wish I had all the comments, posts, and photos from the past 14 years to look back on and reminisce. Can’t do that now.

      This reminds me of, during the height of the iPod era, when someone I know was gifted* an non-Apple music player and some iTunes gift cards—their first device for their first music purchases not delivered on physical media. They created an iTunes account, bought a bunch of music on the Music Store, and then set about trying to get it onto their non-Apple device, coming to me when it wasn't going well trying to get it to work themselves. I explained how Apple had (at the time) made iTunes Music Store purchases incompatible with non-Apple devices. Their response was baffling to me:

      Rather than rightly getting pissed at Apple for this state of affairs, they did the opposite—they expressed their disdain about the non-Apple MP3 player they were given** and resolved to get it exchanged for credit so they could buy a (pricier, of course) Apple device that would "work". That is, they felt the direct, non-hypothetical effects of Apple's incompatibility ploy, and then still took exactly the wrong approach by caving despite how transparently nefarious it all was.

      Returning to this piece: imagine if all that stuff hadn't been locked up in the social media silo. Imagine if all those "comments, posts, and photos from the past 14 years" hadn't been unwisely handed over for someone else to keep out of reach unless you assimilated. Imagine just having it delivered directly to your own inbox.

      * NB: not by me

      * NB: not as a consequence for mimetic desire for the trendiest device; they were perfectly happy with the generic player before they understood the playback problem

    2. It’s not feasible to constantly be texting and phone calling Paula from 10th grade geometry, etc.

      This was initially confusing. What makes texting infeasible, but doing it through Facebook is feasible? I realized upon reaching the end of the next paragraph: "I cant make a new Facebook in 2023 and add all these old friends. Literally psychotic behavior."

      When this person talks about "keeping up", they don't mean "interacting". They mean non-interactively keeping tabs on people they once knew but that they don't really have an ongoing relationship with.

    1. It's interesting how few comments are engaging with the substance of the piece. They are encountering for the first time the idea that Rikard is providing a commentary on—that is, giving students their own big kid Web site, an idea that "belongs" to the "Domain of One's Own" effort—and expressing enthusiasm for it here as comments nominally about this piece, which is really rather intended to express a specific, reflective/critical response to overall idea, and does not pretend to present that idea as a novel suggestion for the first time...

    1. the patriotic or religious bumper-stickers

      College graduates in 2005 could understand what this meant. I'm skeptical that college graduates in 2023 can really grok this allusion, even if it were explained.

      See also:

      this previous comment thread with a minority detractor view on Idiocracy [...] argues it’s a little more dated to it’s specific Bush-era cultural milieu than everyone remembers

      https://news.ycombinator.com/item?id=29738799

      E.g.:

      [Idiocracy's] "you talk faggy" [...] sadly was common in real life during the mid-00s [...] but would be completely taboo now

      https://news.ycombinator.com/item?id=18489573

    2. how annoying and rude it is that people are talking loudly on cell phones in the middle of the line. And look at how deeply and personally unfair this is

      That's actually not (just) seemingly "personally unfair"—it's collectively unfair. The folks responsible for these things serve as the better example of self-centeredness...

    3. Because my natural default setting is the certainty that situations like this are really all about me. About MY hungriness and MY fatigue and MY desire to just get home, and it’s going to seem for all the world like everybody else is just in my way.

      The fact that we're not talking about a child here but that it was considered a normal for a 43-year-old man in 2005 to have this as his default setting perhaps explains quite a lot about the evident high skew of self-centeredness in folks who are now in their sixties and seventies.

      I didn't notice this in 2005, but maybe I wasn't paying close enough attention.

    4. there is actually no such thing as atheism. There is no such thing as not worshipping. Everybody worships

      Very sophomoric argument, and it's hard not to point out the irony between this claim and everything preceding it wrt self-assuredness.

      Is it impossible for there to exist people to whom this description doesn't apply, or is it merely annoying and inconvenient to consider the possibility that they might?

    5. By way of example, let’s say it’s an average adult day, and you get up in the morning, go to your challenging, white-collar, college-graduate job, and you work hard for eight or ten hours, and at the end of the day you’re tired and somewhat stressed and all you want is to go home and have a good supper and maybe unwind for an hour, and then hit the sack early because, of course, you have to get up the next day and do it all again. But then you remember there’s no food at home. You haven’t had time to shop this week because of your challenging job, and so now after work you have to get in your car and drive to the supermarket. It’s the end of the work day and the traffic is apt to be: very bad. So getting to the store takes way longer than it should, and when you finally get there, the supermarket is very crowded, because of course it’s the time of day when all the other people with jobs also try to squeeze in some grocery shopping. And the store is hideously lit and infused with soul-killing muzak or corporate pop and it’s pretty much the last place you want to be but you can’t just get in and quickly out; you have to wander all over the huge, over-lit store’s confusing aisles to find the stuff you want and you have to manoeuvre your junky cart through all these other tired, hurried people with carts (et cetera, et cetera, cutting stuff out because this is a long ceremony) and eventually you get all your supper supplies, except now it turns out there aren’t enough check-out lanes open even though it’s the end-of-the-day rush. So the checkout line is incredibly long, which is stupid and infuriating. But you can’t take your frustration out on the frantic lady working the register, who is overworked at a job whose daily tedium and meaninglessness surpasses the imagination of any of us here at a prestigious college. But anyway, you finally get to the checkout line’s front, and you pay for your food, and you get told to “Have a nice day” in a voice that is the absolute voice of death. Then you have to take your creepy, flimsy, plastic bags of groceries in your cart with the one crazy wheel that pulls maddeningly to the left, all the way out through the crowded, bumpy, littery parking lot, and then you have to drive all the way home through slow, heavy, SUV-intensive, rush-hour traffic, et cetera et cetera. Everyone here has done this, of course. But it hasn’t yet been part of you graduates’ actual life routine, day after week after month after year.
    6. how to keep from going through your comfortable, prosperous, respectable adult life dead, unconscious, a slave to your head and to your natural default setting of being uniquely, completely, imperially alone day in and day out

      "All of humanity's problems stem from man's inability to sit quietly in a room alone" —Blaise Pascal

    1. For better or worse, people will continue to run things to inspect the results manually—before grumbling about having to duplicate the effort when they make the actual test. The ergonomics are too tempting, even when they're an obvious false economy.

      What about a test-writing assistant that let you just copy and paste your terminal session into an input field/text file which the assistant would then process and transform into a test?

    1. My first startup, Emu, was a messaging app. My second startup, Miter, was not a messaging app but sometimes acted suspiciously like one. Am I obsessed with communication software? Well…maybe. I’m fascinated by the intersection of people, communities, and technology

      and, apparently, business—which definitely explains the author's overall position, specific recommendations, and the fact that this blindspot (about failing to mention the intersection of business with their interest in messengers).

    2. anything is better than SMS

      Happy to hear this is the author's position, at least, because delta.chat and/or something like it is really the only reasonable way forward.

      (This isn't to say the current experience with delta.chat doesn't have problems itself. I'm not even using it, in fact.)

    3. Your Mom has an iPhone so she loved it. Your brother has an Android, so he saw a blue-green blob with a couple tan blobs in the middle.

      "I'll blame Apple" is both an acceptable and reasonable response to this.

    4. typing indicators, read receipts, stickers, the ability to edit and delete messages.

      Yeah, I don't want any of those. It's not that I'm merely unimpressed—I am actively opposed to several of them for good reasons.

    5. Messages get lost.

      The only reason why I "switched" to Signal ~5 years ago, was because it became clear that some of my messages weren't coming/going through.

      When I switched to Signal, the experience was even worse. Someone would send the message or attempt a voice call, but Signal would not reliably notify me that this happened. I'd open the app to find notifications for things that should've been delivered hours/days earlier.

      This had nothing to do with my app/phone settings. Signal did deliver some notifications, but it would do so unreliably. Eventually, I switched back to SMS in part because I was baffled by how the experience with Signal could be so much worse—as well as a bunch of dodgy decisions by the Signal team (which was actually the main catalyst for the switch back, despite the deliverability problems).

  2. thecomputersciencebook.com thecomputersciencebook.com
    1. That’s pretty much it

      The lack of emphasis on the original design motivations for the Web as an analog for someone sitting at the reference desk in e.g. a corporate library who will field your request for materials is something that should be corrected.

    1. The usefulness of JSON is that while both systems still need to agree on a custom protocol, it gives you an implementation for half of that custom protocol - ubiquitous libraries to parse and generate the format, so the application needs only to handle the semantics of a particular field.

      To be clear: when PeterisP says parse the format, they really mean lex the format (and do some minimal checks concerning e.g. balanced parentheses). To "handle the semantics of a particular field" is a parsing concern.

    1. We rebuilt Cloudflare's developer documentation - here's what we learned

      This post is one massive derp. (Or rather, the problem solved by the changes documented here is... The post itself would instead be best described as one massive "duh".)

      Spoiler alert: anti-wikis with heavyweight PR-based workflows that ignore how much friction this entails for user contributions don't beget many user contributions! (And it also sucks for the people getting paid to write and edit the content, too.)

    1. constructs involving a lone ASCII single quote can make the job of the parser more difficult, when single quote is already significant within the language (such as for denoting character or string literals)

      NB: not actually that much harder. In fact, my prescription today would probably be to omit the trailing s and allow only a bare single quote, which altogether would be incredibly easy to parse. (Omitting the s would also solve the it-doesn't-look-contrived-enough problem.)

    1. we need to reunite model language and programming languages this was the great vision of Simula of beta and Delta L o Delta was not designed to

      We need to reunite model language and programming languages. This was the great vision of Simula[...] We need to stop believing that we can document programs by some well-written code or "clean code". Clean code is great for small programs. Systems need more than comments and a few diagrams—systems need the voice of the designer in them with multimedia, but they also need more expressive paradigms for putting these in our programs.

    1. global.getProcessControl = new ServiceProcurement(page)

      This can be migrated to a utility method (for ServiceProcurement); viz:

      static initialize(slotted, page, key) {
        const { OverrideFailure } = ServiceProcurement;
      
        let override = new ServiceProcurement(page, key);
        if (!(slotted.name in override.global) ||
            override.global[slotted.name] != slotted) {
          throw new OverrideFailure(slotted, page, override);
        }
      
        override.global[slotted.name] = override;
      }
      

      (Alternatively, omit the override failure checking?)

    1. It is less and less the case now, but for a while you could inspect websites to see how it was put together.

      It should not go unmentioned that a big reason why this is the case is the types of folks from the presenter's social circles being hostile to this and trafficking in dubious maxims about how it somehow has to be this way...

    1. Scientificmanagement is simply management that is based upon actualmeasurement

      and yet Moneyball took almost a century, and the human-level processes behind semiconductor fabrication remain astoundingly inefficient (doubly ironic given the task at hand...)

  3. Dec 2022
    1. as a developer, writing against the Win32 APIs allows your software to run on over 90 percent of the computers in the world

      (Something else that has changed in the intervening years; most computers in the world—or a plurality of them, at least—are now running Android, not Windows, but Win32 is useless on Android. It's no help on iOS, either.)

    2. web apps are just so damned easy to use

      Despite the number of times it's been submitted over the years (most recently two months ago), this post has received very little commentary on HN. I think it suffers for its choice of title. This snippet is a better candidate, I think.

    3. Microsoft totally fucked up when they took aim at Netscape. It wasn’t Netscape that was a threat to Windows as an application platform, it was the web itself.

      I dunno about this assessment...

      They knew, and they tried.

      They just eventually stopped trying, because they beat Netscape and Sun.

    4. its overall look-and-feel is far inferior to that of a real desktop mail client

      From a 2022 perspective, things are largely thought of as the opposite. People including developers and users alike on the whole seem to prefer the free-form canvas that enables non-traditional (otherwise referred to as "non-native") UIs—even if developers hate the front-end tech stack.

      Not saying I share the sentiment—just observing that the folks who prefer colorful, do-what-you-want UIs like those found among mobile and Web apps tend to outnumber the SerenityOS fans.

    5. This isn’t about being “Mac-like” — it applies equally to Windows and open source desktop platforms.

      Yeah, but they're basically cribbing the paradigm introduced by the Lisa and popularized by the Mac in 1984. (At least that used to be the case—before Chrome introduced the hamburger menu and everyone else followed suit, Microsoft's attempt to change things up with the ribbon not withstanding.)

    6. They don’t have menu bars or keyboard shortcuts.

      Two things which typical users have been shown not to care about. (I understood the shortcuts thing around this time. It took me, like, 5–10 years after the date that this post was published to realize the thing about menu bars, too. And I'm not even a Mac user.)

    1. designers are fickle beasts, and for all their feel-good bloviation about psychology and user experience, most are actually just operating on a combination of trend and whimsy

      the attitude of software designers that gripped the early 2010s described succinctly

    1. The battle for convivial software in this senseappears similar to other modern struggles, such as the battle toavert climate disaster. Relying on local, individual rationality aloneis a losing game: humans lack the collective consciousness thatcollective rationality would imply, and much human activity hap-pens as the default result of ‘normal behaviour’. To shift this meansto shift what is normal. Local incentives will play their part, butsocial doctrines, whether relatively transactional notions such asintergenerational contract, or quasi-spiritual notions of our evolvedbond with the planet, also seem essential if there is to be hope ofsteering humans away from collective destruction.

      Quoted here: https://malleable.systems/catalog/

    2. Consider how many web applications con-tain their own embedded ‘rich text’ editing widget. If linking weretruly at the heart of the web’s design, a user (not just a developer)could supply their own preferred editor easily, but such a feat isalmost always impossible. A convivial system should not containmultitudes; it should permit linking to them.

      Quoted here: https://malleable.systems/catalog/

    1. It feels weird to say this in 2020, when the idea was presented as fait accompli in 1997, but an enabling open source software movement would operate more like a bazaar than a cathedral. There wouldn’t be an “upstream”, there would be different people who all had the version of the software that worked best for them. It would be easy to evaluate, compare, combine and modify versions, so that the version you end up with is the one that works best for you, too.
    2. Currently modes of software development, including free and open source software, are predicated on the division of society into three classes: “developers” who make software, “the business” who sponsor software making, and “users” who do whatever it is they do. An enabling free software movement would erase these distinctions, because it would give the ability (not merely the freedom) to study and change the software to anyone who wanted or needed it.
    1. Since all reading at that time occurred out loud rather than inside one’s head, the study rooms were a modern librarian’s nightmare

      The modern library is the quiet reader's nightmare; I've been to many noisy libraries over the last decade.

    1. aren’t really pens at all, in fact, but tributes to pens

      Nice turn of phrase.

      See also: mop handles that flex and bend like rubber when subjected to the downward pressure that is typical during mopping.

    2. the cute yellow mittens my wife picked up at Target which unraveled the second time she wore them

      I've said it before: we focused too much on the dream of a 3D printer in every home when we should have focused on personal electric looms.

    3. For the remains of the Pyrex casserole that shattered when I removed it from the oven, strewing the floor with blade-like shards, some so tiny I probably won’t find them for another couple of months, and only when they lodge in my bare feet.

      Would it be possible to adulterate glassware to glow underneath a blacklight?

    1. People seem to think it's the browser's job to block ads, but my perspective is that if a business owner wants to make their business repulsive, the only sensible response is to stop using the business. Somehow once technology is involved to abstract what's happening, people start talking about how it's their right to unilaterally renegotiate the transaction. Or for another analogy that will likely make you upset: "I hate how this store charges $10 for a banana, so I am just going to pay $2 and take the banana anyway".

      terrible analogy is terrible—and I say this as someone who doesn't even fall in line with the general anti-copyright sentiment that is prevalent on sites like HN and Reddit

    1. non-concrete ideas are very hard to falsify

      Maybe this is just a regional thing, but something I really began to notice several years ago is that (a) what Jamie is saying is true, but (b) it's evident that people actually love the unfalsifiability aspect of non-concrete claims—it's safe refuge that they actively seek out.

      Two opposing examples from real life that particularly stuck out: * "Slow down. [You're going too fast.]" * "[...] since you always take so long"

      (These were two different instances/contexts with, I think, a year+ in between them; it wasn't the contrast between them that made me notice the phenomenon. Rather, I recognized each one at the time as being part of this phenomenon. They just serve as good examples because of how easily they could be made concrete—and therefore falsified in light of the facts: "without defining exactly what 'too fast' means, what is an acceptable speed?", "without defining what it means to take too long, what is an acceptable amount of time to take?"—both arising from wholly disingenuous complaints that were forms of externalized gut reactions rather than anything that would hold up under scrutiny...)

    1. Building programs by reusing generic components willseem strange if you think of programming as the act ofassembling the raw statements and expressions of a pro-gramming language. The integrated circuit seemed just asstrange to designers who built circ uits from discrete electronic com ponents. What is truly revolutionary aboutobject-oriented programming is that it helps programmersreuse existing code. just as the silicon chip helps circuitbuilders reuse the work of chip designers.

      Oh man, this metaphor really fell apart and, if anything, works against itself.

      "If integrated circuits are superior to discrete components, why exactly are we supposed to be recreating the folly of reaching for reusable components in creating software?"

    1. The only difference is that standard data repre- sentations (XML schemas) eliminate the need for custom parsers

      They don't, though. Things like JSON, XML, etc. mean you don't need to write a lexer--not that you don't need to write a parser. People land themselves in all sorts of confused thoughts over this (smart people, even).

    1. From the 1976 edition of Naur, Randell, and Buxton's "Software Engineering: Concepts and Techniques":

      Get some intelligent ignoramus to read through your documentation and try the system; he will find many "holes" where essential information has been omitted. Unfortunately intelligent people don't stay ignorant too long, so ignorance becomes a rather precious resource. Suitable late entrants to the project are sometimes useful here.

      Burkinshaw on correctness and debugging. p. 162. (Only in the 1976 edition.)

    1. as forking Electron to make Min wouldn't make any sense, and the replier knew this, reading it to mean that seems like a mistake to me

      Right. If there are two ways to take a statement, one which is absurd because it's inconsistent with actual fact and another which isn't, it's a bad idea to make an argument that hinges on it being the former—that would mean you're insisting on an absurdity.

    1. unified module system and I wouldn’t need to worry about file name extensions

      There is. It was standardized in ES6 aka ES2015. 2015! Direct your ire in the appropriate direction, i.e., at your vendor—and, ultimately, at yourself regarding your own lack of foresight.

    2. I just can’t stop dreaming about a perfect world where I could go back to any of my old JavaScript projects with an ease of mind and knowing that everything just works. A perfect world where my program is going to stand the test of time.

      That's a you-problem. The pieces are there—the language is stable, and there's a ludicrously backwards compatible World Wide Wruntime that you can trust to be around—it's on you if it fails.

    3. perhaps we should take a few steps back before making decisions to reflect on our long-term vision and to recognize that how every change, even the tiny ones, could affect a whole group of users

      The truest thing I've read in this post so far.

    4. being developed on their own to push the boundaries

      I don't think so. I go back look at what boundaries were being pushed in historical projects like some of the stuff that Tolmasky was doing and I see way more than today.

      The last 10 years are pretty stagnant, in comparison. React sucks up a lot of resources to ultimately do little more than reinvent the wheel (poorly). Same with the adjacent tools mentioned here.

    5. JavaScript has been evolving and growing so fast it’s unlike any other tech I’ve seen.

      This statement belies some ignorance of history, I think, or some form of intellectual dishonesty.

      JS is objectively and demonstrably slower on the uptake than what happened with Java.

    6. we mean it as a whole and how it gets used by the user, not just the language specifications alone

      Well, "as a whole", JS includes non-Node-affiliated silliness; it does get used by people in ways that doesn't involve any of this stuff. So use it that way, too, if you have a problem with the way you're using it now (and you should have a problem with it—so stop doing that!)

    7. The language is responsible for providing the context and the environment in which things happen and get shaped.

      No. Well, yes, this is true, strictly speaking. But, again, this is a true-in-some-sense-that-isn't-relevant-here sort of way.

      Parents are not responsible for the crimes that their children grow up and commit.

      The NodeJS community is responsible for the NodeJS community's problems. And people outside the NodeJS community who choose to align themselves with the NodeJS community are responsible for their choices.

    8. Six months passes and while you had almost forgotten about your little project, you now have got some new ideas that could make it even better. The project also has a few open issues and feature requests that you can take care of. So you come back to it. “Let’s start”, you whispered to yourself with excitement. You run npm install in your terminal like an innocent man and switch to your browser to scroll on Twitter while you are waiting for the dependencies to be installed. Moments later you return to your terminal and see… an error!
    1. @15:40

      His point is that a lot of software design has failed to be even as good as print design--that software design really has focused so much on interaction, we kind of treat the computer as this machine that we need to manipulate and less about displaying information--for people to make decisions, to come to conclusions, to learn something--and that by focusing so much on interaction design, by focusing so much on the computer as a mechanical thing, we've really made bad software.

      And so he wants to say part of the problem here is that the people who design can't make these context-sensitive magic ink--right? It's like: print design but now it knows something about your context. And so designers really aren't able to make these rich information things that are dynamic but not interactive. And so you could really kind of call this "Interaction Considered Harmful", and the idea is that our software should only be interactive when it has to be and really our software should be context-aware, good, print-like displays of information.

    1. This brings interesting questions back up like what happens to your online "presence" after you die (for lack of a better turn of phrase)?

      Aaron Swartz famously left instructions predating (by years IIRC) the decision that ended his life for the way that unpublished and in-progress works should be licensed and who should become stewards/executors for the personal infrastructure he managed.

      The chrisseaton.com landing page has three social networking CTAs ("Email me", etc.) Eventually, the chrisseaton.com domain will lapse, I imagine, and the registrar or someone else will snap it up to squat it, as is their wont. And while in theory chrisseaton.github.io will retain all the same potential it had last week for much longer, no one will be able to effect any changes in the absence of an overseer empowered to act.

    1. In our way of delivering orders we emphasise explaining the context two levels up. I may tell my soldiers to raid a compound, but I would also tell them that the reason for this is to create a distraction so that the Colonel can divert the enemy away from a bridge, and that the reason the Brigadier wants the Colonel to divert the enemy is so that the bridge is easier to cross. Not only do the soldiers then know why it’s important to raid the compound (so that others can cross the bridge), but they know that if for some reason they can’t raid the compound, creating any other diversion or distraction will do in a pinch, and if they can’t do that they can still try to do something to make it easier to cross the bridge. It lets everyone adapt to change as it happens without additional instruction if they aren’t able to get in touch with me. Again I think tech could possibly learn from that.

      def

    1. McIlroy envisioned a world where we would be constructing software by picking components off a shelf, and snapping them together like Legos. The hard work would be building the right blocks, and then it would be easy to snap them together.

      See also: Brad Cox on software ICs

    2. Briefly, Taylorism has two central tenets:Measurement: associate metrics with all aspects of work.The separation of thinking and doing: An educated class of managers measures and plans all the work, and is responsible for the overall process, while a class of laborers carries out the implementation of those plans.

      I find it difficult to reconcile these two tenets with the claim that "Taylorism is so deeply ingrained in every aspect of not just modern commerce but life in general".

      Many (most?) places—even big engineering orgs like Samsung—are failing on the first principle alone and are doing a lot of wasteful, gut-driven operational stuff.

    1. Everywhere you could use var you can now use let or const

      No, not everywhere.

      You start replacing var with let and things can be observed to break (e.g. multiple declarations, no declaration before use...)

    1. I’m rather surprised that a separate issue did not come up. That is, many schools make students sign something saying that any code they create as a student has the copyright automatically assigned to the school.

      This is not nearly as common as the highlighted comments suggest—at least not for undergrads at public universities in the US. It is a thing, though, with grad students (for obvious reasons).

    1. Multiplayer stategy games.

      Today, you tend to hear "web app" as a term used e.g. to contrast SPAs from classic, form-based apps. But the term was already being used then, when (barring the use of plugins, which weren't really "of" the Web) only the form-based ones were viable because SPAs and similar Web-native RIAs weren't yet an option.

      Should we refer to those (the classic Web apps) as "turn-based apps" to emphasize the paradigm in play (i.e. REST)?

    1. If I had to guess, I'd assume that this was done here to save some keystrokes by not having to pass around all arguments again.

      One of the most obnoxious motivators for turning out code that ends up sucking.

      When trying to save keystrokes, prefer keyboard macros over code macros that get checked in as source code.

  4. Nov 2022
    1. But. I somehow changed-ish laptop, and you know the problem where you change laptop and suddenly you lose access to a few things? Yes. That's one of my problems. This site was using nini to statically generate pages and interconnect them with backlinks, and it's great. But I figured I'll keep it to simple html pages that don't need any compilation for now, so that when I change this laptop, I'll still be able to publish without having to install anything.
    1. I encounter a lot more smug static weenies than smug dynamic weenies, so I defend dynamic typing out of spite. There have been a few cases where I was surrounded by SDWs and I immediately flipped around to being pro-static. The industry is moving towards static typing and I like being weird and contrarian.

      See also:

      In the Smalltalk community, I always had the reputation as the guy who was trying to make Smalltalk more static so it was more applicable for industrial applications. In the Microsoft environment, which is a very static world, you know, I go around pushing people to make everything more dynamic.

      — Allen Wirfs-Brock, in an interview with Jon Udell https://hypothes.is/a/YH0Mwp0yEeu-Ybt6B3i1Ew

    1. They demand less catching up time, less file-switching and ultimately give you the very bit of information you’re looking for anyway when dealing with them.

      déformation professionnelle

    2. unused classes

      again: unmatched class selectors

      additionally, it's not the fact that they are unmatched (or that they are class selectors specifically) that it's a problem—it's the fact that there are a lot of them

      the entire choice of focusing on classes and class selectors here is basically a red herring

    3. “Favor composition over inheritance”. This piece of wisdom from Design Patterns, one of the most influential software engineering books, is the foundation of utility-first CSS. It also shares many principles with functional programming: immutability, composability, predictability, and avoidance of side-effects. The goal behind all those fancy terms is to write code that’s easier to maintain and to scale.

      déformation professionnelle

    1. I want to add software to offer to inline HN links in this format when people include links to past threads in their comments.

      You could pretty easily do this with a bookmarklet. (Call it "citehn", pronounced "citation".)

    1. logic that depends upon fixed format/fixed position fields

      So don't do that.

      The class attribute had been in HTML for years by the time this post was written—and (contrary to any belief otherwise) was meant for this purpose, and not something contributed by the CSS group for use with CSS selectors.

    1. The idea that a billion of us can keep dumping fresh content into our account for free and that none of this content seems to be ever lost, is honestly quite bizarre even if we take it for granted.

      You can change the framing and gain new insights.

      In the mastodon.technology shutdown post linked, the author describes a situation where the workload exceeds the capacity of an ordinary person, even a motivated one. (There's an argument to be made that this makes for someone who isn't merely an ordinary person—but that strengthens the point I'm about to make, instead of weakening it.)

      How do we fix this problem? In other words, how do we ensure that the workload of an "instance" remains within the realm of feasibility for an ordinary person?

      Answer: making it the responsibility of each person. A single volunteer admin should not be responsible for hundreds, thousands, or more other people. Getting each person to shoulder their own personal load is far more tractable.

      What's absent, currently, is the means for each person to do so on their own in a way that we can realistically expect. That can be worked on. Software like Mastodon can be improved upon—necessarily drastically so—and infrastructure configuration can be improved to, to the point that it doesn't even feel like infrastructure configuration.

      For people with very large spheres of influence, like Aral Balkan who recently disclosed that he's spending ~600 EUR per year for his instance, they can seek help, mining from the resources that are at their disposal that are a consequence of their wide reach. For ordinary people with up to a few hundred followers, they won't need to be exposed to this.

      As for the sentiment behind the remarks about "dumping fresh content into our account for free", recognize that the fresh content has value, and there are ways to subsidize the resource use by parties with an interest in being able to capture some of that value for themselves. When you post a widely shared piece to your blog, then Google for example benefits from this, whether you're using Google Ads or not. It's the mere fact that there's something on the Web worth looking at that makes this beneficial.

    1. That's a whole different topic. Mastodon isn't built for single-user instances.

      That's the entire topic, my guy!

      "We should be optimising Mastodon so it incentivises more serve[r]s with fewer people." is the very premise of the conversation!

      Mastodon "push[ing] the direction of the protocol or make it harder to cultivate an ecosystem of smaller ones."? "it needs to be easier to start smaller ones"? Are you just not paying attention to the conversations you're responding to?

      Reminds me of:

      What fascinated me was that, with every single issue we discussed, we went around in a similar circle — and Kurt didn’t seem to see any problem with this, just so long as the number of 2SAT clauses that he had to resolve to get a contradiction was large enough.

      https://scottaaronson.blog/?p=232

    1. @stephen@social.stephenfry.com

      This is where it starts getting ridiculous.

      First, rather than social.stephenfry.com, stephenfry.com should be sufficient. Look at email. I can set my MX records to point wherever I want. I don't actually have to have a server with A records to field the email traffic.

      Secondly, the @stephen part is superfluous, too! This is something where Mastodon et al had years (decades!) of hindsight to take care of this, and they still messed it up.