4 Matching Annotations
  1. Nov 2020
    1. The app that comes closest to a luxury service that I can think of is Superhuman, which charges its users $30 a month for an email client (which you could also get for free by just using Gmail). But there’s a difference to other software products: Superhuman has signal distribution built in. Every time you send an email via Superhuman, your recipient will notice a little “Sent via Superhuman” in your signature.

      Superhuman is the closest thing Julian can think of to a luxury software product. One reason might be that Superhuman has some signalling built in: It will add a little "sent by superhuman" to your signature.

  2. Aug 2020
    1. With a strong enough NLP engine behind the command line interface, the possibilities become endless: Add that New York Times article to your Pocket queue or send it directly to your Kindle to read it later Re-assign Jira tickets directly from Superhuman or send them to your to-do listPay invoices or send money to a friend

      Julian Lehr offers an interesting idea. If you can process emails directly, without needing to open them, and if you can do so with a text-based user interface powered with an NLP engine —you've got something very powerful on your hands.

      This is especially interesting because with the advent of GPT3 this is actually becoming closer to a reality.

  3. Feb 2019
    1. ily to be distinguished in society, by the soundness of their understanding and the superi­ority of their faculties above the rest of mankind. The ascendant, which they acquire, gives a

      "the superiority of their faculties" makes them sound super-human, while the taste is said to be "distinguished in society." I'm sensing some tension here in how society and nature interact.

  4. Sep 2018
    1. Another approach to confinement is to build rules into the mind of the created superhuman entity (for example, Asimovs Laws of Robotics). I think that any rules strict enough to be effective would also produce a device whose ability was clearly inferior to the unfettered versions (so human competition would favor the development of the more dangerous models).

      The author points out that human competition, which thus far has driven the exponential development/advancement of technology, would drive developers towards the "unfettered versions." While I agree that it would likely be the case, I think it's possible that the author is underestimating how much of ourselves would likely end up in the superhuman. Aspects of humanity that were likely never intended to be in the superhuman, but will end up there inherently due to who is programming it.