- Dec 2022
-
www.getrevue.co www.getrevue.co
-
I think the problem with protocol layers is that they are not really obvious to the user. The closest we get now is when enter, "https://www.some_domain" into our web browser, or while setting up a new router.
I feel like a more explicit expression of norms and norm-signaling is required -- a social contract for how we use the internet, and a way to belong.
-
If you want proof of this, get out of the US and European bubble of the bitcoin price fluctuations and learn how real people are using it for censorship resistance in Africa and Central/South America.
Would be great if he linked us to an example here that was more obvious. I don't believe that his point is quite as relevant in the US, where the concept of "free speech" is different, and I am concerned with the social contract at home (in the US) first.
Here was my top search result: https://www.coindesk.com/markets/2021/07/22/bitcoins-censorship-resistance-was-a-step-change-in-history/
-
It can only be done through ranking and relevance algorithms, the more localized the better.
The synopsis here is that: * Artificial Intelligence select our content for us * You can't "open up the AI" ->To constrain AI, you have to be explicit about what you want to code it in, and we "aren't there yet". We are just begining to set up "computational contracts". Reference: Blockchains.
What can be done now: Why not create a marketplace of algorithms for what people choose for themselves? Third parties. Shouldn't inherently detract from current content selection business *"And here’s another important thing: right now there’s no consistent market pressure on the final details of how content is selected for users, not least because users aren’t the final customers. (Indeed, pretty much the only pressure right now comes from PR eruptions and incidents.) But if the ecosystem changes, and there are third parties whose sole purpose is to serve users, and to deliver the final content they want, then there’ll start to be real market forces that drive innovation—and potentially add more value.' * "That has to come from outside—from humans defining goals."
-
Content takedowns and suspensions should not be possible.
I agree with this, but there should be a way to filter out conversations we no longer want to be a part of. For instance, if I don't want to hear about Donald Trump any more, I'd like to be able to filter him out, no matter which side of the political spectrum I sit on. I don't want him to have my attention. I'd like to be able to ignore some efforts to intrude on my attention. I can do this with some ad-filtering in my browser right now, but it's hard to do with other categories of information.
I can't "unread' the headlines news sites, social media feeds, or my search engine stick in front of me right now.
-
I no longer had hope of achieving any of it as a public company with no defense mechanisms (lack of dual-class shares being a key one). I planned my exit at that moment knowing I was no longer right for the company.
Company structure is important. B-Corps and DAOs seem worthwhile when it comes to thinking of a community.
-
Not sure how deep this goes. I think humans with expertise should be part of any algorithm. I don't like the idea of pure machine-based algorithms that divide thought into filter-bubbles or tribal groups. That is a problem.
-