- Jul 2024
-
reallifemag.com reallifemag.com
-
In 1996, technology historian Jennifer S. Light compared the talk of “cyberoptimists” about virtual communities to city planners’ earlier optimistic predictions about shopping malls. As the automobile colonized U.S. cities in the 1950s, planners promised that malls would be enclosed public spaces to replace Main Streets. But as Light pointed out, the transition to suburban malls brought new inequities of access and limited the space’s functions to those that served commercial interests.
A comparison of urban development privatisation and internet development corporatisation
-
While it’s technically true that no one company dominates the internet today, the cloud services, undersea cables, and other infrastructures that power it are increasingly concentrated in a small group of telecommunications conglomerates and the owners of the web’s dominant platforms: Amazon, Google, Microsoft, and Facebook. But looking back at the internet of the 1990s, the power of private companies was already apparent. Even though the internet’s infrastructure wasn’t fully privatized until 1995, online interactions were already being shaped by commercial pressures.
Corporate power in 1995
-
The conventional wisdom about networks suggests that their politics can be reduced to how centralized they are: A centralized network is designed for control, while a decentralized or distributed network is democratic. Early champions of the internet assumed both that its structure made it decentralized and that its decentralization would protect it from monopolization. In 1999, Tim Berners-Lee, the inventor of the World Wide Web, wrote that the internet “is so huge that there’s no way any one company can dominate it.”
what they sold to us..
-
- Apr 2024
-
www.wired.com www.wired.com
-
Distrust of existing tech institutions and separatist urges will always be part and parcel of decentralized social media
-
We can, and should, always strive to build better, more accessible, and more inclusive technology. But decentralizing the web into walled-off silos seems unlikely to accomplish this goal.
-
These platforms have shortcomings on the practical level, as well. If a user needs a different app for everything they want to do online, we’re looking at a massive increase in app fatigue, the exhaustion that comes when users must download and engage with more platforms to have a semblance of an online presence and community. While the platforms offered by Meta and Alphabet are certainly not without issue, it is hard to deny the convenience of their established existence, which makes it possible to communicate, be entertained, shop, and more all in the same place. By contrast, users of decentralized platforms will need to download a slew of apps for everything they want to do online, because these features will no longer all exist in one place.
-
-
www.robinsloan.com www.robinsloan.com
-
The term Web3 plays on “Web 2.0”, popularized in the 2000s to describe a new generation of websites and web platforms. As a philosophy, Web 2.0’s success was incomplete, to say the least: there was a whole thick strand of ambition around the exchange of data in modular, permissive ways between platforms, which basically died — or was killed. With that in mind, I think Web3 is a fine term for this new set of ideas, because it will certainly play out the same way: influencing the direction of the internet, but incompletely; unpredictably.
-
The money thing confounds evaluation; it’s like trying to look at a star next to the sun. The same was true for the World Wide Web in the year 2000, of course; and, if that’s the analogy, what do we make of it?
-
A large fraction of Web3’s magnetism comes from the value of the underlying cryptocurrencies. Therefore, a good diagnostic question to ask might be: would you still be curious about Web3 if those currencies were worthless, in dollar terms? For some people, the answer is “yes, absolutely”, because they find the foundational puzzles so compelling. For others, if they’re honest, the answer is “nnnot reallyyy”.
-
Many Web3 boosters see themselves as disruptors, but “tokenize all the things” is nothing if not an obedient continuation of “market-ize all the things”, the campaign started in the 1970s, hugely successful, ongoing. In a way, the World Wide Web was the rupture — “Where … is the money?”—which Web 2.0 smoothed over and Web3 now attempts to seal totally.
Tags
Annotators
URL
-
-
damaged.bleu255.com damaged.bleu255.com
-
Open Source Appropriate Technology (OSAT)
-
-
www.e-flux.com www.e-flux.com
-
But stepping back even further, one can only see this imagined software as an enhancement to Latour’s larger model of interplay in his actor-network theory, a theory that does not need software or special equipment to exist. The activity in a spatial environment is not reliant on the digital environment. It may be enhanced by a code/text-based software, but a spatial software or protocol can be any platform that establishes variables for space as information
-
Compared to the relative trickle of space made by special practitioners, these technologies produce a fire hose blast. The most radical changes to the globalizing world are being written in the protocols or softwares of infrastructural space.
-
On the one hand, Alexander expands the repertoire of design to include activity. But on the other, he quickly codifies and taxonomizes that activity. He mimics the object of this own critique by reforming the artificial with a “natural” corrective—instead of the tree, the semi-lattice becomes the placeholder. Despite his attempt to incorporate active form and information, Alexander only creates another immobilized form.
-
In his 1965 article “The City is not a Tree,” he critiqued what he deemed to be the infrastructural or organizational template of many settlements and cities. A “tree,” in Alexander’s parlance, is a branching structure in which sets are either completely disconnected from one another or entirely contained within one set without overlapping sets. The branches do not grow together but emanate separately from a single trunk.
-
We are not accustomed to the idea that non-human, inanimate objects possess agency and activity, just as we are not accustomed to the idea that they can carry information unless they are endowed with code/text-based information technologies. While accepting that a technology like mobile telephony has become the world’s largest shared platform for information exchange, we are perhaps less accustomed to the idea of space as a technology or medium of information—undeclared information that is not parsed as text or code. Indeed, the more ubiquitous code/text-based information devices become, the harder it is to see spatial technologies and networks that are independent of the digital. Few would look at a concrete highway system or an electrical grid and perceive agency in their static arrangement. Agency might only be ascribed to the moving cars or the electrical current. Spaces and urban arrangements are usually treated as collections of objects or volumes, not as actors. Yet the organization itself is active. It is doing something, and changes in the organization constitute information. Even so, the idea that information is carried in activity, or what we might call active form, must still struggle against many powerful habits of mind.
-
This capacity seems to finally fulfill the dream of artists and architects of the mid- to late twentieth century, among them Jack Burnham, Cedric Price, Archigram, and Christopher Alexander, who experimented with a cybernetic apparatus for modeling space.
-
-
medium.com medium.com
-
Power comes from organized groups. If you want people to have power, then you want to help them connect with others and teach them how to carry out effective advocacy together. That’s hard.
-
It’s not a technology problem. It’s not something that a slick website solves. Building power is a social, societal, institutional challenge.
-
-
www.goodreads.com www.goodreads.com
-
Technics and civilization as a whole are the result of human choices and aptitudes and strivings….The machine itself makes no demands and holds out no promises: it is the human spirit that makes demands and keeps promises. In order to reconquer the machine and subdue it to human purposes, one must first understand it and assimilate it. So far, we have embraced the machine without fully understanding it, or, like the weaker romantics, we have rejected the machine without first seeing how much of it we could intelligently assimilate.” ― Lewis Mumford, Technics and Civilization
-
-
kelsienabben.substack.com kelsienabben.substack.com
-
The “DAO Model Law” guide by COALA researchers outlines 11 technical and governance requirements for DAOs to meet the requirements for legal recognition as an entity, including:1. Deployed on a blockchain,2. Provide a unique public address for others to review its operations,3. Open source software code,4. Get code audited,5. Have at least one interface for laypeople to read critical information on DAO smart contracts and tokens,6. Have by-laws that are comprehensible to lay people,7. Have governance that is technically decentralized (i.e. not controlled by a single party),8. Have at least one member at any given time,9. Have a specific way for people to contact the DAO,10. Have a binding internal dispute resolution mechanism for participants,11. Have an external dispute resolution mechanism to resolve disputes with third-parties (e.g. service providers).These factors and considerations constitute a legal basis for conceptualizing DAOs.
-
Some obvious but useful DAO design or analysis questions are:1. What is being decentralized? (technically, economically, or politically),2. Who or what is being made autonomous, and from who or what?3. What is being automated?4. What is being organized?From here, subjective goals, beliefs and values can be articulated to determine design choices.
-
-
subconscious.substack.com subconscious.substack.com
-
URLs + HTTP + HTML = webURLs + HTTP + RSS = PodcastsURLs + HTTP + JSON = REST APIsURLs + P2P + HTML = Web3
Very interesting diagram
Tags
Annotators
URL
-
-
nonalignedtech.net nonalignedtech.net
-
NATM Principles Data extraction is illegitimate. There is nothing normal about using data to control us or target us for commercial or political purposes. Speed cannot circumvent ethics. Civil society —not governments or businesses— should be the first and last authority in deciding if a technology should exist. Saying “No” is a practical solution. A clean and quick break from extractive technologies is not possible, but a collective statement of refusal is an important first step. Other platforms are possible. Alternatives exist. We need to promote the use of non-extractive digital media platforms and support their implementation by our member communities. One voice, multiple solutions. While our communities might have different goals and agendas, we need to act as a global coalition to enforce the regulation, taxation and eventual elimination of extractive technologies.
-
-
consonante.org consonante.org
-
Los departamentos con mayor silencio informativo en Colombia son los que más han padecido la crudeza del conflicto armado. El Chocó es un ejemplo, donde la cifra de confinamiento forzado es la más alta y donde solo en siete de sus 30 municipios hay medios que producen información local.
How to bring forward the voices, stories of people that come from places that have been marginalised
-
-
paul.kinlan.me paul.kinlan.me
-
I think it covers a lot of good points: Secure - All domains are sand-boxed from each other and sites are sand-boxed away from the users machine. The user can go to any site and know they are safe. Linkable - You can point to any page or piece of content just by sharing a URL Indexable - Because you can link to anything, if public it can be discovered by any person or machine that can index it to make it universally discoverable to everyone. Composable - Iframes and JavaScript allow us to quickly compose and embed new sites, apps and services just by dropping in some JS and hooking things together. Ephemeral - There is nothing to install, you go to the page and interact with it, leave the page and when you do it stops taking up resources. SLICE.
SLICE
-
This is awesome, but is it possible to build a site that is truly 'local-only'? You would need to provide some guarantees that data couldn't be exfiltrated out of the browser. Right?
Local Only website possible?
-
-
terremoto.mx terremoto.mx
-
Frente al genocidio hay pactos de silencio, tecnologías de silenciamiento, regímenes de silenciamiento. Frente a la tibieza, omisión, y estulticia de editorilizar el genocidio como “un conflicto bilateral”, “una guerra de complejidad histórica”, “un cese al fuego en ambos lados”, “todas las vidas importan”, “paz y estabilidad en Medio Oriente”, hay un manifiesto acto de complicidad con los perpetradores del genocidio. Los mandatos del silencio por consenso institucional, corporativo, estatal, son un gesto público de complicidad con uno de los entramados de terrorismo de Estado, violencia, despojo y política de la muerte más ominosa de la contemporaneidad. Y con esto no sólo me refiero al ajedrecismo geopolítico al que juega la admnistración de Biden en su apoyo político y moral al genocidio, al igual que Reino Unido, Francia, Egipto, España, Canadá, Líbano, Australia, etc. Ni a los miserables ejercicios performativos de la retórica cautelar del Consejo de Seguridad de Naciones Unidas, ni a los lobbys políticos, ni a los nuevos poderes fácticos que articulan GAFAM, MAGMA, FAAMG, o Wallstreet, Hollywood y la industria inmobiliaria de los Estados Unidos, ni a todo el resto de conglomerados de la economía política del extractivismo que de sobra sabemos su relación con el sionismo. Sino al régimen de silencio de todos esos corporativos globales de multimedios, conglomerados de instituciones culturales-artísticas, museos, organizaciones sin fines de lucro, proyectos editoriales, bienales del Norte Global promotoras del “multiculturalismo liberal”, de las agendas interseccionales, del ambientalismo chabacano, del antirracismo performativo, del metarrelato del “Sur Global”, de la plétora de los discursos poscoloniales, de la agenda 20-30, de la democracia cultural, del “compromiso irrestricto con los derechos humanos” y la justicia social, en los que —en muchos de los casos— sus mesas directivas, consejos, y financiadores están integrados por plutócratas afines al sionismo o al Estado de Israel. Esta vez su pacto de silenciamiento está bañado de sangre.
Consejo de Seguridad de Naciones Unidas, ni a los lobbys políticos, ni a los nuevos poderes fácticos que articulan GAFAM, MAGMA, FAAMG, o Wallstreet, Hollywood y la industria inmobiliaria de los Estados Unidos, ni a todo el resto de conglomerados de la economía política del extractivismo que de sobra sabemos su relación con el sionismo
-
-
philipsheldrake.com philipsheldrake.com
-
That's ALL the work because all the work is sociotechnology, no decentralized tech stands in isolation, and we will have failed in the socio contexts. Specifically, instead of being sensible and meaningful (i.e. taking full advantage of our deep and beautiful capacities for perception and for making sense, making meaning), the interface between 'the digital' and the sentience and mind of the human species is collapsed down to the mindless, contextless, individuating and regimenting programmatic structure of the digital side of the equation. We can avoid this. Not with a solution, as Nora Bateson (2016) points out ...
That's ALL the work because all the work is sociotechnology, no decentralized tech stands in isolation, and we will have failed in the socio contexts.
-
The promise of increased knowledge is that it might help us to solve the problems we face. But the problem with problem-solving is the idea that a solution is an endpoint. There are no endpoints in complex systems, only tendrils that diffuse and reorganize situations ...
Design solutionism answered!
-
Other concepts of primary interest include agency, freedom, power, behaviour, identity, reputation, culture, democracy, rights, justice, and jurisprudence, many of which pepper the slidestacks and homepages of Web3 projects
True, how to approach this from design?
-
However, Valdis Krebs conveyed to me in conversation that he's never mapped a social network that looks anything remotely like Baran's archetypal distributed topology. Post-Baran then, distributed has achieved currency in describing techniques for coordinating digital computation, whereas decentralization is the clarion call one hears whenever contemplating human systems with the desire to avoid those concentrations of power (see Schneider 2019). For example, we talk in terms of decentralized organising (DAO) and distributed ledgers.
Distributed does not exist in the wild
-
A system is decentralized when it doesn’t have central components through which information flows, thereby avoiding concentrations of control and power. E.F. Schumacher considered decentralization allied with freedom and one of “the truths revealed by nature’s living processes.” (Small is Beautiful, 1973)
A system is decentralized when it doesn’t have central components through which information flows, thereby avoiding concentrations of control and power.
-
Web3 is a subset of the decentralized web (dweb) describing the relatively recent application of novel cryptographic techniques to the challenges of distributed computing. Distributed ledger technology is a subset of Web3, of which blockchain is one form. For me, information-centric networking (e.g. IPFS, Hypercore, Maidsafe, and Swarm) and so-called self-sovereign identity (SSI) also come under the Web3 banner.
Explaning what Decentralized web first then Web3 then Distributed ledger technologies as a subset
-
-
philipsheldrake.com philipsheldrake.com
-
When only states are sovereign, the individual rendered stateless has no rights. This is unacceptable and led to the Universal Declaration of Human Rights in 1948 and more recently to the inclusion under the Sustainable Development Goals of the target to ensure everyone has a legal identity by 20306.
Everybody should have a legal Identity by 2030.
-
In other words, identity ≠ identification ≠ ID. Any and every conversation in which participants recognize this distinction is massively more productive. And there's another phrase …
identity ≠ identification ≠ ID.
-
Identity experts are semantic pedantic — their professional focus is distinction after all. I adopt Jonathan Donner's recommendation3 to distinguish the terms "identity", "identification", and "ID". Identity "implies a kind of multidimensional social location of an individual relative to other people and institutions around him or her." Identification is a claims verification process, and ID is an artifact, traditionally tangible, that "supports a claim or signals that identification might be possible."
Different terms for Identity
-
Well not quite. Consider the Barnes Paradox whereby individuals transgress their own privacy preferences just to get to the stuff a click away. We've all done it. Moreover, each of us is just one of very many agents in the constant emergent reformation of societal structures; in other words, many if not most of your fellow citizens would have to exercise similar discipline. Still confident?
Sacrificing features for one's data
-
-
firstmonday.org firstmonday.org
-
The choice between Silicon Valley gadget-making (which amounts to planned obsolescence on a stratigraphic scale since new geological strata currently produced are made out of the plastic and exotic metals excreted by discarded digital devices) and radical technological abstinence is misguided. Instead, to borrow an expression from Donna Haraway, we might have to learn to “stay with the trouble” meaning that not only ourselves but our world has been hybridized with technology (Haraway, 2016). Just as we may not have the choice of leaving our decaying planet to settle on Mars or any other fanciful destination desired by Elon Musk, neither may we have any other choice than to understand and seize control of the knowledge graphs and algorithms which are part and parcel of the global infrastructure. Global — centralized — answers provided by oligopolies follow patterns which do not disclose adequate conditions for the necessary reorganization required if we are to collectively survive the coming hardships, notwithstanding the heroic posturing of Elon Musk: After all, Musk’s master plan for 3,000 years is simply to escape the planet and extract whatever is needed to achieve this goal beforehand! To escape the practical problems posed of the present due to the possibility of a “solar catastrophe,” as posited by Lyotard in his later years, is escapism. Instead, the intellectual reserves of philosophy should be aimed directly at the problems of political and ecological sustainability inherent in our infrastructure, including the infrastructure of knowledge given by the Web (Lyotard, 1988). For example, given the global scope of climate change and the need for better scientific data collection on carbon emissions by individuals, it is more likely that a decentralized Web in the hands of an empowered population will be crucial for the future. The future of the Web will then involve a radical practical re-design of machine learning, data centers and knowledge representation in order for knowledge to become truly decentralized. The vision of the Semantic Web should not be caught up in arguments over logical frameworks, but focus on the elements of what it would take to empower people with knowledge: Not just data, but kinds of thinking and infrastructure. The future will then be more contentious than just opening up datasets. The decentralization of knowledge is a political struggle for power over knowledge in the context of an ecological crisis, it is a re-appropriation of what Tony Fry calls “future-making” so as to multiply the ongoing experiments out of which answers (in the plural) may eventually emerge and scale. That is the crux of decentralized knowledge: It must fit local conditions and globally scale whenever needed at the same time. In order to rescue the potential of these technologies, we should rescue their potentials given by philosophy. Both the Semantic Web and Carnap foresaw a future where all of the world’s knowledge could be self-organized without a “master plan” but still ultimately strive to be communicated. Carnap’s tolerant or decentralized vision of multiple — and possibly incommensurable — languages being developed to aid in large-scale distributed cognition can be extended outside of the confines of the logic and model theory of the traditional Semantic Web, in order to encompass the opening and sharing of machine learning algorithms, thus providing new frontiers for knowledge engineering itself.
RE read with care
-
What we do with other beings which face a common threat, those which contribute to futuring as well as those who contribute to defuturing (Fry, 2009), remains to be seen; however, we can imagine that data centers under popular control can be decentralized, and ultimately made more ecologically sound. It should be remembered that the choice between letting go of or embracing digital technology may not even be a fair one. For, if predictions are true, lack of affordable access to oil and resources will drastically reduce the possibility of maintaining, repairing and adding new infrastructures within just a few decades.
Defuturing
-
Decentralization must mean seizing back control not only of data, but of algorithms and data centers from centralization, which is a political task for the future. This political task has been latent in philosophy for decades, as articulated by Lyotard (1979) in The postmodern condition: “Give the public free access to the memory and data banks.”
Decentralization
-
Tim Berners-Lee has, via the Semantic Web, long advocated for open data. It is now obvious that open data is necessary but not sufficient for the development of autonomy. The ability to think algorithmically and program is useless in terms of the decentralization of knowledge unless the proper infrastructure is provided. Decentralized open data and even open versions of the knowledge graph like DBpedia are rather meaningless in terms of human knowledge if only a small minority controls the data centers and machine learning algorithms needed in order make sense of the data
Open Data is not Enough
-
In the era of Diderot’s Encyclopédie, knowledge was bound to the function of every tool: An axe for cutting, looms for weaving and even dyes for wig making all featured prominently in the Encyclopédie. In the transition to the Web as a universal space of information, the truly necessary tool is the universal abstract machine, the Turing Machine that executes any computable algorithm. As through education and literacy our ancestors learned how to autonomously extend their physical capabilities with modern tools and learned how to autonomously organize in a larger complex social fabric than simple face-to-face meetings, through programming humans can learn how to communicate with the machines necessary to autonomously understand and control the complex technological world we have inherited. In this regard, programming is not simply the learning of a particular programming language, from Lisp to HTML and JavaScript. What is necessary is for the generalized skillset of scientific, logical and algorithmic thinking that underlies programming to be spread throughout the population. This does not mean it should in any way supersede our previous languages and modes of thinking, just as writing did not absorb non-verbal tool-use and the visual arts, but that it is necessary in order to maintain autonomy in the era of the Internet. Rather than a valence of description of the world or a technique for controlling the world, it would be far better to think of algorithmic thinking as yet another capacity that can be developed and nurtured in future generations due to its own limited yet powerful capacity: A meta-language for controlling the general abstract machines — computers — that currently form the emerging global infrastructure of much of our inhabitation of the planet. Without the ability and freedom to navigate through these programs, autonomy would be lost.
Cite this in PhD
-
There is now widespread concern that this vast power may be abused, and there is spreading among the general population a fear of these companies and a distrust of the Internet (Morozov, 2014). Can the Web return to being a tool of empowerment?
Abuse of power int he internet Internet as a tool for empowerment.
-
In their radical re-interpretation that blended together Heidegger and cybernetics, the human was to become part of a new kind of distributed cognitive system that continually learned from its mistakes. Technology aimed for ever smoother, and eventually invisible, integration with the human. In other words, Winograd and Flores had laid the metaphysical foundations for Google.
Cybersyn experiment was transfered to google
-
The key to the new Heideggerian metaphysical foundations for design was Heidegger’s concept of Zuhandenheit, of “ready-to-hand”, where the goal was to have the computational apparatus become completely transparent — invisible — to the human. If there was to be some kind of technical breakdown, the technological apparatus was to become reshaped based on human feedback with the ultimate goal of re-establishing its own self-organization that continuously improved in the face of the messiness of the world.
Design always good at making thinks invisible...
-
While the Semantic Web imagined a vast web of knowledge representations structuring the world’s data, what had actually ended up happening was that a few companies had literally developed copies of the entire world’s data, and by cleverly applying algorithms to this unstructured data, they were able to extract immense amounts of both knowledge and wealth by predicting patterns in everything from user buying habits to results of elections.
From Semantic Web to Googles Web
-
As the adage in machine learning circles goes, “There’s no data like more data.” Yet storing and processing data did not come without costs. With the amount of data on the Web skyrocketing into the millions of terabytes, what ended up mattering for the future of the Web was the ability to handle data that was larger than could be fit on a single machine, which in turn required large distributed — but centralized — data centers to handle. In other words: “big data.”
This was the main reason for the server client paradigm to continue
-
This proprietary database was put in place to connect the vast variety of heterogeneous knowledge spread throughout Google’s various online services. At the same time, other companies such as Yahoo!, Microsoft and even Apple started creating their own competing proprietary knowledge graphs. The use of these knowledge graphs started becoming increasingly common in new products. Behind Apple’s Siri’s knowledge of the world lies the formal knowledge engineering of a spin-off of Stanford Research Institute (SRI) that formed the foundation for Apple’s knowledge graph.
Propietary Knoledge Graphs. how information is linked could be privatized. so even open source data is able to be captured.
-
In 2011, Google’s plans for the Semantic Web became clear: Led by Guha, Google had created a massive knowledge representation framework called “schema.org” to unite the fragmented structured data present on the Web (Guha, et al., 2016). Using the considerable clout of Google, other search engines such as Yahoo!, Microsoft and Yandex joined the effort so that every search engine could consume the same kinds of structured data, and Web page authors would know which logical terms to use in order to add knowledge to a Web page. Although not an open standard and controlled informally by a small group of search engines, schema.org finally made structured data take off, so that soon up to 10 percent of the Web was using structured data. Even the Facebook “Like” button began embedding data using RDFa, making structured data ubiquitous on the Web.
Schema.org - Interesting approach on how private companies come together with a standard..
-
With the easy-to-use language of HTML, the ubiquity of TCP/IP that connected computers all over the globe and the well-understood domain name system for buying names, anyone could easily set-up their own Web site to share knowledge about any subject of their choosing, and thus the Web soon took off as the first truly decentralized system for global knowledge sharing. The Web’s decentralized nature, which allowed anyone to contribute and link to anyone else, made it a “permission-less” platform for knowledge. The decentralized innovation also applied to the core functionality of the Web as developers added new tags, such as the image tag by Netscape, and a constant stream of innovation has characterized the Web ever since its inception. Of course, it helped that CERN was committed to providing the core technology for free and the permission-less innovation was managed by a consensus-run global standards process for HTML, HTTP and URIs at the Internet Engineering Task Force (IETF) and Berners-Lee’s own World Wide Web Consortium (W3C). Still, the Web was not completely decentralized, as the domain name system itself, on which URIs depend, was centralized and requires the licensing of domain names — although once one has bought a single domain name one may host many different Web sites. As regards the decentralization of knowledge, the Web was viewed not as the end, but the beginning: Berners-Lee and others began hoping that eventually it would evolve into a truly universal information space for the sharing of knowledge that went beyond hypertext.
happy beginnings...
-
A dreaded 404 error is always possible since no central authority preemptively checks URIs, payloads, continuity of service or even deliver authorization to “mint” them (provided one is in control of a domain name).
Decentralization is also accepting that nodes could disappear. also linked to the right to be forgotten. Interesting in the realm of centralization beacuse in that scenario there will be an Uthority that gives you a fine our makes you be "present"
-
Luckily, although his paper describing the “World Wide Web” was rejected for the ACM Hypertext conference in December, 1991 in San Antonio, Texas, Tim Berners-Lee decided to go there and give a demonstration. On his way, he stopped at universities and gave demonstrations of how to set up a Web site and “link” using hypertext from one Web site to another. As Gopher and WAIS fell into decline due to the uncertainty around licensing and commercialization, the World Wide Web started to take-off. Both taking key ideas from the concept of hypertext invented by Ted Nelson’s Xanadu and earlier systems such as Engelbart’s NLS (oNLine System) as well as departing from them, the Web at first seemed rather underwhelming. However, it succeeded because it was both easy-to-use and decentralized.
hu, so he got rejected a paper? nice to know.
-
Thinking Machines Inc. stopped allowing WAIS to be used for free, and Brewster Kahle and Harry Morris set up WAIS Inc. to sell the software, which was promptly bought by the commercial Internet service AOL. Likewise, the University of Minnesota decided to start charging licensing fees for the Gopher codebase created by its developers. At the very moment when there was rising interest in the Internet as a potential platform for discovering knowledge by the general public, it seemed as if the first generation of software would put this knowledge behind a paywall
Exit to Community?
-
So, instead of idling while waiting for the next program or human interaction, in moments nearly imperceptible to the human eye, it would share its time among multiple humans. Inspired by the spread of time-sharing, the question facing computer scientists was how could computational resources be shared not only throughout time, but throughout space?
Socializing the means of computation
-
Thus, the advent of the Enlightenment led not to a massive decentralization of knowledge but to a re-centralization of knowledge in the hands of a bureaucratic elite, who maintain their power at least in part through their control over knowledge (Rushkoff, 2010). Yet this control could be naturalized, as the time and effort that could be put into the reading and training required to join the “knowledge class” did not seem to scale. To put it crudely, if one wanted access to specific knowledge up until even the 1980s, one would have had to go to Oxford to gain access to the Bodleian library — a task that was simply impossible for the knowledge-starved masses of the earth, who were thus stuck in the proletarian positions of taking orders from the knowledge elite.
Knowledge economies
-
and the university system (which was one of the few institutions to survive the transition from feudalism into capitalism post-Enlightenment), who controlled knowledge in the form of explicit training and certification. Knowledge itself is a prime reason for control: If someone doesn’t know how to do something or how something works, it seems intuitively obvious that they should be put under the control of someone who possesses the knowledge that is proper to the task at hand.
Control and centralization are noways in university
-
Yet as regards humans and their social institutions, centralized control over a fellow human being was seen as biologically natural within the institution of slavery, when bodies were reduced to mere tools in a larger process. However, if one assumes that humans are at least epistemically equal, i.e., that all humans have at least the potential to be a member of a community of self-directed knowing subjects (Lynch, 2016), then one can state as the goal of knowledge representation that it should enable humans to strive to be autonomous.
The goal of knowledge representation is to strive for human autonomy.
-
Autonomy can then be defined as the use of one’s own cognitive resources to create and share one’s own representations based on an independent judgment in terms of trust. In some distributed systems, the loss of autonomy may be a reasonable design choice, necessary in order to gain increased powers of co-ordination. After all, one does not want soldiers taking decisions in a battlefield autonomously, or an SQL database deciding on its own what someone’s taxes should be through purely internal random number generation. Yet as regards humans and their social institutions, centralized control over a fellow human being was seen as biologically natural within the institution of slavery, when bodies were reduced to mere tools in a larger process. However, if one assumes that humans are at least epistemically equal, i.e., that all humans have at least the potential to be a member of a community of self-directed knowing subjects (Lynch, 2016), then one can state as the goal of knowledge representation that it should enable humans to strive to be autonomous. If human intelligence is dependent on representations, the ability to navigate and create these representations becomes not just a matter of engineering and education, but of utmost political importance.
Interesting view on the relationship between autonomy, knowledge and decentralization/centralization
-
In a centralized system, an authority is in control of another entity, resulting in a loss of autonomy for the controlled entity.
Centralization by definition means the loss of autonomy. In what scenarios is centralization a pharmakon?
-
What is decentralization? In technical terms, a distributed system is defined by Lamport (1978) as a system with multiple components whose behavior is coordinated by passing messages. Many systems are distributed and in general for a system to be successful there has to be trust between its various components so that if the components are involved in some joint task, each can be trusted to play its role. Examples of technically distributed systems include everything from search engines, where multiple servers work together to find and retrieve data that may be spread out across multiple machines, to the traditional banking system where a single payment on a credit card involves co-operative interactions between the computers of a merchant and a bank. Whereas in distributed systems the components are generally trusted, there is no single trusted authority in a decentralized system, and so components have to co-ordinate and negotiate trust separately (Troncoso, et al., in press).
In this definition a decntralized system is one that negotiates trust whereas distributed in this defnition takes it as if trust is already established.
Thats why the internet is decentralized
-
Such a philosophy of decentralization is a much-needed foundational orientation that can prevent future engineering and economic innovation from following the all-too-easy path of centralization. More importantly, it even points at this historical juncture to the role the Web can play in transcending our current era of global political and ecological crisis by renewing the project of the decentralization of knowledge, a project at the heart of philosophy from Socrates to the present day.
A philosophy of decentralization? Centralization is not really all too easy. It requires people, and gathering that force is usually linked to incentives
-
The increased centralization of the Web is inarguable, as only two companies — Facebook and Google — control more than half the flow of traffic throughout the Web in 2016. However, such centralization is not predestined nor the result of a conspiracy; a more sound argument for the centralization of the Web is that centralization is fundamentally structural to any maturing industry. All maturing capitalist industries eventually become oligopolies, so that the centralization of the Internet into a few increasingly feudal fiefdoms simply shows the Web cannot escape the same pattern of classical pre-Internet telecommunications and automobile industries (Wu, 2011). However, we will argue that the Web is not just another industry, but possesses a special epistemic import as the latest incarnation of a larger progressive philosophical project of the decentralization of knowledge, a philosophical project to ultimately advance human autonomy.
The fate of every mature industry is to be centralized? here posses the question of decentralization as a phase in the maturity of a system.
-
-
decentpatterns.com decentpatterns.com
-
ds-on research work. The lib
sff
-
designers can use to build better user-facing applications backed by
ff
-
Decent Patterns is a design project supporting practitioners in decentralization through interface, content, and service design. Our main resource is a library of tried-and-tested design patterns, along with a glossary of terms and a research report detailing the needs and gaps we see in the current ecosystem.
d
-
-
www.e-ir.info www.e-ir.info
-
For over five centuries, the idea of modernity spread throughout the world, imposing Western culture and technology as the ultimate pinnacle of humankind (Escobar 2014). This has resulted in the historical destruction and continued devaluation of different Indigenous cultures and pluriversal forms of seeing and understanding the world (Brand et al. 2021).
Here the criticisms is on accepting the alterity
-
-
hypha.coop hypha.coop
-
Too many self-defined Web3 projects – at least those that involve tokenomics – have been harmful. They’re either scams, burn a grotesque amount of energy (particularly those based on Proof-of-Work), or require tons of hardware usage (leading to immense waste).
Web3 Is going great?
-