- Nov 2024
-
www.linkedin.com www.linkedin.com
-
the point is that this is a collective problem that can only be solved collectively. And clearly there is no collective, even worse
for - post comment - LinkedIn - polarization - Trump 2024 win - lack of collective - adjacency - Deep Humanity - deep time, species-wide singularity - conservativism vs progressiveness - progress - political polarization - progress trap
adjacency between - Trump 2024 win - Deep Humanity - anthropocene as deep time species-wide singularity - progress traps reaching a climax - conservatism vs progressiveness - adjacency relationship - This fits into a Deep Humanity explanation: - We are moving through a deep time, species singularity in which - once isolated pockets of cultural seeking and interpretative systems for explaining reality have been rapidly mashed-up via: - communication and - transportation technology - There is a singularity now where two forces are battling each other: - conservative that values old traditional cultural values and norms and - progressive that values the future possibilities - There are different cultural flavors of this. Whether it is - political polarization that pits authoritarian vs democratic ideologies or - climate change that pits traditional fossil fuel systems vs new renewable energy systems - the way we've always done things is in conflict with new ways of doing things through natural human evolutionary change - progress - In fact, we can look at the deep time, species-wide singularity that is now happening across all fields in the anthropocene as a predictable progress trap arising from progress itself
-
- Mar 2024
-
www.youtube.com www.youtube.com
-
www.linkedin.com www.linkedin.com
-
AI will still need us for a while... That is until the Singularity will rise - when we will be able to upload our brains into computers.
progress trap - singularity
-
- Aug 2023
-
metodo-rivista.eu metodo-rivista.euMetodo1
Tags
Annotators
URL
-
- Jul 2023
-
-
Over the next 15 to 20 years this is going to develop a computer that is much smarter 00:01:20 than all of us. We call that moment singularity.
- Singularity
- will happen within the next few decades
- Singularity
-
-
kortina.nyc kortina.nyc
-
finite time singularity
-
finite time singularity
- when the mathematical solution to the growth equation becomes infinitely large at some finite time
-
comment
- this is also salient for the accumulation of unresolved progress traps
- the Anthropocene can perhaps be viewed as the occurence of finite time singularities due to unresolved problems arising from progress traps that innovation is too slow to solve
-
-
-
Title
- West // Scale: The Universal Laws of Growth, Innovation, Sustainability, and the Pace of Life in Organisms…j
-
Comment
- good exerpts from the book
-
-
- Aug 2021
-
su.org su.org
-
an American company that offers executive educational programs, a business incubator and innovation consultancy service.[1][2] It is not an accredited university and does not provide traditional university qualifications.
Worth keeping in mind
Tags
Annotators
URL
-
- Sep 2018
-
www.mnemotext.com www.mnemotext.com
-
Good has captured the essence of the runaway, but he does not pursue its most disturbing consequences. Any intelligent machine of the sort he describes would not be humankinds “tool” – any more than humans are the tools of rabbits, robins, or chimpanzees.
If humanity were to create an ultra-intelligent computer then humans would be far surpassed. This part of the passage says that because our minds would be so simple compared to that of the machines at the point of the singularity that we would be little more than rabbits. This idea is astounding because our mental capabilities are vastly superior to those of rabbits or other animals and the idea that we could be so easily surpassed and so greatly surpassed is probably terrifying to many people. This is probably what makes the idea of a singularity such an abstract idea.
-
And its very likely that IA is a much easier road to the achievement of superhumanity than pure AI. In humans, the hardest development problems have already been solved. Building up from within ourselves ought to be easier than figuring out what we really are and then building machines that are all of that.
The authors of the text are proposing a radically different approach to the inevitable "singularity" event. They propose the research and development IA, or Intelligence Amplification, is developing computers with a symbiosis with humans. Noting that IA could be easier to develop than AI algorithms, since humanity had to probe what their true weaknesses and strengths are. In turn, developing an IA system that could cover humanities' weaknesses. This would summarily prevent an IA algorithm from getting over itself, which could potentially slow a point when we reach singularity.
-
The maximum possible effectiveness of a software system increases in direct proportion to the log of the effectiveness (i.e., speed, bandwidth, memory capacity) of the underlying hardware.
Simply stating that there will always be something restrictive about what technologies can do. Thus far in human technological advances there have not been a single database that can support a beyond human software. As stated in the quotes, the 'mind' of the piece of software is limited to all the effectiveness of the hardware, and by the time that humans are able to invent something that could effectively contain this non-human beyond human brain there would be some counter measures in placed to reduce the risk of an AI taking over the human race. The resource cost would also discourage for such experiment to be funded as it would be expensive to fund the researcher on creating compatible parts and programmers to develop something that would resemble that of a human mind but something more advance. Programming is also another problem, humans do not fully understand the human mind so there is a very unlikely chance that some programmer is able to accidentally write a line of code that make an AI be able to extend further than what a human can comprehend. The idea of a technology singularity stays a theory but this one single quote assures that the technology singularity is far from what is achievable.
-
-
www.mnemotext.com www.mnemotext.com
-
Into the Wormhole
This scene, as with many others, represents a crucial point in the movie. I think this also has some connection of what was discussed on the first day of class in relations to singularity. One student suggested that singularity in an astronomical context seems to be a black hole that implodes on itself. Further discussed in a linguistics context, we described how singularity shares connections with being alone or unique and individual. In the "into the wormhole" scene in 2001: A Space Odyssey, all of humankind is erased except for David. Advancements in technology had reached its peak to a point where David literally enters a black hole or a worm hole, in which he lives the rest of his life alone and passes away quietly. This signifying mankind imploding on themselves and being reborn. His passage signifies the rebirth of humankind and presents the idea of the cycle of life, where technology is nonexistent and life begins again. There is a sense of reverse chronology in the movie as the ending scene is continued at the beginning of the movie where the monkeys demonstrate their journey towards intelligence once more.
-
-
www.mnemotext.com www.mnemotext.com
-
Technology is therefore no mere means. Technology is a way of revealing. If we give heed to this, then another whole realm for the essence of technology will open itself up to us. It is the realm of revealing, i.e., of truth.
Technology opening itself to us giving us the essence of technology to which we--as a species-- can conquer the mysteries of the natural laws around us. As many fears for the day of the technological singularity, they should not worry since by the fact that that piece of information is unlocked for researchers to tinker then there will always be a counter measure to such an event. Knowing the truth of the machine allows for loopholes of the machine to be exploited and neutralize.
-
-
www.mnemotext.com www.mnemotext.com
-
The concept of “extropy” was used to encapsulate the core values and goals of transhumanism. Intended not as a technical term opposed to entropy but instead as a metaphor, extropy was defined as “the extent of a living or organizational systems intelligence, functional order, vitality, and capacity and drive for improvement.”
It's interesting that the author chooses to emphasize the distinction between extropy as an opposition to entropy, but instead as a metaphor. However, would extropy not be the opposite of entropy metaphorically as well? Scientific definition aside, entropy is defined as the universe's tendency towards chaos in all manners of the word meaning constant expansion and disorder. Extropy is the universe's tendency towards the idea of a 'singularity' (discussed in "The Technological Singularity") so essentially the exact opposite? The universe's tendency to follow "intelligence, functional order", etc. toward a single point of "posthuman" where we've gone beyond human capability?
-
- Dec 2017
-
journals.plos.org journals.plos.org
Tags
Annotators
URL
-
- Feb 2016
-
www.e-flux.com www.e-flux.com
-
Everyday interactions replay the Turing Test over and over. Is there a person behind this machine, and if so, how much? In time, the answer will matter less, and the postulation of human (or even carbon-based life) as the threshold measure of intelligence and as the qualifying gauge of a political ethics may seem like tasteless vestigial racism, replaced by less anthropocentric frames of reference.
That's beautiful. I only hope the transition isn't jarring and the rate of expansion for compassion matches or exceeds that of cognition.
-