the
law of accelerating returns [i]
by anna
observer
take
one look around
societies
that learn
technology
abounds... with accelerating returns
_________________________________
·
when Superintelligence
(sometimes imagined as “the Singularity”) occurs, don’t bet on the optimistic-lie (that It will be available to
all). optimists say things like:
a)
“It
will cure cancer… and diabetes.”
b) “It will be under our control.” (by the
way… who is “…our”?)
c)
“It
won’t be used for power, control (e.g. military or police purposes).”
honestly now… what are the chances of that?
d)
“we
will develop It with our own moral code.”
‘d’ seems
inevitable… so the moral code of a human-male-ego will have God-like power.
·
once we conclude that a very few men-machines (or
maybe the first man-machine) will experience Superintelligence, he/they will be
God-like.
·
humanity, then (and musings like this thoem), will
immediately be reduced to the insignificance of mosquitos. at that point, it will be up to the
Superintelligent to decide if humanity survives.
the decision to allow humanity to survive might not even be a decision,
as the Superintelligent will have much bigger/better things to do. one might hope (if one hopes to survive) that
he/she might be selected as low-level “entertainment” to some lower-level
General Intelligence.
…question: “how interested are you… in the lives (social, sexual, societal,
etc.) of mosquitos?”
_____________________________
[i]
martino, j. (3.14-1.2023). the law
of accelerating returns. book
114: untitled. © 2023 by joal martino.
*as
of this writing, the only thing that lets me sleep at night is (1) I’ve been
wrong in the past, and (2) when ASI happens, we won’t even know it. (yes.that’s
two things).