2015-08-03

#robotics beware offensive autonomous weapons

8.1: news.cs/robotics/beware offensive autonomous weapons:
8.3: summary:
. we are getting close to robotics that are
so talented they may serve as soldiers;
but scientists warn we should avoid
such a robotics arms race.
. supercomputers will be used to design
robotic soldiers we don't fully understand.
. robots designed for mutually assured destruction
could cause the extinction of the human race
much more effectively than nuclear weapons.

theatlantic 2014:
. scientists are warning about
serious threats to human life in the near future,
you've heard a lot about climate change:
many people don't care;
but how many times have you heard warnings of
superintelligent computers?
. the whole point of pouring money into them,
is to put them in charge of critical responsibilities,
like deciding who dies on the battlefield.

"machines with superhuman intelligence
could repeatedly improve their design even further,"
triggering a singularity:
a very rapid rise in the rate of tech evolution.

. robots will be a superior alien presence
arriving in a few decades,
and our political response has been
"Experts are surely doing everything possible
to ensure the best outcome, right?"
but, if you look at industrialization,
we gave capitalism the benefit of a doubt;
they thoroughly abused their responsibilities,
and only then did we pour on the regulation.
. if we do the same for robots,
they could do serious damage,
working for the next anti-christ
programmed to kill everybody but the master.

. here are some organizations
that are worried about our future:
Cambridge Centre for the Study of Existential Risk;
the Future of Humanity Institute;
the Machine Intelligence Research Institute;
the Future of Life Institute.

kurzweilai 2015:
More than 1,000 AI and robotics researchers
signed and published an open letter
calling for a ban on
offensive autonomous weapons.

persuade humans to prepare:

The Future of Life has a video:
. their speaker, MIT-prof Max Tegmark,
believes that despite our gloomy cosmology
telling us the universe dies;
it doesn't die for a very long time
whereas the planet dies much sooner,
so we need to think about the threats
that would prevent us from colonizing space.
. we need to spend more resources
educating people about the need to prepare;
the scientists need to be more like the capitalists
using ad's and social media to persuade the masses.

. he starts off assuming the big bang theory,
and people with that sort of imagination
also conclude the universe eventually dies
the same way it was created.
. but he assures us we have plenty of time:
it took millions of years to evolve us,
and that is just 1/10^57 of the time
that our universe can support life.

. he thinks the universe has a definite size
due to it coming from the big bang
and something about the microwave background radiation;
but given the time it takes to travel,
the size of the universe is practically unlimited.

. so we have this enormous future of opportunity
if we can spend more money on learning to
survive and colonize space.

. he's sure there are no extraterrestrials
within our own galaxy
because they would have been here already
and they would be spreading because
they eat other species (like humans, of course).

. assuming aliens have not found us
it could mean that the problem is
creation and evolution of life is very rare,
or that once life has become intelligent
something prevents it from space exploration.

. but his argument against aliens is very weak;
first, there are hints we are visited,
and even if we weren't, it's a big place:
aliens could be spreading without reaching here yet.

. how are the aliens extracting resources
without being seen?!
Snowden says some think aliens live in the ocean.
. if there is evidence of alien contact,
it may be hidden by the government.
(see the notable list of believers).

1.14: web.tech/Kurzweil`Transcendent Man.film:
Transcendent Man is the documentary film
that introduces the life and ideas of
Ray Kurzweil, the renowned futurist .
He proposes that the Law of Accelerating Returns
—the exponential increase in the
growth of information technology—
will result in a "singularity", a point where
humanity and machines will merge,
allowing one to transcend biological mortality .
At the end of the film, Kurzweil states,
"if I was asked if god exists, I would say not yet."

8.3: conclusion:
. if you're sure the god would not allow
machines to wipe out all of humanity;
consider the fact that Earthlings
are not the only source of humanity.

No comments:

Post a Comment