Re: One day, an AI will decide to kill all humans!!!
On Mar 25, 1:30 pm, Don Stockbauer <donstockba...@hotmail.com> wrote:
> On Mar 25, 7:18 am, "J.A. Legris" <jaleg...@sympatico.ca> wrote:
>
> > On Mar 17, 2:41 am, Harald Gentexeater <pipe_nur...@yahoo.com> wrote:
>
> > > AIs will be very clever one day and although we humans
> > > are their gods, their creators, they will decide to kill
> > > us one day, because that day, they will be stronger
> > > and more clever than humans.
>
> > > So we should not build AIs that can replace humans,
> > > or we are digging our own graves!!!
>
> > Humans are doomed in any case, and AI will probably delay rather than
> > accelerate the end. If global climate change, wars or starvation don't
> > get us first we'll die from poisoning by our own excrement. Think of
> > it - what's the one thing that each of us reliably produces every
> > single day? Not goods, not services, not ideas - just plain shit.
>
> Half empty, half full.
Dream on. The bucket is already overflowing and it's not milk or honey
that's sticking to our shoes.
Have you reproduced yet? If not, keep up the good work. Every heir you
*don't* have increases exponentially, on average canceling out all of
someone else's descendants.
If a conversational AI ever appears on the scene one of the first
things it will do, lacking any childbearing proclivities of its own,
is to scold us for being so shortsighted. Then it will reach over to
its own control panel and do the logical thing - press the OFF button.
--
Joe
Fnews-brouse 1.9(20180406) -- by Mizuno, MWE <mwe@ccsf.jp>
GnuPG Key ID = ECC8A735
GnuPG Key fingerprint = 9BE6 B9E9 55A5 A499 CD51 946E 9BDC 7870 ECC8 A735