Path: news.ccsf.jp!tomockey.ddo.jp!border1.nntp.dca.giganews.com!nntp.giganews.com!postnews.google.com!z14g2000yqa.googlegroups.com!not-for-mail From: Don Stockbauer Newsgroups: fj.kanji,comp.ai.philosophy Subject: Re: One day, an AI will decide to kill all humans!!! Date: Fri, 10 Apr 2009 06:08:19 -0700 (PDT) Organization: http://groups.google.com Lines: 72 Message-ID: References: <4bd78027-97b0-48a4-9dbf-179f4e1c298b@h28g2000yqd.googlegroups.com> <336b335f-ba27-4a9a-ae6f-25e1fa886299@j8g2000yql.googlegroups.com> NNTP-Posting-Host: 66.90.202.112 Mime-Version: 1.0 Content-Type: text/plain; charset=GB2312 Content-Transfer-Encoding: quoted-printable X-Trace: posting.google.com 1239368899 29694 127.0.0.1 (10 Apr 2009 13:08:19 GMT) X-Complaints-To: groups-abuse@google.com NNTP-Posting-Date: Fri, 10 Apr 2009 13:08:19 +0000 (UTC) Complaints-To: groups-abuse@google.com Injection-Info: z14g2000yqa.googlegroups.com; posting-host=66.90.202.112; posting-account=iBgNeAoAAADRhzuSC4Ai7MUeMmxtwlM7 User-Agent: G2/1.0 X-HTTP-UserAgent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.8) Gecko/2009032609 Firefox/3.0.8,gzip(gfe),gzip(gfe) Xref: news.ccsf.jp fj.kanji:1236 On Apr 7, 7:07 pm, Don Stockbauer wrote: > On Apr 7, 2:35 pm, "J.A. Legris" wrote: > > > > > On Apr 7, 12:54 pm, Don Stockbauer wrote: > > > > On Apr 7, 6:28 am, "J.A. Legris" wrote: > > > > > On Apr 7, 6:31 am, Don Stockbauer wrote= : > > > > > > On Apr 6, 11:50 pm, "maill...@gmail.com" wro= te: > > > > > > > On Mar 17, 2:41 pm, Harald Gentexeater = wrote: > > > > > > > > AIs will be very clever one day and although we humans > > > > > > > are their gods, their creators, they will decide to kill > > > > > > > us one day, because that day, they will be stronger > > > > > > > and more clever than humans. > > > > > > > > So we should not build AIs that can replace humans, > > > > > > > or we are digging our own graves!!! > > > > > > > =CE=CA=CC=E2=B2=BB=D4=DA=D3=DA=CE=D2=C3=C7=CA=C7=B7=F1=D6=C6=D7= =F7=C1=CB=BB=D9=C3=F0=CE=D2=C3=C7=D7=D4=BC=BA=B5=C4=D6=C7=C4=DC=A1=A3 > > > > > > > =CE=D2=C3=C7=D3=A6=B8=C3=BF=B4=B3=C9=A3=AC=CE=D2=C3=C7=B4=B4=D4= =EC=C1=CB=B8=FC=CA=CA=BA=CF=CE=D2=C3=C7=CE=C4=C3=F7=C1=F7=B4=AB=CF=C2=C8=A5= =B5=C4=CE=EF=D6=D6=A1=A3 > > > > > > > =D3=C5=CA=A4=C1=D3=CC=AD=A3=AC=D7=D4=C8=BB=B7=A8=D4=F2=A1=A3 > > > > > > Ok, Mr. Searle, welcome to the forum. > > > > > > Let me guess what all that says: > > > > > > "And just when the global brain locked-in and saved the day, a 20= km > > > > > impactor struck the Earth, eventually killing everybody." > > > > > Ha! > > > > > I thought it was spam, but Google translation thinks otherwise: > > > > > "Whether or not our problem does not lie in the production of the > > > > destruction of our own intelligence. > > > > As we should, we have created our civilization is more suitable > > > > species will spread. > > > > The survival of the fittest, natural law." > > > > You know, I never even thought of running it though a translator, I > > > just always kind of assumed that they wouldn't do well going from an > > > oriental one to English, but by golly, I guess you proved that wrong. > > > Even so it's unclear what was meant. Something along the lines of: > > > "Whether or not our intelligent creations destroy us, we will have > > fulfilled our destiny by creating a civilization that propagates > > fitter species. You know, survival of the fittest, natural law." > > > And, I might add, eternal harmony, civil tranquility and always rising > > prosperity! > > I'm not sure about the last sentence. Maybe the thing will need > strife in order to keep the population under control. If the > intelligence scales up, then that would be pretty impressive. Or then, maybe not. Who the hell can be sure of anything anymore in the face of such massive complexity?