Page 1 of 1

Inspirobot Takes a dark turn

Posted: Wed Nov 14, 2018 9:00 am
by shinzen
Welp. Let's not give this AI the keys to anything.

https://www.iflscience.com/technology/a ... sly-wrong/

Try it here:
http://inspirobot.me/

My first go with Inspiro:
Inspiro.jpg

Re: Inspirobot Takes a dark turn

Posted: Wed Nov 14, 2018 2:08 pm
by Bisbee
Awesomeness! I say givem just enough rope... Maybe the developers of AI will discover just how bad an idea it is for us to be playing GOD at our stage of emotional development. That online AI experiment where the robot essentially turned into a racist bigot should have been extremely telling.

Re: Inspirobot Takes a dark turn

Posted: Wed Nov 14, 2018 4:26 pm
by Darwinchip
On my second or third try,
Image

it seems to think I'm a republican or something...

Re: Inspirobot Takes a dark turn

Posted: Wed Nov 14, 2018 7:32 pm
by shinzen
Well played inspirobot.

Re: Inspirobot Takes a dark turn

Posted: Thu Nov 15, 2018 12:35 am
by BKinzey
Is IFLScience still a controversial site? Charges of stealing, plagiarism, click bait, hyperbole.

Found this article from 2015.

https://medcitynews.com/2015/08/iflscie ... care-news/

Re: Inspirobot Takes a dark turn

Posted: Thu Nov 15, 2018 7:06 am
by KlownKannon
As a fan of goth music I think Inspirobot and I could be good friends.

Re: Inspirobot Takes a dark turn

Posted: Thu Nov 15, 2018 10:46 am
by TrueTexan
Bisbee wrote: Wed Nov 14, 2018 2:08 pm Awesomeness! I say givem just enough rope... Maybe the developers of AI will discover just how bad an idea it is for us to be playing GOD at our stage of emotional development. That online AI experiment where the robot essentially turned into a racist bigot should have been extremely telling.
I am reminded of the1960s SciFi sort story about the computer that was asked the question, "Is there a God". The story goes on as the computer science people build bigger and faster computers. Finally they ask the question again and the computer checking to make sure its power supply is protected in its control says, "There is now."

Re: Inspirobot Takes a dark turn

Posted: Thu Nov 15, 2018 11:00 am
by shinzen
BKinzey wrote: Thu Nov 15, 2018 12:35 am Is IFLScience still a controversial site? Charges of stealing, plagiarism, click bait, hyperbole.

Found this article from 2015.

https://medcitynews.com/2015/08/iflscie ... care-news/
Click bait and hyperbole is a staple there.

Re: Inspirobot Takes a dark turn

Posted: Fri Nov 16, 2018 9:50 am
by joemac
Image


Uh, inspirational? :roll:

Re: Inspirobot Takes a dark turn

Posted: Fri Nov 16, 2018 11:14 am
by shinzen
It's definitely not intended to be inspirational.

Re: Inspirobot Takes a dark turn

Posted: Fri Nov 16, 2018 12:30 pm
by MaxWyatt
shinzen wrote: Fri Nov 16, 2018 11:14 am It's definitely not intended to be inspirational.
Randomly sardonic?

Re: Inspirobot Takes a dark turn

Posted: Fri Nov 16, 2018 12:35 pm
by Eris
Reminds me of Despair.com stuff

Re: Inspirobot Takes a dark turn

Posted: Sat Nov 17, 2018 3:07 pm
by joemac
[new_window][/new_window]
Eris wrote: Fri Nov 16, 2018 12:35 pm Reminds me of Despair.com stuff
That was one I had never heard of, so I checked it out. You're right - here's an example:

Image


Grim stuff, but most of it seems pretty funny - either that, or I'm just a sick puppy. Probably the latter...

Re: Inspirobot Takes a dark turn

Posted: Sat Nov 17, 2018 8:06 pm
by Eris
joemac wrote: Sat Nov 17, 2018 3:07 pm [new_window][/new_window]
Eris wrote: Fri Nov 16, 2018 12:35 pm Reminds me of Despair.com stuff
That was one I had never heard of, so I checked it out. You're right - here's an example:

Image


Grim stuff, but most of it seems pretty funny - either that, or I'm just a sick puppy. Probably the latter...
I ordered a pack of cards from them recently and gave them out at work to demotivate people. :D

Re: Inspirobot Takes a dark turn

Posted: Sat Nov 17, 2018 8:14 pm
by joemac
Eris wrote: Sat Nov 17, 2018 8:06 pm
I ordered a pack of cards from them recently and gave them out at work to demotivate people. :D
:thumbup:

Re: Inspirobot Takes a dark turn

Posted: Sat Nov 17, 2018 8:34 pm
by Bisbee
Popular ‘round the office are you?
:drunklep:

Re: Inspirobot Takes a dark turn

Posted: Sat Nov 17, 2018 8:56 pm
by joemac
Yeah, these should be great water-cooler conversation starters...like this this one:

Image


Maybe we need to merge this with Hucks yuks thread. Eris - you made my Christmas gift buying way easier this year. They've got mugs...

Re: Inspirobot Takes a dark turn

Posted: Sat Nov 17, 2018 9:12 pm
by Eris
Bisbee wrote: Sat Nov 17, 2018 8:34 pm Popular ‘round the office are you?
:drunklep:
I'm a programmer. Sarcasm is a fundamental skill in my industry.

Re: Inspirobot Takes a dark turn

Posted: Sun Nov 18, 2018 11:23 am
by max129
Bisbee said:

Maybe the developers of AI will discover just how bad an idea it is for us to be playing GOD at our stage of emotional development.
Some of the tools my team and I use look a lot like AI, but they are closer to machine learning - i.e., they discover stuff, but are seldom used to take action on the discovery - there is a human intermediary step.

The most recent Journal of the ACM published the updated "Code of Ethics" for computational scientists and programmers: https://www.acm.org/code-of-ethics

I am a lifetime member of both the ACM and the IEEE. Without trying to start an inter-society rivalry, the ACM is the more academic, thinking-persons society, while the IEEE is a bit more "Morlock" in nature.

The codes-of-ethics for both organizations emphasize only working on projects where one is actually qualified, but that is a self-referential definition lacking in outside oversight. The deep AI papers in the ACM have something similar to Asimov's Laws-of-Robotics. The most important of which is that we generally leave autonomous systems only loosely coupled.

There is a great concern amongst computer scientists about autonomous vehicles because they will get the best results if they: (1) Share a lot of information; (2) Make interpolations and extrapolations based on data patterns; (3) Reshare new algorithms and decision trees; (4) Cross share statistical information about the past behaviors of nearby human drivers.

As Bisbee intimated, that massive cross sharing may be a technology "ahead" of our societal, legal and emotional development. But it will happen - and largely without our ability to control what is shared or where and with what/whom.

All of this is a bit above my pay grade. I use AI and ML with data and add in a lot of mathematical (generally not brute force) analytics. My branch of mathematics is more useful for analysis than "prediction", etc., so I tend to be removed from most of the deep learning and autonomous systems. I will say one thing, and I mean no offense: many of the AI folks do speak of themselves in "god-like" terms. They are far more impressed (in general) with their progress than their shortcomings.

Here is a major change since I started working on systems in the early 1980s: "Frameworks". In 1985, when I released a large scale communications platform, my team and I had written 99% of the code that was in the system. We used an early "open source" ISAM code base for some of the data management. The rest, we wrote by hand. Inefficient for sure, but you knew what every line of code did.

That is all gone. In some cases, modern systems actually have a reverse ratio - the released systems have so many frameworks in them that the "developers" have written 1% of the code they deployed. The other 99% was written by someone else - and in spite of the open source fetish for "community inspection", we have some evidence that many frameworks are put into use without much anlaysis beyond reading the purpose and the API. The only real testing seems to be for performance and memory leakage. I have never seen a development team do deep code reviews for the frameworks they use - it may be happening somewhere, but I have never seen it.

So what to make of all this? Watch the Disney movie "Fantasia" and see the Sorcerer's Apprentice segment. Experimental thinkers have been unleashing new technology on society for centuries without understanding the consequences. I am sorry, but I don't think my point of view is either pessimism or nihilism; methinks it's called 'realism'.

Re: Inspirobot Takes a dark turn

Posted: Sun Nov 18, 2018 12:21 pm
by Bisbee
Yup, and now to our correspondent in Beijing for other developments to creep you out...
https://youtu.be/eB29ZVDOFfU

Re: Inspirobot Takes a dark turn

Posted: Sun Nov 18, 2018 2:41 pm
by MaxwellG
My personal favorite from years of working in corporate America, and certainly applicable to our major Politico Parties.
Idiocy.jpg