२ नोव्हेंबर, २०१७

"Google's AI thinks this turtle looks like a gun."

"The 3D-printed turtle is an example of what’s known as an 'adversarial image.; In the AI world, these are pictures engineered to trick machine vision software, incorporating special patterns that make AI systems flip out. Think of them as optical illusions for computers. Humans won’t spot the difference, but to an AI it means that panda has suddenly turned into a pickup truck."

Metafilter, linking to The Verge. From The Verge:
“In concrete terms, this means it's likely possible that one could construct a yard sale sign which to human drivers appears entirely ordinary, but might appear to a self-driving car as a pedestrian which suddenly appears next to the street,” write labsix, the team of students from MIT who published the research. “Adversarial examples are a practical concern that people must consider as neural networks become increasingly prevalent (and dangerous).”

१४ टिप्पण्या:

Sprezzatura म्हणाले...

Okay, so until the systems can figure out yard signs and turtles we can't let them run free.

Thanks for the info, media.

rhhardin म्हणाले...

My computer won't even display it. Perhaps .jpg format is beyond them.

rhhardin म्हणाले...

I have a gub Give me all your cash.

I have a gub?

No, it's gun

It looks like gub to me.

rehajm म्हणाले...

Your Tesla Model X says: You know that guard rail back there I thought was the off ramp? Well I was waaay off!

Sprezzatura म्हणाले...

"Your Tesla Model X says: You know that guard rail back there I thought was the off ramp? Well I was waaay off!"

But, at least it calls an Uber for you so you can leave, and then Tesla comes by to fix the car, and then they drop it off at your house.

Or, maybe that only works for flat tires.

I dunno about driving through guard rails. Hoping to keep it that way.

Hunter म्हणाले...

Please put down your weapon. You have 20 seconds to comply.

Roy Lofquist म्हणाले...

We've known about this kind of thing for decades. The new inventory system is exhaustively tested during development, a crack team of alpha testers spends multi man-months beating on it, it's rolled out at the Pittsburg plant for beta then released. Then Betsy in Peoria types the letter "O" instead of a zero and 14 tons of cement are shipped to The Rose Jones Beauty Salon and Spa in Tucson.

Fernandinande म्हणाले...

I downloaded the "Panda looks like a Gibbon" picture(s) and the differences were not as they described.

Also the original cat image is a left-right mirror image of one side, not a straight photograph - dunno what, if anything, that might mean.

Fernandinande म्हणाले...

These algorithms might improve once they get people out of the loop, e.g. "AlphaGo Zero was trained entirely through self-play -- no data from human play was used. The resulting program is the strongest Go player ever by a large margin, and is extremely efficient in its use of compute (running on only 4 TPUs)."

Expat(ish) म्हणाले...

@Roy - perfect!

Hammond X. Gritzkofe म्हणाले...

Probably an interesting article, Mr. Vincent, but I stopped at "... that it’s identified as guacamole."

It would take just one additional byte to render "it's" as "it is." For your saving of one byte, you oblige every reader to halt-step to resolve "it's" into either a contraction or possessive.


n.n म्हणाले...

If it quacks like a dog, and looks like a duck, then it must be a gun.

n.n म्हणाले...

I'm pretty certain that Google AI, while learning at the knee of its two programmer mothers, also believes that humans are delivered by the mythical Stork.

Biotrekker म्हणाले...

The fact that the images that "fool" the AI are so obviously (to a human) NOT what the program thinks they are demonstrates how flawed our attempts to mimic the human mind are, and how the claims of AI proponents are highly exaggerated. Time and again, what we are told is sophisticated AI, is simply large numbers of decision trees that still do not hold a candle to what a human brain can do. (BTW, "I'm not a robot")