Smart speaker algorithm says what!!!

Box Brownie

Suspended / Banned
Messages
17,645
Edit My Images
No
Last edited:
Spotted this on the news feed


What next, giving suicide advice :banghead:

It should not be the user that finds such appalling 'advice'.

One way and another, I've played with software for quite a bit of my life. I'm totally mystified what sort of error could cause that.

The only thing I can think of (other than deliberate action for "exposure" is that Amazon must be using some of the natural language ML kit to trawl TikTok looking for things identified as challenging. If that's true, this is one of the most benign failures they could have had. (FWIW I believe it's almost impossible to short out a modern UK plug in this way but please don't try to prove me wrong....)
 
One way and another, I've played with software for quite a bit of my life. I'm totally mystified what sort of error could cause that.

The only thing I can think of (other than deliberate action for "exposure" is that Amazon must be using some of the natural language ML kit to trawl TikTok looking for things identified as challenging. If that's true, this is one of the most benign failures they could have had. (FWIW I believe it's almost impossible to short out a modern UK plug in this way but please don't try to prove me wrong....)
I think you are likely correct....................modern (I think for at least the past 20 years? ) 3 pin plugs have shrouded L & N pins that AFAIK if pulled out far enough to expose the exposed pins they are no longer in contact with the socket supply :thinking:
 
Anyone that's daft enough to follow that kind of advice should NOT be in command of any electronic device IMO! :LOL:
Though in the case of the story it was a young child 'asking the smartspeaker...' as for whether there might or should be parental controls in place to stop such dangerous advice that is a whole different discussion ;)
 
I think you are likely correct....................modern (I think for at least the past 20 years? ) 3 pin plugs have shrouded L & N pins that AFAIK if pulled out far enough to expose the exposed pins they are no longer in contact with the socket supply :thinking:

Correct.

But still.......

Next thing you know, it'll be listening to your conversations...... ;)
 
One way and another, I've played with software for quite a bit of my life. I'm totally mystified what sort of error could cause that.

The only thing I can think of (other than deliberate action for "exposure" is that Amazon must be using some of the natural language ML kit to trawl TikTok looking for things identified as challenging. If that's true, this is one of the most benign failures they could have had. (FWIW I believe it's almost impossible to short out a modern UK plug in this way but please don't try to prove me wrong....)


As you say, it's all but impossible to do in the UK (although I have a couple of very old 13A plugs with solid brass pins that would make it possible!) but IIRC the child in the story is in the USA where the pins are pretty easy to get at.
 
As you say, it's all but impossible to do in the UK (although I have a couple of very old 13A plugs with solid brass pins that would make it possible!) but IIRC the child in the story is in the USA where the pins are pretty easy to get at.
Also only 110v so not likely to be fatal.....shocking all the same.
 
Also only 110v so not likely to be fatal.....shocking all the same.
U.S. 110v is different to UK building site 110v which is 55v either side of earth so 110v in the U.S. is more likely to be harmful but surely a penny across the pins would trip a breaker or blow a fuse on older systems pretty quickly.
 
U.S. 110v is different to UK building site 110v which is 55v either side of earth so 110v in the U.S. is more likely to be harmful but surely a penny across the pins would trip a breaker or blow a fuse on older systems pretty quickly.
Actually that’s not really the point. When I read the story some days ago it said Alexa was trawling the internet for ‘challenges’ and that Amazon had banned that one but it didn’t say Amazon had changed the overall practice. If it’s continuing, who knows what daft/lethal challenges it will find :(.
 
Last edited:
Well I guess it's because "machine learning" is really just glorified pattern matching with no intelligence what so ever. There is no policy for Amazon to change, they probably don't even really know how their ML actually recognises "challenges", all they will do is put this on an exclude list. Frankly things like this example are fairly trivial becuase they are sensational and easily spotted, what is a lot more pernicious is doing things like recommending dieting information to anorexics or suicide information to depressed people.
 
Back
Top