The invention of AI ‘gaydar’ could be the start of something much worse

Windparadox

Gold Member
May 3, 2017
4,567
903
275
Northern WI.
`
The invention of AI ‘gaydar’ could be the start of something much worse - "
Researchers claim they can spot gay people from a photo, but critics say we’re revisiting pseudoscience"
`
`

While the article does a good job of technically explaining why this kind of programing is flawed, it fails to address the larger issue here. What next? Can AI photo-identification look for other human characteristics such as guilt or innocence? Is a person a criminal? Will the police use it? Consider that such AI programs are in their infancy right now but are all ultimately geared at identifying traits in a person, which the end user finds either profitable and/or beneficial to their own cause.

Civil liberties are at stake here. Living in an online public environment has its risks.


 
`
The invention of AI ‘gaydar’ could be the start of something much worse - "
Researchers claim they can spot gay people from a photo, but critics say we’re revisiting pseudoscience"
`
`

While the article does a good job of technically explaining why this kind of programing is flawed, it fails to address the larger issue here. What next? Can AI photo-identification look for other human characteristics such as guilt or innocence? Is a person a criminal? Will the police use it? Consider that such AI programs are in their infancy right now but are all ultimately geared at identifying traits in a person, which the end user finds either profitable and/or beneficial to their own cause.

Civil liberties are at stake here. Living in an online public environment has its risks.

I don't think nothing's next.
Usually one can spot someone who's gay by their looks.
Not always, but a lot of the time.
But who cares anymore if you're gay.
 
with the increasing use of cameras and DNA evidence except for identical twins there is no need for such AI tech.
 
with the increasing use of cameras and DNA evidence except for identical twins there is no need for such AI tech.
`
Be that as it may, if you read the article, you'd know that such technology is currently being developed nonetheless. Obviously some monied interests are willing to invest tens of millions of dollars in it.
 
`
The invention of AI ‘gaydar’ could be the start of something much worse - "
Researchers claim they can spot gay people from a photo, but critics say we’re revisiting pseudoscience"
`
`

While the article does a good job of technically explaining why this kind of programing is flawed, it fails to address the larger issue here. What next? Can AI photo-identification look for other human characteristics such as guilt or innocence? Is a person a criminal? Will the police use it? Consider that such AI programs are in their infancy right now but are all ultimately geared at identifying traits in a person, which the end user finds either profitable and/or beneficial to their own cause.

Civil liberties are at stake here. Living in an online public environment has its risks.

Stepping a toe into giving machines 'precog' status. And opening the door again to Eugenics and the Minority Report.

Why can't people just go to the movies, they always have to be fucking with something.
 
with the increasing use of cameras and DNA evidence except for identical twins there is no need for such AI tech.
`
Be that as it may, if you read the article, you'd know that such technology is currently being developed nonetheless. Obviously some monied interests are willing to invest tens of millions of dollars in it.

Probably for better data mining. Trump won the presidency almost solely through data mining.
 

Forum List

Back
Top