This new AI that can tell if you're gay or straight from a photo


#1

Lots to think about…

Worried about employers /governments (and parents?) abusing this as /when it becomes public domain.

Another step towards shutting up the “choose to be gay” idiots?

Pretty limited study… No bi, trans people not to mention being entirely white it seems (!) but guess that more sophisticated one will follow.

Thoughts?


#2

Why would you even think of making this? God knows how much time and effort it requires.

There are some right arseholes out there.


#3

I have to say, I think this is terrible in many ways, and I’m desperate to try to find reasons why it might not work

I wonder if the fact that the pictures are from a dating website makes it ‘easier’ for the AI
and I would have thought the fact that it is photographs might mean that they have been hasty drawing conclusions about what the actual physical structure of faces could be a sign of…

Maybe I’m clutching at straws

Not particularly comforting if it heralds the return of physiognomy and phrenology though!


#4

Probably just looked at the bit where it said “looking for”


#5

The authors say in the article that they did it because the technology is already available and they want to raise the issue now so that controls can be put in place to regulate it properly. (Most likely they came up with this excuse afterwards and just wanted to publish a paper that was going to get lots of attention).


#6

Usually the answer to this is simply “because we can”.

Not just uncomfortable with this from the point of view of outing people that don’t want to make their sexuality public, but also, 9% is a pretty high error rate for something that could have huge ramifications for individuals lives.


#7

how is this not total bollocks? (didn’t read the article)


#8

Probably not all that much time or effort. This kind of machine learning is fairly rote nowadays. Show it pictures and tell it whether the person in the picture is straight/gay.

Doesn’t justify it obviously but I don’t think there’s anything “clever” about this.

EDIT: not sure if that’s appropriate use of “rote”.


#9

I bet it can’t.

Unless it’s photo of you having sex with another man/woman.


#10

People often think I’m gay, predominantly because of my mannerisms and my total incompetence with ladykind.

I wonder if a photo picks up on that.


#11

Obviously loads of scary implications of the use of this technology, but think the science behind it could be positive. Presumably if it can find markers in the face that correlate then that supports their being an underlying biological basis of sexuality it will be harder for people to say it is not natural (obviously there shouldn’t need to be an underlying biology cause for people to accept it, it being people’s choice should be enough)


#12

I hope also the fact that it’s white people only in their shitty experiment might also be a clue that they aren’t tracking what they think they are, that their results can’t be generalised in any meaningful way etc.


#13

Completely unethical, obviously.

The numbers mean nothing without comparable data either. Not that they’d be of importance in this instance, but i imagine A.I could be used with some success to check all sorts of other personal information.


#14

Just hope they don’t develop something to tell if you’re a complete twat!


#15

I’ve never really understood how incompetence with women translates into ‘he must be slaying the D’, but you do hear it a lot from people


#16

This. It’s surely that there are trends in the kind of profile pictures that straight and gay people choose. In a forum where they are open about their sexuality


#17

obviously it’s never going to work 100%

but it’s still scary to think it exists at all and will only get better. can you imagine employers filtering through a thousand CVs deciding they don’t want any candidates that look a bit gay? doesn’t matter if the software is entirely accurate or not


#18

It’s very obvious why this piece of AI can determine gay/straight from an image, and I’m surprised the Guardian didn’t pick up on this:

“consistent with the association between baseball caps and masculinity in American culture (Skerski, 2011), heterosexual men and lesbians tended to wear baseball caps”

Mystery solved.


#19

Well, surely you can just get around that by not applying for jobs where you’re asked for a photograph, surely? Like, 99% of them.


#20

For me that’s completely unethical as well, btw. Not completely sure it exists, but if it does it’s not on. If a potential employer asked me for a photograph before or during the interview stage i’d ask them why it was necessary, and almost certainly end my interest in the role.