this post was submitted on 05 Jun 2024
596 points (100.0% liked)

LGBTQ+

4642 readers
122 users here now

founded 2 years ago
MODERATORS
 

Via LeopardsAteMyFace on Reddit

App covered by Them and Pink News

all 49 comments
sorted by: hot top controversial new old
[–] smiletolerantly@awful.systems 140 points 2 years ago (4 children)

This is hilarious, but also: how could anyone develop such a tool and not at least test it out on their own images? Someone with a public persona no less! Boggles my mind.

[–] schnokobaer@feddit.de 67 points 2 years ago (3 children)

I mean, I bet they did, but you can't test it with all the photos ever taken of you. Someone probably tried dozens of photos to get this result. Which, to be clear, I admire.

[–] youngalfred@lemm.ee 54 points 2 years ago (1 children)

It looks like it's the photo from her tweet

[–] RogueBanana@lemmy.zip 1 points 2 years ago

I would assume it was taken just for this tweet so all the testing would have been completed by that time.

[–] 30p87@feddit.de 38 points 2 years ago (1 children)

Well - in this case it's the photo she posted herself, to announce the app.

[–] Revonult@lemmy.world 15 points 2 years ago* (last edited 2 years ago)

They did, they know it doesn't work, but they are already too far down in the money hole. Gotta grift and bullshit and spread bigotry until they make back the money.

Edit: Some words

[–] jaybone@lemmy.world 1 points 2 years ago

Does she have a public persona? Who is this person?

[–] Maggoty@lemmy.world 102 points 2 years ago* (last edited 2 years ago) (4 children)

Heat emissions.

From a picture.

Hold on there's a ringing in my ears... Yeah yeah that's my bullshit alarm going off.

[–] JackbyDev@programming.dev 33 points 2 years ago

The heat emission is the smoke they're trying to blow up my ass.

[–] iAvicenna@lemmy.world 11 points 2 years ago (1 children)

as in how hot the person in the picture is?

[–] Maggoty@lemmy.world 3 points 2 years ago

Lmao can't be hotter than the CEO!

[–] BluesF@lemmy.world 8 points 2 years ago

Could be analysing 4 band imagery with an NIR layer. But then that usually comes from satellite imagery so it would make identifying gender challenging. I'd struggle with just a grainy image of the top of someone's head, even if I knew how warm it was.

Well, Big Shaq told us all that Man's not hot.

[–] KISSmyOSFeddit@lemmy.world 77 points 2 years ago (2 children)

Maybe her transphobia is just an attempt to pass better?

[–] ThePyroPython@lemmy.world 28 points 2 years ago

Ah straight out of the Dictatorship Surviour's Guide for Middle Management playbook.

[–] lesbian_seagull@lemm.ee 5 points 2 years ago

I kinda want to use it to see if I pass as a cis woman with short hair, but then again is my ego really prepared to be misgendered by some shitass app!?

Also, too, fuck TERFS 🖕

[–] nick@midwest.social 77 points 2 years ago (1 children)

So because she didn’t check herself, you might say she wrecked herself.

[–] lesbian_seagull@lemm.ee 9 points 2 years ago

☜(゚ヮ゚☜)

[–] breadsmasher@lemmy.world 54 points 2 years ago

well if it is 99.85% accurate, maybe she is self hating and hiding

[–] ch00f@lemmy.world 46 points 2 years ago* (last edited 2 years ago) (2 children)

What’s the play here? Does she not know that people upload highly inaccurate or blatantly fake photos to dating sites all the time?

What problem does this solve?

[–] Diplomjodler3@lemmy.world 53 points 2 years ago

The problem that right wing fuckwits always need somebody to hate and discriminate against.

[–] alyth@lemmy.world 10 points 2 years ago* (last edited 2 years ago)

The problem it solves is that she needs plenty of money with little effort and morals are not a limiting factor. And what Diplomjodler3 said.

[–] smiletolerantly@awful.systems 40 points 2 years ago (2 children)

OK im starting to have doubts that this is legit. Looks like OP (or OOP, idk) just found a classifier which misclassified that image. Nothing I'm seeing indicates that it's the classifier used for her stupid app.

[–] Cybrpwca@beehaw.org 28 points 2 years ago (1 children)

I fed it a pre-HRT pic and got "Woman, 56% confident". Lol. I guess it's kind of affirming to think a machine could see the real me back then?

[–] lud@lemm.ee 9 points 2 years ago (1 children)

I did the same pic that was used above and it said "Woman, 95% confident"

[–] BananaOnionJuice@lemmy.dbzer0.com 1 points 2 years ago* (last edited 2 years ago)

Did you use the larger cropped picture? And else I was thinking what if the AI was actually saying I'm 97% sure that it's a man facing away from the camera.

[–] ASeriesOfPoorChoices@lemmy.world 14 points 2 years ago

that's a solid disclaimer.

[–] qjkxbmwvz@startrek.website 34 points 2 years ago* (last edited 2 years ago)

According to the screenshot, it doesn't even call her a trans woman, it calls her a man. Presumably because man and woman are the only options on her little TERF world.

[–] rimu@piefed.social 28 points 2 years ago

The AI probably saw that massive boner in her pants and got confused.

[–] BestBouclettes@jlai.lu 27 points 2 years ago

They can always tell!

[–] nifty@lemmy.world 27 points 2 years ago (1 children)

As funny as it is, I don’t think people should be uploading their images to this app. Maybe it’s hilariously wrong because it’s trying to data mine?

[–] Maggoty@lemmy.world 3 points 2 years ago

Yup if you give information to a company it's now theirs. The old adage about being the product if you're not paying no longer applies. Now you are the product even if you're paying.

[–] ClockworkOtter@lemmy.world 19 points 2 years ago* (last edited 2 years ago) (2 children)

I wonder if the AI is detecting that the photo is taken from further away and below eye level which is more likely for a photo of a man, rather than looking at her facial characteristics?

[–] Tyoda@lemm.ee 17 points 2 years ago (1 children)

It's possible to manipulate an image in a way that the original and the new one are indistinguishable to the human eye, but the AI model gives completely different results.

Like this helpful graphic I found

Or... edit the HTML...

[–] Alexstarfire@lemmy.world 6 points 2 years ago

You think someone would do that? Just go on the internet and lie?

[–] drcobaltjedi@programming.dev 16 points 2 years ago

Yeah, this is a valid point, if this is the exact case or not I don't know, but a lot of people don't realize a lot of the weird biases that can appear in the training data.

Like that AI trained to detect ig a mole was cancer or not. A lot of the training data that was cancer had rulers in them. So the AI learned rulers are cancerous.

I could easily see something stupid like angle the picture was taken from being something the AI erroniously assumed to be useful for determining biological sex in this case.

[–] EmilyIsTrans@lemmy.blahaj.zone 14 points 2 years ago

Lol it was 85% confident I was "female"

[–] Oszilloraptor@feddit.de 11 points 2 years ago* (last edited 2 years ago) (1 children)

Half of my images are around 50% confidence that I look like I'm a man

All other images lead to 98% confidence in me beeing a women.

All of my test-pics were made pre-HRT, and while my style makes it obvious that I don't identify as masculine (or am GNC+Gay) my face (unfortunately) screams was a pre-HRT AMAB.

That AI is totally shit

[–] Maggoty@lemmy.world 8 points 2 years ago

They said it includes heat emissions. Which is impossible without an IR camera that everyone totally has. I love it when bullshit tech charlatans tell on themselves.

[–] bluewing@lemm.ee 11 points 2 years ago

We are all Trans Women on this blessed day.

Who uses their real picture in a dating app anyways?

/s

[–] Taleya@aussie.zone 4 points 2 years ago

Ah bless, they've automated it now.