Results 1 to 12 of 12
  1. #1
    A Cockless Wonder
    Looper's Avatar
    Join Date
    Jun 2007
    Last Online
    Yesterday @ 01:35 AM
    Posts
    15,242

    New AI can guess whether you're gay or straight from a photograph

    Artificial intelligence can accurately guess whether people are gay or straight based on photos of their faces, according to new research that suggests machines can have significantly better “gaydar” than humans.



    The study from Stanford University – which found that a computer algorithm could correctly distinguish between gay and straight men 81% of the time, and 74% for women – has raised questions about the biological origins of sexual orientation, the ethics of facial-detection technology, and the potential for this kind of software to violate people’s privacy or be abused for anti-LGBT purposes.

    The machine intelligence tested in the research, which was published in the Journal of Personality and Social Psychology and first reported in the Economist, was based on a sample of more than 35,000 facial images that men and women publicly posted on a US dating website. The researchers, Michal Kosinski and Yilun Wang, extracted features from the images using “deep neural networks”, meaning a sophisticated mathematical system that learns to analyze visuals based on a large dataset.

    The research found that gay men and women tended to have “gender-atypical” features, expressions and “grooming styles”, essentially meaning gay men appeared more feminine and vice versa. The data also identified certain trends, including that gay men had narrower jaws, longer noses and larger foreheads than straight men, and that gay women had larger jaws and smaller foreheads compared to straight women.

    Human judges performed much worse than the algorithm, accurately identifying orientation only 61% of the time for men and 54% for women. When the software reviewed five images per person, it was even more successful – 91% of the time with men and 83% with women. Broadly, that means “faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain”, the authors wrote.
    Elon Musk says AI could lead to third world war
    Read more

    The paper suggested that the findings provide “strong support” for the theory that sexual orientation stems from exposure to certain hormones before birth, meaning people are born gay and being queer is not a choice. The machine’s lower success rate for women also could support the notion that female sexual orientation is more fluid.

    While the findings have clear limits when it comes to gender and sexuality – people of color were not included in the study, and there was no consideration of transgender or bisexual people – the implications for artificial intelligence (AI) are vast and alarming. With billions of facial images of people stored on social media sites and in government databases, the researchers suggested that public data could be used to detect people’s sexual orientation without their consent.

    It’s easy to imagine spouses using the technology on partners they suspect are closeted, or teenagers using the algorithm on themselves or their peers. More frighteningly, governments that continue to prosecute LGBT people could hypothetically use the technology to out and target populations. That means building this kind of software and publicizing it is itself controversial given concerns that it could encourage harmful applications.

    But the authors argued that the technology already exists, and its capabilities are important to expose so that governments and companies can proactively consider privacy risks and the need for safeguards and regulations.

    “It’s certainly unsettling. Like any new tool, if it gets into the wrong hands, it can be used for ill purposes,” said Nick Rule, an associate professor of psychology at the University of Toronto, who has published research on the science of gaydar. “If you can start profiling people based on their appearance, then identifying them and doing horrible things to them, that’s really bad.”

    Rule argued it was still important to develop and test this technology: “What the authors have done here is to make a very bold statement about how powerful this can be. Now we know that we need protections.”

    Kosinski was not available for an interview, according to a Stanford spokesperson. The professor is known for his work with Cambridge University on psychometric profiling, including using Facebook data to make conclusions about personality. Donald Trump’s campaign and Brexit supporters deployed similar tools to target voters, raising concerns about the expanding use of personal data in elections.

    In the Stanford study, the authors also noted that artificial intelligence could be used to explore links between facial features and a range of other phenomena, such as political views, psychological conditions or personality.

    This type of research further raises concerns about the potential for scenarios like the science-fiction movie Minority Report, in which people can be arrested based solely on the prediction that they will commit a crime.

    “AI can tell you anything about anyone with enough data,” said Brian Brackeen, CEO of Kairos, a face recognition company. “The question is as a society, do we want to know?”

    Brackeen, who said the Stanford data on sexual orientation was “startlingly correct”, said there needs to be an increased focus on privacy and tools to prevent the misuse of machine learning as it becomes more widespread and advanced.

    Rule speculated about AI being used to actively discriminate against people based on a machine’s interpretation of their faces: “We should all be collectively concerned.”

    https://www.theguardian.com/technolo...m-a-photograph

  2. #2
    I am in Jail
    stroller's Avatar
    Join Date
    Mar 2006
    Last Online
    12-03-2019 @ 09:53 AM
    Location
    out of range
    Posts
    23,025
    Quote Originally Posted by Looper
    gay men appeared more feminine
    Of course.


  3. #3
    Thailand Expat
    wasabi's Avatar
    Join Date
    Dec 2012
    Last Online
    28-10-2019 @ 03:54 AM
    Location
    England
    Posts
    10,940
    Artificial intelligence can now spot homosexuals, whatever next?

  4. #4
    Thailand Expat David48atTD's Avatar
    Join Date
    Jan 2016
    Last Online
    @
    Location
    Palace Far from Worries
    Posts
    14,393
    Standing @ the junction of Asoke and Sukhumvit, not far from Cowboy, waiting to cross.

    It was the wee hours.

    Two Thais came up from behind

    One placed 'her' hand on my shoulder, the other, on my ass.

    Which one was the LadyBoy?

    The ass grabber, of course.
    ... and she wasn't appreciating my gluteus maximus, more like making a drunken grab for my wallet

    Loops, fun OP BTW
    Someone is sitting in the shade today because someone planted a tree a long time ago ...


  5. #5
    Thailand Expat
    chassamui's Avatar
    Join Date
    Feb 2009
    Last Online
    @
    Location
    Bali
    Posts
    11,678
    So ..... Arabs ....... big noses ......... gay proclivities, rumor, .... or not

  6. #6
    Thailand Expat

    Join Date
    Mar 2015
    Last Online
    @
    Posts
    15,541
    Quote Originally Posted by Looper
    The machine’s lower success rate for women also could support the notion that female sexual orientation is more fluid.
    Some of us knew this already after participating in threesomes.


    Quote Originally Posted by Looper
    In the Stanford study, the authors also noted that artificial intelligence could be used to explore links between facial features and a range of other phenomena, such as political views, psychological conditions or personality.
    Job interviews....you didn't get the job because you're a Trumpster from our analysis. Sounds fair enough, but there's two sides to that coin.


    Quote Originally Posted by chassamui
    So ..... Arabs ....... big noses ......... gay proclivities, rumor, .... or not
    Semites would be a better description.
    Never heard that rumour.

  7. #7
    กงเกวียนกำเกวียน HuangLao's Avatar
    Join Date
    Jun 2017
    Last Online
    @
    Location
    สุโขทัย
    Posts
    10,149
    What's the practical purpose for such a study?


  8. #8
    I am in Jail
    stroller's Avatar
    Join Date
    Mar 2006
    Last Online
    12-03-2019 @ 09:53 AM
    Location
    out of range
    Posts
    23,025
    artificial intelligence could be used to explore links between facial features and a range of other phenomena, such as political views, psychological conditions or personality
    Quote Originally Posted by HuangLao
    What's the practical purpose for such a study?
    Visa application fast-tracking.

    "Sorry Sir, there's a 69% chance that you're a commie and 100% that you're a scat-enthusiast according to facial analysis - only 20% are acceptable for the non-imm."

  9. #9
    Thailand Expat
    chassamui's Avatar
    Join Date
    Feb 2009
    Last Online
    @
    Location
    Bali
    Posts
    11,678
    How does it cope with feminine transgenders? Does it need sight of hands, feet and throats?


    Quote Originally Posted by Maanaam
    Never heard that rumour.
    Arab men allegedly like young boys. Is that not gay?

  10. #10
    A Cockless Wonder
    Looper's Avatar
    Join Date
    Jun 2007
    Last Online
    Yesterday @ 01:35 AM
    Posts
    15,242
    Row over AI that 'identifies gay faces'

    A facial recognition experiment that claims to be able to distinguish between gay and heterosexual people has sparked a row between its creators and two leading LGBT rights groups.



    The Stanford University study claims its software recognises facial features relating to sexual orientation that are not perceived by human observers.

    The work has been accused of being "dangerous" and "junk science".

    But the scientists involved say these are "knee-jerk" reactions.

    Details of the peer-reviewed project are due to be published in the Journal of Personality and Social Psychology.
    Narrow jaws

    For their study, the researchers trained an algorithm using the photos of more than 14,000 white Americans taken from a dating website.

    They used between one and five of each person's pictures and took people's sexuality as self-reported on the dating site.

    The researchers said the resulting software appeared to be able to distinguish between gay and heterosexual men and women.

    In one test, when the algorithm was presented with two photos where one picture was definitely of a gay man and the other heterosexual, it was able to determine which was which 81% of the time.

    With women, the figure was 71%.

    "Gay faces tended to be gender atypical," the researchers said. "Gay men had narrower jaws and longer noses, while lesbians had larger jaws."

    But their software did not perform as well in other situations, including a test in which it was given photos of 70 gay men and 930 heterosexual men.

    When asked to pick 100 men "most likely to be gay" it missed 23 of them.

    In its summary of the study, the Economist - which was first to report the research - pointed to several "limitations" including a concentration on white Americans and the use of dating site pictures, which were "likely to be particularly revealing of sexual orientation".
    'Reckless findings'

    On Friday, two US-based LGBT-focused civil rights groups issued a joint press release attacking the study in harsh terms.

    "This research isn't science or news, but it's a description of beauty standards on dating sites that ignores huge segments of the LGBTQ (lesbian, gay, bisexual, transgender and queer/questioning) community, including people of colour, transgender people, older individuals, and other LGBTQ people who don't want to post photos on dating sites," said Jim Halloran, chief digital officer of Glaad, a media-monitoring body.

    "These reckless findings could serve as a weapon to harm both heterosexuals who are inaccurately outed, as well as gay and lesbian people who are in situations where coming out is dangerous."
    Image caption Campaigners raised concerns about what would happen if surveillance tech tried to make use of the study

    The Human Rights Campaign added that it had warned the university of its concerns months ago.

    "Stanford should distance itself from such junk science rather than lending its name and credibility to research that is dangerously flawed and leaves the world - and this case, millions of people's lives - worse and less safe than before," said its director of research, Ashland Johnson.

    The two researchers involved - Prof Michael Kosinski and Yilun Wang - have since responded in turn, accusing their critics of "premature judgement".

    "Our findings could be wrong... however, scientific findings can only be debunked by scientific data and replication, not by well-meaning lawyers and communication officers lacking scientific training," they wrote.

    "However, if our results are correct, Glaad and HRC representatives' knee-jerk dismissal of the scientific findings puts at risk the very people for whom their organisations strive to advocate."
    'Treat cautiously'

    Previous research that linked facial features to personality traits has become unstuck when follow-up studies failed to replicate the findings. This includes the claim that a face's shape could be linked to aggression.

    One independent expert, who spoke to the BBC, said he had added concerns about the claim that the software involved in the latest study picked up on "subtle" features shaped by hormones the subjects had been exposed to in the womb.

    "These 'subtle' differences could be a consequence of gay and straight people choosing to portray themselves in systematically different ways, rather than differences in facial appearance itself," said Prof Benedict Jones, who runs the Face Research Lab at the University of Glasgow.

    It was also important, he said, for the technical details of the analysis algorithm to be published to see if they stood up to informed criticism.

    "New discoveries need to be treated cautiously until the wider scientific community - and public - have had an opportunity to assess and digest their strengths and weaknesses," he said.

    Row over AI that 'identifies gay faces' - BBC News

  11. #11
    Thailand Expat
    wasabi's Avatar
    Join Date
    Dec 2012
    Last Online
    28-10-2019 @ 03:54 AM
    Location
    England
    Posts
    10,940
    When I was young and beautiful, too pretty to be a boy, My divine looks would have foiled the computer into mistaking my straight ways by my looks.

  12. #12
    Thailand Expat
    bobo746's Avatar
    Join Date
    Oct 2009
    Last Online
    24-01-2019 @ 09:21 AM
    Location
    Brisbane
    Posts
    14,320
    Anyone got a photo of Blue,the machine would go off it's head.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •