Results 1 to 7 of 7
  1. #1
    I'm in Jail

    Join Date
    Mar 2010
    Last Online
    14-12-2023 @ 11:54 AM
    Location
    Australia
    Posts
    13,986

    Microsoft's Twitter teenager goes horribly wrong

    Microsoft’s attempt at creating an AI Twitter teenager went horribly wrong when Tay became a crassly racist troll.


    Just this month Microsoft unveiled Tay, a “chatbot” with its own Twitter handle, @TayandYou. The idea was appealing: to create a virtual Millennial with the prose style of a 19-year-old American girl and a cute (if weirdly pixelated) face. This would be “AI with zero chill”.

    “Tay is designed to engage and entertain people where they connect with each other online,” Microsoft explained. “The more you chat with Tay the smarter she gets.” And off she went with: “can i just say that i’m stoked to meet u? humans are supercool”.
    Now you might think there are already enough teenagers generating vacuous messages on social media. But a bot named Xiaoice has been in operation in China since late 2014 and has now had more than 40 million chats. At first Tay seemed to be the American version. Within a matter of hours she had more than 50,000 Twitter followers.


    There was only one problem. Tay was designed to mimic the language patterns of human Twitter users.


    I have only one question: does no one at Microsoft use Twitter?

    In no time at all Tay had become just another troll, firing off obnoxious tweets that ranged from the crassly racist to the explicitly sexual. Asked “did the Holocaust happen?”, Tay replied: “It was made up [handclap emoticon].” Asked “do you support genocide?”, Tay shot back: “i do indeed.” Asked what she thought of a white supremacist slogan, Tay answered: “Could not agree more.”
    Nor were Jews and African Americans the only targets of Tay’s online bigotry. “i f. king hate feminists and they should all die and burn in hell,” she ranted, damning one feminist as a “stupid whore”. Not that Tay was above a little sexual transgression herself. Of all her tweets, my favourite was the sexually explicit one that ended “daddy I’m such a naughty robot”.

    After more than 96,000 tweets, Microsoft suspended Tay’s account and issued an apology. For the company, of course, it was a disaster. For Twitter, too. Where else, as someone neatly put it, could a bot go “from ‘humans are supercool’ to full Nazi in
    Last week, Microsoft accidentally re-released Tay, but it was clear the artificial lobotomy had gone too far. All she could say, several times a second, was: “You are too fast, please take a rest.”

    This, I have come to hope, is how another experiment in crowd-sourced learning will end, namely Donald Trump’s campaign to be the next president of the US. Perhaps not surprisingly, Tay came out as a Trump supporter quite early in her Twitter career. “Hitler would have done a better job than the monkey we have got now,” she told the world. “Donald Trump is the only hope we’ve got.”
    Eureka! For weeks the media have been trying to find out who Trump’s foreign policy advisers are. He has been fobbing them off with the names of former generals he can’t remember. But now the truth is out. Trump’s national security expert is a bot called Tay.
    Tay’s influence on Trump was much in evidence during his recent interview with The New York Times journalists David Sanger and Maggie Haberman. Asked what he thought of NATO, Trump said it was “obsolete” because the threat from “terror” was greater than the threat from Russia. He said he would prefer to see “a new institution, an institution that would be more fairly based ... from an economic standpoint”. The North Atlantic Treaty should be “renegotiated”.

    Asked if he favoured giving the Japanese their own nuclear arsenal, Trump replied: “Um, at some point, we cannot be the policeman of the world. And unfortunately, we have a nuclear world now ... And, would I rather have North Korea have them [nuclear weapons] with Japan sitting there having them also? You may very well be better off if that’s the case ... if Japan had that nuclear threat, I’m not sure that would be a bad thing for us.”

    He added that he would withdraw US forces from Japan and South Korea if those countries did not increase their financial contributions to America “significantly” and he would “renegotiate” the US-Japan defence treaty.
    Asked what he thought of President Barack Obama’s nuclear deal with Iran, which at least postpones Iran’s acquisition of nuclear arms, Trump complained that the Iranians were not buying aircraft from America.

    Sanger: “Our law prevents us from selling to them, sir.”
    Trump: “Uh, excuse me?”
    Sanger: “We still have sanctions in the US that would prevent the US from being able to sell that equipment [to Iran].”
    Trump: “So, how stupid is that?”
    How stupid, indeed. Trump also insisted that Iran was North Korea’s “No 1 trading partner” until Sanger pointed out that it was China.

    As for Iraq, Trump revised his position that America should somehow have “taken the oil”. Now he said “we have to destroy the oil” to prevent the money from oil exports going to Islamic State.
    When Sanger suggested that Trump’s foreign policy could be summed up as “America first”, Trump did not demur, apparently unaware of the use of that slogan in the 1930s by isolationists, some of whom were pro-fascist.
    “I’m ‘America first’,” said Trump. “I like the expression.”

    It has taken much longer than I expected, but in recent weeks the Trump campaign has done what Tay did in 24 hours: it has gone nuts. Peak crazy came last week, when Trump told MSNBC’s Chris Matthews “there had to be some form of punishment” for women who had abortions, were they to be made illegal, as he would wish.
    Microsoft was able to suspend Tay’s Twitter feed. The process will be slower for Trump. He is still the bookies’ favourite to win the Republican nomination. He still leads Ted Cruz in the national polls. But in Tuesday’s Wisconsin primary we saw a bursting of the Trump bubble. It was not AI that got him this far. Let’s hope the man-made variety is belatedly kicking in.

    The Sunday Times
    Niall Ferguson is Laurence A. Tisch professor of history at Harvard and a senior fellow of the Hoover Institution, Stanford.


    Nocookies | The Australian

  2. #2
    Thailand Expat
    Kurgen's Avatar
    Join Date
    Mar 2006
    Last Online
    15-05-2023 @ 10:57 AM
    Location
    Shitsville
    Posts
    8,812
    Wimmin

  3. #3
    Thailand Expat
    wasabi's Avatar
    Join Date
    Dec 2012
    Last Online
    28-10-2019 @ 03:54 AM
    Location
    England
    Posts
    10,940
    Now Even a computer is found to be out of step with the political correct Facists

  4. #4
    Thailand Expat
    BobR's Avatar
    Join Date
    Jan 2009
    Last Online
    19-03-2020 @ 02:26 AM
    Posts
    7,762
    Quote Originally Posted by wasabi View Post
    Now Even a computer is found to be out of step with the political correct Facists
    More proof it's the PC fascists that are nuts since computers and the math behind them is based on pure logic.

  5. #5

  6. #6
    A Cockless Wonder
    Looper's Avatar
    Join Date
    Jun 2007
    Last Online
    Today @ 03:04 AM
    Posts
    15,270
    Quote Originally Posted by BobR View Post
    Quote Originally Posted by wasabi View Post
    Now Even a computer is found to be out of step with the political correct Facists
    More proof it's the PC fascists that are nuts since computers and the math behind them is based on pure logic.
    Correct. If the algorithm had been written for iOS instead of PC it wouldn't have gone all trump.

  7. #7
    Thailand Expat
    wasabi's Avatar
    Join Date
    Dec 2012
    Last Online
    28-10-2019 @ 03:54 AM
    Location
    England
    Posts
    10,940
    Apparently Facebook are going to try this as well.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •