Close Menu
    Facebook X (Twitter) Instagram
    Trending
    • Hauser’s Marks 45 Years With Province-Wide Customer Celebration and Renewed Focus on Community Care
    • Young drivers face elevated collision risks after consuming edible cannabis, new CAA-funded study finds
    • Salvation Army Thrift Store Marks 40th Ontario Location with Peterborough Opening
    • Early Blast of Winter Prompts Safety Warnings from Ontario Road Authorities
    • HONOR Takes Home Two TIME Best Inventions 2025 Awards for Smartphone Breakthroughs
    • Toronto Set to Host Largest LEGO® Fan Event in Canadian History
    • Hank Azaria and Caitlin Morrison Champion Mental Health Through Music at Toronto’s Koerner Hall
    • Bricks in the Six to Build Canada’s Largest-Ever LEGO® Fan Event This November
    Facebook X (Twitter) Instagram YouTube
    Vaughan TodayVaughan Today
    • Home
    • Top News
    • World
    • Banking
    • Explore Canada
    • How to
    • Solutions
    • Contact Form
    Vaughan TodayVaughan Today
    Home»Tech»Facebook’s algorithm misrepresents blacks and monkeys
    Tech

    Facebook’s algorithm misrepresents blacks and monkeys

    Jillian CastilloBy Jillian CastilloSeptember 4, 2021No Comments2 Mins Read
    Facebook’s algorithm misrepresents blacks and monkeys
    Share
    Facebook Twitter LinkedIn Pinterest Email

    (San Francisco) A Facebook recommendation algorithm asked users if they wanted to see more “primates videos” within a UK desert video showing black people, The New York Times Friday.


    Posted on Sep 3, 2021 at 10:41 PM



    to share

    Video daily Mail, over a year ago, titled “White Man Calls Cops Against Black Men in Marina”. It only shows people, not monkeys.

    Below, the question “Do you see more primate videos?” With “Yes/Reject” options displayed on some users’ screen, according to a screenshot posted on Twitter by Darci Groves, the social networking giant’s former designer.

    And she commented on this by saying, “It’s scandalous,” calling on her former colleagues on Facebook to escalate the matter.

    “This is clearly an unacceptable mistake,” a Facebook spokesperson responded, at the request of AFP. “We apologize to anyone who came across these insulting recommendations.”

    She said the California group deactivated the recommendation tool on this topic “as soon as we noticed what was happening in order to investigate the causes of the problem and prevent its recurrence.”

    She continued, “As we said, even though we’ve improved our AI systems, we know it’s not perfect and that we have some progress to make.”

    The case highlights the limits of AI technologies, which the platform regularly highlights in its efforts to build a personalized feed for each of its 3 billion monthly users.

    They also use it extensively in content moderation to identify and block problematic messages and photos before they are even seen.

    But Facebook, like its competitors, is regularly accused of not fighting racism and other forms of hate and discrimination.

    The topic raises more tension as several civil society organizations accuse social networks and their algorithms of contributing to the division of American society, in the context of the Black Lives Matter (Black Lives Matter) movement demonstrations.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Tumblr Email
    Jillian Castillo

    "Proud thinker. Tv fanatic. Communicator. Evil student. Food junkie. Passionate coffee geek. Award-winning alcohol advocate."

    Related Posts

    MSI Unveils Black Friday Discounts on Flagship Laptops and Handhelds

    November 1, 2025

    Rare Earth Metals: Essential Uses and the Global Supply Chain

    October 4, 2025

    Bell error 2000: Troubleshoot and Solutions

    June 4, 2023
    Facebook X (Twitter) Instagram Pinterest
    © 2025 ThemeSphere. Designed by ThemeSphere.

    Type above and press Enter to search. Press Esc to cancel.