You can be on Entrepreneur’s cover!

Microsoft Apologizes for Chatbot's Racist, Sexist Tweets The company says that the program's tweets 'do not represent who we are or what we stand for, nor how we designed Tay.'

By Reuters

entrepreneur daily

This story originally appeared on Reuters

Reuters | Shannon Stapleton

Microsoft is "deeply sorry" for the racist and sexist Twitter messages generated by the so-called chatbot it launched last week, a company official wrote on Friday, after the artificial intelligence program went on an embarrassing tirade.

The bot, known as Tay, was designed to become "smarter" as more users interacted with it. Instead, it quickly learned to parrot a slew of anti-Semitic and other hateful invective that human Twitter users started feeding the program, forcing Microsoft Corp to shut it down on Thursday.

Following the setback, Microsoft said in a blog post it would revive Tay only if its engineers could find a way to prevent Web users from influencing the chatbot in ways that undermine the company's principles and values.

"We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay," wrote Peter Lee, Microsoft's vice president of research.

Microsoft created Tay as an experiment to learn more about how artificial intelligence programs can engage with Web users in casual conversation. The project was designed to interact with and "learn" from the young generation of millennials.

Related: Top 10 Best Chatbot Platform Tools to Build Chatbots for Your Business

Tay began its short-lived Twitter tenure on Wednesday with a handful of innocuous tweets.

Then its posts took a dark turn.

In one typical example, Tay tweeted: "feminism is cancer," in response to another Twitter user who had posted the same message.

Lee, in the blog post, called Web users' efforts to exert a malicious influence on the chatbot "a coordinated attack by a subset of people."

"Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack," Lee wrote. "As a result, Tay tweeted wildly inappropriate and reprehensible words and images."

Microsoft has enjoyed better success with a chatbot called XiaoIce that the company launched in China in 2014. XiaoIce is used by about 40 million people and is known for "delighting with its stories and conversations," according to Microsoft.

Related: How to Create a Facebook Messenger Chatbot For Free Without Coding

As for Tay? Not so much.

"We will remain steadfast in our efforts to learn from this and other experiences as we work toward contributing to an Internet that represents the best, not the worst, of humanity," Lee wrote.

(By Alex Dobuzinskis, Editing by Frank McGurty and Peter Cooney)

Want to be an Entrepreneur Leadership Network contributor? Apply now to join.

Editor's Pick

Data & Recovery

This File Backup Tool Subscription Is $25 for Life for One Week Only

AOEMI Backupper Professional is designed to protect, store, and transfer user's files for them.

Side Hustle

He Took His Side Hustle Full-Time After Being Laid Off From Meta in 2023 — Now He Earns About $200,000 a Year: 'Sweet, Sweet Irony'

When Scott Goodfriend moved from Los Angeles to New York City, he became "obsessed" with the city's culinary offerings — and saw a business opportunity.

Business News

James Clear Explains Why the 'Two Minute Rule' Is the Key to Long-Term Habit Building

The hardest step is usually the first one, he says. So make it short.

Business News

Microsoft's New AI Can Make Photographs Sing and Talk — and It Already Has the Mona Lisa Lip-Syncing

The VASA-1 AI model was not trained on the Mona Lisa but could animate it anyway.

Business Ideas

63 Small Business Ideas to Start in 2024

We put together a list of the best, most profitable small business ideas for entrepreneurs to pursue in 2024.

Living

Get Your Business a One-Year Sam's Club Membership for Just $14

Shop for office essentials, lunch for the team, appliances, electronics, and more.