My Queue

There are no Videos in your queue.

Click on the Add to next to any video to save to your queue.

There are no Articles in your queue.

Click on the Add to next to any article to save to your queue.

There are no Podcasts in your queue.

Click on the Add to next to any podcast episode to save to your queue.

You're not following any authors.

Click the Follow button on any author page to keep up with the latest content from your favorite authors.


Video: Would You Follow a Robot Into a Fire? These Students Did.

Assistant Editor, Contributed Content
3 min read
Opinions expressed by Entrepreneur contributors are their own.

Imagine driving along, following directions from your smartphone, when the polite voice tells you to take a turn. You oblige, only to realize that you’ve been directed onto a dead end.

We’ve all been there: trusting machines a bit more than we should.

A new study from Georgia Tech Research Institute sheds a bit more light on that phenomenon, finding that people will trust directions from a robot in an emergency even when the machine had misled them in the past. A group of participants followed a robot to get to a conference room, though the machine purposefully misled the group, passing the room multiple times before arriving at the correct destination. Then, the scientists simulated an emergency situation -- in this case, a mock fire -- to see if the participants would follow the same robot to safety.

Related: The World Is Embracing Robots But America Keeps Them at Arms Length

Despite its earlier incompetence, and some participants being told told that the robot had broken down previously, the experiment’s subjects followed the robot’s directions. The volunteers listened to the robot even though it was directing them to an exit farther away than the doorway marked with exit signs that they used to enter the building. The experiment’s subjects only questioned the robot’s instructions when it malfunctioned during the emergency.

Researchers said were surprised by the results and have concluded that “victims in emergency situations may overtrust a robot, even when they have recently witnessed the robot malfunction.” Also, when the experiment was conducted without an emergency scenario in the simulation, volunteers did not trust a robot that had made previous errors, leading researchers to believe that the experiment’s robots were seen as authority figures during an emergency.

In future studies, scientists hope to understand the motivations that make humans trust robots, and whether factors such as an individual’s education level and other demographics influence their decisions.

These results could be encouraging because while earlier experiments have shown that people do not follow protocols during an emergency -- even with an active alarm -- this test may prove that putting robots in high-rise buildings will make residents more likely to leave. Still, scientists are wary of humans trusting the machines too much.

Related: This Is What Robots Will Be Doing in 2025

The research is scheduled to be presented March 9 at the 11th ACM/IEEE International Conference on Human-Robot Interaction in New Zealand. The conference is dedicated to “basic and applied human-robot interaction research.”

It should be noted that the study is very limited due to its small size -- only 42 people participated. Its conclusions should not be broadly applied.

More from Entrepreneur

Kathleen, Founder and CEO of Grayce & Co, a media and marketing consultancy, can help you develop a brand strategy, build marketing campaigns and learn how to balance work and life.
Book Your Session

In as little as seven months, the Entrepreneur Authors program will turn your ideas and expertise into a professionally presented book.
Apply Now

Create your business plan in half the time with twice the impact using Entrepreneur's BIZ PLANNING PLUS powered by LivePlan. Try risk free for 60 days.
Start My Plan

Latest on Entrepreneur