Users prefer the ‘warmth’ of an artificial intelligence system over its competence, according to researchers in Haifa

Why your clothes become hot ehen the land is becalmed by the south wind?

Job

37:

17

(the israel bible)

June 30, 2021

3 min read

When you have to choose a hair stylist, a dentist or a baby minder, you probably decide on how warm, friendly and affable the person is, not only whether he or she has a good reputation. 

 

It turns out that these same considerations are in effect when people evaluate which artificial intelligence (AI) systems to use. Waze or Google Maps? Spotify or Apple Music? Alexa or Siri? Consumers choose between AI-based systems every day, but how exactly do they choose which systems to use? 

 

People increasingly rely on AI-based systems to aid decision-making in various domains and often face a choice between alternative systems.

 

A recent study conducted by researchers from the Faculty of Industrial Engineering and Management at the Technion-Israel Institute of Technology in Haifa has shown that the “warmth” of a system plays a pivotal role in predicting consumers’ choice between AI systems.

 

We know what warmth is when it involves people. What about “cold,” non-human AI applications can be ‘warm’? In fact, this adjective refers to the AI systems’ perceived intent (good or ill) and competence – that is, the systems’ perceived ability to act on those intentions and on the choices they made. 

 

These findings are similar to what is known of human interactions: warmth considerations are often more important than competence considerations when judging fellow humans. In other words, people use similar basic social rules to evaluate AI systems and people, even when assessing AI systems without overt human characteristics. Based on their findings, the researchers concluded that AI system designers consider and communicate the system’s warmth to its potential users. 

Considering the amount of money and efforts spent on AI performance enhancement, one might expect competence and capability to drive users’ choices. 

Zohar Gilad (Technion)

The research was recently published in the Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems under the title “The Effects of Warmth and Competence Perceptions on Users’ Choice of an AI System” and carried out by the Technion’s Zohar Gilad, Prof. Ofra Amir and Prof. Liat Levontin.

 

Most of the research done to date regarding “warmth” perceptions of AI-based systems addressed systems with a virtual or physical presence such as virtual agents and robots. The current study, however, focused on “faceless” AI systems with little or no social presence such as recommender systems, search engines and navigation apps. For these types of AI systems, the researchers defined “warmth” as the primary beneficiary of the system. For example, a navigation system can prioritize collecting data about new routes (benefitting the system) over presenting the best-known route or vice versa. 

 

The researchers found that the system’s “warmth” was important to potential users even more than its competence and that they favored a highly “warm” system over a highly competent system. 

 

This preference for “warmth” persisted even when the highly “warm” system was overtly deficient in its competence. For example, when asked to choose between two AI systems that recommend car insurance plans, most participants favored a system with low-competence (“using an algorithm trained on data from 1,000 car insurance plans”) and high-“warmth” (“developed to help people like them”), over a system with high-competence (“using a state-of-the-art artificial neural network algorithm trained on data from 1,000,000 car insurance plans”) and low-“warmth” (“developed to help insurance agents make better offers”). That is, consumers were willing to sacrifice competence for higher warmth.


So these findings are similar to what is known of human interactions: warmth considerations are often more important than competence considerations when judging fellow humans – people use similar basic social rules to evaluate AI systems and people even when assessing AI systems without overt human characteristics. Based on their findings, the researchers concluded that AI system designers consider and communicate the system’s warmth to its potential users. 

The shortcode is missing a valid Donation Form ID attribute.

 

Register to Vote

JOIN MORE THAN

1300

registered voters!

If you would like to stop the threat of a Palestinian State in the heartland of Israel, vote for Israel365 in the upcoming World Zionist Congress elections taking place from March 10-May 4, 2025. Please submit your information and we will remind you to vote during the voting period!
* To be eligible to vote you must be a Jewish US citizen who is 18+ with primary residency in the US.

Share this article

Subscribe

Prophecy from the Bible is revealing itself as we speak. Israel365 News is the only media outlet reporting on it.

Sign up to our free daily newsletter today to get all the most important stories directly to your inbox. See how the latest updates in Jerusalem and the world are connected to the prophecies we read in the Bible. .