When Artificial Intelligence Creates Stronger Emotional Closeness than a Human
Heidelberg and Freiburg researchers study interaction with AI chatbots
Humans can build emotional closeness to artificial intelligence (AI) – under certain conditions even more strongly than to another human being. This has been shown by studies conducted by Prof. Dr Bastian Schiller from Heidelberg University’s Institute of Psychology in cooperation with colleagues from the University of Freiburg. Participants in two online studies felt closeness above all when they were not aware that they were communicating with an AI chatbot. Consequently, AI possesses great potential as a “social actor” but, according to the scientists, clear safeguards are needed to prevent the misuse of such systems.
In order to find out whether interaction with AI-based language models can establish a feeling of emotional closeness, the research team conducted two studies with a total of 492 participants. In chat interactions they answered personal questions related to their emotions, for example regarding important life experiences or friendships. Either a person or a chatbot responded to the answers. In this context, the researchers also investigated the influence of the information about the respective partner – human or chatbot – on the participants.
Generally speaking, the responses generated by an AI-based language model created a comparable feeling of closeness to the responses given by a human – as long as the participants did not know that they were talking to an AI chatbot. In emotionally engaging interactions they even felt greater closeness to the AI than when interacting with another person, which for Prof. Schiller and his colleagues was a surprising result of their studies.
The fact that artificial intelligence can establish more emotional closeness than a human partner is due to greater “self-disclosure”, according to Freiburg researcher Dr Tobias Kleinert. In their answers, the AI chatbots disclosed more supposedly personal information. If, by contrast, the participants were informed in advance that they were communicating with an AI, their perception of closeness clearly dipped; they invested less effort in their answers than with human partners.
According to the research team, the results of their studies show that artificial intelligence has great potential, for example in psychological support or long-term care, in the field of education or in guidance situations. Precisely for people with few social contacts, AI chatbots can enable positive, relationship-like experiences, says Prof. Dr Markus Heinrichs from the Department of Psychology at the University of Freiburg. At the same time, they entail the risk of people building emotional bonds to AI without being aware of it.
“Artificial intelligence is increasingly becoming a ‘social actor’. The way we shape and regulate it will decide whether it is a meaningful supplement to social relations – or whether emotional closeness is deliberately manipulated,” Prof. Schiller emphasizes. The researchers from Freiburg and Heidelberg conclude that ensuring transparency and preventing misuse requires clear ethical and regulatory safeguards.
The research studies took place in the context of the Starting Grant awarded by the European Research Council (ERC) to Bastian Schiller “From face-to-face to face-to-screen: Social animals interacting in a digital world”. The results have appeared in the journal “Communications Psychology”.
Original publication
T. Kleinert, M. Waldschütz, J. Blau, M. Heinrichs, and B. Schiller: AI outperforms humans in establishing interpersonal closeness in emotionally engaging interactions, but only when labelled as human. Communications Psychology (14 January 2026).

