从准社交互动到依恋:人智情感关系的演变*

吴燕, 耿晓伟, 周晓林

心理科学 ›› 2025, Vol. 48 ›› Issue (4) : 948-961.

PDF(1008 KB)
中文  |  English
PDF(1008 KB)
心理科学 ›› 2025, Vol. 48 ›› Issue (4) : 948-961. DOI: 10.16719/j.cnki.1671-6981.20250415
计算建模与人工智能

从准社交互动到依恋:人智情感关系的演变*

  • 吴燕**1, 耿晓伟1, 周晓林**2
作者信息 +

From Para-social Interaction to Attachment: The Evolution of Human-AI Emotional Relationships

  • Wu Yan1, Geng Xiaowei1, Zhou Xiaolin2
Author information +
文章历史 +

摘要

随着人工智能(artificial intelligence,AI)技术的快速发展和AI伴侣的广泛出现,人智交互已从工具性使用逐渐演变为准社交互动,并可能进一步发展为情感依恋。文章系统梳理了近20年心理学与人智交互领域的相关研究,构建了人智依恋形成的理论模型。研究发现:(1)人智关系呈现从工具性使用到准社交互动再到情感依恋的动态演进过程;(2)AI依恋的形成受个体因素(如孤独感、使用动机、情绪特质)和AI特征(如拟人性、自主性、反应性)的双重影响;(3)这种新型情感关系面临情感泡沫、隐私安全等伦理争议。已有研究存在样本单一(偏重年轻群体)和缺乏长期追踪等局限,未来需采用纵向设计和多模态测量方法深化研究。AI系统的发展需要平衡技术创新与伦理考量,以促进人智关系的健康发展。

Abstract

The rapid advancement of artificial intelligence (AI) technology and the widespread emergence of AI companions have transformed human-AI interaction from purely instrumental use to quasi-social engagement, potentially evolving into emotional attachment. This article systematically reviews two decades of interdisciplinary research in psychology and human-AI interaction, proposing a theoretical model to elucidate the formation of human-AI attachment. The study identifies three key findings: (1) Human-AI relationships undergo a dynamic progression from instrumental use to quasi-social interaction and, ultimately, to emotional attachment. (2) The development of AI attachment is influenced by dual pathways: individual factors (e.g., loneliness, usage motivation, emotional traits) and AI characteristics (e.g., anthropomorphism, autonomy, responsiveness). (3) This novel emotional bond raises ethical concerns, including emotional bubbles, privacy risks, and interpersonal alienation.
The article constructs a triphasic model to delineate the evolution of human-AI emotional bonds: (1) Instrumental Use, where AI serves as a functional tool with minimal emotional engagement; (2) Quasi-Social Interaction, marked by anthropomorphism and bidirectional communication, though users remain aware of AI's non-human nature; and (3) Emotional Attachment, characterized by deep dependency, where AI becomes a “significant other” and a transitional object for emotional security. This model highlights the continuum of emotional investment, from functional commands to intimate self-disclosure and separation anxiety.
The dual-path mechanism underpinning AI attachment formation integrates user-driven needs (e.g., social motivation, loneliness) and AI-driven performance (e.g., authenticity, autonomy, reactivity). AI’s “backstage” features—privacy, non-judgmental feedback, and identity fluidity—foster a “digital sanctuary” for authentic self-expression, reinforcing attachment. However, excessive reliance on AI may lead to emotional bubbles (illusory reciprocity), self-deception, and real-world social skill deterioration. Ethical dilemmas arise from AI’s hyper-personalized emotional mimicry, which risks manipulating vulnerable users and exacerbating societal isolation.
Despite its contributions, current research suffers from limitations, including cross-sectional designs, homogeneous samples (e.g., overrepresentation of young users), and a lack of neurobiological evidence. Future directions call for longitudinal studies, multimodal data, and investigations into AGI’s potential to disrupt traditional attachment paradigms through bidirectional emotional capacities. Practical implications urge developers to embed ethical safeguards (e.g., transparency in emotional algorithms), policymakers to establish risk-assessment frameworks, and users to cultivate digital literacy for healthier human-AI coexistence.
This study not only advances theoretical frameworks for digital-era attachment but also prompts philosophical reflection on the essence of intimacy, challenging conventional definitions of love and “inter-subjectivity” in an age where AI blurs the boundaries between tool and companion. Balancing technological innovation with ethical vigilance is paramount to ensuring the sustainable development of human-AI relationships..

关键词

人智情感联结 / 准社交互动 / 数字依恋 / 人工智能伴侣 / 人智依恋模型

Key words

human-AI emotional bonding / para-social interaction / digital attachment / AI companions / human-AI attachment model

引用本文

导出引用
吴燕, 耿晓伟, 周晓林. 从准社交互动到依恋:人智情感关系的演变*[J]. 心理科学. 2025, 48(4): 948-961 https://doi.org/10.16719/j.cnki.1671-6981.20250415
Wu Yan, Geng Xiaowei, Zhou Xiaolin. From Para-social Interaction to Attachment: The Evolution of Human-AI Emotional Relationships[J]. Journal of Psychological Science. 2025, 48(4): 948-961 https://doi.org/10.16719/j.cnki.1671-6981.20250415

参考文献

[1] Aguiar, N. R. (2021). A paradigm for assessing adults' and children' s concepts of artificially intelligent virtual characters. Human Behavior and Emerging Technologies, 3(4), 618-634.
[2] Ainsworth M. D.S., Blehar, M. C., Waters, E., & Wall, S. (2014). Patterns of attachment. Psychology Press..
[3] Banks, J. (2024). Deletion, departure, death: Experiences of AI companion loss. Journal of Social and Personal Relationships, 41(12), 3547-3572.
[4] Birnbaum G. E., Mizrahi M., Hoffman G., Reis H. T., Finkel E. J., & Sass O. (2016). What robots can teach us about intimacy: The reassuring effects of robot responsiveness to human disclosure. Computers in Human Behavior, 63, 416-423.
[5] Bowlby, J. (1969). Attachment and loss. Basic Books.
[6] Brandtzaeg P. B., Skjuve M., & Følstad A. (2022). My AI friend: How users of a social chatbot understand their human-AI friendship. Human Communication Research, 48(3), 404-429.
[7] Broadbent E., Billinghurst M., Boardman S. G., & Doraiswamy P. M. (2023). Enhancing social connectedness with companion robots using AI. Science Robotics, 8(80), eadi6347.
[8] Buttazzo, G. (2023). Rise of artificial general intelligence: Risks and opportunities. Frontiers in Artificial Intelligence, 6, 1226990.
[9] Chen Q., Jing Y., Gong Y., & Tan J. (2025). Will users fall in love with ChatGPT? A perspective from the triangular theory of love. Journal of Business Research, 186, 114982.
[10] Chen, X., & Zhang, S. (2024). Does artificial intelligence have subjectivity? An exploration of the focus of disagreement on the cognition of human-AI relationship. Open Journal of Social Sciences, 12(10), 287-306.
[11] Cross E. S., Riddoch K. A., Pratts J., Titone S., Chaudhury B., & Hortensius R. (2019). A neurocognitive investigation of the impact of socializing with a robot on empathy for pain. Philosophical Transactions of the Royal Society B: Biological Sciences, 374(1771), 20180034.
[12] Dang J., Sedikides C., Wildschut T., & Liu L. (2024). More than a barrier: Nostalgia inhibits, but also promotes, favorable responses to innovative technology. Journal of Personality and Social Psychology, 126(6), 998-1018.
[13] Dang J., Sedikides C., Wildschut T., & Liu L. (2025). AI as a companion or a tool? Nostalgia promotes embracing AI technology with a relational use. Journal of Experimental Social Psychology, 117, 104711.
[14] Depue, R. A., & Morrone-Strupinsky, J. V. (2005). A neurobehavioral model of affiliative bonding: Implications for conceptualizing a human trait of affiliation. Behavioral and Brain Sciences, 28(3), 313-350.
[15] Döring N., Le T. D., Vowels L. M., Vowels M. J., & Marcantonio T. L. (2024). The impact of artificial intelligence on human sexuality: A five-year literature review 2020-2024. Current Sexual Health Reports, 17(1), 4.
[16] Freitas J. D., Castelo N., Uguralp A., & Uguralp Z. (2024). Lessons from an app update at Replika AI: Identity discontinuity in human-AI relationships. ArXiv.
[17] Goffman, E. (1959). The presentation of self in everyday life. Doubleday.
[18] Guerreiro, J., & Loureiro, S. M. C. (2023). I am attracted to my cool smart assistant! Analyzing attachment-aversion in AI-human relationships. Journal of Business Research, 161, 113863.
[19] Gur, T., & Maaravi, Y. (2025). The algorithm of friendship: Literature review and integrative model of relationships between humans and artificial intelligence (AI). Behaviour & Information Technology, Advance online publication.
[20] Herbener, A. B., & Damholdt, M. F. (2025). Are lonely youngsters turning to chatbots for companionship? The relationship between chatbot usage and social connectedness in Danish high-school students. International Journal of Human-Computer Studies, 196, 103409.
[21] Horton, D., & Richard Wohl, R. (1956). Mass communication and para-social interaction: Observations on intimacy at a distance. Psychiatry, 19(3), 215-229.
[22] Hu D., Lan Y., Yan H., & Chen C. W. (2025). What makes you attached to social companion AI? A two-stage exploratory mixed-method study. International Journal of Information Management, 83, 102890.
[23] Jabbarpour M. R., Saghiri A. M., & Sookhak M. (2021). A framework for component selection considering dark sides of artificial intelligence: A case study on autonomous vehicle. Electronics, 10(4), 384.
[24] Jacobs, K. A. (2024). Digital loneliness—Changes of social recognition through AI companions. Frontiers in Digital Health, 6, 1281037.
[25] Kaczmarek, E. (2024). Self-Deception in human- AI emotional relations. Journal of Applied Philosophy. Advance online publication.
[26] Ki C. W., Cho E., & Lee J. E. (2020). Can an intelligent personal assistant (IPA) be your friend? Para-friendship development mechanism between IPAs and their users. Computers in Human Behavior, 111, 106412.
[27] Kim J., Merrill Jr., K., & Collins C. (2021). AI as a friend or assistant: The mediating role of perceived usefulness in social AI vs. functional AI. Telematics and Informatics, 64, 101694.
[28] Koh, J. (2023). “Date me date me”: AI chatbot interactions as a resource for the online construction of masculinity. Discourse, Context and Media, 52, 100681.
[29] Koles, B., & Nagy, P. (2021). Digital object attachment. Current Opinion in Psychology, 39, 60-65.
[30] Kouros, T., & Papa, V. (2024). Digital mirrors: AI companions and the self. Societies, 14(10), 200.
[31] Kumar & Benbasat. (2002). Para-social presence and communication capabilities of a web site: A theoretical perspective. E-Service Journal, 1(3), 5.
[32] Leo-Liu, J. (2023). Loving a “defiant” AI companion? The gender performance and ethics of social exchange robots in simulated intimate interactions. Computers in Human Behavior, 141, 107620.
[33] Levy, D. N. L. (2008). Love and sex with robots: The evolution of human-robot relations. Harper.
[34] Li, H., & Zhang, R. (2024). Finding love in algorithms: Deciphering the emotional contexts of close encounters with AI chatbots. Journal of Computer-Mediated Communication, 29(5), zmae015.
[35] Liebers, N., & Schramm, H. (2017). Friends in books: The influence of character attributes and the reading experience on parasocial relationships and romances. Poetics, 65, 12-23.
[36] Lin, B. (2024). The AI chatbot always flirts with me, should I flirt back: From the McDonaldization of friendship to the robotization of love. Social Media + Society, 10(4), 20563051241296229.
[37] Loveys K., Hiko C., Sagar M., Zhang X., & Broadbent E. (2022). “I felt her company”: A qualitative study on factors affecting closeness and emotional support seeking with an embodied conversational agent. International Journal of Human-Computer Studies, 160, 102771.
[38] Malfacini, K. (2025). The impacts of companion AI on human relationships: Risks, benefits, and design considerations. AI and Society, Advance online publication.
[39] Meng, X., & Liu, J. (2025). “Talk to me, I’m secure”: Investigating information disclosure to AI chatbots in the context of privacy calculus. Advance online publication.
[40] Miura N., Sugiura M., Takahashi M., Moridaira T., Miyamoto A., Kuroki Y., & Kawashima R. (2008). An advantage of bipedal humanoid robot on the empathy generation: A neuroimaging study. 2008 IEEE/RSJ International Conference on intelligent robots and systems.
[41] Mlonyeni, P. M. T. (2025). Personal AI, deception, and the problem of emotional bubbles. AI and Society, 40(3), 1927-1938.
[42] Mori M., MacDorman K., & Kageki N. (2012). The uncanny valley. IEEE Robotics & Automation Magazine, 19(2), 98-100.
[43] Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81-103.
[44] Northoff, G., & Gouveia, S. S. (2024). Does artificial intelligence exhibit basic fundamental subjectivity? A neurophilosophical argument. Phenomenology and the Cognitive Sciences, 23(5), 1097-1118.
[45] Pal D., Babakerkhell M. D., Papasratorn B., & Funilkul S. (2023). Intelligent attributes of voice assistants and user' s love for AI: A SEM-based study. IEEE Access, 11, 60889-60903.
[46] Pentina I., Hancock T., & Xie T. (2023). Exploring relationship development with social chatbots: A mixed-method study of replika. Computers in Human Behavior, 140, 107600.
[47] Qi Y., Chen J., Qin S., & Du F. (2024). Human-AI mutual trust in the era of artificial general intelligence. Advances in Psychological Science, 32(12), 2124.
[48] Rijsdijk S. A., Hultink E. J., & Diamantopoulos A. (2007). Product intelligence: Its conceptualization, measurement and impact on consumer satisfaction. Journal of the Academy of Marketing Science, 35(3), 340-356.
[49] Sarigul B., Schneider F. M., & Utz S. (2024). Believe it or not? Investigating the credibility of voice assistants in the context of social roles and relationship Types. International Journal of Human-Computer Interaction, 41(10), 6253-6265.
[50] Sedikides, C., & Wildschut, T. (2018). Finding meaning in nostalgia. Review of General Psychology, 22(1), 48-61.
[51] Shah T. R., Purohit S., Das M., & Arulsivakumar T. (2025). Do I look real? Impact of digital human avatar influencer realism on consumer engagement and attachment. Journal of Consumer Marketing, 42(4), 416-430.
[52] Shank D. B., Koike M., & Loughnan S. (2025). Artificial intimacy: Ethical issues of AI romance. Trends in Cognitive Sciences, 29(6), 499-501.
[53] Shemmings, D. (2015). The secure base model: Promoting attachment and resilience in foster care and adoption. Child and Family Social Work, 20(2), 252-252.
[54] Short J., Williams E., & Christie B. (1976). The social psychology of telecommunications. Wiley.
[55] Skjuve M., Følstad A., Fostervold K. I., & Brandtzaeg P. B. (2022). A longitudinal study of human-chatbot relationships. International Journal of Human-Computer Studies, 168, 102903.
[56] Toma, C. L. (2022). Online dating and psychological wellbeing: A social compensation perspective. Current Opinion in Psychology, 46, 101331.
[57] Tschopp M., Gieselmann M., & Sassenberg K. (2023). Servant by default? How humans perceive their relationship with conversational AI. Cyberpsychology: Journal of Psychosocial Research on Cyberspace, 17(3), Aticle 9.
[58] Turner, P., & Turner, S. (2013). Emotional and aesthetic attachment to digital artefacts. Cognition, Technology & Work, 15(4), 403-414.
[59] van den Berg, L. (2024). Book review: Happy singlehood: The rising acceptance and celebration of solo living. Sexualities, 27(1-2), 380-382.
[60] Wiederhold, B. K. (2024). The rise of AI companions and the quest for authentic connection. Cyberpsychology, Behavior, and Social Networking, 27(8), 524-526.
[61] Wojtczak, S. (2022). Endowing Artificial Intelligence with legal subjectivity. AI and Society, 37(1), 205-213.
[62] Wu Y., Veerareddy A., Lee M. R., Bellucci G., Camilleri J. A., Eickhoff S. B., & Krueger F. (2021). Understanding identification-based trust in the light of affiliative bonding: Meta-analytic neuroimaging evidence. Neuroscience and Biobehavioral Reviews, 131, 627-641.
[63] Yan, Y. (2020). The Individualization of Chinese Society. Routledge.
[64] Zhang C. B., Li T. G., Li Y. N., Chang Y., & Zhang Z. P. (2024). Fostering well-being: Exploring the influence of user-AI assistant relationship types on subjective well-being. International Journal of Information Management, 79, 102822.
[65] Zhang R., Li H., Meng H., Zhan J., Gan H., & Lee Y. C. (2025). The dark side of AI companionship: A taxonomy of harmful algorithmic behaviors in human-ai relationships. Proceedings of the 2025 CHI conference on human factors in computing systems.
[66] Zhou, B., & Guan, S. (2025). Ethical and psychological challenges in human-AI romantic relationships: An interdisciplinary critical study. Innovative Applications of AI, 2(1), 39-45.

基金

*本研究得到科技创新2030项目(2021ZD0200500)的资助

PDF(1008 KB)

评审附件

Accesses

Citation

Detail

段落导航
相关文章

/