An Approach of AI as a Practice of Escapism

Main Article Content

Waralak Vongdoiwang Siricharoen

Abstract

This paper examines the phenomenon of AI-mediated escapism by clarifying its scope, risks, and implications for design. The primary objective is to conceptualize how artificial intelligence reshapes traditional forms of escapism through personalization, immersive interaction, and adaptive feedback. Using a conceptual synthesis approach, this work integrates insights from human-computer interaction, psychology, and media studies. The analysis highlights three key findings: 1) AI intensifies escapism by enhancing interactivity, empathy simulation, and agency; 2) the spectrum of AI-mediated escapism ranges from therapeutic and time-bound uses to problematic patterns associated with misinformation, avoidance, and excessive immersion; and 3) ethical design principles—such as transparency, time-awareness, and emotional boundary safeguards—are essential to mitigate risks. The implications extend to HCI, UX design, and policy, suggesting that responsible frameworks can balance the benefits of AI-driven escapism with its potential harms. Limitations include the conceptual and narrative scope of this review, which does not provide empirical validation. Future research should apply empirical methods to test and refine the proposed framework. Overall, this study provides a structured understanding of AImediated escapism and offers design guidelines for creating safer, more ethical interactive systems.

Article Details

How to Cite
Siricharoen, W. V. (2026). An Approach of AI as a Practice of Escapism. INTERNATIONAL SCIENTIFIC JOURNAL OF ENGINEERING AND TECHNOLOGY (ISJET), 10(1), 42–49. retrieved from https://ph02.tci-thaijo.org/index.php/isjet/article/view/259349
Section
Academic Article

References

G. Holt, “Clinical benchmarking for the validation of AI medical diagnostic classifiers,” Artif. Intell. Med., vol. 35, no. 3, pp. 259-260, Nov. 2005, https://doi.org/10.1016/j.artmed.2005.10.001

A. Bohr and K. Memarzadeh, “The rise of Artificial Intelligence in healthcare applications,” in Artificial Intelligence in Healthcare, Amsterdam. Netherlands: Elsevier, 2020, pp. 25-60, https://doi.org/10.1016/b978-0-12-818438-7.00002-2

S. B. De Jesus, D. Austria, D. R. Marcelo, and C. Ocampo, “Play-to-Earn: A qualitative analysis of the experiences and challenges faced by Axie Infnity online gamers amidst the COVID-19 pandemic,” Int. J. Psychol. Couns, vol. 12, no. 1, pp. 391-424, Jan. 2022.

P. Fleming, “Robots and organization studies: Why robots might not want to steal your job,” Organ. Stud., vol. 40, no. 1, pp. 23-38, Apr. 2018, https://doi.org/10.1177/0170840618765568

S. G. Hofmann and A. C. Hay, “Rethinking avoidance: Toward a balanced approach to avoidance in treating anxiety disorders,” J. Anxiety Disord., vol. 55, pp. 14-21, Apr. 2018, https://doi.org/10.1016/j.janxdis.2018.03.004

H. Jouhki and A. Oksanen, “To get high or to get out? Examining the link between addictive behaviors and escapism,” Subst. Use Misuse, vol. 57, no. 2, pp. 202-211, Nov. 2021, https://doi.org/10.1080/10826084.2021.2002897

H. Jouhki, I. Savolainen, A. Sirola, and A. Oksanen, “Escapism and excessive online behaviors: A three-wave longitudinal study in Finland during the COVID-19 pandemic,” Int. J. Environ. Res. Public Health, vol. 19, no. 19, p. 12491, Sep. 2022, https://doi.org/10.3390/ijerph191912491

P. Kaimara, A. Oikonomou, and I. Deliyannis, “Could virtual reality applications pose real risks to children and adolescents? A systematic review of ethical issues and concerns,” Virtual Reality, vol. 26, no. 2, pp. 697-735, Aug. 2021, https://doi.org/10.1007/s10055-021-00563-w

K. Kırcaburun and M. D. Griffiths, “Problematic Instagram use: The role of perceived feeling of presence and escapism,” Int. J. Ment. Health Addiction, vol. 17, no. 4, pp. 909-921, 2018, https://doi.org/10.1007/s11469-018-9895-7

L. Craig and N. Laskowski, “What is Artifcial Intelligence (AI)? Defnition, types, examples & use cases,” Tech Accelerator, 2022. [Online]. Available: https://www.techtarget.com/searchenterpriseai/definition/AI-Artificial-Intelligence [Accessed: Feb. 23, 2025].

R. Lavoie, K. Main, C. King, and D. King, “Virtual experience, real consequences: The potential negative emotional consequences of virtual reality gameplay,” Virtual Reality, vol. 25, no. 1, pp. 69-81, Apr. 2020, https://doi.org/10.1007/s10055-020-00440-y

F. Melodia, N. Canale, and M. D. Grifths, “The role of avoidance coping and escape motives in problematic online gaming: A systematic literature review,” Int. J. Ment. Health Addiction, vol. 20, no. 2, pp. 996-1022, Nov. 2020, https://doi.org/10.1007/s11469-020-00422-w

NATO, “The role of AI in the battle against disinformation,” NATO StratCom COE, 2022. [Online]. Available: https://stratcomcoe.org/publications/the-role-of-ai-in-the-battleagainst-disinformation/238 [Accessed: Sep. 5, 2025].

I. Patra, “To immerse is to escape: Analyzing the power of simulacra and simulation in Ernest Cline’s Ready Player One and Ready Player Two,” Elementary Education Online, vol. 20, no. 1, pp. 1658-1671, 2021, https://ilkogretim-online.org/index.php/pub/article/view/3522

A. L. Pedrosa et al., “Emotional, behavioral, and psychological impact of the COVID-19 pandemic,” Front. Psychol., vol. 11, 2020, https://doi.org/10.3389/fpsyg.2020.566212

G. Petropoulos, “The impact of Artifcial Intelligence on employment,” Bruegel, 2025. [Online]. Available: https://www.bruegel.org/sites/default/files/wp-content/uploads/2018/07/Impact-of-AI-Petroupoulos.pdf [Accessed: Feb. 23, 2025].

I. H. Sarker, “AI-based modeling: Techniques, applications and research issues towards automation, intelligent and smart systems,” SN Comput. Sci., vol. 3, no. 2, Feb. 2022, https://pubmed.ncbi.nlm.nih.gov/35194580/

J. Shi, R. Renwick, N. E. Turner, and B. Kirsh, “Understanding the lives of problem gamers: The meaning, purpose, and influences of video gaming,” Comput. Hum. Behav., vol. 97, pp. 291-303, Aug. 2019, https://doi.org/10.1016/j.chb.2019.03.023

H. S. Saetra, “The parasitic nature of social AI: Sharing minds with the mindless,” Integr. Psychol. Behav. Sci., vol. 54, no. 2, pp. 308-326, Mar. 2020, https://doi.org/10.1007/s12124-020-09523-6

J. Torpey, “A sociological agenda for the tech age,” Theory Soc., vol. 49, no. 5-6, pp. 749-769, Aug. 2020, https://doi.org/10.1007/s11186-020-09406-0

L. Craig, N. Laskowski, and L. Tucci, “What is artificial intelligence (AI)?” TechTarget, 2022. [Online]. Available: https://www.techtarget.com/searchenterpriseai/definition/AI-governance [Accessed: Feb. 23, 2025].

M. Holt, N. Sutton, and V. Arnold, “How much automation is too much? Keeping the human relevant in the decisionmaking process,” J. Emerg. Technol. Account, vol. 15, no. 1, pp. 1-15, 2018, https://publications.aaahq.org/jeta/article-abstract/15/2/15/9279/How-Much-Automation-Is-Too-Much-Keeping-the-Human?redirectedFrom=fulltext

K. Kircaburun and M. D. Griffiths, “Problematic Instagram use: The role of perceived feeling of presence and escapism,” Int. J. Ment. Health Addiction, vol. 17, no. 4, pp. 909-921, Jun. 2019.

M. D. Griffiths, “The role of context in online gaming excess and addiction: Some case study evidence,” Int. J. Ment. Health Addiction, vol. 8, no. 1, pp. 119-125, Jul. 2010, https://psycnet.apa.org/record/2010-01264-010

H. S. Saetra, “A framework for human–AI collaboration: Assessing the moral and ethical implications of AI-driven tools,” AI Soc., vol. 38, pp. 221-234, 2023.

A. Mittelstadt, P. Allo, M. Taddeo, S. Wachter, and L. Floridi, “The ethics of algorithms: Mapping the debate,” Big Data Soc., vol. 3, no. 2, pp. 1-21, Dec. 2016, https://doi.org/10.1177/2053951716679679