DOI QR코드

DOI QR Code

The Effect of Involvement and Severity on Acceptance of Artificial Intelligence Judgment

사건 관여도와 심각성이 인공지능 판결에 대한 수용도에 미치는 효과

  • Doh, Eun Yeong (Department of Industrial Psychology, KwangWoon University) ;
  • Lee, Guk-Hee (Division of General Studies, Kyonggi University) ;
  • Jung, Ji Eun (Department of Forensic Psychology, Kyonggi University)
  • 도은영 (광운대학교 산업심리학과) ;
  • 이국희 (경기대학교 교양학부) ;
  • 정지은 (경기대학교 범죄심리학과)
  • Received : 2021.10.12
  • Accepted : 2021.10.12
  • Published : 2021.12.31

Abstract

With the development of artificial intelligence(AI), the jobs of many human experts are threatened, and this also applies to the legal profession. This study attempted to investigate whether AI can actually replace humans in the legal profession, especially the role of judges making final judgments. For this purpose, from the perspective of uniqueness neglect, this study was conducted to confirm the effect of involvement and the severity on acceptance of the judgment made by the AI judge (Experiment 1) and the AI jury (Experiment 2). The involvement was manipulated as if the subject who was sentenced for committing a crime was his or her family (mother, father) or stranger, and the severity was manipulated by the extent of the damage, the perception of the crime, and the number of applied crimes. In Experiment 1, the interactive effect of involvement and severity was found. Specifically, when the involvement was low, the acceptance of AI judges was higher in high severity (vs. low severity). Conversely, when the involvement was high, the acceptance of AI judges was higher in low severity (vs. high severity). The same interactions as in Experiment 1 occurred in Experiment 2. Specifically, when the involvement was low, a larger number of AI jury members were allocated in high severity (vs. low severity). On the other hand, when the involvement was high, the number of AI juries increased in low severity (vs. high severity). This study has implications in that it is the first experimental study in Korea on artificial intelligence legal judgment and that it presents the prospects for the jobs of legal experts.

인공지능이 발달함에 따라 많은 인간 전문가들의 일자리가 위협당하고 있으며 이는 법조계 또한 해당한다. 본 연구는 실제로 인공지능이 법조계에 종사하는 사람들, 특히 최종 판결을 내리는 판사의 역할까지 대체할 수 있을지에 대해 다루어보고자 하였다. 본 연구는 고유성 무시의 관점에서, 이를 판결 대상자와 얼마나 가까운 사이인지(관여도)와 사건이 얼마나 심각한지(심각성)가 인공지능 판사(실험 1)가 내리는 판결을 수용하는 정도와 전체 배심원 중 인공지능 배심원(실험 2)을 배치하는 정도에 미치는 효과를 확인하는 것으로 시행하였다. 실험 1에서는 관여도와 심각성의 상호작용 효과가 나타났다. 구체적으로, 관여도가 낮을 때는 심각성이 높은 사건(vs. 낮은 사건)에서 더 인공지능 판사 수용도가 높았지만, 관여도가 높을 때는 심각성이 낮은 사건(vs. 높은 사건)에서 더 인공지능 판사 수용도가 높았다. 실험 2에서도 실험 1과 동일한 상호작용이 발생하였다. 구체적으로, 관여도가 낮을 때는 심각성이 높은 사건(vs. 낮은 사건)에서 인공지능 배심원 수를 더 많이 배정하였지만, 관여도가 높을 때는 심각성이 낮은 사건(vs. 높은 사건)에서 인공지능 배심원 수가 더 많아졌다. 본 연구는 인공지능의 법적 판단에 대한 국내 최초의 실험연구라는 점과 법률 전문가들의 일자리에 대한 전망을 제시한다는 점에서 시사점을 가진다.

Keywords

References

  1. 양새롬. (2020.10.16.). 구하라 '불법촬영 무죄' 판결에... "AI 판사 채용해주세요" 청원 등장. 동아일보. https://www.news1.kr/articles/?4088468
  2. 양종모. (2016). 인공지능을 이용한 법률전문가 시스템의 동향 및 구상. 법학연구. 19(2), 213-242.
  3. 양종모. (2018). 인공지능에 의한 판사의 대체 가능성 고찰. 홍익법학. 19(1), 1-29. https://doi.org/10.16960/JHLR.19.1.201802.1
  4. 연규욱. (2017.11.07.). 랜들 레이더 전 美연방항소법원장 "인공지능이 5년내 판사 대체... 사법 불신 줄어들 것". 매일경제. https://www.mk.co.kr/news/economy/view/2017/11/737834/
  5. 오요한, 홍성욱. (2018). 인공지능 알고리즘은 사람을 차별하는가?. 과학기술학연구. 18(3), 153-215. https://doi.org/10.22989/JSTS.2018.18.3.004
  6. 정영화. (2020). 인공지능과 법원의 분쟁해결-최근 영미법국가들의 인공지능 법제. 홍익법학. 21(1), 209-247.
  7. 정원엽, 이기준. (2016.05.17). 첫 AI 변호사 '로스', 뉴욕로펌 취직하다. 중앙일보. https://news.joins.com/article/20035624
  8. 한국개발연구원. (2021.01.14). 우라니라 AI 생태계 작동 아직 미흡해: AI에 대한 기업체 인식 및 실태 조사 결과. https://www.kdi.re.kr/news/coverage_view.jsp?idx=10941&pp=10&pg=1&gubun=03
  9. LG경제연구원. (2018.05.15). 인공지능에 의한 일자리 위험 진단. http://www.lgeri.com/report/view.do?idx=19620
  10. Angwin, J., Larson, J., Mattu, S. and Kirchner, L. (2016, May 23). Machine Bias. ProPublica. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing
  11. Balabanis, G., & Chatzopoulou, E. (2019). Under the influence of a blogger: The role of information? seeking goals and issue involvement. Psychology & Marketing, 36(4), 342-353. https://doi.org/10.1002/mar.21182
  12. Bigman, Y. E., & Gray, K. (2018). People are averse to machines making moral decisions. Cognition, 181, 21-34. https://doi.org/10.1016/j.cognition.2018.08.003
  13. Brainard, D. H. (1997). The psychophysics toolbox. Spatial Vision, 10(4), 433-436. https://doi.org/10.1163/156856897X00357
  14. Dhanesh, G. S., & Nekmat, E. (2019). Facts over stories for involved publics: framing effects in CSR messaging and the roles of issue involvement, message elaboration, affect, and Skepticism. Management Communication Quarterly, 33(1), 7-38. https://doi.org/10.1177/0893318918793941
  15. Gill, T. (2020). Blame it on the self-driving car: how autonomous vehicles can alter consumer morality. Journal of Consumer Research, 47(2), 272-291. https://doi.org/10.1093/jcr/ucaa018
  16. Granulo, A., Fuchs, C., & Puntoni, S. (2021). Preference for human (vs. robotic) labor is stronger in symbolic consumption contexts. Journal of Consumer Psychology, 31(1), 72-80. https://doi.org/10.1002/jcpy.1181
  17. Hengstler, M., Enkel, E., & Duelli, S. (2016). Applied artificial intelligence and trust-The case of autonomous vehicles and medical assistance devices. Technological Forecasting and Social Change, 105, 105-120. https://doi.org/10.1016/j.techfore.2015.12.014
  18. Lambert, E. G., Keena, L. D., Haynes, S. H., Ricciardelli, R., May, D., & Leone, M. (2020). The issue of trust in shaping the job Involvement, job satisfaction, and organizational commitment of southern correctional staff. Criminal Justice Policy Review, 32(2), 193-215.
  19. Leachman, S. A., & Merlino, G. (2017). The final frontier in cancer diagnosis. Nature, 542(7639), 36-38. https://doi.org/10.1038/nature21492
  20. Lee, E. J., & Kim, Y. W. (2016). Effects of infographics on news elaboration, acquisition, and evaluation: Prior knowledge and issue involvement as moderators. New Media & Society, 18(8), 1579-1598. https://doi.org/10.1177/1461444814567982
  21. Lee, J., & Holyoak, K. J. (2020). "But he's my brother": The impact of family obligation on moral judgments and decisions. Memory & cognition, 48(1), 158-170. https://doi.org/10.3758/s13421-019-00969-7
  22. Martin, J., Young, L., & McAuliffe, K. (2020, August 18). The impact of group membership on punishment versus partner choice. https://doi.org/10.31234/osf.io/5qr32
  23. Pelli, D. G. (1997). The VideoToolbox software for visual psychophysics: Transforming numbers into movies. Spatial Vision, 10(4), 437-442. https://doi.org/10.1163/156856897X00366
  24. Schmid, R. F., Miodrag, N., & Francesco, N. D. (2008). A human-computer partnership: The tutor/child/computer triangle promoting the acquisition of early literacy skills. Journal of Research on Technology in Education, 41(1), 63-84. https://doi.org/10.1080/15391523.2008.10782523
  25. Schuitema, G., Aravena, C., & Denny, E. (2020). The psychology of energy efficiency labels: Trust, involvement, and attitudes towards energy performance certificates in Ireland. Energy Research & Social Science, 59, 101301. https://doi.org/10.1016/j.erss.2019.101301
  26. Teng, C. C., & Lu, C. H. (2016). Organic food consumption in Taiwan: Motives, involvement, and purchase intention under the moderating role of uncertainty. Appetite, 105, 95-105. https://doi.org/10.1016/j.appet.2016.05.006
  27. Weidman, A. C., Sowden, W. J., Berg, M. K., & Kross, E. (2020). Punish or protect? How close relationships shape responses to moral violations. Personality and Social Psychology Bulletin, 46(5), 693-708. https://doi.org/10.1177/0146167219873485
  28. Yun, J. H., Lee, E. J., & Kim, D. H. (2021). Behavioral and neural evidence on consumer responses to human doctors and medical artificial intelligence. Psychology & Marketing, 38(4), 610-625. https://doi.org/10.1002/mar.21445
  29. Zhao, Y., Feng, T., & Shi, H. (2018). External involvement and green product innovation: The moderating role of environmental uncertainty. Business Strategy and the Environment, 27(8), 1167-1180. https://doi.org/10.1002/bse.2060