References
- 과학기술정보통신부. (2021). 사람이 중심이 되는 인공지능을 위한 신뢰할 수 있는 인공지능 실현 전략[안]. Retrieved 2022년 3월 12일, from https://www.korea.kr/common/download.do?fileId=195009613&tblKey=GMN
- 권영준, 남재현, 조민정. (2011). 개인신용평가에서의 비금융정보의 경제적 효과. 한국경제연구, 29(2), 81-107.
- 금융위원회. (2020년). '21.1.1일부터는 신용점수로 자신의 신용을 확인하세요. Retrieved 2022년 3월 12일, from http://www.fsc.go.kr:8300/v/p42S1u6Twh2
- 금융위원회. (2021). 금융분야 AI 가이드라인 및 주요 검토 필요사항. Retrieved 2022년 3월 5일, from http://www.fsc.go.kr:8300/v/pq8TQUQFZSY
- 금융위원회. (2021). 코로나 이후 시대의 디지털 대전환을 선도하기 위해 금융분야 인공지능(AI)을 활성화하겠습니다. Retrieved 2022년 4월 6일 from http://www.fsc.go.kr:8300/v/pbVFpTRt0h5
- 김지웅, 허준, 김장일. (2013). 빅데이터의 금융기관 활용 사례. The Magazine of the IEIE, 40(8), 49-54.
- 소순주, 안성진. (2021). 인공지능 윤리원칙 분류 모형 및 구성요소에 관한 연구. 컴퓨터교육학회 논문지, 24(6), 119-132.
- 안소영. (2021년, 9월 16일). 대안신용평가 시대온다... 전통 금융사들 빅데이터 센터 세워 대응해야. ChosunBiz. https://biz.chosun.com/stock/finance/2021/09/16/7MJQZ34FLFD7RHSFY7D3TR342A/
- 양희태, 최병삼, 이제영, 장훈, 백서인, 김단비. (2018). 인공지능 기술 전망과 혁신정책 방향 - 국가 인공지능 R&D 정책 개선방안을 중심으로-. Retrieved 2022년 3월 2일, from https://www.nkis.re.kr:4445/researchReport_view.do?otpId=OTP_0000000000002198#none
- 엄하늘, 김재성, 최상옥. (2020). 머신러닝 기반 기업부도위험 예측모델 검증 및 정책적 제언: 스태킹 앙상블 모델을 통한 개선을 중심으로. 지능정보연구, 26(2), 105-129. https://doi.org/10.13088/JIIS.2020.26.2.105
- 이정선, 서보밀, 권영옥. (2021). 인공지능이 의사결정에 미치는 영향에 관한 연구: 인간과 인공지능의 협업 및 의사결정자의 성격 특성을 중심으로. 지능정보연구, 27(3), 231-252. https://doi.org/10.13088/JIIS.2021.27.3.231
- 이창효. (2000). 집단의사결정론. 서울: 세종출판사.
- 장용석, 김형준, 문정욱, 문아람, 김정언, 이시직,양기문, 황선영, 변순용, 선지원, 이청호, 김봉제. (2020). 윤리적 인공지능을 위한 국가정책 수립. 정책연구, 2020(7), 1-235.
- 전종헌. (2020년, 6월 4일). SNS활동 신용평가해 대출했더니 '상환율 95%'...비금융정보 주목. 매일경제. https://www.mk.co.kr/news/economy/view/2020/06/573304/
- 최성민. (2020). 개인 신용평가모형과 설명력 이슈. 2020 한국경영정보학회 추계학술대회, 한국과학기술회관, 서울.
- 황용석, 정재선, 황현정, 김형준. (2021). 알고리즘 추천 시스템의 공정성 확보를 위한 시론적 연구. 방송통신연구, 169-206.
- KDB 미래전략연구소. (2021). '금융분야 AI 가이드라인' 및 금융권의 대응. Retrieved 2022년 2월 26일, from https://eiec.kdi.re.kr/policy/domesticView.do?ac=0000159143
- Baeza-Yates, R. (2018). Bias on the web. Communications of the ACM, 61(6), 54-61. https://doi.org/10.1145/3209581
- Bagdasaryan, E., Poursaeed, O., & Shmatikov, V. (2019). Differential privacy has disparate impact on model accuracy. 33rd Conference on Neural Information Processing System, Vancouver, Canada.
- Bank for International Settlements(BIS) (2019). Big tech in finance: opportunities and risks. Retrieved December 26, 2021, from https://www.bis.org/publ/arpdf/ar2019e3.pdf
- Bolukbasi, T., Chang, K. W., Zou, J. Y., Saligrama, V., & Kalai, A. T. (2016). Man is to computer programmer as woman is to homemaker? debiasing word embeddings. 30th Conference on Neural Information Processing System, Barcelona, Spain.
- Bruckner, M. A. (2018). Regulating fintech lending. Banking & Financial Services Policy Report, 37(6).
- Buolamwini, J., & Gebru, T. (2018, February). Gender shades: Intersectional accuracy disparities in commercial gender classification. ACM Conference on fairness, accountability and transparency, New York, USA.
- Cornacchia, G., Narducci, F., & Ragone, A. (2021, September). A general model for fair and explainable recommendation in the loan domain. Joint Proceedings KaRS 2021 and ComplexRec 2021, Amsterdam, Netherlands.
- European Commission. (2018). Communication Artificial Intelligence for Europe. Retrieved January 13, 2022, from https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52018DC0237&from=EN
- European Commission. (2019). Ethics guidelines for trustworthy AI, Report. Retrieved January 13, 2022, from https://www.aepd.es/sites/default/files/2019-12/ai-ethics-guidelines.pdf
- Financial Stability Board(FSB). (2017). Artificial Intelligence and Machine Learning in Financial Services: Market Developments and Financial Stability Implications. Retrieved March 23, 2022, from https://www.fsb.org/wp-content/uploads/P011117.pdf
- Fuster, A., Goldsmith Pinkham, P., Ramadorai, T., & Walther, A. (2022). Predictably unequal? The effects of machine learning on credit markets. The Journal of Finance, 77(1), 5-47. https://doi.org/10.1111/jofi.13090
- Hardt, M., Price, E., & Srebro, N. (2016, December). Equality of opportunity in supervised learning. 30th Conference on Neural Information Processing System, Vancouver, Canada.
- International Committee on Credit Reporting (ICCR). (2018). Guidance Note: Use of Alternative Data to Enhance Credit Reporting to Enable Access to Digital Financial Services by Individuals and SMEs Operating in the Informal Economy. Retrieved January 24, 2022, from https://www.gpfi.org/sites/gpfi/files/documents/Use_of_Alternative_Data_to_Enhance_Credit_Reporting_to_Enable_Access_to_Digital_Financial_Services_ICCR.pdf
- Konig-Kersting, C., Pollmann, M., Potters, J., & Trautmann, S. T. (2021). Good decision vs. good results: Outcome bias in the evaluation of financial agents. Theory and Decision, 90(1), 31-61. https://doi.org/10.1007/s11238-020-09773-1
- Kozodoi, N., Jacob, J., & Lessmann, S. (2022). Fairness in credit scoring: Assessment, implementation and profit implications. European Journal of Operational Research, 297(3), 1083-1094. https://doi.org/10.1016/j.ejor.2021.06.023
- Lepri, B., Oliver, N., Letouze, E., Pentland, A., & Vinck, P. (2018). Fair, transparent, and accountable algorithmic decision-making processes. Philosophy & Technology, 31(4), 611-627. https://doi.org/10.1007/s13347-017-0279-x
- Li, J., & Chignell, M. (2022). FMEA-AI: AI fairness impact assessment using failure mode and effects analysis. AI and Ethics, 1-14.
- McCalman, L., Steinberg, D., Abuhamad, G., Brunet, M. E., Williamson, R. C., & Zemel, R. (2022). Assessing AI Fairness in Finance. Computer, 55(1), 94-97.
- Mehrabi, N., Morstatter, F., Saxena, N., Lerman, K., & Galstyan, A. (2021). A survey on bias and fairness in machine learning. ACM Computing Surveys (CSUR), 54(6), 1-35.
- Microsoft. (2020). Fairlearn: A toolkit for assessing and improving fairness in AI. Retrieved April 2, 2022, from https://www.microsoft.com/en-us/research/uploads/prod/2020/05/Fairlearn_WhitePaper-2020-09-22.pdf
- Monetary Authority of Singapore(MAS). (2020). FEAT Fairness Principles Assessment Case Studies. Retrieved April 25, 2022, from https://www.mas.gov.sg/-/media/MAS/News/Media-Releases/2021/Veritas-Document-2-FEAT-Fairness-Principles-Assessment-Case-Studies.pdf
- Ntoutsi, E., Fafalios, P., Gadiraju, U., Iosifidis, V., Nejdl, W., Vidal, M. E., Ruggieri, S., Turini, F., Papadopoulous, S., Krasankis, E., Kompatsiaris, I., Kurlanda, K. K., Wagner, C., Karimi, F., Fernandez, M., Alani, H., Berendt, B., Kruegel, T., Heinze, C., Broelemann, K., Kasneci, G., Tiropanis, T., & Staab, S. (2020). Bias in data driven artificial intelligence systems-An introductory survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 10(3).
- Petrasic, K., Saul, B., Greig, J., & Bornfreund, M. (2017). Algorithms and bias: What lenders need to know. Retrieved April 25, 2022, from https://www.lexology.com/library/detail.aspx?g=c806d996-45c5-4c87-9d8a-a5cce3f8b5ff
- Political & Economic Research Council(PERC). (2006). Give credit where credit is due: Increasing access to affordable mainstream credit using alternative data. Retrieved January 3, 2022, from https://www.brookings.edu/wp-content/uploads/2016/06/20061218_givecredit.pdf
- Political & Economic Research Council(PERC). (2009). New to Credit from Alternative Data. Retrieved May 3, 2022, from https://www.perc.net/wp-content/uploads/2013/09/New_to_Credit_from_Alternative_Data_0.pdf
- Saaty, T. L. (1990). How to make a decision: the analytic hierarchy process. European journal of operational research, 48(1), 9-26. https://doi.org/10.1016/0377-2217(90)90057-I
- Saaty, T. L. (1994). Highlights and critical points in the theory and application of the analytic hierarchy process. European journal of operational research, 74(3), 426-447. https://doi.org/10.1016/0377-2217(94)90222-4
- Suresh, H., & Guttag, J. (2021). A framework for understanding sources of harm throughout the machine learning life cycle. In Equity and Access in Algorithms, Mechanisms, and Optimization, 1-9.
- Vigdor, N. (2019, November 10). Apple Card Investigated After Gender Discrimination Complaints. The New York Times. https://www.nytimes.com/2019/11/10/business/Apple-credit-card-investigation.html
- World Bank Group (2018). Financial Consumer Protection and New Forms of Data Processing Beyond Credit Reporting. Retrieved April 23, 2022, from https://openknowledge.worldbank.org/bitstream/handle/10986/31009/132035-WP-FCP-New-Forms-of-Data-Processing.pdf?sequence=1&isAllowed=y
- World Bank Group. (2019). Credit Scoring Approaches Guidelines. Retrieved December 14, 2021, from https://thedocs.worldbank.org/en/doc/935891585869698451-0130022020/original/CREDITSCORINGAPPROACHESGUIDELINESFINALWEB.pdf
- World Bank and CGAP. (2018). Data Protection and Privacy for Alternative Data. Retrieved May 17, 2022, from https://www.gpfi.org/sites/gpfi/files/documents/Data_Protection_and_Privacy_for_Alternative_Data_WBG.pdf
- Yusof Ishak Institute. (2021). The Prospects and Dangers of Algorithmic Credit Scoring in Vietnam: Regulating a Legal Blindspot. Retrieved January 12, 2022, from https://think-asia.org/bitstream/handle/11540/13169/ISEASEWP2021-1Lainez.pdf?sequence=1