Browse > Article
http://dx.doi.org/10.14352/jkaie.2022.26.4.273

Learning Method of Data Bias employing MachineLearningforKids: Case of AI Baseball Umpire  

Kim, Hyo-eun (Dept. of Humanities, Hanbat National University)
Publication Information
Journal of The Korean Association of Information Education / v.26, no.4, 2022 , pp. 273-284 More about this Journal
Abstract
The goal of this paper is to propose the use of machine learning platforms in education to train learners to recognize data biases. Learners can cultivate the ability to recognize when learners deal with AI data and systems when they want to prevent damage caused by data bias. Specifically, this paper presents a method of data bias education using MachineLearningforKids, focusing on the case of AI baseball referee. Learners take the steps of selecting a specific topic, reviewing prior research, inputting biased/unbiased data on a machine learning platform, composing test data, comparing the results of machine learning, and present implications. Learners can learn that AI data bias should be minimized and the impact of data collection and selection on society. This learning method has the significance of promoting the ease of problem-based self-directed learning, the possibility of combining with coding education, and the combination of humanities and social topics with artificial intelligence literacy.
Keywords
MachinelearningforKids; artificial intelligence; data bias; machine learning; baseball umpire; self-directed learning;
Citations & Related Records
Times Cited By KSCI : 2  (Citation Analysis)
연도 인용수 순위
1 "U.S. Open without referee, AI referee fills in", The Chosun Ilbo, Korea. 2020.09.01. https://www.chosun.com/sports/2020/09/01/J2KCIOURXRBWJARQB7DRHHZXPQ
2 Joint ministries (2021). Reliable artificial intelligence realization strategy, May 13. Artificial Intelligence-based Policy Division, Ministry of Science and ICT. https://www.korea.kr/common/download.do?fileId=195009613&tblKey=GMN
3 "KBO, This year, the 2nd district 'robot referee' pilot operation... Ball strike judgment" Younghap News, Korea, 2021. 6.29. https://www.yna.co.kr/view/AKR20210629159600007"
4 Teachable Machine, https://teachablemachine.withgoogle.com
5 Vries, T., Misra, I., Wang, C., & van der Maaten, L. (2019). Does object recognition work for every one?. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, 52-59.
6 Skeem, J. L., & Lowenkamp, C. T. (2016). Risk, race, and recidivism: Predictive bias and disparate impact. Criminology, 54(4), 680-712.   DOI
7 Kim, Hyo-eun (2021). Fairness Criteria and Mitigation of AI Bias, Kor. J. Psychol.: Gen., 2021, 40(4), 459-485.
8 Barocas, S., Hardt, M., & Narayanan, A. (2017). Fairness in machine learning. Nips tutorial, 1. https://arxiv.org/ct?url=https%3A%2F%2Fdx.doi.org%2F10.1007%2F978-3-030-43883-8_7&v=31e44ca0hn   DOI
9 "International Gymnastics Federation's '3D technology and AI referee' World Cup competition", Younghap News, Korea, 2018.11.21. https://www.yna.co.kr/view/AKR20181121057600007
10 Parsons, Christopher A., Johan Sulaeman, Michael C. Yates, and Daniel S. Hamermesh. 2011. "Strike Three: Discrimination, Incentives, and Evaluation." American Economic Review, 101 (4): 1410-35.   DOI
11 Machine Learning for Kids, https://machinelearningforkids.co.uk
12 Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Penguin UK. https://dl.acm.org/doi/10.5555/2029079
13 Prates,M. O., Avelar, P. H., & Lamb, L. C.(2020). Assessing gender bias in machine translation: a case study with google translate. Neural Computing and Applications, 32(10), 6363-6381.   DOI
14 Tan, S., Caruana, R., Hooker, G., & Lou, Y. (2017). Detecting bias in black-box models using transparent model distillation. arXiv preprint. Retrieved from https://arXiv:1710.06169   DOI
15 Bigman, Y. E., Yam, K. C., Marciano, D., Reynolds, S. J., & Gray, K. (2021). Threat of racial and economic inequality increases preference for algorithm decision-making. Computers in Human Behavior, 122, 106859.   DOI