
An Empirical Study on Group Fairness Metrics of Judicial Data
Author(s) -
Yanjun Li,
Huan Huang,
Xinwei Guo,
Yuyu Yuan
Publication year - 2021
Publication title -
ieee access
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.587
H-Index - 127
ISSN - 2169-3536
DOI - 10.1109/access.2021.3122443
Subject(s) - aerospace , bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , engineered materials, dielectrics and plasmas , engineering profession , fields, waves and electromagnetics , general topics for engineers , geoscience , nuclear engineering , photonics and electrooptics , power, energy and industry applications , robotics and control systems , signal processing and analysis , transportation
Group fairness means that different groups have an equal probability of being predicted for one aspect. It is a significant fairness definition, which is conducive to maintaining social harmony and stability. Fairness is a vital issue when an artificial intelligence software system is used to make judicial decisions. Either data or algorithm alone may lead to unfair results. Determining the fairness of the dataset is a prerequisite for studying the fairness of algorithms. This paper focuses on the dataset to research group fairness from both micro and macro views. We propose a framework to determine the sensitive attributes of a dataset and metrics to measure the fair degree of sensitive attributes. We conducted experiments and statistical analysis of the judicial data to demonstrate the framework and metric approach better. The framework and metric approach can be applied to datasets of other domains, providing persuasive evidence for the effectiveness and availability of algorithmic fairness research. It opens up a new way for the research of the fairness of the dataset.