Proj CDeepFuzz Paper Reading: Astraea: Grammar-based fairness testing

Abstract

Background:

  1. discriminatory inputs, e.g., from societal bias, produce error, need to conduct fairness testing(generating discriminatory inputs that reveal and explain biases)

本文:Astraea
Github: https://github.com/sakshiudeshi/Astraea
Task: leverages context-free grammars to generate discriminatory inputs that reveal fairness violations in software systems
方法:

  1. probabilistic context-free grammars
  2. provides fault diagnosis by isolating the cause of observed software bias

实验:
Dataset: 18 software systems that provide three major natural language processing (NLP) services-Coreference Resolution, MLM, Sentiment Analysis
效果:

  1. generated fairness violations at a rate of about 18%
  2. ASTRAEA generated over 573K discriminatory test cases and found over 102K fairness violations
  3. improves software fairness by about 76% via model-retraining, on average
posted @ 2023-08-29 15:48  雪溯  阅读(5)  评论(0编辑  收藏  举报