Meta company in order to alleviate the current many computer vision models for women and people of colour there is a systematic bias problem recently launched a new AI tool called FACET, used to identify race and gender bias in the computer vision system.
The FACET tool, now trained on 30,000 images containing 50,000 people, specifically enhances perception in terms of gender and skin colour, and can be used to assess computer vision models across a wide range of features.
The FACET tool can be trained to answer complex questions, such as recognizing a skateboarder after identifying the subject as male, and light and dark skin colors.
Meta used FACET to evaluate the DINOv2 and SEERv2 models developed by the company, as well as OpenAI’s OpenCLIP model, and overall OpenCLIP outperformed the other models in terms of gender, while DINOv was a better judge of age and skin colour.
The open-sourcing of FACET will help researchers perform similar benchmarking tests to understand bias in their own models, and can also be used to monitor the impact of mitigating measures taken to address fairness issues.