Meta believes that the intersection of technology and civil rights is an emerging space which requires more attention from the industry. This even considers the spike in digital discrimination and bias. As questions continue to be asked about technology’s potential effects on members. Meta has called for more research.
“I want to explain more about our approach to understanding how people from marginalized communities experience Meta technologies. Some people have said that their opportunities are limited or that they’re having a different experience than others, but we don’t have the data to fully understand what may be happening and why. We can’t address what we can’t measure, so establishing a more accurate measurement framework is vital to create more inclusive products, policies and operations across the company.“ says Roy L. Austin, Jr., VP and Deputy General Counsel, Civil Rights.
To make things right, Meta can’t work alone, hence it is consulting with civil rights community, privacy experts, academics, regulators and other organizations about the best way to measure these differences in people’s experiences. After exploring the options for measurement, experts have reaffirmed that any work Meta does should take into account privacy, security and transparency.
To kick things off, Meta plans to introduce a framework for studying its platforms and identifying opportunities to increase fairness when it comes to race in the United States. Meta says, “we plan to augment that approach with two methodologies that will produce more accurate insights. We will do this in a way that allows important measurement while honoring people’s privacy:
- Bayesian Improved Surname Geocoding: For this methodology, we’re exploring a widely used method to estimate and aggregate racial distribution based on ZIP code and surname, informed by publicly available US Census statistics. With the privacy enhancements we’ve added, this method will help us analyze the data using aggregate results.
- Off-platform surveys using secure multiparty computation: This methodology is still under development and will use innovative, privacy-enhancing technologies to help us learn potential differences in people’s experiences in the US. In collaboration with partner institutions, we will conduct an opt-in off-platform survey and run measurements while employing privacy-preserving methods.
Learn more about these methodologies in Meta’s technical paper.
While it is true that this initial work will focus on race in the US, it will help Meta in laying the groundwork for how to address concerns from other marginalized communities around the world. As civil rights expert Laura Murphy stated in the audit, Meta has a “responsibility to ensure that the algorithms and machine learning models that can have important impacts on billions of people do not have unfair or adverse consequences.” The journey won’t be easy but Meta will remain committed to this work and will continue to remain transparent in all their efforts.