Meta explains its AI with the Instagram algorithm decoder

Everyone wants Facebook and Instagram to explain “the algorithm”.

But their parent company, Meta, doesn’t use just one algorithmic system, machine learning model, or technology to categorize, moderate, or tag Facebook and Instagram posts. There are a variety of AI and non-AI tools in play, sometimes working with real human moderators or decision makers. This means that some standard approaches to explaining AI don’t necessarily work to explain Meta’s ever-elusive algorithms.

That’s why the company said today that it’s using a new approach to explaining its algorithmic systems to people. The AI ​​team responsible for Meta has developed “system maps” and a related prototype tool they claim “has the potential to provide insight into the underlying architecture of the AI ​​system and help better explain how these systems work. A pilot system map is intended to show how the technologies used to determine Instagram feed ranking work and states: “When you open or refresh the Instagram app, the feed system ranks posts that you have not yet seen at from the accounts you follow, based on how likely you are to be interested in each position.

System cards can apply to other algorithmic processes such as those used by the company to translate languages, detect fashion items in images, report harmful content, or for speech recognition. The system map concept was developed as a hybrid of other more commonly used approaches to explaining AI, including model maps, which detail how models were constructed and how they are intended to be used, and worksheets. techniques for datasets, a framework for displaying information about the data used to train machine learning models.

But don’t expect Meta to give away the keys to its prized (but also maligned) algorithms. The company made a point in its System Boards article to say that the approach might not be able to shed light on how very complex systems work. And, the company said it might not reveal information in system boards that could pose a security risk or expose systems in ways that allow users to reverse engineer them.

Comments are closed.