How AI affects what you see on Facebook and Instagram | Half

Billions of people use Facebook and Instagram every day to share life’s ups and downs, to connect with people who share their interests, and to discover content they enjoy. To make everyone’s experience on our apps unique and personalized to them, we use artificial intelligence systems to decide what content they see, informed by the choices they make.

I wrote earlier about the relationship between you and algorithms Meta uses to shape what you see on Facebook and Instagram and to challenge the myth that algorithms leave people powerless over the content they see. In that piece, I wrote that we needed to be more frank about how this relationship works and give you more control over what you see.

Today, we’re building on that commitment by being more transparent about many of the AI ​​systems that incorporate your feedback to rank content across Facebook and Instagram. These systems make it more likely that the posts you see are relevant and interesting to you. We’ve also clarified how you can better control what you see in our apps, as well as test new controls and make others more accessible. And we’re providing more detailed information to experts so they can better understand and analyze our systems.

This is part of a wider ethos of openness, transparency and accountability. With the rapid advances taking place with powerful technologies like generative AI, it’s understandable that people are excited about the possibilities and concerned about the risks. We believe the best way to address these concerns is with openness. In general, we believe that as these technologies are developed, companies should be more open about how their systems work and collaborate openly between industry, government and civil society to help ensure they are developed responsibly. This starts by giving you more information and control over the content you see.

How AI predictions influence recommendations

Our AI systems predict the value of a piece of content for you, so we can show it to you sooner. For example, sharing a post is often an indicator that you found that post interesting, so predicting that you’ll share a post is factored into our systems. As you can imagine, no single prediction is a perfect indicator of the value of a post to you. We then use a variety of predictions in combination to get as close to the right content as possible, including some based on behavior and some based on user feedback received through surveys.

We want to be more open about how this works. One model of transparency that Meta has been developing and championing for some time is the publication of system factsheets, which give people insight into how our systems work in a way that is accessible to those without deep technical knowledge. Today we are releasing 22 system tabs for Facebook and Instagram. They provide insight into how our AI systems classify content, some of the predictions each system makes to determine which content may be more relevant to you, as well as controls you can use to personalize your experience. They cover feeds, stories, reels, and other surfaces where people go to find content from the accounts or people they follow. System tabs also cover AI systems that recommend unrelated content from people, groups, or accounts you don’t follow. You can find a more detailed explanation of the AI ​​behind the content recommendations Here.

To provide an extra level of detail beyond what’s posted in the system tabs, we’ve shared the types of inputs known as signals, as well as the predictive models these signals inform that help determine what content you’ll find most relevant from your network on Facebook. The signal categories released represent the vast majority of signals currently used in Facebook feed ranking for this content. You can find these signals and predictions in the file Transparency Centeralong with how often they tend to be used in the overall classification process.

We also use signals to help identify harmful content, which we remove as soon as we become aware of it, as well as to help reduce the distribution of other types of problematic or low-quality content in line with our Content Distribution Guidelines. We include some examples of the signals we use to do this. But there’s a limit to what we can safely reveal. While we want to be transparent about how we try to keep harmful content off people’s feeds, we also need to be careful not to disclose signals that could make it easier for people to bypass our defenses.

Of course, not everyone will find information just because we post it on our website. That’s why we make it possible to see the details right in our apps about why our systems predicted content would be relevant to you, and the types of activities and inputs that might have led to that prediction. We are expanding ours Why am I seeing this? functionality in the Instagram Reels and Explore tab, and Facebook Reels in the coming weeks, having previously launched it for some Feed content and all Ads both on Facebook and Instagram. You’ll be able to click on an individual reel to see more information about how your previous activity might have affected the machine learning models that shape and deliver the reels you see.

Expanding tools to customize your experience

Using the tools available, you have the ability to shape your experiences on our apps so that you see more content you want to see and less content you don’t. To make this easier, we’ve created centralized places on Facebook and Instagram where you can customize the controls that affect the content you see on each app. You can visit your Facebook feed preferences and Instagram Suggested Content Control Center via the three-dot menu on relevant posts, as well as via Settings.

On Instagram, we’re testing a new feature that lets you indicate that you’re interested in a recommended reel in the Reels tab, so we can show you more about what you like. The Not Interested characteristic has been available since 2021. You can learn more about how to influence what you see on Instagram Here.

So you can personalize your experience and the content you see, we also have Show More, Show Less characteristic on Facebook, which is available on all posts in Feed, Video and Reels via the three-dot menu. We’re working on ways to make the feature even more important. And if you don’t want an algorithmically categorized feed, or just want to see what your feed would be like without it, you can use the Feeds tab on Facebook or select Following on Instagram to switch to a chronological feed. You can also add people to your favorites list on both Facebook AND Instagram so you can always see the contents of your favorite accounts.

Provide better tools for researchers

We also believe that an open approach to research and innovation, especially when it comes to transformative AI technologies, is better than leaving know-how in the hands of a small number of large tech companies. That’s why we’ve released over 1,000 AI models, libraries, and datasets for researchers over the past decade, so they can take advantage of our computing power and pursue research in an open and secure way. It is our ambition to continue being transparent as we make more AI models openly available in the future.

In the coming weeks, we’ll begin rolling out a new suite of tools for researchers: the Metas Content Library and API. The Library includes data from public posts, Pages, groups, and events on Facebook. For Instagram, it will include public posts and data from creators and business accounts. Library data can be searched, browsed, and filtered on a graphical user interface or through a programmatic API. Researchers at qualified research and academic institutions pursuing scientific research or public interest topics will be able to request access to these tools through partners with deep expertise in secure data sharing for research, starting with the Inter-University Consortium for Policy and the University of Michigan Social Research. These tools will provide the most comprehensive access to publicly available content on Facebook and Instagram of any search tool we’ve built to date, and will also help us meet new data-sharing compliance and transparency obligations.

We hope that by introducing these products to researchers early in the development process, we can receive constructive feedback to ensure we build the best possible tools to meet their needs.


#affects #Facebook #Instagram
Image Source : about.fb.com

Leave a Comment