If you correctly report a piece of fake news, your score goes up, and your reports are given more weight in the future. If you get it wrong or you’re trolling, your score goes down. But at the same time, they’ve implied that this isn’t the only thing that contributes to your score, so even if this seems reasonable, there may be more to the story.

The User Trustworthiness Score

Each user score lies between zero and one, and though it’s attached to many Facebook profiles, the score may be limited to users who have reported fake news. Though Facebook has not released the full criteria used to assign the number to people, the piece that we do know about depends completely on these reports. If you’re wondering where exactly the “report” button is, it can be found in the top-right corner in the menu that you can reach by clicking those three dots.

Near the bottom of the list you’ll see the option to “Give feedback on this post,” which allows you to tell Facebook about whatever you think is wrong with the post.

Selecting the “false news” option and clicking “Send” will report the post to Facebook, and you’ll be presented with a set of options, like unfollowing the person who posted the fake news, messaging them about it, or even unfriending them.

After the report is sent, it may be reviewed by an algorithm and/or a human fact-checker (this part of the process isn’t 100% clear either). Either way, if Facebook agrees that what you reported was fake news, you’ll earn yourself a bump up in your score. If not, your score goes down.

It’s not a centralized user score

Given Facebook’s penchant for collecting massive amounts of data on its users, it would be fair to assume that they’re using all that information as part of your reliability score. It would probably be quite effective to use data from profiles to help combat fake news, but they claim that they’re not. According to their statement, it’s just a few pieces of data, mainly from reports, which aren’t personally identifiable.

How does my score affect me?

Unless you like earning invisible Internet points and feeling like you’re making more of a difference in the battle against fake news on Facebook, your score doesn’t really affect you as a user. It seems that no matter how many fake reports you make, your account will remain open without repercussions. Vice versa, no matter how many correct reports you make, you’ll probably never receive so much as a thank-you note. How this affects you as a private denizen of the Internet may be a little different. If you can opt in to the system by reporting fake news, it seems fairly benign, but it’s more likely that there are score factors that you have less control over. However, the fact that Facebook has this score hints that they could easily be doing a lot more with lots of other data, from targeted ads to more advanced profiling. A Facebook account is about as close as many people come to a centralized online identity, and having it dissected, regardless of intent, isn’t a comforting thought.

Why can’t I see it?

Every time someone gets a score, it’s human nature to want to know what it is, but Facebook is keeping our reliability scores under wraps. From a security standpoint, it makes sense, since if the criteria were publicly available, it would make gaming the system a lot easier for bad actors. There is also likely a public relations motivation, since the decisions made by their algorithms and fact-checkers would likely not be controversy-free, and users might respond negatively to receiving low scores, whether they deserved them or not.

Transparency vs security

Some people believe that Facebook’s scoring doesn’t go far enough — if anything, Facebook should be doing more to combat fake news, enlisting its users in the effort to an even greater degree. Many others, though, are still feeling the reverberations of Cambridge Analytica and feeling rightfully sensitive about how their data is being used. The lack of transparency about the entire score is also not comforting, though Facebook does claim that this it’s not drawing on your other data. Either way, more visible efforts at transparency concerning user data would be a nice move on Facebook’s part.