Protecting people’s information is the most important thing we do at Facebook. What happened with Cambridge Analytica was a breach of Facebook’s trust. More importantly, it was a breach of the trust people place in Facebook to protect their data when they share it.
As Mark Zuckerberg explained in his post, we are announcing some important steps for the future of our platform. These steps involve taking action on potential past abuse and putting stronger protections in place to prevent future abuse.
People use Facebook to connect with friends and others using all kinds of apps. Facebook’s platform helped make apps social — so your calendar could show your friends’ birthdays, for instance. To do this, we allowed people to log into apps and share who their friends were and some information about them.
As people used the Facebook platform in new ways, we strengthened the rules. We required that developers get people’s permission before they access the data needed to run their apps – for instance, a photo sharing app has to get specific permission from you to access your photos. Over the years we’ve introduced more guardrails, including in 2014, when we began reviewing apps that request certain data before they could launch, and introducing more granular controls for people to decide what information to share with apps. These actions would prevent any app like Aleksandr Kogan’s from being able to access so much data today.
Even with these changes, we’ve seen abuse of our platform and the misuse of people’s data, and we know we need to do more. We have a responsibility to everyone who uses Facebook to make sure their privacy is protected. That’s why we’re making changes to prevent abuse. We’re going to set a higher standard for how developers build on Facebook, what people should expect from them, and, most importantly, from us. We will:
Review our platform. We will investigate all apps that had access to large amounts of information before we changed our platform in 2014 to reduce data access, and we will conduct a full audit of any app with suspicious activity. If we find developers that misused personally identifiable information, we will ban them from our platform.
Tell people about data misuse. We will tell people affected by apps that have misused their data. This includes building a way for people to know if their data might have been accessed via “thisisyourdigitallife.” Moving forward, if we remove an app for misusing data, we will tell everyone who used it.
Turn off access for unused apps. If someone hasn’t used an app within the last three months, we will turn off the app’s access to their information.
Restrict Facebook Login data. We are changing Login, so that in the next version, we will reduce the data that an app can request without app review to include only name, profile photo and email address. Requesting any other data will require our approval.
Encourage people to manage the apps they use. We already show people what apps their accounts are connected to and control what data they’ve permitted those apps to use. Going forward, we’re going to make these choices more prominent and easier to manage.
Reward people who find vulnerabilities. In the coming weeks we will expand Facebook’s bug bounty program so that people can also report to us if they find misuses of data by app developers.
There’s more work to do, and we’ll be sharing details in the coming weeks about additional steps we’re taking to put people more in control of their data. Some of these updates were already in the works, and some are related to new data protection laws coming into effect in the EU. This week’s events have accelerated our efforts, and these changes will be the first of many we plan to roll out to protect people’s information and make our platform safer.
Source:Facebook