© Reuters / Aly Song
Facebook has supplied phone companies with customers' private data without their knowledge or consent, and even helped those companies use Facebook behavior to evaluate users' creditworthiness, documents reportedly show.
The social network supplied data on location, interests, and friend groupings to phone carriers and manufacturers without users' permission – data that went far beyond mere technical specs. Users' activity on Facebook, Instagram and even Messenger was fair game for data-mining, and the platform encouraged and even assisted over 100 global telecoms to use customers' data for purposes including evaluating their creditworthiness, according to documents seen by the Intercept, which suggest the program is still going on.
Facebook data scientists working on its "Actionable Insights" program developed an algorithm to exclude customers with poor credit history from future promotions by a client, determining creditworthiness through users' online behavior, according to the document, which presented this case study as an example of what clients could achieve through the program. Such an algorithm, replicated across the platform through a targeting mechanism called "lookalike audiences" that lumps together users who share attributes, could allow Actionable Insights clients to negatively "profile" users, denying them services based on their failure to fulfill metrics they didn't even know existed, based on behavior they didn't know was being surveiled.
Actionable Insights was announced in August, at about the same time Facebook's secretive and possibly illegal data-sharing partnerships with other tech companies were being exposed – and while Facebook was insisting such non-consenting data-sharing was wholly in the company's past. Like the "trusted partnerships" program, Actionable Insights is ostensibly free, allowing Facebook CEO Mark Zuckerberg to continue to claim that Facebook doesn't "sell users' data" – but access was provided with the understanding that companies would purchase Facebook ads, now expertly targeted thanks to the user data they could access.
The program serves up information on demographics, location, personal interests, and "friend homophily," meaning how similar a user is to their friend groups, in addition to functional data like use of WiFi, cell networks and device information. According to the leaked document, the data is "aggregated and anonymized;" while a Facebook spokesperson told the Intercept that the collection of location data stopped at the zip code level, any phone with location turned on pinpoints its owner's whereabouts quite precisely, and researchers have demonstrated that a record of a person's movements over the course of a month can reliably identify that person no matter how "anonymized" they are.
Speaking to the Intercept, Ashkan Sultani, one of the Federal Trade Commission (FTC) employees who drew up the 2011 "consent decree" forcing Facebook to get permission from users before sharing their data, likens the secretive behavioral algorithms used to denote "good credit" or "bad credit" to redlining, the illegal practice of denying home loans to entire demographic groups. Given that studies have already shown Facebook's ad delivery algorithms are racially biased – the company has paid out at least $5 million to settle multiple lawsuits regarding ad discrimination in employment, credit and housing ads – the addition of an unaccountable behavioral metric is ripe for abuse.