Debt Slavery And Remote Kill Switches: You Can’t Escape Data Surveillance In America

  • iPage Web Hosting for only $1.99/Month
  • Free Domain for Life!
  • The Original $3.15/mo. FatCow Plan - Unlimited Disk Space, Unlimited Transfer, Unlimited E-mails & Site building tools.
  • Host your website with MyDomain!
  • You Can’t Escape Data Surveillance In America

    Screen Shot 2016-05-02 at 7.31.00 AM

    by SARAH JEONG

    In America, surveillance has always played an outsized role in the relationship between creditors and debtors. In the 19th century, credit bureaus pioneered mass-surveillance techniques. Today the American debtor faces remote kill switches in their devices, GPS tracking on their leased cars, and surreptitious webcam recordings from their rent-to-own laptops. And where our buying and borrowing habits were once tracked by shopkeepers, our computers score our creditworthiness without us knowing.

    The most egregious privacy violations have been punished either by the Federal Trade Commission, or answered with massive class-action lawsuits. But surveillance, tracking, and data collection continues to proliferate. The law has not yet met the challenge of protecting consumers. The capabilities of today’s technology might be unprecedented, but the quandary is an old one. The ways our financial data gets collected and used today is reminiscent of the state of affairs that led to the Fair Credit Reporting Act of 1970.

    The American experience of mass surveillance started with commercial credit bureaus in the 1800s, which employed spies across the country to maintain dossiers of the personal habits of “men of trade”—where they went to church, who they slept with, whether they drank or gambled. Consumer credit reporting agencies followed, after the Civil War, expanding these practices beyond men of trade, to anyone who wanted to buy anything. The information collected by these bureaus was used by retailers, public utilities, landlords, and even law enforcement.

    Because the American credit reporting system relies on both good and bad reports of creditworthiness, a consumer must have some kind of credit—not just the absence of bad credit. (In some countries, the lack of a credit report can establish good credit). “The American system, on other hand, relies on total surveillance,” writes Chris Jay Hoofnagle in his primer on privacy law and the Federal Trade Commission.

    From the end of the Civil War to the mid-20th century, the breadth and detail of information collected by the reporting agencies only increased. Control over the access to that information, however, did not seem to keep up. “People do not realize, for example, that their own credit files are accessible to virtually anyone who understands the workings of credit bureaus and has a few dollars to spend on a report,” said one study in 1969. And those credit reports contained personal information ranging from the deeply prejudicial to the utterly inane.

    The reports were compiled using information from retailers, from the public record (court records, newspaper clippings), and from interviews with friends and neighbors. In 1972, a Senate aide testified before a committee about the type of information that was collected by the automobile insurance industry: “If they, in any way, have some deviant behavior characteristics, they wear pink shirts, or have long hair and a mustache, they read Karl Marx … They can look in your library and see what books you read, what magazines you subscribe to…”

    It was no wonder, then that Congress enacted the Fair Credit Reporting Act in 1970, to be enforced by the Federal Trade Commission and by private litigants through a civil cause of action. According to Hoofnagle, FCRA is “America’s first federal consumer information privacy law and one of the first information privacy laws in the world.”

    Today FCRA is still at the forefront of fighting privacy violations—but the transition to the new frontier of digital privacy has not been a smooth one. In the 70s, FCRA looked like a white knight charging into the dragon of credit reporting. Today, it sometimes more closely resembles Don Quixote tilting at windmills.

    FCRA, after all, assumes that the problem is inaccurate information. (And in all fairness, many Americans have low credit due to false information on their reports). It also imposes duties on credit reporting agencies to keep information from being carelessly disclosed. But people don’t object to spying on the grounds that the secret dossier about them might be full of errors. They object to spying because it’s spying.

    A FCRA case still pending at the Supreme Court illustrates this tension. In Spokeo v. Robins, Thomas Robins initiated a class-action lawsuit against “people search engine” Spokeo, because they gave out information “indicating that he has more education and professional experience than he actually has, that he is married (although in fact he is not), and that he is better situated financially than he really is.” There are probably a lot of people who think Spokeo is creepy and should be regulated. The number of people who think Spokeo is creepy and should be regulated because they made Thomas Robins look better off than he actually is, is likely vanishingly small.

    It’s not always clear whether data brokers like Spokeo are consumer reporting agencies for the purposes of FCRA. The FTC certainly thought so in 2012, pointing to how Spokeo marketed its information to “human resources professionals, job recruiters, and others as an employment screening tool.” The Spokeo website now displays a disclaimer that specifically says its information “cannot be used for FCRA purposes,” but a disclaimer by itself can’t ward away class action lawsuits. To the attorneys for Thomas Robins, Spokeo is a consumer reporting agency in everything but name.

    According to a Supreme Court brief written by Spokeo’s lawyers, the service “aggregates publicly available information regarding individuals from phone books, social networks, marketing surveys, real estate listings, business websites, and other sources into a database that is searchable via the Internet using an individual’s name, and displays the results of searches in an easy-to-read format.”

    Others see the service differently. “Meet the Stalkers,” is the headline for one article about data brokers. “Did my rapist find me on Spokeo?” asks an anonymous poster on Jezebel.

    Data brokers in general pose a threat to victims of domestic violence and targets of stalking, enough that the National Network to End Domestic Violence offers a guide to data brokers. The problem isn’t new. In the early 2000s, the estate of Amy Lynn Boyer sued online investigation service Docusearch, after it provided information on Boyer’s whereabouts to her stalker. (The stalker shot Boyer and then shot himself).

    Whether or not specific data brokers today are covered by FCRA, they are a modern day echo of the bureaus that gave rise to FCRA in the first place. The FTC has successfully gone after data brokers under FCRA for millions of dollars in judgments. It’s a far cry from the “wrist-slap” penalties in many electronic privacy cases the agency takes on. In the Detective Mode case—where rent-to-own stores turned on webcams on laptops and took photos of the renters without their consent—the settlement essentially banned the stores from spying on its customers anymore. In comparison, extracting a combined $1.6 million from Equifax and other companies for selling and buying lists of consumers who were late on mortgage payments looks like a big deal, until you realize Equifax alone reports $2.66 billion in revenue per year.

    No wonder the FTC doesn’t seem to have had much of a chilling effect on the collection or sale of data about people.

    Where the agency’s limitations stop the FTC from going any further, the private sector has stepped in, producing a rip-roaring industry of class-action lawsuits litigated by attorneys that Silicon Valley views as “leech[es] tarted up as freedom fighter[s].

    The stated issue in Spokeo is whether a plaintiff can sue under laws like FCRA without showing actual harm. But the subtext is whether or not the Supreme Court will continue to tolerate these kinds of lawsuits. Liberal justices like Kagan and Sotomayor might say yes; justices like the deceased Antonin Scalia, who openly despised the class action plaintiffs’ bar, would say no.

    The effect of letting someone sue without showing harm is obvious: It makes it really easy to sue. If you see the world in terms of privacy-invading behemoths that need to be slapped down by buccaneering class action attorneys, then that’s a good thing. But if you see a pack of over-litigious lawyers squeezing settlements out of Silicon Valley as the real villains, then of course it’s a bad thing.

    Yet when it comes to privacy invasion, it’s difficult to show real life injury. LinkedIn leaks your password to the world. You change all your passwords, and so nothing happens. So what? Tracking links monitor your web activity. Someone at the other end collects that information, files it away, and never does anything with it. So what? Having to show harm, or not having to show harm, can make or break an entire genre of lawsuits.

    But is this type of litigation actually doing any good? It’s a controversial question, for which there is no easy answer. Squint at these lawsuits from one angle, and you’ll see a privatized regulatory system that keeps data monsters in check. Squint at them from another angle, and all you see is a flock of vultures in suits, picking away at deep-pocketed tech companies.

    We peer into a future where our software spies on usour data defines us, and our hardware reinforces existing power imbalances. What’s slowing it down are a federal agency whose remedies often feel unsatisfactory, and a cohort of attorneys whose motivations are of course capitalistic. And while it’s tempting to simply call for more federal intervention, paternalistic impulses sometimes harm the most vulnerable among us.

    Regardless, the status quo is not enough. Before FCRA, the credit reporting industry was limited mostly by the technologies they had access to. The bureaus could compile dossiers and accumulate lifetimes of information, but they couldn’t run algorithms on the entirety of this data. And in 1969, the “systematic collection” of information about people from newspapers and court records was “expensive and time-consuming,” whereas today data about people can be scraped automatically from the internet.

    Combined with new forms of financial control over subprime borrowers exercised via the cars and devices they buy, creditors have at their fingertips a web of technological control over debtors. If left unchecked, the future of financial surveillance looks dark. If FCRA trimmed back the worst excesses of the last century of financial surveillance, perhaps it’s time to look to something similar, before it’s too late.

    Amazon Apple Music Spotify iTunes Napster
    Deezer iHeart Radio Google Play - Youtube Tidal TheTruthTale.com