ER 11: On Abortion, Digital Footprints, and Roombas
Plus, some concrete steps to protect yourself when fundamental rights are on the chopping block.
Welcome back to the Ethical Reckoner. It’s a long one today, but I hope the payoff will be worth it.
First, though, some personal news: I’ll be starting a PhD in Law, Science, and Technology at the University of Bologna this fall. I’m sad to be leaving my current company, which is filled with truly wonderful people, but thrilled for this new adventure. Please reach out if you happen to be in my new neck of the woods!
Data protection: it’s a good thing. We all know that we should use strong, unique passwords (bonus points for a password manager*), not tell anyone our mother’s maiden name, etc. This will ostensibly protect against our data being compromised by hackers. We also vaguely know that the social media platforms we use collect a lot of data on us. But in general, we don’t really care, because we don’t think that we’re much of a target, or we appreciate the convenience that having good content and ad recommendations provides, or because we just have other things to worry about. I’m here to tell you that you should dedicate at least a little bit of headspace to this, especially if you’re in America, because things are changing. Yes, it’s bad when your credit card information gets leaked in a data breach, and we should all be concerned about the massive data collection by platforms for highly targeted advertising and political messaging that infringes on our autonomy.** However, as our ability to exercise fundamental rights is increasingly thrown into question, it’s possible that more freedoms that we now take for granted will vanish and even be criminalized, and that our digital footprints will aid in the targeting of individuals and organizations. While I’ll be talking a lot about abortion in this piece and the aftermath of the Supreme Court overturning the constitutional right to abortion in America, these same arguments could apply to any right anywhere. I’ll be breaking down three avenues that we should be concerned about, and discuss some possible future threats. Buckle up.
Platforms
A 17-year-old girl in Nebraska, Celeste, and her mother, Jessica, have been charged with mishandling human remains, concealing a death, and false reporting after using medical pills to end Celeste’s abortion after Nebraska’s 20-week ban. Jessica is also charged with performing an abortion without a medical license and performing an abortion after 20 weeks of pregnancy. Celeste is being charged as an adult. The case came about after investigators heard “concerns” that Celeste and her mother had buried a stillborn baby. When the detective asked Celeste, who was due in July, what date the pregnancy ended, she looked at Facebook Messenger. Rather than assuming that she had texted a friend or relative about the stillbirth, the detective chose to subpoena Facebook in June—prior to the Dobbs v. Jackson Women’s Health Organization decision, but after the draft opinion leaked—for her and her mother’s messages. They found messages discussing obtaining and taking abortion pills and charged Jessica with the two abortion-related charges as well.
The only evidence the prosecutors had that an abortion had occurred was Celeste and Jessica’s Facebook messages, and they had to go searching for them. As part of the ongoing war against the right to choose, investigators have been investigating miscarriages, and the overturning of Roe v. Wade has only added fuel to the fire.
Facebook claimed that “The warrants did not mention abortion at all. Court documents indicate that police were at that time investigating the alleged illegal burning and burial of a stillborn infant.” The police obviously didn’t need the Facebook messages to determine that the remains had been improperly disposed of; Celeste and Jessica volunteered that they had buried the remains after the stillbirth. Facebook wouldn’t have known about the other evidence, but they could have read between the lines because much of the time, a stillbirth investigation is actually about abortion. Even if they had determined it was about abortion, they probably still would have handed over the data (despite calls for Facebook to protect abortion-related data and their employees’ access to abortion). Defying a subpoena can result in a fine and/or being held in contempt of court, and that’s not a maintainable legal strategy since they are indisputably in possession of the data. Google announced that they would delete location data related to fertility centers and abortion clinics, which gives them an easy out if subpoenaed, but Facebook can’t selectively delete messages related to abortion. Users cannot rely on platforms standing up on their behalf. When platforms will not protect us, we have to be protect ourselves. This means that users have to make their data inaccessible to law enforcement, which often also means making it inaccessible to platforms.
For messages, the best way to do this is by using end-to-end encryption, which makes it so that even platforms can’t view messages; only participants can. Shortly after the Nebraska charges were announced, Facebook announced that they would be expanding end-to-end encryption as a default option, but the full rollout isn’t scheduled till 2023. Platforms that support end-to-end encryption by default are Apple’s iMessage (though SMS messages aren’t encrypted, and Apple can access messages backed up to iCloud), Meta’s WhatsApp (although recipients can report messages and send unencrypted versions to Facebook,*** and Meta does turn over message metadata**** to law enforcement), and Signal (which helpfully also encrypts metadata).
For other data, like location and interaction, though, it’s harder. Platforms own the data they collect on users, and there’s no guarantee that any of it is encrypted. The whole point of them gathering this data is to use it to serve you ads and make behavioral predictions, so it’s likely not. In the first half of 2021, Google got 50,907 requests for user information from government agencies in the US—a nearly 30% increase over the same period in 2020—and provided some data in 82% of cases. Facebook got 63,657 requests in that same period and provided data for 89% of them. Some of these could be “geofence” warrants, which ask for data about anyone in a certain area at a certain time, or “keyword” warrants, which request information about any user searching for a specific term. Looking back at the Nebraska case, suppose Celeste had used an encrypted messaging app, but visited a clinic, found she couldn’t get an abortion, then searched online for abortion pills. Under Google’s new sensitive-location policy, law enforcement wouldn’t be able to request information on everyone who visited an abortion clinic, but Celeste could have been caught up in a keyword warrant for anyone who searched for “abortion pill” or “misoprostol.” Or, the police could have requested her specific search history after they began investigating her stillbirth, just as they did with her Facebook messages (a woman was indicted in 2018 after a subpoena revealed she had searched for and purchased misoprostol). Even if Celeste had used an “incognito” mode on her browser, her ISP would still have access to the websites visited. If Celeste had visited one of the thousands of crisis pregnancy center websites that send marketing data to Facebook (in violation of Facebook’s own policy), even if not logged into Facebook, the platform could feasibly have linked that search to her profile and disclosed it to law enforcement. If you asked Alexa for help, or if your smart speaker just thought it heard its wake word, those recordings are kept by the platform and could be accessed.
So, to keep your data safe from platforms, you not only need to use an encrypted messaging app, but also avoid any searches that could possibly be linked to any of your user accounts anywhere, and leave your devices at home when visiting sensitive locations. Even deleting your accounts isn’t a solution, as platforms may retain large quantities of data even after account deletion. And even more good news: It’s not just social media platforms that will obtain and share your data.
Third parties
After the Supreme Court overturned Roe v. Wade, Twitter filled with calls for people to delete their period tracker apps. Part of this is because of the possibility of wide-sweeping subpoenas, potentially for “any users who apparently because pregnant in a given time period,” which is a valid concern.*****
However, period trackers aren’t the only apps that threaten individual privacy. Location data is regularly collected by seemingly innocuous apps and passed to data brokers, who then sell it on to advertisers, retailers, and even hedge funds. According to a NYT investigation, popular apps like WeatherBug, The Weather Channel, DC Metro and Bus app, and theScore send exact latitude and longitude of the device to data brokers, and often do not clearly communicate their policies to users. Though the data is anonymized, the NYT was able to match it with public records to identify individuals, tracking them to work, home, hospitals, gyms, hikes, and exe’s houses. Employees or clients with access to the raw data could also identify individuals. So, while Google won’t keep data on what health clinics you’ve visited, your local news app could be gathering and selling that data, and law enforcement could use a geofence warrant to obtain information from any company with that data. Furthermore, cell tower data can be subpoenaed to obtain location information. The only solutions are to identify and delete such apps, which is difficult and requires either digging into code or legalese, or to leave devices at home
Hackers
This is the most theoretical avenue of concern, and hopefully the one you need to worry about the least. However, there’s definitely a possibility that hackers will target people they suspect of violating the law for blackmail purposes, obtaining incriminating data and threatening to send it to employers or law enforcement, as happened to a women's health clinic in Australia in 2018 (although this targeted the organization, and the threatened consequences were stigmatization, not imprisonment). This is especially alarming considering the rise in vigilante laws that allow random individuals to sue someone for providing abortion care or aiding someone in obtaining an abortion, with fifteen states potentially preparing laws inspired by Texas’s SB8. Law enforcement has also been known to purchase access to hacked data through services dedicated to “criminal locating.” Many of the measures that prevent platforms and third parties from accessing your data will also serve to protect it from hackers.
Future Avenues to Worry About
Big Tech consolidation is also a threat; if Amazon buys One Medical and Roomba****** as planned, they’ll own your health data and your home layout, and they’ll have your palm print from Whole Foods. Meta has control over your Facebook, Instagram, and WhatsApp messages. Nest knows when you’re moving around; your garage opener knows when you leave your house. And this third-party smart home data is shared with Amazon. Furthermore, as they expand into VR, these companies will have access to extremely sensitive biometric data. While individual pieces of data may seem innocuous—so what if I drove away at 10:30pm on a Thursday? So what if there was a crib in one room and now there isn’t?—when combined, they can be incredible invasions of privacy.
In their article “A Right to Reasonable Inferences: Re-Thinking Data Protection Law in the Age of Big Data and AI,” Sandra Wachter and Brent Mittelstadt argue that data collection organizations will be able to make more and more sophisticated inferences about people, which could be hugely invasive. The famous story about Target inferring that a teenage girl was pregnant before her father found out and sending her maternity ads is the perfect example. Platforms can guess things about you that you may not have wanted them to know—for instance, I never told TikTok my sexual orientation, but within half an hour of making an account I was getting served queer content left and right. The making and use of these inferences are not covered under existing data protection laws in the US (not that we have many to begin with) or EU. This could open the door to a dragnet-like system where investigators—or, even more terrifyingly, individual bounty hunters—use publicly available data, keyword or geofence warrants, or data brokers to infer whether an individual has recently obtained an abortion. Would this be admissible as probable cause for a more detailed search warrant of messages? What about a machine learning model that predicts if an individual has recently obtained an abortion? I don’t want to find out.*******
Though I’ve been talking about abortion a lot in this piece, ultimately, it’s not just about abortion. Rights and freedoms are under attack worldwide, and the overturning of Roe v. Wade is just the latest salvo. Today it’s abortion in America, but tomorrow it could be being Muslim in India, or talking about queerness in Hungary (or race in Florida). The more freedoms are yanked away, the more we have to worry about our digital footprints being used to persecute individuals for trying to exercise their rights, and now is the time to realize that there is a problem and that we can take steps to keep ourselves safe while fighting for the restoration of those rights. The lobster doesn’t realize the water is getting hotter until it’s boiling, but it can at least make sure that its smart thermostat isn’t contributing to the situation.
Below is a non-comprehensive list of recommendations for digital safety. I sincerely hope that everything I’ve written here is overblown and that no one reading this will ever be targeted for exercising what was once considered an inalienable right. But two years ago the country that champions “liberty and justice for all” wasn’t overthrowing the right to choose, or putting contraception and gay rights in the crosshairs. No one thinks they’ll be under suspicion until they are, and by then it’s too late. Things change quickly. We can be ready.
Recommendations:
Enable end-to-end encryption where possible, or move to a messaging app that is encrypted by default.
Use incognito/private mode on a browser to do sensitive searches, and don’t log in to any site while doing it. (Also consider using alternative search engines like DuckDuckGo, which has more privacy protections.) Use a VPN as well to ensure that your internet service provider can’t access your browsing data either. Or use the Tor browser, which encrypts traffic and bounces it to servers across the globe. Definitely don’t use the in-app browser in any social media app (but especially TikTok).
Do delete your period tracker.
Don’t bring any devices to sensitive locations. Especially don’t bring your Tile, AirTag, or other tracking device.
Unplug—don’t just turn off—your smart home devices. Don’t get a Ring or other smart home camera.********
Pay for health supplies in cash. Definitely don’t use your palm print at Whole Foods.
Assume that law enforcement will be able to access your health records, regardless of whether Amazon has them from One Medical.
If you have an iPhone, turn on “ask not to track,” but don’t rely on it.
Turn on two-factor authentication wherever possible, and use strong passwords. Ideally get a password manager, too.
* Confession: I don’t. I know, I know.
** This is the idea of surveillance capitalism. The term was coined by Shoshana Zuboff and refers to the idea that companies mine behavioral data to predict behavior for commercial gain, but in the process modify our behavior to make us easier to predict.
*** This has been used to target individuals and groups for extra scrutiny and bans.
**** This is information like who you’re messaging, how many messages you exchanged, when they were sent, etc.
***** I want to note that while extant laws criminalizing abortion mostly target abortion providers and those who help someone obtain an abortion, there is nothing preventing them from expanding statues to cover patients (like Oklahoma, Nevada, and South Carolina already have). Furthermore, fetal “personhood” laws have led to charges for pregnant people who confided to their doctor about having a drink or taking half a Valium, and there have been countless investigations of miscarriages that attempt to pin them on the pregnant person's actions and indict them under dubious charges.
****** Several articles have cautioned that Amazon’s acquisition of Roomba is concerning from a privacy perspective but not really articulated why. The surveillance capitalism aspects of Amazon being able to use the contents of your house to sell you stuff are concerning, but my other worry is about the inferences that the company can make about you when they know not only what you buy but how you live—what things you own, what kind of people are in your house, and how things change. The “crib-no crib” example is an obvious one, but it could also potentially analyze where you go in your house and make inferences off of that. Sleeping more? Going to the bathroom more? Your Roomba might know.
******* There’s also the important question of who will be targeted. No law is universally enforced, and abortion-related cases in particular may be difficult to try for reasons of evidence and PR. Will bounty hunters target people without access to good legal counsel? Will hackers exploit those without the resources to use VPNs and other safety measures? What new dangers will befall those in more conservative areas?
******** For the sake of sanity if not safety, don’t watch the new Ring Nation, which is essentially a ploy to normalize surveillance devices.
Thumbnail generated by DALLE-2 with the prompt “the digital footprint of a Roomba, abstract painting”.
Thanks for reading.
Check out the archive of past Ethical Reckoner issues here. If you haven’t already, click the big “subscribe” button to get the Ethical Reckoner, and help the ER grow by spamming all your friends with the “share” button.
Any suggestions or comments? Let me know on Twitter @EmmieHine.