They know where you live

How we can better protect users by considering the impact of our technical and product decisions on their privacy—and safety.
Part of
Issue 7 October 2018

Security

I know women who cannot vote.

They want to vote, very much so. And they should be able to. They’re white, non-disabled, American-born citizens: They don’t have to worry about being disenfranchised by strict voter ID laws or inaccessible booths. Nothing should be stopping them from getting to the polls.

But in their states, all that’s required to find someone’s voter registration record is a name and a birthday. Such simple information, available within most social circles, is the only thing standing between abusive exes, stalkers, internet mobs, and these women’s front doors. So they don’t vote.

The Help America Vote Act of 2002 required, among other things, a “centralized, interactive computerized statewide voter registration list defined, maintained, and administered at the State level.” This database was intended to limit public abuse by requiring voters’ personal information for access. And in some cases, it does help. However, the personal information required was left up to individual state governments, and it varies widely. Some states require a social security number, others just a birthday. Now this fractured system intended to improve voting integrity has, in fact, made it easier than ever to find out where someone lives, effectively silencing some voters as a result.

The combination of sensitive information and short-sighted design is a dangerous one, and it’s a pattern seen at nearly every tech giant in the world. Technology is increasingly used to collect and distribute location data, from home addresses to real-time GPS coordinates, with pinpoint accuracy. Typically marketed as a feature, such as a personalized user experience, this trend is putting people at risk of physical harm by technologists unaware or uncaring of such impact. And it’s only getting worse.

What harm can a person do with your address? There are the obvious answers: find you, stalk you, harass you, hurt you, kill you. Mail you a bomb. Send a SWAT team to your house and shoot you with an officer’s gun. Use any of these possibilities as a threat to blackmail and silence you.

Give someone more resources, however, and their opportunities grow. Check your address against real estate listings and they can guess your income. Compare it with census data and they can guess your race. Plot it on a district map and they can find your child’s school. Give them the internet, and they can do all of this in 30 minutes.

Now, give them access to your devices, your GPS data. They can figure out your routine. They can see where you shop, where you work, where your friends live. They can compare your route to transit maps and figure out whether you have a car. They can see how frequently you leave home, if at all, and narrow in on your physical and mental health. They can see when you’re not at home and rob you.

Our location is arguably our most vulnerable data, yet we give it away every day to sites, apps, devices, and the companies behind them. And these companies aren’t interested in changing that. In the Information Age, data is king. The more data a company has, the more attractive it is to potential partners, buyers, and investors. And while companies can claim good-faith reasons for gathering such data, intent doesn’t prevent the possibility of abuse.

Most social media apps provide intimate location tracking, with various levels of control. Slack lets you change your time zone but not opt out of one entirely. Facebook and Twitter have location tracking disabled by default, but enabling it pinpoints your location and broadcasts it with every post. Facebook Messenger’s location data is so accurate, an app called Marauder’s Map allowed users to stalk people using it. Instagram, now owned by Facebook, also has location tracking disabled by default, but it actively encourages users to enable it to add cute location tags to photos. Even without location tracking activated,Instagram still stores geotags that were added manually.

And, of course, there are entire apps centered around location. Strava, for instance, lets users upload personal exercise information, including running routes. This is meant as a helpful aid for private use, but in 2017 the company released those routes as a consolidated public map, inadvertently exposing sensitive military bases. Google Maps, which boasts over one billion users, consolidates location tracking data into a decidedly creepy feature called Timeline that lets users see exactly where they were on a certain date. Google admitted in August 2018 that it continues to store tracking data even when tracking is disabled; actually disabling location tracking requires multiple unintuitive steps that can render Android phones essentially useless for other app use.

Then there are the more egregious cases, where the danger moves from ephemeral hacker spaces to right next door. Ford released an app that allows users to track the location of any qualifying vehicle, even when parked, as long as they have the vehicle identification number. Amazon used to allow people who purchased items from a user’s wish list to view that user’s address in full. It no longer provides full addresses, but it still shares the destination city and state with purchasers and exposes full addresses to customer service reps and third-party sellers. Uber’s infamous God View allegedly allowed employees to track the live locations of celebrities, politicians, and ex-girlfriends. And conferences increasingly use smart badges to share live attendee location information with conference organizers and booth staff. Some conferences also expose this information to other attendees. Any of these systems could be exploited by stalkers or domestic abusers to track and terrorize their victims.

None of this is new. Search “[app name of choice] location tracking” and dozens of articles from security professionals and journalists will crop up. Location tracking is expected. It’s a trope, a joke. Users have all but given in to the notion that the price for existing in modern society is their right to general privacy, which can also mean their right to safety. And the people in charge, the leaders and decision makers of tech, have largely decided against changing this. But where does that leave those of us who don’t want to merely exist, but thrive? Where does that leave society’s most vulnerable members?

“Is this you?” my coworker asks, showing me a photo. I see typed words on newsprint and think it’s a Help Wanted ad.

ALICE G. dig yr scene. the devops talk gets me hot. we work in the same building. but i’m too shy to say hello. happy v-day!

It’s an anonymous valentine in the Portland Mercury, a city newspaper with an estimated circulation of 175,000. And yes, it’s me. Of course it’s me.

I grab a copy of my own and scan through the love letters. Most are signed or make the sender obvious through the message. Mine doesn’t. I don’t know who wrote it. All I know is that someone knows my location and has decided to use this information. Their intention could be a brief flirtation, or they could decide to escalate. I don’t know if they’re a coworker being coy, an office worker who recognized me, or someone trying to hide how they actually got my location. I don’t know much of anything, except that I could be in danger.

I wasn’t a stranger to online safety issues when this valentine hit the stands. I was intimately familiar with the price of being a woman with opinions, having watched numerous internet contacts and real-life friends be threatened and effectively chased from their homes by online mobs. I knew to disable photo metadata in my apps and not broadcast where I was having dinner until I had already left. I used my office’s address for all community mail. I disabled GPS tracking on my phone.

Still, I couldn’t escape notice. I had a distinctive haircut, half shaved with a bright blue streak. Men in the tech industry regularly approached me in lobbies, restaurants, and on the street. People tweeted about spotting me around town. It was never a given that I could run errands without being recognized—which meant people could all too easily figure out my routine.

The valentine was less a sudden shock and more a crystallization of suspicions I had nursed for some time. Over the next three months, I executed plans I had already penciled down: I changed jobs, took a Twitter break, and shaved my head.

That period of disconnect—from my job, from my community, from myself—was a freedom I hadn’t known I needed. I met friends for brunch without worrying about being spotted. I walked into a meeting where I was expected, and still people thought I was a stranger. It was a time of recovery and peace.

But I couldn’t stay an island forever. Social media, for all its potential dangers, was still a source of knowledge, laughter, and kinship. I wanted to be a part of it and to share parts of myself with it.

So, bit by bit, I returned to the online world. My routine is mostly the same as before, although I’ve added some extra security steps. I scrub through data collection sites and request that any personal information about myself or my family be removed. The scraping refreshes frequently, so I’m never done. My friends agree to a strict social media policy when they visit me, and they don’t share or post certain information at certain times. I use volunteered addresses for community and work-related mail (if you think you have my address, you likely don’t). I encourage people to vote, but I don’t know if I can. I try to relax.

Am I being paranoid? Is all this worrying really necessary, or am I flattering myself? These are very fair questions to ask. It’s exhausting, and my approach is certainly toward the extreme end of the spectrum. It’s likely I’ll never be in any real danger. On the other hand, my follower count climbs every day, and I’ve been identified on at least one alt-right website. Meanwhile, hate groups and others with malicious intent are using technology to their advantage against ill-prepared systems every day. Doxxing, which can easily lead to harassment, stalking, and worse, is all too common, and remains a complex legal gray area. Last year, a Wichita man was killed after a gamer dispute turned into a swatting incident; the police had no system for accurately tracing calls, and the department hadn’t been trained in swatting techniques.

Can we risk waiting for society to catch up? Do we get a choice?

Every day, headlines fall like hammers, pinning us down. Tech giants we rely on for photo swapping, daily news, and job hunts are selling our information. In the U.S., there are sprawling investigations into election hacking and reports of government agencies tracking and tearing families apart. A data breach at the consumer credit reporting agency Equifax leaked the personal information of nearly every adult in America, and the firm responded by pivoting to selling them identity protection. Technology is a hydra with heads burrowed into every aspect of our lives, so pervasive that the loss of our privacy can feel inevitable. But if we buy into that nihilism, if we’re afraid to be uncomfortable, the temptation to look away will win.

We all have a choice, and some of us have power. If you’re reading this, you likely work in the tech industry. You may be an engineer, a designer, a manager, or someone higher up the food chain who silences small talk when you enter a room. You have a voice in spaces many of your users do not. You can fight to protect them.

How do you fight for your users? First, remember that they are human. Remember that all of your technical problems are human problems, because humans are the reason technology exists. We tend to think we know our users, but really, they could be anyone—so aim to serve the most vulnerable members of society. What product changes would better include people of color? What information might a trans person not want you to have? How can you help keep a sex worker safe? What features would improve the experience of a person with a disability? What protections can you offer survivors of domestic abuse? Seek out people who are different from you on social media and listen to them—many people are already talking about their experiences, and they will tell you what they need from you and the technology you build. Hire people from underserved and underrepresented communities. The more diverse the perspectives you have at the table, the less likely you’ll be to ignore the needs—or endanger the lives—of your most vulnerable users.

Designing secure systems is a noble effort, but security is never absolute. Partnerships happen. Acquisitions happen. Leaks happen. Many circumstances are, at least in part, out of your control, and can place your users’ data in the hands of people you never intended to use it. The only truly secure data is data you do not have.

So push back. Ask why you need to collect that data from those users. Ask why you need to store it. Say no if you don’t agree with the answer. Evaluate your company’s data retention policies and who can access what. Hash things. Separate data so it’s difficult to piece it back together without your designated systems. It’s likely you aren’t the only person at your company with these concerns, but you might be the first to bring them up. If you are a white, cisgender man, leverage your position to help others—like it or not, your objections may hold more sway, and expressing them may be less risky for you than it would be for your less privileged coworkers.

The more tech industry workers speak up, the more others will join them. In recent months, employees at Google and Microsoft have made waves across the industry with petitions and appeals to media that shed light on unethical company policies. And they are winning.

You are not only a tech worker, you are a member of society, and there are countless people around you who may not own an iPad but who are hurt by technology’s presence every day. Teach your friends and family how to disable location tracking on their phones. See if your local library holds classes on basic internet security and volunteer to teach them. Donate to the political campaigns of people who care about privacy and voter protections. At least some people in power are listening, which is why Europe now has the General Data Protection Regulation (GDPR). And some people in power still need pressure to listen, which is why so many companies have stopped service in Europe rather than comply with the law.

Much like climate change, we can’t reverse the damage that has already been done, but we can stop—or at least slow—its progress. We can use our power, our access, and our knowledge to give people back their privacy. We can put the door back on its hinges and change the locks.

I don’t have a lock (yet), but I do have a thin door, and my world is smaller and more peaceful for it. Strange tech men at restaurants have been replaced with books and good chai. I even have plants, enough that I had to shop around for apps to help me care for them. The top choice was free and had a pleasing green color scheme, helpful guides, and a dialogue box requiring my location for climate-specific care. I deleted it and moved on.

If you are experiencing online abuse or harassment, organizations including techsafety.orgiheartmob.org, and onlinesos.org have resources and toolkits that can help.

Additional resources

About the author

Alice Goldfuss enjoys building some systems and dismantling others.

@alicegoldfuss

Buy the print edition

Visit the Increment Store to purchase print issues.

Store

Continue Reading

Explore Topics

All Issues