Loose lips sink ships—a phrase that’s even true for endpoint security matters, apparently. In late 2017, fitness tracker data was shared by active duty military members and effectively sank military intelligence security. Several security researchers and journalists found that almost anyone could piece together top-secret location data on military bases and routes via Strava, a “social network for athletes.” Over 1 billion users have opted in to connect their fitness tracker or smartwatch data to the network, which is then displayed on a giant global heat map under anonymous screen names.
The user data protection measures Strava put in place may not have been enough. According to Wired, Australian National University student Nathan Ruser broke the news via Twitter that, “Strava user activities potentially related to US military forward operating bases in Afghanistan, Turkish military patrols in Syria,” and other military-related activities could be viewed online.
Strava released their global heatmap. 13 trillion GPS points from their users (turning off data sharing is an option). https://t.co/hA6jcxfBQI … It looks very pretty, but not amazing for Op-Sec. US Bases are clearly identifiable and mappable pic.twitter.com/rBgGnOzasq— Nathan Ruser (@Nrg8000) January 27, 2018
Strava security concerns continue to spiral
In the days following Ruser’s concerning discovery, the academic and security research community dug deeper into the global heat map data released by Strava and made even more concerning discoveries. One such instance was when researcher Alec Muffet linked Strava data to CIA “black sites” in Djibouti.
Perhaps most concerning, researcher Paul Dietrich made the public claim that the information Strava collects and displays creates data trails that could be used to piece together an individual’s identity. Dietrich also claims public data scraped from Strava could be used to track a French soldier from overseas deployment all the way back home.
Your private data might not be so private
The claims that Strava’s data could expose your identity aren’t far-fetched, given the efficacy of today’s big data algorithms. MIT researchers were able to use big data to identify 90 percent of the population by randomly plucking four data points out of a database of 30 days worth of credit card information.
Another Columbia University study found that for most mobile users—i.e., the majority of the US adult population—location metadata from two apps is all that’s needed to figure out someone’s identity. Similar big data methods have been applied by researchers in downright worrisome ways, including one study where researchers successfully reidentified supposedly anonymous health data from patient records.
Strava’s response to the big scandal
There were, understandably, a lot of concerned researchers and citizens paying attention to the Strava news. Regardless of your political stance, no one wants anyone’s personal safety jeopardized because they opted into a fitness-themed social media network in hopes of improving their daily running routine.
Strava quickly issued a statement in which it stated:
“Our global heat map . . . excludes activities that have been marked as private and user-defined privacy zones. We are committed to helping people better understand our settings to give them control over what they share.”
In addition, the company shared a 2017 blog post of user privacy tips and stated it was, “committed to working with military and government officials to address sensitive areas that might appear.” In essence, Strava’s response amounts to: It’s the users’ responsibility to protect their data—not the app’s. Whether they’re right is a matter of opinion.
Shifts in the data protection conversation
A few months ago, the general public suffered from a raging case of breach fatigue. “Oh, another major brand lost my personal data? Well, it hasn’t hurt me yet.” If there’s one upside of the Strava scandal, it’s that people started talking about data privacy again.
In “The Cryptographers’ Panel,” which opened the RSA Conference 2018, Moxie Marlinspike of Signal explained how attitudes toward social media have shifted in recent months. “The utopian narratives of [social media] connecting the world and organizing information are coming to an end,” he said.
The convergence of people and data privacy is way too complex of a topic to tackle here, but Strava’s highly publicized stumble demonstrates how that convergence is evolving. Whether or not you agree with the company’s response, this incident should inspire you to take a fresh look at how your company approaches endpoint security and data privacy.
Who should own data privacy?
In The New York Times, Zeynep Tufekci presented that humans, when armed with mobile devices, have the potential to be dangerous. She argued for public regulation to protect people who can’t effectively protect themselves:
“The privacy of data cannot be managed person-by-person through a system of individualized informed consent. Data privacy is not like a consumer good, where you click ‘I accept’ and all is well. Data privacy is more like air quality or safe drinking water.”
Given that Strava users were accidentally leaking military intelligence, she may be right. Can anyone reasonably expect the average smartphone or fitness tracker user to make the right decisions about their data? Or is that a responsibility falling on the shoulders of their employer or the app creator? It’s a complicated subject, but it’s a conversation worth having.
Don’t make data privacy harder than it needs to be
Strava’s data debacle is not the first example of accidental endpoint security gone awry due to user confusion, and it’s unlikely to be the last or least dangerous. The lesson to take away is that data privacy shouldn’t be this hard. If your users don’t know how to protect their data, you may need to rethink your security user experience (UX).
When it comes to your apps and mobile experiences, don’t sacrifice security for ease of use in design. Single sign-on, fraud detection methods, and trust-based authentication can guide your users to behave more securely without clunky UX. Some gentle prodding could have gone a long way in giving Strava users the chance to protect their personal data, and it can also help your users protect your company’s data, alongside their own.
Just as importantly, recognize the risks in the endpoints you’re not worried about—like fitness trackers, business printers, and other seemingly innocuous IoT devices. By adopting devices that come with built-in security features, you can minimize the likelihood your data will end up somewhere you don’t want it to.
The story of Strava reveals just how easy it is for data to leak in unexpected ways in 2018. Let this news serve as a reminder to always be on guard, whether you’re sharing or collecting data. While you can’t always control whether secret military training sites get revealed on Twitter, you can control how your own organization handles data. With a little security UX smarts and a lot of caution, you can make sure your company’s name is never said alongside Strava’s.