What started out as a novelty quickly spun into something ominous. In November of 2017, the San Francisco-based start-up fitness app Strava released a heatmap depicting the activities of Strava users across the world. At first glance, there was nothing untoward about this—the map highlighted popular running routes in major cities or the occasional dedicated exerciser’s route in a rural area. But— as with most things – once they are released on the internet—the map was carefully scrutinized. In January of this year a 20-year-old student tweeted that he had figured out that the map revealed locations and activities of soldiers in US Military bases. The heatmap, it turned out, not only revealed the location of such bases but may also indicate patrol routes in Afghanistan and Syria. The inadvertent release of such data could pose serious security risks. As you would expect, US military and intelligence planners are now currently assessing the extent to which compromised information is available to enemy forces based on this leak. Some are considering a ban on personal cell phones completely.
At this point, the news has become inundated with data leaks—Equifax, Yahoo, Deloitte, and back in 2014 Ashley Madison—were at one point all major news. What makes the Strava case different is that it wasn’t a breach. Strava was simply sharing data they had collected from users and publishing it online. As writer Amelia Tait pointed out in an article, “If a small start-up has this much data, and can cause this much danger, what of Google, Facebook and Apple?”
The issue at hand is not just about cybersecurity and the need for companies to actively protect data that they collect—it’s also about privacy and users consenting to share their data with numerous applications. Almost everyone does it. How often have you agreed to let an app—whether its uber, google maps, or a fitness app—track your location? It’s not obvious to the user that they’re agreeing to being constantly monitored—but often, that’s actually the case. It’s just a click -right? How much harm can it really cause? The free apps we use that make our life so much easier are not actually free. In exchange we provide them with access to our world. To our call data, location, contacts, media storage etc.–information the app often doesn’t need to run—which companies can anonymize and sell for advertising purposes. It has become second nature for us to agree to provide data on ourselves to applications.
And what the Strava case emphasizes is how ubiquitous this behavior has become. We no longer think about our privacy when it comes to what we do on our smart phones. Even in areas where secrecy is paramount and staff were likely versed in privacy and security, data was still collected on them and published without them knowing —potentially causing security risks – simply due to the fact that they wanted to track their steps or exercise regimen.
End users can certainly more clearly read the Terms and Conditions, End User Agreements or event just turn off location sharing settings on their smartphone. This however is not always straight forward, and is different for each app. While Strava had an opt-out option for users to not reveal their location, it was not easy to navigate to that option. And it’s not only Strava who functions like this—millions of companies create opaque terms and conditions that readers do not read—whether they’re too boring or too dense. In a world of BYOD, it is on the users to better understand the impact of their decision when they select “I agree”, and most users will not care. It is up to the organizations to take control of how secure they want their information – which is why some government agencies and military planners are considering banning personal devices. This is exactly the question every enterprise executive team and board faces daily. What do I allow and how far do we go to protect our sensitive information?
This is definitely an issue of privacy and security but it is also a cultural shift as more and more apps are downloaded to accomplish efficiency for personal and business needs. Here are a few quick tips to think about as you roll out apps in your business.
- Understand how a vendor uses your data, plans to use your data or can use your data. This can be vital to protect your security and privacy.
- As boring as it might be, always read the terms and conditions prior to agreeing to using a specific app for your business (or personal use).
- If you don’t need to share information with an app, or the vendor, don’t do it. You should own your data.
- Turn off tracking for apps that track your location when you’re not using them. Think about the true benefit of location for each app and determine if it is even necessary to its use. Determine if these apps can be used in your environment if personal phones are utilized based on the information collected and their use policies.
- When it comes to communication, be weary of what information you send over unsecure channels like SMS texting. Have policies in place for what can and cannot be sent over unsecure channels, but understand that humans are human. If you are dealing with sensitive data, set your employees up for success and employ secure text for communications.
- Utilize secure messaging platforms like Vaporstream that do more than just encrypt messages and disappear. Protect your data and ensure it doesn’t get inadvertently (or advertently) shared/ leaked.
- Be prepared for change and new innovation.
Contributor: The Vaporstream Team