With the cold season comes ... cold season, and being stuck in bed with a cold is even less fun if you don't have any hot tea to drink. That's what happened to Geoffrey last week, and it got us thinking about the value of backups and even imperfect backups. We're (obviously) fans of good-quality loose leaf tea, and we try to keep an ample supply on hand stored in sealed containers. However, when we run out of loose leaf tea, we're glad to have some packaged, mass-produced tea bags in our pantries - even after a long time, they still make a drinkable cuppa. In the same way, backups of important data and records don't have to be quite as nice to use as your day-to-day email or files, as long as you have a way to get to them when something goes wrong.
If someone forwarded this to you, you can sign up yourself at https://looseleafsecurity.com/newsletter.
Tip of the week
Two of our stories this week are about losing access to Google account data, so we figured we'd take a look at how to keep it backed up. It's common for us to think that our accounts in the cloud are backed up because they're in the cloud, which is technically true - you won't lose your Gmail account to a failed hard disk, because Google keeps their own backups - but it doesn't cover the risk of losing access to your account. (We've also covered this risk before in our episode "Using a password manager effectively" in the context of whether to rely on a "Log in with Google," or "Log in with Facebook," etc. button instead of using a password manager.)
The most complete option here is using Google Takeout: if you visit https://takeout.google.com, you can choose to download everything in your Google account, from mail to files in Google Drive to calendars to "data shared for research." The formats will generally be broadly compatible but possibly inconvenient. For instance, docs will be in Microsoft Office's docx format, but email will be in the ancient "mbox" format - a single giant file containing all your email. This is usable in basically any email app to import your email, but almost certainly not a format you want to actually work with it in. Also, there's no way to automate / schedule a Google Takeout download, so it's probably the sort of thing you want to do once or twice and treat it as a worst-case backup.
The other option is to synchronize your cloud account's data to your computer. For Google Drive, you can download the Backup and Sync application and configure it to sync the folders you care about. For Gmail, you can set up a regular desktop email client, such as Thunderbird or Apple's Mail.app, to connect to your Google account and store recent email on your local computer. However, be careful with this approach - if you delete a document or an email, the deletion will also sync and you'll probably lose access to it. If you get locked out of your Google account, the sync process might also lock you out: you'll need to be careful to disconnect from the internet before looking for your files, which is tricky to do if these applications are running in the background. (It's possible that you can restore your computer from an older backup while it's offline, but that's a complicated approach that's probably best treated as a last-resort answer, because it then runs the risk of losing your more recent files.)
A better option for important records in your email or files in your cloud storage is to save a copy entirely outside your cloud account - download that email or file into a folder on your computer. Then, of course, make sure you're regularly backing up that folder.
In the news
Driving away with your files: Google has been in the process of fully deprecating the Google Drive Android API over the last year, and on December 6, just a couple weeks from today, that API will be fully shut down. Some Android apps enable their users to connect the app to their Google account so the app can use this API to backup the user's app data to their Google Drive. The API shut down on December 6 means users will no longer be able to restore their app data from those Google Drive app backups, and as Google's deprecation guide also mentions that "App data support is likely to be removed from Drive in the future," you can't depend on finding another way to access these backups.
Several Android apps still suggest backing up app data through Google Drive. (It's possible that they've switched over to another API to store regular files on Google Drive, but because the deprecated API is specifically geared towards supporting app backups, we wouldn't count on it.) One such app is WhatsApp: their help page suggests using this method to transfer chat history to a new Android phone, and the WhatsApp Android app still allows users to configure Google Drive backups in settings. We'd recommend against using Google Drive backups for WhatsApp, even if it turns out it's not affected by the shutdown: WhatsApp chats are protected by end-to-end encryption on your phone, but the backups are not encrypted.
An alternative to depending on Google Drive for your backups is to make your own, ideally encrypted, local backups. If you want, you can also upload those to a cloud backup service of your choosing. Unfortunately, Android takes a more piecemeal approach to backups than iOS, so Android phones don't provide simple, complete encrypted backups to your computer like iTunes does for iPhones. WhatsApp has instructions for copying your Android phone's local backups of WhatsApp data to your computer and restoring from those backups at a later time, and if you use another app that currently backs up to Google Drive, it's worth looking into how to create local backups of that app data, too.
Don't forget to like, subscribe, and save a copy of your entire Google account: Markiplier, a popular YouTuber, recently told a story on livestream where he crowdsourced the decisions - viewers could reply with one emoji to vote one way and another to vote another. Several viewers, expressing enthusiasm, "spammed" the livestream chat with messages full of a certain emoji. YouTube's systems interpreted this behavior as actual spam and responded by disabling not just their YouTube account but their entire Google account, including Gmail and Google Drive. Many of these users filed appeals through Google's process, which are supposed to be manually reviewed, and were denied. After Markiplier released an impassioned video decrying YouTube's actions, they reinstated many of the accounts, and he released a followup video saying things were being sorted out with the involvement of a senior exec at YouTube. It's not clear what would have happened if this had involved a less famous livestreamer who didn't have high-level contacts at YouTube.
Hold the phone, Twitter fixed their two-factor signup process: Twitter updated their two-factor authentication to no longer make two-factor dependent on tying a phone number on your account. This is particularly good news since Twitter was previously using phone numbers provided for two-factor for ad targeting. If you haven't added a second factor to your Twitter account yet, you can now enroll without providing a phone number, and if you already have two-factor on your account and only use authenticator apps or a security key, you can just remove your phone number in the Account section of Settings. (If you're using SMS-based two-factor for Twitter, your account will still need your phone number, but as we discuss in our episode "Two-factor authentication and account recovery," you can improve your account security by switching to authenticator apps and security keys, which are available even when your phone is out of range and aren't vulnerable to SIM-jacking attacks.)
Two-factor account info disclosed in a breach: A recent news report about a breach of 2.2 million accounts caught our eyes for one reason: part of the breached data included two-factor authentication keys. The standard for authenticating with six-digit two factor codes, such as with an authenticator app on your phone, involves both you and the service calculating a cryptographic function of the current time and a shared secret (usually the QR code you scanned when you signed up), and when you log in, the service just compares whether your code matches its code. That means there's a risk that the shared secret could be stolen in a breach just as easily as any other information - while these codes are still more secure than an unchanging password and harder to intercept than a text message, they're not foolproof. In a particularly infamous breach a few years back, industry giant RSA Security had to replace their customers' physical authentication tokens (specialized code-generating hardware with a 6-digit LED display) after a master file of secrets was stolen. Security keys, our favorite form of two-factor authentication, don't have this risk - your key creates a cryptographic signature of a message generated by the website, which the website can only verify, so nothing else can impersonate your security key - and we use them wherever we can.
It's for your own good: Uber will soon be piloting an option in Brazil and Mexico to let either passengers or drivers record audio of their ride in an attempt to better respond to "numerous and increasing reports" of violent crime in Uber rides in Latin America, as their IPO prospectus put it. The Washington Post reports that they hope to roll this out to the US as well. The feature allows either party to initiate a recording, which will be stored encrypted and optionally sent to Uber if the user reports an incident - users won't be able to play back the recordings themselves. While the feature may help Uber's ability to respond to safety issues, it brings its own concerns: it's not clear how the feature interacts with US privacy laws, since several states require "two-party consent" for recordings. More generally, we're not excited about another company surveilling us, especially without clearly limiting how these recordings will be used and how long they'll be stored. (As a reminder, Uber has already shown some pretty bad judgement surrounding how they use your data.) The Post says they previously found that Uber's Special Investigations Unit exists primarily to shield the company from liability accusations (hm, sounds a lot like HR), and their IPO prospectus worried that criminal incidents would mean "our business and financial results could be adversely affected" - so their first motivation may not actually be your safety.
Better luck with 6G: The new 5G cellular phone standard is supposed to fix the security flaws in older standards like 4G that enable Stingrays and similar devices to spoof a cellular tower, causing your phone to send all of its traffic through it instead of through your actual cell phone provider. A team of researchers from Purdue University and the University of Iowa discovered a handful of security flaws with 5G by translating the 5G documents into a computerized model and testing it exhaustively. These bugs include allowing an attacker to track you, send fake emergency alert messages, and even cause your phone to downgrade from 5G back to 4G. The GSM Association seems to not think the vulnerabilities merit changes, so this is yet another reason not to trust SMS or even phone calls to be secure.
Facial recognition in the West Bank: Israeli facial-recognition startup AnyVision has technology that can keep track an individual as they move across a city and through the field of view of multiple surveillance cameras. NBC News reported recently that their technology backs a secret surveillance project in the West Bank and that they were the company that anonymously won Israel's national defense prize for using "large amounts of data" to prevent terrorism. AnyVision has previously confirmed that they're behind the facial recognition at Jerusalem border checkpoints for Palestinians (Israelis go through a separate checkpoint without facial recognition), which itself raised worries about data tracking, but they insist they're not running surveillance beyond the checkpoints. Microsoft's venture capital arm invested in AnyVision earlier this year, under a deal that required them to hold to six ethical principles including "lawful surveillance." As a result of these reports and pressure from activists, Microsoft has hired former US attorney general Eric Holder to lead an investigation into just what AnyVision's technology is powering.
Facial recognition on the Tube: Mashable recently posted a video demonstration of how facial recognition in subway stations might make your commute easier, to which people promptly questioned whether making surveillance and fare enforcement easier is actually an improvement. As part of the NYC subway's ongoing fare enforcement crackdown, they've installed individual cameras on every turnstile at one major downtown station just for enforcement, not for speeding up boarding, so we don't share Mashable's naive optimism here.
Oh, you wanted to turn off that location tracking: Two US senators, one from each major party, have written a letter to Facebook pushing them on the meaning of their location privacy options on mobile devices. While Facebook guides you to using Android and iOS's controls for sharing "precise" location information (usually via GPS), they can still keep track of your approximate location from your IP address and actions like checking into events. They point out that if Facebook is updating in the background, they effectively have constant tracking of your IP address. Facebook has a post encouraging users to use the OS setting to disable location tracking, but if they're still getting your location by other means (and perhaps using it for advertising), the senators argue this is misleading.
One Ring to surveil them all: Ring, the video doorbell startup Amazon acquired last year, now has at least 630 partnerships with law enforcement agencies, which is an over 50% increase from the 400 partnerships they had in June. Ring has not been forthcoming about the specifics of these "partnerships" with the general public, and Massachusetts Senator Ed Markey pressed Amazon for more information. Amazon's responses indicate they've abdicated responsibility over the surveillance footage captured by Ring cameras and are placing it on consumers and police departments. Ring has no policies limiting how long law enforcement can hold onto footage shared with them, leaving it solely up to local laws to place checks on police departments, and law enforcement can share Ring data with other parties however they like. This is particularly troubling as Amazon maintains Ring users must share their footage when police request it.
Amazon's response to Senator Markey included another bizarre deflection of responsibility: Ring says it's their users' jobs to not point the cameras at children. Of course, it's nearly impossible to use the Ring camera as intended and not capture footage of children, and Amazon admits they are doing nothing to prevent the surveillance of children: "Ring has no way to know or verify that a child has come within range of a device." Amazon even released a promotional video featuring footage of children on Halloween captured by Ring doorbells, which suggests they don't really care whether or not their users are surveilling children. We at Loose Leaf Security don't believe the bulk of privacy burden should fall on consumers, and we're especially alarmed by how Ring's intentional obfuscation of what happens with their product's surveillance footage makes it near-impossible for consumers to know what they're opting into.
Bypass camera app permissions with this one weird trick: Security researchers at Checkmarx discovered an interesting vulnerability in how camera permissions work on Android. While an app needs permissions to directly access the camera, it's possible to send a message to the camera app instructing it to take a photo, for instance, starting a timed photo with a three-second countdown. The camera app usually saves to the SD card (devices that don't have an actual SD card still usually provide an area of disk that acts like the traditional Android SD card), and the standard storage permission provides access to the entire SD card. Thus, an app that only has storage permissions can instruct the camera app to take a photo, and then check back a few seconds later and see the photo that was taken - and also see your location, if your camera app is set to geotag your photos. The attack isn't stealthy at all, since it actually switches your screen over to the camera app, but it could still be effective if you leave your phone face down.
Google and Samsung have both addressed this vulnerability in an update to the camera app, and this sort of attack is a good reason why you should apply updates even to software like your camera app where you might not think it could need any changes. However, the post implies that phones from other manufacturers might still have the same bug.
Even without the camera app bug, there's another issue this story highlights: the standard storage permission on Android allows an app to access (and modify) anything else stored by any other app using the storage permission, which on most Android phones includes photos from the camera app. Apps designed for Android 10 (the latest version) can use a new feature, scoped storage, to limit what they access, but Android still allows the older option of full access to storage, and at least for now, the permission dialog doesn't distinguish between scoped storage and full storage. If an app from a lesser-known maker is asking for external storage permissions for no clear reason (such as a video game that only needs to store high scores), it may be worth treating it with suspicion.
Permission to download apps freely, sir? US soldiers in the 504th Military Intelligence Brigade noticed that their unit's new Android app asked for a huge number of permissions, including location data and full storage access. The app was developed by an outside company based in the US with a branch in India. The soldiers, who work with top-secret information and don't tell people outside their family what they do, worried that sharing information with the app might blow their cover. One even asked about it on the /r/army forum on Reddit; while the original post was deleted, the long discussion that followed is still online. The brigade originally told soldiers it was "mandatory" to download the app on their personal phones, then later changed it to "highly encouraged," and the app is now gone from the app stores.
You were supposed to destroy the ads, not join them: Malwarebytes Labs found a particularly evil "ad blocker" on Android that just adds more ads and also makes itself almost impossible to remove. The app requests permission to "Allow display over other apps," which it uses to draw more ads on your screen. It also sets itself up as a VPN, which allows it to watch and modify all your network traffic. This is apparently a common technique for mobile ad blockers because mobile platforms don't often have other good ways to remove ads from your browsing, but it's also a dangerous permission for the same reason. Worse, the app installs itself with a blank icon and a blank name, making it pretty difficult to find and uninstall. Our takeaway: once again, suspicious permissions are suspicious.
What we're reading
Who does the data enrich? Troy Hunt has a look at the recent massive exposure of personal data from People Data Labs. People Data Labs is a "data enrichment" company, which evidently means they're in the business of aggregating data about people from public or purchasable sources and reselling it. One of their customers - we don't know who - stored over a billion records on a publicly accessible database server where it was recently discovered. This wasn't a breach of People Data Labs themselves, nor of Oxydata, another company in the same business whose data was also on this server: both of them intentionally sold the data and expected the customer to keep it secure. That expectation didn't really work out, though, and there's not much that can be done now that the data is public.
Both Troy Hunt and the researchers who noticed the exposed database raised the question of whether these "data enrichment" companies should be able to sell off their responsibility to keep data secure. We're also somewhat frustrated that it's hard to find out that you're in one of these databases - and often even that such a company exists - until you get a notification from a site like Troy Hunt's HaveIBeenPwned.com. His post discusses how this has become common in data exposures he's seen lately: "The recurring theme I'm finding with exposed data of this nature is increasing outrage that the data aggregator obtained and used personal information in a fashion the owner of the data (i.e. me) didn't consent to. It's not about how public the data might be through the channels people choose to publish it, rather it's about the use of the data outside its intended context."
Rachel Thomas on surveillance and society: We've expressed our concerns about both government and private surveillance efforts, including our worries about the Ring doorbell and its surveillance network above, and Director of USF's Center for Applied Data Ethics Rachel Thomas has written a detailed article about how surveillance affects society. (If her name looks familiar, it could be because we linked to her Twitter thread about machine learning uncovering unspecified variables in our last newsletter.) The article is targeted at "people who are not that concerned about surveillance, or who feel that the positives outweigh the risks," but provides a detailed reference on the impact of surveillance for those of us who are already troubled by it.
That's all we've got for this week! If there's a story you'd like us to cover, send us an email at firstname.lastname@example.org. We'll see you next week with a slightly shorter newsletter than usual - we wish our US readers a happy and secure Thanksgiving!
-Liz & Geoffrey