Loose Leaf Security Weekly, Issue 26

We're sorry this week's newsletter is a bit late - the two of us haven't been able to meet up in person because we've been practicing "social distancing" to help contain the novel coronavirus, and we're still adjusting to working remotely and staying at home. It's not as straightforward as the old way of doing things, but it's an important step to take to limit the potential impact of the disease. In a sense, it's basically the human version of sandboxing, which limits the spread of computer viruses and other malware. While it would be easier if mobile apps and web pages could just directly access each other's files, cookies, and other data, sandboxing requires that you specifically share the things you want to share between apps and therefore makes it harder for malware to directly move between apps. As with humans, containing the spread of a virus also makes it a lot easier to recover from an outbreak - it's easier to deal with a single infected or compromised app than recover from every app on your device being infected at the same time.

If someone forwarded this to you, you can sign up yourself at https://looseleafsecurity.com/newsletter.

Tip of the week

If your lives are going anything like ours, you've probably just started using a bunch of new videoconferencing apps at home. Our episode "Covering your webcams" covers both why and how to rig up some sort of removable cover for your laptop and phone cameras, just in case one of those new apps starts sending video before you realize it. We're both partial to washi tape, but there are lots of easy options you can cobble together from what you already have at home, such as a folded index card.

In the news

I am once again asking for you to take software updates: Last week, Microsoft released an out-of-cycle update to Windows 10 to fix a pretty bad security flaw. The bug is in the compression routines in SMB, the protocol used for Windows file sharing, and affects both servers and clients. Microsoft originally planned to release the fix as part of their monthly "Patch Tuesday" cycle, but removed it from the list - unfortunately, they'd made the draft list of patches available to various vendors who ended up publicizing the existence of the bug. The bug also gained attention because it would have been particularly convenient for a worm that copies itself to new computers: the recent WannaCry and NotPetya attacks both took advantage of a similar bug in SMB to spread to hundreds of thousands of computers. As a result of the visibility, Microsoft ended up releasing the patch on Thursday - so make sure you take software updates whenever Windows prompts you.

Gone in 60 milliseconds: A group of researchers from Belgium and the UK discovered an easy way to clone the electronic part of Toyota, Hyundai, and Kia car keys. All of the affected cars use a chip from Texas Instruments with a special cryptographic algorithm - and all of them misuse the algorithm in ways that make it too weak. While the TI chip supports an 80-bit secret key, affected Kia and Hyundai key fobs only use 24 of the 80 bits, making it possible for an attacker to just try every possibility. Toyota, meanwhile, sets their secret key to the serial number of the key fob, which is announced to anyone who scans the fob. The researchers also found a similar flaw in the Tesla Model S, which has already been fixed via a firmware update, but none of the other manufacturers have expressed an intention to patch the flaw in existing cars. Car thieves will still need to physically break into the car, but they can use the cloned key can to get past the car's immobilizer, letting them hot-wire the ignition.

Bicycling is not a crime: NBC News has a story about an Android user who found himself picked up by a "geofence warrant" served to Google. One of his bicycle trips brought him past a house where a burglary was taking place, and local police ordered Google to give up information about the owners of Android devices that had passed through an area (a "geofence") surrounding the house. Google's policy is to notify users when they're subject to a government data request whenever they're legally allowed to, so they sent him an email, and after some digging, he found the case and hired a lawyer to chat with the police department. He was recording his ride with an exercise-tracking app, and on the day of the burglary, his own records showed he'd looped past that house three times. The bicyclist's lawyer argued the warrant was unconstitutionally broad, but the police department eventually said he wasn't a suspect anymore and he dropped the case.

Insensitive Tower: Sensor Tower is a tech startup that advertises "Mobile App Store Marketing Intelligence," which means they'll tell other companies with mobile apps how those apps are doing in comparison to competitors. There are a few ways they could get this data, including carefully watching the Apple and Android app stores, or perhaps secretly running ad blocker and VPN apps of their own that monitor user activity. A BuzzFeed News investigation found that the company chose the latter route and has built at least 20 iOS and Android apps since 2015. These apps prompt users to install a custom root certificate controlled by Sensor Tower, which gives the company the ability to subvert all HTTPS security on the phone and monitor traffic sent by both websites and mobile apps. There's no indication in the apps that they're linked to Sensor Tower or even that they collect data at all. The company admitted that they made the apps, insisting that they only stored aggregated information. We've urged caution with VPN apps in the past, because even when they're working as intended, a VPN by design sends all your traffic to the VPN company. It's very important to only install a VPN that you trust with your data - which generally requires knowing what company made the VPN and what their incentives are - and it's also important not to install a root certificate from anyone you don't completely trust.

Facebook tracked me and all I got was this lousy t-shirt: Last month we discussed Facebook's new "Off-Facebook Activity" tool, which allows you to see the data Facebook has about you that they gather from other companies and websites. Usually, this data is reported back to Facebook by "pixels," cross-site ads, and other trackers, so ad-blocking and tracker-blocking tools should prevent Facebook from knowing what you do on the rest of the web. Terence Eden, who uses such tools and expected to see no "off-Facebook activity", was surprised to see exactly one entry on the list: "LAN TIM 2" had reported some activity on Christmas Eve. Some searching turned up another privacy-conscious Facebook user who noticed two entries: one stay booked through Airbnb (understandable) and one "LAN TIM 2" from early December. "Off-Facebook activity" includes offline activity, as well, because some merchants directly send data to Facebook to help their own advertising efforts, but neither of them recalled purchasing anything from a company with that name. After a query to a friend at Facebook, Eden found the answer: it seems "Lan Tim 2" is a white-label T-shirt maker, and both of them had purchased T-shirts from an online store in December.

Everybody gets to sign in with Apple: A recent change to the rules of the Apple App Store requires that apps with third-party login options, like Facebook Login or Sign in with Twitter, must also support Apple's own third-party login system, Sign in with Apple. The change only affects apps that are using a third-party system simply as an account system; apps that actually use the data in your third-party accounts are exempt from the rule. We've previously discussed Sign in with Apple in the news segment of our episode "Password managers: how they should work and when they didn't." Our feeling was that we think Apple has better incentives to keep your account available and less incentives to see your data than a social media site would, but even so, we prefer making an account with each app or site whenever possible. Password managers make this about as easy as using a third-party sign-in button, and it ensures that the accounts are kept fully separate and nothing that goes wrong with one account can affect the rest.

Careless Whisper: Anonymous social networking app Whisper left a database of user data and messages accessible on the internet without authentication. While some of the data was intended to be displayed through the app, the database included specific locations and other details about users that weren't intended to be public. The open database could also be downloaded and analyzed in bulk, raising the risk of correlating the pseudonymous data with other datasets.

The sheep on the right and the Voatz on the left: Security consultancy Trail of Bits has released the report from their security assessment of mobile voting system Voatz. We've previously covered Voatz as the platform used by West Virginia's mobile voting pilot and the subject of a security analysis at MIT that found some really odd things. Trail of Bits worked with Voatz and had access to most of their source code, which let them analyze the system's security in detail. While they say in their summary that the code is "is written intelligibly and with a clear understanding of software engineering principles," they also call it "unusually complex" compared to similar voting apps they've seen and "the product of years of fast-paced development" with some validation features "reimplemented across the codebase, often erroneously." They identified 79 issues in the code and in the design, of which Voatz was able to fully address eight before publication, but they also caution that the complexity of the Voatz system "leads us to believe that other vulnerabilities are latent."

Your location is out there: In February, we talked about Venntel, a small company that collects and resells location data aggregated from sources like mobile apps or websites. Protocol has uncovered another such product, Locate X. As we noted before, since Venntel and Locate X don't get location data directly from cell phone carriers, they're operating legally under current U.S. law for them to sell that aggregated location data - including to government agencies like U.S. Customs and Border Protection. One particularly interesting thing with Locate X is the section in their terms of use that states their data may not be "cited in any court/investigation-related document," and as a ACLU lawyer Nathan Wessler notes, "these secrecy provisions prevent the courts from providing oversight" over Locate X's product.

The virus isn't the only thing spreading from China to other countries: VICE reports that Iran's Ministry of Health sent Iranian citizens a link to download an app called AC19 that claimed to "determine if you or your loved ones have been infected with the coronavirus." Unsurprisingly, the app couldn't tell its users whether or not they've caught the novel coronavirus, but it has been used to gather lots of personal information as well as real time location data. The vast majority of Iranians use phones with the Android operating system, and if their phone was running an older version of Android, AC19 automatically started tracking their location. Iranians on a more up-to-date version of Android weren't necessarily that much better off because the prompt is shown as part of the operating system, which means it may have been displayed in English - the default language for the Android OS and also one few Iranians understand - so many users didn't even know AC19 was requesting access to their real time location data. (We were a little surprised at first to hear that users don't change their device's default language to one they understand. The article notes two things that help explain why: apps in Persian are provided through a separate app store known as Cafe Bazaar, and the Iranian government controls the country's cell providers.) As VICE notes, the Iranian government has a history of surveillance, but it's particularly disappointing to watch them weaponize Iranian's understandable fear of the current global pandemic to tighten their watch.

What we're reading

Think of the children: Back in December, we covered a US Senate hearing over "lawful access," the requested ability for law enforcement to see into end-to-end-encrypted conversations or access data on encrypted smartphones. Democratic Sen. Dianne Feinstein and Republican Sen. Lindsey Graham demanded that big tech companies like Facebook either voluntarily find a way to provide access to law enforcement, or Congress would make them provide access. Those senators, along with colleagues from both parties, have now started following through on that threat with the "EARN IT Act," which is nominally about a somewhat different problem: the risk of child abuse images being sent on end-to-end-encrypted services without the ability for platforms to detect this material. Their act would remove the historic immunity in Section 230 of the Communications Decency Act, which made providers not liable for communications sent through their service (and thus not obligated to monitor such communications). Instead, companies would have to "earn" their liability by complying with "best practices," which haven't been defined yet, and a committee created by the proposed law would determine those best practices. One possible outcome could include no longer allowing true end-to-end encryption.

Rianna Pfefferkorn of Stanford's Center for Internet and Society has a detailed blog post about the new law, arguing that it provides no actual benefit to the cause of finding child sexual abuse material and prosecuting people who trade in it, but allows the new committee to make end-to-end encryption infeasible. Matthew Green, a cryptography professor at Johns Hopkins, also has a blog post about why the law would ban end-to-end encryption in practice: because there is no known way of detecting child abuse imagery within end-to-end encryption while keeping the encryption intact, the new committee would likely say that "best practices" are to weaken the encryption. He also argues that the law marks a change from previous legal debates about end-to-end encryption that he's participated in - he no longer sees lawmakers as acting in good faith, and he sees this as an attempt to use a popular-sounding cause (keeping children safe from online predators) to pass a law that is actually about something entirely different. In The Verge, Casey Newton draws a comparison with FOSTA/SESTA, laws passed in 2018 that reduced Section 230 protections with the stated goal of preventing online sex trafficking. By preventing sex workers from communicating freely online, the practical impact of these laws seems to have been putting sex workers in more danger, giving them less information about clients and sometimes driving them towards traffickers.

Fashion changes, but surveillance data endures: We've talked about designs intended to trick facial recognition algorithms and automated license plate readers before, but a recent article in the New Yorker called "Dressing for the Surveillance Age" reminded us that adversarial fashion often causes you to stick out like a sore thumb to human eyes. In addition to a small roundup of examples of adversarial fashion, the piece discusses the history of adversarial imaging - a field created "more or less by accident."

See you next week...

We'd recommend just covering your webcam, but there's now a new privacy tool that automatically deletes humans from your webcam's footage.

That wraps it up for this week - thanks so much for subscribing to our newsletter! If there's a story you'd like us to cover, send us an email at looseleafsecurity@looseleafsecurity.com. See y'all next week!

-Liz & Geoffrey