Loose Leaf Security Weekly, Issue 6

Welcome back to Loose Leaf Security's weekly newsletter! This week, we're introducing a new section for a tip of the week, either something we learned recently and want to share with you or just a classic that we may have mentioned briefly in a previous episode (like this one!).

If someone forwarded this to you, you can sign up yourself at https://looseleafsecurity.com/newsletter.

Tip of the week

The best way to avoid losing access to your two-factor authentication is to have multiple second factors available. If you're using authenticator apps, one of the stronger methods, it's a good idea to configure an authenticator app on both your current phone and an old phone that you don't carry with you all the time - that way, if you break or lose your phone, you won't lose access to your second factor and need to find a backup code.

Lots of accounts only let you set up an authenticator app once, but you can typically still set up two authenticator apps on different phones at the same time: simply scan the QR codes with both your phone and your backup phone at the same time, and they'll generate the same codes.

P.S. This is also really handy for setting up authenticator apps for accounts you share with friends or collaborators. In fact, we have some accounts set up this way for Loose Leaf Security!

In the news

Twitter ad targeting via 2FA: Twitter announced that they had accidentally allowed phone numbers provided for account recovery or two-factor authentication to be used for ad targeting. If this sounds familiar, it's because Facebook did the exact same thing before. Twitter's announcement doesn't really make a lot of sense though: you still need to give Twitter your phone number to use two-factor authentication, even if you later turn SMS-based two-factor authentication off after the initial configuration, and unfortunately, even if you aren't using your phone number for two-factor, removing the phone number from your account will disable two-factor authentication ([1], [2]). Furthermore, when you add your phone number for security reasons, it shows up in the "Account" section of settings, which leads us to believe that it is still not solely being used for security purposes. We believe that there should generally not be a dependency on your phone number to enable two-factor, but especially in light of this mishandling of phone numbers being added to your account, Twitter needs to truly untangle phones used for security reasons from the rest of your account and advertising. For now, we're keeping our phone numbers on our Twitter accounts because we believe the added security from two-factor authentication is very important, but we hope Twitter fixes this soon.

Google password checkup: If you're using Google's password manager - the one that's built into Chrome - you may want to check out their new "checkup" feature which identifies which of your passwords have been in a known breach, are duplicates of other passwords, or could be considered weak. For reasons we've discussed in our episode "Using a password manager effectively," we prefer password managers that aren't tied to a specific browser or operating system, but if you're currently using Chrome's built-in password manager, it's worth taking a look and updating any breached, duplicated, or weak passwords it finds.

Chrome blocking mixed content: What happens when a secure HTTPS web page loads images, videos, or even scripts from an insecure HTTP connection? Is it still secure? Historically, web browsers have continued to show the little lock icon on these websites, which isn't quite accurate. An attacker can modify the images you see, which could be anywhere from mildly amusing to actively harmful if, say, the image claims to have the phone number for your bank's security department. Also, insecure scripts can modify the page entirely, including the content from the secure sources.

Chrome announced that in the coming months, they'll be getting significantly more restrictive about such "mixed content," where parts of a web page come from secure connections and parts come from insecure connections. Extensions like HTTPS Everywhere have approached this problem by trying to make secure connections even when the web page used an insecure http:// URL, and Chrome will be following that approach to "upgrade" insecure connections - but if that "upgrade" doesn't work, the insecure content will be blocked. We think this is a major positive step for the web being secure by default, and the logical conclusion of what HTTPS Everywhere tried to accomplish.

Samsung's confusing Android updates: Android Police reports that Samsung has made some changes to its list of which devices get security updates on monthly and quarterly schedules, moving some to the faster channel and some to the slower channel. If you've got a Samsung Galaxy device, check out Samsung's list and see where your device is. It seems that the more expensive flagship phones have a better chance of staying on the monthly update cycle, so unfortunately, you might want to prefer those if you're in the market for a new phone and you like Samsung. (Still, even monthly updates seem slow to us).

Russian hacking group fingerprinting HTTPS connections: ZDNet reports that a Russian hacking group appears to be modifying Chrome and Firefox browsers to fingerprint traffic, as identified by a report from cybersecurity firm Kaspersky. What's notable is how they accomplished the fingerprinting - the SSL/TLS protocol used for HTTPS connection starts with the browser sending a random number to the server, and the malware works by actually sending meaningful data as the "random" number. Since this stage of the connection sets up encryption, the traffic isn't encrypted yet, so an attacker on the network can follow a single computer around and track what it connects to. It's a reminder that it's not enough for a protocol to be "encrypted" or "secure" - the software on both sides has to be secure, too, and malicious or infected software can send traffic that looks just as encrypted as anything else but secretly is undermining your privacy.

NYPD's use of Cellebrite: Medium's technology publication OneZero has a report about New York City's use of technology from Cellebrite, a company that apparently has a device to break the encryption of up-to-date iPhones. Cellebrite made some news earlier this year when they announced sales of this device to law enforcement departments, but OneZero reports that the NYPD had this device last year. Previously, at least as far as anyone knew, Cellebrite required devices to be sent into their facilities, and the closest Cellebrite facility closest to NYC was in New Jersey. A public defender who specializes in digital forensics suspected that the Manhattan District Attorney had broken into his client's iPhone 6S by sending it to New Jersey, which would have required a court order from judges in both states, but the district attorney replied that the NYPD broke into the phone within their own offices, using Cellebrite's device.

US government DNA testing detained immigrants: The US Department of Homeland Security said the Department of Justice is working on a regulation that would allow thorough DNA testing in immigration detention facilities and that the data collected would be entered into the FBI's DNA database. This testing is mostly targeted at immigrants, despite the data showing that immigrants commit crimes at a lower rate than citizens. (US citizens have also been mistakenly booked into these facilities before, too.) We at Loose Leaf Security are always concerned when the government gets access to more personal data, particularly when it's as sensitive as DNA - which, as we discussed in last week's newsletter, also affects more than just the individual as it can be used to track and reveal information about extended family members as well - and this is no exception. However, we are particularly appalled that this DNA is being collected and used for crime-related purposes that just aren't backed up by the crime data we have - because once the data is collected and incorporated in the database, it's generally technically difficult to remove it, not to mention how politically difficult it can be to decide to remove the existing data and not just stop collecting it going forward.

What we're reading

Decreased privacy isn't inevitable: Rose Eveleth calls out tech companies for pretending increased surveillance and decreased privacy both inside and outside our homes is inevitable when it isn't. We at Loose Leaf Security would, of course, prefer that tech companies prioritize consumer privacy instead of pushing boundaries just because they can, but while it shouldn't be our responsibility, it is unfortunately true that consumers also play a role in normalizing privacy-stripping technology. Eveleth's article discusses the societal forces pushing tech companies to convince themselves and the general public that we can't prevent them from removing our privacy.

Architecture as a metaphor for privacy: We're probably not going to get to London to explore the spaces Hyphen Labs installed at the Tate Modern to illustrate how private different types of online interactions are, but a comfortable living room placed in a public space does feel right to us for most group interactions. When you're posting to semi-public or friends-limited social media, it really is a lot like hanging out in a really comfortable living room furnished by a particularly nosy landlord.

That's all for this week - thanks so much for subscribing to our newsletter! If there's a story you'd like us to cover, send us an email at looseleafsecurity@looseleafsecurity.com. See y'all next week!

-Liz & Geoffrey