Liz and Geoffrey are back with a look at physical computer security - just how much trouble could someone cause if they got access to your laptop for a few minutes? - and what sorts of problems disk encryption can and cannot solve. Also, security issues at popular social media services cause trouble for 90 million Facebook users and every Google+ user.
You can now find and subscribe to Loose Leaf Security on Spotify!
Timeline
- 1:18 - Security news: Facebook "View As" breach and report that phone numbers used for two-factor authentication are used for ad targeting
- 7:38 - Security news: Google+ is shutting down after it was reported that a bug could have been exposing data months ago
- 9:36 - Security news: Chrome extensions security changes
- 11:37 - Security news: Windows Update that accidentally deleted files
- 13:00 - Security news: Super Micro rumors
- 14:40 - Why physical security matters & possible attacks when someone has physical access to your computers
- 17:30 - Locking your screen
- 18:18 - Screw tampering & glitter nail polish
- 18:49 - How to minimize the risk a computer has been tampered with before you even get it & BIOS/firmware passwords
- 25:03 - How to deal with hard drives you don't use anymore
- 29:28 - Disk encryption
Show notes & further reading
Tamper detection with glitter nail polish
At the 2013 Chaos Communications Congress, Eric Michaud and Ryan Lackey presented a talk entitled "Thwarting Evil Maid Attacks," where they propose the use of glitter or pearlescent nail polish for low-cost and reliable tamper detection. The pattern made by the nail polish is a "physically unclonable function" - the result of a physical process that is hard to repeat in exactly the same way. By comparing the pattern with an old photo, you can tell whether someone has attempted to open up the device and re-apply the nail polish.
You can watch their talk on YouTube. WIRED has a writeup of the talk, as does Motherboard.
Cold boot attacks
If a computer with an encrypted disk is booted up, it generally has the decryption keys for the disk somewhere in memory so that it can work with the contents of the disk. In 2008, a team of researchers based at Princeton University found two ways to extract that data. First, if you reboot a computer without removing power, the memory chips remain powered on and generally retain their contents. Second, you can apply a cold substance to the RAM chips and then remove them from the machine, and they'll retain their data for minutes because the cold temperature slows down circuit decay. Their preferred cold material is the liquid that comes out of an upside-down can of compressed air (typically used for dusting electronics equipment) - there are some nifty photos in their academic paper.
Hard drive disposal
If you're not careful when disposing of hard drives, you might expose your personal data. If your disk is encrypted with a strong key and you have no reason to believe someone else has the key, you can safely get rid of your hard drive because an attacker can't get to your data without the decryption key. If your hard drive isn't encrypted, an attacker with physical access can boot it up from another drive and get access, bypassing your operating system's password.
Just deleting all your sensitive data isn't enough to keep it out of the wrong hands: when computers delete files, they typically just free the space they lived on to be used again without zeroing or overwriting the bits that made up those files. An attacker or an "undelete" tool can look through space marked as free on your disk and try to reconstruct files you deleted.
Hard drives with spinning magnetic platters
With hard drives with spinning magnetic platters, you'll want to make sure you overwrite your data, including any space marked as free that may have files you deleted. There's a widespread belief that overwriting your modern spinning hard drive's binary data with a bunch of zeros isn't good enough to erase all physical traces of the data, but some researchers tried to reconstruct hard drives with a magnetic force microscope in 2008 to little success. The actual research paper is behind a paywall, but there's a good summary of its contents written by the first author. To be on the safe side, use the disk wiping tools built into your operating system, which will overwrite your disk with zeros and ones a couple of times, so you can be sure the data is unrecoverable.
Solid state drives (SSDs)
Solid state drives (SSDs) are built out of flash memory chips, each of which has a limited lifespan. SSDs come with a few more chips than their actual capacity, and they avoid wearing out a single chip prematurely by rotating spare chips in and out. When a flash memory chip gets close to the end of its life, it instructs the drive to stop writing to it - but whatever data is on the chip at the time is on it forever. Since deleting data just instructs the SSD controller to consider the chip unused (to avoid wearing out the chip by overwriting the contents), if a flash memory chip reaches the end of life, any "deleted" data is preserved forever for an attacker to possibly access.
A team of researchers at the University of California, San Diego built a device to directly access flash memory chips, bypassing an SSD controller, which lets them read "deleted" data. Their device cost $1000 to build, but they think the cost can be lowered to $200. Forensics firm Gillware also posted a story about recovering wedding photos from a formatted SD card, by soldering some connectors to the middle of the card.
Unless you have access to extremely expensive disk shredders, the only way to ensure you won't expose data you delete from an SSD is to enable full-disk encryption BEFORE you put any of your data on it. If you have full-disk encryption on your SSD and a section of the disk goes bad, the contents on that section will still be preserved forever, but they won't be accessible without your strong decryption key. If you have already put your data onto an SSD, encryption will not protect you because segments of the disk may have already died with your data on it, so the safest thing to do is to hold onto that drive forever. (Unfortunately, one of the top search engine results for disposing of SSDs incorrectly suggests that you can encrypt the drive after it has had unencrypted data on it. We cannot emphasize enough how important it is to encrypt your SSD BEFORE you put data on it if you ever want to get rid of it - replacing the data with an encrypted version works much like erasing it, and carries the same risk that an old flash memory chip will still hold unencrypted data.)
Full-disk encryption and TPMs
The easiest and most standard way to encrypt a disk is called "full-disk encryption", although more precisely it refers to encrypting an entire filesystem or partition (e.g., everything that shows up under a single drive letter). Full-disk encryption generally works by creating a single cryptographic key to encrypt the entire disk, and then in turn encrypting that key with a password. The reason for the two encryption steps is it allows changing (or temporarily removing) the password without having to rewrite the contents of the entire disk, but it does mean that everything on the disk is protected by a single key that doesn't change.
Windows has a built-in full-disk encryption mechanism called BitLocker, and macOS's mechanism is called FileVault.
In general, "full-disk" encryption leaves another small unencrypted partition on the disk, with enough software to load the encryption drivers. The contents of this partition are not sensitive (it's approximately the same for everyone), but if an attacker tampers with the partition, they can effectively break through the encryption routine the next time someone logs in. Disk encryption schemes that also use a TPM or other security chip allow verifying the software that's booted, which protects against this type of attack. If the initial software does not have an expected signature, the security chip will not give up its portion of the disk encryption key.
Another approach to encryption is file-based encryption: this has the advantage of allowing encrypting some files and not others, but it also tends to leak more metadata about the encrypted files (such as the directory structure and sizes). Chromebooks use file-based encryption, along with a signed (but not encrypted) operating system. The Chromebook's TPM verifies the entire OS and protects the user encryption keys, yielding approximately equivalent security. You can see more details in Google's design documents for Chrome OS security.
In the news
If you're using Facebook logins to access other sites, it's a good idea to switch them to having their own logins (with unique passwords in your password manager). Software developer Robb Lewis has a writeup of how to change your Spotify account (in the past, Spotify only allowed login with Facebook, so many early accounts use Facebook logins). There should be similar ways to convert other accounts.
Google reported on an internal security effort they call Project Strobe, which identified systemic security-related improvements to make in Android and in third-party Gmail applications - and also recommended the shutdown of Google+. The same day, The Wall Street Journal reported on a security issue at Google+ and found internal memos fearing that public disclosure would lead to reputational and regulatory problems for Google. Several news sites have additional reporting, including The New York Times and The Verge.
Transcript
Liz Denys (LD): Geoffrey, you left your laptop on my dining table last night.
Geoffrey Thomas (GT): Oh, sorry.
LD: You know, if this keeps happening, I might change your desktop background.
GT: I mean, I can just change it back...
LD: No, because I'd definitely put a program on your laptop that would change it nightly.
GT: But you wouldn't.
LD: Yeah, yeah, I know. I'm far too lawful good to mess with other people's computers.
GT: Unfortunately, not everyone who gets physical access to your devices is going to be your friend or generally an honest person like Liz.
LD: That's why today's episode, the first in our series on securing your laptop and desktop computers, focuses on physical security. We'll cover what sorts of threats you face if an attacker gets physical access to your computer, and the tools you have, including disk encryption, that can help keep you safe.
Intro music plays.
LD: Hello and welcome to Loose Leaf Security! I'm Liz Denys,
GT: and I'm Geoffrey Thomas, and we're your hosts.
LD: Loose Leaf Security is a show about making good computer security practice for everyone. We believe you don't need to be a software engineer or security professional to understand how to keep your devices and data safe.
GT: In every episode, we tackle a typical security concern or walk you through a recent incident.
Intro music fades out.
GT: Lots of security news in the last month. Let's start by discussing what's been going on at Facebook. On September 28, Facebook disclosed that an authentication bug potentially affected 50 million accounts.
LD: According to Facebook, this was a pretty bad vulnerability with the "View As" feature, which lets you see what your Facebook profile looks like to other people. It's a well-intentioned privacy feature: if I post something that I only want my family to see, I might click View As Geoffrey to make sure Geoffrey can't see it.
GT: But there was a bug in the View As feature and in their video uploader, where Liz would not just see what Liz's own page looks like to me, but also get access to upload a video as me. And the video uploader, in turn, had permissions not just to upload videos, but to get an access token with all the powers of the Facebook mobile app.
LD: The effect of this is that an attacker could get an access token that would let them pretend to be the Facebook app, which would give them access to everything in your account, bypassing the login completely.
GT: Facebook preemptively disabled the access token of 90 million accounts - 50 million where someone had used View As and the video uploader, and another 40 million where View As had been used recently, as a security precaution. Those users had to log back into Facebook, so if you found yourself logged out recently, that's probably what happened.
LD: They could have disclosed this a lot better. I opened Facebook one morning and saw I was logged out and thought "that's odd". Then, a couple hours later, I saw something about the breach in the news, and it's never great to hear about a breach that might have affected you from the news instead of the website or service itself.
GT: Eventually they reduced their estimate of the attack to 30 million accounts, of which about half just had contact info accessed, and half had additional details like hometown, relationship status, and recent searches taken. The attackers apparently started with some accounts of their own, and kept using the View As bug to take over larger and larger groups of accounts.
LD: They still haven't publicly announced who the attackers were or what the purpose was - they say they're keeping quiet at the FBI's request, but European and Japanese regulators are also investigating the breach.
GT: So what can you do about this breach? Although the attack bypassed logins completely, which means that even strong two-factor authentication methods wouldn't have helped, there is one practical step you can do: don't rely on Facebook logins to third-party sites.
LD: Many sites let you click a "Log In With Facebook" button so you don't have to make a separate account. It's a convenience, but it also ties your security to the security of your Facebook account. Anyone with access to your Facebook account - like the attackers in this breach - can also authenticate as you to any site you're using that Facebook login for.
GT: This breach shows that even major websites with large security teams like Facebook have serious vulnerabilities on occasion, so it's worth not putting all your eggs in one basket. The idea behind "Log In With Facebook" is to make logins easier by having to remember and type fewer passwords, but a more secure and just as easy solution is to use a proper password manager and have its extension enter passwords into sites for you.
LD: You might wonder why keeping all your passwords in a password manager is safer than logging in via another site. While it's true that either Facebook or your password manager could get breached, an attacker would still need to figure out how to read your encrypted passwords if it was from your password manager, and without your master password that only you know, this is not a trivial task. We discuss the value of having unique passwords for every site and password managers in our episode "Securing your online account passwords."
GT: Facebook says that they don't have evidence these attackers actually tried to use third-party logins for any of the accounts they compromised, but it's still a wake-up call.
LD: Also, when you use one site's credentials to log into another like with "Log In With Facebook", you're tying all of your accounts to the policies of that one website you're using. If you're logging into Spotify from Facebook and your Facebook accounts gets suspended for whatever reason, you might not be able to get into your Spotify account anymore or access any of your cool playlists.
GT: In other Facebook News, Gizmodo reported on some uncomfortable behavior in Facebook's ad targeting system. Advertisers can give Facebook a list of phone numbers to show ads to, for instance, to target people already in their customer database. If Facebook has your phone number just for use with two-factor auth text messages or for security alerts, even if it's not on your profile, advertisers can still target your account with that phone number.
LD: In February they got a lot of heat for sending notifications to a phone number you use for two-factor authentication, including copying replies to your Facebook page. They later said this was a bug, but it shows that there's no real internal separation between security phone numbers and the rest of your profile.
GT: Since May you've been able to sign up for two-factor auth on Facebook without providing a text message number at all. That's probably the best way to sign up for it - especially because text-message based two-factor authentication is basically the least secure option.
LD: Facebook also creates "shadow profiles" for everyone, including people not on Facebook, by collecting data other users upload to Facebook. There's no way to access that "shadow profile" information associated with your account, but they still allow advertisers to target that information.
GT: For instance, if I use the Facebook app to upload my contacts to Facebook, and I have Liz's home phone number in there, Facebook will now store that number with Liz's profile, whether or not Liz actually uploaded that number.
LD: So if an advertiser starts targeting landline phones in my area - or even just my number - I'll see an ad on my computer as if it was targeted to me. And I can't even see that they're targeting me with this data, because it's in Geoffrey's address book, not mine.
GT: They make an argument that they're actually protecting my privacy because it's my address book, which I kind of see but, well, it's Liz's information too. Anyway, I'm personally extremely paranoid, and I don't actually use my phone's contacts feature at all - I just remember people's phone numbers like in the old days.
LD: A much more reasonable way of doing this is to just not grant contact access to apps that don't totally need it. Unfortunately, you probably have a lot of Facebook friends, and at least one of them is probably using Facebook Messenger for text messaging. There's no real reason you'd need to roll your texts into Messenger, but it is a good front for Facebook to get that contacts data.
GT: Facebook Messenger doesn't fundamentally make your text messages more secure, so there's no reason to grant that access to get them all in one place.
LD: If you want to increase your text message security, the current gold standard is Signal, which is independent and doesn't want to tie your messages with some larger social media service.
GT: Although honestly I prefer having Signal and everything else as separate apps, and just using my phone's built-in SMS app for SMS only, because then it's clear to me what's encrypted and what's not.
LD: Another social media network announced a security vulnerability so serious that it's shutting down entirely. The Wall Street Journal reported last week that Google+ exposed private data of hundreds of thousands of users as Google announced that they would be sunsetting the consumer parts of Google+.
GT: Specifically, an interface for use by third-party developers could be used to access profile information - name, email, gender, age, etc. - even for users who had marked that data private. Google says they discovered the weakness internally and fixed it immediately, and although they keep limited logs for privacy reasons, they had no evidence within those logs that anyone was abusing this.
LD: The Journal also claimed that Google internally discussed whether to announce the fix in March when they discovered it, and they decided against it, worried about their reputation and whether regulators would get involved.
GT: Honestly, I think what they did was fine, and there isn't really a scandal here. I think it's a little misleading the Journal says the data was "exposed" - unlike the Facebook case, where they found unusual activity and investigated, Google just discovered the design flaw themselves and fixed it, and there's no evidence anyone else had found it. Everyone has security bugs and fixes them on a regular basis. I mean, Windows Update fixes security bugs all the time, and nobody says that Microsoft should be sending breach notifications to every Windows user.
LD: But Microsoft does notify Windows users, in practice. You know when your computer is being restarted for a security update. They tell you exactly what is being patched and why.
GT: Mm yeah, that's true, there is a little more accountability there. You know they're patching things regularly, but also that they're probably not patching major stuff all the time after release. For a website you have no idea what they're doing.
LD: Websites don't really need to alert people, but maybe they should have a public blog of all the things they've fixed and other changes they've made.
GT: Yeah, that's not a bad idea. Desktop software is basically forced to do that, and we've gotten used to it. I suspect whoever does this first for web software is going to start out looking a little bit bad, but people will get used to it like we did for desktop software, and maybe the Google+ issues are enough to make people realize that this is needed and worth doing.
LD: Google Chrome is making some changes to extensions to improve security. First, in the next release of Chrome, they're giving users finer control over which webpages extensions get to work on.
GT: Currently, an extension can choose to limit which sites it had access to, but if you want to install the extension, you have to accept whatever list the extension chose. Maybe you have an extension for showing easter eggs for a particular webcomic, and it says it only wants to modify that web comic's site - that's great! But something like your password manager or an unwanted content blocker wants access to all sites, and until the new Chrome version comes out, you have no choice but to grant it access to all sites or just not install it at all.
LD: In the next version of Chrome, you'll be able to restrict extensions' access to a custom list of sites you choose and also be able to configure extensions you use to require you to click on them to grant them access to the current page.
GT: The other changes also seek to improve security but are more behind the scenes. Extensions that request broad access to read or modify lots of sites or all sites will be undergoing more scrutiny. So if that extension to display a particular web comic's easter eggs requested access for any site, Chrome might no longer allow it.
LD: Chrome is also requiring that extensions cannot have obfuscated code because it makes review more difficult, and also they've found that about 70% of malicious extensions had it.
GT: Software authors who want to keep their precise techniques and algorithms secret tend to use code obfuscators, which make it harder to reverse-engineer what the code is doing. If you're selling proprietary data-processing software to businesses in other countries or something, maybe this makes sense. But if you're providing an extension that runs in individual people's web browsers and has complete access to their websites, this isn't particularly reasonable; people should be able to know exactly what their extension is doing.
LD: Next year, they'll also be requiring that all Chrome Web Store developers have two-factor authentication on their accounts so it's harder for hackers to access a developer's account and ship malicious code.
GT: And if you've yet to set up two-factor, you should follow Chrome's lead and enable that right now. To learn more, check out our episode "Two-factor authentication and account recovery." A recent Windows 10 update accidentally deleted files.
LD: Yikes.
GT: Yeah, this is a pretty bad bug, but it's fixed now, so if you were holding off on taking Windows updates, you don't need to any longer.
LD: Not only is this a nasty bug, but also a rather unfortunate one. Automatic updates are generally regarded as the first line of defense to secure your devices, and issues like this are extremely rare.
GT: I can only think of one similar case to this: back in 2011, some unofficial tools for installing graphics drivers on Linux had a typo, and they would uninstall everything instead of just the old version of the drivers. I know some people were worried about the iOS 10.3 upgrade, which literally changed the filesystem format on existing phones, but that went extremely smoothly because they spent a lot of time thoroughly testing it. So seeing this sort of bug in a routine update from Microsoft is shocking because it's so uncommon - but on the bright side, it is very uncommon.
LD: Microsoft's tech support will help you try to undelete your files if you were affected by this. Their recommendation is to use the computer as little as possible so it's most likely you can undelete those files.
GT: Undeleting is a tricky process that isn't guaranteed to work, and we'll actually touch briefly on how they're even possible later in this episode. However, a better option is to restore from backups, if you have them.
LD: Backups are invaluable when dealing with many types of security threats, and they're actually the topic for our next episode. So, uh, Bloomberg published a, hmm, really big if true piece about Chinese spies secretly embedding rice-sized chips onto the motherboards of servers assembled by Super Micro, who then sent them to major tech companies.
GT: Did this even happen?
LD: Probably not, Super Micro, Apple, Amazon, and even US Intelligence have denied this story.
GT: Hmm. But… could it have happened?
LD: Oh, definitely. I'd imagine people would have noticed it sooner, especially if it was trying to infiltrate that many different companies. But yes, theoretically, a small chip on a motherboard would have the access needed to do some really nasty things.
GT: It's such a weird way to go about this attack, though, it certainly would have been a lot easier just to subtly modify the firmware to inject malicious code in.
LD: And possibly code to say it's been reinstalled when it hasn't, in case whoever gets your product is paranoid enough to do that.
GT: So, where'd this story come from?
LD: There's speculation that someone gave Bloomberg this story with the intent to feed the flames in US-China relations. Most hardware does come from China, and this would severely undermine trust in Chinese hardware.
GT: Yeah, and detecting physical attacks is pretty difficult.
LD: Yep, and while most of us aren't likely to be personally targeted by a foreign power, a lot of the same sorts of physical attacks on our desktop computers and laptops could happen to us, and our main segment today is all about securing your personal computers from physical attacks.
Interlude music plays.
LD: So why is physical security so important for your laptops and desktop computers? In a sense, it's obvious - an attacker has the device in their hands - but there's a bunch of details of what this means exactly.
GT: For instance, they can copy out the data from an unencrypted disk, or if the computer is still running - even if it's just suspended but not powered off - they can copy out the data from RAM.
LD: They can probably boot up the computer from another drive, like an installation CD or USB drive, and reset any passwords on the computers.
GT: With enough time and equipment, they can add in malicious hardware. For instance, they can add a keylogger, a small hardware device that captures everything you type, including your passwords, and reports it to the attacker for later use.
LD: Computers have traditionally been built in ways that make it easy for people who can get inside the computer to gain access. There's often a switch or button inside the computer you can use to reset a firmware password, for instance.
GT: Law number three from Microsoft's old ten laws of security reads, "If a bad guy has unrestricted physical access to your computer, it's not your computer anymore."
LD: In this episode we're primarily going to be talking about laptops and desktops, because we've talked about phone security in detail recently. But these concepts apply broadly to phones, tablets, and basically any electronic device you have.
GT: Phones are often designed to be more resistant to physical attacks, at least - they're not designed to be opened easily by the general public, the way desktops and many laptops are. And they often don't have reset buttons because of that.
LD: On the other hand, computers are designed to physically interface with your keyboard, your tablet, external hard drives, thumb drives, and so on. Generally speaking, they're also designed to potentially be opened up and taken apart pretty easily so that you could upgrade the memory or hard drive, though this has become less and less the norm with non-enterprise-grade laptops.
GT: But just like you can open up your computer to upgrade your hard drive to a larger one, an attacker could take your computer and open it and swap out the hard drive - they could replace your hard drive with their own, with a malicious login screen that steals your password as soon as you type it, and if it's unencrypted, they can also just attach it to their own computer and read the data.
LD: An attacker doesn't even necessarily need to take your drive to copy your data - they could just boot from another drive, like a USB stick, to bypass your operating system's password, access your hard drive, and then copy all the files.
GT: An attacker could also open up your computer and add in a hardware keylogger or some other bit of malicious hardware. It doesn't have to be as stealthy as the tiny chip Bloomberg allegedly found - very few of us recognize the insides of our computers well enough to notice something new.
LD: Or they could turn off some layer of security - physical access generally allows someone to do firmware resets or disable Secure Boot, a security feature that detects if the you're booting up the intended operating system by checking if it's signed with a cryptographic key known to the vendor.
GT: And it's not always easy to tell whether someone messed with your computer. If it was off, nothing's stopping them from turning it on, booting it off their USB drive, doing nefarious deeds, and turning it right back off.
LD: If it's on and has a locked screen, it's a little bit protected against casual attacks - you might get suspicious if it's been powered down when you get back. Anything involving booting it into a new OS or tampering with the insides generally requires shutting it down.
GT: Then again, you might just think it crashed, shrug, and turn it back on.
LD: And there are a few attacks that are easier when a computer's turned on. For instance, some high-speed connections like Thunderbolt allow direct access to the computer's memory, and there's even a technique involving spraying freezing liquid on a RAM chip so you can remove it without losing its data.
GT: It requires opening up the computer, so it's not really the sort of attack someone can do at a coffee shop while you're putting more milk in your tea, but if you're leaving your computer at a coworking space overnight with just a cable lock to prevent theft, maybe you should worry about these sorts of things.
LD: You might be able to narrow down whether or not someone opened up your computer if you had a difficult to replicate marker over the screws. Glitter nail polish actually works pretty well for this, since the glitter scattering pattern is essentially random. Apply it over the screws, wait for it to dry, and take pictures of how the glitter scattered. If someone tampers with the screws and tries to replace the polish, they'll end up with a different glitter scatter. Nail polish isn't perfect though - sometimes, it falls off, and it can be tricky to apply the right thickness of coat so that it doesn't just peel off in one piece.
GT: All of this helps make sure you're not any less secure than when you first got your computer, but it's also possible that your device gets compromised before it gets into your hands, sort of like what Bloomberg said was happening with Super Micro.
LD: There's even more power granted to an attacker who gets your computer before you do - they can set passwords for the BIOS or firmware setup, and if they're particularly sneaky, they can make it so you can't even reset or reflash the BIOS.
GT: There are sometimes also security chips that can only be set up once, or have some concept of an initial owner. Several business-class laptops have theft-protection firmware that reports the location of the computer, and you can't easily disable it once it's been enabled - otherwise a thief could just do that. But if you're not the first person who got to the computer, the firmware thinks the attacker is the legitimate owner and you are the thief - and just reports your location to them.
LD: Once someone has gotten full access to a computer, it's very hard to figure out what they changed, if they're trying to hide that from you.
GT: A common example of this is a laptop you get from your workplace. They often have legitimate reasons to monitor everything that's being done on the device, so they can make sure nobody is attacking them or running off with confidential data. And they probably want to do that in a way that's very hard to turn off.
LD: It might be fine to log into an instant messaging account or something on your work computer - but if it's an IM account that's associated with other things like Facebook with social media or Google with your email, calendar, etc., that's a lot of access that you're effectively giving them. They're also likely to want remote access for totally reasonable purposes, like installing updates or letting the help desk look at problems.
GT: Hopefully your employer isn't out to get you, but it's probably in your best interest to be a little cautious, especially because employment relationships can turn sour in unexpected ways. My current employer does very extensive web monitoring - and tells us that up front - so a bunch of folks, including myself, have a personal laptop or tablet on our desk, on the guest wifi, which we use for things like personal email and streaming music.
LD: Another potential route for tampering is someone getting a shipment of your new computer before you get to it.
GT: There's a pretty amazing photo from the Snowden leaks of NSA employees carefully removing tape from a box of networking equipment so they can install an "implant" and then give the box back to the shipping company.
LD: I'd hope most of our listeners are not NSA targets - if you are, you probably want more specific help than you can get from a more broadly targeted advice podcast - but maybe if your packages are sitting in the common area in your apartment complex, or if you don't trust your housemates or whoever accepts the packages for your building, it's something to worry about.
GT: The safest way to get a new computer is to just walk into a retail store, without ordering anything in advance, and pick up something off the shelf, or even ask for something from their storeroom, without giving your name. It might be a little bit excessively paranoid - for most people this just isn't a concern at all. But it's how I've been buying computers for the last few years, because it's not too much out of my way to go to a store.
LD: Speaking of buying a new computer from the store, the default Windows installations used to be really bad, with all sorts of stuff preinstalled that might be awful for your security.
GT: There was one a few years ago that basically disabled all SSL checking by accident. Microsoft has been pushing to clean this up with Windows 10.
LD: On the other hand, consumer versions of Windows itself send more data to Microsoft than you might be comfortable with.
GT: We'll talk about this a little more later in this series, when we cover software issues and malware in detail.
LD: When you get a new computer, it's probably a good idea to set a firmware password so nobody else can reconfigure your firmware settings.
GT: The system's firmware - often called the BIOS, after the name of the firmware on the original IBM PC - controls the boot process, including letting you select which drive to boot from, enabling and disabling ports, and so forth.
LD: If you set a firmware password, it's much harder for someone to boot your computer up from another drive or reflash the firmware with a malicious version.
GT: If you've never booted up your computer from a CD drive or a USB stick, you might be wondering what the legitimate use of this is. If something goes wrong with your OS, you might need to reinstall it or fix it by booting it up from an install disk or a recovery disk.
LD: So, if you set up your BIOS or your firmware to require a password to boot from another drive, you'll need to remember the password in case something does go wrong.
GT: Hm, I don't have a firmware password but I should probably do that. There's little downside as long as I don't forget it.
LD: Just put it in your password manager!
GT: Oh right - I'll generally have my phone with me, so this sounds better than not having a firmware password at all.
LD: Back at beginning of this segment, we listed a bunch of different attacks that became possible when someone gets physical access to your device, and many of them involved getting the contents of your hard drive.
GT: As a reminder, anyone can remove your disk and plug it into another computer, and removing a hard drive often relatively easier to do than disassembling the entire computer.
LD: And even with a firmware password, there's usually a switch or button inside to reset it, so an attacker could open up your computer, press that button, and then just boot it up off an external drive. Or sometimes, your firmware settings are stored on CMOS memory, an older technology that needs power to stay set, and that battery could just run out.
GT: Either of these provides a way around your login password. If your files aren't encrypted, they're just sitting there on the disk, and it doesn't matter how strong your login password is, because it only matters when your own OS is running. It's useful for the screen saver, but that's about it.
LD: It's like the old saying that locks keep honest people honest. Just like you can get past a locked gate by climbing over it or removing the hinges, if your files aren't encrypted, an attacker who's got a little bit of time and maybe a screwdriver can skip the login screen and get to your files.
GT: Most computers, by default, aren't encrypting your hard drive, but disk encryption is incredibly useful for having peace of mind with lost or stolen laptops or hard drives.
LD: That's actually one of the most common threats. For most of us, nobody's out to get us specifically, but if we get a laptop stolen or misplaced, maybe when traveling, we want to make sure our personal information is safe. A thief might just take the opportunity to steal the identity of whoever used the laptop, without having any sort of plans for it in advance.
GT: And they can get into a lot of stuff: any financial or tax records you might have, any logged-in accounts like email or bank accounts, maybe photos of family members, and so forth.
LD: If you lose a laptop with an encrypted disk, it's still a bummer, but you're just out the cost of replacing the hardware. You know that what's on there isn't accessible.
GT: Another very similar risk is when you intentionally get rid of an old laptop or an old hard drive.
LD: Whenever you get a new computer, it's tempting to get that set up and forget about the old one, but you should make sure you do something responsible with the old hard drive, especially if you're giving the machine away. You might not want whoever gets it next to have all your data.
GT: If your hard drive is encrypted with a strong key, you don't need to delete your data because someone without the key can't get to it in the first place. You should still wipe it to be safe, but having it encrypted gives you another level of protection.
LD: The risks of leaving your encrypted data on a drive are that your password gets guessed or the encryption gets broken, but encryption algorithms have been remarkably solid recently, as long as you use a good encryption key. Most disk encryption software bases the key on your login password by default, so it becomes much more important to make sure you have a strong login password when you're using disk encryption.
GT: So what happens if you try to erase files on an unencrypted disk? Is there a way to reliably clear your data?
LD: If you have a solid state drive or, SSD for short, they're built out of a bunch of flash memory chips, each of which has a limited lifespan. So if you "erase" a large block of data, the SSD does that more efficiently by just marking it as unused but not zeroing out that data, until something comes along later and wants to write to it. And if the chip with the "erased" data goes bad before it can be reused, it might never get overwritten. That means your data might be exposed forever.
GT: Probably the safest thing to do is never get rid of an SSD or a USB drive that has had sensitive data on it. If it's only got vacation photos or something, it's fine, but if you used it as your regular hard disk, or you had tax documents on it or something, there isn't a way to be confident that the data isn't gone.
LD: There are some services for businesses that do SSD destruction, but for personal use, the simplest answer is unfortunately just to hold on to it forever.
GT: For a hard disk drive with spinning magnetic platters, it's commonly believed that overwriting data with zeros may not completely erase all physical traces of the data. It's sort of like turning off a light - when you turn off a light that's been on, they're still a little bit warm, they might glow dimly for a while after the power's cut, and the light itself shows signs of use. While your computer will still read a zero for something that's been deleted, someone with specialized tools could potentially look at the physical magnetic field to guess which bits were originally zero and which bits were ones that were overwritten with zeros later. This is why some businesses will wipe disks with really strong magnets, but they're not really practical for home use.
LD: But the whole idea that you can recover data off spinning disks by reading the physical magnetic field seems to be speculation - it's possible for old-school floppy disks, where the magnetic tracks may not line up every time you write to it, but for modern hard disks nobody has been able to meaningfully recover data.
GT: In 2008, some researchers tried this with a magnetic force microscope [the actual research paper is behind a paywall], and concluded that on a fresh drive that was used once and then overwritten, they could recover a couple of bytes. For anything more heavily used, they couldn't do better than random guessing.
LD: The disk wiping tools built into Windows 10, macOS, and other operating systems can overwrite your disk a couple of times, which makes the data unrecoverable.
GT: But also when we talk about how to erase data, that doesn't necessarily correspond to what happens when your computer's filesystem deletes a file. Typically, they'll just mark the space as free without overwriting it with zeros. If you remember Microsoft's botched Windows update that deleted files, they were recommending using an undelete tool, and this is actually how undelete tools work - they look through undeclared areas of your drive for things that look like parts of files and try to assemble them back into the original files.
LD: Sometimes, directories or folders are also deleted in the same way, so it might be possible to patch together large swaths of deleted data and their association to nearby files, too.
GT: On an unencrypted disk with the right software, this is just about as easy to read as data you haven't deleted.
LD: On an encrypted disk, it's not any harder to access than files you haven't deleted. So if someone doesn't have your encryption key, this keeps your files safe. But if they do have access to the decrypted drive - maybe they guessed your password or they got access to your computer while it was running - then the deleted files are still potentially easy to get to. We'll talk more about disk encryption after a quick break.
Interlude music plays.
GT: So we've talked a bunch about why disk encryption is so useful. What's the best way to enable it securely?
LD: Definitely the best way is to set it up when you're setting up your computer, so you don't have to worry about unencrypted data still being on the drive.
GT: But any time is a good time to start.
LD: There are a couple of third-party disk encryption tools, but we're fans of the ones that are built into popular operating systems - BitLocker for Windows and FileVault for macOS.
GT: If you're using Linux, there's probably something integrated with your Linux distro that you can use. For people with Chromebooks, Chrome OS actually has disk encryption built in as a default feature - you actually can't set up Chrome OS and not use encryption.
LD: BitLocker will prompt you to choose a password to use for encryption, and FileVault will just use your regular login password to encrypt the drive. In both cases, you want to make sure that the password you're using is strong because you're entrusting your security directly to that password. And make sure that it's not a password that you're using in an online service, because in case that gets breached, an attacker might be able to associate that with you and then with your hard drive's encryption.
GT: The setup process will also generate a recovery key, which you can store either on a USB drive or in the cloud, if you trust Microsoft or Apple's cloud services. This key is the only way back into your disk if you forget your password, so make sure to keep it safe.
LD: There's another way of using full-disk encryption with additional security, by tying the key to the physical hardware of your computer. It requires a special security chip, which isn't in all computers, so you'll have to check if your laptop or desktop supports it. It's similar to the security chips in Apple and Google smartphones, which are also used for encrypting data and keeping it safe against physical attacks.
GT: One of the risks with regular disk encryption is that someone can steal your disk, mess with the boot-up code in the operating system to install a keylogger or virus or something, and put it back. Then when you get prompted for your strong password, you'll type it in and the attacker's code will be watching.
LD: This is a different sort of threat than just losing your laptop - it requires someone to have access to your laptop and then give it back to you without you noticing. The security industry has started calling this the "Evil Maid Attack" - if you leave your laptop in a hotel room while you're out for dinner, the housekeeping staff or someone who bribes them can get to your room, break into your laptop, and then put it back before you notice.
GT: So there are hardware security chips that store part or all of your disk encryption key in a physically tamper-resistant package, and make sure that the OS being booted up hasn't changed in an unexpected way. If someone messed with the OS or swapped out the drive, it won't give up the encryption key.
LD: And if someone put the drive into another computer, without the chip, there's no way to get the key at all.
GT: It isn't 100% hack-proof - the sort of highly-skilled attack Bloomberg suggested would be able to defeat it by inserting a new chip between the security chip and the rest of the system and lying to the security chip about what's going on. But that requires serious electronics skills, not just installing some malware on the disk.
LD: On PCs, this chip is called a Trusted Platform Module, or TPM for short. Microsoft has been requiring it on recent machines with Windows 10 pre-installed, but on older computers, you'll generally only find it on business-class laptops and desktops. BitLocker has a "transparent operation mode" which puts the key entirely on the TPM, so you don't need a separate BitLocker password. But you can also set an additional PIN for more security.
GT: All Chromebooks, from the original 2010 model onwards, have TPMs and use it as part of the Chrome OS security design.
LD: The most recent MacBook Pros have a T2 chip, which they use for disk encryption by default. You can also put FileVault on top for additional protection.
GT: So the flip side of making it harder for an attacker to get to your files is that all of these options make it harder for you to get to the files, too, if something goes wrong.
LD: Apple specifically said you should back up all data on laptops with the T2 chip, in case that chip gets damaged.
GT: Also, if you're using full-disk encryption without a TPM or similar chip and you suspect someone has tampered with your hard disk, you shouldn't use the disk at all. You don't know that you're typing in your encryption password somewhere secure. So in many cases, the safest thing to do is to reinstall your computer and restore from backup.
LD: And if you're at high risk of attack and you notice the pattern of glitter nail polish has changed, you probably shouldn't trust that laptop at all any more.
GT: The other way that backups help is by making you confident that when anything goes wrong, you can always start from scratch. If you don't trust your backups, you're going to be tempted to keep using your possibly compromised existing laptop just so you can keep doing your work.
LD: We'll be talking about backups - different approaches to backups, where to store them, and how to make sure that they're secure and available - in our next episode.
GT: Tune in in two weeks as we continue our series on securing your laptop and desktop computers by looking at backups.
Outro music plays.
LD: Loose Leaf Security is produced by me, Liz Denys.
GT: Our theme music, arranged by Liz, is based on excerpts of "Venus: The Bringer of Peace" from Gustav Holst's original two piano arrangement of The Planets.
LD: For a transcript of this show and links for further reading about topics covered in this episode, head on over to looseleafsecurity.com. You can also follow us on Twitter, Instagram, and Facebook at @LooseLeafSecure.
GT: If you want to support the show, we'd really appreciate it if you could head to iTunes and leave us a nice review or just tell your friends about the podcast. Those simple actions can really help us.
Outro music fades out.