I hate separating hackers based on morality.

mr-robot-addressI have given a few talks recently to non-hacker audiences. In so doing, I learned that even at it’s most basic level, the idea of what hacking is, is kind of lost on “normal people.” The “Wanna Cry” malware couldn’t have better illustrated the things I was trying to teach.

It’s not that normies aren’t capable of understanding, it’s that they have been given the wrong information  by the government, the media, and popular culture for years. There is this fairly lame idea of hackers following  this sort of monochromatic gradient matching that of the old-west: the good guys wear white hats, the bad guys wear black hats, and there is a spectrum of moralities in between. There are legitimate ethics that guide hackers, they just aren’t the kinds that you hear about in movies and on TV:

  1. The Sharing Imperative – Hacking is a gift economy. You get tools, knowledge and code for free, so you have to share what you have learned to keep growing the pool.
  2. The Hands-On Imperative – Just like “real” science, you have to learn by doing. Take things apart, break them even, and learn how they work. Use that knowledge to create interesting things.
  3. The Community Imperative – Communities (geographic, philosophical, etc.) are how it gets done. Crews, clubs, chat rooms, hackerspaces, conferences, email lists, are all places for n00bs to ask questions and get flamed, and for l33ts to hold court.

Monochromatic Morality
heckermanThe typical whitehat is a security researcher, penetration tester, or security consultant that only hacks the computers and networks that they have permission to hack. This can either be a lab environment built for research, a client who has retained security services, or an employer who has granted express permission. Whitehats then disclose their findings. This disclosure may be for the benefit of a client or an employer, or it may be to benefit the public. The key differentiator is that the whitehat gets permission and then shares their discovery for the benefit of others.

The typical blackhat is a generally considered to be a criminal. They hack systems that do not belong to them and then do not disclose their findings. The exploits that they discover are hoarded and stockpiled for their benefit alone. The key differentiator is that blackhats do not seek permission, they do not disclose their findings, and they hack for personal benefit.

The gray areas have to do with the degree to which a hacker has permission, discloses their findings, and how they profit from their activities. Whitehats have “real” jobs and share everything, blackhats don’t have jobs and therefore hack for money. A typical grayhat might hack systems that don’t belong to them but then anonymously share their findings, or they might develop their exploits in a lab, but then sell those exploits rather than disclosing them.

In my professional life, I routinely employ hacking tools for the benefit of my employer, whether it’s scanning networks to troubleshoot problems, or cracking passwords to help users who have lost access to their computers. In previous jobs, I have exfiltrated research data from one network to another at the request of the data’s owner. While I don’t always have my employer’s explicit permission to do what I do, they hired me to fix problems for their users, so I do what it takes. The things that I learn, I then share and teach to others, whether that’s talks at conferences or Cinci2600 meetings, or posts on this blog. I have no idea where that falls in the white/gray spectrum.

Chromatic Pragmatism
red_vs_blueInstead of black and white, I prefer to look at hacking from a red vs. blue perspective. Regardless of your moral compass (or that of your employer), you are either on the offensive end which is the red team or the defensive end, which is blue team.

Teams are better terms to think in because hacking is a social activity. You may or may not be physically alone, but you are always learning from others. You read docs and code, you try stuff, you get stuck, you look up answers and ultimately ask someone for help. The idea of hackers as introverted smart kids living in their mom’s basements isn’t nearly as accurate as TV would have you believe.

Regardless of the reason why you are hacking a computer or a network, you are either the attacker or the defender. You are either probing defenses looking for  a way in, or you are hardening defenses to keep others out. You can further divide these activities into application vs. network security, but at that point the discussion is more about tools.

Thinking about hacking in terms of offense and defense takes away all the politics, business, and patriotism of your red and blue teams. If you are a red teamer, backed by your country’s military, you might be doing black hat stuff for a “good” cause. You might be a blue teamer working for organized crime syndicate, doing white hat stuff for “bad” people. You might be a whistleblower or a journalist, exposing bad acts by a government.

Wanna Cry: with the good comes the bad, with the bad comes the good
The Wanna Cry debacle is interesting because of its timing, its origin, its disclosure, and its impact.

Its timing is interesting because nation-state political hacking is like half of all discussions when it comes to the Presidential election. It’s origin is interesting because the tools in the leaked sample appear to come from the NSA. The leak comes from a group known as “Shadow Brokers.” They said they would auction the rest for a large sum of money. The disclosure is interesting because the first release is a free sample to prove the quality of the goods they intend to auction.

The zero-day exploit exposed by the leaked tools was then used to implement a large scale ransomware attack that severely affected systems in Europe and the UK. A researcher was able to locate a call in the ransomware to deactivate the malware, which stopped the attack dead in its tracks. There are lots of theories about this strange turn of events, but my personal theory is that the ransomware campaign was a warning shot. Possibly to prove out a concept, possibly to urge everyone to patch against the vulnerability.

The idea that NSA tools were compromised, and disclosed by a criminal organization, turns the whole black hat/white hat thing on its head. The NSA was hoarding exploits and not disclosing them, which is total black hat move. Shadow Brokers exposed the tools, prompting a widespread campaign to fix a number of vulnerabilities, which is a total white hat move. So you have a government agency, a “good guy”, doing bad things, and a criminal organization, a “bad guy”, doing white had things.

If you want to talk about the specifics of the hack, the NSA’s blue team didn’t do it’s job, and the Shadow Brokers’ red team ate their lunch. The blue team’s principle was a server where attacks were either launched or controlled. This server was the red team’s target. It’s a pretty epic win for the red team because the NSA is a very advanced hacking group, possibly the best in the world.

Windows Hyper-V Manager is Stupid

I spend many hours at work in the middle of the night. Sometimes I work on my own things by connecting to my gear at home. I call this telecommuting in reverse. In order to facilitate my reverse telecommute, I use a couple of machines, one Linux box I call Hub, for OpenVPN, SSH, and NeoRouter, and one Windows machine I call Portal, for Teamviewer, Remote Desktop, and to run my DNS hosts Windows-only dynamic DNS client. Hub died, and so I figured I would run the two machines on one box via XenServer or Virtualbox. It turns out that the hardware for Portal doesn’t do Linux very well. So I decided to take a run at virtualization with Hyper-V. Hyper-V Server 2012 R2 lets you evaluate the product indefinitely, so I thought that would be a good place to start.

After downloading the ISO, which is hard to locate on the MS TechNet site, I burned it to disk and wiped Portal and loaded Hyper-V Server and configured a static IP for it. This isn’t a high end box, it’s a dual core AMD with 8gb of ram. It’s fine for using Windows 7 as a springboard to get into my home network. I just want to spin up a couple of low end Linux boxes and a Windows machine. The sconfig.cmd tool is fine for the basics of setting up the box, but since I am not much of a powershell guy, I wanted to use the Hyper-V manager on another workstation. I was trying to do this without having to pirate anything, and it turned out to be a complete waste of time.

Hyper-V Manager and the Hyper-V Server that it can manage is basically a matched set. You can use the manager on Windows 7 to connect to Hyper-V on Server 2008 and earlier. You can’t really use Win7 or Win10 to manage 2012 R2. So, I basically have to either pirate Server 2008, pirate Win8.1, or pirate Server 2016. Or, I can just use a ProHVM, a third party tool from a Swedish company that seems to have been invented specifically because Hyper-V Manager is the worst.

Even with ProHVM, it’s not all champagne and roses. Accessing the console of a VM causes wonky keyboard performance. This is mildly frustrating, so I recommend using a mouse as much as possible for configuration of a VM. The only real showstopper is logging in to a Linux box with no GUI. Having only 50% of your keystrokes register makes logging into the console completely impossible because you don’t see the *** to let you know which character you are on.

My workaround for Debian VMs is to not set a root password, which forces Debian to disable root in favor of sudo, like Ubuntu. Then you set a very short password for your user account (like 12345, same as the combination to my luggage) and make certain that you set up an SSH server during setup. Then you can SSH to the box and use the ‘passwd’ command to reset the password to something more secure. Then you can configure SSH keys for your logins.

So if you find yourself in a situation where you need to do virtualization on Windows, and you are deeply invested in the idea of using 2012 R2, don’t bother with Hyper-V manager. Instead, download ProHVM, and then use ProHVM as little as possible. It’s free for non-commercial use and you can build new VMs and all that stuff that you *should* be able to use Hyper-V Manager for.

My guide to setting up SSH keys with Putty

I have become a kind of fan of Cloud At Cost. Their one-time-fee servers and easy build process is great for spinning up test machines. I would hardly recommend running anything that I would consider “production” or mission critical on a cloud at cost VM, but it is a cheap, quick, and simple way to spin up boxes to play with until you are ready for more expensive/permanent hosting (like with Digital Ocean or Amazon). Spinning up a new box means securing SSH. So here is my guide.

The major problem with a hosted server of any kind is drive-by scans. There are folks out there that scan for huge swaths of the Internet looking for vulnerable machines. There are two basic varieties: scanning a single host for all vulnerabilities, and scanning a large number of hosts for a specific vulnerability. A plain box should really only be running SSH, so that is the security focus of this post. There should also be a firewall running, that rejects connections on all ports except the services you absolutely need.

It should be noted that Your security measures don’t necessarily have to be top notch, your box just has to be less convenient than the next host on the scanners’ lists. It’s not hard to scan a large subnet and find hosts to hammer on. Drive-by scans are a numbers game; it’s all about the low hanging fruit. With C@C, it’s a question of timing. You have to get onto the box and lock it down quickly. Maybe I’m just being paranoid, but I have had boxes that I didn’t log in to right after spinning them up and I have seen very high CPU utilization on them when they aren’t really running anything, which leads me to believe that the host has been compromised. Also, beware that the web-based stats can be wildly inaccurate.

This guide will only lock down SSH. If you are running a web server, this guide will not lock down the web server. If you are running Asterisk, this guide will not lock down Asterisk. All this guide will do is shore up a couple of vulnerabilities with SSH. I recommend running these steps *BEFORE* installing anything on your VM.

My use case for Cloud At Cost is something like this: There are times when I need a box that is easier to get to than hosting a box on my home network, but doesn’t really justify the monthly cost of running a server on Digital Ocean or Amazon. For me, I spend a lot of time working all night inside a very restrictive corporate network, so it’s hard to get access to my stuff at home especially since Team Viewer is compromised. C@C is cheap and easy, which probably means it’s a playground for scammers and other bad actors. This means it’s a good idea to lock down your box before you do anything useful with it.

You can get started with C@C for around $35, but if you follow them closely, you can catch some of their discount deals and get a very low end developer box for around $10. I took advantage of a few of these promotions and now I have a bucket of resources at my disposal for all of my tinkering needs. Also, if your box starts to misbehave (loads of network traffic, high cpu utilization, etc.) it’s probably compromised, so just torch it and build a new one.

Getting Started

You can learn about the basics of the Cloud At Cost panel here, the info will be useful later on:

Once you have signed up with C@C, bought some resources, and fired up your Linux VM, it’s time to do some housekeeping. I prefer Debian, and it’s what I am using in this guide, but it doesn’t really matter what you choose.

As soon as the box is up, log in with SSH, using the root password given in the information button. I use putty*, because most of my time in front of a computer is spent working or gaming, so I use Windows a lot. I know it upsets a lot of folks to hear that, but hey, those folks can feel secure in knowing that their “Unix Beards” are mightier than mine.

The very first thing that I do is change the root password. Make like 30 or more random characters. You shouldn’t actually need to type it in after this point, but keep it somewhere encrypted just in case. I also comment out the non-us repo that C@C Debian machines are still pointed to in sources list:

passwd
nano /etc/apt/sources.list

Just locate the line that begins with “deb http://non-us.debian.org” and put a # in front of it. On a C@C Debain 8 box, it should be the first line.

With that pesky non-US entry removed, you are clear to update your packages:
apt-get update
apt-get upgrade

I also run these commands from the Nerd Vittles blog to make sure the password doesn’t revert to the Cloud At Cost root password:

sed -i '/exit 0/d' /etc/rc.local
killall plymouthd
echo killall plymouthd >> /etc/rc.local
rm -f /etc/rc3.d/S97*
echo "exit 0" >> /etc/rc.local

I don’t know if they are strictly necessary, but the dudes at Nerd Vittles recommend it, and they spend waaaay more time doing this stuff than I do, so there you have it.

After that, it’s time to install fail2ban, and then create a non-root user:

apt-get install fail2ban
adduser steve

Hopefully, in a few minutes fail2ban will be made superfluous by our additional security measures. In the meantime it will stop brute force attempts. Some of my hacker buddies change the default port for SSH to throw off driveby scans, but the restrictive corporate network I mentioned before doesn’t like arbitrary ports, so that’s a hard no in this case.

Enable Sudo for a Non-Root User

To start implementing our security measures, we will install sudo, add ‘steve’ (our non-root user) to the sudo group, and then make sure steve has the right permissions in the sudoers file:
apt-get install sudo
adduser steve sudo
nano /etc/sudoers

At this point the /etc/sudoers file should open in the Nano next editor. I know I should be using vi, but I am too busy #YOLOing to do that Unix Beard crap. 🙂

Press ‘ctrl+w’ to open the search box, and type ‘%sudo’ to find the permissions line.
Press ‘ctrl+k’ to cut the ‘%sudo ALL=(ALL:ALL) ALL’ line, and then ‘ctrl+u, ctrl+u’ (hold ctrl and press ‘u’ twice) to paste the line in twice.
Edit the second line to read ‘steve ALL=(ALL:ALL) ALL’ and press ‘ctrl+x’ to exit, and press enter to save.

Setting up sudo is important because we are going to disable root logins here in a minute, but first we are going to set up SSH Keys for logins and then disable clear text logins. SSH does use clear text passwords, but it passes them through an encrypted tunnel. This means that while your password isn’t likely to be sniffed, it could be guessed or brute forced. Using SSH keys means you have to have the right private key to match with a public key on the server. But before we can do any of that, we need to test the new non-root account by logging in with it.

Once you are logged in as steve, test sudo:
sudo whoami

Which should return ‘root’.

Securing SSH with Asymmetric Keys

Once the non-root account is working and sudo-ing, we can proceed to lock down SSH with public+private key pairs. I will explain how to do this with putty for Windows, but it’s actually way easier to do this with Unix.

The first step is to make sure you have puttygen.exe handy. Download it and launch it, change the bits for your keys to 4096 (in the lower right corner) then click the ‘Generate’ button.

puttygen1
Wiggle the mouse around for a bit, and in a minute or so you will see your public key, with a key comment and blanks for your passphrase. You don’t have to change the comment, or enter a passphrase, but I recommend it. I like to change the comment to match the username and server (‘steve@stevesblog.com’ in the screenshot below), since I have lots of different keys. The passphrase keeps things safe in case your private key file falls into enemy hands.**

puttygen2

At this point, you may be tempted to use the same passphrase for your private key as you use for your non-root user account. This is a bad idea, because your non-root password is now basically your root password. Do yourself a favor and use two completely different passwords.

Next, click ‘Save private key’ and save the resulting .ppk file in a safe location, but don’t close the puttygen window just yet. If you use multiple computers, putty will let you re-use your private key file between Windows machines, if that’s what you’re into. SSH on Linux may, but it will not let you use a puttygen file in a Linux system. (Based on that one time I tried it and it didn’t work for me.) So just keep that in mind.

Also, it’s no big deal to have multiple private/public key pairs on the same server. You can use a different pair for each client computer, which is probably safer and more convenient than using a shared key pair. If you lose access to a client machine for whatever reason, you can just delete the public key off of the server and that machine won’t be able to connect to your server.

Leave your puttygen window up and switch back to your putty/SSH window. Create a .ssh folder and a key file for SSH, then a text file to store your keys:
mkdir ~/.ssh
nano ~/.ssh/authorized_keys

Paste the Public Key text in the top of the puttygen window onto a single line in the file. This will be a Very Large Line Of Text(tm) (VLLOT). The VLOTT should begin with ‘ssh-rsa’ and end with ‘rsa-key-yyyymmdd’ where yyyymmdd is the date you created the key. Sometimes the key comment (steve@stevesblog.com in the example below) is the last bit of text. I haven’t quite nailed down why that is, presumably an order of operations thing. Anyway, be sure that the VLOTT begins with ssh-rsa, or you didn’t grab all the text in the public key.

Save and close the file (‘ctrl+x’ and then ‘enter’) and then set the permissions for the file:
chmod 600 ~/.ssh/authorized_keys

Now exit your ssh session, and reopen putty. You need to set the IP address of your server as the hostname. I prefer this to host names because DNS can’t always be trusted. Give your session a useful name.

putty_2

Under ‘Connection -> Data’ add the username for your non-root account. In this example, I named my account ‘steve’.

putty3

Under ‘Connection -> SSH -> Auth’ browse to the safe place you saved your private key. You pasted your public key onto the server, and you have your private key stored on your computer. You will want to keep the private key file safe because if you lose it you have to set up a new pair while logged in at the console, which is a total pain. I keep mine in Dropbox, but I keep them secured with a passphrase.**

putty4

Now go back to Session and save your session profile. Henceforth you can connect simply by double clicking ‘steve’s server’ under ‘Saved Sessions’.

Now it’s time to test your new key pair. Just double click ‘steve’s server’ and you should be prompted for the passphrase that you set for your private key. Once you enter it, you should be logged in to the server as user ‘steve’. If you were able to log in using your key, you are all set to move on. You are now free to close PuttyGen.

If The Server Rejects Your Key

It’s most likely that you didn’t paste the public key correctly. This is why we left the PuttyGen window open. 🙂

Log in with your non-root username and password (‘steve’ in this example) and open your ~/.ssh/authorized_keys file in nano again:
nano ~/.ssh/authorized_keys

In the PuttyGen window, make sure that you scroll to the top of the public key text. It should begin with ‘ssh-rsa’. Now click and drag down to the end of the public key text, then right click and select ‘copy’.

In the Putty window, with your authorized_keys file open in nano, delete the incomplete key and paste the complete text of the public key on a single VLLOT.

Save and exit nano, then exit your SSH session and try again.

Also make sure that you changed the permissions of the authorized_keys file:
chmod 600 ~/.ssh/authorized_keys

If your key is still being rejected, generate a new public and private key by clicking the ‘Generate’ button and starting the whole key process over again.

Disable Root and Cleartext Logins

Once your keypair is working, (and you are able to log in with it) it’s time to eliminate root logins and cleartext logins. Some folks will tell you that root logins are fine with SSH because passwords don’t get sent in the clear. While that’s true, ‘root’ is still the one username that is guaranteed to be on every Unix-based machine, so if you are going to brute force an account, this is the one to focus your efforts on. Disabling root logins and clear text logins is all done in the sshd_config file:
sudo nano /etc/ssh/sshd_config

Press ‘ctrl+w’ and search for the word ‘root’. You are looking for this entry:
# Authentication:
LoginGraceTime 120
PermitRootLogin yes
StrictModes yes

Change ‘#PermitRootLogin yes’ to ‘PermitRootLogin no’. (uncomment if necessary and change from ‘yes’ to ‘no’.)

Then press ‘ctrl+w’ and search for the words ‘clear text’. You are looking for this entry:
# Change to no to disable tunnelled clear text passwords
#PasswordAuthentication yes

Change ‘#PasswordAuthentication yes’ to ‘PasswordAuthentication no’ (uncomment and change from ‘no’ to ‘yes’.)

Once these changes are made, DO NOT LOG OFF OF YOUR SSH SESSION. Once these changes are implemented, it will be hard to log back in to undo anything if you make a mistake. You should have tested and succeeded with your ssh-key based login because we are about to restart the ssh daemon and prevent clear text logins:
sudo systemctl restart ssh

To test ssh logins, connect to the IP of your server with putty using the ‘Default Settings’ profile. Your login attempt should fail because only people with private keys are allowed to the party:

putty_failed

At this point you are far from being hack-proof, but you are a bit more locked down than you were before, and there are always more convenient targets out there 🙂

Hardening web servers is another story, which really isn’t my bag to be honest. There’s a reason that I host my blogs with Google or WordPress 🙂

* Protip: put your putty.exe file in ‘c:\windows\system32’ so you can run putty from the command line or the run line. If you want to be a real hard rock, rename putty.exe to ssh.exe. Did you know putty accepts commandline args? It does, so you can do awesome Unixy shit from the command line like type ‘ssh steve@testbox.stevesblog.com’ to connect to a remote host. It still pops up your connection in the putty window, but it keeps your hands on the keyboard. 🙂

** Another Protip: not setting a passphrase is handy for automating ssh connections, especially if you want to move files back and forth with ‘scp’ or mess with tunneling via local and remote ports. I haven’t found a decent scp command line app for Windows, other than the Unix utils in CygWin.

My .screenrc

I am a huge fan of screen. It’s indispensable for working on a Unix host via SSH. It lets me have multiple terminals (screens) up at a time. There are dudes that use screen to split their terminals into multiple views, like a tiling window manager, but for the command line.

My needs are not nearly as sophisticated, since I mostly use putty to connect to Linux servers from Windows.

I use 4 special keys:
F9 to detach from the screen session. This is leaves your session running in the background. I mostly use this to idle in IRC. Once detached from your session you can view your active screen session by typing:
screen -ls

Which will return something like this:

user@localhost:~$ screen -ls
There is a screen on:
2030.pts-0.localhost (05/25/2016 06:45:51 PM) (Detached)
1 Socket in /var/run/screen/S-user.

To reconnect to a detached screen session, type
screen -r 2030.pts-0.localhost

If the session is in use elsewhere, use the -D option:
screen -D 2030.pts-0.localhost

This will disconnect the screen session that’s in use, log off the SSH session that initiated it, and then reattach the active SSH session to the screen session.

F10 to open a new terminal in screen.

This option lets you have multiple terminals in the same SSH session. This is handy for having a full screen app (like irssi) in one term, and one or more additional terms for running other commands. To close a terminal, type
exit

F11 and F12 to switch terminals
When you have multiple terminals open you can navigate them, from left to right with the F11 key to select the terminal to the right, and the F12 key to select the terminal to the left.

The File
To use this file, simply paste the contents below into a file called .screenrc in your home directory. So here it is, the .screenrc, that I have been using for years:


startup_message off

# Window list at the bottom.
# I got the long line of vars from https://bbs.archlinux.org/viewtopic.php?pid=423481#p423481
hardstatus alwayslastline
hardstatus string "%{.kW}%-w%{.W}%n %t%{-}%{=b kw}%?%+w%? %=%c %d/%m/%Y" #B&W & date&time

# From Stephen Shirley
# Don't block command output if the terminal stops responding
# (like if the ssh connection times out for example).
nonblock on

# Allow editors etc. to restore display on exit
# rather than leaving existing text in place
altscreen on

# bind F9 to detach screen session (to background)
bindkey -k k9 detach

# bind F10 to create a new screen
bindkey -k k; screen

# Bind F11 and F12 (NOT F1 and F2) to previous and next screen window
bindkey -k F1 prev
bindkey -k F2 next

The Drama With My New Laptop: the High Cost of Saving $350 (part 3)

This post contains a lot of profanity. Like a shitload.

When we last left our heroes, I had finally managed to encrypt my SSD, and after running clonezilla probably a hundred times to back up and restore the drive after fucking it up, I decided to try and simplify the backup process.

Part of the hassle was the fact that I had removed the optical drive and installed the original mechanical drive into that bay. This meant booting from an external DVD drive, or from a USB stick in order to do the backups. I was also using GParted a lot, which meant a second cd-rom disc or thumb drive. Thankfully I was using an i-Odd external hard drive to do this, but it still meant plugging something in so that I could copy files to an internal hard drive. Backing up has to be convenient or backups simply won’t happen.

My first thought was to install linux on an external drive. This would give me the option of using the drive on different computers. Maybe it’s possible, but I never got it to go. I wiped an external drive a couple of times. I used to use Sardu Linux, but it was not that reliable, and the project seldom kept pace with new versions of live CDs. Also the primary developer started putting spammy spyware in the installer at one point.

After a lot of formatting and re-partitioning, this time on my secondary clonezilla_logo_smallbackup drive, I decided to go with a simpler approach and just put the Clonezilla live install on a small partition on the backup drive. This hadn’t worked on my USB external drive, but I wanted to try it with the internal, based on this document. Basically I created an 800mb FAT32partition and extracted the zip to that partition. I used the rest of the disk for a large NTFS partition. I skipped all the GRUB stuff, and I just use the alternate boot menu to boot from the other drive when I want to do my backups. I then set the FAT32 partition to be hidden so it won’t show up in Windows. It would have been great to have a small Linux install for times when I am in a hurry and I don’t want to decrypt my Windows drive, but this will do fine for now.

holy shit! i got it working!

The Drama With My New Laptop: the High Cost of Saving $350 (part 2)

This post contains a lot of profanity. Like a shitload.

When we last left our heroes, I had finally gotten Windows working on an SSD after trying a bunch of things, and then basically giving up and then reinstalling everything. Now that the SSD was working, the time had come to encrypt the SSD.

I am a fan of block crypto. I encrypt lots of things, not because I am worried about the government seizing my gear (well, not *that* worried) but because gadgets get lost and stolen. I lost my mobile phone a couple of years ago, and if I hadn’t encrypted it, it would have been nerve wracking worrying about what someone might do with the data that’s on it. So rather than worry about what is or isn’t protected, I just encrypt the whole drive. Full drive encryption is important because Physical Access is Total Access. I have rescued untold amounts of data for others from their crashed or otherwise misbehaving hard drives by removing them and plugging them into a different computer. I don’t normally encrypt the drives on my gaming rigs because if the FBI or whomever needs my Goat Simulator game saves that badly, they are welcome to them. This was a special case because it’s a gaming laptop. My rule is that if it leaves the house, it has to be encrypted.

Modern computers use UEFI to “securely” boot the operating system. I guess this is a security measure to prevent someone from booting your laptop from a CD and stealing all your shit, but since this laptop doesn’t have a Trusted Platform Module, Secure Boot doesn’t protect you from someone plugging your drive into another computer and stealing all your shit, I think it’s more trouble that it’s worth. If you have to ask Windows for permission to boot off a CD, it’s just going to stop the user from doing what he or she wants, it will not stop Proper Villainy(tm).

My favorite disk encryption tool, TrueCrypt, vanished under mysterious circumstances. I won’t get into the conspiracy theories behind its demise, but I have decided to keep encrypting my drive, and that leads me to the next chapter of this saga, where I get punished for using the basic version of Windows.

Part 2 – Solid State Drama’s Revenge

I prefer to run Windows on laptops because of all the bullshit proprietary hardware that goes into them. I am probably showing my age here, but there was a time when hardware support in Linux was spotty. I have swapped out Intel WiFi card for an Atheros cards in laptops to make sure I can do packet injection, but I now have a dedicated Kali laptop for that sort of thing. For my daily driver/EDC laptop, life is just easier with Windows. I know that that fucking with Linux makes a lot of dudes feel superior, and they probably are. For me, I prefer to use Linux for specific tasks (i.e. Kali and Clonezilla) or for servers. With that being said, I am not such a Windows fanboy that I care about the differences between Windows versions. My personal laptop won’t be joining an Active Directory domain, so I just go with whatever version came with my laptop, which I replaced with whatever version MS let me download when I migrated to the SSD.

This path of least resistance philosophy led me to entertain thoughts of using BitLocker to encrypt my hard drive, only I am not running Windows 8.1 Professional or Enterprise, so I guess that BitLocker isn’t included with my version. There is no fucking way that I’m forking over $150 for a new version of Windows after working so hard to save $200 on the RAM and SSD. No TrueCrypt? Fine. No BitLocker? Whatever. I don’t give a fuck. I’ll just use a fork of TrueCrypt called VeraCrypt. Well, VeraCrypt’s boot loader doesn’t play nicely with UEFI and GPT partitions. It only works on MBR disks. feelsbadman.jpg

So after days of messing with various tools to get Windows working on my SSD, and then enduring the hassle of setting up Windows all over again, and waiting on my Steam library to download again, I am faced with yet another hard disk challenge: converting my GPT partitioned drive to MBR without deleting anything. Honestly, now that Steam is in the Debian repos, I am sorely tempted to make my next gaming rig run Linux.

I tried a bunch of things and ended up using the pirated AOMEI tool to do the conversion, and it worked, sort of. The drive booted, and VeraCrypt didn’t bitch about GPT anymore. However, when I went to back up the drive one last time before encrypting it, I discovered that AOMEI half-assed the conversion. According to Clonezilla, my drive had some remnant of the GPT boot stuff left on it that I had to fix with the Linux version of fdisk for GPT, a.k.a gdisk. I have screwed up plenty of working partitions with fdisk, so I was nervous to say the least. Also, the magical -z option that I needed to was buried in the “expert” menu section (AKA Here There Be Dragons!) which added to the danger. Clonezilla said to run gdisk -z but -z isn’t a valid option from the command line.

I read this tutorial to figure out what had to be done, and in the end I just closed my eyes, clenched up my butt cheeks, and hit enter. I got it working, and thankfully I had already made plenty of backups, just in case. Speaking of backups, I should find a way to make running Clonezilla easier…

Update 8/16 – A few months ago, I tried migrating to Win10, but it was a shitshow. I just pirated Win10 Pro (thanks to KMSPico portable, JFGI) and used BitLocker without a TPM. This was less stressful since I set up easy bare metal backups in Part 3.

Stay tuned for the thrilling conclusion in Part 3 – Making Backups Easy to do is Hard 🙂

The Drama With My New Laptop: the High Cost of Saving $350 (part 1)

This post contains a lot of profanity. Like a shitload.

I bought a new laptop a month ago, which for me is like moving to a new apartment. Getting it set up the way that I want it has been a total pain in the ass. Mostly because I have decided to save money by implementing key features myself, but also because the relentless march of progress in the PC market has left me behind. This was an uncharacteristic purchase for me, but I wanted a powerful laptop that I could write, code, play games, and run multiple VMs on. In short, I violated my first rule of personal computing, which is to use dedicated computers for specific tasks.

The goals were:

  1. Be made mostly of aluminum – my laptops tend to have case or hinge problems before they have actual hardware problems, although when they do have hardware problems, it’s almost always the hard drive.
  2. Be ready for anything – have 16gb of RAM, an SSD, USB3.0 and a high end GPU
  3. Have a big screen and full size keyboard – this is replacing a full-sized laptop
  4. Have ample storage – I also bought a caddy to go into the CDROM bay to house a second hard drive
  5. Be encrypted – I normally don’t keep important things on laptops, or gaming rigs, but this is my main computer now
  6. Be backed up regularly – I am not usually a stickler for backups because I use several computers. But with this machine, I want to be able to do a full disk image fairly easily

I have built enterprise servers in less time than I have spent tweaking this fucking laptop. I have more or less achieved all of my goals at the considerable expense of my time and possibly my sanity. There are three major sources of my discontent. The first is that copying a Windows install to a smaller drive is wildly difficult and Asus makes the process even more so. The second, is that Modern versions of Windows are not very friendly with the block crypto tools that I trust. The third is that because I decided to remove the optical drive, I wanted dual-boot Windows with my favorite cloning tool, Clonezilla.

Part 1 – Solid State Drama
I went with the Asus N550jx because it is a mostly aluminum mid-range gaming laptop with a big screen, full size keyboard with keypad, and a touch screen. I can sort of take or leave touchscreens on laptops, but my wife is a fan. I like for she and I to have the same model of laptop. That way, when she runs into problems, I am already very familiar with the hardware and software she is using. The N550jx comes in two models: one with 8GB of RAM and a 1TB mechanical HDD, and one with 16GB of RAM and a 240GB SSD. Both models have the same processor, GPU, screen, and case, and I was able to price another 8GB of ram and a 250GB SSD for almost half the price of the difference between the two models, for a savings of roughly $200. It was a mistake brilliant idea!

#5 Torx bits? On a 6lb laptop? Who does that?Getting the upgrades installed was a series of misadventures. The first obstacle was that for no good goddamn reason, Asus decided to use #5 Torx screws on the chassis. I have plenty of star bit screw drivers from working on Compaq computers back in the Dark Ages, but no #5’s. So what any red-blooded All American Man would do. First, I went on the Internet and complained, and then I ordered yet another set of screwdriver bits from Amazon.

holy shit! i got it working!With the SSD and RAM in place, it was time to get the OS off the mechanical drive onto the SSD. In the past, moving an install of Windows was simply a matter of shrinking partitions with GParted and cloning them with Clonezilla. With the Asus N550jx and Windows 8.1, there is a bunch of bullshit associated with hidden restore partitions with weird flags and whatnot. It is this bullshit that thwarted my countless attempts to migrate the partitions correctly. I even used pirated copies of notable commercial disk cloning tools like Norton Ghost and AOMEI with little success. After a few days of trial and error, I ended up just doing a clean install of Win8 on the SSD. Fortunately, Microsoft lets you create your own install media from an activated Windows system, and Asus is kind enough to make drivers and utilities available on their website for download. So after much installing of software, I had a working OS on the SSD.

All of this trial and error is why I am a huge fan of bare metal backups. I have used all manner of tools and other nonsense to back up Windows and/or data, and the only thing that is truly reliable is dumping the entire drive to an image file on a separate drive. Copying data always leads to missed files, and snapshots and restore points become corrupted especially when malware is involved. Rolling an infected PC back to a restore point is the fastest way to get rid of malware, so most crackers wipe out your restore points as part of the exploit process. Because of this, I don’t really care about recovery partitions, or restore points, or any of that other bullshit. If my laptop eats itself, I just want to roll it back to where it was just before the last time I tried to do something stupid to it. I understand that your typical consumer isn’t familiar with imaging hard drives, and that is why those other tools exist, but for me it’s Clonezilla or GTFO.

Stay tuned for Part 2: Solid State Drama’s Revenge 🙂