Arch Setup

I’ve always loved the concept of Arch Linux, with it’s nothing-by-default setup and intentional lack of user-friendly tools, but I’ve run into issues with the installation that’ve prevented me from really using it enough to get familiar with it.

This time around I got it working perfectly, so I decided I’d write a little guide on what I did. This is mostly just a reference for me, but it should prove useful to anyone trying out Arch for the first time. This guide assumes familiarity with linux basics, like sudo, fstab, gparted, and you should probably read Arch’s Beginner’s Guide.

Getting Started

Start by booting the live CD.

Partition setup

Note: this guide only covers the setup for a MBR partition table as booting GPT requires more system-specific setup.

If you prefer a more graphical tool you can use the Gparted live CD to configure your partitions, then skip to the “Install base system” step. Note that the current version of the Gparted live CD won’t boot properly on VirtualBox without EFI enabled.

Locate the disk you want to install your system to with lsblk, then start cfdisk with that device.

cfdisk /dev/sda

In cfdisk, create a new partition table, then the partitions you would like, writing the changes when you’re finished.

Next, create a filesystem, substituting your new system partition and repeating for each partition you need to format.

mkfs.ext4 /dev/sda1

If a swap partition was created, activate it:

mkswap /dev/sda2
swapon /dev/sda2

Install base system

Mount partition

Start the installation by mounting your system partition, and any other non-swap partitions you created in /mnt.

mkdir -p /mnt
mount /dev/sda1 /mnt

Set up base packages and fstab

Move your preferred mirror to the top of the list, or add mine (

vim /etc/pacman.d/mirrorlist

Install base packages and generate new fstab:

pacstrap -i /mnt base
genfstab -U -p /mnt >> /mnt/etc/fstab

Chroot into the new installation:

arch-chroot /mnt

Configure language

sed -i "s/#en_US.UTF-8/en_US.UTF-8/g" /etc/locale.gen
echo LANG=en_US.UTF-8 > /etc/locale.conf
. /etc/locale.conf

Configure timezone

ln -s /usr/share/zoneinfo/America/Denver /etc/localtime
hwclock --systohc --utc

Configure network

Run ip link to list all network interfaces and enable DHCP on the one you want to use:

ip link
systemctl enable dhcpcd@eth0

Configure wireless (optional)

pacman -S wireless_tools wpa_supplicant wpa_actiond dialog
systemctl enable net-auto-wireless

Configure package manager

Open /etc/pacman.conf and check that the [core], [extra], and [community] lines are uncommented. If you’re on a 64-bit system (you should be), optionally uncomment the [multilib] lines for 32-bit compatibility.

After updating your pacman config, refresh the repository list:

pacman -Sy

Create a user

passwd # Set root password
useradd -m -g users -G wheel,storage,power -s /bin/bash alan # Create 'alan'
passwd alan # Set password for alan

Configure sudo

pacman -S sudo # Install sudo

Uncomment the %wheel line to allow your new user to use sudo:

EDITOR=nano visudo

Install bootloader

pacman -S grub-bios
grub-install --target=i386-pc --recheck /dev/sda
cp /usr/share/locale/en\@quot/LC_MESSAGES/ /boot/grub/locale/
grub-mkconfig -o /boot/grub/grub.cfg

Finish installation

umount /mnt

Desktop setup

# Xorg
pacman -S xorg-server xorg-xinit xorg-server-utils \
  xorg-twm xorg-xclock xterm

# Mesa (3D acceleration)
pacman -S mesa

# Drivers (only one needed)
pacman -S xf86-video-vesa # Vesa (general, works almost always)
pacman -S nvidia lib32-nvidia-utils # Nvidia

Desktop environment

Xfce4 + lightdm

pacman -S xfce4 xfce4-goodies lightdm lightdm-gtk-greeter
systemctl enable lightdm # Enable lightdm


pacman -S gnome # Install desktops
systemctl enable gdm # Enable gdm

KDE Plasma 5

pacman -S plasma kde-applications
systemctl enable sddm

After installing your preferred DE, reboot, and your system should be ready to go! If you decide to switch DEs, make sure you disable the display manager before uninstalling, otherwise you’ll have to manually remove the symlink from /etc/systemd.

If you’re running Arch in VirtualBox, you’ll want to install the guest additions with pacman -S virtualbox-guest-utils.

For a basic overview of the pacman and the Arch User Repositories, see the Arch wiki and this gist.

Surface Pro 4

So I bought a Surface Pro 4 a little while ago. I change between loving it and hating it almost daily.

It has pretty impressive hardware, apart from the “cheap” $899-$1199 models only having 4 GB of RAM and a small, slow 128 GB SSD. The 128 GB Samsung NVMe SSDs used are painfully slow by modern standards, usually getting only 7200 RPM hard disk drive sequential write speeds of 80-100 MB/s. On the plus side, the little Core m3 is an impressive little CPU. Despite the low clock speed and 2 cores, it runs the OS and most software beautifully quickly, even at the high native resolution of the display. An extra annoyance is that the keyboard cover (which is amazing by the way) is an extra $130, and fairly essential to getting everything out of the Surface Pro 4. The on-screen keyboard isn’t bad, but is hardly usable for any real work, especially in my field where I need quick access to arrow keys (of which the Windows 10 keyboard only has left and right, no up and down) and special symbols. It’s worth buying the keyboard, but it really feels like it should just be included with a tablet this expensive, especially when Microsoft is branding it as a laptop replacement.

The capacitive touch screen is absolutely perfect, and the included upgraded Surface Pen is a breeze to use, my only complaint with the pen being that it’s initial pressure required to register a touch is higher than feels natural. When used for drawing, this can be a bit of an issue, since very light strokes will often not be registered, while a Wacom tablet registers the same light strokes perfectly. I’ve only drawn a bit on it so far, but that was actually my original reason for buying it. I’ve been particularly impressed by the pen vs. finger detection, allowing you to rest your hand on the screen while writing or drawing without any accidental touches registering, while still being able to use fingers to use the UI. The Core m3 is fairly responsive even when drawing at very high resolutions in Photoshop, and runs OneNote’s pressure-sensitive drawing with no noticable pen delay, making sketching and handwriting feel fantastic.

Sadly Windows 10 is still not really stable. Everyone keeps telling me they haven’t had any issues with it. Maybe I’m just incredibly unlucky, but I’ve got five devices running Windows 10, and every one of them has some serious usability issue with the OS. Most noticably on the Surface Pro 4, the lock screen sometimes gets stuck and won’t respond to touch or keyboard input, and more annoyingly, the device never wakes from sleep about one in ten times the power button is pressed.

Overall, I’m not sure whether I’d recommend the Surface Pro 4 to anyone. I absolutely love it a lot of the time, but the quality issues I’ve experienced feel like something you’d have on a $200 tablet, not a $900 one (plus the $130 for the keyboard cover). I look forward to seeing what Microsoft does with the future Surface line. There’s a long way to go before it’s perfect (or even worth the price, probably), but so far this thing is awesome.

Update (Dec 4):

The latest Windows Updates included a new display driver that has somehow prevented my Surface Pen from working, including crashing OneNote on startup even when the pen is off. Installing updates for Windows Defender somehow fixed this, not sure why. I’m really considering an iPad Pro, which isn’t really what I want at all.

Social Network Performance

I’ve been working on a somewhat unique social network lately, and I wanted to see how it matched up with the big ones. Here’s a simple breakdown of the HTTP requests on each site:


  • 308 requests, 6.1 MB
  • 36 CSS files, 232 KB
  • 85 JS files, 1.7 MB
  • 161 images, 507 KB
  • 0 webfonts
  • 2 IFRAMEs


  • 65 requests, 2.7 MB
  • 6 CSS files, 127 KB
  • 4 JS files, 440 KB
  • 45 images, 2.0 MB
  • 1 webfont, 23.9 KB (Rosetta Icons)
  • 1 IFRAME


  • 224 requests, 4.8 MB
  • 13 CSS files, 79.9 KB
  • 23 JS files, 1.4 MB
  • 84 images, 1.7 MB
  • 6 webfonts, 42.6 KB (Various weights of Roboto)
  • 13 IFRAMEs

All of these were loaded from Chrome 45 on a desktop PC, with uBlock and Privacy Badger enabled (because really everyone should have both of these installed). Twitter is definitely the smallest, with a very fast perceived loading speed thanks to the small number of CSS files. All of the networks delay loading images until the rest of the interface is there. I was surprised to see Google+ using webfonts for their content, but since they were the same fonts used throughout many Google sites, chances are you already have Roboto cached from another previous pageload.

For comparison, here’s my current test site’s requests:

  • 20 requests, 779 KB
  • 1 CSS file, 21.5 KB
  • 1 JS file, 45.9 KB
  • 14 images, 577 KB
  • 3 webfonts, 132 KB (higher than I would like)

Apart from the webfonts, this loads very, very quickly. The webfont slowness was what prompted me to look into what other networks were doing, and it was good to see that other sites weren’t requiring nearly as much data for their webfonts. In Webkit browsers, on a slow connection, webfonts this big can prevent the content from showing for a good 5-6 seconds, which is definitely not usable. The large size of the webfonts is an issue, and the largest one is surprisingly my custom icon font, which I definitely need to optimize more before production. I’ll likely remove many of the icons from the set since I don’t need most of them.

The only conclusion I can draw from this is that Twitter is the only company who knows what they’re doing. Google+ has good cached load times, which is probably fine since most of their assets are cached from other Google pages anyway, but Facebook needing 1.7 MB of JavaScript is just scary.

Pidgin with Google Apps

I love using Pidgin with our chat server at work. It’s a really nice, clean IM client (apart from the account management, that’s a mess). Setting it up with Google Apps is slightly more tricky.

  1. If you use 2-Factor Authentication (never a bad idea), you’ll need to generate an app password before you can use the account with Pidgin or other generic mail/chat clients.
  2. Pick Other for the app name, and name it “Pidgin” or whatever else you want. Copy the password it generates for you.

In Pidgin’s Buddy List window, go to Accounts > Manage Accounts (Ctrl+A), click Add…, and enter the following details in the Basic tab:

  • Username: the prefix of your email address (e.g. “alan” from “”)
  • Domain: the domain name of your email address (e.g. “”)
  • Password: your account password, unless you have 2-factor authentication, then it’s the app password generated in the first step.

In the Advanced tab, set “Connect server” to

That’s it! You should be able to use Pidgin with your Google Apps account now.

MySQL SSL on Ubuntu 12.04

Ubuntu 12.04’s included libssl is incompatible with the default mysql verison provided. This isn’t how to fix that, but a warning not to try. Just use 14.04 LTS.

If you did set up SSL on a 12.04 server, you’ll likely run into issues exporting your database. If you get errors like these, just disable SSL.

root@db:/# innobackupex --user=root --password=MyPass /home/backup/rep-transfer
150821 13:54:35  innobackupex: Connecting to MySQL server with DSN 'dbi:mysql:;mysql_read_default_group=xtrabackup' as 'root'  (using password: YES).
innobackupex: Error: Failed to connect to MySQL server: DBI connect(';mysql_read_default_group=xtrabackup','root',...) failed: SSL connection error: error:00000001:lib(0):func(0):reason(1) at /usr/bin/innobackupex line 2949

root@db:/# mydumper --database=db_name --user=root --password=MyPass --host=localhost
** (mydumper:28499): CRITICAL **: Error connecting to database: SSL connection error: error:00000001:lib(0):func(0):reason(1)

Commenting out ssl-ca, ssl-cert, ssl-key and ssl-cipher lines in your /etc/mysql/my.cnf file and restarting the service with sudo service mysql restart should allow you to export again.