Gaming PC parts list

A friend of mine just ordered parts for a PC, using the Logical Increments lists. Here’s what he came up with.


Title Price Quantity Has
Cooler Master HAF X – Full Tower Computer Case with High Airflow Windowed Side Panel and USB 3.0
$159.99 1 0
Intel Core i7-4790K Processor- BX80646I74790K
$339.99 2 1
Crucial MX100 256GB SATA 2.5″ 7mm (with 9.5mm adapter) Internal Solid State Drive CT256MX100SSD1
$109.99 2 1
MSI ATX DDR3 2600 LGA 1150 Motherboards Z97-GD65 GAMING
$175.99 2 1
Noctua NH-U14S for Intel LGA 2011,1156,1155​, 1150 and AMD AM2/AM2+/AM3/3​+,FM1/2 Sockets, U Type, 6 Heatpipe,140mm CPU Cooler Cooling

Offered by Amamax (USA Seller).

$69.99 2 1
Seagate Barracuda 1 TB HDD SATA 6 Gb/s NCQ 64MB Cache 3.5-Inch Internal Bare Drive ST1000DM003
$52.92 2 1
Arctic Silver 5 Thermal Compound 3.5 Grams

Offered by LLYtech.

$7.70 1 0
EVGA GeForce GTX760 w/EVGA ACX Cooler 2GB GDDR5 256bit, Dual-Link DVI-I, DVI-D, HDMI,DP, SLI Ready Graphics Card (02G-P4-2763-K​R) Graphics Cards 02G-P4-2763-KR
$239.99 2 1
EVGA SuperNOVA 750G2 80PLUS Gold Certified ATX12V/EPS12V 750W Power Supply 220-G2-0750-XR
$112.99 1 0
G.SKILL Ares Series 16GB (2 x 8GB) 240-Pin SDRAM DDR3 1600 (PC3 12800) Desktop Memory F3-1600C10D-16​GAO

Offered by DotComFriday.

$149.99 1 0

Remote file system bad display (MacOS sucks)

Finder’s access to remote filesystems will sometimes show stale data. I’m copying a 2.5GB file to an SMB share (Linux samba). While copying I look at a folder, then click the little down arrow to open the subfolder in the window. It shows no contents, it must be an empty folder! So go to delete, and then it pauses saying “deleting 5 items”. Ha  ha, joke’s on me. Cancel the delete, try again. This time wait 3–5 seconds for the file content to list, and poof there are the files now. At least I cancelled the delete in time.

I’m not complaining that the remote filesystem is slow, particularly when a big copy is happening. No. I’m complaining that Finder is showing incorrect information.


Some quick ad campaign stats

I ran some simple ads to promote Logs of Lag, my League of Legends netlog parser. I’m having a hard time getting attention and thought it’d help to get a bit of paid traffic. I invested, oh, $20 in all of this, trying to bottom feed some traffic. Results after a week or two of trying:

Google Adwords

0.18% CTR. 4468 impressions, 8 clicks, $2.73

Screen Shot 2014-06-26 at 2.03.14 PM

Very bad results, particularly on the search ads. Google’s minimum bid for a relevant keyword like [league of legends lag] seems to be at least $1, and apparently Google is happier showing no ads than showing cheap ads. 4339 of the impressions were via the Display Network. Near as I can tell AdWords gives me no way to see where those ads actually displayed. Bummer.

Bing Ads

0.59% CTR. 848 impressions, 5 clicks, $0.17

Straight-up clone of the Google ad campaign. But Bing Ads is quite happy to show ads for $0.10 CPC, so the good keywords I wanted got some impressions and clicks. No display ads at all; not sure if I wasn’t eligible or there was just no inventory.

Twitter Ads

Too early to tell. But my ad is a poor match for Twitter. All you can promote is a tweet, and I didn’t want to write a special ad tweet for my website. Also hard for me to imagine a Twitter user will be inspired to try my tool by a Twitter ad, particularly on a mobile device. I think Twitter Ads may be better suited for brand campaigns.

Reddit Ads

0.94% CTR, 14030 impressions, 132 clicks, $10.

Screen Shot 2014-06-26 at 2.11.33 PM


By far the most successful of my ads. In part because I could target it so specifically to /r/leagueoflegends, a big community. Also Reddit ads are cheap. It’s a pretty interesting ad product really, particularly the way ads become stories that get Reddit votes, comments, etc. I’m surprised it’s not better known. Reddit has a blanket $0.75 CPM rate which seems fairly cheap to me, particularly given the value of Reddit’s audience. The ad tool sure is primitive though. I’m left wondering why Reddit doesn’t pursue an ads business more aggressively.


Small ad campaigns are cheap. OTOH I didn’t set up any way to measure actual conversion. What I want is to get my tool in people’s minds so they remember to try it a month from now, and I’m not about to do the necessary work for such a small side project of mine. I don’t regret the $12.90 I spent though. I’d definitely go back to Reddit ads to get attention for something there, particularly if the organic way of getting Reddit attention doesn’t work out.


Directory symlinks (MacOS sucks)

Try this on a Mac

cd /tmp
ln -s ~/Docum <press TAB>
ln -s ~/Documents/ .
ln: ./: File exists



Yes, MacOS “ln” is too stupid to understand that if something has a trailing slash you probably meant to symlink the directory with that name, not the implicit “Documents/.” name. And bash completion is just helpfully providing slashes because 99% of the time you want them when talking about directories. Of course this works fine on Linux; another legacy of MacOS being based in SysVile.

(To be fair the with/without trailing slash thing has always been ambiguous in Unix. rsync even assigns different semantics to the two names, confusing me every single time.)


Steam (MacOS sucks)

The Steam client on MacOS has always been terrible. I mean it’s badly designed as a product, but then also the implementation is terrible. Basically it’s 90% just showing a web page, but that’s always been slow and cranky and broken.

But today takes the cake. Because running Steam has brought my whole system to its knees. Beachball cursors, 10 second delay switching apps, 20 second delay to launch Total trash. And as bad as Steam is, the real blame falls on MacOS for freezing because one app is badly behaved. I don’t see anything obviously wrong running top or activity monitor. Is the graphical shell single threaded or something? Steam wasn’t even doing much, just sitting there showing me a web page and downloading an update in the background.

To be fair, I think part of the problem was memory. When I killed Steam and relaunched it a bunch of memory freed up and things are better. So maybe that’s a Steam bug afterall. No Unix kernel schedules memory contention well.



ToS/DSCP byte in IP packets

Update: be sure to read comment #3 below from Dave Täht. Apparently the Bufferbloat project has made a lot of progress on good router queuing and there’s a lot for me to learn about, including fq_codel and CeroWRT.

Interesting thing in Transmission’s Bittorrent client’s docs

peer-socket-tos: String (default = “default”) Set the ​Type-Of-Service (TOS) parameter for outgoing TCP packets. Possible values are “default”, “lowcost”, “throughput”, “lowdelay” and “reliability”. The value “lowcost” is recommended if you’re using a smart router, and shouldn’t harm in any case.

The Type-of-Service byte is the second byte in an IP header. It is not widely implemented, and has a tangled history. Is it useful?

The bottom six bits are for DSCP, Differentiated Services (aka DiffServ). This allows an IP packet (or the program controlling it) to request a certain level of routing service quality. There’s various things to request; low jitter, best effort delivery, and a couple of means of defining priority classes. I don’t know how well supported these bits are, the Wikipedia entry makes it sound like it’s not entirely unknown. Linux IPTables has rules for setting DSCP classes. The Bufferbloat wiki has some discouraging notes on implementation. Windows 7 claims to support it, but looks wonky. Someone found a Comcast bug in their DSCP data that was harming download speeds.

The top two bits are used for ECN, a way for the router to signal to a TCP stack that it should slow down without actually dropping packets. I think ECN is not practically useful in today’s Internet, too many clients don’t implement it and too many routers don’t use it (or any form of queue management). But I could be wrong. /proc/sys/net/ipv4/tcp_ecn is 0 (not enabled) in my Tomato/Shibby router even though I’ve turned QoS on.

My conclusion on a quick read is these bits are not usefully deployed, too much stuff doesn’t support them. Too bad; it’s exactly what a home network router needs to cooperatively share a network. So much of TCP/IP QoS work was abandoned because it doesn’t work with hostile peering relationships. But I really do trust and control my home router and I wish it could do more to manage bandwidth.

Logs of Lag ready to go

My Logs of Lag site is ready to launch. It works well, a clean and simple way to drop a League of Legends logfile onto a web page and see some statistics.

Not sure how to properly “launch” this yet. Success would be some visibility on the League of Legends Reddit, I’ll probably post it on Monday but it’s a crapshoot of the first 15 minutes whether it gets upvotes or not. I don’t expect an amazing response, this kind of tool is really something more useful to hang around and be available for months or years when people need it.

I’m pretty happy with how the final project came together. I was worried for a long time about the visual style. I still don’t think it’s great, but adding a basic background image texture and switching the fonts from Helvetica/Arial made a big difference in making it feel “real”. And I don’t want to do the over-produced glitzy graphics thing that’s popular with most gaming sites, I find that a useless distraction.

Ideas for future features:

  • Highlight “bad” data. Ie; if the loss/second is greater than some threshold, make that show up in big angry red warning font. I want to collect some logfiles from users to see what is typical before building a classifier.
  • Parse broken logs. In particular the logfile gets wonky if the game disconnects, would be nice to parse that correctly.
  • An analyzer for a whole directory of files. This is the way works, presenting summary statistics for gameplay in many games. It’d be useful to characterize lag for many games as well. Totally feasible given all the work is client-side, it could be fast. A histogram showing the distribution of median ping and median packets/s loss would be a good start.

And ideas for the server:

  • Replace the Python CGI with a real server. I have a Node version but I’m not sure I like Node. Porting the CGI to Gunicorn would suffice for scaling.
  • Or.. Remove the server entirely. Just make the client Javascript upload stuff to S3.
  • Or.. commit to a real server, and add logins and user data management.