noUiSlider, a nice Javascript control

Just wanted to give a shout out to noUiSlider, a nice Javascript library for building slider controls. We just used it to add a time-range slider control to Wanderings, my little location tracker. Now you can restrict what data is shown to specific date ranges. Our control looks a lot like this demo.

There’s a zillion slider controls implemented in Javascript. Most of the well known ones are 5+ years old and require jQuery. There’s some fancy new ones that require React. What I like about noUiSlider is it’s standalone, doesn’t rely on any other frameworks. It’s not tiny but it’s pretty small, about 90k uncompressed or 25k gzipped.

noUiSlider is also remarkably professional. It’s quite flexible; you can have any number of handles, linked or not. It’s straightforward to restyle. It gives off a nice collection of events when the user interacts with it. It works thoughtfully on touchscreens. And it’s all well documented! Nice library.

The one thing I haven’t figured out how to do in my app is add a nice UI for “filter to one month” for “one year”. It’d be easy enough to add buttons but the UI would be so cluttered then. Want something less obtrusive. For now we have a bunch of hotkeys defined on the main window (YQMWD for ranges, also HL to pan back and forth) but that’s pretty mysterious.

Edit WSL Linux files from Windows with SFTP

I just hit on a way to edit files in WSL that in retrospect seems really obvious. I’m wondering if there’s some reason it’s a bad idea. It seems to work!

The problem I’m trying to solve is editing files in the Linux partition of WSL from Windows. Ie: editing $HOME/.bashrc with Notepad. We’re told to absolutely never change the Linux filesystem files from Windows because things can break.

So my solution is to run the sshd daemon under WSL. Then use that to connect to my Linux files via SFTP running as a Windows app. I’m using SFTP Net Drive to mount my Linux files as a Windows drive, there are other options. Then use any Windows program to edit the files in that drive, like Notepad or Sublime Text.

All the actual file I/O is happening under WSL thanks to SFTP, so it should be safe. SFTP imposes some overhead but it’s not a big deal for just editing files. One nice thing is SFTP seems to do a pretty good job at preserving Unix file permissions even when editing from Windows.

Update: I’m now doing this regularly, all the time, with the Linux partition mounted as W: in Windows. It’s great! Makes a big difference having easy access to the files over there.

(originally posted to Reddit)

New release of Wanderings, in which I learn to love Webpack

Brad and I pushed our first new release of Wanderings since the big release last January. No user visible changes, but we upgraded all the versions of third party packages and build tools we were using.

The real thing is I had to get over being whiny about webpack and npm. Right after we finished 1.0 Brad stepped in and reorganized our code to use fancy web tools. Previously we just had a big ol’ wanderings.js file we imported on every page, along with a bunch of third party minimized redistributables included one after the other in the HTML file. You know, the old school way. Brad redid everything nicely in webpack and npm so that third party modules were fetched from a repository and bundled together. Our code was refactored and then combined into the bundle. All with nice frontend scripts like “test” and “deploy” and “start development server”.

Unfortunately I don’t know anything about these fancy new fangled tools, I live or die by press-F5-in-the-browser. And it was kind of a big hump for me to get over it, learn the tools, and use them. I have a categorical problem any time someone tries to inject a “compile” or “link” phase into what should be simple interpreted code. It always slows me down. npm is very hard to love, too.

But webpack solves a lot of problems too. Managing distributing the right versions of things. Bundling them so they distribute efficiently. Minimizing the code for obfuscation + network savings. Keeping it possible to still run devel versions with source maps, etc for easy debugging.

The fancy devel server we get from webpack is pretty nice too. It seems to automagically arrange for things to recompile and reload when they change. I had a problem with this at first because our devel bundle is like 4.5 megabytes and I was running my devel server remotely, on another continent. That’s a bad idea; once I got the devel server running locally on my laptop it was much better. (We might want to split our bundle into two pieces though, a big one of third party code that seldom changes and a small one of our own code. Just to keep reload overhead down).

I’m still grumpy about some things. I’ve lost all my global variables I used to reach into to inspect the state of the running system in the console. I’m not sure I can get the Firefox debugger working at all, the source maps or something aren’t working. Chrome seems better though.

I still hate imposing any overhead in the “change code, reload, run” loop. But all this build stuff solves real problems in the Javascript ecosystem so I think it’s best to play along.

PC gamepads: DirectInput vs XInput

I bought a crappy cheap game controller to use for a few weeks and ran into the total compatibility nightmare which is Windows game controllers.

Long story short, there’s two completely separate APIs. DirectInput and XInput. Old controllers (like my cheap PS3 knockoff) used DirectInput. Starting ~2005 Microsoft introduced XInput as a new API with better support for advanced controllers. It had drawbacks and some game controllers were slow to adopt it. Around 2011 Microsoft deprecated DirectInput, so everyone’s supposed to use XInput now. Despite that you can still buy new controllers that do DirectInput. Some controllers have a hardware switch to toggle modes.

What’s dumb is not all games still support DirectInput. (Looking at you, Dead Souls). Worse, Microsoft’s drivers don’t support some compatibility shims so a DirectInput device can be used via the XInput API. So you have to do something third party if you have an old DirectInput controller and want it to work with XInput. There’s three options I found:

x360ce is a thing that emulates an X360 controller and looks like an XInput driver but takes its input from DirectInput devices. I’m not sure it really is a driver, the hack includes its own XInput.dll you have to drop in to some games’ folders to fake it out. I didn’t try this, but it’s the usual old school gamer recommendation for solving this problem. It seems very flexible and hacker friendly. There’s newer binary builds on the GitHub page.

ScpToolkit is similar to x360ce, but emulates a Playstation controller. I did actually install this and found it didn’t work and uninstalling was spooky. I ended up using Windows’ rollback to remove it.

Steam Big Picture mode also includes an XInput emulator that seems to work well. I can’t find official docs for it and its confusing, but the Internet is full of badly written guides and videos on how to use it. This is what I ended up using. Setup is a bit fiddly; for awhile I was telling it to be “Generic” and it was sort of working but what I really wanted was “Playstation”. I’m still not clear if that means it’s emulating a Playstation controller, or if it is using this cheap controller I bought thinking it is a Playstation controller. Or both. Whatever, it seems to work.

mosh behind NAT

I wanted to give mosh a try to see if the fancier remote shell program would work around this crazy Berlin Internet that terminates idle connections in < 5 minutes.

Turns out mosh doesn’t work if the server you are connected to is behind NAT. It relies not only on connecting to TCP:22 for the ssh, but also a second server running on UDP:60001 or nearby to do its magic. Worse, the mosh server doesn’t seem to have any support for NAT traversal. The advice of the docs is “set up static port forwards for these 1000 ports”. I appreciate the optimism about a pure 1990s Internet, but in the modern environment that’s ridiculous.

The solution I found is to use this 2012 shell script. It’s a wrapper for mosh-server that uses miniupnpc first to request the port be forwarded. Seems to work for me with my Ubiquiti router. You arrange for it to be invoked by putting the server wrapper script in your PATH on the server, then doing “mosh -server=mosh-server-upnp” on the client you’re connecting from to cause it to be invoked.

There’s also this recent Ruby script which might be a better choice, but I couldn’t be bothered to look up how to install Ruby scripts.


Kiwix: Wikipedia offline

Just a shout-out to Kiwix, the software plus database for reading Wikipedia offline. It’s incredibly helpful to have on your cell phone when you’re traveling and don’t have a data plan. Also nice just to have lightning-fast browsing of Wikipedia. They now have clients for iOS, Android, Windows, MacOS, and Linux.

You have to download the data separately. Their ZIM format seems well thought out, compressed tight and general purpose enough you can get other document collections like Stack Overflow posts, TED talks, etc. English Wikipedia is an enormous 35Gb, 80Gb if you get images too. There are various Wikipedia subsets but no “top articles” opton I see. (Be careful not to download a Simple English version by accident.)

The most interesting thing to me is Kiwix is now partnered with Wikimedia Foundation. A nice cash infusion, but also gives Kiwix some legitimacy. I’m hoping that gives a second wind to the editorial project to produce a Wikipedia subset that’s smaller and well selected to be sufficient.

There are some other offline Wikipedia readers on the Apple app store. I used Wiki Offline for years but the company that produced that seems to have disappeared (again). There’s also Minipedia which does have a nice small subset you can download. I think all of these alternatives found it hard to produce regular updates with the most recent Wikipedia content. Kiwix seems to be doing quarterly releases.


x265 transcoding

I continue to re-encode video I get into x265. It’s much smaller, which helps both with storage and bandwidth. And it looks fine to my tin eye.

My script has evolved over time. This is what I do now:

ffmpeg -i “$1” -map 0 -c copy -c:v libx265 “${1%.*} x265.mkv”

In theory this leaves all the audio and subtitles alone, just copies them, but re-encodes the video in x265 with default settings.

I was in a hurry today and so got fancy with resizing the video to 360p and using a faster / less accurate encoding setting.

ffmpeg -i “$1” -map 0 -c copy -vf scale=-1:360 -c:v libx265 -preset faster “${1%.*} x265.mkv”

Quality definitely suffers for this, but I can encode at 3-4x playback speed on my CPU this way.