NTFS filesystem corruption in practice

I unplugged a USB drive without unmounting it first and got burned for my troubles. Windows was able to repair the disc and I think recover all my files, but I had to do it manually.

I use a hard drive to sync files between two Windows desktops with FreeFileSync. I now just unplug the drive whenever I want it; Windows can’t unmount it properly, always says it’s in use, maybe by the backup subsystem? Anyway, I just pull the plug.

So I went to sync the files from the drive to the other disc and it acted wonky. A few files disappeared during the sync. Some other files I expected to exist didn’t. Windows prompted me to run a check or repair via a notification but clicking that did nothing. Manually right clicking the disc and doing Tools / Check and Repair did work though. And afterwards the files I was expecting to find came back. At least I think so, I don’t have a complete audit.

I’m confused how Windows NT is willing to mount a USB drive that wasn’t unmounted cleanly without immediately recommending a check & repair. Also a bit confused about the pattern of what files went missing; it wasn’t just the newest stuff.

 

Sony Playstation: how to fail at cloud sync

First world problems; I got a second Playstation console, the sweet God of War edition. Yay for me, no more carrying a PS4 back and forth between houses! Only now I have to worry about moving my save game data around because the cloud sync on Playstation is so awful.

Sony added auto-upload back in 2015 as part of your paid subscription. However they never added auto-download. So even when everything is working perfectly you still have to manually download save game data every time you switch devices.

Only auto-upload doesn’t work very well. It seems to only upload when you put the device in rest mode. If you go straight from playing a game to shutting down, the files aren’t auto-uploaded. So you can’t trust it and have to manually upload. But manually uploading is also a hassle, you have to shut the game down entirely to upload. (Which raises the question of how it’s uploading in rest mode when the game is still in memory, albeit suspended.)

Even with fully manual management there’s problems. There’s some confusion about file origin; both of my PS4s are complaining that the file didn’t originate on their system when I try to download. Also you can’t be logged in to Playstation on the same account on two consoles at the same time. And there’s also some confusing thing about which console is your “primary” for your account, which seems to have to do with licensing rights. Only maybe a secondary console also doesn’t auto-upload?

Real two-way automatic sync is a little tricky; if the user modifies files on both systems simultaneously and then syncs both, you have a conflict. No gamer wants to have to “git merge” two timelines. But 99% of the time a file is only modified in one place so “always sync the latest version” is fine as long as you detect conflicts. Hell, do a little light locking and warn the user if they’re trying to play while another session is active. To be safe a good sync system should have a bit of history for the files too and some way to recover an older version. Sony of course doesn’t, treating storage space as some enormous premium. (It was a big deal when they moved from 1GB of storage to 10GB for users who are paying $10/month.)

I gather Xbox Live’s version of cloud syncing works better. Steam’s SteamSync mostly works just great too, although I did have it bug out on me once.

 

2017 Audi A3 map update

Wanted to update the maps in my car’s navigation system. To Audi’s credit they provide twice-yearly upgrades via a variety of methods, some free. (Sadly they don’t seem to have any way to update the MMI firmware though). The map updates are badly documented, so here’s the details.

First, the easy options that cost money. You can pay the dealer to do it. Or you can pay for the Audi Connect service (the cellular data for your car) and updates come over the air.

Now the free manual option. Discussed here, also see this video. It’s not hard or scary, the only real nuisance is their Java download program. Detailed walkthrough:

Downloading update

  1. Log in to myAudi.
  2. Click the “Audi connect services” button. I had a lot of trouble with their SSO redirects, maybe related to my ad blockers and privacy stuff.
  3. On the Audi Connect website go to Services / Map Update and select “Complete Package”, then click the “Prepare Package” button.
  4. Download the map update! Ha ha just kidding. Download a tiny JNLP file for Java Web Start to run a download program that Audi requires you use.
  5. Run the JNLP program. I managed finally by enabling it in the Java control panel (Security / Enable Java content for browser and Web Start applications) and then running “javaws foo.jnlp” from a command line. Perhaps there’s an easier way.
  6. Wait for the download. The full US maps were about 11GB.
  7. The download program creates four files. Two small files: one named metainfo2.txt and one named randomhexstring.md5. And two directories with the data: Mib1 and Mib2. Apparently all 4 are necessary for an update.
  8. Copy the 4 to the root directory of a FAT32 system on an SD card. (A USB drive will work as well.)

Installing update

  1. Go to the car and put the SD card in one of the slots for the MMI. (I used SD2).
  2. Turn the MMI on. (I do this by turning the radio on, not the ignition).
  3. Go to Settings / System Maintenance / Version info to check the versions of things on your system already. This shows the MMI software version (0694 for me) and some basic navigation database info; you have to navigate to the Navigation database submenu to see all the databases you have. My car had about 20 2016/2017 databases for North America, plus a single 2017/2018 for USA, Region 4 (California, where I live). That was an OTA update I got right after buying the car.
  4. Go to Audi Connect and select “Software update”. Select the source you plugged in (SD2 for me) and it begins. It takes awhile, about 30 minutes for me.
  5. Leave the car and close the door. Lights will turn off but the MMI will continue to be running with the screen lit and updating. Eventually the screen will close and it seems to keep updating. Hopefully none of this drains the battery too much. The update is automatic and requires no extra input
  6. Once it finishes you may have to enter a single click  to acknowledge the update.
  7. Go back to Version info to check the version numbers. Now all my car’s maps are labelled “2018”. Success!

Of course I’m curious about the contents of the update files but couldn’t find a lot of info. Mib1 has a directory named “Eggnog” and Mib2 has “Truffles”; internal product names? Mib1’s datafiles seem to be named “.psf” and Linux guesses some (but not all) are FoxPro FPT files, which seems plausible. Mib2 is mostly compressed data I couldn’t figure out, which some small SQLite tables without much interesting it. Probably all proprietary data, of course, and if they’re smart it’s encrypted too to prevent casual snoops like me from lifting all the data out.

Shame Audi doesn’t also let you upgrade MMI firmware. Mine has a bug where it’s limited to indexing 10,000 music files. Admittedly that’s a lot of music, but it’s a nuisance if you just want to copy all the music you own into the car. Static array sizes: not even once!

Windows 10 fail, safe mode, reboot

My Windows 10 box failed, hard. Well maybe soft, who knows? It would boot and log me in as a user, then the system graphical shell would hang, little spinny cursor. One window is visible on screen, Razer Central. That’s some nonsense to turn off the stupid light on my fancy gaming mouse and is notoriously bad software, so maybe it caused the whole system to hang? But even after killing it (Task Manager worked) the system was still hung. FileExplorer seemed to be spinning at 100% of a single CPU core, that’s not a good sign.

Anyway it took me a couple of hours to get the system back. Booting into Safe Mode or accessing other recovery tools on Windows 10 is not at all clear; these docs finally helped me. I tried restoring to a System Restore Point first and that didn’t help, the system was just as broken as before. Fortunately I’d made a full system backup yesterday (using the old Windows 7esque deprecated tool) and restoring that fixed it.

Not sure what really was wrong. The computer had been off for about two weeks, then failed after it rebooted itself to apply some Windows 10 system updates. There were also some Razer updates to install and given that it was on screen when the rest of the system was broken and their reputation for broken software, I’m inclined to blame it. This isn’t even the Razer laptop, this is just a normal Windows machine that had a Razer mouse and I’d installed the software to, like I said, turn off the stupid light on the mouse. Anyway now I’ve excised that software, turned off a lot of other third party services that want to launch at boot (screw you Adobe updater), and will reboot with my fingers crossed.

Update: a few days later I finally worked up the courage to try rebooting. And it worked. I’m 90% sure that the Razer update had some interactive thing it had to do that failed and blocked the rest of the upgrade boot.

x264 / x265 sizes compared

Doing a little ugly fast and loose codec comparison. I took a pirate scene MKV file in 720p with 5.1 audio, an hour long video file that was (no doubt) lovingly encoded in H.264 with one audio track and several subtitle tracks. I then transcoded it to H.265 and H.264 using ffmpeg with default settings. This also transcoded the audio from 640kbps AAC to 276kbps Vorbis. Anyway here’s what I got for file sizes:

Original file

  • 1800MB, 720p, 51 minutes
  • 1580MB video, 240MB audio
  • h264 (High)

x265

  • 170MB, or roughly 10% the size
  • 80MB video, 90 MB audio
  • hevc (Main), encoded with “ffmpeg -c:v libx265”, default settings.
  • Encoded at about 1.3x playback speed (software)

x264

  • 400MB, or roughly 22% the size
  • 290 MB video, 90 MB audio
  • h264 (High), encoded with “ffmpeg -c:v libx264”, default settings.
  • Encoded at about 3x playback speed (software)

The x265 encoding is remarkable. The video is literally smaller than the audio! And the resulting file is 10% of the source. The video segment is 5% of the size of the source and the audio is 38%.

The x264 re-encoding is also notable, in that I’ve squished it to 22% of the source with nominally the same codec. That’s a pretty good savings and indicative that the source of the file was probably encoded with very high quality / high bitrate to make paying customers happy.

The resulting x265 file looks fine on my TV. You can see some posterization on subtle dark areas and maybe the sound isn’t as clear. But I’m blessed with a tin ear so those degradations don’t bug me much. No doubt there’s a sliding bar here between bandwidth and quality that a proper codec testing setup can do. Me, I’m just a dumb bear using ffmpeg default settings and pretty happy with the result

This post was all motivated by this post and discussion about comparing AV1 to x264 and VP9. AV1 is the new hotness in codecs, although as with all these things there’s years of controversy and patents and hardware support to settle before adoption.

Queering the Heatmap

Over on Metafilter I learned about the project Queering the Map. I like it! Simple project to let people share their LGBT experiences on a map. I like the anonymous confessional aspect of it, also the visibility and power of LGBT people sharing our experiences.

I liked it so much I made my own visualization: Queering the Heatmap. I took a static snapshot of their textual data and put it into MapBox GL JS. Mostly I wanted a heatmap of comment density. Also was curious if it’d be faster than the Google Maps site, it definitely is. Just took a couple of hours, thanks in large part to being able to crib map code from my Wanderings project.

Doubt I’ll do anything more with it, at least without permission and cooperation from the folks in Montreal who run Queering the Map. I think without the “add point to the map” feature it’s not really that exciting, and of course I shouldn’t do that without permission from them. Particularly since they had to deal with a troll attack recently.

Screenshot below. Not sure the heatmap setup is quite right, but I got tired of tinkering with gradients.

Edit: this blog post is so self-deprecating! Lol. I like this little project, I’m proud of being able to make something pretty like this in a couple of hours. Wouldn’t mind doing this as a more real thing if the folks running the project like it.

screenshot-montreal.jpg

Centroid street addresses considered harmful

Had a precise example of a mapping error in Sydney, Australia for my AirBnB located at 3/239 Victoria St in Darlinghurst, Sydney, NSW, Australia.

Screenshot_2.png

Great location in the middle of a fun central neighborhood. The front door is on the east side of the building, on Victoria St itself. But that map pin there is in the middle of the building, the centroid of the rectangle. And that means a lot of map software guesses the address is just a bit closer to the west side of the building, to that little Hayden Ln. Which is a back alley you can’t really drive in and isn’t accessible from the apartment.

This caused real problems with Uber. Their routing software would sometimes try to send the driver up to the back alley.Drivers are smart enough not to do that, but following the purple line put them on Liverpool St where they can’t turn left on to Victoria St to drive to the actual door. This detail mattered because one of us was injured and couldn’t walk the half block down to Liverpool St. Also the routing was unstable; sometimes it directed drivers to the east side on Victoria St and then would flip mid-drive to the west side on Hayden Ln. Which changes the whole route by half a kilometer because Victoria St is one way.

We have the official address point data from the Australian government on OpenAddresses. They geocode the address as 151.2212399,-33.8779683 which is just a hair further southwest than the Google Maps pin. That’s the official correct location for the address but it does not describe the ground truth. Bing Maps has more or less the same wrong point as Google, so does Apple. OpenStreetMap doesn’t have the street number at all and guesses the wrong segment of the road.

The underlying problem here is the database has the polygon for the building but not the exact point of the front door. So it guesses a point by filling in the centroid of the polygon. Which is kinda close but not close enough. A better heuristic may be “center of the polyline that faces the matching street”. That’s also going to be wrong sometimes, but less often.

One extra wrinkle: our proper address was “3/239 Victoria St Darlinghurst”. It’s not enough to say Sydney, you have to name the actual suburb Darlinghurst, because there are multiple Victoria Streets. And the 3/ is an essential part of the address, naming the apartment unit number. Various official forms won’t just accept “239 Victoria St”.

PS: Gelato Messina is excellent.

Update: a friend at Mapbox tells me they do something similar to my heuristic. It’s implemented in this library and is in process to be added to their geocoder. He describes the algorithm as “drawing a bisecting line from the address point to the closest point on the street line”. I imagine the details are more complicated.

Update 2: a second example, 2324 Sacramento St SF, CA. The map pin is in the middle of the building which caused our Deaf Uber driver to go to the “closest” point, that stupid alley on Clay St. There’s a lovely driveway and drop-off location right at the street frontage on Sacramento that we could not get the driver to go to, between communication difficulties and map failures.