My Wanderings app has a problem; when it’s running it sometimes induces iOS to consume a lot of data in the Time & Location service. Like 30MB a day. Because the data usage isn’t in our own app, but rather the iOS, it’s hard to figure out what’s going on. I ended up setting up a second iPhone with a prepaid account just to have a phone that is running nothing but Wanderings to try to figure out what is going on.
Still tricky to monitor cellular data usage on an iPhone. The Settings / Cellular page does give info but you have to think to look at it regularly to record data. Not sure there’s any good way to automate that, but I’ve tried a couple of apps that are aimed at helping people monitor their cellular contract usage.
MyDataManager is the most promising so far. It has a graph of hourly data usage. What’s promising is it sets up some sort of VPN config on the phone that it uses for monitoring. They promise your traffic isn’t actually routed through a network VPN, so my guess is it’s a local-to-iPhone thing that gives them more monitoring ability. Just set it up. Update: not impressed with the app. It only measures when you launch the app, so no graph of history. And it seems to consume a lot of data itself.
MobiStats is what I tried first. It also has an hourly graph but near as I can tell the data is only populated when you launch the app, not regularly in the background. That may be an iOS limitation, dunno.
Both apps also have a feature to record where you are using data, not just when, using location services. Neat idea! But because I’m testing iOS Location Services myself I don’t want to muddy the water turning that on.
Update: neither app seemed to track usefully, they both only recorded new data when I ran the app in the foreground. A little surprised that was the case with MyDataManager, I’d hoped the VPN would give it more logging ability. Also that app used a bunch of data on its own.
I tried to copy some new music I bought to my iPhone. In 2018 this is apparently impossible if you want to use iTunes / Apple Music on the phone to play it.
Well it is possible, if you happen to be sitting in front of the One True Computer that has been managing your iPhone sync all along. You can just add some new music and press the sync button again and it works. But if you have the temerity to have a second computer and try to copy music from that computer via iTunes sync, it will fail. Because iTunes will first insist you delete all the music on your phone that’s not present on this second computer. Apple doesn’t deign to let you keep your music. I’m not sure if this is because they assume you are a criminal or if they are just incompetent at multi-way sync.
There used to be a variety of hacks to work around this. For awhile you could clone the library key and fool iTunes into thinking two computers were the same; this no longer works. Some advice has you clicking “manage music manually” in iTunes; I think this sort of helps, but only still if you have access to the full original music library on the second computer. Finally there’s a bunch of third party programs that claim to be file explorers for iTunes that let you copy in music in an unsanctioned / flexible way. I tried several of these for Windows and literally none of them worked. Also none seem really free; they want to charge $40 for the ability to copy a few files. (I fear I may have gotten Computer Herpes installing all this sketchy software.)
Anyway, the only solution I’ve really found is VLC Mobile. It’s back on the App Store, or maybe it never left. And it will play files you copy over via the iTunes File Sharing tab. You won’t be able to play them in the usual Apple Music app, but you will be able to play them. I also fear VLC Mobile doesn’t support CarPlay, another Apple lock-in system, but I haven’t tried it.
While I’m here; playing video files has the same library problem. For that I use Infuse, a third party movie player which is excellent and has its own file store.
(The music in question is Autechre NTS Sessions: 8 hours of glorious bleepy bloopy glitches from Bleep / Warp. Good stuff.)
I unplugged a USB drive without unmounting it first and got burned for my troubles. Windows was able to repair the disc and I think recover all my files, but I had to do it manually.
I use a hard drive to sync files between two Windows desktops with FreeFileSync. I now just unplug the drive whenever I want it; Windows can’t unmount it properly, always says it’s in use, maybe by the backup subsystem? Anyway, I just pull the plug.
So I went to sync the files from the drive to the other disc and it acted wonky. A few files disappeared during the sync. Some other files I expected to exist didn’t. Windows prompted me to run a check or repair via a notification but clicking that did nothing. Manually right clicking the disc and doing Tools / Check and Repair did work though. And afterwards the files I was expecting to find came back. At least I think so, I don’t have a complete audit.
I’m confused how Windows NT is willing to mount a USB drive that wasn’t unmounted cleanly without immediately recommending a check & repair. Also a bit confused about the pattern of what files went missing; it wasn’t just the newest stuff.
First world problems; I got a second Playstation console, the sweet God of War edition. Yay for me, no more carrying a PS4 back and forth between houses! Only now I have to worry about moving my save game data around because the cloud sync on Playstation is so awful.
Sony added auto-upload back in 2015 as part of your paid subscription. However they never added auto-download. So even when everything is working perfectly you still have to manually download save game data every time you switch devices.
Only auto-upload doesn’t work very well. It seems to only upload when you put the device in rest mode. If you go straight from playing a game to shutting down, the files aren’t auto-uploaded. So you can’t trust it and have to manually upload. But manually uploading is also a hassle, you have to shut the game down entirely to upload. (Which raises the question of how it’s uploading in rest mode when the game is still in memory, albeit suspended.)
Even with fully manual management there’s problems. There’s some confusion about file origin; both of my PS4s are complaining that the file didn’t originate on their system when I try to download. Also you can’t be logged in to Playstation on the same account on two consoles at the same time. And there’s also some confusing thing about which console is your “primary” for your account, which seems to have to do with licensing rights. Only maybe a secondary console also doesn’t auto-upload?
Real two-way automatic sync is a little tricky; if the user modifies files on both systems simultaneously and then syncs both, you have a conflict. No gamer wants to have to “git merge” two timelines. But 99% of the time a file is only modified in one place so “always sync the latest version” is fine as long as you detect conflicts. Hell, do a little light locking and warn the user if they’re trying to play while another session is active. To be safe a good sync system should have a bit of history for the files too and some way to recover an older version. Sony of course doesn’t, treating storage space as some enormous premium. (It was a big deal when they moved from 1GB of storage to 10GB for users who are paying $10/month.)
I gather Xbox Live’s version of cloud syncing works better. Steam’s SteamSync mostly works just great too, although I did have it bug out on me once.
Wanted to update the maps in my car’s navigation system. To Audi’s credit they provide twice-yearly upgrades via a variety of methods, some free. (Sadly they don’t seem to have any way to update the MMI firmware though). The map updates are badly documented, so here’s the details.
First, the easy options that cost money. You can pay the dealer to do it. Or you can pay for the Audi Connect service (the cellular data for your car) and updates come over the air.
Now the free manual option. Discussed here, also see this video. It’s not hard or scary, the only real nuisance is their Java download program. Detailed walkthrough:
- Log in to myAudi.
- Click the “Audi connect services” button. I had a lot of trouble with their SSO redirects, maybe related to my ad blockers and privacy stuff.
- On the Audi Connect website go to Services / Map Update and select “Complete Package”, then click the “Prepare Package” button.
- Download the map update! Ha ha just kidding. Download a tiny JNLP file for Java Web Start to run a download program that Audi requires you use.
- Run the JNLP program. I managed finally by enabling it in the Java control panel (Security / Enable Java content for browser and Web Start applications) and then running “javaws foo.jnlp” from a command line. Perhaps there’s an easier way.
- Wait for the download. The full US maps were about 11GB.
- The download program creates four files. Two small files: one named metainfo2.txt and one named randomhexstring.md5. And two directories with the data: Mib1 and Mib2. Apparently all 4 are necessary for an update.
- Copy the 4 to the root directory of a FAT32 system on an SD card. (A USB drive will work as well.)
- Go to the car and put the SD card in one of the slots for the MMI. (I used SD2).
- Turn the MMI on. (I do this by turning the radio on, not the ignition).
- Go to Settings / System Maintenance / Version info to check the versions of things on your system already. This shows the MMI software version (0694 for me) and some basic navigation database info; you have to navigate to the Navigation database submenu to see all the databases you have. My car had about 20 2016/2017 databases for North America, plus a single 2017/2018 for USA, Region 4 (California, where I live). That was an OTA update I got right after buying the car.
- Go to Audi Connect and select “Software update”. Select the source you plugged in (SD2 for me) and it begins. It takes awhile, about 30 minutes for me.
- Leave the car and close the door. Lights will turn off but the MMI will continue to be running with the screen lit and updating. Eventually the screen will close and it seems to keep updating. Hopefully none of this drains the battery too much. The update is automatic and requires no extra input
- Once it finishes you may have to enter a single click to acknowledge the update.
- Go back to Version info to check the version numbers. Now all my car’s maps are labelled “2018”. Success!
Of course I’m curious about the contents of the update files but couldn’t find a lot of info. Mib1 has a directory named “Eggnog” and Mib2 has “Truffles”; internal product names? Mib1’s datafiles seem to be named “.psf” and Linux guesses some (but not all) are FoxPro FPT files, which seems plausible. Mib2 is mostly compressed data I couldn’t figure out, which some small SQLite tables without much interesting it. Probably all proprietary data, of course, and if they’re smart it’s encrypted too to prevent casual snoops like me from lifting all the data out.
Shame Audi doesn’t also let you upgrade MMI firmware. Mine has a bug where it’s limited to indexing 10,000 music files. Admittedly that’s a lot of music, but it’s a nuisance if you just want to copy all the music you own into the car. Static array sizes: not even once!
My Windows 10 box failed, hard. Well maybe soft, who knows? It would boot and log me in as a user, then the system graphical shell would hang, little spinny cursor. One window is visible on screen, Razer Central. That’s some nonsense to turn off the stupid light on my fancy gaming mouse and is notoriously bad software, so maybe it caused the whole system to hang? But even after killing it (Task Manager worked) the system was still hung. FileExplorer seemed to be spinning at 100% of a single CPU core, that’s not a good sign.
Anyway it took me a couple of hours to get the system back. Booting into Safe Mode or accessing other recovery tools on Windows 10 is not at all clear; these docs finally helped me. I tried restoring to a System Restore Point first and that didn’t help, the system was just as broken as before. Fortunately I’d made a full system backup yesterday (using the old Windows 7esque deprecated tool) and restoring that fixed it.
Not sure what really was wrong. The computer had been off for about two weeks, then failed after it rebooted itself to apply some Windows 10 system updates. There were also some Razer updates to install and given that it was on screen when the rest of the system was broken and their reputation for broken software, I’m inclined to blame it. This isn’t even the Razer laptop, this is just a normal Windows machine that had a Razer mouse and I’d installed the software to, like I said, turn off the stupid light on the mouse. Anyway now I’ve excised that software, turned off a lot of other third party services that want to launch at boot (screw you Adobe updater), and will reboot with my fingers crossed.
Update: a few days later I finally worked up the courage to try rebooting. And it worked. I’m 90% sure that the Razer update had some interactive thing it had to do that failed and blocked the rest of the upgrade boot.
Doing a little ugly fast and loose codec comparison. I took a pirate scene MKV file in 720p with 5.1 audio, an hour long video file that was (no doubt) lovingly encoded in H.264 with one audio track and several subtitle tracks. I then transcoded it to H.265 and H.264 using ffmpeg with default settings. This also transcoded the audio from 640kbps AAC to 276kbps Vorbis. Anyway here’s what I got for file sizes:
- 1800MB, 720p, 51 minutes
- 1580MB video, 240MB audio
- h264 (High)
- 170MB, or roughly 10% the size
- 80MB video, 90 MB audio
- hevc (Main), encoded with “ffmpeg -c:v libx265”, default settings.
- Encoded at about 1.3x playback speed (software)
- 400MB, or roughly 22% the size
- 290 MB video, 90 MB audio
- h264 (High), encoded with “ffmpeg -c:v libx264”, default settings.
- Encoded at about 3x playback speed (software)
The x265 encoding is remarkable. The video is literally smaller than the audio! And the resulting file is 10% of the source. The video segment is 5% of the size of the source and the audio is 38%.
The x264 re-encoding is also notable, in that I’ve squished it to 22% of the source with nominally the same codec. That’s a pretty good savings and indicative that the source of the file was probably encoded with very high quality / high bitrate to make paying customers happy.
The resulting x265 file looks fine on my TV. You can see some posterization on subtle dark areas and maybe the sound isn’t as clear. But I’m blessed with a tin ear so those degradations don’t bug me much. No doubt there’s a sliding bar here between bandwidth and quality that a proper codec testing setup can do. Me, I’m just a dumb bear using ffmpeg default settings and pretty happy with the result
This post was all motivated by this post and discussion about comparing AV1 to x264 and VP9. AV1 is the new hotness in codecs, although as with all these things there’s years of controversy and patents and hardware support to settle before adoption.