deleting duplicate files

I have a script downloading a JSON API once a minute. The data only changes every ~30 minutes or so, so it’s a lot of duplicates. And I left it running for 10+ days out of carelessness, so it’s a whole lot of tiny useless duplicate files. Easy solution:

fdupes -dN .

Apparently this keeps the oldest file.