After a takeout with Google Photos any file taken with your own device should contain the correct time. But photos sent from other people, created with Lightroom or very old files might have a stripped date field. Google Photos usually adds a date from the date it was uploaded. You want to have at least that date from Google, otherwise all files you import to Synology Photos will clump together at the date of the indexation.
That's also what most people are doing but they think Google stripped the date from their files during the Takeout. Most files should be fine though, depending on your specific photo collection of course.
My current takeout size is 100GB and I didn't have that much space left while I have plenty of disk space on my synology.
In addition the Synology DSM File Downloader wasn't able to download the temporary links from Google (or they prevented it for some technical reason)
I therefore created the takeout, stored in on my (free) drive with a temporay Google One subscirption to get 100GB of space. I then synced the drive to my synology and got the two 50GB tgz files.
Extract and then run the following exitfool commands. Then copy it to your Photos
folder the Synology Photos app created before.
The indexing will then happen. I ignored the google json files beforeo (as I thought all my files were good) but there have been
too many files from other people or from the past so I did all steps again and overwrote all files. I guess the index will find them
due to the slightly changed size (changed exif header).
The script will read PhotoTakenTimeTimestamp
from the json (which is guaranteed to exist for every file from Google Photos
and write it into DateTimeOriginal
which is then read by Synology Photos during the index.
-
Install Perl on Synology by adding the app.
perl -v
should work then -
Download the latest exif release from https://exiftool.org/ (today: https://exiftool.org/Image-ExifTool-12.29.tar.gz)
wget https://exiftool.org/Image-ExifTool-11.88.tar.gz
# unpack and use it
gzip -dc Image-ExifTool-12.29.tar.gz | tar -xf -
cd Image-ExifTool-12.29
# Check version (optionally)
perl exiftool -v
via https://qrys.ch/using-the-exiftool-on-a-synology-nas/
Prepare some temporary alias.
alias exif='perl ~/Image-ExifTool-12.29/exiftool'
# info about a single file
alias infoDate='exif -T -DateTimeOriginal'
# for a single file update to test (hence the `%F.json` for file + ext without dir)
alias updateSingle='exif -tagsfromfile "%F.json" "-DateTimeOriginal<PhotoTakenTimeTimestamp" -d %s -overwrite_original'
# usage.
# output the date from a given file so you can check if it's missing and too check after the update
infoDate some-file.jpg
# will look for some-file.jpg.json to fetch the `PhotoTakenTimeTimestamp` and copy it into the exif field `DateTimeOriginal`
updateSingle some-file.jpg
# actual update of an entire folder (hence the %d/%F placeholder)
# --ext is excluding (json) and -ext is including extensions (any)
exif -r -d %s -tagsfromfile "%d/%F.json" "-DateTimeOriginal<PhotoTakenTimeTimestamp" -ext "*" -overwrite_original -progress --ext json <folder-name>
That worked pretty good. There are still some files not updated because exiftool did not find json files:
The duplicate identifier
(1)
is at the wrong position.Expected:
IMG_123_0005(1).jpg.json
but instead google photos createdIMG_123_0005.jpg(1).json
while the image file is correctly named
IMG_123_0005(1).jpg
.Either find an exif solution to tell about the different forrmat with
-tagsfromfile
or use a custom regex, rename the files in the filesystem and update the files again with exif.