1

I have a folder Google Photos. This is from a google photos takeout. The way the takeout works is there are multiple folders with date names

➜  ~/Google Photos => tree |head -n 20     
.
├── 1979-12-31
│   ├── icon-24x24.png
│   ├── icon-24x24.png.json
│   ├── icon_local_color.png
│   ├── icon_local_color.png.json
│   ├── metadata.json
│   ├── viewer-14.png
│   ├── viewer-14.png.json
│   ├── viewer-28.png
│   └── viewer-28.png.json
├── 2001-07-24
│   ├── heic0109a.jpg
│   ├── heic0109a.jpg.json
│   └── metadata.json
├── 2003-01-06
│   ├── ASPdotNET_logo.jpg
│   ├── ASPdotNET_logo.jpg.json
│   ├── darkBlue_GRAD.jpg
│   ├── darkBlue_GRAD.jpg.json

I would like to move all the files which do not end in json to a new folder called ./all_photos. The issue is that I may have two files with the same name in folders.

For example in the folder 1979-12-31 I may have a file called a.jpg and in the folder 2001-07-24 I may also have another file called a.jpg. Solutions I have looked at move the files but overwrite or skip if there is a potential overwrite. I would like to change the filenames as shown: 1979-12-31_a.jpg and 2001-07-24_a.jpg. This will ensure there are no files with the same name and therefore no overwriting.

I am on a Linux server and I imagine the best way to do this would be through Linux find command but I am unsure of the exact commands so would like a helping hand.

D S
  • 11

1 Answers1

0

A script I wrote to solve the problem. Probably not the most efficient way but if there is a more efficient way please let me know. Takes one parameter - the folder name for which you would like to apply the script. In the example above I would run ./script.sh Google\ Photos. Works for 2 levels of subfolders.

#!/bin/bash
shopt -s extglob
for x in `ls $1` 
do
    for y in $1/$x/!(*.json)
    do
        a=${y#"$1/$x/"}
        mv $y "$1/$x-$a"
    done
done
D S
  • 11