You can access the logs directly from the Drive Client DB files.
To do so, first install sqlite3 in your Windows machine:
winget install SQLite.SQLite
Then, run the following command:
Bash CLI:
You can access the logs directly from the Drive Client DB files.
To do so, first install sqlite3 in your Windows machine:
winget install SQLite.SQLite
Then, run the following command:
Bash CLI:
To get a list of the files of home (of user from whom the ssid belongs):
https://IP:5001/webapi/entry.cgi?api=SYNO.SynologyDrive.Files&method=list&version=2&path=/mydrive&_ssid=XXX
If you need a list of the file of a team folder instead, first get the team folders list:
https://IP:5001/webapi/entry.cgi?api=SYNO.SynologyDrive.TeamFolders&method=list&version=1&_ssid=XXX
And then the files of the team folder you want:
https://IP:5001/webapi/entry.cgi?api=SYNO.SynologyDrive.Files&method=list&version=2&path="id:XXX"&_ssid=XXX
We can list shared folders in Drive with their respective ID:
root@NAS-BACKUP:~# synowebapi --exec api=SYNO.SynologyDrive.TeamFolders method=list version=1 | jq '.data.items[] | "ID: \(.file_id) / Name: \(.name)"'
"ID: 795719640889204739 / Name: photo"
"ID: 781590850416189441 / Name: Share update"
"ID: 766719452698944079 / Name: Test Andrea 2"
Then, we can run a recycle bin deletion of only a specified team folder:
synowebapi --exec api=SYNO.SynologyDrive.Share method=list version=1 action=list sort_direction=ASC sor_by=share_name | jq '.data.items[] | "\(.share_name) \(.share_enable)"' | grep true | sed 's/ true//g' > /tmp/enabled_shares; cat /tmp/enabled_shares | while read share; do synowebapi --exec api=SYNO.SynologyDrive.Share method=set version=1 share="[{\"share_name\": `echo $share`, \"rotate_cnt\": 5}]" ; done |
#!/bin/bash | |
#### | |
# Usage | |
# ./increment_subdir_mtime.sh "path/to/parent/folder" increment_in_seconds:INT | |
# eg: | |
# ./increment_subdir_mtime.sh "/volume1/DATA/" 5 | |
#### | |
TARGET_PATH=$(realpath "$1") |
VOLUME='volume1'; | |
SHARED_FOLDER='Data'; # can also specify the subfolder | |
synowebapi --exec api=SYNO.ActiveBackupOffice365.Portal.Restore.AllLog method=list sort_by=start_time sort_direction=DESC offset=0 version=1 | jq .data > /$VOLUME/$SHARED_FOLDER/ABM_restore_logs.json | |
synowebapi --exec api=SYNO.ActiveBackupOffice365.Portal.Export.AllLog method=list sort_by=start_time sort_direction=DESC offset=0 version=1 | jq .data > /$VOLUME/$SHARED_FOLDER/ABM_export_logs.json |
const sleep = async (time) => { | |
return new Promise(resolve => setTimeout(resolve, time)); | |
} | |
(async () => { | |
let counter = 1; | |
while (true) { | |
console.log(`==== Exporting page ${counter} ====`); | |
await sleep(1000); | |
console.log(' Getting check...'); |
#!/bin/sh | |
TASK_ID=11; | |
REQ_VERSIONS=10; | |
SUCC_VERSIONS=0; | |
# Use a temporary file to store the JSON output | |
tmpfile=$(mktemp) | |
synowebapi --exec api=SYNO.ActiveBackup.Task method=list load_verify_status=true load_versions=true filter='{"task_id": '$TASK_ID', "data_formats": [1,4]}' version=1 | jq .data.tasks[0].versions > "$tmpfile" |
#!/bin/sh | |
HELP='false'; | |
DSM_MAJOR=0; | |
DSM_MINOR=0; | |
CONF_PATH=''; | |
while getopts ":ho:n:" flag | |
do | |
case "${flag}" in | |
h) HELP='true';; |
#!/bin/sh | |
TASK_ID=3; | |
TASK_STATUS=$(synowebapi --exec api=SYNO.Backup.Task method=status version=1 task_id="$TASK_ID" additional='["last_bkp_result"]' | jq .data.last_bkp_result); | |
if [[ "$TASK_STATUS" == \"suspend\" ]]; then | |
echo "[$(date)] Backup is suspended" >> /volume1/hb_auto_resume.log; | |
echo "[$(date)] Resuming backup..." >> /volume1/hb_auto_resume.log; | |
synowebapi --exec api=SYNO.Backup.Task method=resume task_id="$TASK_ID" version=1 >> /volume1/hb_auto_resume.log; | |
else | |
echo "[$(date)] Backup is not suspended." >> /volume1/hb_auto_resume.log; |