-
-
Save guss77/44369c39b6ce0cfa488ef476ea477c0b to your computer and use it in GitHub Desktop.
#!/bin/bash | |
function usage() { | |
( | |
echo "usage: $0 <auth-token> <group-address> <mbox-dir>" | |
echo "To generate an auth token go to https://developers.google.com/oauthplayground/ and get an access token for Google Groups migration" | |
) >&2 | |
exit 5 | |
} | |
AUTH_TOKEN="$1" | |
shift | |
GROUP="$1" | |
shift | |
MBOX_DIR="$1" | |
shift | |
[ -z "$AUTH_TOKEN" -o -z "$GROUP" -o -z "$MBOX_DIR" ] && usage | |
for file in $MBOX_DIR/*; do | |
curl -H"Authorization: Bearer $AUTH_TOKEN" -H'Content-Type: message/rfc822' -X POST \ | |
"https://www.googleapis.com/upload/groups/v1/groups/$GROUP/archive?uploadType=media" --data-binary @${file} || exit 1 | |
done |
This script works a treat ; have used it for TWO of my Gsuite migrations to date.
I am getting 408 timeout messages now and again. Do you think the API is trying to throttle things? I wonder if instead of a temporary token ; a project-based token might speed things up? I'm not expert at the API game but I wanted to thank you for this tiny if yet invaluable script... I'm wiring a reddit blog post to document my woeful experience which became much less woeful when I found this script. Thank you !!!!
I've upgraded it so that even if it fails halfway, you can resume your import. All emails that are to be imported must be children of the folder specified, sub-folders are not included because this script creates a sub-folder of successful
for those that worked and leaves the ones that doesn't in the directory, errors will show in console.
This worked marvelously for me for thousands of emails, even if the auth token expired.
#!/bin/bash
function usage() {
(
echo "usage: $0 <auth-token> <group-address> <mbox-dir>"
echo "To generate an auth token go to https://developers.google.com/oauthplayground/ and get an access token for Google Groups migration"
) >&2
exit 5
}
AUTH_TOKEN="$1"
shift
GROUP="$1"
shift
MBOX_DIR="$1"
shift
[ -z "$AUTH_TOKEN" -o -z "$GROUP" -o -z "$MBOX_DIR" ] && usage
SUCCESS="$MBOX_DIR/successful"
mkdir -p $SUCCESS
success_count=0
failure_count=0
for file in $MBOX_DIR/*.eml; do
if curl --fail -H"Authorization: Bearer $AUTH_TOKEN" -H'Content-Type: message/rfc822' -X POST "https://www.googleapis.com/upload/groups/v1/groups/$GROUP/archive?uploadType=media" --data-binary "@$file"; then
mv "$file" $SUCCESS
success_count=$((success_count + 1))
else
failure_count=$((failure_count + 1))
fi
done
echo "Done. $success_count successfully imported, $failure_count failed."
Recommend changing --data-binary @${file}
to --data-binary @"${file}"
to fix issues with files having spaces.
Hi there,
Thanks for this script. I've exported the mbox files correctly (apparently) but when I use your script to import those mbox files I only get this "error" message:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "invalid",
"message": "Unable to parse the raw message"
}
],
"code": 400,
"message": "Unable to parse the raw message"
}
}
Is this easy to solve?
Any thoughts?
Thanks in advance!