-
-
Save gholker/88639858d53ae3ccec7ddd306cd8ba22 to your computer and use it in GitHub Desktop.
#!/usr/bin/env bash | |
set -euo pipefail | |
IFS=$'\n\t' | |
LOG_GROUP_NAME="" | |
LOG_STREAM_NAME="" | |
REGION="" | |
OUTPUT_FILE="$(date +"%Y%m%d").log" | |
result=$(aws logs get-log-events \ | |
--start-from-head \ | |
--log-group-name=${LOG_GROUP_NAME} \ | |
--log-stream-name=${LOG_STREAM_NAME} \ | |
--region=${REGION}) | |
echo ${result} | jq -r .events[].message >> ${OUTPUT_FILE} | |
nextToken=$(echo $result | jq -r .nextForwardToken) | |
while [ -n "$nextToken" ]; do | |
echo ${nextToken} | |
result=$(aws logs get-log-events \ | |
--start-from-head \ | |
--log-group-name=${LOG_GROUP_NAME} \ | |
--log-stream-name=${LOG_STREAM_NAME} \ | |
--region=${REGION} \ | |
--next-token="${nextToken}") | |
if [[ $(echo ${result} | jq -e '.events == []') == "true" ]]; then | |
echo "response with empty events found -> exiting." | |
exit | |
fi | |
echo ${result} | jq -r .events[].message >> ${OUTPUT_FILE} | |
nextToken=$(echo ${result} | jq -r .nextForwardToken) | |
done |
@gholker thanks for this.
@ravikdasari
jq is a command line utility for parsing json.
save this as download-aws-logs.sh.
edit the variables at the top with the log group/stream and region you want to download.
LOG_GROUP_NAME=""
LOG_STREAM_NAME=""
REGION=""
run chmod +x ./download-aws-logs.sh
to make it executable.
run ./download-aws-logs.sh
to run it.
Thank you @dylanjsa
@gholker thank for the source, i tried it but failing on line 16 .
"jq : command not found"
@610hf you will need to install jq
. https://jqlang.github.io/jq/
Arch: yay -Sy jq
Debian: apt install jq
How to use this code inside a while loop when we execute commands like describe-task of ECS? Can anyone please help?
@vesubramanian you can try something like this:
#!/usr/bin/env bash
set -euo pipefail
# Example: fetch a list of task ARNs
task_arns=$(aws ecs list-tasks --cluster your-cluster-name --service-name your-service-name --query 'taskArns[]' --output text)
# Loop through each task ARN
for task_arn in ${task_arns}; do
# Get log group and stream names from the task description
LOG_GROUP_NAME=$(aws ecs describe-tasks --cluster your-cluster-name --tasks ${task_arn} --query 'tasks[0].overrides.containerOverrides[0].environment[?name==`LOG_GROUP_NAME`].value' --output text)
LOG_STREAM_NAME=$(aws ecs describe-tasks --cluster your-cluster-name --tasks ${task_arn} --query 'tasks[0].overrides.containerOverrides[0].environment[?name==`LOG_STREAM_NAME`].value' --output text)
# Call the script with these names
./download-aws-logs.sh "${LOG_GROUP_NAME}" "${LOG_STREAM_NAME}"
done
You can modify the download-aws-logs.sh, to take LOG_GROUP_NAME and LOG_STREAM_NAME as arguments.
@dylanjsa Thank you for your response. Let me explain a few more details. I am deploying Database using ECS Task Definition (both RDS and MSSQL). There is a single task. However, during the task execution, I can see that the logs are not getting populated in real time or at regular intervals. Rather, there is no log message till some time and then there is a surge and then again nothing for a few seconds and then again a surge and so on. When this happens, it becomes a challenge, since the next token fetches nothing. Also, I am trying it out from a Shell Script step in a GitHub Action.
how to use this script? and what that jq in the code