[go: up one dir, main page]

0% found this document useful (0 votes)
11 views3 pages

Instructions

Uploaded by

Kalyan D
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views3 pages

Instructions

Uploaded by

Kalyan D
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Instruction Document for Google Cloud Logging Script

Overview

This document provides instructions on how to use the provided Bash script to retrieve logs from
Google Cloud Logging and store them in a CSV file. The script requires a Google Cloud project
ID, a service account key file, and a bucket name. It authenticates with Google Cloud, sets the
project, and retrieves logs based on specified filters.

Prerequisites

● You need to have the Google Cloud SDK installed on your machine.
● Ensure you have permissions to access the Google Cloud project and the logging
service.
● A service account key file (in JSON format) should be created and downloaded from the
Google Cloud Console.

Script Breakdown

Script Code

#!/bin/bash

# Load the user's bash profile


export bash_file='/home/GGAUDIT/.bash_profile'
if [ -n "$PATH" ]; then . "$bash_file"; fi

# Set project ID, key file, and bucket from script arguments
PROJECT_ID=$1
KEY_FILE=$2
BUCKET=$3

# Validate required parameters


if [ -z "${PROJECT_ID}" ] || [ -z "${KEY_FILE}" ]; then
echo "ProjectID or Service Account JSON is not provided"
exit 1
fi

# Authenticate with Google Cloud using service account


echo "Authenticating with Google Cloud using service account from ${KEY_FILE}..."
gcloud auth activate-service-account --key-file="$KEY_FILE" || { echo "Authentication failed.
Exiting."; exit 1; }

# Set the Google Cloud project


gcloud config set project "$PROJECT_ID"

# Read logs and store in records variable


records=$(gcloud logging read --project="${PROJECT_ID}" \
--bucket="${BUCKET}" \
--location=global \
--view=_AllLogs \
'resource.labels.project_id="'${PROJECT_ID}'" AND
timestamp<="2024-01-31T23:59:59Z" AND timestamp>="2024-01-01T00:00:00Z" AND
protoPayload.authenticationInfo.principalEmail!~"gserviceaccount.com"' \
--format="csv(insertId, logName, operation.first, operation.id, operation.last,
operation.producer, protoPayload.authenticationInfo.principalEmail,
protoPayload.authenticationInfo.principalSubject,
protoPayload.metadata.instance_group_manager_id,<all_attributes>)" \
--limit=1000000)

# Check if records were found


if [ -z "$records" ]; then
echo "No records found."
else
# Write records to a CSV file
echo "$records" > "${PROJECT_ID}_logs.csv"
echo "Records written to ${PROJECT_ID}_logs.csv"
fi

Explanation of the Script

1. Load User Bash Profile:


○ The script starts by loading the user's .bash_profile to ensure any necessary
environment variables are set.
2. Set Variables:
○ PROJECT_ID, KEY_FILE, and BUCKET are set from the script's command-line
arguments.
3. Validate Parameters:
○ The script checks if the PROJECT_ID or KEY_FILE variables are empty. If either
is missing, it prints an error message and exits.
4. Authenticate with Google Cloud:
○ The script authenticates with Google Cloud using the service account key file
specified by KEY_FILE. If authentication fails, it exits with an error message.
5. Set Google Cloud Project:
○ The script sets the active project in the gcloud configuration using the gcloud
config set project command.
6. Fetch Logs:
○ The script retrieves logs using gcloud logging read, applying filters to
capture logs for the specified project and time frame. It formats the output as
CSV, including specific fields. The limit of 1,000,000 records is set to ensure a
large dataset can be retrieved.
7. Check for Records:
○ If no records are found, it prints a message indicating this. If records are found, it
writes them to a CSV file named ${PROJECT_ID}_logs.csv.
8. Output Message:
○ The script prints a success message indicating that the records have been written
to the CSV file.

How to Run the Script+x download_logs.sh

1. Execute the Script:


○ Run the script by providing the required arguments:

./logs.sh <PROJECT_ID> <KEY_FILE> <BUCKET>

d_logs.sh <PROJECT_ID> <KEY_FILE> <BUCKET>

○ Replace <PROJECT_ID> with your Google Cloud project ID, <KEY_FILE> with
the path to your service account key file, and <BUCKET> with your log sink bucket
name.

You might also like