This document describes how to query and view logs that are stored in log buckets that are upgraded to use Log Analytics. You can query logs in these buckets by using SQL, which lets you filter and aggregate your logs. For information about the Log Analytics capabilities of Cloud Logging, see Log Analytics overview.
When you upgrade a log bucket to use Log Analytics, you don't restrict your access to Logs Explorer. You can continue to troubleshoot issues and view individual log entries in these buckets by using the Logs Explorer.
About running queries on the Log Analytics page
When you want to identify patterns in your log data or create charts from that data, we recommend that you use the Log Analytics page. This interface requires that your log bucket has been upgraded to use Log Analytics.
Log Analytics queries run on the default Log Analytics service, unless you configure this page to use reserved BigQuery slots. When the default service is used, your queries compete for slot space with a limited number of default slots. As a result, if the default slots in your Google Cloud project are occupied with other queries, then your query might experience delays in execution time. However, you can improve the performance of your queries by running them on reserved BigQuery slots.
If you want to do any of the following, then create a linked BigQuery dataset, which lets the BigQuery engine read log entry datasets.
- Query log entry data from the Log Analytics page by using BigQuery reserved slots.
- Query log entry data from another service like the BigQuery Studio page or Looker Studio.
- Join log entry data with other BigQuery datasets.
If you run your queries on reserved BigQuery slots, then your queries are subject to capacity compute pricing. If you query your data by using a service other than Log Analytics, then your queries may be subject to other charges based on that service. See the pricing page for the service that you are using.
Before you begin
-
To get the permissions that you need to use Log Analytics, ask your administrator to grant you the following IAM roles on your log buckets or log views:
-
To query the
_Required
and_Default
log buckets: Logs Viewer (roles/logging.viewer
). -
To query all log views in a project:
Logs View Accessor (
roles/logging.viewAccessor
). -
To query logs in a specific log view:
Create an IAM policy for the log view, or restrict the Logs View Accessor (
roles/logging.viewAccessor
) role to a certain log view. For more information, see Control access to a log view.
-
To query the
-
To get the permissions that you need to create and query linked datasets, ask your administrator to grant you the following IAM roles on the project that stores the log bucket:
-
(Optional) To create and view linked datasets:
-
Logs Configuration Writer (
roles/logging.configWriter
) -
Log Link Accessor (
roles/logging.linkViewer
).
-
Logs Configuration Writer (
-
(Optional) To run queries on linked datasets using reserved BigQuery slots:
-
BigQuery User (
roles/bigquery.user
) -
BigQuery Job User (
roles/bigquery.jobUser
).
-
BigQuery User (
-
(Optional) To view linked datasets in BigQuery Studio, grant all roles mentioned in this step, and the following role:
BigQuery Data Viewer (
roles/bigquery.dataViewer
).
-
(Optional) To create and view linked datasets:
For the log views that you want to query, go to the Logs Storage page and verify that the log buckets that store those log views are upgraded to use Log Analytics. If necessary, upgrade the log bucket.
- Optional: If you want to query your log data by using a BigQuery dataset, for example, if you plan to use the BigQuery Studio page, then create a linked BigQuery dataset.
-
This feature is in Public Preview:
Optional: If you want to query your log data from the Log Analytics page by using reserved BigQuery slots, then do the following:- Create a linked BigQuery dataset.
- Create a reservation with dedicated slots and then create reservation assignments.
Query a log view
When you are troubleshooting a problem, you might want to count the log entries with a field that match a pattern or compute average latency for HTTP request. You can perform these actions by running a SQL query on a log view.
To issue a SQL query to a log view, do the following:
-
In the Google Cloud console, go to the Log Analytics page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
In the Log views list, find the view, and then select Query. The Query pane is populated with a default query, which includes the table name for the log view name that is queried. This name has the format
project_ID.region.bucket_ID.view_ID
.You can also enter a query in the Query pane, or edit a displayed query.
To set the time range of your query, use the time-range selector or add a
WHERE
clause that specifies thetimestamp
field. We recommend that you use the time-range selector to specify the time range.If you specify a timestamp in your query, then that timestamp overrides the selected time range in the time-range selector and the time-range selector is disabled. To use the time-range selector, remove timestamp expressions from the
WHERE
clause in your query.Select your query engine.
To run your query on your default Log Analytics service, ensure that the Run query button is displayed in the toolbar. If this button isn't displayed, then do the following:
In the toolbar, click settings Settings.
Click Log Analytics (default).
The Run query button appears on the toolbar.
To run your query on the reserved BigQuery slots in your Google Cloud project, do the following:
In the toolbar, click settings Settings.
Click BigQuery.
The Run query button changes to Run on BigQuery.
If the Run on BigQuery button is displayed but disabled, then a log view referenced by your query doesn't have a linked dataset. To run your query on your BigQuery slot reservations, create a linked dataset on your log view.
Run your query.
The query is executed and the result of the query is shown in the Results tab.
You can use the toolbar options to format your query, clear the query, and open the BigQuery SQL reference documentation.
Display the schema of a view
The schema of a log view defines its structure and the data type
for each field. This information is important to you because it determines
how you construct your queries. For example, suppose you want to compute the
average latency of HTTP requests. You need to know how to access the latency
field and whether it is stored as an integer like 100
or stored as a
string like "100"
. When the latency data is stored as a string, the query
must cast the value to a numeric value before computing an average.
When the data type of a column is JSON, the schema doesn't list the fields
available for that column. For example, a log entry can have a
field with the name of json_payload
. When a log bucket is upgraded to use
Log Analytics, that field is mapped to a column with a data type of JSON.
The schema doesn't indicate the child fields of the column. That is, you
can't use the schema to determine if json_payload.url
is a valid reference.
To identify the schema for a log view, do the following:
-
In the Google Cloud console, go to the Log Analytics page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
In the Log views list, find the log view, and then select the name of the log view.
The schema is displayed. You can use the Filter field to locate specific fields. You can't modify the schema.
Save a query
All queries that you run are automatically saved for 30 days and are accessible by selecting the Recent tab on the Log Analytics page. You can search, view, run, and share the queries that are listed on the Recent tab.
If you want to keep a query available for future use, annotate it with information that is useful to you, or let teammates view and run your query, then save the query. You can search and sort your saved queries by their name, their description, and their visibility label. You can also edit and delete these queries. Queries that you save are retained until you delete them.
You can save 10,000 queries per Google Cloud project.
Console
To save a query, do the following:
-
In the Google Cloud console, go to the Log Analytics page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
Populate the Query pane with a query.
You can populate the Query pane by entering a new query, by selecting a query from the Recent tab, or by selecting a query from the Saved tab.
When the query in the Query pane is valid, the
Save is enabled.Click
Save and complete the Name and Description fields. The values you set for these fields are shown on the Saved tab.Optional: To let everyone with access to the Log Analytics page for the Google Cloud project view and run your saved query, enable the Share with project toggle.
By default, this toggle is disabled and the visibility is restricted to you.
Click Save query.
Optional: To view, sort, and run saved queries that are visible to you, select the Saved tab.
You can sort and filter your saved queries by their name, description, and visibility label. You can also filter by the contents of the query.
You can edit and delete queries that you created by using options on the Saved tab:
To edit a query, click more_vert More Options and select Edit. You can modify the values for the Name and Description fields; however, the query itself can't be modified.
To delete a saved query, click more_vert More Options and select Delete.
API
To save a query by using the Logging API, use the
savedQueries.create
method. For more information about this method, its
parameters, and the response data, see the reference page for
savedQueries.create
.
You can execute the savedQueries.create
method by using the
APIs Explorer widget on the method's reference page. For
Log Analytics queries, you must specify the opsAnalyticsQuery
field. The
following example illustrates a sample request body, which contains an
instance of SavedQuery
:
{ "parent": "projects/my-project/locations/global" "saved_query": { "ops_analytics_query": { "sql_query_text" : "SELECT timestamp, log_name, severity, json_payload, resource, labels FROM `TABLE_NAME_OF_LOG_VIEW` WHERE timestamp > TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 1 HOUR) ORDER BY timestamp ASC LIMIT 100" } "visibility": "PRIVATE" } }
Share a query
Console
When troubleshooting a problem, or when you see anomalous results, you might want to share a query and its results with a teammate. When you are viewing query results on the Log Analytics page, you can copy a URL that, when opened, displays the query you ran and its results.
To share a query and results with a teammate, do the following:
-
In the Google Cloud console, go to the Log Analytics page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
Populate the Query pane with a query and then click Run query.
You can populate the Query pane by entering a new query, by selecting a query from the Recent tab, or by selecting a query from the Saved tab.
Click link Share link.
Send the link to your teammate.
When your teammate opens the link, the Log Analytics page is opened. This page displays the query that you ran and the results of the query.
To open the URL successfully, your teammate's Identity and Access Management role on the Google Cloud project must include the permissions required to view Logging pages.
API
You can use the Logging API to create a shared query by
using the savedQueries.Create
method, and specifying a value of SHARED
in
the visibility
field.
Save query results to a dashboard
After you run a query, you can select how to view your query results, and whether you want to save the query and its results to a custom dashboard. Custom dashboards let you display and organize information that is useful to you by using a variety of widget types.
By default, your query results are presented as a table. That is, on the Results tab, Table is selected. With these settings, if you go to the toolbar and select Save to dashboard, then a dialog opens and you are prompted to enter a descriptive name for the table and a dashboard name. When you complete these steps, the named dashboard is edited and a table widget is added. The table widget displays the result of your query.
If you want to view trends in your data, then in the Results tab, select either Chart or Both. These options let you display the result of a query on a chart or with indicator like a gauge or scorecard. After you create a chart, you can save it to a custom dashboard by selecting Save to dashboard. For information about configuring charts, see Chart query results with Log Analytics.
You can change the configuration of a table or chart on a custom dashboard. To make changes, open the dashboard and then edit the widget. For example, you can edit the query or change how the data is displayed. For more information, see Modify a widget's configuration.
View and run recent or saved queries
To view or re-run a query, select the Recent tab on the Log Analytics page and find the query:
- To run the query, click Run.
- To view the query, use the options in the more_vert More Options menu.
To view, edit, or run a saved query, select the Saved tab on the Log Analytics page and find the query:
- To run the query, click Run.
- To edit, view, or delete the query, use the options in the more_vert More Options menu.
Run queries from BigQuery
When you have a log bucket that uses Log Analytics and linked datasets, you can view and query your linked datasets by using the BigQuery Studio page. With this configuration, you can analyze your datasets using commands, workflows, and datasets available only in BigQuery Studio.
To query a linked dataset by using the BigQuery Studio page, do the following:
-
In the Google Cloud console, go to the Log Analytics page:
If you use the search bar to find this page, then select the result whose subheading is Logging.
In the Log views list, find the log view, and then select Query. The Query pane is populated with a default query.
You can also enter a query in the Query pane, or edit a displayed query.
In the toolbar, click Open in BigQuery.
The BigQuery Studio page opens. The
FROM
statement of the query is modified to specify the path to the log view on the linked dataset by using the BigQuery Table path syntax.You can also edit the displayed query.
Click Run query.
What's next
- Sample queries
- Create a log bucket and upgrade it to use Log Analytics
- Upgrade a bucket to use Log Analytics
- Create a linked dataset