Exporting with the Logs Viewer

If you are not familiar with exporting logs in Logging,
see Overview of Logs Export. In summary, you export logs
by creating one or more sinks that include a logs filter and an
export destination. As Stackdriver Logging receives new log entries,
they are compared against each sink. If a log entry matches a sink's filter,
then a copy of the log entry is written to the export destination.

The search-filter box above the table lets you filter your sinks by text
search or the sink properties Sink Name, Destination, and
Writer Identity. For example, the following screenshot shows a search
on Destinationbigquery and options to concatenate other sink properties
to the argument by OR (AND is the default):

In addition, clicking on any of the column names lets you sort data in ascending
or descending order. At the bottom of the table, you can also select the number
of rows that you wish to display.

Creating sinks

To create an export sink, click the Create Export button at the top of the
Logs Exports page. You can also access this button at the top of the Logs
Viewer page.

If your account lacks permission to create exports for the project, this button
is disabled. See the Before you begin section above for more
information.

The following screenshot shows the Edit Export panel with some
fields filled in:

To create a sink, fill in the Edit Export panel as follows:

(filter): Enter an advanced logs filter. You don't need
quotation marks around the filter and you can use multiple lines. The
initial filter is determined by the log entries being displayed when you
click Create Export.

Whenever you edit the filter, press Submit Filter, found under the
filter box, to display the matched log entries. Click the
Jump to newest logs icon at the top of the page to fetch the most
recent logs.

If you wish to use the
basic viewing interface to select the
logs, use the &blacktriangledown; menu at the right side of the
search-filter box.

A custom export destination still has to be in Cloud Storage,
BigQuery, or
Cloud Pub/Sub, but allows you to send logs to a sink in a different
project.

Sink Destination:

Cloud Storage: Select or create the particular bucket to receive the
exported logs.

Cloud Pub/Sub: Select or create the particular topic to receive the
exported logs.

BigQuery: Select or create the particular dataset to receive the
exported logs.

Custom Destination: Add the Cloud Storage,
Cloud Pub/Sub, or
BigQuery
project as a string. For information on project name formatting, see
Sink properties.

Click Update Sink to create the sink.

As part of creating the sink, Logging attempts to grant the
sink's writer identity permission to write to your destination. If you are
exporting to a destination in a project other than the one owning your logs,
then an administrator of the new destination must grant permission. You
should send the administrator the sink's writer identity, which
is listed with the sink in the Exports page.

New log entries that match your sink will start being exported. Log entries
going to BigQuery or Cloud Pub/Sub are streamed to those
export destinations
immediately. Log entries going to Cloud Storage are batched and sent
out approximately every hour. For more information, see
Using exported logs.

If Logging encounters errors when trying to export logs to your
export destination, the errors appear in your project's Activity Stream.
Select Activity at the top of your project's home page in Google Cloud Platform Console.
To diagnose common errors, see Troubleshooting.

Updating sinks

To update a sink, select the Edit sink command in the More menu to
the right of the sink's name. You can change any of the following parameters:

Deleting sinks

To delete a sink, select the sink in the Exports page and press Delete at
the top of the page. Alternatively, select Delete sink from the More
menu to the right of the sink's name.

Destination permissions

This section describes how you can grant Logging
permission to write exported logs to your sink's export destination.

When you create a sink, Logging creates a new service account for
the sink, called a unique writer identity. You cannot manage this service
account directly as it is owned and managed by Stackdriver Logging. The service
account is deleted if the sink gets deleted.

Your export destination must permit this service account to write log entries.
To set up this permission, follow these steps:

Create the new sink in the GCP Console, the gcloud logging
command-line interface, or the Logging API.

If you created your sink in the GCP Console and you have
Owner access to the
destination, then Stackdriver Logging should have set up the
necessary permissions on your behalf. If it did so, you are done. If not,
continue.

Obtain the sink's writer identity—an email address—from the
new sink:

If you are using the GCP Console, you can see the writer
identity in
the sink listing on the Exports page.

If you are using the Logging API, you can get the writer
identity from the LogSink object.

If you are using gcloud logging, you can see the writer identities
when you list your sinks.

If you have Owner access to the destination, then add the
service account to the destination in the following way:

For Cloud Storage destinations, add the sink's writer identity
to your
bucket and give it the
Storage Object Creator
role.

For BigQuery destinations, add the sink's writer identity
to your
dataset and give it the BigQuery Data Editor role.

For Cloud Pub/Sub, add the sink's writer identity to your topic
and give
it the Pub/Sub Publisher
role.

This completes the authorization.

If you do not have Owner access to the export destination, then send the
writer
identity service account name to someone who has that ability. That person
should then follow the instructions in the previous step to add the
writer identity to the export destination.

Authorization delays

If a sink tries to export a log entry but does not have the needed permission
to the export destination, it will report an error and skip the log entry.
This will continue until the permission is granted, at which time the sink
begins exporting new log entries.

There is an unavoidable delay between creating the sink and using the sink's new
service account to authorize writing to the export destination. You can choose
to simply ignore any error messages from the sink during the delay.

Advantages and limitations

The GCP Console has the following advantages over using the
Logging API:

The GCP Console shows all of your sinks in one place.

The GCP Console shows you which log entries are matched by
your sink filter before your create a sink.

The GCP Console can create and authorize export destinations
for your sinks.

However, the GCP Console can only create or view sinks in projects.
To create sinks in organizations, folders, or billing accounts, see
Aggregated Exports.

Troubleshooting

This section lists some possible errors and unexpected results, and explains
what to do about them.

Errors from sinks appear in the Activity Stream for the project or other
resource where the sink was created. See the
Activity Stream in the resource's home page in GCP Console.

General problems

Problem

Cause

Solution

Your new log entries are exported but your older log entries
are not exported.

Logging only exports log entries that are received
after the export has been set up.

Use the
entries.list
API method to retrieve your older log entries and then write them to
the export destination using the destination service's API. You
can only retrieve log entries that have not reached their expiration
date in Stackdriver Logging. For more information,
see Logs Retention
Limits.

Errors exporting to Cloud Storage

The following table lists the most common errors when you configure
Logging to export logs to Cloud Storage:

Error

Cause

Solution

Permissions on bucket [YOUR_BUCKET] do not allow the logs
group to create new objects.

The sink's writer identity does not have enough permissions to the
bucket.

You might have an error in your sink's destination or
someone might have deleted the dataset.

Either re-create the dataset
or update the export sink to use a different dataset.

Logs streamed to table [YOUR_TABLE] in dataset [YOUR_DATASET]
do not match the table schema.

You are trying to export logs that are incompatible with the current
table's schema.

Make sure that your log entries match the table's schema.
Common issues include sending log entries with different data types.
For example, one of the fields in the log entry is an integer,
while a corresponding column in the schema has a string type.
The activity stream contains a link to one of the invalid log entries.
After you fix the source of the error, you can rename your current table
and let Logging create the table again.

Per-table streaming insert quota has been exceeded for table
[YOUR_TABLE] in dataset [YOUR_DATASET].

Decrease the amount of log data your sink generates. You can update
your sink's filter to match fewer log entries or use the
sample()
function.

Logs streamed to partitioned table [YOUR_TABLE] are outside
the permitted time boundaries.

BigQuery does not accept logs that are too far in the
past or
future.

Logs outside permitted time boundaries cannot be exported with sinks.
You can export those logs to Cloud Storage and use a
BigQuery load job
instead. See
BigQuery documentation for further instructions.

Logs cannot be streamed to dataset [YOUR_DATASET] because that
operation is prohibited by an organization policy.

An organization policy exists that prevents writes to the selected
dataset. See
documentation for more details on organization policies.

Modify your export sink to use a compliant dataset.

Errors exporting logs to Cloud Pub/Sub

The following table lists the most common errors when you configure
Logging to export logs to Cloud Pub/Sub:

Error

Cause

Solution

[ACCOUNT] needs edit permission on [PROJECT] to publish to [TOPIC]

The sink's writer identity does not have enough permissions to the
topic.