Drafting a trail specification
You can create and edit trails via the CLI using YAML specifications. To create a new trail, create a specification from the get command output of a similar trail. To edit an existing trail, proceed based on the get command output of the same trail.
Tip
To create and manage multiple trails, use Terraform.
Drafting a specification for a trail
-
Get information about the trail you want to edit or use as the basis for a new one:
yc audit-trails trail get <trail_name_or_ID>Result:
id: cnpmhbf7gsq3******** folder_id: b1g681qpemb4******** created_at: "2026-02-13T20:32:52.357Z" updated_at: "2026-02-13T20:32:52.357Z" name: create-me destination: object_storage: bucket_id: recreate-trails service_account_id: ajelprpohp7r******** status: ACTIVE cloud_id: b1gia87mbaom******** filtering_policy: management_events_filter: resource_scopes: - id: b1g681qpemb4******** type: resource-manager.folder data_events_filters: - service: compute resource_scopes: - id: b1g681qpemb4******** type: resource-manager.folder -
Save the information you got into a file, e.g.,
my-trail-spec.yaml. -
When editing a trail, rename the
idfield totrail_id. -
Delete these fields:
folder_id(only when editing a trail)created_atupdated_atstatuscloud_id
-
Optionally, delete obsolete sections, if any:
filterpath_filterevent_filter
-
Edit relevant trail parameters.
Description of trail parameters
name: <trail_name> folder_id: <folder_ID> destination: # Only one destination must be specified: # object_storage, cloud_logging, data_stream, or eventrouter # Settings for all destinations are provided for illustration purposes. object_storage: bucket_id: <bucket_name> object_prefix: <prefix_for_objects> cloud_logging: log_group_id: <log_group_ID> data_stream: stream_name: <YDS_name> database_id: <YDS_database_ID> codec: <event_compression_method> eventrouter: eventrouter_connector_id: <bus_connector_ID> service_account_id: <service_account_ID> filtering_policy: management_events_filter: resource_scopes: - id: <cloud_or_folder_organization_ID> type: <type> data_events_filters: - service: <service_name> resource_scopes: - id: <cloud_or_folder_organization_ID> type: <type> # You can specify either included_events or excluded_events, # or skip both parameters to collect all service events. # Both parameters are provided for illustration purposes. included_events: event_types: - <these_events_will_be_collected> excluded_events: event_types: - <these_events_will_not_be_collected>Where:
name: Trail name. It must be unique within the folder.folder_id: ID of the folder the trail will reside in.
-
destination: Settings of the selected destination the audit logs will be uploaded to.Warning
Destination settings are mutually exclusive. Using some settings makes it impossible to use others.
-
object_storage: Uploading logs to a Yandex Object Storage bucket:-
bucket_id: Bucket name.You can request the name of the bucket with the list of buckets in the folder (the default folder is used):
yc storage bucket list -
object_prefix: Prefix that will be assigned to the objects with audit logs in the bucket. It is an optional parameter used in the full name of the audit log file.Note
Use a prefix to store audit logs and third-party data in the same bucket. Do not use the same prefix for logs and other bucket objects because that may cause logs and third-party objects to overwrite each other.
-
-
cloud_logging: Uploading logs to a Yandex Cloud Logging group.Specify the log group ID in the
log_group_idparameter. You can request the ID with the list of log groups in the folder. -
data_stream: Uploading logs to a data stream in Yandex Data Streams:stream_name: Stream name. You can request the name with the list of data streams in the folder.database_id: ID of the YDB database used by Data Streams. You can request the ID with the list of YDB databases in the folder.
-
-
service_account_id: Service account ID.
-
filtering_policy: Settings of the filtering policy that determines which events to collect and include in the audit logs. The policy consists of filters pertaining to different levels of events.Warning
You must configure at least one filter for the policy; otherwise, you will not be able to create a trail.
Available filters:
-
management_events_filter: Management event filter.Specify the log collection scope in the
resource_scopesparameter:-
id: Organization, cloud, or folder ID. -
type: Scope type according to the specified ID:organization-manager.organization: Organization.resource-manager.cloud: Cloud.resource-manager.folder: Folder.
You can combine several scopes belonging to the same organization in one
resource_scopesparameter. For example, you can collect logs from one entire cloud and only from particular folders in another cloud:resource_scopes: # Collecting logs from all of cloud 1 - id: <ID_of_cloud_1> type: resource-manager.cloud # Collecting logs from folder 1 of cloud 2 - id: <folder_1_ID> type: resource-manager.folder # Collecting logs from folder 2 of cloud 2 - id: <folder_2_ID> type: resource-manager.folderService account permissions must allow collecting logs from the specified scopes.
-
-
data_events_filters: Data event filters. You can configure several filters of this type, one filter per service.A filter for one service has the following structure:
-
service: Service name. You can get it from the data event reference. -
resource_scopes: Places to collect data events from. You can configure this parameter the same way as the management event filter. -
*_events: Data event filters.included_events.event_types: Collect only specified events.excluded_events.event_types: Collect all events other than the specified ones.
You can get a list of events from the data event reference.
Warning
The
included_eventsandexcluded_eventsfilters are mutually exclusive, so only one of them should be set up. If neither filter is set up, all events will be collected.
-
-
-
Save the changes to the file.