Managed Service for Apache Kafka® API, REST: Connector.Resume
Resumes an Apache Kafka® connector.
HTTP request
POST https://mdb.api.cloud.yandex.net/managed-kafka/v1/clusters/{clusterId}/connectors/resume/{connectorName}
Path parameters
Field |
Description |
clusterId |
string Required field. ID of the Apache Kafka® cluster to resume the connector in. To get this ID, make a ClusterService.List request. |
connectorName |
string Required field. Name of the Apache Kafka® connector to resume. To get this name, make a ConnectorService.List request. |
Response
HTTP Code: 200 - OK
{
"id": "string",
"description": "string",
"createdAt": "string",
"createdBy": "string",
"modifiedAt": "string",
"done": "boolean",
"metadata": {
"clusterId": "string",
"connectorName": "string"
},
// Includes only one of the fields `error`, `response`
"error": {
"code": "integer",
"message": "string",
"details": [
"object"
]
},
"response": {
"name": "string",
"tasksMax": "string",
"properties": "string",
"health": "string",
"status": "string",
"clusterId": "string",
// Includes only one of the fields `connectorConfigMirrormaker`, `connectorConfigS3Sink`
"connectorConfigMirrormaker": {
"sourceCluster": {
"alias": "string",
// Includes only one of the fields `thisCluster`, `externalCluster`
"thisCluster": "object",
"externalCluster": {
"bootstrapServers": "string",
"saslUsername": "string",
"saslMechanism": "string",
"securityProtocol": "string"
}
// end of the list of possible fields
},
"targetCluster": {
"alias": "string",
// Includes only one of the fields `thisCluster`, `externalCluster`
"thisCluster": "object",
"externalCluster": {
"bootstrapServers": "string",
"saslUsername": "string",
"saslMechanism": "string",
"securityProtocol": "string"
}
// end of the list of possible fields
},
"topics": "string",
"replicationFactor": "string"
},
"connectorConfigS3Sink": {
"topics": "string",
"fileCompressionType": "string",
"fileMaxRecords": "string",
"s3Connection": {
"bucketName": "string",
// Includes only one of the fields `externalS3`
"externalS3": {
"accessKeyId": "string",
"endpoint": "string",
"region": "string"
}
// end of the list of possible fields
}
}
// end of the list of possible fields
}
// end of the list of possible fields
}
An Operation resource. For more information, see Operation.
Field |
Description |
id |
string ID of the operation. |
description |
string Description of the operation. 0-256 characters long. |
createdAt |
string (date-time) Creation timestamp. String in RFC3339 To work with values in this field, use the APIs described in the |
createdBy |
string ID of the user or service account who initiated the operation. |
modifiedAt |
string (date-time) The time when the Operation resource was last modified. String in RFC3339 To work with values in this field, use the APIs described in the |
done |
boolean If the value is |
metadata |
Service-specific metadata associated with the operation. |
error |
The error result of the operation in case of failure or cancellation. Includes only one of the fields The operation result. |
response |
The normal response of the operation in case of success. Includes only one of the fields The operation result. |
ResumeConnectorMetadata
Field |
Description |
clusterId |
string ID of the Apache Kafka® cluster the connector is being resumed in. |
connectorName |
string Required field. Name of the Apache Kafka® connector that is beign resumed. |
Status
The error result of the operation in case of failure or cancellation.
Field |
Description |
code |
integer (int32) Error code. An enum value of google.rpc.Code |
message |
string An error message. |
details[] |
object A list of messages that carry the error details. |
Connector
Field |
Description |
name |
string Name of the connector. |
tasksMax |
string (int64) Maximum number of connector tasks. Default value is the number of brokers. |
properties |
string A set of properties passed to Managed Service for Apache Kafka® with the connector configuration. |
health |
enum (Health) Connector health.
|
status |
enum (Status) Current status of the connector.
|
clusterId |
string ID of the Apache Kafka® cluster that the connector belongs to. |
connectorConfigMirrormaker |
Configuration of the MirrorMaker connector. Includes only one of the fields Additional settings for the connector. |
connectorConfigS3Sink |
Configuration of S3-Sink connector. Includes only one of the fields Additional settings for the connector. |
ConnectorConfigMirrorMaker
Field |
Description |
sourceCluster |
Source cluster connection configuration. |
targetCluster |
Target cluster connection configuration. |
topics |
string List of Kafka topics, separated by |
replicationFactor |
string (int64) Replication factor for automatically created topics. |
ClusterConnection
Field |
Description |
alias |
string Alias of cluster connection configuration. |
thisCluster |
object Connection configuration of the cluster the connector belongs to. As all credentials are already known, leave this parameter empty. Includes only one of the fields Type of connection to Apache Kafka® cluster. |
externalCluster |
Configuration of connection to an external cluster with all the necessary credentials. Includes only one of the fields Type of connection to Apache Kafka® cluster. |
ExternalClusterConnection
Field |
Description |
bootstrapServers |
string List of bootstrap servers of the cluster, separated by |
saslUsername |
string SASL username to use for connection to the cluster. |
saslMechanism |
string SASL mechanism to use for connection to the cluster. |
securityProtocol |
string Security protocol to use for connection to the cluster. |
ConnectorConfigS3Sink
An Apache Kafka® S3-Sink
connector resource.
Field |
Description |
topics |
string List of Kafka topics, separated by ','. |
fileCompressionType |
string The compression type used for files put on GCS. |
fileMaxRecords |
string (int64) Max records per file. |
s3Connection |
Credentials for connecting to S3 storage. |
S3Connection
Resource for S3Connection -
settings of connection to AWS-compatible S3 storage, that
are source or target of Kafka S3-connectors.
YC Object Storage is AWS-compatible.
Field |
Description |
bucketName |
string |
externalS3 |
Includes only one of the fields |
ExternalS3Storage
Field |
Description |
accessKeyId |
string |
endpoint |
string |
region |
string Default is 'us-east-1' |