1. Packages
  2. Airbyte Provider
  3. API Docs
  4. DestinationDatabricks
airbyte 0.8.1-beta published on Saturday, Mar 29, 2025 by airbytehq

airbyte.DestinationDatabricks

Explore with Pulumi AI

DestinationDatabricks Resource

Example Usage

Coming soon!
Coming soon!
Coming soon!
Coming soon!
package generated_program;

import com.pulumi.Context;
import com.pulumi.Pulumi;
import com.pulumi.core.Output;
import com.pulumi.airbyte.DestinationDatabricks;
import com.pulumi.airbyte.DestinationDatabricksArgs;
import com.pulumi.airbyte.inputs.DestinationDatabricksConfigurationArgs;
import com.pulumi.airbyte.inputs.DestinationDatabricksConfigurationAuthenticationArgs;
import com.pulumi.airbyte.inputs.DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs;
import java.util.List;
import java.util.ArrayList;
import java.util.Map;
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Paths;

public class App {
    public static void main(String[] args) {
        Pulumi.run(App::stack);
    }

    public static void stack(Context ctx) {
        var myDestinationDatabricks = new DestinationDatabricks("myDestinationDatabricks", DestinationDatabricksArgs.builder()
            .configuration(DestinationDatabricksConfigurationArgs.builder()
                .accept_terms(false)
                .authentication(DestinationDatabricksConfigurationAuthenticationArgs.builder()
                    .personalAccessToken(DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs.builder()
                        .personalAccessToken("...my_personal_access_token...")
                        .build())
                    .build())
                .database("...my_database...")
                .hostname("abc-12345678-wxyz.cloud.databricks.com")
                .http_path("sql/1.0/warehouses/0000-1111111-abcd90")
                .port("443")
                .purge_staging_data(false)
                .raw_schema_override("...my_raw_schema_override...")
                .schema("default")
                .build())
            .definitionId("fb6a88f5-a304-46f5-ab8b-4280a6d91f99")
            .workspaceId("2615758c-c904-459e-9fd6-c8a55cba9327")
            .build());

    }
}
Copy
resources:
  myDestinationDatabricks:
    type: airbyte:DestinationDatabricks
    properties:
      configuration:
        accept_terms: false
        authentication:
          personalAccessToken:
            personalAccessToken: '...my_personal_access_token...'
        database: '...my_database...'
        hostname: abc-12345678-wxyz.cloud.databricks.com
        http_path: sql/1.0/warehouses/0000-1111111-abcd90
        port: '443'
        purge_staging_data: false
        raw_schema_override: '...my_raw_schema_override...'
        schema: default
      definitionId: fb6a88f5-a304-46f5-ab8b-4280a6d91f99
      workspaceId: 2615758c-c904-459e-9fd6-c8a55cba9327
Copy

Create DestinationDatabricks Resource

Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.

Constructor syntax

new DestinationDatabricks(name: string, args: DestinationDatabricksArgs, opts?: CustomResourceOptions);
@overload
def DestinationDatabricks(resource_name: str,
                          args: DestinationDatabricksArgs,
                          opts: Optional[ResourceOptions] = None)

@overload
def DestinationDatabricks(resource_name: str,
                          opts: Optional[ResourceOptions] = None,
                          configuration: Optional[DestinationDatabricksConfigurationArgs] = None,
                          workspace_id: Optional[str] = None,
                          definition_id: Optional[str] = None,
                          name: Optional[str] = None)
func NewDestinationDatabricks(ctx *Context, name string, args DestinationDatabricksArgs, opts ...ResourceOption) (*DestinationDatabricks, error)
public DestinationDatabricks(string name, DestinationDatabricksArgs args, CustomResourceOptions? opts = null)
public DestinationDatabricks(String name, DestinationDatabricksArgs args)
public DestinationDatabricks(String name, DestinationDatabricksArgs args, CustomResourceOptions options)
type: airbyte:DestinationDatabricks
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.

Parameters

name This property is required. string
The unique name of the resource.
args This property is required. DestinationDatabricksArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.
resource_name This property is required. str
The unique name of the resource.
args This property is required. DestinationDatabricksArgs
The arguments to resource properties.
opts ResourceOptions
Bag of options to control resource's behavior.
ctx Context
Context object for the current deployment.
name This property is required. string
The unique name of the resource.
args This property is required. DestinationDatabricksArgs
The arguments to resource properties.
opts ResourceOption
Bag of options to control resource's behavior.
name This property is required. string
The unique name of the resource.
args This property is required. DestinationDatabricksArgs
The arguments to resource properties.
opts CustomResourceOptions
Bag of options to control resource's behavior.
name This property is required. String
The unique name of the resource.
args This property is required. DestinationDatabricksArgs
The arguments to resource properties.
options CustomResourceOptions
Bag of options to control resource's behavior.

Constructor example

The following reference example uses placeholder values for all input properties.

var destinationDatabricksResource = new Airbyte.DestinationDatabricks("destinationDatabricksResource", new()
{
    Configuration = new Airbyte.Inputs.DestinationDatabricksConfigurationArgs
    {
        Authentication = new Airbyte.Inputs.DestinationDatabricksConfigurationAuthenticationArgs
        {
            OAuth2Recommended = new Airbyte.Inputs.DestinationDatabricksConfigurationAuthenticationOAuth2RecommendedArgs
            {
                ClientId = "string",
                Secret = "string",
            },
            PersonalAccessToken = new Airbyte.Inputs.DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs
            {
                PersonalAccessToken = "string",
            },
        },
        Database = "string",
        Hostname = "string",
        HttpPath = "string",
        AcceptTerms = false,
        Port = "string",
        PurgeStagingData = false,
        RawSchemaOverride = "string",
        Schema = "string",
    },
    WorkspaceId = "string",
    DefinitionId = "string",
    Name = "string",
});
Copy
example, err := airbyte.NewDestinationDatabricks(ctx, "destinationDatabricksResource", &airbyte.DestinationDatabricksArgs{
Configuration: &.DestinationDatabricksConfigurationArgs{
Authentication: &.DestinationDatabricksConfigurationAuthenticationArgs{
OAuth2Recommended: &.DestinationDatabricksConfigurationAuthenticationOAuth2RecommendedArgs{
ClientId: pulumi.String("string"),
Secret: pulumi.String("string"),
},
PersonalAccessToken: &.DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs{
PersonalAccessToken: pulumi.String("string"),
},
},
Database: pulumi.String("string"),
Hostname: pulumi.String("string"),
HttpPath: pulumi.String("string"),
AcceptTerms: pulumi.Bool(false),
Port: pulumi.String("string"),
PurgeStagingData: pulumi.Bool(false),
RawSchemaOverride: pulumi.String("string"),
Schema: pulumi.String("string"),
},
WorkspaceId: pulumi.String("string"),
DefinitionId: pulumi.String("string"),
Name: pulumi.String("string"),
})
Copy
var destinationDatabricksResource = new DestinationDatabricks("destinationDatabricksResource", DestinationDatabricksArgs.builder()
    .configuration(DestinationDatabricksConfigurationArgs.builder()
        .authentication(DestinationDatabricksConfigurationAuthenticationArgs.builder()
            .oAuth2Recommended(DestinationDatabricksConfigurationAuthenticationOAuth2RecommendedArgs.builder()
                .clientId("string")
                .secret("string")
                .build())
            .personalAccessToken(DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs.builder()
                .personalAccessToken("string")
                .build())
            .build())
        .database("string")
        .hostname("string")
        .httpPath("string")
        .acceptTerms(false)
        .port("string")
        .purgeStagingData(false)
        .rawSchemaOverride("string")
        .schema("string")
        .build())
    .workspaceId("string")
    .definitionId("string")
    .name("string")
    .build());
Copy
destination_databricks_resource = airbyte.DestinationDatabricks("destinationDatabricksResource",
    configuration={
        "authentication": {
            "o_auth2_recommended": {
                "client_id": "string",
                "secret": "string",
            },
            "personal_access_token": {
                "personal_access_token": "string",
            },
        },
        "database": "string",
        "hostname": "string",
        "http_path": "string",
        "accept_terms": False,
        "port": "string",
        "purge_staging_data": False,
        "raw_schema_override": "string",
        "schema": "string",
    },
    workspace_id="string",
    definition_id="string",
    name="string")
Copy
const destinationDatabricksResource = new airbyte.DestinationDatabricks("destinationDatabricksResource", {
    configuration: {
        authentication: {
            oAuth2Recommended: {
                clientId: "string",
                secret: "string",
            },
            personalAccessToken: {
                personalAccessToken: "string",
            },
        },
        database: "string",
        hostname: "string",
        httpPath: "string",
        acceptTerms: false,
        port: "string",
        purgeStagingData: false,
        rawSchemaOverride: "string",
        schema: "string",
    },
    workspaceId: "string",
    definitionId: "string",
    name: "string",
});
Copy
type: airbyte:DestinationDatabricks
properties:
    configuration:
        acceptTerms: false
        authentication:
            oAuth2Recommended:
                clientId: string
                secret: string
            personalAccessToken:
                personalAccessToken: string
        database: string
        hostname: string
        httpPath: string
        port: string
        purgeStagingData: false
        rawSchemaOverride: string
        schema: string
    definitionId: string
    name: string
    workspaceId: string
Copy

DestinationDatabricks Resource Properties

To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

Inputs

In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.

The DestinationDatabricks resource accepts the following input properties:

Configuration This property is required. DestinationDatabricksConfiguration
WorkspaceId This property is required. string
DefinitionId string
The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
Name string
Name of the destination e.g. dev-mysql-instance.
Configuration This property is required. DestinationDatabricksConfigurationArgs
WorkspaceId This property is required. string
DefinitionId string
The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
Name string
Name of the destination e.g. dev-mysql-instance.
configuration This property is required. DestinationDatabricksConfiguration
workspaceId This property is required. String
definitionId String
The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
name String
Name of the destination e.g. dev-mysql-instance.
configuration This property is required. DestinationDatabricksConfiguration
workspaceId This property is required. string
definitionId string
The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
name string
Name of the destination e.g. dev-mysql-instance.
configuration This property is required. DestinationDatabricksConfigurationArgs
workspace_id This property is required. str
definition_id str
The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
name str
Name of the destination e.g. dev-mysql-instance.
configuration This property is required. Property Map
workspaceId This property is required. String
definitionId String
The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
name String
Name of the destination e.g. dev-mysql-instance.

Outputs

All input properties are implicitly available as output properties. Additionally, the DestinationDatabricks resource produces the following output properties:

CreatedAt double
DestinationId string
DestinationType string
Id string
The provider-assigned unique ID for this managed resource.
ResourceAllocation DestinationDatabricksResourceAllocation
actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
CreatedAt float64
DestinationId string
DestinationType string
Id string
The provider-assigned unique ID for this managed resource.
ResourceAllocation DestinationDatabricksResourceAllocation
actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
createdAt Double
destinationId String
destinationType String
id String
The provider-assigned unique ID for this managed resource.
resourceAllocation DestinationDatabricksResourceAllocation
actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
createdAt number
destinationId string
destinationType string
id string
The provider-assigned unique ID for this managed resource.
resourceAllocation DestinationDatabricksResourceAllocation
actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
created_at float
destination_id str
destination_type str
id str
The provider-assigned unique ID for this managed resource.
resource_allocation DestinationDatabricksResourceAllocation
actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
createdAt Number
destinationId String
destinationType String
id String
The provider-assigned unique ID for this managed resource.
resourceAllocation Property Map
actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.

Look up Existing DestinationDatabricks Resource

Get an existing DestinationDatabricks resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.

public static get(name: string, id: Input<ID>, state?: DestinationDatabricksState, opts?: CustomResourceOptions): DestinationDatabricks
@staticmethod
def get(resource_name: str,
        id: str,
        opts: Optional[ResourceOptions] = None,
        configuration: Optional[DestinationDatabricksConfigurationArgs] = None,
        created_at: Optional[float] = None,
        definition_id: Optional[str] = None,
        destination_id: Optional[str] = None,
        destination_type: Optional[str] = None,
        name: Optional[str] = None,
        resource_allocation: Optional[DestinationDatabricksResourceAllocationArgs] = None,
        workspace_id: Optional[str] = None) -> DestinationDatabricks
func GetDestinationDatabricks(ctx *Context, name string, id IDInput, state *DestinationDatabricksState, opts ...ResourceOption) (*DestinationDatabricks, error)
public static DestinationDatabricks Get(string name, Input<string> id, DestinationDatabricksState? state, CustomResourceOptions? opts = null)
public static DestinationDatabricks get(String name, Output<String> id, DestinationDatabricksState state, CustomResourceOptions options)
resources:  _:    type: airbyte:DestinationDatabricks    get:      id: ${id}
name This property is required.
The unique name of the resulting resource.
id This property is required.
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
resource_name This property is required.
The unique name of the resulting resource.
id This property is required.
The unique provider ID of the resource to lookup.
name This property is required.
The unique name of the resulting resource.
id This property is required.
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
name This property is required.
The unique name of the resulting resource.
id This property is required.
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
name This property is required.
The unique name of the resulting resource.
id This property is required.
The unique provider ID of the resource to lookup.
state
Any extra arguments used during the lookup.
opts
A bag of options that control this resource's behavior.
The following state arguments are supported:
Configuration DestinationDatabricksConfiguration
CreatedAt double
DefinitionId string
The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
DestinationId string
DestinationType string
Name string
Name of the destination e.g. dev-mysql-instance.
ResourceAllocation DestinationDatabricksResourceAllocation
actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
WorkspaceId string
Configuration DestinationDatabricksConfigurationArgs
CreatedAt float64
DefinitionId string
The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
DestinationId string
DestinationType string
Name string
Name of the destination e.g. dev-mysql-instance.
ResourceAllocation DestinationDatabricksResourceAllocationArgs
actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
WorkspaceId string
configuration DestinationDatabricksConfiguration
createdAt Double
definitionId String
The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
destinationId String
destinationType String
name String
Name of the destination e.g. dev-mysql-instance.
resourceAllocation DestinationDatabricksResourceAllocation
actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
workspaceId String
configuration DestinationDatabricksConfiguration
createdAt number
definitionId string
The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
destinationId string
destinationType string
name string
Name of the destination e.g. dev-mysql-instance.
resourceAllocation DestinationDatabricksResourceAllocation
actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
workspaceId string
configuration DestinationDatabricksConfigurationArgs
created_at float
definition_id str
The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
destination_id str
destination_type str
name str
Name of the destination e.g. dev-mysql-instance.
resource_allocation DestinationDatabricksResourceAllocationArgs
actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
workspace_id str
configuration Property Map
createdAt Number
definitionId String
The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
destinationId String
destinationType String
name String
Name of the destination e.g. dev-mysql-instance.
resourceAllocation Property Map
actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
workspaceId String

Supporting Types

DestinationDatabricksConfiguration
, DestinationDatabricksConfigurationArgs

Authentication This property is required. DestinationDatabricksConfigurationAuthentication
Authentication mechanism for Staging files and running queries
Database This property is required. string
The name of the unity catalog for the database
Hostname This property is required. string
Databricks Cluster Server Hostname.
HttpPath This property is required. string
Databricks Cluster HTTP Path.
AcceptTerms bool
You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
Port string
Databricks Cluster Port. Default: "443"
PurgeStagingData bool
Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
RawSchemaOverride string
The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
Schema string
The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
Authentication This property is required. DestinationDatabricksConfigurationAuthentication
Authentication mechanism for Staging files and running queries
Database This property is required. string
The name of the unity catalog for the database
Hostname This property is required. string
Databricks Cluster Server Hostname.
HttpPath This property is required. string
Databricks Cluster HTTP Path.
AcceptTerms bool
You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
Port string
Databricks Cluster Port. Default: "443"
PurgeStagingData bool
Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
RawSchemaOverride string
The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
Schema string
The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
authentication This property is required. DestinationDatabricksConfigurationAuthentication
Authentication mechanism for Staging files and running queries
database This property is required. String
The name of the unity catalog for the database
hostname This property is required. String
Databricks Cluster Server Hostname.
httpPath This property is required. String
Databricks Cluster HTTP Path.
acceptTerms Boolean
You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
port String
Databricks Cluster Port. Default: "443"
purgeStagingData Boolean
Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
rawSchemaOverride String
The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
schema String
The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
authentication This property is required. DestinationDatabricksConfigurationAuthentication
Authentication mechanism for Staging files and running queries
database This property is required. string
The name of the unity catalog for the database
hostname This property is required. string
Databricks Cluster Server Hostname.
httpPath This property is required. string
Databricks Cluster HTTP Path.
acceptTerms boolean
You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
port string
Databricks Cluster Port. Default: "443"
purgeStagingData boolean
Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
rawSchemaOverride string
The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
schema string
The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
authentication This property is required. DestinationDatabricksConfigurationAuthentication
Authentication mechanism for Staging files and running queries
database This property is required. str
The name of the unity catalog for the database
hostname This property is required. str
Databricks Cluster Server Hostname.
http_path This property is required. str
Databricks Cluster HTTP Path.
accept_terms bool
You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
port str
Databricks Cluster Port. Default: "443"
purge_staging_data bool
Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
raw_schema_override str
The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
schema str
The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
authentication This property is required. Property Map
Authentication mechanism for Staging files and running queries
database This property is required. String
The name of the unity catalog for the database
hostname This property is required. String
Databricks Cluster Server Hostname.
httpPath This property is required. String
Databricks Cluster HTTP Path.
acceptTerms Boolean
You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
port String
Databricks Cluster Port. Default: "443"
purgeStagingData Boolean
Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
rawSchemaOverride String
The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
schema String
The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"

DestinationDatabricksConfigurationAuthentication
, DestinationDatabricksConfigurationAuthenticationArgs

DestinationDatabricksConfigurationAuthenticationOAuth2Recommended
, DestinationDatabricksConfigurationAuthenticationOAuth2RecommendedArgs

ClientId This property is required. string
Secret This property is required. string
ClientId This property is required. string
Secret This property is required. string
clientId This property is required. String
secret This property is required. String
clientId This property is required. string
secret This property is required. string
client_id This property is required. str
secret This property is required. str
clientId This property is required. String
secret This property is required. String

DestinationDatabricksConfigurationAuthenticationPersonalAccessToken
, DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs

PersonalAccessToken This property is required. string
PersonalAccessToken This property is required. string
personalAccessToken This property is required. String
personalAccessToken This property is required. string
personal_access_token This property is required. str
personalAccessToken This property is required. String

DestinationDatabricksResourceAllocation
, DestinationDatabricksResourceAllocationArgs

Default DestinationDatabricksResourceAllocationDefault
optional resource requirements to run workers (blank for unbounded allocations)
JobSpecifics List<DestinationDatabricksResourceAllocationJobSpecific>
Default DestinationDatabricksResourceAllocationDefault
optional resource requirements to run workers (blank for unbounded allocations)
JobSpecifics []DestinationDatabricksResourceAllocationJobSpecific
default DestinationDatabricksResourceAllocationDefault
optional resource requirements to run workers (blank for unbounded allocations)
jobSpecifics DestinationDatabricksResourceAllocationJobSpecific[]
default Property Map
optional resource requirements to run workers (blank for unbounded allocations)
jobSpecifics List<Property Map>

DestinationDatabricksResourceAllocationDefault
, DestinationDatabricksResourceAllocationDefaultArgs

DestinationDatabricksResourceAllocationJobSpecific
, DestinationDatabricksResourceAllocationJobSpecificArgs

JobType string
enum that describes the different types of jobs that the platform runs. must be one of ["getspec", "checkconnection", "discoverschema", "sync", "resetconnection", "connection_updater", "replicate"]
ResourceRequirements DestinationDatabricksResourceAllocationJobSpecificResourceRequirements
optional resource requirements to run workers (blank for unbounded allocations)
JobType string
enum that describes the different types of jobs that the platform runs. must be one of ["getspec", "checkconnection", "discoverschema", "sync", "resetconnection", "connection_updater", "replicate"]
ResourceRequirements DestinationDatabricksResourceAllocationJobSpecificResourceRequirements
optional resource requirements to run workers (blank for unbounded allocations)
jobType String
enum that describes the different types of jobs that the platform runs. must be one of ["getspec", "checkconnection", "discoverschema", "sync", "resetconnection", "connection_updater", "replicate"]
resourceRequirements DestinationDatabricksResourceAllocationJobSpecificResourceRequirements
optional resource requirements to run workers (blank for unbounded allocations)
jobType string
enum that describes the different types of jobs that the platform runs. must be one of ["getspec", "checkconnection", "discoverschema", "sync", "resetconnection", "connection_updater", "replicate"]
resourceRequirements DestinationDatabricksResourceAllocationJobSpecificResourceRequirements
optional resource requirements to run workers (blank for unbounded allocations)
job_type str
enum that describes the different types of jobs that the platform runs. must be one of ["getspec", "checkconnection", "discoverschema", "sync", "resetconnection", "connection_updater", "replicate"]
resource_requirements DestinationDatabricksResourceAllocationJobSpecificResourceRequirements
optional resource requirements to run workers (blank for unbounded allocations)
jobType String
enum that describes the different types of jobs that the platform runs. must be one of ["getspec", "checkconnection", "discoverschema", "sync", "resetconnection", "connection_updater", "replicate"]
resourceRequirements Property Map
optional resource requirements to run workers (blank for unbounded allocations)

DestinationDatabricksResourceAllocationJobSpecificResourceRequirements
, DestinationDatabricksResourceAllocationJobSpecificResourceRequirementsArgs

Import

$ pulumi import airbyte:index/destinationDatabricks:DestinationDatabricks my_airbyte_destination_databricks ""
Copy

To learn more about importing existing cloud resources, see Importing resources.

Package Details

Repository
airbyte airbytehq/terraform-provider-airbyte
License
Notes
This Pulumi package is based on the airbyte Terraform Provider.