Management APIs
Destinations API

Create Destination

5min

Create Destination

Create a Destination to upload Batch Result Sets to on completion.

Destinations are created by making an HTTP POST to the /destinations endpoint. POST parameters can be either supplied as x-www-form-urlencoded parameters or by POST'ing a JSON object.



Example

POST /destination

To create a new Amazon S3 Destination the request would be as shown below:

Verifying your Destination Whenever you create a new enabled Destination, or enable a previous disabled Destination, or update an existing Destination's credentials, SerpWow will upload and then delete a small test file to the Destination to verify connectivity. If this connectivity test fails then the create/update operation itself fails.

Curl
Node.js
Python
PHP


SerpWow responds with the following JSON confirming the Destination has been successfully created.

JSON



Destination Parameters

POST /destinations

The following parameters can be used when creating a Destination. The api_key parameter should be supplied as a querystring parameter on the request URL.

Parameters pther than api_key should be supplied in either x-www-form-urlencoded or JSON format, in the request body.

Parameter

Required

Description

api_key

required

The API key for your account, should be supplied on the request querystring, i.e. api_key=demo

name

required

The name of the Destination.

enabled

required

Determines whether the Destination is enabled or not. Disabled Destinations do not have Batch Result Sets uploaded to them, even if the Destination is set of a Batch.

Parameter Values

  1. true
  2. Destination is enabled



  1. false
  2. Destination is disabled

type

required

Determines the type of Destination.

Parameter Values

  1. s3
  2. Specifies that this Destination is for an Amazon S3 bucket.



  1. gcs
  2. Specifies that this Destination is for a Google Cloud Storage bucket.



  1. azure
  2. Specifies that this Destination is for Microsoft Azure Blob Storage.



  1. oss
  2. Specifies that this Destination is for a Alibaba Cloud OSS bucket.



  1. s3compatible
  2. Specifies that this Destination is an S3 compatible object storage provider. Note that when type=s3compatible an S3-compatible endpoint must be supplied in the s3_endpoint parameter.

s3_access_key_id

required

Applies only when type=s3 or type=s3compatible

The Amazon S3 Access Key ID. We strongly recommend you set up credentials specifically for SerpWow to use.

s3_secret_access_key

required

Applies only when type=s3 or type=s3compatible

The Amazon S3 Secret Access Key. We strongly recommend you set up credentials specifically for SerpWow to use.

s3_bucket_name

required

Applies only when type=s3 or type=s3compatible

The Amazon S3 bucket name to use.

s3_path_prefix

optional

Applies only when type=s3 or type=s3compatible

The Amazon S3 path prefix to prepend to Batch Result Set files when they are uploaded to your bucket.

The text tokens %BATCH_ID% , %BATCH_NAME% , %RESULT_SET_ID% and %DATE% will be replaced with the Batch ID, Batch name, Result Set ID and UTC date (in the form YYYY_MM_DD) respectively when the files are uploaded.

s3_endpoint

required

Applies only when type=s3compatible

The Endpoint to use when connecting to S3 compatible storage (i.e. when type=s3compatible ) For example https://ams3.digitaloceanspaces.com

s3_region

optional

Applies only when type=s3compatible

The region to use when connecting to S3 compatible storage (i.e. when type=s3compatible ) For example eu-west-1 , us-east-1 , auto , etc

gcs_access_key

required

Applies only when type=gcs

The interoperable storage access key to use when connecting to Google Cloud Storage. We strongly recommend you set up credentials specifically for SerpWow to use.

gcs_secret_key

required

Applies only when type=gcs

The interoperable storage secret key used when connecting to Google Cloud Storage. We strongly recommend you set up credentials specifically for SerpWow to use.

gcs_bucket_name

required

Applies only when type=gcs

The Google Cloud Storage bucket name to use.

gcs_path_prefix

optional

Applies only when type=gcs

The Google Cloud Storage path prefix to prepend to Batch Result Set files when they are uploaded to your bucket.

The text tokens %BATCH_ID% , %BATCH_NAME% , %RESULT_SET_ID% and %DATE% will be replaced with the Batch ID, Batch name, Result Set ID and UTC date (in the form YYYY_MM_DD) respectively when the files are uploaded.

azure_account_name

required

Applies only when type=azure

The storage account name to use when connecting to Microsoft Azure Blob Storage. We strongly recommend you set up credentials specifically for SerpWow to use.

azure_account_key

required

Applies only when type=azure

The account key used when connecting to Microsoft Azure Blob Storage. We strongly recommend you set up credentials specifically for SerpWow to use.

azure_container_name

required

Applies only when type=azure

The Microsoft Azure Blob Storage container name to use.

azure_path_prefix

optional

Applies only when type=azure

The Microsoft Azure Blob Storage path prefix to prepend to Batch Result Set files when they are uploaded to your bucket.

The text tokens %BATCH_ID% , %BATCH_NAME% , %RESULT_SET_ID% and %DATE% will be replaced with the Batch ID, Batch name, Result Set ID and UTC date (in the form YYYY_MM_DD) respectively when the files are uploaded.

oss_access_key

required

Applies only when type=oss

The RAM user access key to use when connecting to Alibaba Cloud OSS. We highly recommend creating a new Alibaba Cloud RAM user specifically for SerpWow to use. You can do so in your Alibaba Cloud console

oss_secret_key

required

Applies only when type=oss

The RAM user secret key used when connecting to Alibaba Cloud OSS. We highly recommend creating new Alibaba Cloud RAM user specifically for SerpWow to use. You can do so in your Alibaba Cloud console

oss_bucket_name

required

Applies only when type=oss

The Alibaba Cloud OSS bucket name to use.

oss_region_id

required

Applies only when type=oss

The Alibaba Cloud OSS Region ID to use when connecting to your bucket.

oss-cn-hangzhou - China (Hangzhou)

oss-cn-shanghai - China (Shanghai)

oss-cn-qingdao - China (Qingdao)

oss-cn-beijing - China (Beijing)

oss-cn-zhangjiakou - China (Zhangjiakou)

oss-cn-huhehaote - China (Hohhot)

oss-cn-wulanchabu - China (Ulanqab)

oss-cn-shenzhen - China (Shenzhen)

oss-cn-heyuan - China (Heyuan)

oss-cn-guangzhou - China (Guangzhou)

oss-cn-chengdu - China (Chengdu)

oss-cn-hongkong - China (Hong Kong)

oss-us-west-1 - US (Silicon Valley)

oss-us-east-1 - US (Virginia)

oss-ap-southeast-1 - Singapore

oss-ap-southeast-2 - Australia (Sydney)

oss-ap-southeast-3 - Malaysia (Kuala Lumpur)

oss-ap-southeast-5 - Indonesia (Jakarta)

oss-ap-northeast-1 - Japan (Tokyo)

oss-ap-south-1 - India (Mumbai)

oss-eu-central-1 - Germany (Frankfurt)

oss-eu-west-1 - UK (London)

oss-me-east-1 - UAE (Dubai)

oss_path_prefix

optional

Applies only when type=oss

The Alibaba Cloud OSS path prefix to prepend to Batch Result Set files when they are uploaded to your bucket.

The text tokens %BATCH_ID% , %BATCH_NAME% , %RESULT_SET_ID% and %DATE% will be replaced with the Batch ID, Batch name, Result Set ID and UTC date (in the form YYYY_MM_DD) respectively when the files are uploaded.



Next Steps      List Destinations      Update Destination

Updated 14 Aug 2024
Did this page help you?