Crawler
Endpoints
-
https://api.eu.cloud.talend.com
URL for the AWS Europe region
-
https://api.ap.cloud.talend.com
URL for the AWS Asia Pacific region
-
https://api.us.cloud.talend.com
URL for the AWS United States East region
-
https://api.au.cloud.talend.com
URL for the AWS Australia region
-
https://api.us-west.cloud.talend.com
URL for the Azure United States West region
Security scheme
This scheme can be referenced across the API
BearerAuthentication
Name | Description |
---|---|
Format | Bearer <TOKEN> |
Name | Description | Type | Attributes and examples |
---|---|---|---|
Authorization | The authorization token (PAT, SAT or JWT) | string Required |
Bearer 5eiL8JGlRse48puLamFyXGvp9U4aB2eMqlBZPtDNG5MjSGZpdWlvjvioVbUvEsz3 |
Retrieve all the crawlers of a tenant
GET /connections/crawlersRequest
Name | Description | Type | Attributes and examples |
---|---|---|---|
limit | not used | integer Optional |
INT32 |
offset | not used | integer Optional |
INT32 |
talendVersion | default version of the API | string Optional |
|
includeDeleted | if true, it will also returns the crawlers that have been deleted. | boolean Optional | |
connectionId | Use this option if you want to retrieve the crawler linked to a connection | string Optional |
Name | Description | Type | Attributes and examples |
---|---|---|---|
talend-version | default version of the API | string Optional |
|
Response
200Status 200
The response payload contains the list of returned crawlers. A crawler contains 3 parts :
- the sharing set which tells with whom the generated datasets are shared
- the status of the crawler
- the datasets created by the crawler
{
"data": [
{
"id": "59451bf0-a81a-11eb-bcbc-0242ac130002",
"connectionId": "d54a8f03-7906-4930-a7cc-4eb90e968f89",
"name": "Crawler1",
"description": "Description du crawler 1",
"sharings": [
{
"scimType": "user",
"scimId": "b8a78dcb-65b4-4823-ad76-88720fc6309e",
"level": "OWNER"
},
{
"scimType": "group",
"scimId": "877f89dc-709b-4ef1-8d0e-a851f67a065a",
"level": "READER"
},
{
"scimType": "user",
"scimId": "bd4c7ae4-a1df-4702-845e-11946fa07d85",
"level": "WRITER"
}
],
"status": {
"runStatus": "NotStarted"
},
"createdAt": "2021-01-08T15:41:29.263Z",
"createdBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"crawledDatasets": [
"Table1",
"Table2",
"Table3",
"View1"
]
},
{
"id": "3a45cb46-a81a-11eb-bcbc-0242ac130002",
"connectionId": "165ea830-e003-11eb-ba80-0242ac130004",
"name": "Crawler2",
"description": "Description du crawler 2",
"sharings": [
{
"scimType": "user",
"scimId": "b8a78dcb-65b4-4823-ad76-88720fc6309e",
"level": "OWNER"
},
{
"scimType": "group",
"scimId": "877f89dc-709b-4ef1-8d0e-a851f67a065a",
"level": "READER"
},
{
"scimType": "user",
"scimId": "bd4c7ae4-a1df-4702-845e-11946fa07d85",
"level": "WRITER"
}
],
"status": {
"runStatus": "RetrievingProperties",
"runStartedAt": "2021-01-08T15:41:29.263Z",
"runBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516"
},
"createdAt": "2021-01-08T15:41:29.263Z",
"createdBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"crawledDatasets": [
"Dataset 1",
"Dataset 2",
"Dataset 3",
"Dataset 4"
]
},
{
"id": "108fb1c2-a81a-11eb-bcbc-0242ac130002",
"connectionId": "1db0db6c-e003-11eb-ba80-0242ac130004",
"name": "Crawler3",
"description": "Description du crawler 3",
"sharings": [
{
"scimType": "user",
"scimId": "b8a78dcb-65b4-4823-ad76-88720fc6309e",
"level": "OWNER"
},
{
"scimType": "group",
"scimId": "877f89dc-709b-4ef1-8d0e-a851f67a065a",
"level": "READER"
},
{
"scimType": "user",
"scimId": "bd4c7ae4-a1df-4702-845e-11946fa07d85",
"level": "WRITER"
}
],
"status": {
"runStatus": "PropertiesRetrievalFailed",
"runStartedAt": "2021-01-08T15:41:29.263Z",
"runBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"failure": "cannot generate dataset properties"
},
"createdAt": "2021-01-08T15:41:29.263Z",
"createdBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"crawledDatasets": [
"Dataset 1",
"Dataset 2",
"Dataset 3",
"Dataset 4"
]
},
{
"id": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"connectionId": "2695204e-e003-11eb-ba80-0242ac130004",
"name": "Crawler4",
"description": "Description du crawler 4",
"sharings": [
{
"scimType": "user",
"scimId": "b8a78dcb-65b4-4823-ad76-88720fc6309e",
"level": "OWNER"
},
{
"scimType": "group",
"scimId": "877f89dc-709b-4ef1-8d0e-a851f67a065a",
"level": "READER"
},
{
"scimType": "user",
"scimId": "bd4c7ae4-a1df-4702-845e-11946fa07d85",
"level": "WRITER"
}
],
"status": {
"runStatus": "CreatingDatasets",
"runStartedAt": "2021-01-08T15:41:29.263Z",
"runBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516"
},
"createdAt": "2021-01-08T15:41:29.263Z",
"createdBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"crawledDatasets": [
"Dataset 1",
"Dataset 2",
"Dataset 3",
"Dataset 4"
]
},
{
"id": "7c7f7872-a81a-11eb-bcbc-0242ac130002",
"connectionId": "2e46ed68-e003-11eb-ba80-0242ac130004",
"name": "Crawler5",
"description": "Description du crawler 5",
"sharings": [
{
"scimType": "user",
"scimId": "b8a78dcb-65b4-4823-ad76-88720fc6309e",
"level": "OWNER"
},
{
"scimType": "group",
"scimId": "877f89dc-709b-4ef1-8d0e-a851f67a065a",
"level": "READER"
},
{
"scimType": "user",
"scimId": "bd4c7ae4-a1df-4702-845e-11946fa07d85",
"level": "WRITER"
}
],
"status": {
"runStatus": "Finished",
"runStartedAt": "2021-01-08T15:41:29.263Z",
"runBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"runFinishedAt": "2021-01-08T15:41:29.263Z"
},
"createdAt": "2021-01-08T15:41:29.263Z",
"createdBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"crawledDatasets": [
"Dataset 1",
"Dataset 2",
"Dataset 3",
"Dataset 4"
]
}
],
"offset": 0,
"limit": 0,
"total": 5
}
Create a new crawler
POST /connections/crawlersAt this time, a crawler can only be created on a JDBC connection. You can only have one active crawler per connection. An active crawler is a crawler that has not been deleted. When the user runs the crawler, datasets from the tables and views of the JDBC connection will be created.
Known limitations:
Max objects limit : We recommend selecting less than 1000 tables/views. Beyond this limit, you may encounter issues when launching the run endpoint.
Max datasets limit: The maximum number of datasets a user can have is 1500. Beyond this limit, you may encounter timeouts when calling the dataset endpoint that list them all for a user. In consequence, when configuring a crawler, it is important to ensure that you will not exceed this limit after the crawler has run.
Request
Name | Description | Type | Attributes and examples |
---|---|---|---|
talendVersion | string Optional |
|
Name | Description | Type | Attributes and examples |
---|---|---|---|
talend-version | string Optional |
|
{
"connectionId": "d54a8f03-7906-4930-a7cc-4eb90e968f89",
"name": "Crawler - JDBC",
"selectedDatasets": [
"accounts",
"orders",
"items"
],
"sharings": [
{
"scimType": "user",
"scimId": "b8a78dcb-65b4-4823-ad76-88720fc6309e",
"level": "OWNER"
},
{
"scimType": "group",
"scimId": "877f89dc-709b-4ef1-8d0e-a851f67a065a",
"level": "READER"
},
{
"scimType": "user",
"scimId": "bd4c7ae4-a1df-4702-845e-11946fa07d85",
"level": "WRITER"
}
]
}
Response
201Status 201
{
"id": "ac6e2117-fbb5-442a-bb02-cefabbf04516"
}
Update the tables and views selection of an existing crawler
PUT /connections/crawlers/{crawlerId}Request
Name | Description | Type | Attributes and examples |
---|---|---|---|
crawlerId | string Required |
Name | Description | Type | Attributes and examples |
---|---|---|---|
talendVersion | string Optional |
|
Name | Description | Type | Attributes and examples |
---|---|---|---|
talend-version | string Optional |
|
Response
200Status 200
Update the name and description of a crawler
PATCH /connections/crawlers/{crawlerId}Request
Name | Description | Type | Attributes and examples |
---|---|---|---|
crawlerId | string Required |
Name | Description | Type | Attributes and examples |
---|---|---|---|
talendVersion | string Optional |
|
Name | Description | Type | Attributes and examples |
---|---|---|---|
talend-version | string Optional |
|
Response
200Status 200
Delete a crawler
DELETE /connections/crawlers/{crawlerId}Request
Name | Description | Type | Attributes and examples |
---|---|---|---|
crawlerId | string Required |
Name | Description | Type | Attributes and examples |
---|---|---|---|
talendVersion | string Optional |
|
Name | Description | Type | Attributes and examples |
---|---|---|---|
talend-version | string Optional |
|
Response
204Status 204
Get a crawler by its ID
GET /connections/crawlers/{crawlerId}Request
Name | Description | Type | Attributes and examples |
---|---|---|---|
crawlerId | string Required |
Name | Description | Type | Attributes and examples |
---|---|---|---|
talendVersion | string Optional |
|
|
includeDeleted | if true, it will also returns the crawlers that have been deleted. | boolean Optional |
Name | Description | Type | Attributes and examples |
---|---|---|---|
talend-version | string Optional |
|
Response
200Status 200
The payload contains all the crawler data:
- name, description
- selected tables/views
- sharingset
{
"id": "59451bf0-a81a-11eb-bcbc-0242ac130002",
"connectionId": "d54a8f03-7906-4930-a7cc-4eb90e968f89",
"name": "Crawler1",
"description": "Description du crawler 1",
"sharings": [
{
"scimType": "user",
"scimId": "b8a78dcb-65b4-4823-ad76-88720fc6309e",
"level": "OWNER"
},
{
"scimType": "group",
"scimId": "877f89dc-709b-4ef1-8d0e-a851f67a065a",
"level": "READER"
},
{
"scimType": "user",
"scimId": "bd4c7ae4-a1df-4702-845e-11946fa07d85",
"level": "WRITER"
}
],
"status": {
"runStatus": "NotStarted",
"nbDatasetsToCrawl": 0,
"nbDatasetsFinished": 0,
"nbDatasetsCreated": 0,
"nbDatasetsFailed": 0,
"nbSamplesFailed": 0
},
"createdAt": "2021-01-08T15:41:29.263Z",
"createdBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"crawledDatasets": [
"Dataset 1",
"Dataset 2",
"Dataset 3",
"Dataset 4"
]
}
{
"id": "3a45cb46-a81a-11eb-bcbc-0242ac130002",
"connectionId": "165ea830-e003-11eb-ba80-0242ac130004",
"name": "Crawler2",
"description": "Description du crawler 2",
"sharings": [
{
"scimType": "user",
"scimId": "b8a78dcb-65b4-4823-ad76-88720fc6309e",
"level": "OWNER"
},
{
"scimType": "group",
"scimId": "877f89dc-709b-4ef1-8d0e-a851f67a065a",
"level": "READER"
},
{
"scimType": "user",
"scimId": "bd4c7ae4-a1df-4702-845e-11946fa07d85",
"level": "WRITER"
}
],
"status": {
"runStatus": "RetrievingProperties",
"runStartedAt": "2021-01-08T15:41:29.263Z",
"runBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"nbDatasetsToCrawl": 0,
"nbDatasetsFinished": 0,
"nbDatasetsCreated": 0,
"nbDatasetsFailed": 0,
"nbSamplesFailed": 0
},
"createdAt": "2021-01-08T15:41:29.263Z",
"createdBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"crawledDatasets": [
"Dataset 1",
"Dataset 2",
"Dataset 3",
"Dataset 4"
]
}
{
"id": "108fb1c2-a81a-11eb-bcbc-0242ac130002",
"connectionId": "1db0db6c-e003-11eb-ba80-0242ac130004",
"name": "Crawler3",
"description": "Description du crawler 3",
"sharings": [
{
"scimType": "user",
"scimId": "b8a78dcb-65b4-4823-ad76-88720fc6309e",
"level": "OWNER"
},
{
"scimType": "group",
"scimId": "877f89dc-709b-4ef1-8d0e-a851f67a065a",
"level": "READER"
},
{
"scimType": "user",
"scimId": "bd4c7ae4-a1df-4702-845e-11946fa07d85",
"level": "WRITER"
}
],
"status": {
"runStatus": "PropertiesRetrievalFailed",
"runStartedAt": "2021-01-08T15:41:29.263Z",
"runBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"failure": "cannot generate dataset properties",
"nbDatasetsToCrawl": 0,
"nbDatasetsFinished": 0,
"nbDatasetsCreated": 0,
"nbDatasetsFailed": 0,
"nbSamplesFailed": 0
},
"createdAt": "2021-01-08T15:41:29.263Z",
"createdBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"crawledDatasets": [
"Dataset 1",
"Dataset 2",
"Dataset 3",
"Dataset 4"
]
}
{
"id": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"connectionId": "2695204e-e003-11eb-ba80-0242ac130004",
"name": "Crawler4",
"description": "Description du crawler 4",
"sharings": [
{
"scimType": "user",
"scimId": "b8a78dcb-65b4-4823-ad76-88720fc6309e",
"level": "OWNER"
},
{
"scimType": "group",
"scimId": "877f89dc-709b-4ef1-8d0e-a851f67a065a",
"level": "READER"
},
{
"scimType": "user",
"scimId": "bd4c7ae4-a1df-4702-845e-11946fa07d85",
"level": "WRITER"
}
],
"status": {
"runStatus": "CreatingDatasets",
"runStartedAt": "2021-01-08T15:41:29.263Z",
"runBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"nbDatasetsToCrawl": 0,
"nbDatasetsFinished": 0,
"nbDatasetsCreated": 0,
"nbDatasetsFailed": 0,
"nbSamplesFailed": 0
},
"createdAt": "2021-01-08T15:41:29.263Z",
"createdBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"crawledDatasets": [
"Dataset 1",
"Dataset 2",
"Dataset 3",
"Dataset 4"
]
}
{
"id": "7c7f7872-a81a-11eb-bcbc-0242ac130002",
"connectionId": "2e46ed68-e003-11eb-ba80-0242ac130004",
"name": "Crawler5",
"description": "Description du crawler 5",
"sharings": [
{
"scimType": "user",
"scimId": "b8a78dcb-65b4-4823-ad76-88720fc6309e",
"level": "OWNER"
},
{
"scimType": "group",
"scimId": "877f89dc-709b-4ef1-8d0e-a851f67a065a",
"level": "READER"
},
{
"scimType": "user",
"scimId": "bd4c7ae4-a1df-4702-845e-11946fa07d85",
"level": "WRITER"
}
],
"status": {
"runStatus": "Finished",
"runStartedAt": "2021-01-08T15:41:29.263Z",
"runBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"runFinishedAt": "2021-01-08T15:41:29.263Z",
"nbDatasetsToCrawl": 0,
"nbDatasetsFinished": 0,
"nbDatasetsCreated": 0,
"nbDatasetsFailed": 0,
"nbSamplesFailed": 0
},
"createdAt": "2021-01-08T15:41:29.263Z",
"createdBy": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"crawledDatasets": [
"Dataset 1",
"Dataset 2",
"Dataset 3",
"Dataset 4"
]
}
Run a crawler
POST /connections/crawlers/{crawlerId}/runThis endpoint allows you to start the crawler. When calling this endpoint, the crawler will rely on its configuration in order to retrieve all the selected tables and views and turn them into datasets. Once the dataset will be created the crawler will also retrieve their samples.
You can launch the crawler as many time as you want. Running a crawler once will create the datasets. Running a crawler again will only refresh the sample of the existing datasets.
Known limitations:
Max objects limit : We recommend selecting less than 1000 tables/views. Beyond this limit, you may encounter issues when launching the run endpoint.
Max datasets limit: The maximum number of datasets a user can have is 1500. Beyond this limit, you may encounter timeouts when calling the dataset endpoint that list them all for a user. In consequence, when configuring a crawler, it is important to ensure that you will not exceed this limit after the crawler has run.
Request
Name | Description | Type | Attributes and examples |
---|---|---|---|
crawlerId | string Required |
Name | Description | Type | Attributes and examples |
---|---|---|---|
talendVersion | string Optional |
|
Name | Description | Type | Attributes and examples |
---|---|---|---|
talend-version | string Optional |
|
Response
202Status 202
End a crawler while it is running
POST /connections/crawlers/{crawlerId}/endThis endpoint allows you to stop a crawler while it is running. After launching a crawler, the run can take up to a few hours to complete, according the number of objects you selected. You may want to stop the run for many reasons, for instance if you notice that the crawler was created on the wrong connection.
Stopping a crawler does not mean cancelling the crawler. The datasets that have already been created will not be deleted. If you want to clean them, you can use the faceted search to retrieve the datasets created by a crawler and delete them.
Request
Name | Description | Type | Attributes and examples |
---|---|---|---|
crawlerId | string Required |
Name | Description | Type | Attributes and examples |
---|---|---|---|
talendVersion | string Optional |
|
Name | Description | Type | Attributes and examples |
---|---|---|---|
talend-version | string Optional |
|
Response
202Status 202
Get the error log file
GET /connections/crawlers/{crawlerId}/errors.logRequest
Name | Description | Type | Attributes and examples |
---|---|---|---|
crawlerId | string Required |
Name | Description | Type | Attributes and examples |
---|---|---|---|
talendVersion | string Optional |
|
Name | Description | Type | Attributes and examples |
---|---|---|---|
talend-version | string Optional |
|
Response
200Status 200
Name | Description | Type | Attributes and examples |
---|---|---|---|
Content-Disposition | string Required |
Retrieve the result of the last scan
GET /connections/scan/{connectionId}Request
Name | Description | Type | Attributes and examples |
---|---|---|---|
connectionId | string Required |
Name | Description | Type | Attributes and examples |
---|---|---|---|
talendVersion | string Optional |
|
Name | Description | Type | Attributes and examples |
---|---|---|---|
talend-version | string Optional |
|
Response
200Status 200
{
"id": "ac6e2117-fbb5-442a-bb02-cefabbf04516",
"lastScan": "2021-01-08T15:41:29.263Z",
"results": [
{
"displayName": "view1",
"technicalName": "view1",
"metadata": {
"type": "VIEW"
}
},
{
"displayName": "table1",
"technicalName": "table1",
"metadata": {
"type": "TABLE"
}
}
]
}
Scan a JDBC connection
POST /connections/scan/{connectionId}Request
Name | Description | Type | Attributes and examples |
---|---|---|---|
connectionId | string Required |
Name | Description | Type | Attributes and examples |
---|---|---|---|
talendVersion | string Optional |
|
Name | Description | Type | Attributes and examples |
---|---|---|---|
talend-version | string Optional |
|
Response
204Status 204
CreateCrawlerRequest
Name | Description | Type | Attributes and examples | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
connectionId | The technical talend ID of the connection | string Required | ||||||||||
name | Name of the crawler | string Required | ||||||||||
description | Description of the crawler | string Optional | ||||||||||
selectedDatasets | array of string Optional |
|
||||||||||
sharings | array of Sharing Optional |
|
Sharing
Name | Description | Type | Attributes and examples |
---|---|---|---|
scimType | indicate USER or GROUP | string Required |
|
scimId | the scim id of the USER or GROUP | string Required | |
level | the level of the sharing | string Required |
|
NotAuthenticated
Name | Description | Type | Attributes and examples |
---|---|---|---|
message | string Required | ||
cause | string Optional |
AlreadyExist
Name | Description | Type | Attributes and examples |
---|---|---|---|
connectionId | The technical talend ID of the connection | string Required | |
i18nMsg | string Optional |
ServerError
Name | Description | Type | Attributes and examples |
---|---|---|---|
message | string Required | ||
cause | string Optional |
CreateCrawlerResponse
Name | Description | Type | Attributes and examples |
---|---|---|---|
id | The technical talend ID of the crawler that has been created | string Required |
NotFound
Name | Description | Type | Attributes and examples |
---|---|---|---|
entityId | The technical talend ID of the entity | string Required | |
entityType | The talend type of the entity | EntityType Required | |
i18nMsg | string Optional |
EntityType
CrawlerModel
CrawlerComplete
Name | Description | Type | Attributes and examples | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
id | The technical talend ID of the crawler | string Required | ||||||||||
connectionId | The technical talend ID of the connection | string Required | ||||||||||
name | Name of the crawler | string Required | ||||||||||
description | Description of the crawler | string Optional | ||||||||||
sharings | array of Sharing Optional |
|
||||||||||
status | The status of the crawler | Status Required | ||||||||||
createdAt | The date when the crawler has been created | datetime Required |
RFC3339 |
|||||||||
createdBy | Technical ID of the talend user. | string Required | ||||||||||
crawledDatasets | array of string Optional |
|
||||||||||
updateAt | The date when the crawler has been updated | datetime Optional |
RFC3339 |
|||||||||
updatedBy | Technical ID of the talend user. | string Optional | ||||||||||
deletedAt | The date when the crawler has been deleted | datetime Optional |
RFC3339 |
|||||||||
deletedBy | Technical ID of the talend user. | string Optional |
Status
Name | Description | Type | Attributes and examples |
---|---|---|---|
runStatus | The running status of the crawler | RunStatus Required | |
runStartedAt | The date when the run has started | datetime Optional |
RFC3339 |
runBy | Technical ID of the talend user. | string Optional | |
runFinishedAt | The date when the run has finished | datetime Optional |
RFC3339 |
failure | string Optional | ||
nbDatasetsToCrawl | Number of datasets to retrieve | integer Required |
INT32 |
nbDatasetsFinished | Number of datasets already retrieved | integer Required |
INT32 |
nbDatasetsCreated | Number of datasets tha has been created | integer Required |
INT32 |
nbDatasetsFailed | Number of datasets that has not been created | integer Required |
INT32 |
nbSamplesFailed | Number of samples that has not been created | integer Required |
INT32 |
RunStatus
CreatingDatasets
Ended
Finished
NotStarted
PropertiesRetrievalFailed
RetrievingProperties
CrawlerLight
Name | Description | Type | Attributes and examples |
---|---|---|---|
id | The technical talend ID of the crawler | string Required | |
name | The name of the crawler | string Required | |
connectionId | The technical talend ID of the connection | string Required | |
runStatus | The status of the crawler | RunStatus Required | |
deletedAt | The date when the crawler has been deleted | datetime Optional |
RFC3339 |
AlreadyRunning
Name | Description | Type | Attributes and examples |
---|---|---|---|
crawlerId | The technical talend ID of the crawler | string Required | |
i18nMsg | string Optional |
PaginatedResources_CrawlerModel
Name | Description | Type | Attributes and examples | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
data | array of CrawlerModel Optional |
|
||||||||||
offset | Pagination offset | integer Required |
INT32 |
|||||||||
limit | Pagination limit | integer Required |
INT32 |
|||||||||
total | Total number of crawlers | integer Required |
INT32 |
UpdateCrawlerRequest
Name | Description | Type | Attributes and examples | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
name | New name of the crawler | string Required | ||||||||||
description | New description of the crawler | string Optional | ||||||||||
selectedDatasets | array of string Optional |
|
PatchCrawlerRequest
Name | Description | Type | Attributes and examples |
---|---|---|---|
name | New name of the crawler | string Optional | |
description | New description of the crawler | string Optional |
PaginatedResources_CrawledDataset
Name | Description | Type | Attributes and examples | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
data | array of CrawledDataset Optional |
|
||||||||||
offset | Pagination offset | integer Required |
INT32 |
|||||||||
limit | Pagination limit | integer Required |
INT32 |
|||||||||
total | Total number of crawlers | integer Required |
INT32 |
CrawledDataset
Name | Description | Type | Attributes and examples |
---|---|---|---|
id | The technical ID of the object selected by the crawler | string Required | |
crawlerId | The technical talend ID of the crawler | string Required | |
datasetId | The technical ID of the dataset generated by the crawler | string Optional | |
displayName | Names of the tables and views that we want to retrieve with the crawler | string Required | |
technicalName | Technical names of the tables and views that we want to retrieve with the crawler | string Required | |
metadata | Indicate if it is a TABLE or a VIEW | Metadata_infos Required | |
exportStatus | Contains the status of the dataset in the crawler context | ExportStatus Required | |
failure | Indicate if we encountered a failure during the crawling for this dataset | string Optional | |
lastUpdate | Date of the last time this dataset has been refreshed | datetime Required |
RFC3339 |
Metadata_infos
ExportStatus
ConnectionScan
Name | Description | Type | Attributes and examples | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
id | The technical talend ID of the connection | string Required | ||||||||||
lastScan | Date of the last time this connection has been scanned | datetime Required |
RFC3339 |
|||||||||
results | array of ScannedDataset Optional |
|
ScannedDataset
Name | Description | Type | Attributes and examples |
---|---|---|---|
displayName | Names of the tables and views that we want to retrieve with the crawler | string Required | |
technicalName | Technical names of the tables and views that we want to retrieve with the crawler | string Required | |
metadata | Indicate if this is a TABLE or a VIEW | Metadata_infos Required |