Personalize / Client / list_batch_inference_jobs
list_batch_inference_jobs#
- Personalize.Client.list_batch_inference_jobs(**kwargs)#
- Gets a list of the batch inference jobs that have been performed off of a solution version. - See also: AWS API Documentation - Request Syntax- response = client.list_batch_inference_jobs( solutionVersionArn='string', nextToken='string', maxResults=123 ) - Parameters:
- solutionVersionArn (string) – The Amazon Resource Name (ARN) of the solution version from which the batch inference jobs were created. 
- nextToken (string) – The token to request the next page of results. 
- maxResults (integer) – The maximum number of batch inference job results to return in each page. The default value is 100. 
 
- Return type:
- dict 
- Returns:
- Response Syntax- { 'batchInferenceJobs': [ { 'batchInferenceJobArn': 'string', 'jobName': 'string', 'status': 'string', 'creationDateTime': datetime(2015, 1, 1), 'lastUpdatedDateTime': datetime(2015, 1, 1), 'failureReason': 'string', 'solutionVersionArn': 'string', 'batchInferenceJobMode': 'BATCH_INFERENCE'|'THEME_GENERATION' }, ], 'nextToken': 'string' } - Response Structure- (dict) – - batchInferenceJobs (list) – - A list containing information on each job that is returned. - (dict) – - A truncated version of the BatchInferenceJob. The ListBatchInferenceJobs operation returns a list of batch inference job summaries. - batchInferenceJobArn (string) – - The Amazon Resource Name (ARN) of the batch inference job. 
- jobName (string) – - The name of the batch inference job. 
- status (string) – - The status of the batch inference job. The status is one of the following values: - PENDING 
- IN PROGRESS 
- ACTIVE 
- CREATE FAILED 
 
- creationDateTime (datetime) – - The time at which the batch inference job was created. 
- lastUpdatedDateTime (datetime) – - The time at which the batch inference job was last updated. 
- failureReason (string) – - If the batch inference job failed, the reason for the failure. 
- solutionVersionArn (string) – - The ARN of the solution version used by the batch inference job. 
- batchInferenceJobMode (string) – - The job’s mode. 
 
 
- nextToken (string) – - The token to use to retrieve the next page of results. The value is - nullwhen there are no more results to return.
 
 
 - Exceptions- Personalize.Client.exceptions.InvalidInputException
- Personalize.Client.exceptions.InvalidNextTokenException