AgentsforBedrock / Client / create_prompt
create_prompt#
- AgentsforBedrock.Client.create_prompt(**kwargs)#
Creates a prompt in your prompt library that you can add to a flow. For more information, see Prompt management in Amazon Bedrock, Create a prompt using Prompt management and Prompt flows in Amazon Bedrock in the Amazon Bedrock User Guide.
See also: AWS API Documentation
Request Syntax
response = client.create_prompt( clientToken='string', customerEncryptionKeyArn='string', defaultVariant='string', description='string', name='string', tags={ 'string': 'string' }, variants=[ { 'additionalModelRequestFields': {...}|[...]|123|123.4|'string'|True|None, 'inferenceConfiguration': { 'text': { 'maxTokens': 123, 'stopSequences': [ 'string', ], 'temperature': ..., 'topP': ... } }, 'metadata': [ { 'key': 'string', 'value': 'string' }, ], 'modelId': 'string', 'name': 'string', 'templateConfiguration': { 'text': { 'inputVariables': [ { 'name': 'string' }, ], 'text': 'string' } }, 'templateType': 'TEXT' }, ] )
- Parameters:
clientToken (string) –
A unique, case-sensitive identifier to ensure that the API request completes no more than one time. If this token matches a previous request, Amazon Bedrock ignores the request, but does not return an error. For more information, see Ensuring idempotency.
This field is autopopulated if not provided.
customerEncryptionKeyArn (string) – The Amazon Resource Name (ARN) of the KMS key to encrypt the prompt.
defaultVariant (string) – The name of the default variant for the prompt. This value must match the
name
field in the relevant PromptVariant object.description (string) – A description for the prompt.
name (string) –
[REQUIRED]
A name for the prompt.
tags (dict) –
Any tags that you want to attach to the prompt. For more information, see Tagging resources in Amazon Bedrock.
(string) –
(string) –
variants (list) –
A list of objects, each containing details about a variant of the prompt.
(dict) –
Contains details about a variant of the prompt.
additionalModelRequestFields (document) –
Contains model-specific inference configurations that aren’t in the
inferenceConfiguration
field. To see model-specific inference parameters, see Inference request parameters and response fields for foundation models.inferenceConfiguration (dict) –
Contains inference configurations for the prompt variant.
Note
This is a Tagged Union structure. Only one of the following top level keys can be set:
text
.text (dict) –
Contains inference configurations for a text prompt.
maxTokens (integer) –
The maximum number of tokens to return in the response.
stopSequences (list) –
A list of strings that define sequences after which the model will stop generating.
(string) –
temperature (float) –
Controls the randomness of the response. Choose a lower value for more predictable outputs and a higher value for more surprising outputs.
topP (float) –
The percentage of most-likely candidates that the model considers for the next token.
metadata (list) –
An array of objects, each containing a key-value pair that defines a metadata tag and value to attach to a prompt variant. For more information, see Create a prompt using Prompt management.
(dict) –
Contains a key-value pair that defines a metadata tag and value to attach to a prompt variant. For more information, see Create a prompt using Prompt management.
key (string) – [REQUIRED]
The key of a metadata tag for a prompt variant.
value (string) – [REQUIRED]
The value of a metadata tag for a prompt variant.
modelId (string) –
The unique identifier of the model or inference profile with which to run inference on the prompt.
name (string) – [REQUIRED]
The name of the prompt variant.
templateConfiguration (dict) – [REQUIRED]
Contains configurations for the prompt template.
Note
This is a Tagged Union structure. Only one of the following top level keys can be set:
text
.text (dict) –
Contains configurations for the text in a message for a prompt.
inputVariables (list) –
An array of the variables in the prompt template.
(dict) –
Contains information about a variable in the prompt.
name (string) –
The name of the variable.
text (string) – [REQUIRED]
The message for the prompt.
templateType (string) – [REQUIRED]
The type of prompt template to use.
- Return type:
dict
- Returns:
Response Syntax
{ 'arn': 'string', 'createdAt': datetime(2015, 1, 1), 'customerEncryptionKeyArn': 'string', 'defaultVariant': 'string', 'description': 'string', 'id': 'string', 'name': 'string', 'updatedAt': datetime(2015, 1, 1), 'variants': [ { 'additionalModelRequestFields': {...}|[...]|123|123.4|'string'|True|None, 'inferenceConfiguration': { 'text': { 'maxTokens': 123, 'stopSequences': [ 'string', ], 'temperature': ..., 'topP': ... } }, 'metadata': [ { 'key': 'string', 'value': 'string' }, ], 'modelId': 'string', 'name': 'string', 'templateConfiguration': { 'text': { 'inputVariables': [ { 'name': 'string' }, ], 'text': 'string' } }, 'templateType': 'TEXT' }, ], 'version': 'string' }
Response Structure
(dict) –
arn (string) –
The Amazon Resource Name (ARN) of the prompt.
createdAt (datetime) –
The time at which the prompt was created.
customerEncryptionKeyArn (string) –
The Amazon Resource Name (ARN) of the KMS key that you encrypted the prompt with.
defaultVariant (string) –
The name of the default variant for your prompt.
description (string) –
The description of the prompt.
id (string) –
The unique identifier of the prompt.
name (string) –
The name of the prompt.
updatedAt (datetime) –
The time at which the prompt was last updated.
variants (list) –
A list of objects, each containing details about a variant of the prompt.
(dict) –
Contains details about a variant of the prompt.
additionalModelRequestFields (document) –
Contains model-specific inference configurations that aren’t in the
inferenceConfiguration
field. To see model-specific inference parameters, see Inference request parameters and response fields for foundation models.inferenceConfiguration (dict) –
Contains inference configurations for the prompt variant.
Note
This is a Tagged Union structure. Only one of the following top level keys will be set:
text
. If a client receives an unknown member it will setSDK_UNKNOWN_MEMBER
as the top level key, which maps to the name or tag of the unknown member. The structure ofSDK_UNKNOWN_MEMBER
is as follows:'SDK_UNKNOWN_MEMBER': {'name': 'UnknownMemberName'}
text (dict) –
Contains inference configurations for a text prompt.
maxTokens (integer) –
The maximum number of tokens to return in the response.
stopSequences (list) –
A list of strings that define sequences after which the model will stop generating.
(string) –
temperature (float) –
Controls the randomness of the response. Choose a lower value for more predictable outputs and a higher value for more surprising outputs.
topP (float) –
The percentage of most-likely candidates that the model considers for the next token.
metadata (list) –
An array of objects, each containing a key-value pair that defines a metadata tag and value to attach to a prompt variant. For more information, see Create a prompt using Prompt management.
(dict) –
Contains a key-value pair that defines a metadata tag and value to attach to a prompt variant. For more information, see Create a prompt using Prompt management.
key (string) –
The key of a metadata tag for a prompt variant.
value (string) –
The value of a metadata tag for a prompt variant.
modelId (string) –
The unique identifier of the model or inference profile with which to run inference on the prompt.
name (string) –
The name of the prompt variant.
templateConfiguration (dict) –
Contains configurations for the prompt template.
Note
This is a Tagged Union structure. Only one of the following top level keys will be set:
text
. If a client receives an unknown member it will setSDK_UNKNOWN_MEMBER
as the top level key, which maps to the name or tag of the unknown member. The structure ofSDK_UNKNOWN_MEMBER
is as follows:'SDK_UNKNOWN_MEMBER': {'name': 'UnknownMemberName'}
text (dict) –
Contains configurations for the text in a message for a prompt.
inputVariables (list) –
An array of the variables in the prompt template.
(dict) –
Contains information about a variable in the prompt.
name (string) –
The name of the variable.
text (string) –
The message for the prompt.
templateType (string) –
The type of prompt template to use.
version (string) –
The version of the prompt. When you create a prompt, the version created is the
DRAFT
version.
Exceptions
AgentsforBedrock.Client.exceptions.ThrottlingException
AgentsforBedrock.Client.exceptions.AccessDeniedException
AgentsforBedrock.Client.exceptions.ValidationException
AgentsforBedrock.Client.exceptions.InternalServerException
AgentsforBedrock.Client.exceptions.ConflictException
AgentsforBedrock.Client.exceptions.ServiceQuotaExceededException