
Send Prompt
Sends a prompt to LLM and generate a completion.
Deepseek Action
The Deepseek integration in Botize allows you to automate response generation using advanced language models. With this tool, you can send customized prompts and receive accurate completions, optimizing your workflows and enhancing task management efficiency.
Easily configure parameters such as the model to use, temperature, and maximum number of tokens, tailoring content generation to your specific needs. Additionally, you can customize tags and output variables for seamless integration with your existing systems.
Customization Options
Configurable fields you can adjust in your automation
- Rules
- Run this step only if the following rules are met
- System message
- New user message
- Model
- Temperature
- Max Tokens to use
- Remove quotation marks
- Customize tags
- Rename output variables
Information provided
When executed, this operation delivers the following data, which can be used in the same automatic task.
Tags
-
Text
{{content}}
Predicted completion.
e.g. This is a test -
Finish reason
{{finish_reason}}
reason why the resulting text has terminated.
e.g. lenght -
Prompt tokens
{{prompt_tokens}}
Number of tokens in the Prompt. 1 token ~= 4 chars in English
-
Completion tokens
{{completion_tokens}}
Number of tokens in the Completion. 1 token ~= 4 chars in English
-
Total tokens
{{total_tokens}}
Prompt tokens + Completion tokens
Let's talk
Choose day and time.
We share the screen and answer all your questions.