
Send Prompt
Sends a prompt to LLM and generate a completion.
Deepseek Action
Customization Options
Configurable fields you can adjust in your automation
- Rules
- Run this step only if the following rules are met
- System message
- New user message
- Model
- Temperature
- Max Tokens to use
- Remove quotation marks
- Customize tags
- Rename output variables
Information provided
When executed, this operation delivers the following data, which can be used in the same automatic task.
Tags
-
Text
{{content}}
Predicted completion.
e.g. This is a test -
Finish reason
{{finish_reason}}
reason why the resulting text has terminated.
e.g. lenght -
Prompt tokens
{{prompt_tokens}}
Number of tokens in the Prompt. 1 token ~= 4 chars in English
-
Completion tokens
{{completion_tokens}}
Number of tokens in the Completion. 1 token ~= 4 chars in English
-
Total tokens
{{total_tokens}}
Prompt tokens + Completion tokens
Let's talk
Choose day and time.
We share the screen and answer all your questions.