You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/api/llm/inference.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -58,4 +58,4 @@ The codebolt.llm.inference function allows you to send an inference request to a
58
58
59
59
question (string): This parameter represents the input question or prompt you want to send to the LLM for inference.
60
60
61
-
llmRole (string): This parameter specifies the role or type of Large Language Model (LLM) you want to use for inference. The role determines which variant of the LLM is selected for processing the input question and generating the response. LLMs role can be optional.
61
+
llmRole (string): This parameter specifies the role or type of Large Language Model (LLM) you want to use for inference. The role determines which variant of the LLM is selected for processing the input question and generating the response. LLMs role can be
0 commit comments