Python module
entrypoints
LLM
class max.entrypoints.llm.LLM(pipeline_config)
A high level interface for interacting with LLMs.
-
Parameters:
-
pipeline_config (
PipelineConfig
)
generate()
generate(prompts, max_new_tokens=100, use_tqdm=True)
Generates text completions for the given prompts.
-
Parameters:
-
Returns:
-
A list of generated text completions corresponding to each input prompt.
-
Raises:
-
- ValueError – If prompts is empty or contains invalid data.
- RuntimeError – If the model fails to generate completions.
-
Return type:
Was this page helpful?
Thank you! We'll create more content like this.
Thank you for helping us improve!