Hey all, this is from the 2025.1 Release notes:
SuiteScript Generative AI API: New Methods in the N/llm
Module
Two new methods are available in the N/llm module:
ā llm.evaluatePrompt(options) ā Takes the ID of an existing prompt and values for variables used in
the prompt and returns the response from the large language model (LLM). You can use this method
to evaluate a prompt by providing values for any variables that the prompt uses. The resulting prompt
is sent to the LLM, and this method returns the LLM response, similar to the llm.generateText(options)
method. When unlimited usage mode is used, this method accepts the OCI configuration parameters.
For more information, see the help topic Using Your Own OCI Configuration for SuiteScript Generative
AI APIs.
ā llm.evaluatePrompt.promise(options) ā Provides an asynchronous version of the
llm.evaluatePrompt(options) method.
These new methods provide more options to interact with LLMs using SuiteScript Generative AI APIs. For
more information about these APIs, see the help topic SuiteScript 2.x Generative AI APIs.
Does this mean you will be able to save a prompt and reference it in the different scripts?