Exploring the LLM Module in SuiteScript 2.1 with a Chatbot Example
Introduction:
As AI continues to transform industries, integrating natural
language processing into business applications has become increasingly
valuable. NetSuite’s SuiteScript 2.1 now offers the `N/llm` module, enabling
developers to harness the power of large language models directly in NetSuite.
In this blog, we'll dive into the basics of the `llm` module and walk through a
practical example of using it to create a chatbot Suitelet for seamless user
interactions.
What is the `N/llm` Module?
The `N/llm` module in SuiteScript 2.1 brings AI-driven
language generation into NetSuite scripts. This module allows for generating
text responses based on user prompts, making it an excellent tool for
applications like customer service chatbots, content generation, and automated
responses in workflows.
Key Components of the `llm` Module:
- generateText: The primary method used to create responses. It takes a user prompt and, optionally, the chat history to generate contextually relevant responses.
- ChatRole: This enumeration allows setting roles (such as `USER` or `CHATBOT`) for each part of a conversation history, helping the AI model maintain the dialogue context.
Building a Chatbot with the `llm` Module:
Let’s explore how to build a simple chatbot Suitelet using
the `llm` module, where users can submit questions and receive AI-generated
responses. This chatbot Suitelet will:
- Take user input through a form.
- Send it to the LLM for processing.
- Display the response and maintain the conversation history.
Step-by-Step Guide to the Chatbot Suitelet:
Step 1: We start by defining the Suitelet and importing the `N/ui/serverWidget` and `N/llm` modules.
Example:
/**
* @NApiVersion 2.1
* @NScriptType Suitelet
*/
define(["N/ui/serverWidget", "N/llm"], (serverWidget, llm) => ({
onRequest: function(context) {
const form = serverWidget.createForm({ title: "Chat Bot" });
const historySize = parseInt(context.request.parameters.custpage_num_chats || "0");
this.createFieldGroup(form);
const numChats = this.createHiddenField(form, historySize);
if (context.request.method === "POST") {
this.handlePostRequest(context, form, numChats, historySize);
} else {
numChats.defaultValue = 0;
}
this.createPromptField(form);
form.addSubmitButton({ label: "Submit" });
context.response.writePage(form);
},
createFieldGroup: function(form) {
const fieldgroup = form.addFieldGroup({ id: "fieldgroupid", label: "Chat" });
fieldgroup.isSingleColumn = true;
},
createHiddenField: function(form, historySize) {
const numChats = form.addField({
id: "custpage_num_chats",
type: serverWidget.FieldType.INTEGER,
container: "fieldgroupid",
label: "History Size",
});
numChats.updateDisplayType({ displayType: serverWidget.FieldDisplayType.HIDDEN });
numChats.defaultValue = historySize;
return numChats;
},
handlePostRequest: function(context, form, numChats, historySize) {
numChats.defaultValue = historySize + 2;
const chatHistory = this.loadChatHistory(context, form, historySize);
this.processUserPrompt(context, form, chatHistory);
},
loadChatHistory: function(context, form, historySize) {
const chatHistory = [];
for (let i = historySize - 2; i >= 0; i -= 2) {
const userMessage = context.request.parameters["custpage_hist" + i];
const botMessage = context.request.parameters["custpage_hist" + (i + 1)];
this.createHistoryField(form, "You", userMessage, i + 2);
this.createHistoryField(form, "ChatBot", botMessage, i + 3);
chatHistory.push({ role: llm.ChatRole.USER, text: userMessage });
chatHistory.push({ role: llm.ChatRole.CHATBOT, text: botMessage });
}
return chatHistory;
},
createHistoryField: function(form, label, message, index) {
const field = form.addField({
id: "custpage_hist" + index,
type: serverWidget.FieldType.TEXTAREA,
label: label,
container: "fieldgroupid",
});
field.defaultValue = message;
field.updateDisplayType({ displayType: serverWidget.FieldDisplayType.INLINE });
},
processUserPrompt: function(context, form, chatHistory) {
const prompt = context.request.parameters.custpage_text;
const promptField = form.addField({
id: "custpage_hist0",
type: serverWidget.FieldType.TEXTAREA,
label: "You",
container: "fieldgroupid",
});
promptField.defaultValue = prompt;
promptField.updateDisplayType({ displayType: serverWidget.FieldDisplayType.INLINE });
const result = form.addField({
id: "custpage_hist1",
type: serverWidget.FieldType.TEXTAREA,
label: "ChatBot",
container: "fieldgroupid",
});
result.defaultValue = llm.generateText({
prompt: prompt,
chatHistory: chatHistory,
}).text;
result.updateDisplayType({ displayType: serverWidget.FieldDisplayType.INLINE });
},
createPromptField: function(form) {
form.addField({
id: "custpage_text",
type: serverWidget.FieldType.TEXTAREA,
label: "Prompt",
container: "fieldgroupid",
});
},
}));
Step 2 : To test, deploy the Suitelet, open its URL, and engage with
the chatbot by entering various prompts. The chatbot will respond based on its
training, allowing you to create a conversational flow. Each message is
retained, so you can continue the conversation without losing context.
Sample Interaction:
1. Prompt: "Hello! Can you help me with NetSuite?"
2. Response: The bot replies with information on how it can assist.
3. Follow-up Prompt: "Tell me more about SuiteScript API."
4. Response: The bot provides details on SuiteScript.
Conclusion:
The `llm` module is a powerful tool for adding intelligent,
conversational features to NetSuite applications. Using the `llm.generateText`
method, we can generate responses tailored to specific prompts, making it easy
to build dynamic applications like chatbots. Whether for customer support,
FAQs, or more complex interactions, this module opens up new possibilities for
enhancing user experience within NetSuite.
If you're interested in exploring more, consider testing
different prompts, using varied histories, and trying out the other
capabilities of the `llm` module to deepen your understanding of its potential
for dynamic and engaging NetSuite solutions.




Comments
Post a Comment