hi all. I'm new to map reduce processing. I under...
# suitescript
l
hi all. I'm new to map reduce processing. I understand the benefit and why using it. this is my hello world attempt, so I sound dumb, and looking for directions from more experienced you. I have a custom suitelet form, it searches and displays a sublist of sales orders with checkboxes for each line. taking user inputs of these checkboxes, the SOs are ready to be processed. I can certainly process them sequentially but not ideal. having the data on hand, do I call a separate map reduce script and passing the array of sales order ids? can map reduce script take in array variables? if not, how does a suitlet communicate with a map reduce script and pasing data to it. thanks!
n
You've got it! You can use the n/task module to schedule a map/reduce task and pass in the array of sales order id's. From there in the getInput stage of the map/reduce youll grab the parameter with your id's, parse them into an array, and return them out of the get input. Each element in your array will get its own instance of the map stage
A little example of this pulled from something i have written. Its actually passing through an object not an array, but the idea is the exact same.
Copy code
const createMrTask = (invoiceObject) => {
    const mrTask = task.create({
      taskType: task.TaskType.MAP_REDUCE,
      scriptId: 'customscript_some_script_name',
      params: {
        custscript_some_parameter_names: JSON.stringify(invoiceObject)
      }
    });
    return mrTask.submit();
  };
šŸ™ 1
l
hi Nathan again. thanks for the confirmation. I'll go this way.
e
Is there a limit in the size of the object passed?
l
at the map reduce function, how do you read invoiceObject
chatgpt gave me this, not very clear how to read out array
n
I usually set up the script parameter for the map/reduce task as a long text field. So your limit there is the character limit of the field. Im sure there is a better way to do it for larger data sets if that ends up being a problem. But its not ever been an issue for me before so i've never had to research it.
āœ… 1
Copy code
const getInputData = (inputContext) => {
    try {
      const param = JSON.parse(
        runtime.getCurrentScript().getParameter({ name: 'custscript_some_parameter_names' })
      );

return param;
    } catch (e) {
      log.error({ title: 'error in getInputData', details: e });
    }
  };
šŸ™ 1
so we are just using the runtime module to get the data from the parameter, parse it so that its an object, then returning the object. In your case you would just send it as a string then array.split the parameter on the comma to get an array of ids. Which you would then return
l
limit is 3000 seconds says in doc. not gonna happen. 🤣
n
I think long text is 100,000 characters
m
You can also save the data to process to a processing folder. The map/reduce can just search that folder for files and process the contents in them. Then move them to a processed folder in the summary stage. The problem you might find if tasking is that you can only task one concurrent run and it will error if the suitelet attempts to task again while another is still running. I do it both ways. It depends on the use case and the risk of concurrent requests.
l
oh, data length. not execution time.
n
Yeah to @Marvin’s point, using the task module to schedule the map/reduce, if you leave the deploymentId parameter empty it will just pick the first available deployment of your map/reduce. If you have multiple users using the suitelet, or processing back to back, you can create multiple deployments of the map/reduce to ensure one is always available edit: had the wrong parameter name
l
thanks @Marvin. my scenario is expecting 1-20 SO ids at a time, so doing it in memory should be faster. good to know using file is also possible.
m
Thanks @Nathan L I wasn't aware that leaving the deployment empty would work like that. Before I moved to more file processing I was using a while loop and try catch to run through the different deployments until one worked.
🧠 1
r
Consider automatically creating new deployments as needed (record.copy)
l
interesting find. changes are made by map reduce script are logged set by system. the user initiated my restlet submit function, which triggered the map reduce script doesn't show up in log.
r
there is no user for a map reduce script since it runs on the server side
you can pass that as a parameter and record that in some sort of custom way if necessary
l
chatgpt says no api to add to system notes.
true news 1
I can drop a message in deployment log. but that's gonna be much less useful
r
what's the requirement? you want to know who created an SO?
l
custom hidden field, last changed by...... maybe
r
List / Record : Employee type field. hidden or not. and pass the internal id of the user as a param
l
restlet form... user clicks some checkboxes on sales orders. on submit, they get sent to map reduce for updating.
r
restlet? restlets don't have a UI
suitelet you mean?
l
sorry suitelet. yeah, i can pass the emp id along with the array, then save it some where.
šŸ‘ 1
I'm very close. cannot read the array passed from suitelet to map reduce script. what goes in JSON.parse()
n
So you'll need to create a parameter on the map/reduce script record. then grab the id of that field Then in your task.create() Youll need to have the key "params" with the value being an object. inside that object you have a key with your newly created parameter name and the value is your array. Since you are using an array, in your suitelet just set the parameter to
resultArray.join(',')
Then in your map/reduce script Using the N/runtime module
Copy code
const inputArray = runtime.getCurrentScript.getParameter({name: custscript_your_parameter_name}).split(',')
l
i think I'm a bit confused now. @Nathan L do you mean to create one of this parameter? then set it from the calling suitelet at runtime?
m
Yes that is the parameter you need to create. You will set it from the suitelet when you do the task creation. Then in getInput you will pull that parameter.
this 1
l
cool. then I guess when there are multiple instances of the task are triggered, the parameter values are different?
n
yeah leave the parameter in the deployment blank. The suitelet will set it. If you have multiple instances of the map/reduce running at once youll need to make sure you have multiple deployments of the map/reduce created. The parameter gets set per instance of the M/R that is running, and it happens at the time the M/R script is called from the task
So a long way to say "yes, the values are different"
lol
l
if I call this parameter "so_update_list", user a and user b both click submit on the suitelet form which triggers 2 instances of the M/R task. task a and task b. are there two separate "so_update_list" values at runtime belong separately to task a and task b?
if task a and b executed sequentially, then it doesn't matter. hopefullyšŸ™‚
or like you said. have multiple deployments
n
Yes. So when you do task.submit() on your mr task that you create in the suitelet, it grabs a deployment of that mr script and creates a running instance of the script. That instance gets your data that belongs to the user who triggered that specific task. If two users click the button at the same time, one after another, whatever the scenario that causes it to be running two instances simultaneously... The instance of the MR script is created for each specific user. And each instance gets its own data set in that parameter. Again though, not to keep harping on it. You can only have as many simultaneous instances of the map/reduce as you have MR deployments. Otherwise one user is going to get an error saying there are no available deployments to run the script on
šŸ™ 1
So everytime the user clicks the go button, the MR runs in its own little black box with its own data set
šŸ™ 1
l
great. you are so knowledgeable. in short, I don't need additional code to handle concurrent execution of the same MR script.
n
Correct
l
just wanted to make sure I'm not overwriting a global variable that might crossfire between tasks
n
And not to overwhelm you with information, but keep in mind that Map/Reduce scripts run in a queue. So it will only process or run the Map/Reduce script when it has an available spot to run. Meaning if you only have 2 processors to run these MR in, and you have a user submitting 4 different data sets, the newest mr instances will sit there and wait for one of the others to finish before its able to run. And those available processors are shared amongst all scheduled scripts. I tell you that to make sure you know that there is a chance that you clicking the "Go" button on your suitelet will not instantly start running the MR. All your data, and your task will be created, it will just essentially be waiting to run. So users can't expect instant updates to your records when they click go
That's not something you have to worry about in your scripting. More so just NetSuite behaivor
l
makes perfect sense. get in queue, wait for free cpu cycle.
šŸ’Æ 1
n
On the task submitting side of this, you can trap the error that's returned if there's not an available deployment, create a new one and re-submit the task. With regards to the user, try logging runtime.getCurrentUser(), you might be surprised since it's being run by a user via n/task and a SuiteLet...
m
@NElliott what module do use to create a new deployment. Is it
N/record
?
r
record.copy
n
Actually @reptar I don't, I use record.create
Copy code
scheduledScrDeployment = record.create({
                                type: Record.Type.SCRIPT_DEPLOYMENT,
                                defaultValues: {
                                    script: scheduledScrInternalId,
                                },
                            })
but you could use copy I guess. šŸ™‚
s
Note that if the account has only 2 processors a MR script may not have any advantage over Scheduled script.
n
*besides any governance concerns.
s
governance is not a problem in 80% of the problems we face. Then again, it's helps that our standard Scheduled script template automatically checks for governance and reschedules itself
102 Views