Does anyone know if there's a limit to memory (not...
# suitescript
n
Does anyone know if there's a limit to memory (not governance) usage in the getInputData stage of a map/reduce? I'm trying to automate import of item and bom data, so I'm reading from large CSV files and caching item data. Everything was working with earlier files, but now the process is hanging with no error message. It will simply wait until the max ~hour and then simply report "fail", even though I have the Yield After Minutes set to a much smaller value.
d
"The total persisted data used by a map/reduce script cannot exceed 200MB at any one time"
n
I've hit that limit with a prior version. I've shifted to putting all the lookup objects into cache objects that are later stitched together.
I've even gone so far as to put the creation of that lookup object and cache creation in a Suitelet and returning it rather than processing it within the MR itself.
b
getInputData doesnt yield since its only a single function, so thats not a huge surprise
you probably want to make sure that your summarize stage correctly logs errors
n
I'm like 99% it does because I've blown this thing back and forth during its development 🙂 I do get an error message back, but the object returned is simply an empty object
try{ .... } catch (errorObj) { log.error({ title: '(getInputData) Error', details: (errorObj) }); }
b
let the error be thrown
and handle it in the summarize
id also generally recommend suitescript 2.0 if you are seeing weird behavior
you can go through SuiteScript 2.x Map/Reduce Yielding or Soft Limits on Long-Running Map and Reduce Jobs to see how netsuite documents which 2 entry points yield
n
ok, I've added re-throwing the error and hung it up again. Hopefully I can get a more useful error message in an hour 😕