```const getInputData = context => { return se...
# suitescript
c
Copy code
const getInputData = context => {
    return search.load({id: '9256'});
}
This function is still running after three hours. The search has 255,000 results. Does this seem like an expected amount of time?
The search contains around 7 formula. I would like to re-write all as SuiteQL but my major constraint is time.
n
I guess it depends how much you're doing in map/reduce but any script running for 3 hours sounds excessive
c
It's not hit the map stage yet
and this is after 3 hours
n
Yeah, no, that's baaad
Can you run the search yourself and run the results array instead? I typically do that and find it anecdotally quicker.
c
What does that mean?
I tried to page the search and populate an array with all the page results but that time out TIME_LIMIT_EXCEEDED
So now I'm letting the M/R script handle the paging internally.
b
the number of search results sounds doable, so im guessing there is too much data in your results to serialize
make the search only return an id, and then make the map do the full search for that one particular id
c
This is the criteria
I wouldn't save there is a lot of data being return per result row.
n
You're "returning" the loaded saved search which means NS is running the search and I have no idea how that works, is it running the search and then handling passing each result one at a time or is it doing something janky like get me result 0 then result 1 then result 2 etc, running the search each time. (just as an example, you'd hope not šŸ˜„ ). I normally just run the search myself, paged, directly into an array and return that array. I'm speculating but I have definitely worked with that many results with maybe 20 or so columns with no issue. Although I now see your formulas šŸ¤”
c
@NElliott I tried this
Copy code
const getInputData = context => {
    const search = search.load({id: '9256'});

    const resultSet = [];
    const pagedData = search.runPaged({ pageSize: 1000 });
    pagedData.pageRanges.forEach(pageRange => {
        const myPage = pagedData.fetch({ index: pageRange.index });
        myPage.data.forEach(result => {
            resultSet.push(result);
        });
    });

    return resultSet;
}
Which timed out after an hour
n
Yeah, sounds like your formulas are hitting it hard šŸ˜ž
c
Yeah, I'd guess that's the issue
n
Disappointing.
c
I'm not familiar enough with netsuite2.com schema to port this to SQL quickly. This will definitely be delivered late :)
n
I'd love to help you but unfortunately my SQL-fu is sorely lacking and like you I'm not familiar with the schema.
c
Just read @battk’s suggestion - that sounds promising.
šŸ‘šŸ» 1
Although I did not build in a kill switch for the already running M/R 🫤
šŸ˜ž 1
n
BTW, it might be more efficient in your map to load the record rather than perform a search for each record individually.
b
your screenshots suggests you have multiple processors available to you, so you can try making another copy of the script to use the other processors
c
Even if you use all processors, I'm under the impression that doesn't block other M/Rs from starting
then NetSuite just does some sort of round robin (or other algorithm) to manage processor time between processes; does that sound right?
I ask because the person that kicked off this M/R set it to use all processors.
b
only one processor is used for getInputData, leaving 4 others to do other stuff
although if you only had 2 processors, i would be weary of stalling the other processor in case the search is just so bad that it too takes forever with only 1 column
c
It's hit map šŸ˜„