Hi All, I have a question about the MapReduce scri...
# suitescript
c
Hi All, I have a question about the MapReduce script. I have CSV data coming from a third-party integration. Which stage is the most suitable for making an HTTPS request to the third party and fetching the CSV data? Is it a good approach to perform the request in the get input stage, retrieve the data, parse the CSV, and then send the refined data to the map stage to create custom records?
e
Yes, retrieve the CSV in the getInputData stage. I find it's easiest to parse the CSV into a JSON array and then return JSON object. Then you'll get an JSON object to process on each iteration of the map stage.
c
Perfect! Thanks, @eblackey. Do you foresee any issues if we make multiple requests to the third-party system in the get input data stage? I have a search that provides the third-party URL to trigger. Would it be good practice to make multiple calls in the get input data stage? This script will trigger every five minutes, and there won't be many records.
e
No, that shouldn't be a problem.
❤️ 1
If you don't have one already, take a look at papaparse for CSV parsing to JSON. I know a lot of folks here use it and it works well with SuiteScript.
thankyou 1
r
By multiple how many calls do you mean ? @Charan
c
@raghav depends on the search data probably 5-6
s
if there's "not many records" consider using a Scheduled Script instead.
thankyou 1
c
@Shawn Talbert thanks for your insights. The CSV data will be huge.
s
In that case, I'd recommend saving the CSV to the file cabinet and having the MR script ingest the CSV by file handle.
then, I guess delete the CSV when you're done.
depending on your definition of 'huge',
getInputData()
may fail if you try to do all the heavy lifting therein. Whereas IIRC map/reduce scripts have native support for ingesting a CSV file from the file cabinet line by line.
c
Appreciate your thoughts on this @Shawn Talbert . The data which I get from integration is just a csv string. I will convert csv string to json and pass it to map stage for actual processing but I don’t do any major operation in get input data. The data is passed line by line to map stage. Do you think still there be any issues ?
s
it depends on how large (in bytes) that CSV string is - if processing it exceeds the governance limits or the memory limits you'll be in trouble. Off the top of my head I want to say there is a 50MB RAM limit for the script - I may be off on that but if so, your CSV must be less than that (not counting other RAM usage). However, it's been a long time since I saw the memory limit error - so not sure if NetSuite has raised it or I've just been working with more memory-efficient scripts lately.
The reason I mention the CSV file cabinet approach is in that scenario it's the platform's responsibility to manage memory and reading lines to be fed to map()