Trying to scrape all POs on the Coupa API and crea...
# suitescript
r
Trying to scrape all POs on the Coupa API and create them in NetSuite. How do? I tried Map Reduce but the payloads are too large to pass between stages. Tried Scheduled but I have to do batches of 900 requests and even then the JSON file is too large to save. Wondering how people have handled this.
a
Check if the API has pagination. A good RESTful API should allow you to get a range of results.
r
It does, I'm using pagination. Even 250 POs is too large to save to a file.
c
@reptar Do you have the option of saving the payload which is "too large to pass between stages" into the File Cabinet and pass the file ID between stages?
r
It's not a very good option. I'd have to save 15k individual payloads.
👍 1
a
Why too large?
r
I think I might need a middleware app.
I forget the error, it's basically that `value`/`values` is too large.
That was for an individual PO too.
Coupa stores a lot of activity data.
a
do you need that data? can you not just pre-parse the response data and only pass arond the stuff you need?
r
Yep, I had thought about that. I was hoping I could clean the data after collecting it, but I might have to.
a
I'm assuming you know that coupa HAS a NS integration bundle already and you're choosing NOT to use it?
r
We built a custom integration for other record types but not POs. We can't install the bundle rn. We're trying to get rid of Coupa.
a
fair enough, I actually hate their bundle tbh, was just making sure you knew about it at least
r
Ya, they built us a custom integration that's just as bad. I investigated installing their bundle and I was like, this is not an improvement.
e
How many POs are we talking about? Over what time span?
a
is this a one time data extract to migrate off?
r
15k
yep
a
Another way you can approach this is by creating a file in the getInput data, in the map stage make as many requests as governance allows you and save your data to the file.
r
The file size limitation is preventing me from saving even 250 POs to a file without scrubbing the data tho.
a
There is no file size limitation when appending lines
a
and why can't you make 15k queries of w/e and just save each response to a file?... then once you've got the files you can just work with them locally, clean them up and ultimately throw them into a CSV and load into NS?
a
The limitation is to save the file…
You can’t load or save a file bigger than 10MB by using file content. You can load and add lines…
a
i feel like if its a one time deal, the need to make it a repeatable integration flow doesn't actually gain you anything?
...that said i like everything @alien4u is saying right now 🙂
r
@alien4u I'm not following what you're saying.
@Anthony OConnor yes, i'm not trying to make it repeatable, just a one time dump into NetSuite. i will need to create them as POs tho
a
File Append line and file read line allow you to circumvent the 10MB limit.
r
but if you can't save... what are you suggesting?
a
You can save...
From the N/file Module: • Methods that load content in memory, such as File.getContents(), have a 10 MB size limit. This limit does not apply when content is streamed, such as when File.save() is called.
Unless each individual line of your file is bigger than 10MB you will be totally fine with: • File.appendLine(options)
👀 1
c
Just to throw a can of fuel on the fire, you could also use a set of custom records to store the data from the requests and process in a separate script
👀 1
e
Does Coupa have any relevant export capabilities? Would be nice if you could build nothing at all and do a CSV import or two instead.
r
Mmmm I need attachments.
e
Interesting, is it the attachments that are blowing up the payload size?
r
im sure. that and also activity info
The custom record idea is intriguing but I think a node.js server is going to make things so much easier. I can probably get my company to spin up a server.
e
Maybe a two-pass system? The first pass gets all the POs into NetSuite, the second gets attachments (potentially one PO at a time)
👍 1
Yeah if you want everything in one shot, middleware seems mandatory
💯 1
but that feels bad to spin all that up for one-time use
r
ya, but once i have it i can use it for all sorts of purposes
b
do a variation of 1 po per map / reduce
if you are extra lucky, that means get all the po ids in the getInputData then get each one to process in a map
if that isnt possible, its get all the pages in the getInputData, to pass 1 page to each map, and then 1 purchase order id to process in each reduce
r
If forgot to mention. NetSuite is throwing timeout errors when I try to use pagination. That with the file size limitations means I'm probably just going to have to have 15k map stages with one request and a PO being created in each Map instance. Which is doable but horrific from a design perspective.
a
i mean its a one-off I wouldn't get hungup on the horrific design 🙂
1
6 weeks later.... actually since this PO extract works so well lets just stay on coupa and keep using it forever 💀
😆 3
😭 1
r
At what they're charging, there's no chance of that
v
I personally wouldn't mess around with map/reduce etc. For a one-off, I'd grab the data outside of netsuite (python script or similar), then CSV in.