What's the best way to get external data into a ma...
# suitescript
j
What's the best way to get external data into a map/reduce script? We're using a RESTlet that might end up creating over 5000 records. I doubt the payload will ever be more that 5MB. I'm thinking of making a custom record to save the payload, and then execute the script. Does anyone have an insight?
I'll somehow find the new records in the `getInputData()`step... I think?
a
yeah that sounds fine? store the payload in a file or custom record then in the MR get input you just load the custom record / file and get the lines or w/e and pass those into the map stage
obv. you'll have to parse the data in the getinput stage
j
Will I run into any issues with dumping the JSON into a text field?
a
umm might be a size issue with raw JSON... use long text for i think 1,000,000 character limit
👍 1
regular text field is like 300 characters which obv. wont work
that's why i was thinking saving the JSON in the filecabinet might be better
is there a reason to not just do this in the restlet? can the calling system not just pass in a more reasonable size, break the POST data into pages effectively?
or just call it once per transaction or record, and have it make multiple calls? that's usually how RESTlets works
j
Is there a reason to not just do this in the restlet?
Not really. Other than it's not something that I'm in control of, and I've only been doing SuiteScript for a couple months so I'm not sure what I'm doing
a
yeah I get that, so your approach CAN work, I'm just not sure its the approach you SHOULD use
j
We're integrating with a system that does large scale shipments, so we'll be creating thousands of records for each sales order. Map/Reduce seemed like a good candidate because record creation is simple, it's just a lot of records.
a
oh so they send 1 sales order at a time.. but its 1000s of child records effectively?
yeah i guess there's not an easy way to pass that data in as smaller chunks then
your approach is certainly viable, just not a common way of doing thing
you won't be able to respond back to the source system with record ids or anything though since you're handing it off to the MR
j
I believe you. If it were common I think I'd have found more resources online about it
a
or any errors either
j
That could be a pain point... how to error handle this kinda thing...
a
you could respond back with a file id, or custom record id I guess? and then when the MR processes you could initiate a call to the source system with any error info / ids of successful creation
but that's a 2-way integration now with both sides needing to auth against each other
1
j
yeah, that sucks
👍 1
a
alternatively you just dont write anything back and have a manual process on the NS to find and resolve errors... which isn't awful since ideally you should be solid on the incoming data and NOT have errors... but that depends on the source data i guess
j
Oh, there's gonna be errors. It's gonna break. I think i'm going to have to report back... I'll burn that bridge when i come to it, i guess. I'll try and make a proof of concept for map/reduce and see how it goes
😂 1
a
idk what kind of controls you can put in place in the source system? can you give them a list of mandatory fields at least so they're not passign nulls into to fields you NEED populated
j
Yeah, at that level it's going to be pretty well controlled. It's just the way things go that it fails for whatever reason. I think the payload is created without a lot of user input... i think
a
hmm... i guess you could write errors back to the custom record? and have the source system make a GET call instead of POST call to your RESTlet with the id you respond back with from the POST after... idk 15mins ? and you can give them success ids / errors at that time? still only 1 way but they need to schedule 2nd call to get real response data
j
We have some sort of push/pull integration with NetSuite already, i'm not sure how that all works, but it's probably how things will be communicated
I know that map/reduce will collect errors, hopefully that can help smooth this over until we get it working.
a
well sounds like you have some pieces to work with then... I say go ahead and do you PoC with the MR and then figure out what's next 🙂
🤝 1
j
Thanks for chatting
a
sure, I love RPing as a rubber duck rubber duck debugging
rubber duck debugging 1
e
I process an external CSV file (30,000 lines) in these steps: 1. Copy the CSV file into the File Cabinet 2. Read the CSV file and Create a Custom Record for each line (map/reduce). One field says "pending". 3. Search the "pending" Custom Record and process it (we create invoices) and mark it as "processed" or "error". (again map/reduce) 4. Any Custom Record is mass updated using CSV imports and set as "pending" again. Or we can "discard" them and no longer processed.
s
@Edgar Valdes you are aware that M/R scripts can parse CSV files directly and efficiently? At least I think that was a use case - no need for the extra overhead of reformatting into custom records.
e
@Shawn Talbert I need the custom records in order to inspect and re-process single lines of the original CSV, each line can have a separate "status" after being processed.
👍 1
160 Views