Every month, I will have a CSV file with thousands...
# suitescript
e
Every month, I will have a CSV file with thousands of lines, let's say between 5 and 10 thousand. For every line, I need to create an Invoice and a Payment. All the required data is in the CSV file (customer id, subsidiary, items, etc.). Whats the best approach? Scheduled Script or Map/Reduce?
f
I'd use the CSV file to create SOAP requests batching the uploads without any server side processing.
n
Agreed, but between the two (scheduled vs MR), always map/reduce
c
That is a good point on whether or not you want server-side processing. The CSV batching at least gives you the option to ignore it but the map/reduce will fire server-side scripts of XEDIT/EDIT depending on how you set the fields.
e
CSV upload to a custom record (staging table) and then an MR script to create the invoices and payments from the custom records. Include a processed and date processed flag to track which ones successfully generated invoices and payments. Use a saved search of the custom record so that you can limit how many records you process for each run. With the custom record (staging table) approach you won't need to re-import the CSV records when a batch has bad data or fails for one reason or another and you can run your initial tests of your MR on one record followed by a set of records to see the results before you run it on all records.
n
Ooo I like that. With an option to manually resubmit a failed line through a UE button or something
👍 1
f
With volume like that, you'll want to start thinking about moving things out of Netsuite. It gets expensive in terms of performance and cost. Something you will want to measure right away is the time to create a single transaction. Since you are dealing with transactions, you are going to have significant processing. You may consider taking the csv, and create the transactions and then slam them in via SOAP. Previously on another project where I was running batches of 50K transactions at a time, when I did it using a restlet, I was able to get a transaction every 3.4 seconds even after shutting off every possible script. This was a system with 15 processors. So assuming you had 5000 lines with two transactions each, that's 10,000 * 3.4 / 60 / 60, or 9.4 hours. When I changed that to SOAP and eliminated the server side processing, I was able to get a transaction every 0.6 seconds. One nice thing about SOAP is a lot of people hate it, so you can set every script to not execute in that context. Assuming you could get similar results, your process would take approximately 25% of the time or ~2 hours. Maybe write it the other way first, and then put this in your cap and you can show them how you optimized the system to run much faster and they'll be very pleased.
🤯 1
e
Thanks for the insights @Fred Pope @Nathan L @creece @Eric B
🫡 1
b
Does anyone know of a way to make saved search CSV exports downloadable externally? Do I need to create my own script for this? The application here is importing captured tracking data to Google and Bing for offline conversion tracking. They support scheduled downloads from a URL. Perhaps I can just write the saved search results to a file in the file cabinet and make that file available without login (and keep overwriting the file so that the URL stays the same)? Any other suggestions?
n
A RESTlet that runs the saved search and returns a csv file maybe. Then point the scheduled download to the RESTlet url That would be safer than overwriting a file. Just in case the url of the file changed for some reason. With the RESTlet too, you could save the files in the file cabinet anyway with a date time stamp so keep a log of everything that’s been sent through your RESTlet
👀 1