Does anyone know alternatives to massive imports o...
# integrations
a
Does anyone know alternatives to massive imports other than the normal csv import?
j
I had to do an import of > 1million lines
I put the .csv file in the file cabinet and wrote a map/reduce to go through it and add/update as needed.
a
How long did it take?
j
can’t remember tbh, I just let it run and came back later.
I think I still had to split it up if I recall
I used a shell script to split my csv, put those files in FCabinet, ran MR a few times and it would pick up one file, move it, process it.
it got through them all eventually.
better than having to manually sit around and babysit 50 CSV imports.
t
You can use Celigo data loader. Our data loader tool is free to use: https://docs.celigo.com/hc/en-us/articles/226949488-Create-a-Data-Loader-flow
a
How quickly will import and create 100k of transactions?
t
Tough to answer that. Depends on number of fields, field validations, scripts running, workflows running, etc. From Celigo side, just set the NetSuite connection to 25 and we would use all of that maxed out until everything went through. So the bottleneck is really on the concurrency limit you have
s
Depending on the table, we've seen concurrency issues with anywhere from 6-8 connections when creating custom records. The problem is that even if you have more connections to use, database locks may make the performance level out at a certain point. That is going to affect things whether you use CSV import, script, UI, or web services. So there is a practical throughput limit you'll hit. Since this is a one-time import, it's not worth worrying about too much. I'd either use a custom Map/Reduce script, or a third party tool like Celigo just to make it easier, and start it running at the end of the business day if it can wait. You also have to deal with user event scripts, workflows, and any GL plug-ins running on transaction creation, that will also slow things down.
a
Thank you
s
the most important thing is to find a way to verify that all of the transactions were created, after the import is done, and to possible re-import any that failed or are missing. You'll probably want to set a unique external id for each transaction, that can be compared against the original import list
🙌 1
a
@Tyler Lamparter does the data loader tool get installed as a suiteapp or?
t
The data loader tool lives in Celigo, but you would need our SuiteApp to also be installed in your NS environment: https://docs.celigo.com/hc/en-us/articles/360050643132-Install-and-use-the-integrator-io-SuiteApp
b
For a large import job we had a script that breaks the large file into 25000 line CSVs. We scripted the CSV Imports as jobs which we first import into a Custom table. We verified the completeness of the import first. We then have other MR scripts that go ahead and process the lines into transactions. We rarely if ever see any of the CSV lines getting skipped.