Good day, I'm brainstorming how to approach a full...
# suitescript
n
Good day, I'm brainstorming how to approach a full data backup of multiple Netsuite(Non Oneworld) accounts to then merge it into a new master account(Oneworld). The goal is to verify the data from each account (which will become subsidiaries) and ensure it matches to the new master account before the other instances are shutdown ensuring that all the data is correct before they transition into the New Account. A backup of the data in csv is required. I have the customization, roles, searches, all sorted out, that was easy. I know there will be some changes and setup and more challenges to come... It's exacting the large amount of Data. I'm trying to efficiently extract, every single field for every record regardless of its form, with out missing anything. I don't want sit and manually select every field for a save search for extraction in a save search to pull all the data. I was thinking about script that would produce csv files in the file cabinet into a folder which you can just download from a single folder(zipped) from the file cabinet, assuming these may need to be split due to governance . Or even have it setup to pull data from the new master account and import it via a RESTlet. What's your opinion, on how to approach this quick and efficient as possible? And for fun, I have to do this in less then 2 months. Thoughts?
b
i vote sit down and manually select every field using a combination of the
Add Multiple Button
and shift + clicking to select multiple fields to create your saved searches
thats probably not the hard part, you are probably going to do a slow boring process of exporting the data, then reimporting the data using csv import, reexport it from your new account again, and then compare it to your original files to verify the data
n
lol
I started creating a few just to get an idea of how long it would take to do this for EVERY single record. It shouldn't be too bad.
Yuppers, on top of that some accounts will be active and ill have to pull the new data and do it again.
I was playing with the idea of loading the record object which pulls everything that exists, into an array. Work some magic to split it up, and create the csv in the file cabinet.
b
you are essentially doomed on a 2 month timeframe if you have to rely on scripts to do your imports, your only hope is csv imports
👍 1
n
lol, I'll use that inevitable doom as fuel to push this across the finish line 🙂
d
@netsuiteapplepie, check out https://www.tacticalconnect.com/
Exports a based off Saved Searches you build. You can also export them as CSV's (regardless of dataset size) to many endpoints like Azure, Sharepoint, etc. All this can be scheduled too.
@Josh_SatoriReporting is the man to speak to
n
Thanks Darren! I've used the ODBC before, and this looks like a good alternative.
j
@netsuiteapplepie let me know if you'd like to see a demo of Tactical Connect and/or start a free trial
r
Have you tried something like integrator.io? Seems like a straight forward use case for that and depending on the number of flows might be able to do it for free.