Hi All - has anyone got some good solutions for ex...
# general
j
Hi All - has anyone got some good solutions for extracting large amounts of data from Saved Searches in NetSuite? I am hitting the row / size limits when trying to have a Scheduled Email of the results sent to me. Looks like row limit is 10k rows / size limit is 5 MB from my experience. I am trying to extract as CSV and have already broken the query down to small chunks - still have around 20k rows per month.
m
We could make a tool for you pretty easily if you want. We've setup things like this for other clients in the past. Also depending on where you want the data to end up we have a data integration tool to sync saved searches into Google Sheets
Where did you want the CSV files "sent to"?
j
What methods did you use? I've seen online people talking about using custom scripts to send the csv extract, or using the SOAP/REST connection. We will initially just send to a shared google folder, though eventually to a cloud storage bucket.
I'm happy to script it myself - just looking for ideas / inspiration as to where to start
m
You could probably do this with integrator.io free tier
i
Map reduce script is the most applicable due to the map splitting the data and reduce processing those chunks, allowing for multiple threads if you have the oracle module that give you multiple threads (5/10). Can handle as much data as you want if you set it up correctly. Do not use a scheduled script, while the yield functionality will work, map reduces were made for this exact requirement and replaced this functionality. Suggest using n/query as well to speed it up but if you want to run a search in the code, that'll be sufficient.
m
Yes Map/Reduce to run the search and render the CSV file, then us https module to send to Google Drive. Or if you have an email to folder capability you can email it to a storage folder (i think Box or Dropbox can do this)