Hi, I am having 1 million records , writing a Map/...
# suitescript
r
Hi, I am having 1 million records , writing a Map/reduce script to delete those record but getting Usage limit exceeded error What need to do?
d
maybe use this script couple of times
s
No need to write a map reduce. I'm using this public bundle Techfino Mass Deletion Script. If any error occurs, you can run the script again until it finished
r
@David Na getting error on Get input data stage only. unable to load all the records at a time.
k
@Ravi may be worth just limiting your search to a certain amount of records. Then pass those to delete. In summarise then you could queue the same script to run again. Also I'm getinput you could throw an error if no more records exist.
☝️ 1
n
Do not run getRange() in the input stage. Just load your search and return it from the get input stage. Map will run for each record and you can delete that record.
a
For this use case it would be better a Mass Update script, but if you still want to use the Map Reduce the solution from @NickSuite is the right way to go for this amount of records.
👍 1
k
For a million records depending on the data being used in columns will normally go over the 10mb get input data limit. As stated I would limit and return an object of your own making keyed.
r
Thanks all for the help.
n
@alien4u can a mass update script be used for deleting records?
a
@NickSuite Yes and very efficiently...
👍 1
m
A mass update will be single threaded though. Map/Reduce you can use multiple processors. Another option is to use integrator.io. You can do this using the free tier.