Restlet getting triggered from a crawler.
Filter your search / query in it based on system notes.
Get your result. Create the json and sync all the records at once.
Do note : high risk of data inconsistency may happen due to searches/ filters not handling date time properly, getting any other error on either side. Netsuite/crawler going down the moment your restlet was supposed to trigger.
Now once it goes back up, even if you call the restlet your search might end up giving inaccurate results. Which is why I don't personally prefer customization based on system notes.
Best way I feel to handle your usecase is a userevent.
So even if the sync fails for any reason, all you have to do is fix the problem and load and save the record which you can do through UI, script or CSV import.
This is just my personal preference.
Even if it takes 4-5 seconds on saving the record. It outways the benefit you might have with it.
I have implemented all of these, but I felt the user event was the best one.
That's assuming the api that's being called is fast enough to process it.
Never had issues syncing to algolia, azure, 8base and quite a few other DBs.
Or sync everything updated created from a MR on a scheduled basis but again either you need system notes or a checkbox on each of those records.