Hi - In one our customer, would want to implement...
# integrations
d
Hi - In one our customer, would want to implement record listener concept like when changes in master records should auto flow to data lake and update the table in DB - sitting in on premise server having SQL DB. Would like to hear from experts advice here on how we can achieve functionality of updating the master records to another place holder may be Data Lake. Thanks!
r
One userevent having schema for each of your records you want to sync. Based on the record type fetch the schema and call the api and sync your stuff. It depends upon your use case and your personal preferences but you will have 2 options: 1. Check whether the fields you want to sync changed or not from old record vs new record. 2. Or just sync regardless. I personally prefer the 2nd but it depends upon the volume. Another way is to create a MR script and sync your stuff at regular intervals. Another alternate way is to create a restlet, and deploy a crawler on a server which calls your restlet and sync data at regular intervals (personally least preferred option). There are some more ways but these 3 are the most customizable ones.
b
The M/R is the best one, the User Event would hinder performance during every save.
I built something like this for a Data Lake that accepted CSV files dropped via SFTP. I exported the results of a saved search using the N/task module, once the file had been exported to the FileCabinet I used N/SFTP to drop that file in the SFTP server and their server processed it right away. The process was very efficient and quick and supported very large quantities of data, we are talking over 1 GB worth of data exported in 10 mins.
d
Appreciate for response! If would want to sync data on realtime - do we have any option called record listeners where in other system can look at changes and sync instead we deploying script on every master record and calling other system - primary concern have here is - would potentially impact on the performances because request url would take time to call and get the status back
r
The client script field change and other entry points kinda works like a listener. But you wouldn't want to sync something that might not get saved. Only after the record is successfully saved you would want to sync. In which case userevent after submit is your friend.
d
Basically, requirement is to sync real time data into data lake from Netsuite for master and transaction records - is there a way like record listener where other system can look can fetch the data without writing scripts or deploying event scripts
r
Restlet getting triggered from a crawler. Filter your search / query in it based on system notes. Get your result. Create the json and sync all the records at once. Do note : high risk of data inconsistency may happen due to searches/ filters not handling date time properly, getting any other error on either side. Netsuite/crawler going down the moment your restlet was supposed to trigger. Now once it goes back up, even if you call the restlet your search might end up giving inaccurate results. Which is why I don't personally prefer customization based on system notes. Best way I feel to handle your usecase is a userevent. So even if the sync fails for any reason, all you have to do is fix the problem and load and save the record which you can do through UI, script or CSV import. This is just my personal preference. Even if it takes 4-5 seconds on saving the record. It outways the benefit you might have with it. I have implemented all of these, but I felt the user event was the best one. That's assuming the api that's being called is fast enough to process it. Never had issues syncing to algolia, azure, 8base and quite a few other DBs. Or sync everything updated created from a MR on a scheduled basis but again either you need system notes or a checkbox on each of those records.
b
Could be a mix as well, having a user event set a checkbox and trigger a Map Reduce script to sync and in addition have the Map Reduce running every 15 mins to sync the data. It's near real time.
r
Or what Adolfo said. Uncheck the checkbox every time there is a change and your MR will pick up the work.