I have a client that consistently has large (for N...
# suitescript
c
I have a client that consistently has large (for NetSuite) orders. We're looking at up to 800 items on a transfer and inter-company sales order (occasionally even bigger) This means that scripts are always going over usage limits (particularly where values are calculated and saved against all line items). I've tried to work with the business to reduce line length (breaking out into multiple orders etc) but that's being met with massive resistance. Without the business changing their processes, my only option is to refactor a bunch of their scripts to use N/task to spin up scheduled scripts or map/reduce scripts to handle write operations against item lists where size > PREDEFINED_SIZE. It's going to be a pain to refactor their existing scripts (a large number) plus the regression test effort and risk. To be honest, I think NetSuite is a bit crap here; it's not uncommon for wholesalers and retailers to have large orders. Has anyone else worked with clients that consistency have a large number of lines on an order?
a
I assume these are aftersubmit UE scripts? there's no magic bullet here, refactor and hand off to suitlet / MR is the way to go you should be able to take the code out of the UE script and just put it directly into the MR/suitelet though, you shouldn't need to rewrite the code though, the code is presumably fine as it is
c
Some are beforeSubmit, some are afterSubmit. You're right, it's afterSubmit.
a
just having lines doesn't use governance though.. there has to be something that's happening on a per line basis that has some governance cost associated with it... what would that be exactly? editing lines on the current record doesn't cost governance
I'm assuming its a governance issue and not a timeout issue?
c
To be honest, I think I know the answer to lift/shift what I can into scripts with larger governance limits and call them with N/task. It's just painful and the CTO is convinced NS doesn't support wholesale distribution because of this.
Governance issue but sometimes timeout too
a
if you're doing lookups to items or something for additional data that would cost governance you can avoid that by just sourcing more data from the item onto the line so its there already for example.
but that IS a refactor of the current logic and you'd have to invest the time to improve the performance on any/all scripts that are giving you the issue.
hand off via task might be a sledgehammer to crack a nut, but its an approach that WILL work, and you shouldn't have any surprises in there, and no logic changes even
c
I think there are multiple wins that could be had: 1. Remove searches from loops 2. Calculate as much as possible in the script instead of relying on sourced data (where possible) 3. Move looping submitFields() calls to M/R, scheduled etc when the line items size is large.
a
depending what data is required you can also leverage n/cache that can be a real gov cost and time saver depending on the script
1. yes 2. sourced data that's on the record? that should be fine... that is quick and free 3. yes
c
This does all point to NS being poor at handling wholesale use cases though. I can refactor every custom line they have but that doesn't stop a 1500 line TO taking 5 minutes to save or open.
a
dont have any real experience with TOs specifically, generally NS recommend (and normally enforce) 1000 lines max, i thought?
JEs being the obv. exception
TOs are what exactly ? like inventory transfers between locations or something?
👍 1
c
1000 lines per order is still too big for NS to be honest.
especially on heavily scripted records.
a
so its FROM location A TO location B at the header level? and then lines of items with qty? or each line can have different TO / FROM locations? or just TO or just FROM is at the header level?
c
Always header level for the locations
The issue is when you need to write data to each line.
a
i know you said you tried to get them to change process and reduce lines... could you maybe do that with script? before it does anything have the TO that's just been saved, just paginate the lines into blocks of idk 100? and generates as many child TOs as you need?
just brainstorming, its probably a dumb idea
c
That's one of my open suggestions is that a script will just chop the order up into multiple. That requires lots of changes to downstream systems and reporting which nobody is willing to do
It's not a dumb idea, I've done that elsewhere.
ACS wrote a script to do that for sales orders (a test in sandbox), ironically that script hits governance limits :-D
a
facepalm
😂 2
c
I think the solutions here are well understood, just a pain in the ass - everytime I get a wholesale client, I know volume is going to be a pain point.
a
yeah sounds like you know the things that can/need to be done... just lots of time/pain and pissy client till its resolved
s
What are the processes you need to trigger per line?
a
Refactoring will be due if you have any of these: • A search or searchLookup inside any loop. • submitFields inside any loop. • Updating a bunch of record outside the transaction for whatever reason. (My personal preference is to avoid loops as much as I can). The problems you are describing look more like design/code problems and not necessarily NetSuite inability to handle wholesale customers. I’m not saying NetSuite is fast with 1000 lines, but it can handle a huge deal of well designed scripts with 1000 lines without running into any governance issues.
b
in general, what alien4u mentioned is correct, you cant do any logic that consumes points per line
a lot of the time, you can get out of doing that by optimizing your logic, the most obvious of which is doing one search/query for the entire record instead of one per line
🙌 1
e
Have you looked into using a staging table (custom record) to save the records there and run the necessary processes and then using the staging table to create the sales orders instead of directly creating the SOs in the native form?
c
You'll find many poorly designed scripts out in the wild. My job here is identify all the use cases for writing data to each line item and look for a better way to achieve the requirement. The problem is the volume of scripts plus the downstream systems that already use the line item data. There's one script here that writes a unit cost to each line item which is then consumed downstream by multiple other systems and reports - even if I manage to remove forEach(line items) => { submitFields() // loops 900 times... } from the script and come up with another technical architecture, there would be a tremendous amount of re-work required downstream. It used to be the case that NetSuite could operate in a vacuum when it was mainly smaller customers but the use-cases have become larger over the years and one change to a script can have a ripple affect that touches up and downstream systems and processes.
I've got a script here that adds a timestamp to each item in the sublist and that timestamp is used downstream - if there are 800 items it will fail. It's hard to easily remove that without a ton of rework.
e
My personal observation is that governance doesn't seem to be in an issue when records are being created/updated using SOAP web services. When I first started out, I wrote C#/.NET console applications that connected to NS via SOAP and never had these governance limits hamper those CRUD operations.
c
As long as you don't overwhelm SuiteTalk that's fine. You can easily send too much data at once though. I've had to import 700,000 journal entries at end of month using the SFDC bulk API; that can get tricky when the data needs to land in a short timeframe.
Either way, I 100% feel like NS has some fundamental limitations here (stuff that would work in S/4HANA) and it's way too simple to suggest removing all use cases that require writing data to each line item, although I certainly do try to limit that where possible.
a
But this will happen to any system, no system that I know would be able to selectively skip poor code to perform efficiently, still not a NetSuite problem, is a technical debt problem. Code Refactoring is due even if that is expensive, this is commonly the cost of initial cheap labor or allowing anyone to write your business process code without proper oversight. NetSuite scripts with searches or lookupFields or submitFields inside loops are simply unacceptable because that is not scalable at all.
c
I'd love to read some stories where people have avoided / designed around having to write to each line item for various requirements / use cases.
a
Writing to lines should not be a problem at all, gathering the data needed to write to the lines could very well be the problem.
e
I have a script that needs extra info for every line. Originally, the script had a lookup for every line, so a search inside the loop. Didn't scale. An extra column with sourcing was not ideal, because it would have needed more than a couple of line fields. In the end, the solution was that the script loops the lines twice: One to get all the items it needs, then do a single search, then create an object with the results. Second loop to set values based on that object. The code is not perfect, but it now runs fast and doesn't have governance or time limits.
🙌 1