erictgrubaugh
12/07/2020, 5:13 PMJordan Patterson
12/07/2020, 5:16 PMMTNathan
12/07/2020, 5:23 PMerictgrubaugh
12/07/2020, 5:39 PMbattk
12/07/2020, 5:41 PMMTNathan
12/07/2020, 5:58 PMSciuridae54696d
12/07/2020, 5:58 PMstalbert
12/07/2020, 5:59 PMSciuridae54696d
12/07/2020, 6:00 PMstalbert
12/07/2020, 6:00 PMSciuridae54696d
12/07/2020, 6:01 PMstalbert
12/07/2020, 6:01 PMn/https
is blocking/async from the NS server perspectivestalbert
12/07/2020, 6:01 PMstalbert
12/07/2020, 6:01 PMstalbert
12/07/2020, 6:03 PMSciuridae54696d
12/07/2020, 6:05 PMstalbert
12/07/2020, 6:07 PMstalbert
12/07/2020, 6:08 PMSciuridae54696d
12/07/2020, 6:12 PMSciuridae54696d
12/07/2020, 6:14 PMstalbert
12/07/2020, 6:20 PMerictgrubaugh
12/07/2020, 6:27 PMerictgrubaugh
12/07/2020, 6:27 PMerictgrubaugh
12/07/2020, 6:28 PMDavid
12/11/2020, 2:08 AMDavid
12/11/2020, 2:10 AMDavid
12/11/2020, 2:10 AMerictgrubaugh
12/11/2020, 2:58 AMgetInputData
- Search finds Pending Purchase Orders
2. map
- For each PO from 1, Query external service for status; pass Fulfilled PO IDs on
3. reduce
- For each Fulfilled PO from 2, transform corresponding Sales Order to Item Fulfillment
Any that either fail or simply are not fulfilled yet during map
will remain as Pending POs, so the next time the script runs, it will find that one again.
For the time being, capping my getInputData
at 100 results (i.e. under the limit) and running more often allows me to run in parallel and should keep up with the volume. At a larger volume, perhaps your solution is betterDavid
12/11/2020, 3:35 AMerictgrubaugh
12/11/2020, 4:26 AMerictgrubaugh
12/11/2020, 4:26 AMerictgrubaugh
12/11/2020, 4:27 AMerictgrubaugh
12/11/2020, 4:28 AMerictgrubaugh
12/11/2020, 4:28 AM