I have developed a 'payment' process that calls Sage 300 API APPaymentAndAdjustmentBatches for vendor payments and ARReceiptAndAdjustmentBatches for customer payments. I'm currently running these processes (one for each API call) multiple times a day to reduce the number of rows that are returned.
Note: If the job is only run once a day the job will receive a 'Heap size issue (too may rows returned)' which will cause the entire process to fail.
I had originally thought I could retrieve payments based on the DateLastEdited which is a Date/Time field in Sage 300 this would allow the job to only pull in rows since the last date/time the job ran unfortunately the Sage 300 API doesn't seem to be recognizing the time portion of this field.
Does anyone have any thoughts on ?
How to limit the number of rows pulled back from Sage ?
Note : I already use 'ODATA' to filter the result set but it doesn't appear that the orderby is recognized when using a filter.
I'm also open to limiting the size of the payload, I only need a handful of fields but the API returns the entire paymet (json) for each transaction.
This is the filter information I'm using;
- for the ARReceiptAndAdjustmentBatches the filter is ;
- DateLastEdited <previous run date/time>
- BatchStatus ed 'POSTED'
Note: I'm only looking for 'POSTED' information
- DefaultBankCurrency eq 'USD' or 'CAD'
Note I run the job twice, once for each currency - this is done to reduce payload size
- for the APPaymentAndAdjustmentBatches the filter is ;
- DateLastEdited <previous run date/time>
- BatchSelector eq 'PY'
Note: I'm only looking for 'POSTED' information
- BankCurrencyCode
eq 'USD' or 'CAD'
Note I run the job twice, once for each currency - this is done to reduce payload size