REST API Question - using the GET ARReceiptAndAdjustmentBatches and APPaymentAndAdjustmentBatches calls

SOLVED

I have developed a 'payment' process that calls Sage 300 API APPaymentAndAdjustmentBatches for vendor payments and ARReceiptAndAdjustmentBatches for customer payments. I'm currently running these processes (one for each API call) multiple times a day to reduce the number of rows that are returned. 

Note: If the job is only run once a day the job will receive a 'Heap size issue (too may rows returned)' which will cause the entire process to fail. 

 

I had originally thought I could retrieve payments based on the DateLastEdited which is a Date/Time field in Sage 300 this would allow the job to only pull in rows since the last date/time the job ran unfortunately the Sage 300 API doesn't seem to be recognizing the time portion of this field.

Does anyone have any thoughts on ?

How to limit the number of rows pulled back from Sage ? 

Note : I already use 'ODATA' to filter the result set but it doesn't appear that the orderby is recognized when using a filter.

I'm also open to limiting the size of the payload, I only need a handful of fields but the API returns the entire paymet (json) for each transaction. 

This is the filter information I'm using;

- for the ARReceiptAndAdjustmentBatches the filter is ; 

    - DateLastEdited <previous run date/time>

    - BatchStatus ed 'POSTED'

        Note: I'm only looking for 'POSTED' information

    - DefaultBankCurrency eq 'USD' or 'CAD'

        Note I run the job twice, once for each currency - this is done to reduce payload size

- for the APPaymentAndAdjustmentBatches the filter is ; 

    - DateLastEdited <previous run date/time>

    - BatchSelector eq 'PY'

        Note: I'm only looking for 'POSTED' information

    - BankCurrencyCode 

 eq 'USD' or 'CAD'

        Note I run the job twice, once for each currency - this is done to reduce payload size

  • +1
    verified answer

    Joe you are brilliant on how you came up with a solution to this problem!  So, how did you do it?

    Well I rewrote the process, I first make a call to retrieve a list of batchnumbers based on the posting date.  I use a 'select' to reduce the size of each json packett that is returned so the list of batchnumbers don't 'blow' up the memoty hash in Salesforce.  Then I loop throught the list of batchnumbers 1 at a time making a second call to retrieve the payments associated to each batchnumber.  

    With this techneque the first call I make can return a list of batchnumbers for a 24 month period without causing a problem but just to make sure I split the process up further where you can set the number of cycles you want to run (number of times to run the job) and each time the job with decrement the date (again by a specific offset) and keep retrieving payments as far back as you might want.

    This works very well and runs much faster than the previous approach.