Pass TEMP0001 to next VI Job in chain?

SOLVED

On Sage 100 Advanced 2018, 6.00.5.0. Most of my experience with VI has been rather basic, updating or adding customer accounts, but I also have written scripts to import to Repetitive Invoicing Entry and Cash Receipts. However that is using data that I have manipulated to present in the best way for VI to receive it so this is a new challenge that I'm not sure is even possible.

I am curious to know if there is a way to pass a TEMP value from the first VI job to the next one in a chain. Basically I have orders in a csv file, all order information on a single line (3 items max). Not ideal but that is the way it is presented to me. Rather than require the user to pull it to Access or manipulate the data in any way, my hope is that with chained VI jobs I can create the SO Header (using Next for SO#) and import the first detail line in the first job, then I'd like to pass the existing SO# to the next VI job so it can append the next item to the order, and finally a third VI job would add the last item. At the moment I can import all three items but they go to three separate SO#s as I haven't found a way to store the SO# for use in the next VI job.

I realize this is unorthodox, but complexity on my end would equate to simplicity for the user--which is my goal. The only thing I don't know how to do is pass the SO# from the first job to the second--or how to identify the newly created SO# (there is only one order per customer, or at the very most one order per customer per day--so I would think that identifying the SO# in the second and third jobs would be possible using conditional file assign, but I have not got it to work).

If I'm completely missing something and multiple passes through the order aren't required, I am open for suggestions.

  • 0
    SUGGESTED

    You can attempt to use the coSession'SetStorageVar and coSession'GetStorageVar methods (may need to substitute coSession with %SYS_SS) but i'm not certain that the chained job uses the same session object and if not, than this won't work. If they do use the same session object than you could use perform logic after the sales order number has been set to use the coSession'SetStorageVar to store the "next" sales order number, then use perform logic before the sales order number is set to use coSession'GetStorageVar and if the returned value is not blank, apply it instead of using the next sales order number. You will likely need the SalesOrderNo field added twice, the first one being conditional that the variable returned from coSession'GetStorageVar is not blank and use the "Calculated" operation with the variable as the calculation and the second one being conditional that the variable returned from coSession'GetStorageVar is blank and use the "Next" operation. This will not work if you file contains more than one header record (essentially more than one row) where the second (or subsequent) row/s would be a different order because chained jobs are started after all records have been read from the source file and this would cause the temp variable to be overridden with each record read.

    I think it would be worth more to focus on a proper source file layout instead. If i'm understanding you correctly, one line has order header details in, let's say, columns 1-9 and then columns 10-15 contain line info for the first line, then columns 16-20 contain line info for the second line, and finally columns 21-25 contain line info for the third line.

    If this is correct, you could always use an excel workbook with cell references on a second sheet, power query, or a macro (either written by hand or recorded) to shape the data into three rows. This would just require the user to run through it real quick.

    Alternatively, you can use perform logic on the "On Execution" event to run either ProvideX code or a VBScript to shape the data into a proper row and column format.

    The benefit to having everything is one file is a concise job log that contains everything in one shot.

  • 0
    SUGGESTED

    I'd run a VBScript against the source file to reformat into multiple lines, then three jobs are not required.  You should be able to start the VI job from the script too (having the script put the processed import file in a preset location) to make things easier on the user.

    Alternatively I suppose you could also trigger a VBScript after the first job to query the SO #'s (ODBC), adding to the source file (as a new column) for the subsequent imports... but that would get tricky for the file handling.

  • +1 in reply to Kevin M
    verified answer

    Another option is to have perform logic that will write the info you need to it. Then on the next import you have perform logic to read from that file to get the info you need. You would need to have a unique key that doesn't use the info created from the initial vi.  When the last document is imported you would remove the temp file you created.