Unpack job



Unpack job is a processor that unpacks a job that was packed for transmission with Pack job.

Outgoing data success connections receive the unpacked job, with restored metadata (if it was included in the packed archive). Outgoing log success connections receive a checksum-protected XML file that can serve as a formal confirmation of successful receipt of the job, recognized by the Monitor confirmation element.

Incoming jobs that do not have the appropriate format (or that were corrupted during transmission) are sent to the data error connection or if there is no such connection, these jobs are moved to the problem jobs folder.

See Acknowledged job hand-off for an example of how to use this tool.

Keywords

Keywords can be used with the search function above the Elements pane.

The keywords for the Unpack job element are:


Connections

Unpack job supports outgoing traffic-light connections of the following types:


Properties

Property

Description

Name

The name of the flow element displayed in the canvas

Passwords

A list of passwords used for opening the packed data, if needed

The script expression can be used to determine a password dynamically, for example based on the sender of the packed data

Restore metadata

If set to yes, internal job ticket information and external metadata associated with the job are restored from the packed data (overwriting any metadata already associated with the packed data)

Embedded metadata is always included since it is part of the job file itself

Email info

Special options for email info in the internal metadata:


  • Restore and replace

  • Restore and merge

  • Don't restore

Hierarchy info

Special options for hierarchy info in the internal metadata:


  • Restore and replace

  • Restore and place at the top

  • Restore and place at the bottom

  • Don't restore

Fail duplicates

If set to yes, a packed data entity that comes in twice (based on the unique identifier stored inside) is sent to the error output; otherwise it is treated as usual

In both cases the duplicate is logged to the execution log

Keep info for (days)

The list of identifiers already processed is kept for at least this many days