enaio® server-api
| This documentation was generated by ECMind GmbH based on the original enaio® Server-API documentation, available at https://help.optimal-systems.com/enaio/v120/admin/PDF/OS_ServerApi_de.pdf. All rights to the documentation remain with OPTIMAL SYSTEMS GmbH. |
1. Machine-Readable Access
This documentation is additionally available in two machine-readable forms, both served directly from this container:
-
Download skills as ZIP — 418 self-contained Claude skills (
SKILL.md+reference/+examples/per operation), extractable into~/.claude/skills/or<project>/.claude/skills/. Suitable for Claude Code, the Anthropic SDK and FastMCP-compatible clients. -
MCP endpoint (
/mcp/, Streamable HTTP) — live access to the same skills via FastMCP’sSkillsDirectoryProvider. Plus two MCP tools for clients without skill support:search_documentation(query, language, limit)(full-text search via SQLite FTS5) andread_documentation(namespace, procedure, language)(full adoc source of a procedure).
2. Introduction
enaio® is a powerful document management, workflow and archiving system. A special characteristic of the product family is its high configurability and interface strength, which allows integration into other systems at various points.
The following document describes in detail the functions provided by the enaio® server-api. The document has the character of a function reference. Further information about the structure of enaio® and its individual components can be found in the corresponding available documentation.
The enaio® Server serves as a runtime environment for various engines in the archiving, DMS and workflow areas. Thus, the server performs various tasks for document-oriented information flow. These include the capture, management and processing of documents and their associated index data, full-text search, workflow management, archiving and many other functions.
2.1. Engine
An engine – also called Executor – is loaded by the application server and thus enabled to execute jobs. Each executor manages one or more engines, in which the server jobs are organized. The following engines are delivered by default:
| DMS-Engine |
Research and manipulation of index and document data |
| Workflow-Engine |
Processing and management of workflow processes and models |
| Standard-Engine |
Collection of various functions for archiving, file and document transfer |
| Full-text Engine |
Querying the full-text engine |
| OCR-Engine |
Optical character recognition from images or scanned documents |
| Subscription-Engine |
Notification functions for changes to the document inventory |
| Core-Services |
Initialization, licensing and session management |
| Database piping |
Enables access to the database via the server interface in the form of an internal format or as Active Data Objects |
2.1.1. Serverjob
Jobs are tasks that are to be executed by the server and represent a specific acquisition, manipulation of data or control function. Thus, jobs can also be compared to functions. A server job has the following general structure:
-
Job name: The name consists of the engine – in which the job is implemented – and the job designation (e.g. dms.XMLInsert).
-
Input parameters
-
Input file list: If the job requires files to be passed, the absolute path and name of the respective file is passed here.
-
Output parameters
-
Output file list: If the job returns files, the absolute path and name of the respective file is passed here.
-
Return value: Each job generates a return value. In the case of success, this is always
0. If an error occurred, it can be roughly qualified by the return value.
To execute the jobs, a session must be created between a requesting client and the server. This consists of a connection via a technical protocol (e.g. TCP) and various information about this connection, such as authentication, station and license data. The technical protocol has been established for formulating requests and results of processing the BIN-RPC protocol. This protocol defines how requests, parameters and results must be represented.
The communication process between client and server follows this scheme:
-
Establishing a TCP connection with the enaio® server
-
Initializing a session with corresponding parameters
-
Job calls and receiving response data
-
Ending the connection
The client calls a job by packaging the job designation and the corresponding parameters in XML and sending them to the server. The client then waits for a response. The server kernel interprets the received data, determines which engine can execute the job, and passes it to its queue mechanism for processing. The result is then transmitted from the engine via the server kernel to the client.
|
The BIN-RPC format specifies that all parameters are transmitted within a binary structure. When transferring files (i.e. binary data), it would be necessary to convert the files into a string using MIME encoding. For reasons of performance and memory usage, this deviation from the standard was made: The files are sent as a TCP stream separately after the job parameters have been sent. |
2.2. Interface Libraries
To hide the internal communication from the business requirements, several possibilities are available to use the protocol without deep knowledge of the communication sequence. For this purpose, classes and libraries are provided by Optimal Systems GmbH:
-
COM interface for communication with the enaio® server (
oxsvrspt) -
REST interface for communication with the enaio® server (Microservice DMS)
-
Java interface for communication with the enaio® server (on request)
-
Other libraries that encapsulate server calls (on request)
The interface libraries have the same basic structure in usage. After initialization and establishing a connection to the application server, job objects are created, to which input parameters and input files are passed. After execution, output parameters and output files can be read from the object. In addition, an error stack is provided that can capture possible business and technical error messages.
Input and output lists are represented as hash lists, arrays or other objects depending on the programming language used.
2.3. Implementation of an Archive Connection
2.3.1. DMS Object Structure
The goal of an archive connection for various applications is to store documents with structural and index characteristics in the DMS, search in the data base, download documents and, if necessary, delete them. As described above, the functions are performed by calling server jobs, which are implemented in the DMS and Standard engines.
enaio® uses the following DMS objects for the representation of information structures:
- Folder
-
The term folder was chosen as an analogy to a file cabinet. A file cabinet contains folders, registers and documents. It represents the highest management level in the DMS – all other DMS objects can only exist as sub-objects of a folder. A folder receives only one attribute, a name.
- Directory
-
Directories are described by index data and are containers for the objects below them. They are comparable to folders in the file system at the root level. Only one directory type exists per folder.
- Register
-
Registers serve for further fine-grained structuring of the information structure. Per folder, there can be several register types that differ in their index data structure.
- Documents
-
Documents are distinguished from index data by the fact that they are associated with document files. Each document type has the property of a main type, which indicates whether the associated files are images, Windows source documents or other file types. The enaio® client behaves differently when capturing and displaying documents depending on the main type.
The classes of folders, registers and documents are called object types. For example, a customer folder or an invoice type is an object type. All instances of the class have the same fields for indexing. For documents, the object type also defines the main type. The structure of the index data is defined by the object model, so that for each object type a specific set of fields for tagging and searching can be defined.
The following are referred to as objects if a more precise specification is insignificant. Each object is characterized by index data and so-called base parameters. Base parameters are automatically assigned by the system and are intended for internal object management. They contain information about the creator of an object, its object ID, change dates, etc.
Each object in the DMS is described uniquely by its object type and object ID. For most jobs, object type and object ID are expected as input parameters.
2.3.2. Typical Scenarios
Research: Based on search parameters, index data and base parameters of objects are determined for further processing. The determined object IDs and object types can then be used for further research or to download the document files themselves.
Import: The calling application specifies index data and, if applicable, document files; new objects are then created in the system.
2.4. Test capabilities
For testing individual jobs of the enaio® server and its engines, Optimal Systems GmbH provides the program axlabjobs.exe.
The jobs can be executed with any parameters and as often as desired.
A connect string can be specified to indicate which server the test lab should log in to.
It is possible to work with many test labs on one server to perform load tests.
The program is installed by default in the server directory.
For monitoring job calls, the enaio® enterprise-manager is available. There, both computers and jobs can be specified that should be monitored. For jobs to which files are sent, these files can also be made accessible via a temporary directory.
The functions for job monitoring are available in the enterprise-manager under the area Advanced Administration > Monitoring > Job calls.
3. Engine Reference
The following are all engines of the enaio® server with their server jobs described. Each engine is assigned a unique acronym, which is used to address its jobs.
3.1. Engine Overview
| Engine | Prefix | Description | Areas |
|---|---|---|---|
|
Requests and processing of index data, DMS objects and folders |
XML Import · XML Export (Research) · Security system · Folders (Portfolios) · User-related data · Other jobs |
|
|
Processing and management of workflow processes and models |
Organization structure · Workflow model · Workflow process and activity · Workflow mask, event and script · Administration · History management · Other jobs · Server internal jobs |
|
|
Work, cache, file and archive management |
Work, cache and archive management · File management · Internal jobs · Other jobs |
|
|
Setup and control of subscriptions to inform about changes to the document inventory |
||
|
Conversion and access to image files |
||
|
Processing of full-text requests of the clients |
||
|
Optical character recognition |
||
|
Access to groups and users of enaio® |
||
|
Access to the database via the server interface (Active Data Objects) |
||
Core-Services (implemented directly in the server kernel) |
|||
|
Batch management, server monitoring, registry management and administration of loaded engines at runtime |
Registry management · Batch management · Server management · Session management · Engine management · Other jobs |
|
|
Management of system files and configuration tasks on the server |
||
|
License management for the entire enaio® system |
||
|
Server-side call of the data transfer server (MS-Office data transfer) |
||
4. Glossary
- Cache Directory
-
Directory below the server path (
..\server\CACHE). It is used to keep archived documents, which were read from archival media, available for further access, so that the next user does not have to access the archival media again time-consuming. - Flag
-
Parameter that should activate or mark a specific property.
- OSTEMP Directory
-
Directory declared in the environment variable
OSTEMP, which is used by some jobs for temporary storage or creation of temporary files. - Work Directory
-
Directory below the server path (
..\server\WORK), which is used as a storage location for all non-archived documents. For performance reasons, storage is done in this directory and not in the database. - Optional Parameter
-
Parameters and return values that are enclosed in square brackets
[Parameter]are optional. These parameters do not have to be specified when calling the job if they are not needed.