Setting up the JDBC adapter in webMethods’ Integration Server to connect to an Oracle database

Today, I wanted to use the JDBC adapter in webMethod’s Integration Server to connect to a database from a Java service. As it turns out, the configuration was quite frustrating for me. Here’s a history of the problems that appeared while trying to get a database connection running:

Error encountered
[ART.118.5042] Adapter Runtime (Connection): Unable to enable connection resource ***
[ART.118.5036] Adapter Runtime (Connection): Unable to configure connection manager.
[ADA.1.200] The JDBC DataSource class "oracle.jdbc.pool.OracleDataSource " cannot be located.

The problem was a trailing space at the end of the DataSource class name (take a good look at the above message), although I have no idea how it got there.

Solution: Remove the trailing space.

[ADA.1.200] The JDBC DataSource class "oracle.jdbc.pool.OracleDataSource" cannot be located.

This was a “real” problem, because IS or the JDBC adapter could not find the desired class. You need to add them to IS to make it work, e.g. by taking ojdbc6.jar from the installation directory of your local oracle client or downloading the JDBC drivers from Oracle’s website.

Solution: Copy the JAR file containing the needed classes to IS_DIR\packages\WmJDBCAdapter\code\jars and restart IS.

[ADA.1.204] Cannot connect to the database with DataSource class "oracle.jdbc.pool.OracleDataSource".
Ung├╝ltiger Oracle-URL angegeben/Invalid Oracle URL specified: OracleDataSource.makeURL

Apparently, the Oracle driver had a problem creating the correct URL for connecting to the database, although I provided all the needed information like server, username etc. correctly.

Solution: Add driverType=oci under “Other Properties” in the connection’s properties.

no ocijdbc11 in java.library.path

Apparently, Oracle was missing some libraries I could have copied from my local Oracle installation. But instead, I switched over to using Oracle’s thin driver and it worked instantly.

Solution: Add driverType=thin under “Other Properties” in the connection’s properties.

So, to sum up my configuration to get an Oracle connection running in IS’s JDBC adapter:

  • Copy Oracle’s JDBC drivers (e.g. ojdbc6.jar) to IS_DIR\packages\WmJDBCAdapter\code\jars.
  • Restart IS.
  • Add driverType=thin under “Other Properties” in the connection’s properties.
  • Enable the connection.

Avoiding Natural Error 3009 (Adabas Timeout) in an RPC server

I published several Natural subprograms as a service in webMethods Integration Server via the EntireX adapter which uses a Natural RPC server on our SuSE Linux server to access the Natural modules. After a Natural service has not been called from the outside for a certain period of time (about 1 hour), the next call to the service always resulted in a Natural Error 3009:

NAT3009: Last transaction backed out of database :1:. Subcode :2:.

The Software AG documentation describes how to fix this problem: Avoiding Error Message NAT3009 from Server Program. However, there are a few additional things you need to check, to make sure that the solution works. Here is what I needed to do to make it work:

  • Copy NATRPC39 from library SYSRPC to library SYSTEM in your FUSER (System Libraries, not User Libraries!).
  • Edit SYSTEM.NATRPC39 and make sure every database gets pinged correctly:

    01  ACB
      02  ACB-TYPE                      (B1) INIT<H'30'>
      02  ACB-FILL                      (B1)
      02  ACB-COMMAND                   (A2) INIT<'RC'>
      02  ACB-CID                       (A4) INIT<H'FFFFFFFF'>
      02  ACB-FILEID                    (B2)
      02  ACB-RSP                       (B2)
      02  ACB-ISN                       (B4)
      02  ACB-ISNL                      (B4)
      02  ACB-ISNQ                      (B4)
      02  ACB-FBL                       (B2)
      02  ACB-RBL                       (B2)
      02  ACB-SBL                       (B2)
      02  ACB-VBL                       (B2)
      02  ACB-IBL                       (B2)
      02  ACB-COP1                      (A1)
      02  ACB-COP2                      (A1)
      02  ACB-ADD1                      (A8)
      02  ACB-ADD2                      (A4)
      02  ACB-ADD3                      (A8)
      02  ACB-ADD4                      (A8)
      02  ACB-ADD5                      (A8)
      02  ACB-CMDT                      (B4)
      02  ACB-USER                      (A4)
      02 ACB-80                         (A80)
    ACB-RSP := 1 /* DBID to ping
    ACB-RSP := 2
    * add additional databases here
  • Check the setting SRVWAIT in the NATPARM of the RPC server. It needs to be greater than 0 and less than the TNA* settings for the Adabas database, e.g. 60 (seconds). The TNA* settings can be found in the database’s configuration file adanuc.prm.

  • Check the setting SERVER-NONACT in the Attribute File of the EntireX broker. It also needs to be less than the TNA* settings, e.g. 10M.

To make sure that NATRPC39 really gets called you may add a WRITE WORK part to its source like this:


In NATRPC39 be sure to only call modules in SYSTEM (as the Steplib chain is not evaluated) and check that it does not throw any runtime errors whatsoever! Any Natural error in NATRPC39 is silently ignored and not logged anywhere (or at least I found no log). To enable tracing for the RPC server to check whether the SRVWAIT works as expected (see Using the Server Trace Facility), you need to modify its NATPARM file: set TRACE to 2 and Trace on error to OFF. Then set the Report Assignment for Report 10 to Device LPT10 and set the Physical Output Device for Device LPT10 to something like this: /bin/sh -c cat>>/tmp/rpctrace.log. Be aware that the RPC server caches the log information. So you may need to deregister the RPC server after a few minutes to have it flush its log to the file.

How to create and deploy a web service with Java EE 7 and Wildfly 9

Apparently, creating and deploying an web service with Java EE 7 is just as easy as it was with Java EE 6. This maybe due to the fact, that support for web services and SOAP in JAX-WS hasn’t changed much since Java EE 6, as Adam Bien points out here: The State of SOAP / JAX-WS. However, just for the record, here’s the basic code needed to provide a simple web service:

package [...];

import javax.jws.WebMethod;
import javax.jws.WebService;

public class Hello
    private final String message = new String("Hello, ");

    public Hello()

    public String sayHello(final String name)
        return message + name + ".";

Simply compile and deploy the WAR file (e.g. YourApplicationName.war), e.g. into \standalone\deployments of the standalone Wildfly installation. You should now be able to access the generated WSDL under http://localhost:8080/YourApplicationName/Hello?wsdl.

Re-use your own existing Java code in a Java Service in webMethods Integration Server

Modularization is an important concept of programming. So, when I wanted to create a Java Service in webMethods’ Integration Server today, I wanted to re-use my existing Java code that already provided the feature I wanted to publish as a Service. However, this is not as easy as it sounds ­čśë Here are the steps you need to perform to get your already existing Java code working in a Java Service:

  1. Compile and bundle your external Java project into a JAR file, e.g. using Apache Ant.
  2. Copy this JAR file into the subfolder code\jars\ of your package on Integration Server and reload the package. Example path: [...]\SoftwareAG\IntegrationServer\packages\YOURPACKAGE\code\jars\. Now, IS is able to use the JAR and automatically puts it into the package’s CLASSPATH.
  3. Copy the JAR file into the subfolder lib\ of your local Java project (in your local Eclipse workspace) and add it to the Eclipse project’s CLASSPATH, e.g. by adding <classpathentry kind="lib" path="lib/MyExternalProject.jar"/> to .classpath. Now, Software AG Designer (i.e. Eclipse) is able to resolve the external classes and can compile the code correctly, provide code completion etc.
  4. import the needed packages from your external project and use the classes in your code. Here’s some example code from the resulting Java Service:

    package XML.Services;
    import com.wm...; // all webMethods imports
    import; // import from your external project
    public final class ValidateXML_SVC
       * The primary method for the Java service
       * @param pipeline
       *            The IData pipeline
       * @throws ServiceException
      public static final void ValidateXML(IData pipeline) throws ServiceException
        xml = ISPipelineHelper.getInputParameter(pipeline, "xmlStream");
        schema = ISPipelineHelper.getInputParameter(pipeline, "schemaStream");
      // --- <<IS-BEGIN-SHARED-SOURCE-AREA>> ---  
      private static InputStream xml;
      private static InputStream schema;
      private static boolean isValid;
      private static String validationError;
      private static void validateXML() throws ServiceException 
        isValid = true;
        validationError = "";
        // now you can use your own classes
        SaxXmlValidator validator = new SaxXmlValidator(xml, schema);
        catch (XmlNotValidException e) 
          isValid = false;
          validationError = e.getMessage();
        catch (Exception e)
          throw new ServiceException(e);
      // --- <<IS-END-SHARED-SOURCE-AREA>> ---

Error ISS.0088.9163 Could not retrieve WSDL for service in webMethods Integration Server

The following problem with webMethods Integration Server already occured twice in our environment, so I think it’s time to document the fix. When opening the generated WSD for a web service, I got the following error:

[ISS.0088.9163] Could not retrieve WSDL for service [...], WSD not found.

The problem was the URL generated by the web service index page (/ws/):

[ISS.0088.9163] Could not retrieve WSDL for service ..., WSD not found.

The colon – which is part of the service’s namespace – was encoded as an HTML entity and Integration Server could not find the correct WSD. This seems like a small problem, but some of our clients used this URL and weren’t able to call the service anymore. The solution is to set an extended setting in Integration Server:


After a restart of Integration Server, the URLs generated by the index page were correct again and both versions of the URL – whether with colon or entity – worked fine.

This problem occured once after installing Integration Server 9.5 Fix4 as described here: WSDL links are no longer working after installing 9.5 Fix4. And it also occured after upgrading Integration Server 9.5 to 9.7.

Error ISS.0141.9208 Could not deploy the Web service descriptor in webMethods Integration Server

During the startup of webMethods Integration Server I found quite a few entries like this in server.log:

[ISS.0141.9998E] Exception --> [ISS.0141.9208] Could not deploy the Web service descriptor [...]. Cause: [ISC.0081.9164] Exception occurred during generation of WSDL for service [...]: [ISC.0124.9011] Document to XSD error: Simple type [...] does not exist

None of the mentioned web services were operational after a restart of Integration Server. A simple solution was to manually reload the corresponding packages, but I would need to do this every time I restarted Integration Server.

The root cause of this problem was the usage of documents from another package in the package for which the deployment of the WSD failed. For example, a web service in package Business uses a document defined in package Data Model. When Integration Server loads the packages – by default in alphabetical order – Data Model has not yet been loaded when the web service in Business gets deployed.

The solution is to configure the packages’ dependencies: package Business depends on package Data Model. Integration Server then loads the packages in the correct order and is able to deploy the WSD.

You can configure the dependencies in the package’s properties in Software AG Designer:

Configure Package Dependencies in Integration Server

How to change the Java version for webMethods Integration Server

Changing the Java version under which webMethods Integration Server runs and compiles Java services is quite easy (after you searched for hours for the right files to change ;-)). You can check which version your Integration Server uses on the About page:

webMethods Integration Server About page: Java Version

Here’s how to change the version for Integration Server (in my case version 9.7 running on Windows Server 2012):

  1. Install the new JVM on the server Integration Server runs on, e.g. into C:\Program Files.
  2. In the Extended Settings of your Integration Server, set watt.server.compile to the new JVM, e.g. watt.server.compile=C:\Program Files\Java\jdk1.8.0_25\bin\javac -classpath {0} -d {1} {2}
    webMethods Integration Server Extended Settings: watt.server.compile
  3. Shutdown Integration Server.
  4. Edit [SAG]\profiles\IS_default\configuration\custom_wrapper.conf and add (or change) the setting, e.g.\Dev\Java\jdk1.8.0_25\bin\java
  5. Deregister the Windows service (as Administrator) with [SAG]\IntegrationServer\instances\default\support\win32\installSvc.bat unreg:
    Unregister webMethods Integration Server Windows service
  6. Register the Windows service (as Administrator) with [SAG]\IntegrationServer\instances\default\support\win32\installSvc.bat:
    Register webMethods Integration Server Windows service
  7. Start Integration Server and check the About page for whether the new Java version was set correctly:
    webMethods Integration Server About page: Java Version

Conventions for naming deployment artifacts in WmDeployer for webMethods Integration Server

For quite some time now I have used the default names for all the different artifacts (set, build, map, candidate) in WmDeployer, because I simply did not know how to name them in a meaningful way:

Default Deployment Properties in WmDeployer for webMethods Integration Server

However, after talking to an SAG consultant today, I finally came up with a meaningful naming convention:

  • Deployment Set: Simply name the Deployment Set according to the type of deployment it contains, e.g. IS or MWS.
    Create and name a Deployment Set for Integration Server in WmDeployer
  • Build: A meaningful name would contain a version number for the build. Just see the build as a Java artifact (JAR), for which multiple versions can exist. A new version number is only needed, if you plan to revert to an older version later. If you only have minor changes to deploy, you can simply re-build the existing version.
    Create and name a Deployment Set for Integration Server in WmDeployer
  • Deployment Map: You can define mappings for different target servers or groups. Therefore, the target’s name should be contained in the map’s name.
    Create and name a Deployment Set for Integration Server in WmDeployer
  • Deployment Candidate: A Deployment Candiate is simply the combination of a Build und a Deployment Map. This should be reflected in the candidate’s name so you can quickly deploy a specific build to a specific target.
    Create and name a Deployment Set for Integration Server in WmDeployer

Unit-testing Flow Services in webMethods’ Integration Server with JUnit

Today, I built a small test bed for unit-testing Flow Services in webMethods’ Integration Server. If you develop Java Services for IS and follow the best practice of “no business logic in the service layer”, you can unit-test your logic in isolation in a plain old Java environment and deploy the library to IS, which then only acts as the service bus and simply forwards service calls to the library. However, if you create Flow Services directly in IS, which can get complicated pretty fast, I found no built-in tool for testing their logic. Here’s an example of a small Flow Service:

webMethods Integration Server Flow Service

So I went ahead and created a local Java project, in which I could unit-test the remote services. I know network access in unit-tests is considered a “boundary” which should not be crossed, because tests run slower and may fail due to network problems. However, I would rather have a suite of (relatively) slow running tests for my important services than have no tests at all and see programs fail due to regression. And debugging services in IS is no fun at all! Calling the IS services directly from Java is surprisingly easy. Software AG Designer generates all the needed source code for you. Right-click on the service and choose Generate Code. Then follow the steps in the wizard:

Calling IS Service Wizard 1

Calling IS Service Wizard 2

Calling IS Service Wizard 3

The generated code is executable directly, as long as you add references to IS’ client.jar and enttoolkit.jar to your Java project.

Testing IS Services References

The service call itself is implemented in this simple method:

public static IData invoke(Context context, IData inputDocument) 
    throws IOException, ServiceException
     IData out = context.invoke("PACKAGE", "NAME", inputDocument);
     IData outputDocument = out;
     return outputDocument;

However, as you can see above, the service accepts and returns IData objects, whose construction is not that easy and should not be scattered throughout your test code. Even providing a simple value requires at least the following code. And the code for getting the value out of IData to be able to call an assertion on it looks similar.

IData out = IDataFactory.create();
IDataCursor idc = out.getCursor();
idc.insertAfter("errorFlag", "true");

So, to make life easier for me, I spent some time and developed a small environment, in which I can work with plain old Java objects for service input and output in my tests and assert against the public fields of these objects, like this example of a simple date conversion service demonstrates:

protected ConvertDateToN8 sut;
protected ConvertDateToN8.Input input;

public void shouldConvertValidDate() throws Exception
    input.Date = "2008-10-15";
    ConvertDateToN8.Output output =;
    assertThat(output.DateN8, is("20081015"));

The service class looks like this:

public class ConvertDateToN8 extends ServiceCall<ConvertDateToN8.Input, ConvertDateToN8.Output>
    public class Input extends ServiceInput
        public String Date;

    public class Output extends ServiceOutput
        public String DateN8;

    public ConvertDateToN8(String host)
        super(host, "Common", "ConvertDateToN8");

    public Input createInput()
        return new Input();

    protected Output createOutput()
        return new Output();

The only thing that needs to be added to the project to test another service is a class like the one above. I can simply define the input and output of the service in nested classes and everything else is magically built with generics and reflection. Here is an example of converting the generic Input object to IData in class ServiceCall:

protected IData createPipelineFromInput(ServiceInput input)
    IData pipeline = IDataFactory.create();
    IDataCursor idc = pipeline.getCursor();

    Class<?> c = input.getClass();
    for (Field publicField : c.getFields())
            String fieldName = FieldNameConverter.convertToPipelineName(publicField.getName());
            Object fieldValue = FieldConverter.convertToPipeline(publicField.getType().getName(), publicField.get(input));
            IDataUtil.put(idc, fieldName, fieldValue);
        catch (Exception e)

    return pipeline;

Even more complex data structures like objects or arrays can be constructed by extending the classes FieldConverter and PipelineConverter which encapsulate the access to IData:

public class Input extends ServiceInput
    public String errorFlag;
    public String errorCode;
    public String errorMessage;
    public NaturalErrorInfo ERROR_INFO;

You can take a look at the implementation for converting the above input to and from IData in the source code, which can be downloaded below. Feel free to take a look at it and contact me for questions or additions!