RHPAM Custom Work Item Handler

Here are some notes about creating Custom Work Item Handlers for Red Hat Process Automation Manager (RHPAM). These notes should work for jBPM/KIE BPM as well.

Relevant links:

Custom Tasks and Custom Work Item Handlers allow you implement activities that do more than Script or the OOTB task types (like Email or Log or REST Request). NOTE: I think the terms “Custom Tasks” and “Custom Work Item Handlers” are essentially interchangeable…I can’t figure out why there are two ways to refer to this subject. I guess the Work Item Handler is the actual Java application (a JAR) that does the work and a “Custom Task” is what it looks like when you add the Work Item Definition (WID) and such to your project. But I’ll just keep calling them Custom Work Item Handlers, or Custom WIH, for this post.

Getting Started

Though it may be tempting to start with cloning or copying an existing Custom WIH, I found that this can create some Maven POM issues that aren’t that easy to resolve. The best way to start is to use the JBPM Maven Archetype (like a template) to create a blank project and then add your customizations or paste any borrowed code.

Here’s the Windows command line to initialize an empty project based on the Custom WIH archetype:

mvn archetype:generate -DarchetypeGroupId=org.jbpm -DarchetypeArtifactId=jbpm-workitems-archetype -DarchetypeVersion=7.52.0.Final -Dversion=1.0.0-SNAPSHOT -DgroupId=com.mygroup -DartifactId=workitem-name -DclassPrefix=workitemClass

The items in bold should be hardcoded, but if you need to, update the archetypeVersion to match whatever jBPM version you’re using (check for a RedHat version if you’re on RHPAM – just make sure you are mapped to the RedHat Maven Repo using the documentation instructions). The -Dversion, -DgroupId and -DartifactId will be the Maven GAV (group, artifact, version) data for your Custom WIH – update these to match your Custom WIH code. The GAV values will be used for dependency management in your RHPAM Project’s POM and the WID in your RHPAM Project. -DclassPrefix just puts this value at the start of the initial Class in your Custom WIH project.
Running that command will grab some files from your Maven Repo, ask a simple project initialization question, and create a directory where it is run to hold your Custom WIH project. You’ll have to press Y at one prompt to confirm the GAV parameters from the command. Then you’ll get two directories and a pom.xml file to get started:

It seems easiest to startup IntelliJ or VSCode or whatever your IDE-of-choice is and configure it to recognize the folder as a Java project with source in src > main > java and tests in src > test > java. The rest of this page will use IntelliJ but I’m not using anything IDE specific, so if you see any major discrepancies, please feel free to comment below.

There should be a single pre-generated class file for your project and that is probably where you’ll want to start you actual development. It will be something like {WorkItemClassPrefix}WorkItemHandler from the archetype generation above.

It doesn’t seem like you need to make any major changes to the pom.xml that is generated except to add any dependencies for libraries that your Custom WIH will need, like JSON or JodaTime or Mongo or whatever. See this thread about how your Custom WIH dependencies could impact your runtime.

Describe the Custom WIH with the WID

The Work Item Definition (WID) controls how your code looks to RHAPM/jBPM. You can author the WID as its own file or just use the @Wid annotation that is already provided in the class generated from the Archetype. The WID is important for controlling how your Custom WIH looks to someone in Business Central. I tried using the text file like the RHPAM docs say, but I could never get it to work, so this annotation seems fine.
Here is the @Wid for a new project from the archetype with some comments about each section.

Custom WIH Java Class File AnnotationNotes
@Wid(widfile="WorkitemClassDefinitions.wid", name="WorkitemClassDefinitions",
        displayName="WorkitemClassDefinitions",
        defaultHandler="mvel: new com.mygroup.WorkitemClassDefinitionHandler()",
        documentation = "workitem-name/index.html",
        category = "workitem-name",
        icon = "WorkitemClassDefinitions.png",
        parameters={
            @WidParameter(name="SampleParam", required = true),
            @WidParameter(name="SampleParamTwo", required = true)
        },
        results={
            @WidResult(name="SampleResult")
        },
        mavenDepends={
            @WidMavenDepends(group="com.mygroup", artifact="workitem-name", version="1.0.0-SNAPSHOT")
        },
        serviceInfo = @WidService(category = "workitem-name", description = "${description}",
                keywords = "",
                action = @WidAction(title = "
Do an exciting Custom WIH"),
                authinfo = @WidAuth(required = true, params = {"SampleParam", "SampleParamTwo"},
                        paramsdescription = {"SampleParam", "SampleParamTwo"},
                        referencesite = "referenceSiteURL")
        )
)
widfile – Maven seems to auto-generate this file which is essentially just everything from this annotation, but in its own text file.

name – the name of the Custom WIH

displayName – what will appear in Business Central as the name of the Custom Task

defaultHandler – what a user will have to put in Project Settings > Deployments > Work item handlers. If you want to require something at this point which will end up being used in the constructor for this class, use “\” to escape the input. So for example the ExecuteSQL WIH has this value: defaultHandler = “mvel: new org.jbpm.process.workitem.executesql.ExecuteSqlWorkItemHandler(\”dataSourceName\”)”, which means you have to supply the dataSourceName when you pull this WIH into your project as a dependencies or else the WIH won’t initialize. NOTE: the defaultHandler in the @Wid section generated from the archetype is wrong, because the WIH is setup to require SampleParam and SampleParamTwo in the only Constructor, so I think those parameters should be in the Default Handler as well.

documentation – not really sure how a URL is used here…

category – This groups the Custom WIH on the BPMN authoring palette

icon – a picture for the Task/Activity box on the BPMN diagram

parameters – the Input parameters for your Custom WIH. You can use the “required” flag if you want, or leave it out so that the parameter is not required. You can also define a runtimeType like @WidParameter(name=”SampleParamThree”, runtimeType = “java.lang.Object”), which should be a fully qualified class name. This will help when a process author drops the Custom WIH into the BPMN flow.
NOTE: parameters appear to be entirely optional, so if your Custom WIH doesn’t need an input you don’t have to include this section.

results – the name of the object returned by the Custom WIH. You can also define the runtimeType just like SampleParamThree above.
NOTE: results also appear to be entirely optional, so if your Custom WIH doesn’t need to return anything, don’t include results.
In order to call the Workitem Manager completeWorkItem() method, you have to pass either null or a Map<String, Object>. The Map should have items for each Result parameter defined. So in this example, the Map<String, Object> “results” object will have an entry for “SampleResult” as the Key and, for this example, a String as the Value/Object. But it is perfectly fine to list multiple @WidReults and then use each of those @WidResults.name values as a “Key” in the Map<String, Object> result. Or not. See notes below for help with results.

mavenDepends – as far as I can tell, this is the GAV for the Custom WIH, so I’m assuming that if you update the VERSION in the Custom WIH pom, you’ll want to update here as well. I guess you can also add other dependencies that you’ve added to your Custom WIH pom…? I’m not sure what happens if you do or don’t.

serviceInfo – this also seems to control how the Custom WIH appears in Business Central, but will need to do some more testing.

description – notice that by default this points to a variable ${description}, but it doesn’t seem to be defined so feel free to add description=”something here” up between documentation and category so the Custom WIH will get a description, but I’m not sure how this used.

@WidAction(title) – will show up after the Custom WIH name in Business Central, so its more of a headline about what the Custom WIH does, like “Send email” or “Send JMS Message” or “Call SOR API…”

@WidAuth – it is OK to leave authinfo=@WidAuth without using the () values. But if anything is added here, these fields will display when the Custom WIH is added as dependency to the Process Project using Settings > Custom Tasks > Install. I have no idea why this is considered “authentication”…
Here’s a good example of a more complete serviceInfo section from here:

Code the actual Custom WIH

Make sure your class constructors match the defaultHandler from the WID and contains any required values for the constructor. Or leave it entirely blank – up to you. You can also define more than one constructor as well. And you can also define any variables global to the class if you want, like String sampleParam, String sampleParamTwo from the archetype.

Once you’re ready to write the real logic, most everything goes into the executeWorkItem() method.  A few pointers for the execute() method:

  1. Wrapping everything in a try/catch seems pretty standard.
  2. Just like the Archetype does, use the RequiredParameterValidator.validate(this.getClass(), workItem) to make sure any WID Parameters marked as “required=true” were actually provided.
  3. To get the values of the Input Parameters from the BPMN, use the parameter name as a String for the workItem.getParameter() method and then cast the result to whatever you need for your code. If you defined the Parameter as a specific object in the WID, use that Class to cast because getParameter() just seems to return a generic object.
  4. Side note: the workItem object has a method to get the running Process Instance ID, which might be helpful: workItem.getProcessInstanceId()
  5. When your code is done you’ll have to call manager.completeWorkItem(workItem.getId(),results) to close out the Task. “results” is a required parameter for the method, but it seems like you can pass “null” if your Custom WIH doesn’t need to return anything (like this).
    • If your Custom WIH needs to return anything, the results parameter for completeWorkItem is always of type Map.
      • Each Key (the String) in the Map should correspond to whatever you’ve put in the results/@WidResult section. So if you have a @WidResult defined as “SampleResult”, your code will have to call results.put(“SampleResult”,{a}) and then put whatever your result object is as the second parameter (replace “{a}”). If you decide to use a runtimeType in your @WidResult section, make sure the Object in the Map is that same type.
      • NOTE: You can also “put” items into the results map that aren’t listed in results/@WidResult. They will still be available to the BPMN, but will have to be added and mapped manually by the BPMN author when your Custom WIH is added to the BPMN palette and the author assigns the Data Mapping.
  6. I guess leave the “abortWorkItem” method alone…or maybe do something like close a connection or something in a scenario where your Custom WIH is interrupted for whatever reason…?

Feel free to write any tests and such that you want. I’ll add some more notes on this later.

Build and Deploy and Integrate

It seems easiest to run the build from the command line in the project directory instead of using the IDE, but feel free to do this whatever way you want. From the command line, this seems to work best:

mvn clean package -Dmaven.test.skip=true

This command will do a ton of stuff but most importantly it creates a JAR in /target that you’ll want to upload to Business Central. It also creates a lot of other stuff that I have no idea how it’s used, so that will be for another day.

In order to fully integrate the Custom WIH in a Process Project, it seems like you need to follow all three (+1) of these steps from here:

  1. ADDING THE WORK ITEM HANDLER TO BUSINESS CENTRAL AS A CUSTOM TASK
    1. This is where you upload the mvn clean package JAR to BC
  2. INSTALLING THE CUSTOM TASK IN YOUR PROJECT – Custom Tasks
    • Project > Settings > Custom Tasks
      1. This should auto-generate the WID Asset for your process project once you click Save after Installing the Custom Task
  3. INSTALLING THE CUSTOM TASK IN YOUR PROJECT – Add GAV Dependency to your project’s pom.xml
    1. Use your Custom WIH’s GAV that is defined in the Custom WIH pom.xml. 
    2. I’m not really sure why you have to do this because BC makes it seem like it will add the dependency automatically, but it doesn’t at this point, so make sure it gets added under Project > Settings > Dependencies and that the Custom WIH GAV shows up in the Process App’s pom.xml.
  4. MAYBE: You might have to create a entry in Project > Settings > Deployments > Work Item Handlers with the @Wid name and @Wid defaultHandler values from your Custom WIH. It seems like this entry shows up automatically some times but not other times…not sure how this is related but just make sure you have the Work Item Handler defined here.

NOTE: It seems to help if you add the following plugin to your Process Application Project’s pom.xml. This seems to help ensure your Project is built with any required downstream dependencies from assets like your Custom WIH.

<plugin>

    <artifactId>maven-assembly-plugin</artifactId>
    <executions>
        <execution>
            <phase>package</phase>
            <goals>
                <goal>single</goal>
            </goals>
        </execution>
    </executions>
    <configuration>
        <descriptorRefs>
            <descriptorRef>jar-with-dependencies</descriptorRef>
        </descriptorRefs>
    </configuration></plugin>

Once you’ve saved all those Project-level changes, you should be able to see your Custom WIH in your BPMN palette. Congratulations! Drag your new Custom WIH into your BPMN flow and map it up. REMEMBER: for Data Assignments > Data Outputs, the “Name” should match a Key for whatever your Custom WIH is putting in the Map<String, Object> results parameter that is passed to completeWorkItem(). Because the Map.key will just resolve to an Object, make sure you cast before you start trying to do anything with it, because the Value for the Map.key you’re using in Data Output and Assignments > Name will just be a generic Object. You’ll want to do something like:

if (sqlResult != null) {
java.util.List<String> lines = (java.util.List) sqlResult;
}

That will get you a List<String> of the SQL Results from the ExecuteSQL Work Item Handler. You know that “Result” parameter is a List<String> because you looked at the code and saw results.put(RESULT,processResults(resultSet)); where processResults returns a List<String> lines = new ArrayList<>();

So hopefully these notes help anyone trying to develop a Custom Work Item Handler or Custom Task in RHPAM/jBPM.

Feel free to add a comment or send me a note if you have any feedback or questions.

Hopefully I’ll get around to Exception handling and such in a later post.

RHPAM, jBPM, Kogito…oh my

I’m taking a detour from IBM BAW/BPM for a bit and wanted to jot down some lessons learned while working on a Proof-of-Concept with Red Hat Process Automation Manager (RHPAM), which is also kind of just JBoss jBPM, but then also this Kogito KIE (“Knowledge Is Everything”) product…yeah, welcome to Open Source. We’ll call it RjK for this post.

For the most part, RjK is a full Business Process Management/Process Automation/Workflow solution. It offers design/authoring functionality in a web-based environment under the label “Business Central” (BC). In Business Central you can author process flows, rule assets, data models (like Business Objects), forms (like Coaches), and a few other asset types I haven’t used yet. RjK is much more Java than anything in the IBM BPM/BAW world: the Data Model asset is essentially a Java class file; the process assets are full BPMN-compliant XML files, etc. RjK covers process, rules (pretty much the Drools product), and then this OptaPlanner feature that I haven’t looked at yet but seems like a cool way to create optimization solutions for things like “the best way to plan delivery trips” or “solving a Sudoku puzzle”. There are a ton of open source resources for RjK: examples in GitHub, posts on Stack Overflow, Medium.com posts, etc. All three products offer documentation and some of it is copy/paste from each other, but looking at examples is usually the best place to start. RHPAM is Red Hat’s “hardened” version of jBPM with some Kogito/KIE functionality included. I think this means customers can pay for support and consulting for almost the latest version of jBPM/Kogito, but with a stamp of approval from Red Hat that the version is safe(r). From what I can tell, Kogito is the migration path for what was previously called jBPM. Kogito is putting a more cloud-friendly spin on the offering with things like Quarkus, Kubernetes, and GraalVM…way outside of my comfort zone, but still neat to tech-drop like that.

RjK applications are built into kJAR files which can run under the umbrella of a KIE server, or kJARs can be setup to run all by themselves, or even embedded in another application. There are a ton of API options for RjK, all the way from Business Central to KIE to the individual objects that actually execute the process and rules assets (REST APIs, Java APIs, other APIs I’m sure I don’t understand). It’s very compartmentalized and much more “micro-services” than IBM BPM.

This PoC is setting up one Business Central instance on a Kubernetes pod and then one KIE Server instance on a different Kubernetes pod. There is a shared Persistent Storage Volume (PSV) for the two pods: it is being used for the Maven Repo mentioned below. The PoC is essentially trying to mimic IBM BPM by using Business Central as a Process Center-type environment and KIE Server as a Process Server environment. Projects are deployed from Business Central to the KIE Server instance which is running other projects/containers at the same time. Business Central is connected to KIE Server so BC can manage and start instances and interact with tasks.

I’m also messing around with Spring, Apache Camel, Kafka, and Mongo for some supplemental functionality. I need to scrub the code and get it into GitHub and I’ll share some learning notes on those capabilities next.

Again, these are just raw notes from some setup and configuration work. Hopefully more to come:

  1. Connect Business Central (BC) to KIE/Process Server
    1. This older 7.0 article mentions a specific login-module that needs to be enabled on BC to authenticate to KIE, but the 7.12 page doesn’t mention it:
      <login-module code="org.kie.security.jaas.KieLoginModule" flag="optional" module="deployment.business-central.war"/>
    2. We had to add that element to the BC standalone*.xml in order to get BC Workbench to see process data on KIE
  2. Asset Path – The way BC and KIE store assets (aka code) can be a little confusing…and obviously I’m not 100% sure all of this is correct.
    1. BC uses an internal Git repository (.niogit) in a hidden directory to store and manage design-time assets (versioning, branching, etc.)
      1. There is a lot of read/write to this repo so it needs to be local to BC
    2. BC then uses Maven to pull from code niogit and push assets to a Maven Repository
      1. This repository can be local to BC only or external
        1. A local BC Maven Repo (local as in on the file system where BC is running) is accessible by the BC/maven2 REST endpoint, which has a special security configuration (we could not get this to work)
          1. NOTE: this might work if we add the login-module above to KIE standalone…but we’ll wait and try this later
        2. The external repo can be over HTTP (like Artifactory)
        3. Or a shared file location
          1. We chose to point BC maven to a Persistent Volume (PV) that KIE Maven could also reach
    3. At PAM Project deployment, BC sends container information to KIE (?) along with the build to the Maven Repo
      1. Totally not sure about how this really works…KIE has a configuration reference to BC’s controller, so I’m not sure if KIE is pinging BC to get deployment info…or if BC pushes or signals KIE is some way…
    4. KIE goes to the same Maven repo to pull down the assets (the KJAR and pom) and build (via Maven) the actual container that will run while keeping the build locally in its own Repo…? Again, not sure at all how this is working but this PoC has .niogit on BC’s filesystem, a Repo for BC and a Repo for Maven and it’s working.
  3. Events – It looks like there are three different event scenarios for PAM:
    1. Automatic Event Emitters
      1. This a feature where jBPM will automatically send/publish a Process, Task, or Case event to Kafka using the built-in jbpm.event.emitters feature. This feature has to be enabled and configured in standalone-*.xml. See notes here.
    2. Custom Event Emitter
      1. This is where an activity in a BPMN flow is built to send/publish an event in a specific scenario in the process.  This option requires a WorkitemHandler to be configured.  Here.
    3. Basic Event Processing
      1. This is how to configure KIE to consume and produce Kafka events using the built-in Kafka server functionality. This only works with an app deployed to a full KIE Server that is already configured to deal with Kafka. This configuration setup is slightly different than the Automatic Event Emitters.
      2. See here.
      3. NOTE: if you’re setting up a Receive Message Event (Consume) in a BPMN asset, the event payload has to match the Message Start Event data mapping exactly. And needs to be JSON in the form (“data”:{—-JSON of Input Variable name:value—–}}. You can’t just use a string.
  4. Signals vs Messages
    1. I’m not really sure if this is correct based on the documentation, but apparently Messages are universal, which means you can’t target a specific Process Instance with a Message Event. Using the REST API or the Kafka connection to a Message Topic will fire the message on all active instances…?  This might be unique to the scenario of setting up a central KIE server to run multiple containers/deployment units.
    2. Signals use a specific Process Instance ID and can target only that instance.
  5.  Signals
    1. When firing a signal via the REST API, the payload/body has to include the class name of the output variable defined for the Signal.  So if a Signal is mapped to an output variable like this: {“MyObject”:{“myName”:”Sue Smith”,”myAge”:25}}, that JSON has to be in the Body of the REST API call to fire the Signal, where “MyObject” is the class name of the Data Model for the Output variable. Using the name of the Output Variable doesn’t seem to work.

Some helpful links:

https://medium.com/capital-one-tech/using-machine-learning-and-open-source-bpm-in-a-reactive-microservices-architecture-96bb8dc9e962

https://snandaku.medium.com/integrating-red-hat-process-automation-manager-and-red-hat-amq-streams-on-openshift-in-4-steps-327aa2da7929

https://mswiderski.blogspot.com/2015/09/unified-kie-execution-server-part-3.html?force_isolation=true

Another IBM BPM/BAW Date Hack

The team had a user story to determine the Thursday of the 2nd week of the first month of a quarter (so January, April, July and October), recognizing that a week might only contain one day of the month. This would be server-side code running in a BAW Service Flow asset.

January 2021 is a good example:

Notice that January 1 and 2 fall on Friday and Saturday but our user story considers that a valid “week” for the month. So the first Thursday of the 2nd “week” would be January 7.

We had existing code from a prior version of this requirement where the user story was written as “the 2nd Thursday of the Month”, which would end up on January 14, 2021 because we used a method to get the first occurrence of the weekday (Thursday is getDay() == 4 in this case) of the month using mod (%) 7 then moving that date of the month (1-31) out by 7 until we landed on the correct 2nd occurrence: (2-1)*7.

With the new user story, we still determine the Thursday of the 2nd week (that code has been successfully tested), but then we check to see if that week is actually the 2nd week in the month. If not, we subtract 7 days and use that value.

Here’s the new code:

var c = new java.util.GregorianCalendar.getInstance();
c.set(tw.local.targetDate.getFullYear(), tw.local.targetDate.getMonth(), tw.local.targetDate.getDate());
c.setMinimalDaysInFirstWeek(1);
var wk = c.get(java.util.Calendar.WEEK_OF_MONTH);

if (wk > 2) {
   // If we're not in the 2nd week, move back one week (7 days)
   tw.local.targetDate.setDate(tw.local.targetDate.getDate() - 7);
}

We start by creating an instance of the Java GregorianCalendar (the regular Calendar option doesn’t seem to work in either BPM, Java 8, or Rhino, for some reason). And we set the calendar to the matching year/month/date of targetDate (this was previously defined as the 2nd Thursday of the first month in the quarter using the existing code).

The we call setMinimalDaysInFirstWeek to define how many days we consider to be in the first week of the year (don’t worry, this carries over to the rest of the calendar year – October 2021 has the same setup as January 2021). In this case – just 1 day is considered a “week”.

Then we use the WEEK_OF_MONTH constant to determine which week of the month we’re in and decide if we can keep the current targetDate or set it back 7 days to the previous week (it will never be more than 3 weeks off because of the existing “2nd Thursday of the month” logic).

The GregorianCalendar object was helpful and being able to merge the Java and JavaScript was much easier than trying to transcribe into JavaScript or write something entirely from scratch.

The Java GregorianCalendar has a few other helpful methods, one of which is “roll”, but it behaves differently in Java 7 vs Java 8, so consider your environment if you want to experiment with this object.

JMS Learning

IBM BPM added some event capabilities they call DEF for Dynamic Event Framework. It looks like this functionality could replace Tracking Groups and Tracking Points in process assets. Essentially DEF provides a way to dump notifications and data out of the process to an external system.

IBM KC Article on DEF
https://www.ibm.com/support/knowledgecenter/en/SSFPJS_8.5.7/com.ibm.wbpm.admin.doc/topics/capturingevents.html

My current role has a potential business case to connect an IBM BPM application to a Solace message engine. IBM BPM would both drop messages to Solace and also be able to consume messages (or be called from Solace to send a message over).

When I started some research on Solace and connecting it to WebSphere Application Server I realized I didn’t have any practical experience with JMS so I set aside some time this week to hack through some JMS tutorials and see what I could learn. Here are my notes from this journey.

Sites that helped

Old School JMS Tutorial from IBM: https://www.ibm.com/developerworks/java/tutorials/j-jms/j-jms-updated.html

JMS in Liberty:
https://github.com/WASdev/sample.jms.server

JMS Overview:
https://www.javatpoint.com/jms-tutorial

So first I just needed to learn the lingo of JMS and all of those sites helped with that. Next thing was to get coding, so I had to download the latest version of Eclipse (64-bit) for Java EE Developers. Easy enough. I already have a 64-bit Java JDK so there wasn’t a need to download anything new from Oracle.

My plan to take the code samples from Mr. Farrell’s IBM JMS overview and put those into a new project that will run on Liberty using the queues and topics setup by lauracowen’s demo. I like to combine things to help understand how each of them work and to avoid the cookbook approach of just following directions without understanding what was happening.

I wanted to focus on Pub/Sub so I took Mr. Farrell’s code for TPublisher.java and TSubscriber.java into a new Eclipse project. I liked that his code prompted for TopicConnectionFactory and Topic names; laura’s sample had the values in the code. So I figured I could use laura’s server.xml for Liberty and just type in the values to Mr. Farrell’s app and all would be good.

Well it turns out I couldn’t really figure out how to connect Mr. Farrell’s code to the Liberty JMS stuff in server.xml. I mean I ran the code “On Server” but I don’t really know if it had all the necessary context. It was failing at the JNDI look-up.

So instead of that approach and went with all of lauracowen’s code. I didn’t use Git for Eclipse so I just created a new project called jms11-JMSSample, copied all of her Java files and packages into Eclipse, used her server.xml and booted up Liberty.

I kept getting another JNDI failure. Ugh. It was here:

TopicConnectionFactory cf1 = (TopicConnectionFactory) new InitialContext().lookup("java:comp/env/jmsTCF");  

Again I couldn’t figure out what was the matter. The JNDI values seemed fine in server.xml:

	<jmsTopicConnectionFactory jndiName="jmsTCF"
		connectionManagerRef="ConMgr3" clientID="clientId1">
		<properties.wasJms />
	</jmsTopicConnectionFactory>

So what was the matter?

This is another case of not doing Java development frequently enough…I didn’t copy web.xml over. web.xml had the Java resource references that connected the code to the JNDI values. What’s interesting is that if I replaced the lookup value in the code with just “jmsTCGF” it worked fine – yay! But it look me a bit more time to understand how web.xml fit into that flow and get it corrected. The web apps worked perfectly after that.

My next task was to find some sort of tool that would let me look at the JMS queues and topics in Liberty. The sample web app had a little bit of info but I wanted to change it up and start dropping messages, not consume them, but be able to look at them somewhere else.

I found this application: JMSToolBox

It looked the a good solution so I downloaded the zip. Unfortunately I couldn’t get this tool to talk to my Liberty server. I have the full WAS JARs that are required and the Liberty JAR as well but I’m getting hung up on the SSL stuff. I don’t need SSL (everything is just local) but either Liberty or JMSToolbox is somehow forcing someting over https instead of just http…I’ll keep trying to work on this or maybe try a different tool…or just try jConsole. I’ll post an update later.

But in the end I had a good refresher on Java development, Liberty and an overview of JMS. Now I get to keep hacking with the code and start writing my own messages and topics.

Then maybe I’ll move on to adding an Elasticsearch index as a source for the messages…?

Java notes

Just a few tidbits of information I reference pretty regularly for anything Java.

Java was the language of choice for all my CS classes in college but my first few career roles had nothing to do with programming except for a contract job using the LAMP stack.  The IT side of my financial services roles were mostly .NET.

When I moved to Big Blue I had to refresh my Java knowledge, primarily for a project using IBM Operational Decision Manager (ODM).  The IBM ODM rules platform at the time had a couple of ways to create what it called a Business Object Model (BOM) which is essentially a vocabulary of nouns and verbs that can be used to author business rules. 

For example – you might have an object called “a Customer” with properties like “name” (String) and “age” (Integer) and “can purchase alcohol” (Boolean).  When you wanted to write a rule about that object you could simply write something like:

If the age of the customer is more than 21 then make it true that the customer can purchase alcohol.

The BOM was supported by a Executable Object Model (XOM) which could be sourced from an XML Schema or a library of Java classes.  

It was easy enough in Eclipse to create a class with some parameters and Eclipse would automatically create the “getters” and “setters”.  Some of the real work came when you had to decide what functionality resided on the Java side as public or private methods and what functionality did you want to add into the BOM.  

But either way – I had to refresh my basic Java skills and some of these notes came in handy.

Creating Objects

  1. Declaration: The code set in bold are all variable declarations that associate a variable name with an object type.
  2. Instantiation: The new keyword is a Java operator that creates the object.
  3. Initialization: The new operator is followed by a call to a constructor, which initializes the new object.

Interfaces (“implements”)
• public class UsingClass implements InterfaceClass
• The Interface class InterfaceClass defines a set of empty constants and methods
• The using class UsingClass has to actually define a method body (make the methods do something) for all methods defined in the Interface (in this example InterfaceClass)
• This creates a standard (like an API) for how to engage with the class

Inheritance (“extends”)
• When you want to create a new class and there is already a class that includes some of the code that you want, you can derive your new class from the existing class. In doing this, you can reuse the fields and methods of the existing class without having to write (and debug!) them yourself.
• A subclass inherits all the members (fields, methods, and nested classes) from its superclass. Constructors are not members, so they are not inherited by subclasses, but the constructor of the superclass can be invoked from the subclass
• public class Bicycle {…..}
• public class MountainBike extends Bicycle {…}

• MountainBike has access to all of the variables and methods of Bike
• MountainBike can use the same method names of Bike (which means MountainBike overrides the Bike method)
• super.methodName() calls the method of the Super Class (the model that was extended)

WebSphere Notes

Obviously the Big Blue stack I used was based heavily in Java and at the time that meant WebSphere Application Server Network Deployment (ND) Profile was the server of choice for IBM BPM and IBM ODM.

Since then Liberty has taken a much larger role, which is really nice to see.  But it seems like a lot of big enterprises still rely on traditional WAS and WAD ND so I don’t think this knowledge is a complete waste.

When I got to my solutions role I knew very little about WAS apart from the fact that it was a Java Application Server and that’s about it.  I had some experience with Weblogic and WAS Community Edition on my personal Oracle VMs, but nothing very specific and nothing like managing a network deployment environment.  Installing one Java EE app on WAS CE didn’t compare to deploying applications across a three node cell.  IBM BPM on WAS was a big introduction for me and that made IBM ODM on WAS so much easier.

Here are some notes for IBM WebSphere Application Server ND/Traditional/Legacy (whatever you want to call it these days).  I’ll log another post with a few tiny notes on WAS Liberty Profile from a POC with IBM Process Federation Server.

I know this information is petty for most folks but for a guy brand-new to WAS ND just understanding the layout and the components was extremely helpful.

WebSphere (Network Deployment)

  • Application Server – a Java application that runs other Java applications
  • Server – an entity that actually runs the Java EE application (one or more than one, depending on the configuration)
  • Node – for all intents and purposes its a server running enterprise Java applications
  • Cell – a group of nodes
  • Deployment Manager – a specific server instance responsible for managing a cell.  It essentially consolidates server management for all the nodes in the cell.  It communicates to each node via a Node Agent.  Most people interact with the Deployment Manager using the WAS Integrated Solutions Console – essentially a web-application that manages WebSphere.
  • Node Agent – an admin type server program that communicates with the Deployment Manager to localize the management tasks.

This is a great resource for more in-depth information about Traditional ND and Liberty Profile:

https://www.redbooks.ibm.com/redbooks/pdfs/sg248022.pdf

And one more link with a great diagram of WAS ND:

https://itdevworld.wordpress.com/2009/05/03/websphere-concepts-cell-node-cluster-server/