onsdag 20 augusti 2014

Quick tip: How to change database dynamically in a Mule flow for a JDBC endpoint

Lets say you have lots of databases which you for some reason would like to use the same integration flow against. For example you might have a distributed company structure with lots of databases of the same structure for each subsidiary and you might want to run a query against all of them to consolidate the results to some kind of reporting system.
To avoid having a connector for each of the databases (if you lets say have 50 of subsidiaries with the same financial system) and to avoid having each query to be specific for each subsidiary system here comes today's quick tip on one way to dynamically change the database on the fly in your integration flow.

Note. I will not go into how to configure JDBC connectors , or any best practice around JDBC connectors , performance issues or discussions if this solution is good or bad architecture. I'm just creating a case to point out how one can change a data-source on the fly from within a Mule flow.
So here we go:

First you can specify a datasource...something like this:

 <spring:bean id="dataSource" class="org.enhydra.jdbc.standard.StandardDataSource" destroy-method="shutdown" name="dataSource">  
       <spring:property name="driverName" value="net.sourceforge.jtds.jdbc.Driver"/>  
       <spring:property name="user" value="${dbusername}"/>  
       <spring:property name="password" value="${dbpassword}"/>  

Then you can setup a connector using that datasource with all the querys you need like this.

  <jdbc:connector name="db_conn" dataSource-ref="dataSource" pollingFrequency="5000" doc:name="Database" validateConnections="false">  
     <jdbc:query key="myFantasticQuery1" value="SELECT #[header:INBOUND:company] as company, blah , blah, #[header:INBOUND:dynamicallysetvalue] as blah,FROM ... WHERE  #[header:INBOUND:anotherdynamicallysetvalue]);"/>           <jdbc:query key="myFantasticQuery2" value="SELECT blh blah blah.....and so on  

Now you can use an endpoint to call one of the querys..for example inside of an "ALL" component http://www.mulesoft.org/documentation/display/current/All+Flow+Control+Reference.

 <jdbc:outbound-endpoint connector-ref="db_conn" doc:name="DB" exchange-pattern="request-response" querykey="myFantasticQuery1"></jdbc:outbound-endpoint>  

But this would cause the query's for all subsidiaries to run the query on the same database.

To change the database on the fly dynamically based on header values right before the call to the jdbc:endpoint a Java component comes handy:

 public class ChangeDatabase implements Callable {  
           private String serverip;            
           private String serverport;  
           private String dbprefix;  
           public Object onCall(MuleEventContext eventContext) throws Exception {  
             boolean success = false;  
             MuleMessage message = eventContext.getMessage();  
             org.enhydra.jdbc.standard.StandardDataSource ds = (org.enhydra.jdbc.standard.StandardDataSource) eventContext.getMuleContext().getRegistry().lookupObject("dataSource");  
             ds.setUrl("jdbc:jtds:sqlserver://" + serverip + ":" + serverport + ";databaseName=" +dbprefix+ message.getInboundProperty("company"));  
             success = true;  
             return success;  

This solution actually looks up the datasource object defined earlier from the Mule reigistry and sets the jdbc url based on a dynamic header. In this case a "company" header set to identify the subsidiary.

To use it simply declare inn your spring:beans section (you could use a singleton instead if your changeDatabase is thread safe)

     <spring:bean id="changeDB" class="ChangeDatabase">  
        <spring:property name="serverip" value="${dbserverip}"/>   
        <spring:property name="serverport" value="${dbserverport}"/>   
        <spring:property name="dbprefix" value="${dbprefix}"/>   

and add a object reference to it before the jdbc:outbound endpoint and it will automatically change database before the query runs.

     <spring-object bean="changeDB"/>  

Thats it!

Happy hacking!

fredag 2 maj 2014

Mule mockless Integration testing tip, or "in wait for a mature Munit"

I really like the Munit framework that is beeing developed.
and I will continue using it as soon as it gets more mature and stable. Especially when the Anypoint studio (former Mule studio) support is more stable.

However the experience so far as an early adopter has given me issues I can't spend time on during day to day production work.

So how can I do Mule integration tests without Munit and without having to mock every endpoint that has environmental bindings?

One way of doing it which I would like to share is by using the
as described in the Mulesoft documentation but with a little twist.

To be able to interact with a "full featured" environment without mocking or modification of the application  I make use of the fact that Mule is based on Spring. I use a combination of the Mulesoft FunctionalTestCase explained above and Springs JUnit framwork along with mavens failsafe plugin.

For example let's say I would like to have a integation test that tests the results from a flow with a http endpoint that has a <mule-ss:http-security-filter> based where basic authentication is verified against a configured <mule-ss:security-manager> without using any mocking or .

 <flow name="ToTest" doc:name="ToTest">  
     <inbound-endpoint name="testendpoint" address="${endpointbase_address_in}" doc:name="Generic" exchange-pattern="request-response" mimeType="application/json">  
          <!-- first authenticate -->  
                <mule-ss:http-security-filter realm="mule-realm"/>  
                <!-- check that user is autorized to execute this flow -->   
          <custom-security-filter class="se.redpill.pnd.mulecomponents.SpringSecurityRoleFilter">  
            <spring:property name="allowedAuthorities" value="ROLE_SUPPLIER" />  
           <jersey:resources doc:name="REST">  
                <custom-interceptor class="se.redpill.pnd.util.LoggingInterceptor"/>   
                     <spring-object bean="testDataBean" />  

           flow execution ....

This would require the test setup to send in a valid username and password to the http endpoint to be accepted and for the test to be able to evaluate the outcome of the flow.

To be able to do this we would need to share configuration between the application itself and the test setup even if the application has environment based configuration like maven filtering and / or property file overrides.

This could be property placeholders holding usernames for in-memory authentication providers or property placeholders or for database based authentication etc, but also for environment specific Spring application context setup like beans , spring security settings, annotation config etc.

Another thing I want to make sure is that unit test environment is not disturbed by my integration test setup.

 @PropertySource({"classpath:/myacceptancetest.properties", "classpath:/mytest.properties"})  
 public class ITTestData extends FunctionalTestCase{  
      Environment env;  
      protected String getConfigResources() {  
           return "MyTestflow.xml";  
      public void testSend() throws Exception  
        MuleClient client = new MuleClient(muleContext);  
        String payload = FileUtils.readFileToString(new File("src/test/resources/testdata.json"));  
        Map<String, Object> properties = new HashMap<String, Object>();  
        properties.put("Content-Type", new String("application/json"));  
        ImmutableEndpoint endpoint = muleContext.getRegistry().lookupObject("testendpoint");  
        String username = env.getProperty("test.user");  
        String passwd = env.getProperty("test.password");  
        String address = endpoint.getEndpointURI().getAddress();  
        int pos = address.indexOf("://");  
        String beginaddress = address.substring(0, pos+ 3);  
        String endaddress = address.substring(pos+3, address.length());  
        address = beginaddress + username + ":" + passwd + "@" + endaddress + "/signe/pnd/register/enkat";  
        MuleMessage result = client.send(address, payload, properties);  
        assertEquals("{\"Status\" : \"OK\"}", result.getPayloadAsString());  

Lets have a look at the example above.

We are extending the FunctionalTestCase and we specify that we want to launch a complete Mule instance with configuration found in the "MyTestflow.xml" by overriding getConfigResources().
The instance will live only during the test and will be launched and teared down accordingly. This will also make sure that all the referenced context property placeholder resources associated with that xml file would be activated for our test. However these are only available within the muleContext and values in the properties files are not automatically registered in the mule registry so we can't get hold of them easily during runtime.

To handle that we use  4 Spring configuration annotations. It tells the test which property sources to use (note that these are shared so specify both the ones your are testing i.e. the same as the MyTestflow references to (first) and the ones you are overriding or complementing the test with (after).
It also tells the test which Spring application context to use if you have any settings or beans etc that differs during the test from production set another application context herem otherwise use the same as the one referenced to from myTestflow.xml.

Thats it. We now have the same contextual setup in the test as the environment we are testing which allows us true integrational testing.

In the example above we are using a mule client to call the endpoint and we get the actual endpoint address to call by doing a runtime lookup in the Mule registry (i.e no matter what configuration is used we will get the address that the Mule instance is configured with).

We set the json payload from a file and send it to the endpoint with a username and password from our environment shared properties.
In a more complete case the user and password used would be created before and deleted after the test to only live during the lifetime of the actual integration test.

Another beautiful thing about this setup is that it would run just perfectly within Anypoint (Mule) studio when executing your normal unit tests.

So how about maven?

Yes. It would run just fine in a Maven integration-test phase as well together with the failsafe plugin. To separate it from your unit tests which I presume you run in your test phase.

Lets take a look at how it could be configured in your pom.xml

As stated in the Mule docs integration test classes are named IT* or *IT or *ITCase and are located under src/it/java , so we need to tell maven about this to make sure that the integration test classes are compiled and loaded correctly.


Just add another plugin section to your pom.xml sepecifying "generate-test-sources" as phase and "src/it/java" as source path.

And finally another plugin specification for the actual failsafe configuration:


Set "integration-test" as phase and BAM....now you can do:

mvn test

to execute your normal unit tests and

mvn integration-test

to run the integration test phase with your now full blown integration test!

Happy testing until Munit comes to conquer!

måndag 17 mars 2014

Quick tip: Use generic dynamic endpoints when developing for web container embedded Mule ESB

If you are developing your application in Mule Studio and have separate Maven build settings for standalone and embedded as described in:
here is a small tip that might be useful.

Make sure you do not only use dynamic endpoints but also generic endpoints!

Say you would like to expose a REST endpoint with an inbound HTTP/ HTTPS endpoint.

You do not want to use hard coded values anywhere? Great - dynamic endpoints backed with external properties or MEL expressions to the rescue!

But what if you are supposed to deploy the application to an embedded Mule ESB let's say in a JBoss container?

In that case you do not want Mule to spawn new HTTP/HTTPS servers for each endpoint within the JBoss container, you want to use the web container within JBoss when you deploy embedded but you still want to use HTTP/ HTTPS inbound enpoints when you run it standalone.

But how can we do that easily when Mulesoft documentation states that scheme/transport may not be generated dynamically?

Use generic endpoints instead!

It could be something like this:

for a connector:

As stated before, what you want to achieve here is the endpoint to be a http inbound server on standalone builds and a servlet endpoint using the web containers web server when using embedded build.

Your endpoint base address can now be set in your maven build sensitive mule properties files (again see
 ) .

For standalone configuration the properties would be:

but for embedded builds it would be

VOILA! Dynamic and generic endpoint in action!

torsdag 23 januari 2014

Quick tip: Howto log business events to console inside of Mule Studio

As you might have noticed there is no default way of logging Business events to the console log during development of Mule flows in Mule Studio without having to deploy and monitor them through the Mule management console.

This issue is described in the following JIRA:

I tried many things and discussed the matter with my colleague Pontus Ullgren who came up with a solution I could use.

The tip of the day is hence howto implement your own Listener which can handle the matter.

First you need to setup the flow with your custom business event. Something like this:

      <flow doc:name="Business_event_logging" name="Business_event_logging">  
           <http:inbound-endpoint host="localhost" port="8090" path="test" doc:name="HTTP" exchange-pattern="request-response"/>  
           <message-filter doc:name="Filter favicon">  
                     <wildcard-filter pattern="/favicon.ico" caseSensitive="true"/>  
           <tracking:custom-event event-name="MyUserBusinessEvent" doc:name="Custom Business event">  
                <tracking:meta-data key="User_Firstname" value="Jon" />  
                <tracking:meta-data key="User_Lastname" value="Åkerström" />  
                <tracking:meta-data key="User_Action" value="#[message.payload]"/>    

Then you need to in your configuration xml also configure a notification listener referencing to your own Listener component.

     <spring:bean class="se.redpill.mulecomponents.BusinessEventHandlerListener" id="notificationListener">   
     <notification-listener ref="notificationListener">   

Then to be able to log the events when they happen, simply implement your own Listener by implementing the EventNotificationListener interface and take appropriate actions i.e log when the onNotification is called.

 package se.redpill.mulecomponents;  
 import org.apache.commons.logging.Log;  
 import org.apache.commons.logging.LogFactory;  
 import org.mule.api.MuleContext;  
 import org.mule.api.context.MuleContextAware;  
 import com.mulesoft.mule.tracking.event.EventNotification;  
 import com.mulesoft.mule.tracking.event.EventNotificationListener;
 public class BusinessEventHandlerListener implements EventNotificationListener<EventNotification>, MuleContextAware {  
   MuleContext context;  
   protected static final Log LOGGER = LogFactory.getLog("MEEE");  
      public void onNotification(EventNotification notification) {  
           LOGGER.debug("Event notification: " + notification.toString());  
           LOGGER.debug("Event DATA:" + notification.getMetaDatas().toString());  
      public void setMuleContext(MuleContext context) {  
           this.context = context;  

Finally make sure that you in your log4j.properties file specify log level DEBUG globally or for the appropriate class to actually see the Business event and it's data.

The code is now also availble on Pontus Ullgrens GIT repo:

tisdag 21 januari 2014

Quick tip: Howto get rid of REST client failed to route to service, Socket connection closed or Http Headers too large.

A common issue during development of a Mule RESTful webservice client is that calls to the service is denied with error messages similar to "Failed to route..." or "java.net.SocketException: Connection reset" although you have specified correct endpoint address in your <http:outbound-endpoint>.

If the REST service in fact is not called (check with logging or breakpoint) and a traffic analyser like Wireshark or similar tells you its a HTTP "Bad request" or similar allthough the TCP traffic looks correct, you might have an issue with your HTTP headers.

In fact this is quite often related to the fact that Mule by default automatically transforms Mule headers to HTTP headers in the transport and vice versa. This is a very nice feature if you need to for example call a REST service and still keep sessionid or if splitted data is sent to the service for each item and you need to keep correlationids to perform aggregations on the result data further down the flow.
However the MULE_SESSION header is quite large and different vendors of different application servers like JBoss or Tomcat has different settings on maximum allowed sizes of the headers although the HTTP specs has no such limitation. This is often what causes the request to become a "Bad request" i.e to large headers.

The tip of the day is hence to set up a global connector which configures inbound or outbound endpoints to use a NullSessionHandler simply by overriding the default sessionHandler like this:

 <http:connector name="ConnectorWithoutMuleSession" doc:name="HTTP/HTTPS_Nosession">  
           <service-overrides sessionHandler="org.mule.session.NullSessionHandler"/>  

In your flow you can now reference this connector either from an outbound endpoint which will avoid sending the MULE_SESSION header or on an inbound endpoint which will avoid processing the MULE_SESSION header on incoming messages.

Like this:

 <http:outbound-endpoint method="GET" exchange-pattern="request-response" address="${flow.myserviceurl}" contentType="application/json" doc:name="HTTP_out" connector-ref="ConnectorWithoutMuleSession"/>