onsdag 20 augusti 2014

Quick tip: How to change database dynamically in a Mule flow for a JDBC endpoint

Lets say you have lots of databases which you for some reason would like to use the same integration flow against. For example you might have a distributed company structure with lots of databases of the same structure for each subsidiary and you might want to run a query against all of them to consolidate the results to some kind of reporting system.
To avoid having a connector for each of the databases (if you lets say have 50 of subsidiaries with the same financial system) and to avoid having each query to be specific for each subsidiary system here comes today's quick tip on one way to dynamically change the database on the fly in your integration flow.

Note. I will not go into how to configure JDBC connectors , or any best practice around JDBC connectors , performance issues or discussions if this solution is good or bad architecture. I'm just creating a case to point out how one can change a data-source on the fly from within a Mule flow.
So here we go:

First you can specify a datasource...something like this:

 <spring:bean id="dataSource" class="org.enhydra.jdbc.standard.StandardDataSource" destroy-method="shutdown" name="dataSource">  
       <spring:property name="driverName" value="net.sourceforge.jtds.jdbc.Driver"/>  
       <spring:property name="user" value="${dbusername}"/>  
       <spring:property name="password" value="${dbpassword}"/>  
     </spring:bean>  
   </spring:beans>  

Then you can setup a connector using that datasource with all the querys you need like this.


  <jdbc:connector name="db_conn" dataSource-ref="dataSource" pollingFrequency="5000" doc:name="Database" validateConnections="false">  
     <jdbc:query key="myFantasticQuery1" value="SELECT #[header:INBOUND:company] as company, blah , blah, #[header:INBOUND:dynamicallysetvalue] as blah,FROM ... WHERE  #[header:INBOUND:anotherdynamicallysetvalue]);"/>           <jdbc:query key="myFantasticQuery2" value="SELECT blh blah blah.....and so on  

Now you can use an endpoint to call one of the querys..for example inside of an "ALL" component http://www.mulesoft.org/documentation/display/current/All+Flow+Control+Reference.

 <jdbc:outbound-endpoint connector-ref="db_conn" doc:name="DB" exchange-pattern="request-response" querykey="myFantasticQuery1"></jdbc:outbound-endpoint>  

But this would cause the query's for all subsidiaries to run the query on the same database.

To change the database on the fly dynamically based on header values right before the call to the jdbc:endpoint a Java component comes handy:


 public class ChangeDatabase implements Callable {  
           private String serverip;            
           private String serverport;  
           private String dbprefix;  
           @Override  
           public Object onCall(MuleEventContext eventContext) throws Exception {  
             boolean success = false;  
             MuleMessage message = eventContext.getMessage();  
             org.enhydra.jdbc.standard.StandardDataSource ds = (org.enhydra.jdbc.standard.StandardDataSource) eventContext.getMuleContext().getRegistry().lookupObject("dataSource");  
             ds.setUrl("jdbc:jtds:sqlserver://" + serverip + ":" + serverport + ";databaseName=" +dbprefix+ message.getInboundProperty("company"));  
             success = true;  
             return success;  
        }  
      }  

This solution actually looks up the datasource object defined earlier from the Mule reigistry and sets the jdbc url based on a dynamic header. In this case a "company" header set to identify the subsidiary.

To use it simply declare inn your spring:beans section (you could use a singleton instead if your changeDatabase is thread safe)


 <spring:beans>  
     <spring:bean id="changeDB" class="ChangeDatabase">  
        <spring:property name="serverip" value="${dbserverip}"/>   
        <spring:property name="serverport" value="${dbserverport}"/>   
        <spring:property name="dbprefix" value="${dbprefix}"/>   
     </spring:bean>  
   </spring:beans>  


and add a object reference to it before the jdbc:outbound endpoint and it will automatically change database before the query runs.

 <component>  
     <spring-object bean="changeDB"/>  
 </component>  

Thats it!

Happy hacking!

fredag 2 maj 2014

Mule mockless Integration testing tip, or "in wait for a mature Munit"

I really like the Munit framework that is beeing developed.
https://github.com/mulesoft/munit/wiki
and I will continue using it as soon as it gets more mature and stable. Especially when the Anypoint studio (former Mule studio) support is more stable.

However the experience so far as an early adopter has given me issues I can't spend time on during day to day production work.

So how can I do Mule integration tests without Munit and without having to mock every endpoint that has environmental bindings?

One way of doing it which I would like to share is by using the
org.mule.tck.junit4.FunctionalTestCase
as described in the Mulesoft documentation but with a little twist.

To be able to interact with a "full featured" environment without mocking or modification of the application  I make use of the fact that Mule is based on Spring. I use a combination of the Mulesoft FunctionalTestCase explained above and Springs JUnit framwork along with mavens failsafe plugin.

For example let's say I would like to have a integation test that tests the results from a flow with a http endpoint that has a <mule-ss:http-security-filter> based where basic authentication is verified against a configured <mule-ss:security-manager> without using any mocking or .

 <flow name="ToTest" doc:name="ToTest">  
     <inbound-endpoint name="testendpoint" address="${endpointbase_address_in}" doc:name="Generic" exchange-pattern="request-response" mimeType="application/json">  
          <!-- first authenticate -->  
                <mule-ss:http-security-filter realm="mule-realm"/>  
                <!-- check that user is autorized to execute this flow -->   
          <custom-security-filter class="se.redpill.pnd.mulecomponents.SpringSecurityRoleFilter">  
            <spring:property name="allowedAuthorities" value="ROLE_SUPPLIER" />  
          </custom-security-filter>  
     </inbound-endpoint>  
           <jersey:resources doc:name="REST">  
                <custom-interceptor class="se.redpill.pnd.util.LoggingInterceptor"/>   
                <component>  
                     <spring-object bean="testDataBean" />  
                </component>  
           </jersey:resources>

           ....
           flow execution ....
           ....


This would require the test setup to send in a valid username and password to the http endpoint to be accepted and for the test to be able to evaluate the outcome of the flow.

To be able to do this we would need to share configuration between the application itself and the test setup even if the application has environment based configuration like maven filtering and / or property file overrides.

This could be property placeholders holding usernames for in-memory authentication providers or property placeholders or for database based authentication etc, but also for environment specific Spring application context setup like beans , spring security settings, annotation config etc.

Another thing I want to make sure is that unit test environment is not disturbed by my integration test setup.


 @Configuration  
 @PropertySource({"classpath:/myacceptancetest.properties", "classpath:/mytest.properties"})  
 @RunWith(SpringJUnit4ClassRunner.class)  
 @ContextConfiguration("/test-application-context.xml")  
 public class ITTestData extends FunctionalTestCase{  
      @Autowired   
      Environment env;  
      @Override  
      protected String getConfigResources() {  
           return "MyTestflow.xml";  
      }  
      @Test  
      public void testSend() throws Exception  
      {  
        MuleClient client = new MuleClient(muleContext);  
        String payload = FileUtils.readFileToString(new File("src/test/resources/testdata.json"));  
        Map<String, Object> properties = new HashMap<String, Object>();  
        properties.put("Content-Type", new String("application/json"));  
        ImmutableEndpoint endpoint = muleContext.getRegistry().lookupObject("testendpoint");  
        assertNotNull(endpoint);  
        String username = env.getProperty("test.user");  
        String passwd = env.getProperty("test.password");  
        String address = endpoint.getEndpointURI().getAddress();  
        int pos = address.indexOf("://");  
        String beginaddress = address.substring(0, pos+ 3);  
        String endaddress = address.substring(pos+3, address.length());  
        address = beginaddress + username + ":" + passwd + "@" + endaddress + "/signe/pnd/register/enkat";  
        MuleMessage result = client.send(address, payload, properties);  
        assertEquals("{\"Status\" : \"OK\"}", result.getPayloadAsString());  
      }  
 }  

Lets have a look at the example above.

We are extending the FunctionalTestCase and we specify that we want to launch a complete Mule instance with configuration found in the "MyTestflow.xml" by overriding getConfigResources().
The instance will live only during the test and will be launched and teared down accordingly. This will also make sure that all the referenced context property placeholder resources associated with that xml file would be activated for our test. However these are only available within the muleContext and values in the properties files are not automatically registered in the mule registry so we can't get hold of them easily during runtime.

To handle that we use  4 Spring configuration annotations. It tells the test which property sources to use (note that these are shared so specify both the ones your are testing i.e. the same as the MyTestflow references to (first) and the ones you are overriding or complementing the test with (after).
It also tells the test which Spring application context to use if you have any settings or beans etc that differs during the test from production set another application context herem otherwise use the same as the one referenced to from myTestflow.xml.

Thats it. We now have the same contextual setup in the test as the environment we are testing which allows us true integrational testing.

In the example above we are using a mule client to call the endpoint and we get the actual endpoint address to call by doing a runtime lookup in the Mule registry (i.e no matter what configuration is used we will get the address that the Mule instance is configured with).

We set the json payload from a file and send it to the endpoint with a username and password from our environment shared properties.
In a more complete case the user and password used would be created before and deleted after the test to only live during the lifetime of the actual integration test.

Another beautiful thing about this setup is that it would run just perfectly within Anypoint (Mule) studio when executing your normal unit tests.

So how about maven?

Yes. It would run just fine in a Maven integration-test phase as well together with the failsafe plugin. To separate it from your unit tests which I presume you run in your test phase.

Lets take a look at how it could be configured in your pom.xml

As stated in the Mule docs integration test classes are named IT* or *IT or *ITCase and are located under src/it/java , so we need to tell maven about this to make sure that the integration test classes are compiled and loaded correctly.


  <plugin>  
     <groupId>org.codehaus.mojo</groupId>  
     <artifactId>build-helper-maven-plugin</artifactId>  
     <executions>  
      <execution>  
       <id>add-test-source</id>  
       <phase>generate-test-sources</phase>  
       <goals>  
        <goal>add-test-source</goal>  
       </goals>  
       <configuration>  
        <sources>  
         <source>src/it/java</source>  
        </sources>  
       </configuration>  
      </execution>  
     </executions>  
    </plugin>  

Just add another plugin section to your pom.xml sepecifying "generate-test-sources" as phase and "src/it/java" as source path.

And finally another plugin specification for the actual failsafe configuration:


 <plugin>  
     <groupId>org.codehaus.mojo</groupId>  
     <artifactId>failsafe-maven-plugin</artifactId>  
     <executions>  
      <execution>  
       <id>integration-test</id>  
       <phase>integration-test</phase>  
       <goals>  
        <goal>integration-test</goal>  
       </goals>  
      </execution>  
      <execution>  
       <id>verify</id>  
       <goals>  
        <goal>verify</goal>  
       </goals>  
      </execution>  
     </executions>  
    </plugin>  

Set "integration-test" as phase and BAM....now you can do:

mvn test

to execute your normal unit tests and

mvn integration-test

to run the integration test phase with your now full blown integration test!

Happy testing until Munit comes to conquer!

måndag 17 mars 2014

Quick tip: Use generic dynamic endpoints when developing for web container embedded Mule ESB

If you are developing your application in Mule Studio and have separate Maven build settings for standalone and embedded as described in:
http://mulehacks.blogspot.se/2013/08/debugging-maven-based-embedded-mule-esb.html
here is a small tip that might be useful.

Make sure you do not only use dynamic endpoints but also generic endpoints!
Why?

Say you would like to expose a REST endpoint with an inbound HTTP/ HTTPS endpoint.



You do not want to use hard coded values anywhere? Great - dynamic endpoints backed with external properties or MEL expressions to the rescue!

But what if you are supposed to deploy the application to an embedded Mule ESB let's say in a JBoss container?

In that case you do not want Mule to spawn new HTTP/HTTPS servers for each endpoint within the JBoss container, you want to use the web container within JBoss when you deploy embedded but you still want to use HTTP/ HTTPS inbound enpoints when you run it standalone.

But how can we do that easily when Mulesoft documentation states that scheme/transport may not be generated dynamically?

Use generic endpoints instead!





















It could be something like this:





for a connector:

As stated before, what you want to achieve here is the endpoint to be a http inbound server on standalone builds and a servlet endpoint using the web containers web server when using embedded build.

Your endpoint base address can now be set in your maven build sensitive mule properties files (again see
http://mulehacks.blogspot.se/2013/08/debugging-maven-based-embedded-mule-esb.html
 ) .

For standalone configuration the properties would be:
endpointbase_address=http://localhost:8081 

but for embedded builds it would be
endpointbase_address=servlet://.

VOILA! Dynamic and generic endpoint in action!

torsdag 23 januari 2014

Quick tip: Howto log business events to console inside of Mule Studio

As you might have noticed there is no default way of logging Business events to the console log during development of Mule flows in Mule Studio without having to deploy and monitor them through the Mule management console.

This issue is described in the following JIRA:

I tried many things and discussed the matter with my colleague Pontus Ullgren who came up with a solution I could use.

The tip of the day is hence howto implement your own Listener which can handle the matter.

First you need to setup the flow with your custom business event. Something like this:

      <flow doc:name="Business_event_logging" name="Business_event_logging">  
           <http:inbound-endpoint host="localhost" port="8090" path="test" doc:name="HTTP" exchange-pattern="request-response"/>  
           <message-filter doc:name="Filter favicon">  
                <not-filter>  
                     <wildcard-filter pattern="/favicon.ico" caseSensitive="true"/>  
                </not-filter>  
           </message-filter>  
           <tracking:custom-event event-name="MyUserBusinessEvent" doc:name="Custom Business event">  
                <tracking:meta-data key="User_Firstname" value="Jon" />  
                <tracking:meta-data key="User_Lastname" value="Åkerström" />  
                <tracking:meta-data key="User_Action" value="#[message.payload]"/>    
           </tracking:custom-event>  
      </flow>       

Then you need to in your configuration xml also configure a notification listener referencing to your own Listener component.

   <spring:beans>   
     <spring:bean class="se.redpill.mulecomponents.BusinessEventHandlerListener" id="notificationListener">   
   </spring:bean></spring:beans>   
   <notifications>   
     <notification-listener ref="notificationListener">   
   </notification-listener></notifications>   

Then to be able to log the events when they happen, simply implement your own Listener by implementing the EventNotificationListener interface and take appropriate actions i.e log when the onNotification is called.

 package se.redpill.mulecomponents;  
 import org.apache.commons.logging.Log;  
 import org.apache.commons.logging.LogFactory;  
 import org.mule.api.MuleContext;  
 import org.mule.api.context.MuleContextAware;  
 import com.mulesoft.mule.tracking.event.EventNotification;  
 import com.mulesoft.mule.tracking.event.EventNotificationListener;
  
 public class BusinessEventHandlerListener implements EventNotificationListener<EventNotification>, MuleContextAware {  
   MuleContext context;  
   protected static final Log LOGGER = LogFactory.getLog("MEEE");  
      @Override  
      public void onNotification(EventNotification notification) {  
           LOGGER.debug("Event notification: " + notification.toString());  
           LOGGER.debug("Event DATA:" + notification.getMetaDatas().toString());  
      }  
      @Override  
      public void setMuleContext(MuleContext context) {  
           this.context = context;  
      }  
 }  


Finally make sure that you in your log4j.properties file specify log level DEBUG globally or for the appropriate class to actually see the Business event and it's data.

The code is now also availble on Pontus Ullgrens GIT repo:
https://github.com/ullgren/my-mule-examples/tree/master/trackbuissnessevents

tisdag 21 januari 2014

Quick tip: Howto get rid of REST client failed to route to service, Socket connection closed or Http Headers too large.

A common issue during development of a Mule RESTful webservice client is that calls to the service is denied with error messages similar to "Failed to route..." or "java.net.SocketException: Connection reset" although you have specified correct endpoint address in your <http:outbound-endpoint>.

If the REST service in fact is not called (check with logging or breakpoint) and a traffic analyser like Wireshark or similar tells you its a HTTP "Bad request" or similar allthough the TCP traffic looks correct, you might have an issue with your HTTP headers.

In fact this is quite often related to the fact that Mule by default automatically transforms Mule headers to HTTP headers in the transport and vice versa. This is a very nice feature if you need to for example call a REST service and still keep sessionid or if splitted data is sent to the service for each item and you need to keep correlationids to perform aggregations on the result data further down the flow.
However the MULE_SESSION header is quite large and different vendors of different application servers like JBoss or Tomcat has different settings on maximum allowed sizes of the headers although the HTTP specs has no such limitation. This is often what causes the request to become a "Bad request" i.e to large headers.

The tip of the day is hence to set up a global connector which configures inbound or outbound endpoints to use a NullSessionHandler simply by overriding the default sessionHandler like this:

 <http:connector name="ConnectorWithoutMuleSession" doc:name="HTTP/HTTPS_Nosession">  
           <service-overrides sessionHandler="org.mule.session.NullSessionHandler"/>  
 </http:connector>  

In your flow you can now reference this connector either from an outbound endpoint which will avoid sending the MULE_SESSION header or on an inbound endpoint which will avoid processing the MULE_SESSION header on incoming messages.

Like this:

 <http:outbound-endpoint method="GET" exchange-pattern="request-response" address="${flow.myserviceurl}" contentType="application/json" doc:name="HTTP_out" connector-ref="ConnectorWithoutMuleSession"/>

måndag 26 augusti 2013

Debugging Maven based, embedded Mule ESB Community Edition projects in Mule Studio

If you are running Mule ESB Community Edition embedded in an app server and you are deploying your Mule projects as .war archives to that server you might have come across issues regarding using Maven as build tool for your projects in Mule Studio to enable stand alone debugging of your Mule apps.

When using this setup you are not following Mulesofts recommended format of Maven projects and will probably struggle trying to import them into Mule studio.
Furthermore having a pom.xml building .war files for deploy will not be compatible with the standalone nature of Mule studios setup.

You could of course do remote debugging on your app server platform but I want to be able to completely work in Mule studio first making sure everything works standalone before deploying anything to the embedded Mule CE runtime. Another way is to simply debug as Mule application without Maven, but then come across issues having to add user libraries manually to Mule Studio each time to avoid runtime classloading issues.

My solution to this challenge was to create different profiles in your pom.xml and invoke them based on if you are developing / debugging in Mule or if you are about to deploy and a small trick to make that setup work with Mule studio.

First import your Mule project into Mule studio (you may need to do it without Maven support enabled to be able to import it if you are not following their project format). Then from within the package explorer enable Maven support.
Now edit your pom.xml and add your build plugin if its not there already.


 <build>   
  <plugins>   
   <plugin>   
    <groupId>org.mule.tools</groupId>   
    <artifactId>maven-mule-plugin</artifactId>   
    <version>1.4</version>   
    <extensions>true</extensions>  
   </plugin>   
  </plugins>   
 </build>  

Now to the solution edit your pom.xml and add two profiles one "war" profile and one "standalone" profile.
Make the war version activated by default and set packaging type on it to "war".
In the "standalone" profile set packaging type to "mule" and have the activation being triggered if an environment variable called "environment" is set to "mule" like this:
  <profiles>  
     <profile>  
       <id>war</id>  
       <activation>  
         <activeByDefault>true</activeByDefault>  
       </activation>  
       <properties>  
         <packaging.type>war</packaging.type>  
       </properties>  
     </profile>  
     <profile>  
       <id>standalone</id>  
       <activation>  
             <property>  
                    <name>environment</name>  
                    <value>mule</value>  
                   </property>  
             </activation>  
       <properties>  
         <packaging.type>mule</packaging.type>  
       </properties>  
     </profile>  
  </profiles>  

Now you should be able to build your project as as .war file for deployment on the embedded server just by running your normal maven commands like "mvn clean package".

To be able to do "Debug as Mule application with Maven" in the package explorer menu of Mule Studio you will however need to trigger the other profile.
To do this go into
Windows -> Properties -> Mule Studio -> Maven Settings
and set MAVEN_OPTS environment variable to "-Denvironment=mule".

Now when you do "Debug As Mule Application with Maven" on all your Mule Studio projects it will try to find profiles that are triggered by the "environment" environment variable with value set to "mule" and run it accordingly making sure packaging is .zip rather than .war and that all dependencies and settings for that profile follows your pom structures.

Note that even though projects will build just fine. If you have libraries not included in the Mule CE runtime referenced in a pom project structure with parent pom's that reside outside of the Mule project, the Mule Studio runtime will not find these libraries if they are set as "provided" (read  provided by the parent pom) in the pom.xml. One solution for this is to actually have the dependencies defined inside corresponding profile with "provided" set on the dependency inside the "war" profile and without in the "standalone" profile. Example:

  <profile>  
       <id>war</id>  
       <activation>  
         <activeByDefault>true</activeByDefault>  
       </activation>  
       <dependencies>  
           <dependency>  
                <groupId>org.json</groupId>  
                <artifactId>json</artifactId>  
                <version>20090211</version>  
                <scope>provided</scope>  
           </dependency>  
       </dependencies>    
       <properties>  
         <packaging.type>war</packaging.type>  
       </properties>  
     </profile>  
     <profile>  
       <id>standalone</id>  
       <activation>  
           <property>  
               <name>environment</name>  
               <value>mule</value>  
           </property>  
       </activation>  
       <dependencies>  
       <dependency>  
            <dependency>  
              <groupId>org.json</groupId>  
              <artifactId>json</artifactId>  
              <version>20090211</version>  
            </dependency>  
       </dependencies>              
       <properties>  
         <packaging.type>mule</packaging.type>  
       </properties>  
     </profile>  


Embedded Mule project Mavenized!

fredag 5 juli 2013

RESTful MongoDB and Mule ESB Community Edition pattern

There are some alternative ways to go when RESTifying your flows in Mule. However they are poorly described in manuals and forums and the components and tools available are not very active projects in the community. Struggling with Mule's REST router and other alternatives I settled for the built in Jersey way of doing it. However all examples from the Mule documentation and forums describe only how to enable REST components to your Http endpoint and return directly. The do not describe how to integrate the RESTful endpoint with your Mule ESB integration flows and hence take advantage of Mule ESB's other powerful integration abilities.

Here is how I do it:




I use a Http endpoint or Https if secure and couple it with a REST element like this:
 <jersey:resources doc:name="REST">  
      <component class="se.redpill.mulecomponents.mongo.RedpillRESTMongo"/>  
 </jersey:resources>  


The actual REST component is just plain Jersey Java, like this:
 @GET  
      @Path("mypath/{singleelement}")  
      @Produces({MediaType.APPLICATION_JSON})  
      public Map<String, Object> getQueryByName(@PathParam("singlelement") String name )   
      {   
           Map<String, Object> query = new HashMap<String, Object>();  
           query.put("MyCoolDocument.Type", "Cool");  
           query.put("MyCoolDocument.Unit", name);  
           query.put("Year", new Integer(2012));  
           return query;  
      }  


All fine. Now we use a Choice element to check on the inbound property if the request is
'http.method' GET, POST, PUT or DELETE and route each choice to its corresponding component. In my case I have a mapping to my own MongoDB components which are in the same format as my last blogpost.

Now to the real HACK to get this working:

If you tried something like this and failed you have probably seen that the object passed from the REST element  to your component is actually
org.mule.module.jersey.MuleResponseWriter$1
which is a pain to handle. But the payload can be casted to a
org.mule.api.transport.OutputHandler since its implementation comes from
org.mule.module.jersey.MuleResponseWrite

To avoid having to serialize / deserialize your own return value (in my case a Map) from your REST service to your next element (in my case my MongoDB query component) with casting techniques describe above you can instead do it in a much sager manner using the following statement:

           ContainerResponse cr = (ContainerResponse) eventContext.getMessage().getInvocationProperty("jersey_response");  
           Map<String, Object> map = (Map<String, Object>)cr.getResponse().getEntity();  

VOILA! We now have our Map back that we sent from our REST component.
My MongoDB Query component and all the others for PUT, POST and DELETE are implemented in this fashion like this:

 package se.redpill.mulecomponents.mongo;  
 import java.util.Map;  
 import org.mule.api.MuleEventContext;  
 import org.mule.api.lifecycle.Callable;  
 import com.mongodb.BasicDBObject;  
 import com.mongodb.DBCursor;  
 import com.mongodb.DBObject;  
 import com.sun.jersey.spi.container.ContainerResponse;  
 public class QueryMongoComponent extends AbstractMongoComponent implements Callable  
 {  
      @Override  
      public Object onCall(MuleEventContext eventContext) throws Exception {  
           ContainerResponse cr = (ContainerResponse) eventContext.getMessage().getInvocationProperty("jersey_response");  
           Map<String, Object> map = (Map<String, Object>)cr.getResponse().getEntity();  
           DBObject queryFields = new BasicDBObject(map);  
        String json = "[";  
        if(queryFields.keySet().size()>0)  
        {  
          DBCursor cursor = db.getCollection("mycollection").find(queryFields);  
             try {  
               while(cursor.hasNext()) {  
                 json += cursor.next() + ",";  
               }  
             } finally {  
               cursor.close();  
               json = json.substring(0,json.length()-1) + "]";  
             }  
        }  
        return json;  
      }  
 }  

And that's it!

REST enabled MongoDB with the power of Mule ESB CE integration flows at hand!