B2B Integration Software Evaluation Guideline

1. Business User friendly, drag and drop interface

Does the solution provide an easy to use framework for non-technical users to design and configure B2B solutions? How easy it is to onboard Partners? Does it have a pre-built Partner Data Exchange portal for business users to invite and onboard partners into a trading partner network? Traditional B2B Integration Software requires developers such as java or .Net programmers to build partner data exchanges. This lengthens the development time and requires a dedicated team of IT developers to maintain the solution going forward. We believe that business users or citizen integrators can perform the same role as IT in onboarding new trading partners and configuring B2B transactions.

Does the solution enable SOA, in terms of reusable services, does it allow for collaboration within your organization where users can build B2B transactions as a team? Does it have a meta-driven architecture that supports faster development cycle, no code generation and enables rapid deployments of changes into production?

What is the learning curve in terms of using the product? Does it have a graphical, browser-based collaborative design studio that allows multiple users via single sign-on to build and share work?

Does it integrate with SVN (subversion)? How are the artifacts managed in terms of version control? Is this built-in to the product can it connect to an external SVN repository? How easy it is for the users to check-in and checkout different versions of the same artifact and is there a centralized manage page that shows the revision history?

Does it support self-documentation of maps and process flows?

We have written a lot in our blogs on the limitation of using an engineering centric development platforms and on the advantages of using a graphical, self-service B2B Partner data exchange platform. So let’s focus on some of the other factors that should be considered when evaluating a B2B Integration solution.

2. Publish & Subscribe model for data exchange

B2B solution should provide a configuration wizard for power users to publish data flow templates. Templates are common flows to process any data being exchanged with Partners. For example collecting new enrollments of employee contributions into a Pension plan can be published as a template.

Business Users can invite Partners into Trading Partner Network and grant them access to these templates in order to exchange data. Partners can login to the Partner Data Exchange and subscribe to a template and use the transaction to send data files.

B2B solution should provide features such as secure SFTP or File triggers to pull all the files at once and process them in parallel. Lot of B2B Integration tools don’t have these triggers built-in and require third-party components to add this service into their software. Even if they do provide this feature they cannot concurrently process the files, they do it sequentially thus impacting near real-time data synchronizations with distributed systems.

Looping through large data sets when processing bulk files with millions of data records. This is another important feature where the target application puts limits on the amount of data it can accept at any given time and the B2B orchestration needs a looping mechanism to send the data in chunks. Along with this feature there are additional features that are equally important such as support for resubmitting specific data set, handling of errors in cases when the data is rejected by the target application, data validation during mapping, notifications in case of errors are all key features of a robust B2B/EDI solution.

Support for recoverable orchestrations (or pipelines). In regards to this item here are some of the points to use to review this important feature. Does the B2B Integration Solution provide a recoverable option if the server is shut down in the middle of a B2B flow execution it can restart from the point of the last successfully executed activity. Major problems with the B2B tools are that their architecture does not support recovery and persistence of process flow’s “state” at run-time. Having a state-based orchestration engine that uses checkpoints to track the run-time execution of process flow helps in recovering flows that are interrupted by system failures. There are other more advanced features within this category such as JTA Rollbacks that should be taken into account.

Support for splitting large data into multiple chunks. Does the mapper support ability to process multiple chunks of data in parallel through concurrent threads? This is an important feature for large or bulk data files processing.

Support for scalability in handling large data volumes and high frequency transactions by an Active/Active clustered deployment. This helps in providing high-availability as your data volumes increase over time.

API development framework in publishing and consuming SOAP and REST web services. Some of the factors that should be considered are related to publishing of orchestrations as web services. Ability to handle sessions and persisting contextual data within the run-time of the process flow. How easy it is to connect to OAuth based APIs? How does the user handle errors and issues related to connectivity, how are they reported and what are the auto-recovery mechanisms to reinstate the session?

Can the solution connect to major applications such as ERP, CRM, or Databases? Does it rely on open source drivers or proprietary and if they are proprietary how often are the revisions done to keep up with the changes in the target systems.

Does it provide the extensibility to add custom plugins into the SOA framework? This is a very important factor that allows developers to leverage existing programs and executable within the B2B tool such as java classes, jars, SQL queries and stored procedures.

Can we spawn off or call sub-processes from within a main process flow? Having a well thought-out solution design by using sub-processes so that if there’s a need to change one process flow rule you can make that change on a sub-process rather than in one process.

Does the workflow handle dynamic binding? Think about a content based routing scenario where you can have multiple types of data that need to be processed and the behavior of the flow depends on the type of data that is being received. It’s better to have one “template” process that handles all data types versus creating multiple flows for each data type.

How easy it is to monitor and track run-time workflows, does it have a monitoring dashboards that provide details on the status of each transaction, exceptions and how to correct those run-time process and data exceptions?

3. Data Mapping for Structured and Unstructured Data

Support for graphical, browser based mapping that allows business user to map simple and complex data transformation rules easily through a graphical notation.

Ability to auto-map source and target fields.

Ability to validate the data output by attaching sample data and see any errors resulting from the mapping rules. Advantage of having a data preview is to allow users to identify what possible errors may result during run-time and providing the ability to correct those errors during design time.

Ability to map to any database such as Oracle, Sql Server, MySql, DB2 and NoSQL databases.

Support for standard XSL for inter-operability with other XSLT based mapping tools.

Ability to sort, join and split data based on business rules on the data.

Ability to call custom java programs and web services from within a mapping.

Ability to add a modified mapping into a SVN version control for tracking revision history.

4. ESB capabilities

Can the solution connect to Message Queues, cloud based APIs and on-premise applications?

Does it provide a governance framework for managing development and maintenance of activities?

How easy is to integrate with applications? What type of user is needed to work on this tool?

Data Mediation and routing of files?

5. Installation & Deployment

Some of the things to be considered from the installation and deployment perspective are:

How easy is it to install? Can the solution work on Linux, Windows, and Solaris?

Are there online training materials available to get started with the product?

What are the support services available for your business needs?

Can this be deployed on-premise or cloud or both?

Does it support clustering and high-availability?

What is the pricing structure and does it allow for subscription or perpetual options?

How many users can use it? Does it charge per connection, how many B2B flows can I configure in the tool? Are their any gotchas that would negatively impact me later?

And finally once all the above factors are considered in evaluating the tool think about the road-map of the product and what new innovations does the vendor have in plan for the upcoming versions of the B2B integration solution. This would give you confidence that the technology is not getting old and the product team is working on new innovations that would help you as a client down the road.

I hope the above guidelines provides a good framework on how to evaluate a B2B integration solution and we at Adeptia are ready to help you in this process.