There are many technologies for implementing and understanding business data processing applications.

Some are easy to use.

Some are efficient.

Some are robust.

Some are “open.”

Some allow users to be highly productive.

Some are scalable.

Some can handle very complex logic.

Some are well suited to batch processing; some do real-time.

Some… well, you get the idea.

There are many different technologies out there with many different strengths. But unlike others, only Ab Initio® has all of these strengths at the same time.

Ab Initio has a single architecture for processing files, database tables, message queues, web services, and metadata. This same architecture enables virtually any technical or business rule to be graphically defined, shared, and executed. It processes data in parallel across multiple processors, even processors on different servers. It can run the same rules in batch, real-time, and within a service-oriented architecture (SOA). It supports distributed checkpoint restart with application monitoring and alerting. And, this same architecture enables end-to-end metadata to be collected, versioned, and analyzed by non-technical users.

This single architecture is what makes Ab Initio a general-purpose data processing platform. Users don’t need to stitch together a collection of technologies to get the job done. Everything from Ab Initio is designed from the beginning to form a unified processing platform – fully integrated by definition, as opposed to by marketing.

The Ab Initio architecture manifests itself through a wide range of technologies and capabilities, all of which are built on the same architectural foundation. These capabilities fall into the following general categories:

Application specification, design, and implementation

Business rules specification and implementation

A single engine for all aspects of application execution

Application orchestration

Operational management (monitoring, scheduling, and so on)

Software development life cycle management

Metadata capture, analysis, and display

Data management, including very large data storage (hundreds of terabytes to petabytes), data discovery, analysis, quality, and masking