Anda di halaman 1dari 12

White Paper

May 2005

A New Perspective: Dont Integrate Interoperate


Reducing Cost and Complexity with a Process Component Model

Written by:

Michael Thompson

Published May 2005 Butler Direct Limited


All rights reserved. This publication, or any part of it, may not be reproduced or adapted, by any method whatsoever, without prior written Butler Direct Limited consent.

Important Notice
This report contains data and information up-to-date and correct to the best of our knowledge at the time of preparation. The data and information comes from a variety of sources outside our direct control, therefore Butler Direct Limited cannot give any guarantees relating to the content of this report. Ultimate responsibility for all interpretations of, and use of, data, information and commentary in this report remains with you. Butler Direct Limited will not be liable for any interpretations or decisions made by you.

About Butler Group


Butler Group is the premier European provider of Information Technology research, analysis, and advice. Founded in 1990 by Martin Butler, the Company is respected throughout the business world for the impartiality and incisiveness of its research and opinion. Butler Group provides a comprehensive portfolio of Research, Events, and Subscription Services, catering for the specialised needs of all levels of executive, from IT professionals to senior managers and board directors.

Butler Direct Limited

EXECUTIVE SUMMARY
Integration is all too often seen as the only answer to the problems facing organisations. These problems typically involve the historic implementation of systems and applications, which were installed to solve specific issues. However, if organisations solve the problems of the now without regard to the future, then they are simply wasting resource in a bid to maintain a certain position. The truly adaptive enterprise recognises that fact that the real way forward is to create an infrastructure of both technology and people that allows interoperability between all the elements of that infrastructure. By doing this, organisations can start to address the problems of the now and the problems of the future. Integration is a technology issue and interoperability is the concern of processes. If we abstract what is currently seen as integration to a process focus we can start to create that latter, more adaptive, infrastructure. IT has been bedevilled in the past by trying to implement conceptual architectures. Many of these have caused the integration problems that exist today. The interoperability model utilises many new emerging technologies and as such might be viewed as more concept than technology solution. This is not the case. Admittedly we are still at the beginning of implementations that can remove much of the pain of integration, but that should not stop organisations from taking a step backward and viewing exactly why they are undertaking the integration. We believe that once the why instead of the how of integration becomes the focus of organisational attention the move towards interoperability will become unstoppable. Processes are important for a number of reasons, not least of which is the simple fact that organisations are defined by their processes. It could be said that the processes are the organisation. It therefore makes eminent sense to put process at the top of the list. From an integration viewpoint, processes cross application boundaries, and therefore the most obvious route to take to ensure dialogue between the various applications is to hand integration over to the process itself. This is the basis of interoperability. The technologies required to move from the integration mindset are available, but they are not necessarily defined by a single acronym; they include Service Oriented Architectures (SOAs), Business Process Management (BPM), Event Based Architectures (EBAs), Web services, and a raft of other acronyms. This plethora of solutions should not be allowed to diffuse the focus, nor should they be seen as a one-stop shop for reducing integration. It is incumbent upon organisations who want to reduce the cost and complexity of their current integration strategy to understand the basic concepts of interoperability. This White Paper highlights the requirements of a solution for interoperability alongside the arguments for interoperability. It also has details of a specific solution available from Ultimus that will clearly demonstrate the availability of technology to make the concept a reality.

White Paper

INTEGRATION OR INTEROPERABILITY
Integration is one of the nightmares of modern IT systems. The demands placed upon IT resource to solve the integration problem are huge, and take up a disproportionate amount of available budget when one considers the true value gained. It is easy to talk about the need to remove silos of information or data or to pontificate upon the need to create an holistic IT infrastructure, but, at the end of the day, these are just words describing a concept with little added substance. A more pragmatic approach has to be to understand the need for integration from a business perspective rather than a technology one. Integration, if left in its present form, will eventually take up so much budget and consume so much time that organisations will struggle to gain value from their IT systems. Integration can become self-perpetuating if left unchecked. The need to tie one system to another has become the mantra of Enterprise Application Integration (EAI) vendors, and organisations have taken the holy words on board and dedicated themselves to create an amorphous mass of completely integrated applications. Even if this were an attainable goal, then all that would be in place would be a highly structured IT system that would answer the problems of the now rather than be reactive or proactive to problems of the future. The answer to creating an infrastructure that can solve future problems, as well as current ones, lies not in constructurism, but by deconstructing the systems that are already in place. The answer lies in creating an understanding of why integration is needed. It is this understanding of the why rather than the how of integration that will ultimately lead to decoupled systems that are far more flexible than the current models. The why of integration is simple; organisations require interoperability between systems. This interoperability is the key. If we stop looking at the problem from an integration perspective and start to consider it in terms of interoperability, then the way forward becomes clearer. In order to fully understand this, there are a small number of concepts that have to be taken in and accepted as true. To take one example. Many EAI vendors, System Integrators (SIs), consultants, and even analysts will demonstrate the need that exists to have some sort of linkage between ERP and CRM systems. In many organisations these two enterprise applications form the backbone of operations. They can be considered as the back office and front office of the organisation; the internal and the outward facing. As these two applications can form the whole operational structure then it makes perfect sense that they should be linked in some fashion. This is unarguable. What can come under discussion, however, is the best way to achieve this linkage. It is also worth considering just what an enterprise application (or indeed any application) is. An application can be accurately described as a set of functionally related processes collected together to provide a single interface for the activities they perform. A Word Processor collects together a set of functional elements that extend its basic design aim of inputting letters from a keyboard to display on a screen. Modern Word Processors contain functional activities such as spell checkers, the ability to search for text, etc. The full set of functional activities are reflected in every item in the drop-down menus. If the ERP and CRM systems remain discrete from each other then there will come a point within an operational process that requires the divide to be overcome. At some point a process started in an ERP system will need to have access to data, or will need to reference a process in a CRM system (or vice versa). That is not to say that these will be true of every process, but if we extend the model we can say with a great degree of certainty that the vast majority of processes running within an organisation will touch more than one application during its lifespan. To put this another way, we can say that there will be explicit functional relationships between applications.

Butler Direct Limited

Processes cross application boundaries. It is this very fact upon which EAI vendors (and others) have built their businesses. Yet if we accept that it is processes that cross application boundaries, then surely the most obvious route to take to ensure relationality between the various applications is to hand that element over to the process itself? By abstracting the integration issue to the process we can start to look at the why of integration; to move from integration for integrations sake to the point of interoperability. We want these systems to work together based on process requirements. This is another factor that has to be understood. The importance of processes cannot be overstated. It is quite easy to build any model that demonstrates an organisation is defined by the processes it runs. If processes are efficient then the organisation is efficient. If processes can be modified quickly then the organisation is better able to react to changing conditions. If the integration strategy is based upon process then the organisation is not simply answering the current set of problems, but creating an infrastructure that will answer future issues as they arise in a timely fashion. The truth of this is self evident if one considers a typical organisation and its technical infrastructure. In a perfect world it might be that an organisation has tightly integrated its core applications. It could then be argued that any change in process becomes irrelevant. As the applications are completely integrated any change in process will be reflected in the underlying applications. This is total integration for both explicit and implicit functional relationships. What this model tends to ignore is the introduction of new applications into the IT infrastructure. Any new application, in the old integration model, has to be completely tied in to all the existing applications. If the completeness of integration is not there then the argument that a change in process will be reflected in the underlying infrastructure no longer holds true. There is no middle ground with integration. If complete integration is not carried out then one is trying to second-guess future requirements; or play catch-up when those requirements become immediate. By abstracting the problem to the process then new applications can be seamlessly incorporated into the operational infrastructure. Likewise, and this is an issue all too often ignored, retired applications do not have to be decoupled. Just as there is a cost burden to integration, there is also one with de-integration. The abstraction of integration to an interoperability metaphor is more than a concept, it is an achievable end. The remainder of this White Paper will demonstrate how the actuality can be put in place.

INTRODUCING PROCESS
The first item to note is that, up until this point, we have not used the phrase process layer. When one reads about BPM the use of this phrase is commonly encountered. In some instances it is highly accurate and reflective of the solutions under discussion. However, at this deeper level of utilising processes as an integration replacement, it is important to consider processes not as just another layer in the technology stack, but as a solution that both exists as a layer within the stack, but which also has the capability to drill down through the stack to the underlying objects. This is, in our opinion, the most clearly defining element of a strong BPM solution. If the implemented BPM solution, whichever one it might be, has this ability to address the underlying data sources and to attach to the processes running within the enterprise applications, then it also has the ability to help redefine integration.
White Paper

With this functionality many of the requirements (or perceived requirements) for integration disappear and interoperability becomes a possibility. Although it is difficult to place precise figures on the savings to be made, as every organisation has individual requirements, it is not inconceivable that 70% or more of what might have been considered an integration issue (that is something that would need dedicated resource and extensive cost allocated to it) can be subsumed into the larger picture of creating processes. In order to give this some perspective, and to add weight to the argument for interoperability as opposed to integration, it should be stated that we recognise there are different types of integration that need to take place within an organisation. Utilising process in the manner described will not address the issues of, for example, having a single data source or a single version of the truth as it is popularly (though incorrectly) put. Data and information integration is a completely separate issue, and whilst we would not argue against this need, we also believe that this, too, is an issue that can benefit from an interoperability and process viewpoint. Again, it comes down to examining the purpose for integration rather than the integration itself as a primary driver.

PROCESS INTEROPERABILITY MODEL


The model that needs to be built for interoperability as opposed to integration is one of process components. These components can best be described as the discrete activities that take place within a process. For example, a process that raises an invoice for goods sold (which may itself be considered as a discrete activity within a larger sales process) would access a component that calculates the sales tax on the invoice total. As there are different rates of sales tax (and different ways of describing them) globally, it makes sense to hold the data on the rates as a separate data item within a data store. The process component is simply informed where to find the data store and the item to be used (in this instance based upon the country of originating order). Where the data used resides is immaterial in this model. If it makes sense to hold it within an ERP system then that is where it should be; if it makes more sense to hold it within a CRM system then it can reside there. What is not required with this process component model is to tie the ERP and CRM systems together in order for processes that might cross these systems to utilise the data. The rules within the component point the component to the correct location. To a certain extent this process component model is already recognised and has been implemented within many organisations. One example of a process component resides within Web services. When we examine Web services closely, what we find is a precise match to a process component. A Web service is a piece of discrete functionality delivered with common interfaces in order to simplify the integration of external resource. Web services introduced the idea of a SOA, which is essentially a framework by which organisations can assemble process components into applications that address a specific process requirement. Whilst people appear to be comfortable with the idea and implementation of Web services, there is little difference between them and more generic process components, apart from the former having more standards attached to them. One of the issues regarding Web services is the question of security. This has been seen as a barrier to the implementation of the model. Within a SOA or a process-component model the question of security along with other implementation issues is itself abstracted from the component and becomes a requirement for the architectural framework on which they run.

Butler Direct Limited

Currently, Web services are the process component of choice and there is nothing inherently wrong with this. What should be realised however is that a SOA is more than Web services, and in order to fully understand the advantages of the model the restrictions of Web services will not exist. The advantages are numerous. Firstly, there is no need to maintain separate conversion tables between different applications (this could be considered as an alternative to integration or even part of integration depending upon your viewpoint and how the structure was modelled). Secondly, there are no issues with data inconsistency. If the sales tax for a country changes then there is only one alteration needed; there is no need to understand all the applications that have this functionality as part of their makeup. We realise that in large highly distributed systems this model might appear simplistic and unworkable. In organisations that operate with such systems holding such a single data source might not be feasible, but this does not make the model itself unworkable or irrelevant. If we take the model as the highest level of abstraction then we can still apply the model in distributed systems. A little thought will show that implementing such a model will take what might be considered a complex integration problem and turn it into nothing more than a data replication issue. As most organisations already have solutions in place to ensure data consistency across global sites running multiple systems, then the problem has, to all intents and purposes, been removed. To return to the advantages of a process component model, the third advantage lies in the fact that alterations can be applied immediately without the need to stop any running processes. To use another example, a component model would allow currency fluctuations to be reflected across the whole of the enterprise (or those parts of it deemed relevant). Costs that are variable due to exchange rate variations are instantly incorporated into the business model. Similarly, these variations can be reflected in the price charged. All of this can be managed from a business point of view with rules defined to fire processes. In the above example a set of pre-defined rules could exist that would start a price increase/decrease process if the exchange rate passed certain parameters. Finally, and perhaps most importantly, abstracting integration to the process means there is no longer any need to differentiate between explicit and implicit functional relationships. The implicit relationality that has to be put in place in the old integration model (which might be described as just in case integration) no longer exists. All that now exists are explicit relationships that are defined by the process. Previously we implied there were two types of BPM solutions available; those that provided an abstracted layer and those that have a deeper functionality in terms of their ability to drill down through the technology stack to reference the underlying objects. We talked of the latter as strong solutions. Although we firmly believe the process component model is the way forward to help change integration into interoperability, there are different degrees of excellence even with the strong solutions available. This White Paper has been produced to bring the possibilities into the light of day, and not to measure or compare offerings from different vendors. However, it is worth considering what one vendor has to offer in this space, as it adds substance to what otherwise might appear to be nothing more than another concept.

White Paper

ULTIMUS SOLUTION
Ultimus has a solution rich in BPM functionality requirements, but there are a number of defining factors that allows it to sit most comfortably in any discussion on interoperability as opposed to integration. The ability to move from an integration model to one of interoperability is dependent upon two key factors. The first of these is to bring the human element into the process definition and the implemented process flow. In the early days of BPM, when there was resistance to the concept, one of the major arguments against it was the belief that it was simply a re-branding of Business Process Automation (BPA). In order to overcome these objections it was common to highlight the fact that BPM had a strong focus on the human interaction with underlying systems. This was acceptable from a conceptual view, but did not provide detail on how this might be implemented in various offerings. Ultimus has that strong focus on the human element in its solution in a specific manner. Processes can only operate effectively if the process flow is clearly defined and managed from the point of view of the actions taken by the non-technical part of the process. In the previous sections of this White Paper we have discussed how we need to build functional components that undertake a specific activity within a flow. The same is true when that flow touches human operators. If we have a process definition that requires some specific human activity to take place, then we need to manage that activity in the same way we do with the technical components. In bald terms this might seem like some Big Brother solution, but it is anything but. What it does is create an environment that allows processes to flow between applications and humans in a completely transparent manner. It simply ensures that the human element of the flow is fully aware of the actions required and is also presented with the tools to perform those actions in the most effective manner. The second element that defines an interoperability solution rather than an integration one is in the matter of coding. As discussed previously, we can tie systems together by coding interfaces. The cost and restrictions of this have been uncovered. However, simply talking about abstracting this integration to a process level is of no value if solutions still require coding to address underlying data sources or to cross application boundaries. The Ultimus solution removes the technology requirement for understanding data sources and allows process experts to include data without having to understand schemas. Ultimus also uses Web services to cross organisational boundaries in its defined process flows. All of this is carried out with a simple UI. Ultimus uses a technology called Flobots, which are agents that allow processes to automatically call applications. The Flobot can invoke a thirdparty Web service at any defined point within the process. A training metaphor is used for Flobots, and a process designer can train a Flobot to perform a single or multiple transactions with any available Web service. During the process runtime the defined Web service is invoked automatically either synchronously or asynchronously.

Butler Direct Limited

Flobots are not limited to Web services, the same functionality is also implemented for XML, .NET code, and database access. The design factor of most value and of highest relevance to the use of Flobots is that there is no requirement for coding when training Flobots and they are inserted into the process flow to perform an explicit set of actions. This is componentisation at a very fine level of granularity and allows for rapid adaptation of processes, at any point of the process, to meet changing requirements without disturbing the process definition. Ultimus Events Web Service As there appears to be a growing distinction between SOA and Event Based Architecture (EBA), it is worth mentioning how Ultimus operates within this framework although in our view a true SOA is inherently event based, this does not diminish the importance of having the ability to manage and understand events within a process. The Ultimus Events Web Service allows Ultimus managed processes to invoke third-party Web services from any event in a process. Such events can include: Step-based Events. Such as the completion of a step within a process. The actual decision whether to invoke the services or not in response to a step event is controllable through rules technology. Form-based Events. Invoke a third-party party Web service when an Ultimus Form is loaded or unloaded. Control-based Events. Invoke third-party Web services when any Ultimus control, such as a button, is clicked. Web services can also be called from other control events, such as when a control gains or loses focus in a Form. Although Ultimus uses the Web service model to deliver this interoperability, it should be noted that this is not restrictive to the uptake of Web services alone. The Ultimus solution is better viewed as a platform to allow for interoperability within a process component model. As previously discussed, Web services are simply the most common process component implementation available today. As we move this model forward and come to understand the benefits then Butler Group believes Web services will become one of many component types. This will not cause problems in terms of integration or interoperability such as occurred in the past with EJBs, CORBA components, and DCOM, as now it is the frameworks that will handle the transformations needed in a transparent manner. This will occur due to the fact that the SOA model organisations will implement demands a simple generic interface, such as exists with Web services. The only requirements for organisations to move forward in this space is to have the correct framework in place. The availability of these frameworks and their effectiveness will become the defining factor in the whole BPM space. Currently, there are few solutions that abstract the complexity of integration in a usable fashion. Ultimus is clearly one of these.

White Paper

CONCLUSION
Integration or interoperability starts from the viewpoint one takes. If getting disparate systems to communicate in order to fulfil a task is viewed as a technical problem, then integration is clearly the way forward. However, for those organisations who see interoperability as the real requirement then it is encouraging to see solutions coming to market that address the technology requirements. One way of viewing this is to see the underlying technology as the master or the slave of the business. However you may want to describe this (and there are more acronyms than most people can remember involved in this space), what has to remain in focus is simply if we abstract integration to the process layer and have technology implemented that removes technical involvement in change, then we are on the way to building the nirvana of the truly adaptive enterprise.

10

Butler Direct Limited

CONTACT DETAILS
Worldwide Headquarters Ultimus 15200 Weston Parkway Suite 106 Cary, North Carolina 27513 USA Tel: +1 919 678 0900 Fax: +1 919 678 0901 E-mail: info@ultimus.com www.ultimus.com Ultimus Germany Brunnenbachstr. 40 86343 Knigsbrunn Germany Tel: +49 (0) 8231 989 70 -0 Fax: +49 (0) 8231 989 70 -19 E-mail: info_de@ultimus.com www.ultimus.com/de Ultimus Italy Via Milazzo 34/Ter 21052 Busto Arsizio Varese Italy Tel.: +39 (0331) 679787 Fax: +39 (0331) 629436 E-mail: info_it@ultimus.com UK Office Ultimus 400 Thames Valley Park Drive Thames Valley Park Reading, Berkshire RG6 1PT, UK Tel: +44 (0)1189 653451 Fax: +44 (0)1189 653551 E-mail: info_uk@ultimus.com www.ultimus.com/uk Ultimus France 68, rue du Faubourg Saint Honor Paris 75008 France Tel: +33 (0) 1 53 43 64 47 Fax: +33 (0) 1 53 43 63 00 E-mail: info_fr@ultimus.com www.ultimus.com/fr Ultimus Iberia Puerta de Las Naciones Ribera del Loira, 46 Campo de la Naciones 28042 Madrid, Spain Tel.: +34 (91) 503 06 84 Fax: +34 (91) 503 00 99 E-mail: info_es@ultimus.com

Other Offices: Brazil, Central America, Mexico, S. Asia, China, Taiwan, Japan, Australia, and the Middle East

White Paper

11

Europa House, 184 Ferensway, Hull, East Yorkshire, HU1 3UT, UK Tel: +44 (0)1482 586149 Fax: +44 (0)1482 323577 www.butlergroup.com
WP000053ANP

Anda mungkin juga menyukai