October 1999
How Long Can You Wait For Real-Time Aaccess To Legacy
Systems?
BY SCOTT OPITZ
When customers are accessing their own information through a Web-based or
telephone-based interaction, failing to reply quickly and accurately is simply
unacceptable. Call center managers recognize that fact and see the problem clearly, but
solving the problem is often perceived as a huge project with a multi-year timeframe and
daunting IT requirements. This may have been the case before, but it just isn't true
anymore.
In most cases, the required customer information exists, it just isn't available in the
only timeframe that matters: immediately. For a significant percentage of call centers,
customer information is inaccessible or is perceived as inaccessible because
it resides in legacy applications.
COMPELLING REASONS FOR CHANGE
Providing real-time access to legacy systems is the clear path to competitive
advantage. In addition, when access to legacy information is simplified and streamlined
through a Web or Windows front-end, agents are freed from learning arcane applications.
Since most companies spend weeks or months training new call center agents on back-end
systems, this can reduce costs and get agents on the phones faster. It also provides a
common baseline that more agents can reach with less training, leading many enterprises to
reduce the number of job classifications and, in some cases, staffing levels. There are no
more call hand-offs, and agents are transformed from phone clerks to problem solvers.
Since call center costs are directly related to the hiring, compensation, and retention
of competent agents, anything that can be done to heighten agent productivity and improve
employee retention represents increased profitability to the enterprise.
TAPPING LEGACY RESOURCES
Most legacy data resides on IBM mainframes, AS/400s, and various other core systems.
Access to legacy systems that provides true business benefit requires the following:
- Real-time access: Customers and suppliers expect the most up-to-date information
possible, whether they are communicating via phone, e-mail, the Internet, or with a
traditional brick-and-mortar retail outlet.
- A familiar user interface: For example, customers doing business over the Web expect to
see a standard browser-based application.
- Fast time-to-market: Rewriting legacy applications is just not an option.
- Acceptable investment/return and risk levels: A lack of understanding of the legacy
applications compounds the development problem, and in most cases the original developers
are long gone.
The best types of solutions address all of these issues.
LEGACY INTEGRATION EXPLAINED
Access to legacy data from modern applications can be obtained invasively (requiring
low-level access or changes to the host) or non-invasively (completely external to the
host).
Invasive Approaches:
- Direct access Enable or simplify direct access to mainframe databases.
- API access Use an Applications Programming Interface (API) to obtain
access to mainframe data.
- Re-engineering Re-engineer the application so that it provides the data
and user interface required.
Non-Invasive Approaches:
- Screen-scraping Use existing 3270 screen-based applications as the
vehicle for access (screen scraping).
- Application modeling Combine the easy access of screen scraping with
modern, object-oriented, server-based technology by using multi-tier application modeling.
DIRECT ACCESS
Generally, host data is in some sort of database, so one obvious approach is to make
direct requests to the database. Unfortunately, the data is often stored in a very cryptic
manner. In particular, applications written many years ago when storage was expensive may
compress data in ways that are hard to unravel. (Many Y2K project managers have discovered
this the hard way!) These encoding schemes are not always well documented, and must first
be deciphered before direct database access is even feasible.
In addition, direct access to the database requires the developer to replicate the
business logic. If the relationship of data items is poorly documented, there is the
possibility of error, damaging data integrity and violating rules regarding the
maintenance of audit trails. Finally, this approach necessitates the understanding of the
entire application system surrounding the database of interest.
It is not uncommon for older online applications to invoke the running of
one or more batch applications (probably coupled with the cessation of online
applications) contingent on certain kinds or degrees of changes to the database. If the
original legacy application handles all the updates, theres no problem; if a new
application is updating the database directly in ignorance of such synchronization issues,
disaster will eventually ensue.
API ACCESS
Another approach is to access host data through an API. Some applications primarily
those that were designed for access via MQSeries, EPI, ECI, or CPI-C (mostly since the
early 1990s) should probably continue to be accessed that way. But these form a
small percentage (approximately 15 percent according to many estimates) of all host
applications. Older legacy applications lacking such design can be modified to add in API
access, but this type of modification brings all the same pitfalls as those associated
with direct database access.
RE-ENGINEERING
With enough money, time, and programmers, it is possible to completely rewrite
and modernize the host application, simultaneously providing it with a modern user
interface. This is a viable approach if the intention is to completely rework the
application or if the entire business model has radically changed and the company wants to
start from scratch. Otherwise, it requires a huge outlay of time, money, and human
resources at a period when many development organizations are already faced with a tight
labor situation, tight cost controls, and a focus on Y2K/Euro readiness.
SCREEN SCRAPING
In situations in which the host system cannot be modified or reconfigured for external
access, none of the invasive approaches mentioned above can be used. In those situations,
screen scraping or multi-tier application modeling must be employed.
In screen scraping, a custom-coded application takes control of a 3270 or other
emulator session, sending and receiving datastreams, which the host interprets as
communication for any normal host screens. The application then retrieves from those
datastreams (or places into them) the data the end user needs to access or wishes to
modify on the host. Screen scraping products range widely in their degree of
sophistication and scalability. In older, two-tier approaches, the client program
(providing the new user interface) communicates directly with the legacy back-end via the
IBM-defined High Level Language Applications Programming Interface (HLLAPI). More recent
products have provided less complex (though still quite programming-intensive) APIs that
allow developers to use API calls from their preferred development environment (e.g.,
Visual Basic, Java, PowerBuilder). Companies able to handle the tedious programming burden
of these tools do not need to modify the host applications and can achieve easy access to
the data through the screen datastreams.
THE NEXT STEP: MULTI-TIER APPLICATION MODELING
The screen-scraping concept has continued to evolve, and recently there has been a major
step forward. In this approach, a server is introduced as an intermediate tier residing
between the client and the legacy applications. Legacy screen datastreams can be modeled
and managed on the server, fully isolating them from the new client applications. The
multi-tier approach has proven far superior to the ,two-tier approach, especially for
larger projects where ongoing growth and software maintainability are critical.
The benefits of multi-tier application modeling include all those associated with
traditional screen scraping, plus the following:
- Rapid deployment. Any modifications required due to legacy application changes are
restricted to the server system. There is never a need to reprogram and redeploy clients
due to changes in host screens or legacy application logic. Long-term maintenance and
total cost of ownership are improved.
- Client applications are totally isolated from legacy application access logic.
- Different types of clients can use the same server-based code and host screen models.
- Business logic can be separated from the data access logic; programming is simplified
both now and in the future.
- Because the legacy integration logic and screen models are self-contained at the server
level, they can be readily replicated across a number of physical servers. This approach
easily scales to meet the needs of even the largest companies.
- Since all legacy activity flows through a central group of servers, management is
simplified.
Multi-tier application modeling provides flexible, manageable, scalable user access in
real time, and rapid time-to-market within a reasonable budget. For most organizations, it
is the best approach to accessing legacy data and turning under-utilized assets into
competitive advantages.
SUMMARY
While some companies are beginning to recognize the value of immediate access to all
available customer information, most dont realize that it can be achieved quickly
and easily, with no adverse impact on existing systems. The benefits of increased customer
satisfaction and more efficient, profitable call center operations speak for themselves.
Call center and IT managers owe it to their organizations and their customers to
investigate the options today.
Scott Opitz is responsible for all marketing, product management, and business
development activities for the Enterprise Integration Solutions Division of Computer
Network Technology (CNT), and he can be reached for comment at 508-870-3904. CNT provides
EAI solutions for bringing legacy information into new, business-critical applications.
For more information, please visit their Web site at www.cnt.com
|