Tracking Service-Oriented and Web-Oriented Architecture

SOA & WOA Magazine

Subscribe to SOA & WOA Magazine: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get SOA & WOA Magazine: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


SOA & WOA Authors: TJ Randall, Lori MacVittie, Andreas Grabner, Dynatrace Blog, Cynthia Dunlop

Related Topics: SOA & WOA Magazine

SOA & WOA: Article

A New View on Business Intelligence

The power of data analysis in a portal architecture

With better technologies, new architectures, and innovative ways of thinking about old problems, there are new applications for business intelligence and data analysis. I am talking about the power of the intelligent portal.

But what do portals have to do with data analysis? Everything. A portal, by bringing together such a wide range of applications, services, and even entire businesses, not only provides a solid integration platform, but by doing so, enables business intelligence. Data that was once separate, awkwardly related, and difficult to format can now be brought together under a common umbrella to communicate and be interpreted in new and innovative ways. A remote Web service, a legacy mainframe application, an old COM project - can now all talk to each other within a portal, which makes one think: what are they talking about? My suggestion: ask the logs.

Integration and Its Effect on Data Analysis
In the days of independent and uncommunicative applications, it was possible to have a complete view into the usage of each individual system - you could track that a user logged in and clicked here, waited for 15 seconds, then went to this page, then checked so-and-so data, and finally logged out.

However, to improve interoperability, companies adopted custom solutions to glue together disparate applications - where one program would speak directly to another program. The problem with point-to-point integration is that it becomes increasingly difficult to understand how a single user is using the data across these applications. If each application has its own form of logging, it could be a veritable nightmare to merge those two pieces of data, much less analyze them accurately. Even if both programs were assumed to use best practices and logged events in universal time (UTC), slight differences in the clocks between machines could throw the interpretation of the data way off. Did the user click on this link or download that spreadsheet first?

Newer integration strategies, such as service-oriented architecture (SOA), provide for applications of any language on any platform to talk to each other through a central location. With information from different applications flowing through the same central repository, it becomes easier to see and understand system usage.

Portal technologies integrate applications through a central location - the presentation layer - offering a view into the user's actions. Assuming an SOA, the uniform communications layer enables all logging to go through the portal. This ensures that events on a page for a user are given a well-defined order to eliminate any confusion. A small application that interprets those records containing events collected at the presentation layer would help to cleanly tie together and accurately portray site activity and usage. Analysts and planners would be able to bring data back together and understand how it is being used, by whom, and toward what meaningful end.

Creating a Superior Customer Experience
When companies roll out a general portal solution, whether or not it was built on an SOA, the initial navigation is often awkward. Poor design or careless placement of functionality on pages can force a user to spend too much time on one page, while missing better services offered on another. If a robust logging solution was in place, it would enable organizations to quickly see these issues and take action. For example, consider an online bank. A report can be generated that tells site designers and administrators that most new users eventually end up going to the loan application page, but have to sift through four or five links before they get to it. Other reports can show that some people become new users and browse around the site for a while, but don't sign up for any service. They also show that those same users later come back to the site and apply for a loan. Clearly, these users would have appreciated if that functionality had been more visible. Thanks to log analysis, the organization can make the accurate assumption that this often requested service should be moved closer to the entry point of the site so it is more clearly seen. Attention to details such as these goes a long way toward improving the end user's experience and capturing more business.

Adapting the Customer Experience to the Customer
Having a different, customized face to the same data isn't something terribly new - it's called personalization. Imagine a company's reputation if it was able to change the user experience to custom fit the browsing habits of each user. Suppose an online newspaper wants to provide the latest breaking news it possibly can, but in a personalized way to its customers. Now instead of just posting news, they need to understand how that news is being used. With data analysis on the gathered information in the portal logs, you can learn what types of news a person likes to see the most, and customize his or her homepage to show that. Rather than having the users manually change or customize preferences, why not automatically present the data they are seeking? By analyzing aggregated data about the site usage, it is possible to present what the user wants, without him or her asking for it or choosing it. As a user's browsing habits change over time, the site can also adapt. For instance, if a person likes sports, it wouldn't be anything special to just give him the sports page when he enters the site. It would be more impressive if you record that, on his last visit to the site he searched for the final standings of the U.S. Open. Then, when he returned to the site, he was immediately shown highlights of the latest tennis tournament, along with its standings. The customized data you provide need not be restricted to the last couple of visits to the site. As seasons change, so do the sports offerings. So, you should have rules in place to accommodate both the user's short-term interests and long-term tastes. The site would be adaptive to each user's personal viewing habits, providing for a truly organic experience.

Adapting the Portal Interface to Improve Your Business Model
Good site management should not only encompass how the site looks and what functionality should be placed on which pages, but should also encompass active log management. That's where all of the knowledge is; that's where smart decisions can be made. For instance, one e-commerce Web site happens to be a conglomeration of different, smaller companies they purchased over the years. They currently have a portal solution that opens up each of the proprietary inventories. Thanks to portal, you can now record user purchases across product domains. If you have a particularly strong showing one month for a certain combination of purchases across departments, you can always offer exclusive discounts that help customers realize that you are aware of their buying habits and needs. Also, if the logs show that users are shopping much more frequently at certain parts of your site than others during a given time period, you could pitch to your ad investors that those spots on the Web site are more valuable and cost more to advertise on. As long as you have that user data to back up your claims, and as long as you're reasonable with your price, the ad companies aren't going to turn down your offer. It's a win-win-win situation: for them, it means more potential users clicking on their ads; for you, it means more revenue; for your customers, it means providing contextually accurate advertising to improve their overall user experience.

Becoming Predictive to Improve your Business Model
New software and services come in two major forms: assumption and demand. With assumption, there is risk because without really knowing precisely what your customers want, you try your best to presume what services they could need and what services they would use. From time to time, this train of thought will pay off, but not with as much return as anticipated. In the demand world, one will meticulously review metrics across multiple applications to try and deduce how customers are combining separate services for their own use. Some companies will even ask users to rate the site or take a survey to acquire feedback.

With the data analysis of portal logs in place, none of that hassle is needed anymore. As you record a user's steps through the site, you gain intimate knowledge of how he or she is using your data - data not just from one application, but data that crosses boundaries. By understanding the flow of a customer's visit to a site, it is possible to have a vast repository of extremely accurate statistics. You can extrapolate from this knowledge ideas to help companies more quickly predict exactly what kind of new service customers are looking for and preemptively provide it to them. So custom-developed analysis tools that interpret these new types of logs not only help to serve as a record and guide to the newly linked data being accessed, but also as a knowledgebase and rule-based foundation to the intelligent portal.

The Intelligent Portal
Portal technologies are in a powerful position to provide cutting edge statistics to managers, planners, and technologists alike. Managers want to be able to effectively gather meaningful feedback from a system so as to make better and more informed decisions. Planners need to know how a business process or service is being utilized, so as to provide better evolutionary interfaces to make working with the data easier. Technologists want to know what data is being requested so that they can brainstorm and come up with the next best service before a customer has even thought that he or she might want it. All of these ideas are viable; all of these things can be realized.

When getting started, I recommend using a blend of data analysis tools along with a custom implementation. IBM's more advanced portal technology, called IBM WebSphere Portal Extend, comes with tools and built-in functionality designed to empower your data analysis process. Included in the edition is a tool called Site Analyzer that captures, stores, and reports on-site usage and content relevancy. WebSphere Portal Extend also provides rules-based personalization, which enables business managers to define business rules that personalize content for users and groups. The software's collaborative filtering uses recommendation engines that utilize advanced statistical models and other forms of intelligent software to extract trends from the behavior of portal users. This approach adapts to changes in visitor interest without business rules. Finally, to make the analysis process truly effective for your organization, it is key to customize your tooling and techniques. If designed and implemented well, the vital business intelligence will enable you to build on your customer satisfaction as well as on your business model, making the return on your investment immediately worthwhile.

Through all of these different mindsets, when it comes to the potential for log analysis in a portal environment, what do they all have in common? Each stresses that the world today - from the user's or customer's point of view - should be more predictive rather than reactive. The less you have to inquire your target audience about the things they want, the more money you'll save, and the less effort you'll expend. The key is to rapidly deliver services that are simultaneously personalized and contextually relevant.

More Stories By Joseph Marques

Joseph R. Marques is a member of Prolifics' WebSphere Consulting Division - a specialized team of experts on which IBM calls to deliver training, mentoring, and development services, as well as to solve the toughest of their customers' challenges. Specializing and certified in Java, WebSphere architecture and best practices; application development and deployment; and portal development, Joseph is responsible for delivering distributed, J2EE, WebSphere solutions to clients worldwide.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Mike 10/01/04 07:28:38 AM EDT

Nice article. Gives good insight into realizing how to take further advantage of bringing desperate processes and data together.