Transcript of "Vendor strategies: Operational Business Intelligence for Agile Enterprises"
Operational Business Intelligence for Agile Enterprises By Kishore Jethanandani
VENDOR STRATEGIES AND PRODUCT OFFERINGSAgile enterprises need business intelligence to improve their sensory perceptions; the ability togauge reality, to view their resource flows and alacrity to remain on top of events. Businessintelligence vendors are increasingly conscious that customers hate to be hobbled by their sunkcosts in Information Technology. Instead, customers want to be able to rejig their existingtechnologies to adapt in fluid situations. Information needs to flow unimpeded by clunkytechnologies. Technology inadvertently has often, in the past, become a millstone instead of alubricant of change. Customers are increasingly looking for technologies that read the pulse oftheir business activity and funnel information to all their employees who can communicate andcollaborate and act in time to respond to events.The concerted effort that customers are making to lower latencies in data collection and decisionmaking is best illustrated by the effort investment banks are making to speed up the processes torefresh their data that influences their decisions on portfolio management. They are looking toreceive information directly from the exchanges, option trading exchanges and ECNs so that theycan weigh the impact of events on any of the securities that they hold in their portfolios. Automatictrading tools enable traders to complete the calculus of risk and return when they buy or sellsecurities and need to evaluate the impact of major changes such as prices and interest rates.Their information architecture has to be so constructed that it can tap market data from a varietyof servers, with their own data formats, and convert them into a single format and move the data,aided by middleware, to the enterprise data infrastructure.In the past, information systems were tenuously linked to the levers that companies could use toact as situations changed. Increasingly, enterprises are looking to integrate decision makingprocesses and business processes so that the lag time between the receipt of information andthe response is minimized. They want to build in the rules for predictable responses to knownproblems so that human resources can be reallocated to attend to more knotty problems. Wherehuman intervention is required, companies want to be able to quickly visualize a situation andsize up a problem before they act.Above all, the best decisions happen when all the related information is brought to bear on acourse of action. In addition, decision-makers want to simulate alternative scenarios, visualizethem before they take their decisions.The business intelligence industry is still evolving and the jury is still out on who is eventuallygoing to win. Clearly, established players with experience in implementing large size deals in theenterprise software industry have the best chance to integrate several technologies required togather information, analyze data and communicate decisions. Business intelligence projects alsorequire consulting services that are so necessary for successful implementations of especially theoperational applications.
INTELLIGENCE ON TAPBusiness managers, in operational situations, get the “missed the bus” feeling when they areunable to take decisions at the right time; their calculations can go haywire as the ground shiftsunderneath them. Well timed moves help them to grab opportunities and get ahead of thecompetitors. Companies need to also take decisions before a problem snowballs into a crisis.Delays in decision making can cause grievous losses to businesses. In the pharmaceuticalindustry, for example, counterfeiting, medical errors and poor quality of products can undermineconfidence in companies and medical groups. The actual manufacturers could be blamed for theharm done by counterfeit or an odd batch of poorly manufactured products causes harm topatients. The healthcare industry is now well equipped with laser vision, RFID and othertechnologies to eliminate errors in data entry and data gathering technologies which aggregatedata so that causes of any damage done can be traced back to specific deliveries and preemptiveaction is taken.Edge Dynamics Inc is one of the companies with applications for real time decision making for thepharmaceutical industry. Typically, a complex set of contracts, deals and regulatory policy bindthe stakeholders in the supply chain consisting of the manufacturers, wholesalers and retailers.Edge Dynamic has the software which is able to capture transaction order stream data originatingfrom an EDI or other such sources such as a Web-service B2B network. The data is analyzed fordiscrepancies from forecasted numbers or any deviations from agreements initiated at the outset.As data is received and analyzed, the partners in the supply chain will discover flaws in its designand work towards managing inventories better. They will find better ways to optimize andreevaluate their partners and their logistical planning.The technology that goes into gaining visibility into the operations of businesses is illustrated bythe implementation of Siebel Analytics at Jostens which sells class rings, graduationannouncements, and yearbooks to schools across the country. Siebel Analytics is able toaggregate data from the Oracle data warehouse, a Microsoft SQL Web/e-commerce application,and Microsoft Access for use by Jostens sales staff which can view the results on role-based,interactive dashboards. The Siebel Analytics software enables Jostens to keep track of the salesperformance data by each segment of the business. With real time data feeds, Jostens’ salesstaff can spot opportunities for cross-selling, up-selling, etc.Retail stores present a familiar scenario where markdowns happen almost everyday wheninventories pile up unexpectedly. All too often, retail store managements are taken by surprise aspreferences change, the media influences attitudes, new promotions are announced bycompetitors, seasons change or events affect purchases by consumers. At the local level,consumer behavior can be quirky and the inventory in stock may not excite them. Retail storeshave to learn to stock an assortment of products that are in tune with the tastes of customers for
each of their stores spread around the country, ensure that they will be profitable and manage thesupply chain so that the products will available in time for the season.When supply closely matches demand, companies can not only pass the benefits of lower lossesfrom stock outs to consumers in the form of lower prices but also offer products that closelymatch their needs. Zara, a Spanish clothing company, takes less time than its competitors torespond to market need. The managers at its stores send information about customerpreferences by handheld devices and it is all aggregated rapidly so that the most relevantproducts are displayed in their stores. The dyeing and printing is done only after the customerinformation is available.The management of demand and supply has gotten more difficult as the product life cycles getshorter and the supply chains get longer as goods are sourced from more distant places.Increasingly, companies are looking at software that can aggregate point-of-sale data frommultiple sources, analyze it to predict demand for individual categories of products, optimize thesupply chain and help in pricing.Several different pieces of software are used in the management of demand and supply. One ofthem is revenue optimization software which takes into account information on demand, the costsand determines the best price to offer based on elasticity of demand. Conversely, it can take theprices offered by competitors as given and throw up the numbers for the desired demand andsupply. Such software can also pinpoint customer segments most likely to respond to a particularoffer. Manugistics Inc. is one company that leads in this segment of the market. However,revenue optimization software does takes into account existing demand before it cranks outfigures on prices and potential segments to target.The suppliers would rather that they could forecast demand accurately and produce as much sothat they can receive better price deals. This is best achieved by using demand forecastingsoftware. The successful implementation of demand forecasting tools presupposes the collectionof point-of-sales data and the willingness of retailers to share such information with their vendors.Besides supply chain management software providers such as i2 and Manugistics, BusinessIntelligence vendors such as NCR Teradata, Business Objects, Cognos, and Prescient are theplayers in the segment.An additional piece of software for collaborating in real time is the software for supply chainmanagement to collaborate with vendors, manage logistics and to share information. Oracles 11iE-Business Suite, for example, includes iSupplier and Collaborative Planning portal tocommunicate with offshore contract manufacturers and suppliers with web based tools.MACHINE LEARNING FOR INSIGHTSThe growing size of data sets has changed the analytical paradigm. Well known techniques, suchas statistical techniques, are overwhelmed by the colossal volumes of data. Typically, statisticaltechniques begin with a hypothesis and a model, based on domain knowledge such as
psychology, which they seek to validate. The involvement of human beings and uncertainprocesses preclude the use of insights in real time.When data sets are large and chaotic, it is much harder to decide on the methodology forverification. The dimensions overwhelm human cognition’s ability to see the connections and beable to this quickly enough to make decisions. Increasingly, machine learning methods arerequired to reduce raw data into patterns before humans can look for the story that is relevant fordecision making purposes. These automated methods of finding patterns look for correlations ofdata over periods of time (time series), find clusters in activities such as crime or classify datasuch as in decision trees. Market basket analysis, for example, looks for combinations of productscustomers tend to buy.The kind of situations in which machine learning has a compelling value is searches on the web.Intelligence agencies have to look for terrorist activity, competitive intelligence analysts look forinformation on rivals or content creators have to look for intellectual property rights violations.Companies, such as FAST, have created tools that are able to extract insights from such alabyrinth. Reuters, for example, uses FAST’s search tool to zero down on content that lookssuspiciously like its own.Machine learning plays an important role in functions such as fraud detection, stock trading, andcustomer segmentation to extract intelligence that cannot wait for an analyst to extractintelligence. Neural Network software systems, for example, have reduced fraud in UK banks byas much as 30%.The vendors in the space include SAS, STATISTICA Data Miner, S-Plus, Fair Isaac, SPSSClementine, IBM Intelligent Miner Affinium Model, Insightful Miner and KXEN, IBM IntelligentMiner and Genelytics.Tools that understand fuzzy conceptsCompanies are best able to extract insights when they can search across all their data andclassify and correlate it. For decision support knowledge, companies have to be able to conductsearches on both structured and unstructured information.The urgency to search for unstructured information is more urgent now as it has wide range ofapplications such as especially law enforcement, customer service, drug discovery, knowledgemanagement, etc. Companies are beginning to discover the enormous benefits of mining text andother unstructured information. The pharmaceutical industry, for example, is discovering that itcan reduce the time required to commercialize new drugs if only it could search and analyzeinformation pouring in from clinical trials for all drugs. When data on safety is available for allclinical trials, the regulatory bodies can look for patterns that will help them to come to decisionsabout accepting drugs for human use faster than is the case now. XML is the bedrock for linkingrelated databases and to search them with text mining tools.
Customers need a common language and search tools to parse all the relevant data and toanalyze it. Natural language is best able to express the nuances in human thought processes.Inevitably, natural language can have a variety of meanings, synonyms, connotations and usage.New tools are required to see words in their context before any meaning can be drawn fromthem. This is best achieved when search engines have semantics capability.The traditional and the most widely used method of searching databases, the Structured QueryLanguage, is inadequate for heterogeneous environments where data descriptions vary betweendatabases. This form of querying is relevant only for structured data and it presumes knowledgeof the specific information a person is searching. In most cases, people have a knowledge of thetheme they are interested in exploring. In heterogeneous environments, searching by using SQLwould be impossible since the number of series, as well its heterogeneity, overwhelms its abilityto extract meaningful information and knowledge.With the advent of XML technologies it is now possible to classify unstructured information aswell. Individual elements of unstructured information can be described by tags or the metadatathat describes the information content in there. The detailed description of the content helps tosearch repositories with large volumes of content much like SQL queries can extract informationfrom relational databases. XQuery can search both content repositories and databases andextracted related quantitative and qualitative information. Microsoft’s SQL Server 2000 is oneproduct which supports XQuery and is able to use both structured and unstructured data foranalytical purposes.Another approach to searching unstructured data is to use search engines. However, a searchconducted on unstructured data all too often yields a jumble of results which is an all too familiarexperience of users of the World Wide Web. Similar searches on corporate intranets are worsesince the information in not even linked as is the case with the World Wide Web.Search technologies for corporate databases seek to look for the significance in a mass of words.For example, someone maybe looking for information on a crime committed by a suspect namedJohn Lear of San Francisco will be able to find meaningful information when inter-relatedinformation about the background of the person, the time, location, previous associations with thevictim is presented. Databases of unstructured information can have variables such as time,location, biographical data, etc., as the dimensions of a data warehouse and they can storerelated facts associated with each of them. A search conducted on such databases is more likelyto find related results instead of a jumble.At the center of semantic search technologies is ontology or the knowledge base that helps todefine the “being” or the personae that are pivotal to understanding the universe underconsideration. For example, students and professors form the axis of the universe of aneducational institution. Semantic search tools create taxonomy to describe the entities in a
universe and their relationships with the world around them. The information about the universityis classified by entities which helps to create the links between the available records.Search engines for unstructured data are now able to find information in an organized way byusing tokenization, linking and taxonomies. In essence, these methods look for patterns in theunstructured data. The tools are so designed that they look for associated text; a word like crimeis related to gang membership, academic performance of the person, incidents of drug or alcoholabuse, etc., and the information is presented in its relevant context.The impact of correlating structured and unstructured data can be easily visualized if we look atthe decision analysis that is required for store location analysis. Typically, retail companies willneed structured data such as the demographics of the neighborhood. They will need also mapinformation in the form of satellite imagery. Also, they would like to have unstructured information,such as crime, to gauge the attractiveness of the location besides lifestyle trends in the region.One example of the use of intelligent search engines is the case of ISYS search engine fromOdyssey Development. The Ventura County in California searches through its numerousrepositories to find related information. It could, for example, use data of blood examination, froma structured database, and find related information on several burglaries committed by the sameindividual from several other repositories.One of the several semantic search tools in the market has been created by Semagix forsearching media sources. The ontology is a hierarchy of categories beginning with generalclassifications like News, Business and Entertainment and then more specific terms like cricket,soccer, etc. The searches could be done by themes such as cricket tournaments which excludethe possibility of tangential information, such as tournaments of all sports, appearing.IBM is one company bringing a great deal of intellectual property to the table for searchingunstructured data. IBM WebSphere Information Integrator OmniFind Edition has pushed theenvelope by launching its Unstructured Information Management Architecture (UIMA), a platformfor integrating structured data and unstructured information. The platform supports a variety offunctions such as linking analytics software and enterprise applications, tools for developers toconveniently create new or reusable text analytics components. With this architecture,unstructured data in a host of formats or languages, whether it is located in databases, e-mailfiles, audio recordings, pictures or video images can be searched.The searches are unlike the familiar keyword searches; they use concepts to look for relatedpieces of information. Text analytic components, supported by UIMA, can use WebSphereInformation Integrator OmniFind Edition to define the ontology, look for relationships in data, minetext to find hidden knowledge and extract useful business information. An example of how thesekinds of search engines can look for inter-related information would be the case of customersatisfaction; it would be possible to search for data in maintenance records, market research
studies, call center records and warranty claims to find the products that customers find mostsatisfactory or vice versa.Altogether a total of fifteen companies plan to use this architecture and they include Attensity,SPSS, Endeca, Factiva, Kana ClearForest, Cognos, and SAS. Factiva and QL2 will provide datafor analysis.They get it with visualsDecision-makers are constantly intimidated by information clutter and are looking for tools to helpthem digest information rapidly. There is a great deal of noise in large volumes of informationwhile the noteworthy nugget could well elude the decision makers. In industries such as thesecurities industries, the value of information decays quickly unless the substance is absorbedquickly.Visualization is an indispensable tool for real time assimilation of relationships in large volumes ofdata and their implications for decision making. One instance of this is American Water which hasto monitor the threat of a hostile intrusion on its IT network. It receives thousands of alerts and thelarge majority of them are false alarms. Visualization tools help it to map the source address of apacket and its destination to help isolate any suspicious activity.Decision makers prefer interactive visualization tools to help them test their hypothesis visually.Excel type of static graphics have been the staple for visualization in enterprises. Decisionmakers need to be able to examine alternative scenarios and they like to have visuals that arethree-dimensional, pliable enough for impromptu reconfiguration to respond quickly to queriesand they like to flip them to view a problem from a variety of angles. The visuals are made lifelikeby the use of artifacts, colors and animations to convey the meaning of the information displayed.All of these attributes are meant to contribute to effective communication of a message. None ofthe widely available visuals available with spreadsheets have the capability to achieve this.Visual queries are one of the means to isolate relevant data from a clutter and portray it visually.Much like the structured query language, a visual query extracts specific pieces of informationfrom a mass of relational database and displays it on a graph. An alternative way to zero down onselected information is by the choice of dimensions; an analyst might want to compare the baddebt losses by regions, such as mid-west and the west coast, which is possible when a cube iscreated. Cognos Visualizer, which works in combination with Cognos Powerplay for aggregationof data from multiple sources, is one product that enables users to choose their dimensions andthe corresponding numbers they want to display graphically.The vendors in this field include the Business Intelligence vendors and another group isspecialists with a focus on visualization. Among the leading BI vendors are Cognos, BusinessObject and SAS. On the other hand, the specialists are companies like Vizible Corporation, VisualMining and visualization platform providers such as Antarctica System’s Visual Net and Spotfire’sDecisionSite. The platform providers are the most versatile as they are designed to use data from
any source and they can customize analytical tools to conduct the desired kind of visualization.Typically, the platform providers focus on industries that generate enormous quantities such asthe pharmaceutical industry or the natural resources industry.Spreadsheets are foreverAn aspect of real time access to data for decision making is also the ability of users to have theoption to continue to use familiar tools. Spreadsheets have been the most widely used foranalytics required for decision making purposes. Integration of spreadsheets with businessintelligence software is critical to their widespread adoption in the enterprise.Excel spreadsheets are ubiquitous in enterprises despite the fact that they inexorably fragmentthe data sources. The flexibility of Excel allows users to create their own data marts and they canadd formulas of their own choice. On the other hand, they contribute to fragmentation of datasources and perceptions which conflicts with the objectives of gaining a consistent view of theenterprise. Excel spreadsheets are also not scalable and are not the tool of choice when it largeteams have to work together. In addition, Excel spreadsheets do not have the ability to managebusiness processes based on the analysis conducted. Above all, Excel spreadsheets do not havethe ability to pull together information from a diverse set of corporate databases.Business Intelligence vendors have, in the past, provided partial integration with Excelspreadsheets by adding features for exporting data to Excel sheets which can be loaded on to aserver. However, Excel users routinely create their own formulas to create additional data serieswhich were not included in the process of integration. One of the exceptions was SRC Software,recently acquired by Business Objects, which had closely integrated a variety of databases withan Excel interface.Lately, business intelligence vendors have changed course and have offered Excel as theinterface to their business performance or business intelligence software. The software providesa consistent view of the data and uploads the formulas together with information on changesmade by any of the workers in the enterprise so that they all have access to the sameinformation.Beyond a patchwork of integrationThe issue of integration has gained urgency as companies seek to manage their sprawling supplychains, collaborate in product development, offer self-service options to customers and outsourcebusiness processes. A common denominator in these applications is that they span severalsystems and applications; enterprises need conduits for information to flow across all of them.In the past, middleware was the accepted way to integrate applications; it was a step forwardfrom arduous custom coding that was the norm. While custom coding is the recommendedmethod for joining two applications, middleware simplifies matters when an application has to beconnected to several others. The middleware is the intermediate junction which allowsinformation to flow to several different directions. On its way, the middleware prepares the data,
the reformatting, merging, etc., which allows it to be accepted in another database or application.Data transfers, in a message-oriented architecture, are akin to e-mail which remains in a serverqueue till it can move through networks and eventually downloaded onto a desktop when desired.Middleware created a patchwork of joins which grew in numbers and complexity over a period oftime and was increasingly difficult to manage. At this point of time, a need was felt for a platformthat would manage the gamut of middleware and associated software such as business processmanagement, be managed from a single platform. Enterprise Application Integration (EAI) suitesmeet these needs. An EAI network has subscribing applications which replicate data received byany one of them into another application.Integration of applications is not limited to specialized EAI vendors such as Tibco, WebMethodsand SeeBeyond. These vendors have strengths in integrating several applications into acontinuous process but are less likely to be able to combine a broad range of complex businessprocesses. Another group of vendors are the large enterprise software companies offeringplatforms such as IBM, Microsoft, BEA. Finally, the application vendors are also offering a seriesof integrated applications such as SAP’s Netweaver, Siebel’s Universal Application Network(UAN).In the early stages of integration, message-oriented EAI software was most widely used fortransportation of data from one end to another. In a typical message oriented integrationtechnology, data can be transferred in an asynchronous manner so that current applications don’thave to interrupt their functions to receive new data feeds. An intermediate middleware receivedthe data feeds and transmits to another application without the need to alter the two applicationsin any way. The EAI network facilitates the navigation of data over a variety of networks,programming languages and applications. When two applications are integrated, a point-to-pointintegration will suffice. On the other hand, a publish/subscribe methodology will work better whendata transfers have to be undertaken to multiple applications.One case study of the implementation of the EAI is Cincinnati-based HealthBridge which acts asthe junction for flows of information from hospitals and providers in the metropolitan area ofCincinnati. The company navigates the data streams from 28 hospital clinical applications anddelivers more than 940,000 clinical results per month to 2,900 physicians. HealthBridge networkuses clinical messaging software to transmit data from one application in a hospital to another ina physician’s office.EAI makes a departure from piecemeal integration of some of the applications of an enterprise bytying them with middleware. While integration by means of middleware is much less expensivethan an EAI implementation, the benefits too are far more limited. By integrating the entireenterprise, an EAI also enables companies to view their business processes and applications intheir entirety. The process parameters are monitored so that companies can monitor theirperformance. They can optimize their business processes by modeling them, simulating the
impact of alternative designs which could help to lower costs and redesigning their businessprocesses. The IBM MQ series is the market leader in the space for message oriented integrationtechnologies with an estimated 65% of the market share. The leading position of IBM isaccounted for by open architecture which means that applications in diverse range of platformscan be integrated without any special programming. On the other hand, Microsoft, the other majorplayer in the market, does not support any other platform other than its own without custom codeto make it happen. IBM’s WebSphere MQ connects applications and passes messages betweenthem. It has a library of connectors to Oracle, SAP, and Siebel Systems applications, as well asmainframe systems such as CICS and IMS. BEA Systems offers WebLogic Integration and usesXML based application adaptors to inter-link applications.The other method of integration is to execute it at the level of data. This method relies ondatabase technologies like gateways, metadata, queries, data set transfers, and bulk data loadingtools such as ETL or data grids. The most commonly used methods are ETL tools, traditionallyused to feed data into Data Warehouses, are products like Data Junctions Integration Studio andEngine, Informaticas PowerCenter. These tools extract data from operational data stores,transform the data and load them into data warehouses.Increasingly, the former specialists in ETL have morphed into data integration specialists.Informatica, for example, has incorporated PowerAnalyzer into its data integration productPowerCenter Advanced Edition and is working closely with Composite Software for EII type ofintegration. IBM extended its data integration capability by its acquisition of Ascential and hasincorporated its ETL engine, renamed DataStage TX, which also includes EAI capabilities thatcame with the acquisition of Mercator Software Inc. The IBM WebSphere Information Integratorplays the role of data integration.Integration of data sources is crucial to not only consolidate data but also to check for its quality.This is illustrated by the case of The Scotts Company, Marysville, Ohio, the lawn and gardenproducts company, which needed to find a way to forecast consumer demand with greateraccuracy. In the past, it had to depend on its own shipment data instead of the point of sales datawhich was scattered and could only be accessed from EDI systems that were routinely used byretail chains. The data integration major, Ascential, designed a solution that allows the companyto access consolidated data from its POS of sales services in a way that could be read by its SAPapplications. A comparison of the actual shipment, production and inventory data with demandhelps to determine the trends in consumer demand and to make relevant adjustments.Enterprise Information Integration (EII) is a set of tools which integrate a federation of datasources besides applications. EII provides a single point access to all the data in the enterprise,whatever its format, with metadata to describe all the data. The better known products in thisspace include BEAs Liquid Data and IBMs DB2 Integration Integrator. Among the new
companies are Attunity, Avaki, Composite, and MetaMatrix. Composite has an alliance withCognos to integrate business intelligence software and its product Composite Information Server3.0 starts at a price of $100,000. Avaki’s first product Avaki 6.0 was launched at $50,000 whilethe total costs of deployment averages $175,000 to $250,000.These systems have two important components; they need a data model to aid the process ofconversion of data from one source to another. These tools also provide graphical tools that showthe configuration of the network of applications and data sources and a directory of terms toaccess methods and fields in the data sources.Among the key players are start-ups such as MetaMatrix which has a partnership with BusinessObjects and Hyperion while Composite Software has alliances with Informatica and Cognos. Boththese vendors come from a background in relational databases and their products afford SQL.Another category of vendors, such as Ipedo, provide XML based query techniques. CompositeSoftware enables the aggregation of data into a portal view while any kind of manipulation of thisdata has to be done manually by the user.A much desired integration method is to orchestrate business processes in order to enhance theability to control the levers that will help to respond quickly to changes in the businessenvironment. The ability to manipulate business processes puts business executives in control ofIT and they can direct enterprise resources independent of the IT department. The need toautomate business processes had been alluded to by vendors in the content management andworkflow management as well as in the EAI space. However, the management of businessprocesses has been piecemeal so far and has not progressed to a level where all them can bemanaged from a single platform of its own independent of other divisions.A single platform, for business process management, provides the means to adapt businessprocesses to changing requirements rather than be set in an application stone. One of the keybarriers to configuring a series of business processes is that they have always been embedded inapplications. Once they are decoupled from applications, business processes can be broken upinto their components, reused for a variety of tasks and connected to complete a series of tasksto complete a job at hand. The Business Process Modeling Language (BPML) provides themeans to create a path for the flow of business processes. For execution purposes, the BusinessProcess Execution Language (BPEL) plays a complementary role in that it manages the flow ofbusiness processes. The conceptual bedrock of an independent management of businessprocesses is Pi Calculus which provides a method for unifying them and reuniting them foranother purpose. Just like properties define an object, Pi Calculus is like metadata which spellsout the tasks an individual unit of a business process can complete and the roles it expectsrelated processes to complete. In such a world, business processes are akin to packets in anetwork which can be made to follow different routes depending on the addresses where they aredirected to move towards.
Intalio is one of the pioneers in the design of platforms for the management of businessprocesses in their own right rather than as a component of an application. Other notable productsin the same space are Microsoft BizTalk server and Holosofx which was acquired by IBM wasrenamed as Business Integration Modeler and incorporated into its Websphere platform.Siebel’s Universal Application Network (UAN) creates a process centric environment for theintegration of applications and data in an enterprise. At the heart of this strategy is a library ofbusiness processes such as quote to cash, campaign to lead, order to pay, etc. which can beused to compose solutions. An overarching SOA architecture enables other vendors to tie theirapplications in the overall solution. UAN incorporates vendor-neutral interfaces based on SOAP(simple object access protocol), WSDL (Web service definition language) and XML (extensiblemarkup language) to create an environment for a diverse range of vendors to plug in theirapplications. The UAN has used the syntax of the BPEL4WS (Business Process ExecutionLanguage for Web Services) standard to define all business processes which implies thatindividual vendors can hook in their own servers.Lately, another variety of data integration software for interlinking data sources across a grid hasappeared. This kind of software, such as the Avid Data Grid, spans a wide area network andallows access to the resources available across enterprises without going through thecumbersome process of using multiple passwords to access them. The data access is madepossible by a universal directory which provides the access to the data source. With just this oneaccess point, users are able to extract data from numerous sources and are delivered for specificusers.Some of the key players in the grid computing industry are HP’s Adaptive Enterprise, IBM’s OnDemand and Oracle’s 10G and Sun’s NI. IBM’s On Demand program uses its WebSphere andTivoli products for policy-based management. HP has a similar product called OpenView Platformfor the management of the grid and integrates Talking Blocks Web services into it.The key advantage of Grids is the possibility of lowering latencies afforded by a cluster of serversand storage devices which are not clogged when traffic spikes unexpectedly which would tend tohappen when numerous applications have to be operated simultaneously. An array of storageand servers helps to spread the load over several devices which also are better utilized becausethey are not dedicated to specific applications. Charles Schwab was able to lower to lower queryresponse from four minutes to as low as fifteen seconds.One of the earliest applications of Grid computing is the case of Hewitt Associates, a global HRcompany, which has deployed an IBM WebSphere-based grid to operate software to calculatepensions.Corporate radarsThe payoff of integration of information assets in an enterprise is the ability to monitor any sign ofa threat or an opportunity so that enterprises can take timely actions. Business Activity Monitoring
servers play the role of radar which can spot any exceptional events, such as missed schedulesin transportation, followed by the analysis of the impact of untoward events on related activitiessuch as communicating to a truck which could use its spare capacity to take on a load that theassigned truck missed. Business Activity Monitoring involves a series of related activities ofobserving business processes, comparing them with metrics, analyzing and visualizing itsimplications and communicating for corrective action.One of the applications of real time activity monitoring is the case of Brocade communicationswhich needed to keep track of the performance of its contract manufacturers to make decisionsabout its outsourcing decisions. It acquired a business activity monitoring tool to be able to do thisin real time.Some of the leading players in the industry are Informatica which has incorporated its BusinessActivity Platform, developed jointly with WebMethods, in its PowerCenter RT, its integrationplatform. WebMethods Application Integration capabilities and Informatica’s informationintegration have been combined while they have, at best, lightweight business processmanagement capabilities. Ascential Software’s DataStage, now integrated with IBM WebSphereDatastage has a Real Time Integration (RTI) Services component.There are also companies from the business process domain, such as Microsoft BizTalk Server2004, which has added a Business Activity Monitoring engine. BizTalk Server 2004 allows usersof Microsoft Office 2003 to monitor business processes from a desktop. Middleware specialistTIBCO offers BusinessFactor, the technology it acquired from Praja. Celequest is among themore prominent pure-play companies in the domain with its ActivityServer suite which has theability to stream data from operational systems and compares business rules with metrics todetermine whether an alert needs to be sent. Among the database companies, Teradata with itsActive Data warehousing product is focused on business activity monitoring.A nose for bad dataIn its early stages of growth, the data quality industry was largely populated by independentplayers. In more recent years, the independent players have been bought over by the largebusiness intelligence companies. A prominent example of this is the acquisition of DataFlux bySAS, Ascential bought Vality and was in turn bought by IBM. FirstLogic, an innovator in thespace, was bought by Pitney Bowes. Group 1, a data quality vendor, took a different course andacquired an ETL vendor, Sagent, before Pitney Bowes purchased it. The process of extracting,transforming and loading is expensive and companies hope to lower their costs by merging theassociated routines of validation, transformation, filtering or standardization. These processes areoptimal way to improving data quality when companies want to do data mining in a datawarehouse environment while much more needs to be achieved in a real time environment.An example of the functionality available with such products that offer both data consolidation anddata quality services is IBM’s WebSphere Data Integration Suite. Its ProfileStage component
automates the process of matching the data structures and data formats, from differentdatabases, so that the data from the source and the target are consistent. Similarly, theQualityStage component standardizes data for individual entities, such as a customer, andensures that disparate conventions in storing data don’t contribute to inconsistencies. The dataquality components are provided with the DataStage, an ETL tool.When data is updated in a real time environment, using technologies such as messaging queues,there is a need to execute data quality functions early on in the transactional databases or byother means such as metadata. The automation of data quality functions presupposes acomprehensive solution to provide universal definitions of data and a means to convert from onedefinition applicable to a specific application to another, removing duplications and correctingerrors in addresses, names, etc. Master data management is a means by which a database ofmetadata is created and rules for converting from one format to another.A key problem with current methodologies is that the data cleansing is done after the data hasalready been extracted so that the source of the error is not detected. When data reconciliationtakes place with the help of master data management systems, the source of the error is alsoidentified.It is now possible to buy rudimentary master data management products from a variety ofvendors. Among the platform vendors, the leaders are IBM and HP, PeopleSoft and SAP are theplayers in the applications vendor category and Ascential, Infomatica among integration vendorsbesides system integrators and Hyperion from the BI category.An example of the implementation of a master data management system is the case of Unileverwhich needed a centralized way of managing its data to pursue a global policy for brandmanagement and supply chain management. Historically, Unilever followed a decentralized policyfor the management of its subsidiaries which meant that its IT system was fragmented. It has nowimplemented a master data repository which helps to align its transactional systems for aconsistent view of its data.Sensors everywhereAutomation of data collection is one of the means to collect data free from errors. Sensors alsohelp companies gain visibility into their environment and to expand the universe of problems theyare able to address. The volumes grow with the use of sensors and present new challenges indata processing. Increasingly, RFIDs and other type of sensors are available for commercialapplication. For mass adoption, on the other hand, the costs of RFIDs would have to besubstantially lower before they will be accepted in applications such as supply chainmanagement.General Electric has been one of the early pioneers in the use of sensors for analytical purposes.These sensors gather data on the state of health of its jet engines. The data is gathered at oneplace where analytical software looks for signs of trouble and provide early warning to their
customers. The military has often been a leader in adoption in new technologies and itswillingness to accept the RFID technology is a pointer to wider diffusion in industry at large. Costfactors are less binding in situations where high value activities, such as the manufacture ofaircraft engines, is involved or in the military where security is an overriding consideration.The use of RFIDs in mass applications such as supply chain management is going to bring itcloser to ubiquity. In the early days, the industry had to improve the technology, the ability to readinformation, so that it would work in commercial environments. Wal-mart is working with ahundred of its partners to expand the use of RFIDs for supply chain management. NEC of Japanhas reported early successes in the use of RFID in its PC assembling plants. Unlike bar codes,RFIDs do not require manual scanning of bar codes; the productivity at its Yonezawa Plant, as aresult, increased by 10% besides the benefits of just-in-time replenishment of inventories.It will take middleware or some other form of integration technologies for companies to be able toreceive information from a variety of sources and funnel it to a central database. One of thesignificant initiatives, to popularize the use of RFIDs, is the partnership between Oracle and Inteland Xpaseo to supply tools for the management of information received from sensors. Thesetools will mediate the flow of information from sensors and integrate with existing products fromOracle and Intel, including Oracle Application Server 10g, Oracle Database 10g and Oracle E-Business Suite. In addition, the partnership will extend the scope of pervasive computing bylinking data from other devices such as hand held devices, PC, mobile devices using Intelcommunication and server platforms.Other companies who have RFID offerings include SAP, IBM and SUN. There are also smallerspecialist companies who offer middleware to integrate RFID technology with the rest of theenterprise software.THE BIG PICTUREReal time enterprise has to put in place several moving parts before the entire vehicle for rapidresponse takes shape. Vendors have been able to offer most of the components of the solutionsof an adaptive enterprise and are acquiring companies to consolidate their products to provide acomplete package. Some remarkable breakthroughs have already been achieved especially inthe analysis of data. Real time interpretation of data, aided by machine learning techniques, andpredictive analytics capabilities has equipped enterprises to grasp the dimensions of theproblems they encounter and to anticipate the outcomes and consequences. The imminentprospect of automated data gathering and business process design will trigger as muchexcitement as dashboards have done in the recent past. Larger vendors such as Oracle, IBM,Microsoft, SAS, SAP, Business Objects and Hyperion are emerging stronger than before and arebest equipped to emerge as leaders. In all probability, infrastructure providers from among theformer ERP companies will have the strongest foundations to supply the products and theconsulting skills to win over customers in the future.