(Optim Performance Management Solution
Improve performance, reduce costs, speed development and agility)
Today, we’ll learn how the staff at one fictional organization, the Great Outdoors
Company, takes advantage of Optim integrated data management solutions for
performance management to help them work collaboratively to gain more value from
their business-critical applications. You will see how the team:
Detects and corrects an emergent database problem
Accelerates the performance and stability of an existing application without
requiring code changes
Builds performance into a new application right from the start, thereby reducing
downstream risks and costs.
(Resolve problems before they affect the business)
(Optim Performance Manager Extended Edition)
It’s not been a good day at the Great Outdoors Company. The manager in charge of
customer service is upset because they are unable to meet their goals for average call
time. The customer service reps say that the application that retrieves the customer order
history is taking way too long. Managers are calling managers and it finally ends up in
the DBA’s lap to figure out what is going on. Eric takes on the challenge because he
knows that with the Extended Insight capability of Optim™ Performance Manager
Extended Edition, he will be able to narrow down the cause of the problem quickly.
Eric logs into the Web-based console provided by Optim Performance Manager, where
he can easily change the view of collected performance information to a specific time
period. In this case, he narrows his view to the last hour. The red flags on the Extended
Insight Analysis Dashboard confirm that critical thresholds have been exceeded. By
drilling down, he sees that it is specifically the order history report application that has
exceeded its critical threshold. With a quick glance at the end-to-end graph, Eric sees
that the response times in the last hour have exceeded both the critical and warning
thresholds that had previously been configured.
To narrow in where in the software stack the problem is occurring, Eric drills down on
the application to get the end-to-end details. The average end-to-end response time graph
indicates when this application was run during the last hour, and includes the average and
maximum end to end response times. The distribution graph shows that the majority of
the time was on the data server when compared to other layers including the network, and
application. If this were a WebSphere® application, he could also see connection pool
information. From the top 3 SQL statements, Eric selects the SQL statement with the
longest average data server time and drills into the detailed statement information. The
pie chart shows the distribution of response time for this statement, which shows that
most time is clearly spent in the data server.
Eric views the entire SQL statement, and then right from this query he launches into
Optim Query Tuner, with the selected SQL statement ready to be tuned.
(Optim Query Tuner for DB2 for Linux, UNIX, and Windows)
When choosing query tuning activities, Eric has the choice of running several advisors,
including statistics, query, access path, and index. He decides to run the Statistics
Advisor first, as outdated statistics are often a cause of poorly performing SQL. He also
specifies that he wants an access plan graph generated, and that he wants both advanced
query formatting and query annotation, which annotates the query with relevant statistics.
The statistics advisor gathers information from the catalog, comes up with
recommendations and generates the utility control statements for collecting and updating
statistics, In this case, since the information is current, he does not need to run these
statements immediately but if needed they can be saved to be used in the regular
maintenance cycle or in the DB2 auto-runstats profile.
Eric examines the formatted and annotated query. Looking to the Access Plan Graph for
clues, he sees that there are two table scans occurring for the join between two customer
order tables. He can drill down into the node on the access plan graph to see more
Eric thinks that perhaps the data growth for this application has led to the need to create
indexes for these tables. He decides to run the Index Advisor to validate his hypothesis.
The Query Tuner index advisor recommends use of indexes. The recommendation details show
both the current indexes and the suggested new indexes. The estimated performance
improvement with the new indexes is included, as well as the columns and the estimated
disk space required. Optim Query Tuner provides the DDL to create those indexes,
which he can now run or save to a file for later execution after reviewing with his team.
Eric is able to get back to his colleagues quickly with a resolution and a plan to roll out
the new indexes, which should bring performance of the order history application back
into an acceptable range.
(Resolve problems quickly – summary)
Optim solutions for performance management can help DBAs identify, diagnose, solve,
and prevent database performance problems quickly, before they affect the business.
To help you identify problems early, the browser-based user interface lets you log on
anywhere, Dashboards provide visual cues of potential or current problems, and you can
have alerts sent to you.
Optim Performance Manager Extended Edition integrates its deep database performance
insight with the broad enterprise-wide insights provided by IBM Tivoli® transaction
monitoring. This powerful combination extends transaction response time monitoring
from the database to the complete end-to-end transaction path.
Problem diagnosis is guided from overview dashboards to more detailed diagnostic
dashboard views. Extended insights into Java™ and CLI applications help you pinpoint
problems quickly. Predefined application views are available for key applications.
To solve problems caused by poorly performing SQL, you can launch into Optim Query
Tuner directly from Optim Performance Manager to access expert advice and
recommendations for query analysis and tuning.
Finally, to prevent problems from occurring, Optim Performance Manager Extended
Edition provides integrated configuration tooling for DB2 workload manager, giving you
configuration and monitoring capabilities in a single tool. To help with trend analysis
such as required for capacity planning, you can use interactive reports with detailed
(Improve performance, manageability, and security of
new or existing applications)
(OPM Extended Edition and pureQuery Runtime)
Now that our Great Outdoors DBA has solved the immediate problem, let’s see how he
can proactively use Optim solutions to accelerate and stabilize the performance of an
Looking at the Optim Performance Manager Overview Dashboard, Eric notices that a
critical threshold has occurred for the package cache hit ratio, which if left unchecked
could cause overall system performance problems.
On average applications are only finding about 72% of the required resources in the
cache. This means that a significant number of SQL statements are being compiled when
applications request them instead of being reused from the cache. Before immediately
concluding that a larger cache is the only way to deal with this problem, he decides to
investigate further and see if he can determine why this is happening.
Looking at the package cache hit ratio graph he sees an obvious dip that occurred around
1:17 PM. His next step is to determine the specific applications that were connected to
the monitored database during this timeframe. Optim Performance Manager has a
powerful repository of performance metrics that are presented not only in dashboards but
also in a set of predefined and interactive reports. Eric generates a Database Connection
report for that specific time frame to narrow in on the problematic application. By drilling
in on the suspected periodic order application, he can see many more details where he can
see a lot of dynamic SQL. Ah ha, here is the package hit ratio, which is showing a
miserable 2%. For some reason, this particular application is not able to make efficient
use of the dynamic statement cache.
Eric opens the Extended Insight dashboard and drills down into the application details.
The SQL Statements tab makes it clear that there are many similar statements using
literals for values such as customer code and status. From DB2’s perspective, even
though they are so similar, each of these statements is unique and must be recompiled
Eric knows that if the literals could be replaced with parameter markers in the
application, it would eliminate preparation time and will improve both cache hit ratio and
performance not only for this application but also for other applications that are running
at the same time.
Rather than go back to the application team to recode this application, Eric decides to see
what he can do to optimize and stabilize performance by using the client optimization
capability available with pureQuery
The client optimization process consists simply of:
• Capturing the SQL statements into metadata file while the existing application is
• Configuring the capture file and, optionally, binding the statements into a
package for static execution
• Optionally, running the application in static mode.
Now, since Eric has already captured the SQL using the option for converting literals,
from the SQL Outline view, he can see that the literals have been replaced with parameter
markers, which to the database are treated as a single statement and thus needs to be
compiled only once. If he wanted, Eric could run the application dynamically with just
this optimization in place.
However, because Eric strongly believes that static execution will bring the most benefit,
he takes the next step to bind the captured SQL into database packages.
Later, after deploying the new package, Eric uses the Optim Performance Manager
database connection report once again to analyze the application behavior and sees that
there are no longer any dynamically executing SQL statements, and that the package hit
ratio is 100%. Simply by optimizing this one application, he has been able to dramatically
increase the efficiency of the cache without needing to take additional resources from
other parts of the system.
(OPM and pureQuery client optimization summary)
By using pureQuery and client optimization, DBAs can proactively optimize and secure
database applications, helping to reduce costs for growing applications. Internal IBM
testing has shown that pureQuery static execution can dramatically increase database
transaction throughput rates to nearly double that of dynamically executing JDBC.
Increasing efficiency allows more work to be loaded onto existing hardware.
And, by using static execution, the improved performance can be locked in, securing
quality of service. DBAs can give database packages meaningful names, making it easier
to monitor and correlate SQL with applications.
Finally, security and auditability are improved. Captured SQL can be reviewed for audit
purposes, and the security risk of dynamic SQL injection is reduced by using static SQL.
And all of these benefits are possible without changing any application code!
(Build performance into applications – right from the
The Great Outdoors Company is implementing a new application to support their
growing business partner relationships. The development and DBA staff working on the
new application has decided to collaborate on the project using Optim solutions to create
applications that have database best practices and performance built in from the start.
Optim solutions can help them collaborate on performance considerations earlier in the
cycle where it is cheaper and less disruptive to find and correct, thereby reducing the
overall risk for the project even while enhancing developer productivity and agility.
(Optim Development Studio)
Let’s see how the premier database development environment provided by Optim
Development Studio and pureQuery enables unique performance insights to database
developers and DBAs. Simply by running the application, actual data about performance
can be collected and viewed. Here, Roslyn can see performance data and quickly see the
SQL statements that are most frequently run or have the longest elapsed times. The table
view enables sorting so she can easily locate data access hot spots in large applications.
Roslyn saves the current performance data as a baseline. She now wants to do some basic
First, to get context of what seems to be a possible problem area, Roslyn clicks on the
SQL and is relocated directly to the corresponding place in the Java source code. It’s
also easy to find out which tables are used in the query, so developers and DBAs can
understand the impact of database changes on the application or investigate the
characteristics of the table including data distribution, as shown here, which can give a
DBA an indication that skewed data could be affecting query performance.
(Optim Query Tuner)
Next, Roslyn tunes the query by using the Optim Query Tuner seamlessly integrated with
the Development Studio. She wants to get some initial advice on indexes that might be
helpful to improve performance of that query in her application. She also has the option
of running an HTML query summary report which includes information about the access
path of the query and recommended actions which she can easily share with her DBA.
Optim Query Tuner index advisor recommends use of indexes. A developer or DBA can create
the indexes now, or save the DDL to be reviewed and executed by the DBA.
After the indexes are created and Roslyn has done additional testing dynamically, she and
Eric decide to bind the application so that it will use static execution, which enables DB2
to lock in the access paths that use the new indexes.
They compare the new performance data with the run that Roslyn did before the indexes
were created and before binding statically. They are both pleased to see a dramatic
performance improvement both in the query now using indexes as well as with other
queries that are taking advantage of static execution.
(Summary: Build performance into applications - right from the
Optim Development Studio and Query Tuner together enable a powerhouse Java
development environment that moves database performance front and center with such
capabilities as easy identification of SQL hot spots and a seamless switch to static SQL.
DBAs and developers can collaborate effectively on performance early in the cycle with
graphs and reports that efficiently communicate information a DBA needs to help with
query tuning. Finally, Optim Development Studio can improve productivity and reduce
risk by database-centric tooling that helps prevent coding errors and makes dependency
analysis a snap. Optim helps developers create Java applications that even DBAs will
We’ve seen how the Performance Management Solution by Optim has helped the Great
Outdoors Company reduce costs, reduce risk, and improve agility by enabling the staff to
work collaboratively to:
Find and correct problems more efficiently and proactively,
Make existing applications faster and more stable,
Build new applications that are streamlined and optimized for database access.
To find out more, visit us on the web and join the community.