Data Studio Update Part 7

Find and Tune Bad Queries from the Package Cache

Dean Compher

30 October 2013

 

 

In this article I will show you how to find poorly performing queries from the dynamic package cache and tune them using the same tools that I described in my Data Studio Part 6 article.  You will use the same tools to tune these poorly running queries such as the visual explain as you did in that previous article.  There are times when you will need to go looking for problem queries, because the system seems to have slowed down, users are complaining about performance of queries or other reasons.  The easiest way to get started looking for problems is to capture the dynamic package cache in databases that process dynamic SQL.  Data Studio provides a way to gather the current contents of the dynamic package cache and sort it by various metrics such as average or total CPU time, average or total elapsed time, number of executions and several other important measures.  There will typically be tons of SQL in the cache, with most causing very little overhead.  You don’t want to waste your time on queries that consume few resources and are infrequently executed.

 

 

Getting Started

 

Before you can start tuning any queries in Data Studio, someone must have created the explain tables on the database where you are tuning the query and those explain tables must be for the version of DB2 that is currently running.  Sometimes explain tables will have been created prior to upgrading to the current version and will be out of date.  To verify that every thing is in place for query tuning, please follow the steps in the Preparing for Tuning page in the Information Center.  If anything is not ready follow the steps to Configure the Database Info Center page that gives quick instructions or you can follow the steps in chapter 7 of the Getting Started with Data Studio for DB2 e-Book that gives step by step instructions for configuring the database.  Since you may not be allowed to create the explain tables in production, I describe how to make your development environment look like production to the optimizer in the “Tuning the Queries in Test” section near the bottom of this article.  You can capture the package cache queries from production without having any explain tables there and then tune those queries against your test database. 

 

Gather Queries from the Package Cache

 

To find the worst performing queries in the package cache, start in the Administration Explorer, right click on the data base that interests you and choose “Start Tuning” from the menu as shown in Figure 1.  This will bring up a view in a tab called “QTProjectN” where N is a sequential number.  If the “Capture” tab on the left side is not selected, then click on it.  Notice that the QTProjectN tab’s window was maximized to take up the entire Data Studio window.  If you need to look at something else in Data Studio, then just double click on the tab and you will see the other views.  To maximize it again, just double click the tab again.  Please note that I am starting in the Database Administration perspective to get these set of views.  See the first article in this series if you need more information about perspectives. 

 

Figure 1.  Begin Looking for Bad SQL. 

 

 

Once you are in the capture window, make sure that you have a green check next to your database connection as shown in figure 2 where the GSDBTEST database is circled in red.  Then click on the “Package Cache” link on the left side under the appropriate source.  If you do not have any filters created to narrow the amount of data returned from the package cash you need to create one by clicking the “New…” button.  You will then see a box where you can add criteria for filtering.  In figure 2 have indicated that I only want queries that read at least one row.  You can then click Finish or Next through screens where you can set sort criteria and get rid of report columns that you do not want.  While I wanted to discuss capturing queries from the package cache, it is worth noting that you could capture them form the explain tables, stored procedures or even from event monitor tables if you are using any event monitors. 

 

Figure 2.  Begin Capture Process

 

 

 

 

  

 

Once you have created a filter, you can press the “Capture Now” button that had been greyed out before you created a filter.  This will display the contents of the package cache as shown in figure 3.  This report is similar to what you could see in the old Activity Monitor in Control Center.  You can sort the queries by any of the columns in the report by clicking on them.  So you can sort by STMT_EXEC_TIME, TOTAL_CPU_TIME, LOCK_WAIT TIME and several other performance indicators to see which queries are taking the longest or using the most resources. 

 

Figure 3.  Report of Package Cache SQL Measurements

 

 

There are many more measurements than shown in Figure 3 for the captured SQL.  You can sort the list by any of them by clicking on that column heading.  So for example if your are running out of CPU resources on your database server, you may want to sort by TOTAL_CPU_TIME and see if there are any queries that are taking excessive CPU cycles.  Once you have determined a query you want to review, you can just right click on it and then select “Run Single-Query Advisors and Tools on the Selected Statement” as shown in figure 4.

 

Figure 4.  Tune a Query from the Package Cache

 

 

 

Once you click the “Run Single-Query…” menu choice then you will see the window in Figure 5 below.  You are then ready to begin tuning the query as described starting with Figure 2 in my Data Studio Update Part 6 – Query Tuning for DB2 LUW and z/OS article.  To get back to the list of queries from the package cache, just click the “2. Capture” tab on the far left side of the window. 

 

Figure 5. Begin Tuning a Query

 

 

 

Tuning the Queries in Test

 

While you need to connect to production to see which queries in the package cache need to be tuned, you may not want to do the tuning against production for any number of reasons.  However, it is likely that your test database will contain much less data than production and so the query plans that the DB2 optimizer will likely be different.  There is a solution to this.  You can copy the statistics that the optimizer uses from production to development.  Once the statics are in development, DB2 will generate similar plans there.  In DB2/LUW you use the db2look command to accomplish that task.  Scroll down to the “Mimicking statistics for tables” section of the Mimicking databases using db2look page of the information center. 

 

 

This is the Forth in series of Data Studio Articles that I am writing.  Earlier editions that may interest you include:

Data Studio Update Part 1 – Links for installing, summary of my favorite features

Data Studio Update Part 2: Perspectives, Views and Database Connections – Navigating Data Studio, connecting to your databases. 

Data Studio Update Part 3: View DB Objects and Attributes – View tables, columns, indexes and model diagram. 

Data Studio Update Part 4: Changing Database Objects – Generate changes that preserve data and objects. 

Data Studio Update Part 5: Query Building and Editing – Ways to create and edit queries in Data Studio. 

Data Studio Update Part 6: Query Tuning for DB2 LUW and z/OS – Tune individual Queries. 

 

 

***

 

There are many more many more options for capturing and tuning your SQL in Data Studio than I have shown in this article..  Please take a fiew minutes to click around and see what is available.  Please add any features that you really like to my Facebook Page or my db2Dean and Friends Community and help the extended community. 

 

 

HOME | Search