Tag Archives: millsap

Cary Millsap from Method-R Came to Town in November

From my internal blog: Last changed: Nov 06, 2008 11:01 by Stephen Feldman

For the last few weeks I’ve been planning a training visit from one of my favorite authors and an Oracle resident performance expert named Cary Millsap. If you are an avid reader of my blog, there’s a solid chance that you have read about Cary and/or his former company Hotsos. Cary is coming on-site for about 2 and 1/2 days to work with our team on some of our Oracle performance challenges. I wanted to use this blog to cover the agenda, as well as go into a few details about our objectives of the engagement. Because of the R9 schedule, the team will have limited interactions with Cary. I will cover that in full detail below.

Cary is on-site Tuesday through part of Thursday. Most likely David Zhang, Patrick Kee and I will be working most directly with Cary. I’ve asked him to do an afternoon session with the North American team, which we will record and I will also ask him to do some hands on benchmarking with the team on Wednesday.

There were 4 core topics that I had asked Cary to cover with the group.

  • Introduction to Method-R: How to make practical use of Method-R with the data we have
  • Oracle Wait Events, 10046 Tracing and the P5 Profiler
    • Sub-Topic: Other Tools to Use Besides P5 for reading 10046 traces (TKPROF or Other Tools)
  • Lightweight Tools for Instrumenting Oracle Performance
  • Treating Every SQL Statement Like a User Transaction

Besides these 4 topics, there are other aspects to our agenda that I would like to cover:

  • Introduce Cary to our Performance Engineering Practice
    • SPE Overview
    • SPT Overview
      • How We Benchmark: Calibration, Abandonment and PARs
      • The Role of Our Data Model
      • How We Instrument and Measure
        • Tools in the Lab
        • Galileo
  • Review the Bb Architecture
    • Take a spin of the application
    • Review the components of the architecture: Application Layer and then Database Layer
  • Review the Performance Lab Deployment Architecture
    • How we configure Oracle
      • Our Base Configuration and Initialization Parameters
      • Physical and Logical Architecture
  • Execute Multiple Benchmarks
    • Benchmark 1: PVT Scaling Exercise
      • Goal: Instrument and Determine CLP-Datagen Data Inefficiencies
    • Benchmark 2: EDB Calibration Exercise
      • Goal: Use Method-R to determine where to focus our optimization efforts
      • Goal: Lightweight instrumentation prior to profiling (Digging for the right information)
      • Goal: End-to-End profiling (Most likely a focus on Oracle Profiling)
  • Open Question Time with Cary about Oracle Performance
Advertisements

Hotsos Symposium 2008 Day Three

Today was my final day at the conference. Well, actually not the day of this posting, but March 5th. Since I was traveling last night I was not able to get a posting into my blog. I had a great time at the conference. It was quite possibly the best conference that I’ve ever attended or participated in. The Hotsos Community is simply the best. I’m going to do a quick recap of day 3 below. I will only cover the sessions that I attended.

Measure Once…Cut Twice
Cary Millsap gave a really interesting presentation on Software Development practices based on an analogy of carpentry. Apparently Cary is an accomplished furniture builder. He sees a lot of parallels in building furniture as building software. What’s interesting is that analogy is usually the opposite “Measure Twice…Cut Once”. The idea behind this phrase it’s better to be cautious upfront in order to reduce risk later on. As Cary discusses in his paper, things change over time and all of that upfront planning doesn’t necessarily save you anything in the end. You might find that all of that upfront planning ends up hurting you in in the end.

For any of our agile programmers…this was a great read…

Better Visualization Tools
When Neil Gunther walks into a room, more often then not he’s the smartest person in the room. The problem with Neil Gunther is that he knows he is smarter then everyone else and wants you to know he’s smarter then you. The premise of this presentation was about better visualization of performance data. All of his examples can be seen here. In his presentation, he did call out Apdex, which I am a participating member of. He apparently gave a presentation at the CMG Group last year on Apdex and visualization. Gunther is clearly more accomplished then me. So who am I in the scheme of things? A possible buyer of his books…well never again will I buy any of his books.

Singing SQL: Natural Data Clustering
Dan Tow presented my absolute favorite presentation on SQL Performance. This guy is the author of SQL Tuning from O’Rielly. He spent a fair amount of time talking about why Oracle prefers Nested Loops over Hash Joins and why it can be really important to identify when it’s appropriate to use a Hash versus a Nested Loop. Take a look at the article Dan presented.

One thing about Dan…we need to bring this guy here for a seminar. I did touch base with him and he seemed quite interested.

Average Active Sessions
The last presentation I went to was by Kyle Hailey. I really enjoyed hearing from Kyle. He’s pretty straightforward and just makes a lot of sense. His presentation was on the Active Session History metric available in Oracle. He’s written a number of tools that can help pull the metric around Active Session History out of Oracle. Feel free to look at his presentation on wait events as well.

Hotsos Symposium 2008 Day One

Day One is in the books for the 2008 Hotsos Symposium. It was pretty crazy getting here. I arrived around mid-night last night after 9 hours of airplane madness. Who would think that Dallas would have massive thunderstorms the first week of March? Certainly not me…

The conference started off somewhat uneventful. There was no crazy welcome with smoke and flashing lights…just Gary Goodman, the CEO of Hotsos kicked-off the presentation covering logistics and welcomed the keynote speaker, [Cary Millsap|http://carymillsap.blogspot.com/]. For those of you who do not know Cary, well he’s one of the best performance engineers in the world. He wrote Optimizing Oracle Performance back in 2003. This book continues to be my most read performance book in my library. It’s quite frankly my favorite performance engineering book ever.

Cary’s Key Note was a retrospective of his keynote from 2003. Back in the late 90’s and early 2000’s, “Oracle Tuning” was based on trial and error. The old way of thinking was to study the system as an aggregate, make a change (single change), observe the new performance characteristics and start the process again. Much of this process was based on improving percentages and ratios that had nothing to do with how users were actually perceiving performance. That’s where Method-R comes into play. Method-R is about focusing on the operations/transactions that are most important to the business. With Method-R, the idea of tuning is replaced by the process of optimization.

Method-R forces the performance engineer to ask better questions:

* Are the tasks fast? (Quantify/Measure)
* Are the tasks efficient?
* Is the system efficient?
* What would happen if?

Semantic Query Optimization
The first presentation I attended was about Semantic Query Optimization by Toon Koppelaars. SQO is when you have two queries that are semantically equivalent if they return the same answer for any database state satisfying a given set of integrity constraints. The overall point about the presentation is to identify weaker predicates and replace them with more sound/stronger predicates. Definitely take a look at the presentation. It includes tons of example scripts in the zip archive.

Leveraging Oracle’s Extended SQL Trace
The next presentation I attended was [Leveraging Oracle’s Extended SQL Trace Data to Expedite Software Development. The idea behind this presentation is to use the 10046 Oracle trace as an instrumentation tool for software developers. Included in the archive are a bunch of perl scripts to parse the 10046 trace data. The premise of the presentation is to use 10046 to identify software anti-patterns that are not completely obvious, or are incredibly obvious, but don’t make sense for performance.

Dealing with Software Agoraphobia
The third presentation I attended was a Solaris focused system performance engineering presentation on software agoraphobia. This was a heavily sun-focused presentation with an emphasis on prstat, lockstat and d-trace. The emphasis of the presentation was to focus on CPU latency rather then aggregate measures. Take a look at the speaker’s blog.