Day 1 is officially in the books for my first day at Velocity. It was a really exciting day meeting other web performance specialists. The best part of the day had to be sitting next to Steve Souders and even getting the chance to talk to him for a bit. The guy reminds me a lot of Cary Millsap. He’s incredibly hands on which is awesome.
So where to begin…
The day started with Souder’s giving a presentation called Website Performance Analysis. Before he spoke a single word, he had YSlow running against the Alexa top 20 sites automatically using the auto-run mode. He encouraged all of the attendees to look at Google Page Speed. Surprisingly he wasn’t part of the development team, he offered some consulting, but that’s it.
He also announced his new book called Even Faster Web Sites which came out last week. My plan is definitely to pick-up a copy this week at the conference and yes I will get a signed copy. What’s unique about this book is that he has several authors contributing chapters. We are talking about authors like Doug Crockford, Nicholas Zakas, Stoyan Stefanov and Nicole Sullivan. The last two are the developers of SmushIt.
He showed a slide demonstrating old browsers (IE 6/7, FF 3, Chrome 1 and Safari 3) versus the new browsers (IE 8, FF 3.5, Chrome 2 and Safari 4). At best we see 20% improvement. New browsers still block. Souders then demonstrated his own tool that he wrote called Cuzillion which allows developers the chance to model their rendering edge cases by creating mock pages represented with HTML objects such as scripts, images, CSS, etc…I definitely think it’s worth having the team look at the tool.
Functions that execute before the OnLoad() event have the opportunity for lazy loading. Souders brought this point up this point which was covered in this AJAXIAN article as a means for minimizing script blocking latency.
I came across the tool Souders references by slides, but made no verbal acknowledgement of this tool called Doloto from Microsoft which analyzes what workloads can be better split. The tool was presented last year at Velocity. Doloto is a system that analyzes application workloads and automatically performs code splitting of existing large Web 2.0 applications. After being processed by Doloto, an application will initially transfer only the portion of code necessary for application initialization. The rest of the application’s code is replaced by short stubs – their actual function code is transferred lazily in the background or, at the latest, on-demand on first execution. Since code download is interleaved with application execution, users can start interacting with the Web application much sooner, without waiting for the code that implements extra, unused features.
This was really interesting, but probably not relevant to us as I believe we are limited to only 1 domain for all of our code, then N number of domains for external content. Now that I think about it, it may apply for these external edge cases. This process is called Domain Sharding Essentially the browser will perform more then their minimum parallel downloads. Let’s say you have to download 4 objects from one domain and 4 from another. The modern day browsers could parallelize both domain calls in one waterfall call, whereas the older browsers would be more stair case oriented. I’ve attached his presentation on the topic and would love for the team to review.
I’m not sure if this is something we are capable of doing in Java. Essentially, Souders talks about how you can flush the HTML document to create a waterfall like scenario for speeding up the download of content. He cites examples in PHP, Ruby, etc…but no mention of Java. It looks like there is a similar flush() method in Java, but I’m not certain. I’ll add it to our list.
I’ll talk about CSS in a later blog about Nicole Sulivan’s perspective on Object-Oriented CSS. Souders brought up CSS from the context of understanding: Rules and Elements. He referenced an article by David Hyatt from 9 years ago that is an absolute must-read. Souders himself wrote a blog about simplifying CSS selectors which is something we must read as well.
Ok UI team, we really need to talk and act. Sprites aren’t just a craze…they are legit. Every major web site has moved away from GIF and JPG in favor of Sprites and occassionally PNG (as a replacement to GIF). We need to make a serious move to Sprites ASAP. There’s a great project out there that will do the conversion.
- Souders talked about how YSlow used to be a GreaseMonkey script that interacted with Firefox. What about the idea of developing a Bb extension acting like a Bb Perf plugin? I’m not sure what it would be, but the idea of us developing tools for the team sounds intriguing and almost certainly what our next steps as a team need to be.
- Should we make the investment and buy HTTPWatch for the team?
- We need to investigate the Doloto Tool
- How many DNS lookups do we have for a Bb load?
- Can we perform a flush call in Java?
- We really need to do an isolated analysis of our CSS. If our designers are doing our CSS and not developers, we run a bigger risk that they are not as aware of performance. Not only should we do a CSS audit of the code, but do a browser analysis for performance. This should be a part of every release in which we make changes to the CSS and/or introduce a new browser.
- We really need to make a serious push to Sprites, as well as encourage content authors in Bb to use them.
- I’m serious in that I think we should write a special document specifically for content authors about making their pages more usable and responsive.
- We also need to abandon GIFs and JPGs in favor of Sprites and PNG. More on this in a later blog.