Month: January 2005

Mangled IIS Log Files and REGEXP

A while back I asked if anyone had a REGEXP to deal with IIS log files. Well, It was more complex than that. It seems that the logfiles are mangled by the MSFT log parser tool into a very weird format.

And here is the REGEXP I had to use.

/^(S+) (d+) (d+-d+-d+) (d+:d+:d+) (S+) (-) (S+) (S+) (S+) (d+) (S+) (.+?) (.+?) (S+) (S+) (-) (S+) (S+) (S+) (S+) (S+) (S+) (.+?) (.*)$/

Nice, isn’t it?

Web DESIGN Standards

Jakob Neilsen has a great article on Web Design Standards. You often hear me discuss things along these lines at the application and HTML layer — HTTP and (X)HTML/CSS standards. I agree with what Jakob is saying: designers must consider how people will use their site, not just how they want them to use their site.

The Takeaway:

Why Websites Should Comply With Design Standards

One simple reason:

  • Jakob’s Law of the Internet User Experience: users spend most of their time on other websites.

FULLTEXT Indices — The Final Homer!

Ok, figured out the problem with the "/" [root document] query using FULLTEXT indices. It’s actually two problems,

  1. It is less than the minimum character count, which is four by default; I reduced that to three.
  2. It isn’t an alphanumeric character.

So, my corner-case hack would have been necessary anyway.

Cost Recovery for the Unrecoverable

The non-billable hour has this great post on the idea of recovering costs for items such as printers, faxes, phones, etc.

Oh this is nice. What a great way to get your customers to come back to you. Lets bill them for everything we touch. Can’t wait for the cost-recovery implant that gets stuck into professional’s brains by their firms in order to capture the exact amount of brain-power a project took.

More stupidity in my Web Server Log Application

Ok, as Tim Bray does, I am here to expose the stupid with the great in all the code I write.

Mine was even more brain-damaged. I generate a series of aggregate Web Log stats,after filtering out bots/crawlers/images/css/robots.txt/etc., to generate meaningful information on the visitors to my sites.

However, the aggregate stats were not meshing with the drilldowns I was generating to examine data views, such as who hit what pages, who hit during what hour, etc.

Well, it helps if you look at the same data. The drilldowns used the following SQL filter:

ap.bytes != 0

Ummm…if I built the TEMP table for aggregate data view page using the same filter, the information would be aligned.

A Homer Moment…DOH!

Web performance paper

Peter Lin has a Web performance paper here. Focuses on Java and Tomcat. Comments here.

Too bad he misses the point of external monitoring, because some of the other points he makes are very good. Too much focus on the data, not enough on how this improves performance.

Appears that he lives in the area. Peter, if you read this, say hi!

Copyright © 2025 Performance Zen

Theme by Anders NorenUp ↑