Month: December 2004

Google and You

It seems that Dave Winer shares my paranoid feelings toward Google.

I have been ranting to friends and colleagues for a long-time that I would never work for Google, as it is going to be the Microsoft of the early 21st Century. The rosy glow will wear off as they become more and more omnipresent.

They also hired a VP away from a company I used to work for, Now, that isn’t so bad, except that this gentleman has a great deal of experience that can’t appear on his resume for reasons of National Security.

Great search engine, but that’s all Google will get from me.

If you want to extend your paranoia, go here and watch the movie.

Comcast scanning Port 80

You would kind of expect it, but to do it in such an obvious way is kind of mind-boggling. Comcast started scanning my home IP every 15 minutes on November 24, 2004 for signs of a live Port 80. A simple IPTABLES rule and that’s the end of that.

Now, that move will likely get me in more trouble than having an active Web server on my IP.

Sears Shuts Me Down

Starting at approximately 18:00 EST on December 14, 2004, Sears began blocking all incoming requests from my public IP address, most likely due to the GrabPERF testing I do from here.

Sears14dec2004blocked

How do I know it’s a filter and not a performance issue? Well, when I re-route traffic through one of the many thousands of open proxy servers available, the page comes up instantly. Also, they are doing a smart thing, and not sending an explicit TCP Reset; they are simply routing the TCP SYN requests into the "bit bucket" — a DROP rule for firewall wonks out there.

I have terminated testing of this site. Guess this means my wife can’t access her order information for the oven part — my bad.

Need a Regular Expression to Mangle IIS Log Files

Ok, I need some help in writing a Regular Expression that will allow me to split up IIS logfiles into a usable format for entry into a database. I have one for NCSA/Apache, but no one on the web seems to have one that I can use for the mangles output that I am getting from the IIS logs I have been asked to analyze.

Drop me a line if you can help out.

Service Level Agreements in Web Performance

Service Level Agreements (SLAs) appear to finally be maturing in the realm of Web performance. Both of the Web performance companies that I have worked for have understood their importance, but convincing the market of the importance of these metrics has been a challenge up until recently.

In the bandwidth and networking industries, SLAs have been a key component of contracts for many years. However, as Doug Kaye outlined in his book Strategies for Web Hosting and Managed Services, SLAs can also be useless.

The key to determining a useful Web performance SLA rests on some clear business concepts: relevance and enforceability. Many papers have been written on how to calculate SLAs, but that leaves companies still staggering with the understanding that they need SLAs, but don’t understand them.

Relevance

Relevance is a key SLA metric because an SLA defined by someone else may have no meaning to the types of metrics your business measures itself on. Whether the SLA is based on performance, availability or a weighted virtual metric designed specifically by the parties bound by the agreement, it has to mean something, and be meaningful.

The classic SLA is average performance of X seconds and availability of Y% over period Z. This is not particularly useful to businesses, as they have defines business metrics that they already use.

Take for example a stock trading company. in most cases, they are curious, but not concerned with their Web performance and availability between 17:00 and 08:00 Eastern Time. But when the markets are open, these metrics are critical to the business.

Now, try and take your stock-trading metric and overlay it at Amazon or eBay. Doesn’t fit. So, in a classic consultative fashion, SLAs have to be developed by asking what is useful to the client.

  • Who is to be involved in the SLA process?
  • How do SLAs for Internal Groups differ from those for External vendors?
  • Will this be pure technical measurement? Will business data be factored in?

Asking and answering these questions makes the SLA definition process relevant to the Web performance objectives set by the organization.

Enforceability

The idea that an SLA with no teeth could exist is almost funny. But if you examine the majority of SLAs that are contracted between businesses in the Web performance space today, you will find that they are so vaguely defined and meaningless to the business objectives that actually enforcing the penalty clauses is next to impossible.

As real world experience shows, it is extremely difficult for most companies enforce SLAs. If the relevance objectives discussed above are hammered out so that the targets are clear and precise, then enforcement becomes a snap. The relevance objective often fails, because the SLA is imposed by one party on another; or an SLA is included in a contract as a feature, but when something goes wrong, escape path is clear for the “violating” party.

If an organization would like to try and develop a process to define enforceable SLAs, start with the internal business units. These are easier to develop, as everyone has a shared business objective, and all disputes can be arbitrated by internal executives or team leaders.

Once the internal teams understand and are able to live with the metrics used to measure the SLAs, then this can be extended to vendors. The important part of this extension is that third-party SLA measurement organizations will need to become involved in this process.

Some would say that I am tooting my own horn by advocating the use of these third-party measurement organizations, as I have worked for two of the leaders in this area. The need for a neutral third-party is crucial in this scenario; it would be like watching a soccer match (football for the enlightened among you) without the mediating influence of the referee.


If your organization is now considering implementing SLAs, then it is crucial that these agreements are relevant and enforceable. That way, both parties understand and will strive to meet easily measured and agreed upon goals, and understand that there are penalties for not delivering performance excellence.

Firefox Rising

MozillaZine has some interesting statistic war data on the Firefox arrival. Now of course Microsoft is disputing the numbers. But it is sort of like arguing that the Sammy Sosa Home Run record doesn’t mean as much as the Roger Maris or Babe Ruth Records because of the extended season.

Microsoft has lost this battle. If they are willing to suck it up and try to win the war, they have to agree to the following:

  1. Internet Standards — CSS, XHTML, RSS, and HTTP/1.1 — do matter. In fact, when looking at Web performance, I have to fall back to the following argument: “I don’t care if MSIE does it, it is not in the accepted standards”. Microsoft has a very active research division that has participated in the development of these standards. Means they must buy into them at some level.
  2. MSIE hasn’t had any buzz since the start of the pop-up wars.
  3. MSIE suffers from a general perception of being insecure. Microsoft hates when the others can effectively use FUD against their products.
  4. Being multi-platform has it’s advantages. And Macintosh doesn’t count, especially since you don’t have a current version of your browser for OSX. Geeks use Linux, and OSX, and BSD, and Windows. But Geeks all love Firefox, and that runs on ALL platforms. So do 90% of the extensions and themes.

I like Firefox. But I do not underestimate the power that Microsoft can bring to bear if it wants to really develop a kick-ass browser. And in the end, all Web users win.

Copyright © 2025 Performance Zen

Theme by Anders NorenUp ↑