This week we have looked at the different problems that browsers can cause for developers, some of the ways to identify them early in the design process, and some tools to help fix them. One major question that still remains is why do all of these browsers all built around the same standards cause web applications to perform and appear so differently?

A little history on web browsers

Like the mind maps I described in a previous article, since the early 1980’s computing professionals and business people alike wanted a way to link documents together without relying on the underlying operating system. They wanted individual documents to loosely join together as a web of information allowing quick cross-referencing. Neil Larson in 1984 wrote the Maxthink outline program allowing linking between other documents and DOS files.(WP) I won’t go through the entire history. The idea was to expand this network of linked documents first within government, corporate, and educational institutions for the purposes of sharing information and linking it together. This then expanded to the global community bringing information together in amazing ways. Allowing you to view these thoughts in this post right now.

In 1993 the first true web browser was born, Mosaic from the NSCA. This was quickly followed by the introduction of other browsers from commercial companies such as IBM, Microsoft, and AOL. Each had their own commercial goals, patents, and other focuses that resulted in differences in how the browser operated. The race to build the most popular browser had begun.

Current rendering engines

Today we fortunately have a certain set of standards that browsers should comply with or at least understand. These make up the components of the Document Object Model or DOM. A fancy term for all the stuff that comes together on the browser. This includes Javascript, CSS, and HTML. Each browsers interpretation of the DOM and the load order of these individual objects varies, which in turn changes the way that the user sees the site on the web. You may find that the same site not only looks different, but feels different in a certain browser. Some web applications may not even “support” certain browsers interpretation of their content. Hence the “looks best” terminology that you might see on some sites. Certainly understanding and supporting more browsers would be better in order to expand the reach of a web site or application. Fortunately one thing does help, which is there are only a handful of major DOM rendering engines that most browsers use.

  • WebKit – This the underlying engine for a number of browsers with growing success. It powers the iPhone, Google Chrome, and the apple browser Safari. The kit itself is Open Source enabling developers to create new browsers for different platforms and utilize a number of great tools to insure that web applications are compatible with the rendering engine.
  • Mozilla – One of the oldest and most popular web rendering engines this powers Netscape, certain versions of AOL, Firefox, and a number of other browsers. It was rewritten from the ground up as an Open Source project, Gecko, after being a commercial engine from Netscape for many years. It also provides a great set of developer tools, many of which were mentioned in the previous article. It differs from WebKit in that it is a complete rendering engine instead of a some of parts. This enables it be faster and more complete, but also less compact. The complexity of a web application could significantly impact the performance due to this structure, so testing browser performance is important.
  • Trident – This powers Internet Explorer, the current leader in the browser wars along with many derivatives such as Avant. Its optimization for the windows operating system enables it to run a bit faster. It has also seen the most changes between versions, which makes compatibility an issue. It can also be used by developers as an engine for a software application by using its COM object.

There are a few other engines, which are listed on Wikipedia.

Looking to future of web browsers

While browsers have been working harder to meet standards, such as the Acid2 test, there really isn’t a defining set standard for how a browser must operate. This leaves the individual developers of the webs most popular browsers with a lot of room on how they design their engines. One of things that I was alerted to when interviewing companies about this series is that Internet Explorer 8 will try to speed up the users experience by opening more connections to the web server, similar to Fasterfox and other extensions accomplish. What this means for web application developers is that the server will need to handle more connections and that the sequence that things load into a web page will not only happen faster, but with less predictability. This makes testing even more important before releasing a web application. So I would suggest creating your own acid test, where you have the popular sequences and components of your site captured and make sure they still work whenever a new browser is released.