Acid3

The Acid3 test is a web test page from the Web Standards Project that checks a web browser's compliance with elements of various web standards, particularly the Document Object Model (DOM) and JavaScript.

If the test is successful, the results of the Acid3 test will display a gradually increasing fraction counter below a series of colored rectangles. The number of subtests passed will indicate the percentage that will be displayed on the screen. This percentage does not represent an actual percentage of conformance as the test does not really keep track of the subtests that were actually started (100 is assumed). Moreover, the browser also has to render the page exactly as the reference page is rendered in the same browser. Like the text of the Acid2 test, the text of the Acid3 reference rendering is not a bitmap, in order to allow for certain differences in font rendering.

Acid3 was in development from April 2007, and released on 3 March 2008. The main developer was Ian Hickson, a Google employee who also wrote the Acid2 test. Acid2 focused primarily on Cascading Style Sheets (CSS), but this third Acid test also focuses on technologies used on modern, highly interactive websites characteristic of Web 2.0, such as ECMAScript and DOM Level 2. A few subtests also concern Scalable Vector Graphics (SVG), Extensible Markup Language (XML), and data URIs. Controversially, it includes several elements from the CSS2 recommendation that were later removed in CSS2.1, but reintroduced in World Wide Web Consortium (W3C) CSS3 working drafts that have not made it to candidate recommendations yet.

The test
The main part of Acid3 is written in ECMAScript (JavaScript) and consists of 100 subtests in six groups called “buckets”, including four special subtests (0, 97, 98, and 99). The compliance criteria require that the test be run with a browser's default settings. The final rendering must have a 100/100 score and must be pixel-identical with the reference rendering. On browsers designed for personal computers, the animation has to be smooth (taking no more than 33 ms for each subtest on reference hardware equivalent to a top-of-the-line Apple laptop) as well, though slower performance on a slow device does not imply non-conformance.
 * Bucket 1: DOM Traversal, DOM Range, HTTP
 * Bucket 2: DOM2 Core and DOM2 Events
 * Bucket 3: DOM2 Views, DOM2 Style, CSS 3 selectors and Media Queries
 * Bucket 4: Behavior of HTML tables and forms when manipulated by script and DOM2 HTML
 * Bucket 5: Tests from the Acid3 Competition (SVG, HTML, SMIL, Unicode, …)
 * Bucket 6: ECMAScript

To pass the test the browser must also display a generic favicon in the browser toolbar, not the favicon image from the Acid3 web server. The Acid3 server when asked for  gives a 404 response code, but with image data in the body. This tests that the web browser correctly handles the 404 error code when fetching the favicon, by treating this as a failure and displaying the generic icon instead.

When the test is running, the rectangles will be added to the rendered image; the number of subtests passed in the bucket will determine the color of the rectangles.
 * 0 subtests passed: No rectangle shown.
 * 1–5 subtests passed: Black rectangle.
 * 6–10 subtests passed: Grey rectangle.
 * 11–15 subtests passed: Silver rectangle.
 * All 16 subtests passed: Colored rectangle (left to right: red, orange, yellow, lime, blue, purple).

Note that Acid3 does not display exactly how many subtests passed in a bucket. For example, 3 subtests passing and 4 subtests passing in bucket 2 would both render a black rectangle.

Detailed results


After the Acid3 test page is completely rendered, the word Acid3 can be clicked to see an alert (or shift-click for a new window) explaining exactly which subtests have failed and what the error message was. In case some of the 100 tests passed but took too much time, the report includes timing results for that single test. The alert reports the total time of the whole Acid3 test.



In order to render the test correctly, user agents need to implement the CSS 3 Text Shadows and the CSS 2.x Downloadable Fonts specifications, which are currently under consideration by W3C to be standardized. This is required as the test uses a custom TrueType font, called "AcidAhemTest" to cover up a 20x20 red square. Supporting Truetype fonts however is not required by the CSS specification. A browser supporting only OpenType fonts with CFF outlines or Embedded OpenType fonts could support the CSS standard, but fail the test in the Acid3 test. The glyph, when rendered by the downloaded font, is just a square, made white with CSS, and thus invisible.

In addition, the test also uses Base64 encoded images, some more advanced selectors, CSS 3 color values (HSLA) as well as bogus selectors and values that should be ignored.

Development and impact
Google employee Ian Hickson started working on the test in April 2007, but development progressed slowly. In December 2007, work restarted and the project received public attention on January 10, 2008, when it was mentioned in blogs by Anne van Kesteren. At the time the project resided at a URL clearly showing its experimental nature: " http://www.hixie.ch/tests/evil/acid/003/NOT_READY_PLEASE_DO_NOT_USE.html " Despite the notice in the URL, the test received widespread attention in the web-development community. At that time only 84 subtests had been done, and on January 14 Ian Hickson announced a competition to fill in the missing 16.

The following developers contributed to the final test through this competition:
 * Sylvain Pasche: subtests 66 and 67: DOM.
 * David Chan: subtest 68: UTF-16/UCS-2.
 * Simon Pieters (Opera) and Anne van Kesteren (Opera): subtest 71: HTML parsing.
 * Jonas Sicking (Mozilla) and Garrett Smith: subtest 72: dynamic modification of style blocks' text nodes.
 * Jonas Sicking (Mozilla): subtest 73: Nested events.
 * Erik Dahlström (Opera): subtests 74 to 78: SVG and SMIL.
 * Cameron McCormack (Batik SVG library): subtest 79: SVG fonts.

Even before its official release, Acid3's impact on browser development was dramatic. In particular, WebKit's score rose from 60 to 87 in less than a month.

The test was officially released on March 3, 2008. A guide and commentary was expected to follow within a few months, but, as of March 2011, only the commentary had been released. The announcement that the test is complete means only that it is to be considered "stable enough" for actual use. A few problems and bugs were found with the test, and it was modified to fix them. On March 26, 2008—the day both Opera and WebKit teams announced a 100/100 score—developers of WebKit contacted Hickson about a critical bug in Acid3 that presumably allowed a violation of the SVG 1.1 standard to pass. Hickson fixed the bug with the help of Cameron McCormack, a member of W3C's SVG Working Group.

Presto and WebKit based browsers
In 2008, development versions of the Presto and WebKit layout engines (used by Opera and Safari respectively) scored 100/100 on the test and rendered the test page correctly. At the time, no browser using the Presto or WebKit layout engines passed the performance aspect of the test.

Google Chrome and Opera Mobile displayed a score of 100/100. Security concerns over downloadable fonts delayed Chrome from passing.

Firefox
At the time of Acid3's release, Mozilla Firefox developers had been preparing for the imminent release of Firefox 3, focusing more on stability than Acid3 success. Consequently, Firefox 3 had a score of 71. Firefox 3.5 scored 93/100, and Firefox 3.6 scored 94/100. Initially, Firefox 4 scored 97/100, because it did not support SVG fonts. Later, Firefox 4 scored 100/100, because the SVG font tests were removed from Acid3.

According to Mozilla employee Robert O'Callahan, Firefox did not support SVG fonts because Mozilla considered WOFF a superior alternative to SVG fonts. Another Mozilla engineer, Boris Zbarsky, claimed that the subset of the specification implemented in Webkit and Opera gives no benefits to web authors or users over WOFF, and he asserted that implementing SVG Fonts fully in a web browser is hard because it was "not designed with integration with HTML in mind".

On April 2, 2010, Ian Hickson made minor changes to the test after Mozilla, due to privacy concerns, altered the way Gecko handles the  pseudo-class.

Internet Explorer
Microsoft said that Acid3 did not agree with the goal of Internet Explorer 8 and that IE8 would improve only some of the standards being tested by Acid3. IE8 scored 20/100, which is much worse than all relevant competitors at the time of Acid3's release, and had some problems with rendering the Acid3 test page. On 18 November 2009, the Internet Explorer team posted a blog entry about the early development of Internet Explorer 9 from the PDC presentation, showing that an internal build of the browser could score 32/100.

Throughout 2010, several public Developer Previews improved Internet Explorer 9's test scores from 55/100 (on 16 March ) to 95/100 (as of 4 August). Dean Hachamovich, general manager of the IE team, argued that striving for 100/100 on the Acid3 test is neither necessary, nor desirable. He claimed that the two Acid3 failures related to features (SVG fonts and SMIL animation) that were "in transition".

Criticism
Early iterations of the test were criticized for being a cherry-picked collection of features that were rarely used, as well as those that were still in a W3C working draft. Eric A. Meyer, a notable web standards advocate, wrote, "The real point here is that the Acid3 test isn't a broad-spectrum standards-support test. It's a showpiece, and something of a Potemkin village at that. Which is a shame, because what's really needed right now is exhaustive test suites for specifications—XHTML, CSS, DOM, SVG."

"Implementing just enough of the standard to pass a test is disingenuous, and has nothing to do with standards compliance," argued Mozilla UX lead Alex Limi, in his article "Mythbusting: Why Firefox 4 won’t score 100 on Acid3." Limi argued that some of the tests, particularly those for SVG fonts, have no relation to real usage, and implementations in some browsers have been created solely for the point of raising scores.

September 2011 test changes
On September 17, 2011, Ian Hickson announced an update to Acid3. In Hickson's words, Håkon Wium Lie (from Opera Software) and he commented out "the parts of the test that might get changed in the specs." They hoped that this change would "allow the specs to change in whatever way is best for the Web, rather than constraining the changes to only be things that happened to fit what Acid3 tested!"

As a result, Firefox 4 and Internet Explorer 9 achieved a score of 100/100 on Acid3, but Internet Explorer didn't render the test properly because it did not support text-shadow until Internet Explorer 10.

Standards tested
Parts of the following standards are tested by Acid3: • 3

Passing conditions
A passing score is only considered valid if the browser's default settings were used.

The following browser settings and user actions may invalidate the test:
 * Zooming in or out
 * Disabling images
 * Applying custom fonts, colors, styles, etc.
 * Having add-ons or extensions installed and enabled
 * Installed and enabled User JavaScript or Greasemonkey scripts

Desktop browsers
Since the release of Internet Explorer 10 in 2012, the latest versions of all major desktop browsers, including Internet Explorer, Chrome, Firefox, Opera, and Safari, score 100/100 and render the test correctly. the most commonly used browser that does not score 100/100 on Acid3, according to StatCounter, is Internet Explorer 8 with about 1% usage share.