CSS vendor prefixes, again...

We still have a major problem in CSS wrt CSS vendor prefixes... We have -moz-*, -webkit-*, -ms-*, -o-* and more all over the place and we all agree that what we have is suboptimal. During W3C Plenary Meeting this week, we discussed CSS Gradients. Introduced by Apple with a -webkit-* prefix, we now have at least 3 incompatible versions of gradients spreading in Web pages. Being myself an HTML+CSS editor implementor, I can tell you this is pure hell for me. But beyond editing issues, it's also a strong issue for Web authors because they have to maintain multiple versions of the same property in their stylesheets to be compatible with browsers that are not cutting-edge. Let's face it: we have a problem with Gradients (a really great feature by the way) because a browser vendor shipped it to the masses and there is no way to prevent web authors for massively adopt immediately such a great feature. It happened in the past for Gradients, it will happen again for other features.

On another hand, browser vendors need a way to implement experimental features and ship them, even if it's only for testing purposes. The CSS WG decided a while ago that if a given feature is stable enough across browsers, it should move faster along the REC track and the CSS WG should issue a call for implementations, tested with an official test suite.

But that's probably not enough. We still have one issue here: the "extraction" of a given feature from a given spec is not always easy, requires editing work and it's not instantaneous with respect to W3C Process. And if we keep the feature in its original spec, it means we call for implementations before the spec reaches a global interoperability, and that's not what the W3C Process wants. Thinking out loud, I wonder if a better solution isn't the following one:

  • new strictly-experimental prefixed properties and values can be shipped but they are disabled by default in content. It means that they will output a warning on the console at parse time and they will not be stored in the CSS OM, not honored in Layout. I understand this could slow down a little bit CSS parsers but nothing comes for free.
  • these experimental features can be enabled by the user using a special "Debug" menu.
  • a web page can programatically prompt the user to enable these features, through a new API probably on document. App-level (what we call chrome CSS in XUL) CSS does not need that, experimental features are still enabled by default in app-level CSS. I think enabling should be domain-wide, not global.
  • all experimental features are disabled again for all domains when the browser is updated.

Then browser vendors can still implement, ship, make the community test the implementation, allow Web authors use the features in experimental web sites.

Opinions? (again, I'm only thinking out loud here to start the discussion)


1. On Saturday 5 November 2011, 14:22 by CAFxX

So, you're basically suggesting of slowing down the web at large because the W3C can't keep up? That's as bad an idea as our PM's idea of solving the problem of the slowness of our judicial system by putting a time limit on processes...

2. On Saturday 5 November 2011, 14:37 by Daniel Glazman

@CaFxX: because the "W3C" can't keep up? WARF!!! And what do you think is the W3C? It's a consortium and the browser vendors are part of it. Our problem is related to vendors themselves and their will to ship faster than standardization. When WebKit shipped gradients, gradients were their proposal only and they knew that standardization would probably drastically change the original proposal. But because the feature was made entirely public in an unstable form (ie not discussed with other vendors), it's all over the place now and we can't really get rid of it. So don't blame the W3C only please, *everyone* is involved here, including browser vendors.

3. On Saturday 5 November 2011, 14:58 by CAFxX

I know fairly well who the W3C is composed of and the fact that browser vendors are (an important) part of it doesn't undermine my remark: slowing down the web (by making vendor-specific extensions basically unusable in "production" websites) because the W3C can't keep up is not a solution.

4. On Saturday 5 November 2011, 15:11 by FremyCompany

This is not the right way. I think Microsoft solved this problem rather efficiently by having more than on render engine version in the same browser. You could have a "edge" render engine which is in BETA and allows developer to use unstable features for themself and a "stable" one everybody else would use. There's no slowdown noticable when you use the "stable" version : the 'new' property simply don't exist in the render engine.

When Microsoft want to add support for a non-standards property, it does so by furnishing "Platform Previews" that developpers can use to test drive the new functionnalities while keeping the web in a clean state. If a feature isn't safe enough to be published in a stable version, it isn't. Other browser vendors should do the same.

Microsoft is an exemple to follow because they took part to the First Browser War and introduced a bunch of proprietary extensions that they took years to get rid of. They don't want to take the risk of the same situation to rise again and it's why their model is so prudent (regarding to that specific point).

BTW, to work efficiently, it would be needed for UA's implementors to have a way to "REC-ise" part of a working draft (specific properties, they all want to implement in their "stable" engine).

5. On Saturday 5 November 2011, 15:30 by Asbjørn

No way. This is definitely not the right solution. I think that a common prefix for experimental properties would work better, because you are right, the current situation is a mess (though a mess that can be eased significantly using LESS and similar tools).
But this issue is really a result of the W3C being slow as molasses. Come on, CSS 2.1 has only just been finalized, and has been in development since the late 90's. We have all moved on to CSS3 in the mean time, and where is the W3C? Judging by the current rate, we will maybe have a finalized CSS3 in 2020 - if we're lucky. The W3C needs to adopt a speed that appropriate for the web platform, that is at most 2 years from idea to final specification.

6. On Saturday 5 November 2011, 15:43 by CAFxX

IMHO one of the few viable way to solve this is to avoid monolithic specifications (a-la "CSS3") and start working on microspecs (a-la "CSS-[Feature]-[Version]", e.g. "CSS-RoundedBorders-4") with explicit versioning inside stylesheets (e.g. "@css { BoxLayout:2; RoundedBorders:4 }" to signify the developer is expecting the current stylesheet to be interpreted according to CSS-BoxLayout-2 and CSS-RoundedBorders-4).

Just my two eurocents...

7. On Saturday 5 November 2011, 15:59 by Mike Taylor

+1. The notion that CSS3 or the W3C is moving too slowly is perhaps a symptom of the real problem--ignorance of what standardization means and that evolving features are just that, evolving. Gradients are a mess, Flexbox is a mess, why? They're an unstable, moving target with multiple slightly different implementations in the wild.

Do we really want stable cross-browser features? That takes experimentation both on the sides of the authors and the vendors (and time for the dust to settle). Some kind of debug mode will allow this to happen in an environment that is explicitly experimental, rather than in production sites (and elsewhere) that will have to keep up with changing specs and syntax. In my own experience, authors rarely go back to update CSS because of a new spec version.

8. On Saturday 5 November 2011, 16:20 by Mardeg

I think the idea has merit, especially if all vendors used the same "experimental" prefix so styles aren't all bloated or fail to parse when we try to combine all the vendor prefixes in a comma separated list for a single rule. But not for CSS3, it should only be considered for CSS4. That way I'll die of old age before having to deal with it.

9. On Saturday 5 November 2011, 16:36 by Dao

"But because the feature was made entirely public in an unstable form (ie not discussed with other vendors), it's all over the place now and we can't really get rid of it."

This doesn't seem to make any sense. The W3C doesn't need to get rid of vendor-prefixed gradient syntaxes. The W3C simply shouldn't care about them. It should standardize something for vendors to implement without a prefix.

10. On Saturday 5 November 2011, 16:45 by Marat Tanalin

It's not of spec's (as well as W3's) responsibility at all to dictate browser vendors to use certain default settings. It's completely vendor's right to implement any of things so long as these things are prefixed.

Anyway your idea is BAD one since:

— progress of the web would be incredibly slowed down since manual enabling a setting is nothing better than installing a custom build -- nobody will do this;

— prefixed CSS-properties would require JavaScript to be enabled which would be at least odd.

Completely wrong way.

11. On Saturday 5 November 2011, 19:11 by Federico Brigante

First you talk about how it's suboptimal for the developer to have multiple property names, then you propose to block him from using them for that reason. I understand you want the best for the web dev community but this seems "Apple-like" rather than Internet-like. If the developers want to be cutting edge let him live "this hell," people who don't want hell just stay out of it.

12. On Sunday 6 November 2011, 00:04 by Henrik

I agree with your description of the problem but I would like to put forward another suggestion instead.
Why not make it a breach of the standard to have a vendor specific (prefixed) implementation side by side with the standard? This way everyone that uses a vendor prefix would take extra care as they know it will go away in upcoming versions. The risk is of course that the usage is so widespread that the glory in getting a few extra points in the standards race is not worth it...

13. On Sunday 6 November 2011, 09:04 by Robert O'Callahan

The problem with gradients is that we had a "good enough" spec (much better than Webkit's original implementation) quite a long time ago, but various people on www-style can't resist the desire to keep making incremental (non-compatible) "improvements". The latest round of suggestions change the syntax to something completely different that is nothing like anyone's experimental prefixed implementation.

14. On Sunday 6 November 2011, 20:50 by Ian Thomas (thelem)

I don't really see what benefit this brings. If a web developer doesn't want to use these extensions then they can just ignore them until a standardised version is ready. If they don't mind the hassle of checking browser compatibility then they can use the cutting edge features.

15. On Monday 7 November 2011, 12:14 by sporniket

I'm ok with that proposal that tells explicitely "you're using this bleeding edge feature at your own risk, don't require others to support it"

Personnally I already have to much work -to my taste- making things work equally on several browser with standard properties.

16. On Monday 7 November 2011, 12:52 by MarkC

I would suggest having three property names:

1) A vendor prefixed version
2) An "experimental" prefixed version that is a synonym for the browser's vendor prefixed version
3) The final W3C approved name

Now let's suppose that all browsers quickly converge on a compatible syntax for a feature. As a web developer I can use #2, the "-exp-feature" name, and know that each browser will map it to "-moz-feature", "-o-feature", "-webkit-feature" as appropriate. When the feature is approved, I can drop the "-exp-" prefix (or add another line so that older UAs still work - but at least it's only two lines to support).

Now consider a feature that is mostly interoperable, but differs in one browser (such as Gradients in Webkit). Now I have one rule, "-exp-gradient", which covers most browsers, but a subsequent "-webkit-gradient" rule to provide the specific webkit syntax.

For really early experimentation a browser might *only* recognise a vendor prefix, only allowing the "-exp-" synonym as things stabilise.

This approach keeps things flexible enough to allow vendors to experiment, but reduces the burden on authors when there is broad compatibility across browsers.