2) The URL spec (from either source, per above it doesn’t matter) is as backwards compatible to rfc3986 + rfc3987 as HTML5 is to HTML4; which is to say that it is not. There are things that are specified by the prior versions of the specs that were never implemented or are broken or don’t reflect current reality as implemented by contemporary web browsers.
3) Some (Roy Fielding in particular) would prefer a more layered approach where an error correcting parsing specification was layered over a data format; much in the way that HTML5 is layered over DOM4.
Analysis of points 1, 2, 3 above.
1) What this means is that any choice between WHATWG and W3C specs is non-technical. Furthermore, any choice to wait until either of those reaches an arbitrary maturity level is also non-technical. It doesn’t make any sense to bring any of these discussions back to the HTML WG as these decisions will ultimately be made by W3C Management based on input from the AC.
2) In any case where the URL spec (either one, it matters not) differs from the relevant RFCs, from an HTML point of view the URL specification is the correct one. This may mean that tools other than browsers may parse URIs differently than web browsers do. While clearly unfortunate, this likely will take years, and possibly a decade or more, to resolve.
3) If somebody were willing to do the work that Roy proposes, it could be evaluated; but to date there are quite a few parties that have good ideas in this space but haven’t delivered on them.
RFC 3986 provides for the ability to register new URI schemes; the WHATWG/W3C URL specification does not. URIs that depend on schemes not defined by the URL specification would therefore not be compatible. Anne has indicated a willingness to incorporate specifications that others may develop for additional schemes, however he has also indicated that his personal interest lies in documenting what web browsers support.
Meanwhile, this is a concrete counter example to the notion of the URL specification being a strict superset of rfc3986 + rfc3987. Producers of URLs that want to be conservative in what they send (in the Postel sense), would be best served to restrict themselves to the as of yet undefined intersection between these sets of specifications.
While I am optimistic that at some point in the future the W3C will feel comfortable referencing stable and consensus driven specifications produced by the WHATWG, it is likely that some changes will be required to one or both organizations for this to occur; meanwhile I encourage the W3C to continue on the path of standardizing a snapshot version of the WHATWG URL specification, and for HTML5 to reference the W3C version of the specification.
Furthermore, there has been talk of holding HTML5 until the W3C URL specification reaches the Candidate Recommendation status. I see no basis in the requirements for Normative References for this. HTML5’s dependence on the URL specification is weak, and an analysis of the open bugs has been made, and a determination has been made that those changes would not affect HTML5. Furthermore the value of a “CR” phase for a document which is meant to capture and catch up to implementations is questionable. Finally, waiting any small number of months won’t address the gap between URLs as implemented by web browsers and URIs as specified and used by formats such as RDF.
Should a more suitable (example: architecturally layered) specification become available in the HTML 5.1 time-frame, the HTML WG should evaluate its suitability.
New laptop for work: MBP 15.4/2.6/16GB/1TBFlash. First time I ever went the Apple route. I did so as I figured with those specs, I could run multiple operating systems simultaneously. So far, so good. I’m using VirtualBox to do so.
Notes for Mac OS X 10.9, Ubuntu 14.04, Windows 8.1, and Red Hat Enterprise Linux 6.5.
Joe Gregorio: But something else has happened over the past ten years; browsers got better. Their support for standards improved, and now there are evergreen browsers: automatically updating browsers, each version more capable and standards compliant than the last. With newer standards like HTML Imports, Object.observe, Promises, and HTML Templates I think it’s time to rethink the model of JS frameworks. There’s no need to invent yet another way to do something, just use HTML+CSS+JS.
I’m curious as to where Joe believes that these features came from.
My current service is “Standard Cable” (70+ channels, no premium ones) and “Standard Internet” (nominally 15 Mbps up, 1 Mbs down). At the end of the month, I will have had basic cable with Time Warner at the same location for 22 contiguous years, and standard Internet for more than half of that.
With that context, today I got in the mail notification that my rates are set to go up by 60% as my “Promotional” rates (Seriously? A twenty two year long promotion?) will be expiring. After spoofing my User Agent as the chat function doesn’t recognize my browser/operating system combination, I verified this is indeed the plan with “Veronica”. I was then provided a transcript and directed to an online survey when promptly logged me off without submitting my feedback once I had completed it.
Based on this idea, I created a Wunderbar jquery filter to “desugar” Wunderbar calls into JQuery calls. The tests show some of the conversions. I also updated my Bootstrap modal dialog directive to make use of this: before => after.
We’re at an inflection point in the practice of constructing software. Our
tools are good, our server developers are happy, but when it comes to building
client-side software, we really don’t know where we’re going or how to get
While I agree with much of this post, I really don’t think the conclusion
is as bad as Tim portrays things. I agree that there are good server side
frameworks, and doing things like MVC is the way to go.
I just happen to believe that this is true on the client too – including MVC.
Not perfect, perhaps, but more than workable. And full disclosure, I’m firmly
side of the fence.
Leonard Richardson: Hey, folks, I got some pretty exciting news. Now that RESTful Web APIs has come out, there’s really no reason to buy 2007’s RESTful Web Services. So Sam Ruby and I and O’Reilly have gotten together and started giving the old book away. You can get a PDF from the RESTful Web APIs website or from my now-ancient RESTful Web Services site. The license is BY-NC-ND.
I finally debugged why my cable service was so poor. Long story short, an inexplicable 7dB drop in the incoming line, a bad arrangement of splitters, and another unexplained 7dB drop someplace in the house; , which leads to the following question:
If Time Warner Cable is moving towards digital only service, shouldn’t they be providing enough signal strength to drive all of the devices in the house?