Let me put it mildly.
There was a certain laxness on the part of the coders at Twitter. They simply disregarded Rule 1
Regard any user-generated input as tainted until proven benign.
Above a certain volume of traffic taint checks can become rather resource intensive – to the point where the overhead becomes prohibitive. For this situation existeth
If you can’t prove your user’s input as benign –> escape the living shit out of it by default!
No script gets away without comparisons, assignments or plain old strings – plus certain special chars whose absence will trigger a compiler error.
By simply HTML-escaping all user-generated input you disable all comparisons of size, boolean logic and bit-shifts, as well as sending all strings to the nirvana for unterminated literals. For good measure also encode any instance of = and you have nailed assignments as well.
Life would of course be a lot easier if browsers were not as lenient as they are. The recent XSS exploit that bit Twitter wasn’t even encapsulated as CDATA, yet browsers still ran it instead of throwing an XML parser error, as they should have done in the face of unescaped comparison operators, bare ampersands and other garbage in the middle of an XHTML doc.
90% of all websites would either break or vanish completely if browsers enforced standards. About time they did – for all our safety.
Update, Mon, 13 Apr 2009 21:06:13 +0200
As usual users of Firefox who surf with the NoScript Add-on need not worry. By default NoScript nails the Twitter worm.