Helpful Information
 
 
Category: CSS Help
HTML / CSS identifiers

"_" (underscore) is a natively valid character in HTML identifiers. Brilliantly, it is not a natively valid character in CSS identifiers. However, you can indicate the underscore in a CSS identifier by using a hexadecimal Unicode escape sequence (\00005F).

So, I have an HTML document that has elements with underscore in the identifier (e.g. 'SECTION_QUALIFICATIONS') and I have a stylesheet where I reference those identifiers, using the escape sequence in place of the underscore (i.e. 'SECTION\00005FQUALIFICATIONS'). This works as expected in NS 6.1, but it is not working in IE 5.0 (I could swear it was last night...). If I use the actual underscore character in the stylesheet instead of the escape sequence, it works in IE 5.0, but I'm not going to do that because it's not valid CSS and the whole point of what I'm doing is to make a W3 valid page.

Does anyone know anything about this? Is it simply IE 5.0 being f'ed up and not properly supporting CSS? I just did a little experiement and I can see that IE is not properly handling the escape sequences; I left the stylesheet with the escapes and changed the identifier of an element, replacing the literal underscore with the escape sequence \00005F, then things worked in IE.

So does anyone know why IE would not parse the escape sequences? Is this a character encoding issue (I'm using UTF-8)?

Any help with this appreciated. It's going to be galling if this can't be resolved, because all of the stuff I'm actually doing with CSS works, and I really don't want to change my elements' identifiers.

Try using this:
\u00005F

With some experimentation, I find that an underscore in IE and NS6 is
\u005F

Thanks for responding Nemi. I'm not sure what you're describing, \u005F did nothing for me in either browser. Anyway, after I made this post, I continued my research and eventually found my way into the CSS2 errata, which revealed that the underscore is supposed to be allowed in identifiers. Apparently this isn't (yet?) reflected in the W3 CSS validator though. As for the escape sequences, according to the spec, this is the syntax:



Third, backslash escapes allow authors to refer to characters they can't easily put in a document. In this case, the backslash is followed by at most six hexadecimal digits (0..9A..F), which stand for the ISO 10646 ([ISO10646]) character with that number. If a digit or letter follows the hexadecimal number, the end of the number needs to be made clear. There are two ways to do that:

1. with a space (or other whitespace character): "\26 B" ("&B")
2. by providing exactly 6 hexadecimal digits: "\000026B" ("&B")


Netscape 6 properly evaluates the escape sequence, IE 5 does not. But the literal underscore is allowed, thank god.

Cool, I am glad you got it working. What I did was something like:


<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<html>
<head>
<title>Untitled</title>
<script>
alert("\u005F");
</script>
</head>
<body>
</body>
</html>


That puts an underscore in an alert when the page loads in either IE or NS6. I didn't go all out and do it with identifiers though.

Oh, OK, I see what you mean. In CSS everything is Unicode, so the escapes are just \###### (# = Hexadecimal digit), or the alternate form.










privacy (GDPR)