The SitePoint Forums have moved.

You can now find them here.
This forum is now closed to new posts, but you can browse existing content.
You can find out more information about the move and how to open a new account (if necessary) here.
If you get stuck you can get support by emailing forums@sitepoint.com

If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

When I validate any of the pages it said there were Meta tags in uppercase and not closed properly, nesting errors, tags not allowed and more. The company told me their code written with Dreamweaver 8 and was correct. They pointed out that if I validated Google's or Firefox's or Microsoft's main page, they will and do show errors. So what is the value of WW3 validation and is "bad" code in this case bad? Will it not work in some browsers? For the moment I have no answer to give them. The site, which is half-live (!), is villa-stbarths com. I wrote another website myself henry-bar net which has no errors (I corrected them all) and it works fine. Should I pursue my gripe or let it go. Will it work as is? Henry

The value in building with standards is cross-browser compatibility and sustainability. Also, the less errors there are in your code, the less trouble you'll have working with it in the future, and the less trouble other people will have working with it.

As others have said, the main advantage of code that validates is that you can be reasonably sure that different browsers will interpret and therefore display it correctly and consistently. Another advantage is that it helps you sleep better if you are crazy about this sort of thing.

But an important advantage is that it shows that you know how to code properly. A web-design company that makes invalid code is simply a company that doesn't know how to do its job.

When I validate any of the pages it said there were Meta tags in uppercase and not closed properly, nesting errors, tags not allowed and more.

That means the page would not work if it were served as real XHTML. You'd just get a Yellow Screen of Death in Firefox. The page needs to be served as text/html, which is definitely abuse of XHTML. The infamous W3C note says you 'may' serve a subset of XHTML markup as text/html, but it implies that the document must still work when served as real XHTML.

Originally Posted by henrybarnett

They pointed out that if I validated Google's or Firefox's or Microsoft's main page, they will and do show errors.

That's like saying, 'Since <insert famous person's name> cannot spell correctly, I don't have to care about spelling either'. Two wrongs do not make a right.

Originally Posted by henrybarnett

So what is the value of WW3 validation and is "bad" code in this case bad?

The value is quality control. Invalid code can be harmless or disastrous, depending on what's wrong with it. By ascertaining that there are no formal errors, you can devote your energy to working round browser bugs and quirks instead.

Originally Posted by henrybarnett

Will it not work in some browsers?

Who knows? Perhaps it will work in any browser today, but what about tomorrow? Maybe browsers become stricter and less buggy? What if some browser – defying the rules – decides to parse XHTML markup as XHTML even if it's served as HTML? It's not entirely inconceivable; Internet Explorer does content sniffing already, and will render an HTML page even if it's served as text/plain.

Originally Posted by henrybarnett

Should I pursue my gripe or let it go. Will it work as is?

Consider an analogy: Let's say you hired a professional copywriter to produce the site's content, and got back a horrible mess with spelling mistakes and grammar errors. Would you accept that? Would every reader be able to understand it?

You all seem to be saying what I thought. I've paid for a website and if it is correctly and professionally written it should be correct for today (and tomorrow too) The analogy of if "<a famous person cannot spell...> it's ok if there are spelling mistakes" is bang on. I can see a battle ahead. When they said "we write it with DW8" and I see that many of you say that DW is verbose to say the least and the website writer should be able to correct bad code without using DW or another wysiwyg tool, then the company needs reminding of its responsibilities. Watch this space!

Dreamweaver is perfectly capable of creating valid strict code if it is configured and used correctly. You need a good knowledge of HTML yourself to be able to configure it to work that way though - often it works out easier to just work from the code view directly rather than figuring out the configuration and correct way of using the wysiwyg option to generate valid code.

I don't see how work can even be seen as "finished" if the code isn't valid. Sure I've had many clients who take the code and then eventually modify it and it is no longer valid (most don't understand validation either) - but as a web designer, my final product has to be valid for me to feel like I did what I promised my client.

If I were you, I'd request valid code for your website. As members before me stated, why settle for an incomplete?

I guess you mean W3C validator, not WW3 (World war 3) though I find the comparison rather funny!

The company who told you that the code was correct was (not to mince words) lying to you. The W3C validator is indisputable and the errors you saw in the code were of valid concern that it may damage the way your website is viewed (some browsers may render elements incorrectly). What this really shows is the lack of knowledge the people you paid have and they obviously do not have the decency to do the job you paid them to do properly. And by trying to push you off with lies show the lack of respect they have for clients. Just because many of the top businesses have non validating websites is not the issue. If it were me I would demand that the website validates otherwise I would refuse payment on the job on the grounds that they are selling you a buggy website which obviously is not fit for purpose. (But I am pretty web standards strict) lol

Browser rendering works exactly the same no matter what platform the end user uses with a few exceptions. Font sizes may appear different based on the resolution, DPI and typeface availability, colors may look different depending on the color contrasts available to that machine and their monitor, layout and overflow / scrolling may occur based on the screen size or device. There is a real difference to how a website will look if they are using a mobile device in preference to a desktop or laptop but generally speaking, visiting a website using chrome on your machine will look the same as chrome on my machine, unless of course they choose to override settings such as disabling stylesheets / scripting, custom styling, text resizing and zoom.

Remember that there are a variety of web browsers with their own rendering engines, IE, Firefox and Chrome have independent rendering engines but it may also be worth checking the website in Opera, older versions of IE (through a product like IETester) and Safari.

How search engines interpret your website is pretty cryptic and though we have best practice guides, we do not have any standard information to work with. What I would say is that if your meta tags are not properly represented and your content (especially) is not marked up correctly and at minimum passes validation, you can guarantee your search engine ranking may be compromised or damaged as a result. Also note that META tags these days are often ignored by places like Google who focus more on the “meat” of the website rather than the META. Which makes it all the more important to make sure your website validates and is coded properly.

If the site seems to work on my computer in FF, IE and Chrome - and there are others, does it follow that it will work like that on your computer?

Unfortunately, no. It's likely, but not certain. Other users can have a different monitor resolution (I'm talking DPI here, not monitor size which people incorrectly call 'resolution'). They may not have the fonts that the page uses. They may have set a minimum font size due to visual impairment. They may have plug-ins, JavaScript, images and/or CSS disabled. And so on...

Originally Posted by henrybarnett

I also presume that if the doctype is XTHML then meta tags in uppercase and not closed are ignored by some spiders etc?

I assume the pages will be served as text/html (since no version of Internet Explorer has any support for XHTML whatsoever). If so, it IS HTML, not XHTML, as far as browsers, spiders and other user agents are concerned. All browsers (except a very obscure one) have buggy HTML parsers that let you get away with the invalid or incorrect '/>' syntax for certain elements, and spiders are equally forgiving.

Not really :/ The differences between say, FF3 on Ubuntu and FF3 on Vista, and IE7 on Vista and IE7 on XP, can be significant.

Those differences are pretty much the exceptions Alex already listed though.

Ubuntu's problem is, it comes with Gnome. Gnome fux0rz with widths of things Firefox has more control over, like form's submit buttons and text inputs. In my browser, almost everyone's single-line search-bar wraps to two lines, and never in Firefox on a Windows machine (I dunno about KDE, I'd love to know). But this has something to do with a combination of default fonts, font sizing and resolution, I'm pretty sure.

The Vista/XP for IE7, I'm not sure what those would be from. We're assuming the differences there aren't the ones Alex mentioned (esp if the XP is a virtual machine on the Vista box)? Where resolution etc are all the same? Though I've noticed from posts here than the new fonts that came with Vista are rather small. A compensation for some change in Vista?

I also presume that if the doctype is XTHML then meta tags in uppercase and not closed are ignored by some spiders etc?

I would be surprised if a spider rejected META in place of meta. Since most of the content a spider searches for is text content, I would imagine it's moot. But, my speculation only.