Redirects For A Global Website

I am currently working on a client's global website and will be the first to admit I am not the most technically astute person. The client has a global website, www.client.com which has a 302 redirect setup to //www.client.com/index.jspx;jsessioni...E9F44CAADE.pa05, which is then redirected to the English version of the site, www.client.com/en-US/index.jspx;jse...E9F44CAADE.pa05.

So in the search results for the brand, you see www.client.com but when you click the result, you are redirected to www.client.com/en-US/index.jspx. They have an IP detection set in place so that the main site, www.client.com, redirects you to the proper internatinal version of the site; it detects where you are and redirects you to the proper site.

So my question is, is the 302 redirect the proper HTTP header for www.client.com, or should it be a 301 redirect, or should it even be that www.client.com returns a 200 HTTP OK as does all of the different versions of the site, like www.client.com/en-US/index.jspx for example. I want to make sure that the proper redirects are in place so that all link popularity is funneled down properly, and transferred properly, and that there aren't multiple URL's indexed for the same page because things are not redirect properly.

They have a few other branded websites, and all of them are handling this situation differently which makes it difficult to say. Another one of their sites is redirecting in this manner:

redirect from the root to a sub directory or file is bad.. for various reasons. If they cant place the site in the root (which as developers they can, they just cant be bothered) then set up the folder/file point in htaccess.

Okay, there are definitely some redirect issues you'll probably want to get a better handle on. But that's not the part that concerns me first and most.

In your question you indicate the final url has what looks to be a Session ID in them. In this case a Java Session ID by the looks of it. That's what the jsessionid stuff is.

So my question is whether or not these session ids actually show up in the url strings? And if they do for you using a normal browser, does the server give the search engines a free pass around those? Or it is a required element? Or a required element if the browser/user-agent doesn't allow a cookie to be set.

Most likely you're going to want to do a little research and try to get as close as possible to viewing the site the same way the search engine spiders do. You can do this somewhat by using the site: operand at the various search engines, paying attention to the url structure of the pages they have indexed and reviewing the Cache they have on file for pages.

Or another way to do it is to set up your browser to make it look (and act) like the search engine spiders. In other words, you'll want to tweak the way the browser reports itself so that it'll look like Googlebot or Slurp and also to not accept Cookies and not process javascript or client side java.

That's where I'd start. Mostly because whether the pages can be spidered and cached properly is the most important thing. Only when you've confirmed the pages can be spidered and cached do you need to start sorting out the redirect stuff. That's a whole other ball game.