Hello Sean and all the team,
I'm joining the party a bit late, but I wanted to share some thoughts
about the text proposed for this action.
I'm afraid I have different views on a number of topics, I'll try to
keep it as short as possible.
I will report the full text, but in case other like me have just
joined the list, the original e-mail from Sean Patterson is here:
http://lists.w3.org/Archives/Public/public-bpwg-ct/2007Sep/0029.html
> Chapter 2.3.1.2 Most content is not designed for mobile devices
> [CUT]
> "Regular web content frequently assumes that it will be displayed
> using
> the hardware of a desktop computer. Content transformation servers
> can
> reduce the hardware requirements of the content so that it works
> better
> on a mobile device."
I agree that today most of the content available on the web is not
designed for mobiles, yet. I disagree that the objective of
transcoding engines is to reduce the hardware requirements of web
pages. The web and HTML would not require a lot of CPU power. Today's
web sites are not suitable for mobile because they assume a bigger
screen and a different input method in most cases, I think referring
to "hardware" is not correct. I am sure many 3G operators would
disagree about not being broadband and fully connected.
I think the value of transcoding engines can be in adapting images
and layouts that are not meant for the mobile, overcoming Javascript
if possible and so on.
> Chapter 2.3.1.5 Eliminates the need for a least common denominator
> solution
> "One approach to the problem of the variation of mobile devices is to
> create a "least common denominator" page that works on all (or almost
> all) mobile devices. "
Could you please explain why would this approach surpass the LCD
approach? Not that I think LCD is the best possible approach, but I
don't think that building pages for desktop computer and going
through a transcoding engine would make them look better and be more
accessible on a mobile.
I think the techniques to make it possible for transcoding engines
and on-client adaptation engines to work best is to use some kind of
metadata in the markup such as using title and heading elements.
> Chapter 2.3.1.7 A content transformation server can do a better job
> of following
> mobile best practices
I disagree with the entire paragraph. I am sorry, but I do not see
how an automatic software can transform any content better that each
individual owner of the content. The one who knows best about the
content is the original author. For the task of best implementation
of best practices, I think the place are authoring tools and software
like CMS.
I suggest to remove this chapter.
> Chapter 2.3.2.2 The User-Agent header
I disagree. As stated previously, the one who knows best is always
the content owner. If a site has decided to provide certain contents
to mobile, the author will have some good reasons and should know
what his users want.
I do not understand why you suggest to hide the user-agent string
sent by the browser and then provide it as a secondary header. Either
you think it should be hidden or don't. If it is provided as a
secondary header the remote server can still recognize the browser
and provide a different content invalidating your initial state. The
entire chapter seems to be in contradiction. I think the user-agent
string should be left as is. If the content transformation engine
discovers that a server is able to provide good mobile content it
should leave it as is even if it thinks it could do a better job with
the device.
> 2.3.2.3 Identifying the mobile browser
If the group agrees with me about chapters 2.3.1.7 and 2.3.2.2, I
think this chapter and practice should be dropped.
> Chapter 2.3.2.5
> "This lets the browser and any other content transformation servers in
> the request/response"
Looks like the sentence is missing something.
> Chapter 2.3.2.6 Identification of mobile content
I think there's in general an issue here, because most server
applying adaptation will NOT serve mobile content when the user-agent
string specified in the request was one of a desktop. Google's GWT,
according to my tests [1], reads the page content, if it finds the
"link alternate" redirects the browser to the mobile page
automatically. This is a possible approach, but would this require
web designers to provide an alternate to each page on their servers?
I think there's a bit of a chicken and egg issue, here.
> Also, the following text:
> "if the response content is identified as mobile, the content
> transformation server should be conservative and try to perform only
> non-layout and non-format changing transformations. For example, it
> would be OK to accelerate the content (by removing non-layout
> whitespace, non-lossy compression, etc.), add a header and/or
> footer to
> the page, apply content corrections, etc. It would less desirable to
> remove HTML tables, change the size and/or format of an image, etc.
> However, if the content returned from the origin server uses features
> that the content transformation server "knows" that the client device
> does not support (e.g., by examining the User-Agent header sent the
> mobile web browser), it is permissible to make more extensive
> changes to
> make the content more suitable for the client device. For example, if
> an origin server returns an image in GIF format to a device that does
> not support GIF images, it would be OK for the content transformation
> server to transform the image into a different format that the client
> device did support."
I will mention again Google as this is a content transformation
engine that anyone has access to, no matter what page or operator or
device. They add a footer to pages transformed, and I have to say
that, as a user, I understand the use of it. I think that anyway
recommendation that says that transformation engines may add
something to a page should be limited in what they can change.
Google's approach is very conservative and simply adds basic
information and a couple of simple links to the original source and
the same page without images. I think that this document should give
a direction about what is welcome and what is not. Off the top of my
head some ideas:
- note that the content is adapted (as short as possible)
- link to non-transformed source
- a switch to turn adaptation on and off by default
I think most of these might fit in a footer. I would suggest to
developers of transcoding engines to keep this as small as possible
to avoid taking away memory. Accesskeys MAY be used ONLY IF NOT
interfering with the interface of the site that is being transformed.
You mention that the transcoding engine knows better than the remote
server about what the device and browser support and what not. I
don't think you can be assured you always know best. Sites optimized
for the iPhone have popped up all over US and soon in Europe, I
believe some of them have done depp testing and will know A LOT about
the device. I can imagine some sites optimizing the experience for a
certain browser, think of Soonr and the interface optimized for Opera
Mini.
I see that Jo has already made some of these comments clear in
previous replies. I like the idea of using the HEAD request, it's not
very well known, but I think would really help in these cases to let
the proxy know if it should or should NOT act.
As it was mentioned on some sites issuing a HEAD request prior to a
GET might overload servers. I think transcoding engines should do
their best to limit this and make sure they keep track of sites that
have provided good mobile content.
I haven't seen anything about what the engine should do when a site
is providing mobile content and how should the software recognize it.
I think this is an important item if we agree on the idea that the
software should try to identify content for mobiles, when available.
Last week dotMobi, the company I work for, has published some
documents [2] [3] that I think are interesting for this conversation.
[1] http://blog.trasatti.it/2007/09/googles-gwt.html
[2] http://dev.mobi/node/611
[3] http://dev.mobi/node/612
[4] http://dev.mobi/files/ContentTransformation_consultative.html
Andrea Trasatti
Director of Device Intiatives mTLD
mTLD Top Level Domain Limited is a private limited company
incorporated and registered in the Republic of Ireland with
registered number 398040 and registered office at Arthur Cox
Building, Earlsfort Terrace, Dublin 2.
The information contained in this message may be privileged and
confidential and protected from disclosure. If the reader of this
message is not the intended recipient, or an employee or agent
responsible for delivering this message to the intended recipient,
you are hereby notified that any dissemination, distribution or
copying of this communication is strictly prohibited. If you have
received this communication in error, please notify us immediately by
replying to the message and deleting it from your computer.