Matthew,
Why the functions and methods name starting with a capital letter?
check the D style guide at
http://www.digitalmars.com/d/dstyle.html
Walter,
How did you allow that!?
let me check the other contribution from Matthew...
oh, no, registry.d has the same problem...
Ant

Matthew,
Why the functions and methods name starting with a capital letter?
check the D style guide at
http://www.digitalmars.com/d/dstyle.html
Walter,
How did you allow that!?
let me check the other contribution from Matthew...
oh, no, registry.d has the same problem...
Ant

I agree.. Matthew makes good libraries but I don't like his coding style
much.

Matthew,
Why the functions and methods name starting with a capital letter?
check the D style guide at
http://www.digitalmars.com/d/dstyle.html
Walter,
How did you allow that!?
let me check the other contribution from Matthew...
oh, no, registry.d has the same problem...
Ant

also note in overview it says:
D is not a scripting language, nor an interpreted language. It doesn't come
with a VM, <b> a religion , or an overriding philosophy. It's a practical
language for practical programmers </b> who need to get the job done quickly,
reliably, and leave behind maintainable, easy to understand code.
personally i like the his coding style. but im also biased towards anything
that
looks more BASIC than C ;)

Matthew,
Why the functions and methods name starting with a capital letter?
check the D style guide at
http://www.digitalmars.com/d/dstyle.html
Walter,
How did you allow that!?
let me check the other contribution from Matthew...
oh, no, registry.d has the same problem...
Ant

also note in overview it says:
D is not a scripting language, nor an interpreted language. It doesn't come
with a VM, <b> a religion , or an overriding philosophy. It's a practical
language for practical programmers </b> who need to get the job done quickly,
reliably, and leave behind maintainable, easy to understand code.
personally i like the his coding style. but im also biased towards anything
that
looks more BASIC than C ;)

Oh, so many possible answers to this question, some penitent, some
exasperated, some disinterested, some patient. I shall content myself with :
I have lots of different styles, depending on the context. It's conceivable
that my D style will evolve to fall in with the "official style", I suppose,
but there again it may not.
I think it's a little too early in the game to be prioritising this to the
degree that Walter and I, or anyone else, would spend time that is in such
short supply on it, when there are other issues that are more pressing. I
for one am more than prepared for all my D libraries to have to change
conventions, and my D client code to break, before 1.0 comes around. What's
of far more concern, surely, is that there are a raft of libraries yet to be
written, and Walter wants to release 1.0 in March! (A little recursive
search and replace is not going to be too taxing, is it, especially since we
now have a recursive search library ...)
Unless and until the picture clears around the different paradigms that will
lead D forward, in much the same way that STL, C-API and OO paradigms all
have different styles in C++, I'm going to maintain my position of
preparedness. We may all be changing ...

Matthew,
Why the functions and methods name starting with a capital letter?
check the D style guide at
http://www.digitalmars.com/d/dstyle.html
Walter,
How did you allow that!?
let me check the other contribution from Matthew...
oh, no, registry.d has the same problem...
Ant

BTW, why do you define "alias int boolean"? I think this is a big
mistake. What makes it in your opinion better than "alias bit bool"?

why is everyone getting hung up on all this crap? I don't think boolean is
better than bool, but when I wrote the reg api the issue of bool was still
way up in the air, so I chose to avoid possible clashes at that time. I do
think int is better than bit, until such time as we have a strongly typed
boolean. But, AFAIK, the issue was never resolved, and no concensus was
reached, so I just continued in my holding pattern.
As I mentioned in the other post, I'm ready for wholesale syntactic changes
to all of my D code (lib and client) prior to 1.0, but I'm not going to
spend serious amounts of time - of which I have very little atm - on things
that are still subject to change. recls has simply followed the reg api,
since it seems to me to be sensible that I, as their author, be consistent
so that when I have to make these changes it is simple to do.
I'd be interested to hear any feedback - -ve or +ve - on the semantics of
the libs.

I'd be interested to hear any feedback - -ve or +ve - on the semantics

of

the libs.

Should think about upgrading it from the "A" api's to the "W" api's.

So it will not work on Win9x?
Or has a D-standard mechanism been established for dynamically determining
which APIs to use, depending on the current OS? If so, I missed it.
(FTR, I am not of the opinion that its current form is *the* form, merely
that AFAIK the issue is yet to be resolved.)

I'd be interested to hear any feedback - -ve or +ve - on the semantics

of

the libs.

Should think about upgrading it from the "A" api's to the "W" api's.

So it will not work on Win9x?

After January 16, 2004 MS is dropping support for Win98. If I read it right
ME is already dropped (check out
http://support.microsoft.com/default.aspx?scid=fh;[LN];Lifeneom).
Personally I wouldn't worry about Win9x - it's sooo last millennium. ;-)

Or has a D-standard mechanism been established for dynamically determining
which APIs to use, depending on the current OS? If so, I missed it.
(FTR, I am not of the opinion that its current form is *the* form, merely
that AFAIK the issue is yet to be resolved.)

I'd be interested to hear any feedback - -ve or +ve - on the semantics

of

the libs.

Should think about upgrading it from the "A" api's to the "W" api's.

So it will not work on Win9x?

Sure it will. Win9x supports the "W" api's. What 95 does not support is
UTF-8, but that's irrelevant because the UTF-8 to UTF-16 translation is
handled by the D library, not 95. 95 also does not support UTF-16 surrogate
pairs, but again, that's an issue for the high level D programmer, the
runtime library need not care.

Whatever do you mean? Are you saying that Windows 95/98 have functioning
implementations of FindFirstFileW, CopyFileW, etc. etc.? I can assure you
that they do not.
From the MSDN:
"
Windows 95/98/Me: FindFirstFileW is supported by the Microsoft Layer for
Unicode. To use this, you must add certain files to your application, as
outlined in Microsoft Layer for Unicode on Windows 95/98/Me Systems.
"
"
Windows 95/98/Me: CopyFileW is supported by the Microsoft Layer for Unicode.
To use this, you must add certain files to your application, as outlined in
Microsoft Layer for Unicode on Windows 95/98/Me Systems.
"
etc. etc. The vast majority of W functions are not supported. The exceptions
are a *very* few UpperCase functions and the lstr???W ones.

Whatever do you mean? Are you saying that Windows 95/98 have functioning
implementations of FindFirstFileW, CopyFileW, etc. etc.? I can assure you
that they do not.
From the MSDN:
"
Windows 95/98/Me: FindFirstFileW is supported by the Microsoft Layer for
Unicode. To use this, you must add certain files to your application, as
outlined in Microsoft Layer for Unicode on Windows 95/98/Me Systems.

Whatever do you mean? Are you saying that Windows 95/98 have functioning
implementations of FindFirstFileW, CopyFileW, etc. etc.? I can assure

you

that they do not.
From the MSDN:
"
Windows 95/98/Me: FindFirstFileW is supported by the Microsoft Layer for
Unicode. To use this, you must add certain files to your application, as
outlined in Microsoft Layer for Unicode on Windows 95/98/Me Systems.

What happens if you call FindFirstFileW on 95 without MSLU?

It'll return INVALID_HANDLE_VALUE, and GetLastError() will return
ERROR_NOT_IMPLEMENTED. I don't have refs, but there're several sections in
MSDN that list the miserly number of supported 9x Unicode functions and the
vast number of unsupported ones.
This is how MS got Windows 95 to fix on all those crusty old boxes, and was
a deliberate design decision motivated by allowing people to upgrade from
3.x without having to change machines. Smart marketing, but terrible
technology. They junked almost the entire Unicode API, security, and several
other bits. I guess it worked, but it sure left us all a lot of shit to
shovel.
You need to decide how you want to play this Walter, as it's extremely
important.
Personally, I'd like a D or DMC equivalent to MSLU, and the D compiler hides
the searching and loading (of the D9xUL.dll) in the exe, without troubling
the users. One possible nice thing would be that if D9xUL.dll was not
present, the exe could download and install it seamlessly, but I guess
connectivity, exe-size and security will auger against that.
However it's done, it's a lot of work, and this little puppy's got no
interest in doing it.
But either someone does it, or we all do it. There's no escaping the
problem, except by saying D is not for 9x. But since one can compile and
test a PE exe built on D in NT, there's nothing to stop someone trying it on
9x, especially given the likely amount of "free" software available from
this excellent group of code studs over the next c/o years. Given that, it
seems impossible to take the stance that D will not support Win9x. It'll
quickly get the reputation of producing buggy software.
btw, did you read the article. It describes several potential strategies we
can take. Given that we (i.e you) control the Win32 compiler, there is a lot
of potential to take some of the simpler, but more restrictive, techniques I
describe, by having it do a bit of custom linking smarts.
Cheers
Matthew

http://support.microsoft.com/default.aspx?scid=http://support.microsoft.com:
80/support/kb/articles/Q210/3/41.ASP&NoWebContent=1
Thanks, but it doesn't say anything useful. It just references a paper
"Differences in Win32 API Implementations Among Windows Operating Systems by
Noel Nyman" that google can't find.
Sadly, searching for "CreateFileW" on MSDN returns "Content that matches
your query is not available at this time."

I found some info in Richter's book. I think the thing to do is to call

the

"W" function, and if it fails with the not implemented error, fall back to
calling the "A" function.

That's a horrid method. It's inefficient, it's easy to lose last-error
(although not difficult to preserve with a scoping class), and there's no
reason for dynamic tests. Whether or not a function is implemented is fixed
absolutely. For example lstrcpyW() is not implemented on Windows 95, and is
implemented on Windows 98 or later. Why test for something that is
immutable?
We need an equivalent to MSLU, or to use MSLU.

I found some info in Richter's book. I think the thing to do is to call

the

"W" function, and if it fails with the not implemented error, fall back to
calling the "A" function.

That's a horrid method. It's inefficient, it's easy to lose last-error
(although not difficult to preserve with a scoping class), and there's no
reason for dynamic tests. Whether or not a function is implemented is fixed
absolutely. For example lstrcpyW() is not implemented on Windows 95, and is
implemented on Windows 98 or later. Why test for something that is
immutable?
We need an equivalent to MSLU, or to use MSLU.

The MSLU is ok for some applications, but the necessity to ship a DLL
with it disqualifies it for others.
For example, I would want to be able to write a self-extractor or an
Installer in D. That would not be possible if MSLU is required.
I have quite a bit of experience in writing platform abstraction code,
and I have learned that the number of string related OS functions you
use is actually pretty "small" (say ~100).
I think the D standard library should use its own wrappers for these
functions, so that they are linked statically. The amount of work
required for this is not as much as people tend to think.
About the implementation of these wrappers: I think the most sensible
course of action would be to have a global bool that specifies whether
we're on Win9x or WinNT, and then simply call the corresponding version.
Always calling the W version and in the case of an error falling back to
the A version has 2 downsides:
- It's slower on Win9x
- You need quite a bit of code for the fallback. You have to check the
error code of the W function before falling back to A, because even on
WinNT a W function may fail while the A version might not (for example,
the A implementation might need less memory).
With a boolean and some good string conversion routines its simple:
if(isWinNT())
return SomeFuncW(myString);
else
return SomeFuncA(toAnsi(myString));
Hauke

I found some info in Richter's book. I think the thing to do is to call

the

"W" function, and if it fails with the not implemented error, fall back

to

calling the "A" function.

That's a horrid method. It's inefficient, it's easy to lose last-error
(although not difficult to preserve with a scoping class), and there's

no

reason for dynamic tests. Whether or not a function is implemented is

fixed

absolutely. For example lstrcpyW() is not implemented on Windows 95, and

is

implemented on Windows 98 or later. Why test for something that is
immutable?
We need an equivalent to MSLU, or to use MSLU.

The MSLU is ok for some applications, but the necessity to ship a DLL
with it disqualifies it for others.

Acknowledged.

For example, I would want to be able to write a self-extractor or an
Installer in D. That would not be possible if MSLU is required.
I have quite a bit of experience in writing platform abstraction code,
and I have learned that the number of string related OS functions you
use is actually pretty "small" (say ~100).
I think the D standard library should use its own wrappers for these
functions, so that they are linked statically. The amount of work
required for this is not as much as people tend to think.

Indeed. In my own work I use a statically linked layer that contains only
the needed functions.

About the implementation of these wrappers: I think the most sensible
course of action would be to have a global bool that specifies whether
we're on Win9x or WinNT, and then simply call the corresponding version.
Always calling the W version and in the case of an error falling back to
the A version has 2 downsides:
- It's slower on Win9x
- You need quite a bit of code for the fallback. You have to check the
error code of the W function before falling back to A, because even on
WinNT a W function may fail while the A version might not (for example,
the A implementation might need less memory).
With a boolean and some good string conversion routines its simple:
if(isWinNT())
return SomeFuncW(myString);
else
return SomeFuncA(toAnsi(myString));

Your philosophy is right, but the implementation is wrong. There should be a
single function that library code calls. The last thing we'd want is people
testing the boolean themselves. Something along the lines of
HANDLE FindFirstFile(
LPCTSTR lpFileName, // file name
LPWIN32_FIND_DATA lpFindFileData // data buffer
);
HANDLE _DLU_FindFirstFileW(wchar_t *searchSpecW, WIN32_FIND_DATAW *dataW)
{
if(bWinNT)
{
return FindFirstFileW(searchSpec, data);
}
else
{
char *searchSpecA . . . // translate searchSpecW to searchSpecA
WIN32_FIND_DATAA dataA;
HANDLE hFind = FindFirstFile(
if(INVALID_HANDLE_VALUE != hFind)
{
. . . // translate dataA to dataW
}
}
}
What I was referring to in an earlier post about our using the compiler to
good effect would be for it to translate use of FindFirstFileW to
_DLU_FindFirstFileW without troubling the user. Of course, this'd need a lot
of thought before it was accepted.
If Walter could concoct a way to intercept references to Win32 API W
functions and hook them into such translation functions, everyone could
simply program to the Win32 API without ever caring about Win9x limitations.
The only cost is the fact that each W function will cause a small amount of
bloat that is unneeded on WinNT. In my experience, the bloat is a price
worth paying to save us from all the programming and the distribution
hassles.

If Walter could concoct a way to intercept references to Win32 API W
functions and hook them into such translation functions, everyone could
simply program to the Win32 API without ever caring about Win9x limitations.
The only cost is the fact that each W function will cause a small amount of
bloat that is unneeded on WinNT. In my experience, the bloat is a price
worth paying to save us from all the programming and the distribution
hassles.

i don't want to pay that price to support a proprietary os that is unsuported
today, and known to be replaced by by-far bether os' since years.
i know some weren't able to switch. but only for those, we shouldn't all pay.
bether let the ones that have an old, bad os pay. a.k.a. make an os-wrapper for
those (with dll's you make downloadable and addable to your programs folder,
that wrap your W to A functions, for example)
thats more the way i think it should go.

With a boolean and some good string conversion routines its simple:
if(isWinNT())
return SomeFuncW(myString);
else
return SomeFuncA(toAnsi(myString));

Your philosophy is right, but the implementation is wrong. There should be a
single function that library code calls. The last thing we'd want is people
testing the boolean themselves. Something along the lines of
HANDLE FindFirstFile(
LPCTSTR lpFileName, // file name
LPWIN32_FIND_DATA lpFindFileData // data buffer
);
HANDLE _DLU_FindFirstFileW(wchar_t *searchSpecW, WIN32_FIND_DATAW *dataW)

I was referring to the way the wrapper functions are implemented. This
should be done once for each function, not every time the function is
called.
Another thing: from what you write it seems to me like you want to
create a "better" Win32 API for the programmer. I was just referring to
the way the standard library is implemented. I do not think that such an
API should be exposed by the standard library, since it will be very
incomplete. Something like a full Unicode Layer for Windows should not
be integrated into the D core library - this has nothing to do with the
language itself and should be a separate library if someone wants to do it.
Also, a "true" Unicode layer is pretty difficult to implement, as this
would include having to wrap the message queue and converting all
messages before they are handled by the application (MSLU does this to
some degree). For the standard library we won't have that problem, since
we do not not access the message queue there.
And another thing: the MSLU does still exist, even if it is not required
by the core D library. If developers want to have a reasonably complete
Unicode layer they may still link to MSLU (if their application type
allows it).
I guess my point is that a full Unicode layer is too much work and has
nothing to do with the core D language.
Hauke

I was referring to the way the wrapper functions are implemented. This
should be done once for each function, not every time the function is
called.

Ok. :)

Another thing: from what you write it seems to me like you want to
create a "better" Win32 API for the programmer.

Absolutely! These issues are a total PITA, and I see no reason why, at this
early stage, we cannot decide to step in and fix it for one and all.
And who's to say analogous issues are not to be had on Solatirs, or Mac, or
VMS, or whatever?

I was just referring to
the way the standard library is implemented. I do not think that such an
API should be exposed by the standard library, since it will be very
incomplete.

It's not a trivial job, to be sure.

Something like a full Unicode Layer for Windows should not
be integrated into the D core library - this has nothing to do with the
language itself and should be a separate library if someone wants to do

it.
But people will omit it. The single biggest problem is that most of us
developers are (i) English speakers (whether first language, or just been
forced to be good at it because it's the lingua franca of programming and
popular culture) and (ii) we use NT family boxes.

Also, a "true" Unicode layer is pretty difficult to implement, as this
would include having to wrap the message queue and converting all
messages before they are handled by the application (MSLU does this to
some degree). For the standard library we won't have that problem, since
we do not not access the message queue there.

Excellent point. It is indeed a hard one

And another thing: the MSLU does still exist, even if it is not required
by the core D library. If developers want to have a reasonably complete
Unicode layer they may still link to MSLU (if their application type
allows it).
I guess my point is that a full Unicode layer is too much work and has
nothing to do with the core D language.

Agreed on both point. But the issue still has to be addressed.
The fact that big W was unaware of the stubbed nature of the 9x W functions
demonstrates that this is a minefield for all developers, of whatever level
of experience.
How about if we start thinking about this a little differently? Maybe rather
than concerning ourselves about Windows, we can have a general approach to
function declarations. We could have a keyword, dynamic_encoding_function in
the following example (a better name would be suggested by an imaginative
soul), that would allow us to stipulate that the compiler and/or linker must
generate the dispatching code for us. We'd need to tell it what the in and
out parameters are, how it makes the determination of which to call, and
such like - it's not trivial - but it could be done.
HANDLE FindFirstFileA(char *searchSpec, WIN32_FIND_DATAW data);
HANDLE FindFirstFileA(wchar_t *searchSpec, WIN32_FIND_DATAW data);
dynamic_encoding_function HANDLE FindFirstFile(wchar_t *searchSpec,
WIN32_FIND_DATAW data);
{
dispatch_switch = std.windows.isWinNT,
dispatch_case =
{
true, /* When std.windows.isWinNT is true */
FindFirstFileA,
false, /* Don't do anything on entry */
false /* Don't do anything on exit */
},
dispatch_default =
{
std.windows.opSys.winNT,
FindFirstFileA,
true /* Do something on entry (in this case translate
searchSpec) - we need to work out a general way to decree what is to be done
*/
true /* Do something on exit (in this case translate
data) - we need to work out a general way to decree what is to be done */
},
}
Obviously there's more to it, but I think (hope) you get my drift. This
could then be used for more than just character encoding scheme translation,
and would not necessarily only be useful for Win32.
It'd need a clear and concise form, but I'll leave it to others to suggest a
better one.

I guess my point is that a full Unicode layer is too much work and has
nothing to do with the core D language.

Agreed on both point. But the issue still has to be addressed.
The fact that big W was unaware of the stubbed nature of the 9x W functions
demonstrates that this is a minefield for all developers, of whatever level
of experience.

Yes, but changing the way an application interacts with the OS is the
wrong way to go, IMHO. I certainly wouldn't want to have such stubs
between me and the OS, because I already have an abstraction library.
Unnecessary layers mean unnecessary bugs.
So there are really two issues here:
1) How are the Windows calls in the standard library implemented?
2) Is it possible to help people write better, more compatible Unicode
application on Windows?
Number 1 is fairly easy to solve by just creating a small number of
wrapper functions that are used internally.
Number 2 is a completely different beast. Writing a complete Unicode
layer is a HUGE project - not even Microsoft has managed to do it up to
now. For example, the Windows Common Controls are not supported in the
MSLU, even though they are used in almost all applications.
Also, I think the goal of Number 2 is not inside the scope of a
programming language like D. Improving the OS is not the job of the
language, it is the job of either the OS creators or library developers.
So I see three options:
a) Just leave it as it is and provide good tool support for easily
linking in the MSLU, if the developer wants to do that. The D docs could
also feature prominent notice of the MSLU's existence and describe what
it does.
b) Create a new wrapper library in D. As I mentioned, this is a lot of
work. But there could be some advantages as well: for example, we could
write the wrappers in a way so that they take D string objects (of the
"new" kind that hides the encoding), instead of raw string data. That
would make it even easier for newbies, since then they wouldn't be
required to understand which encoding they need for which functions. It
could also save some speed on Win9x (if the encoding used in the String
objects is not UTF-16), because then we could convert directly to "ANSI"
encoding (i.e. the system code page), instead of first converting to
UTF16 and then to ANSI.
Solution b will probably always be incomplete and by the time it is
reasonably usable many people might have moved on to other OSs.
c) Stick to the MSLU for the time being. And instead of slightly
improving the Windows API, invest that work into developing platform
independent libraries that work for multiple OSs. If you get right down
to it, the goal of this discussion is to have newbies write code that is
compatible with all Windows OSs. Wouldn't it be even better if that code
also ran on Linux and MacOS? This is also a huge task, but at least
there are already a lot of libraries one could utilize (like the Apache
Portable Runtime, wxWindows GTK, QT, whatever). The key would be to make
these libraries easily accessible, so that newbies use them instead of
programming the Windows API directly.
I think if you want to protect people from the hazards of an OS then
solution c) is the best way to go. But I wouldn't go that far and make
this a part of the D standard library. That would mean that all those
platform independent libs HAVE to be implemented for every OS D is
ported to - and that could severely hamper the range of systems D is
available for.
Better make it a semi-official set of libraries that are "recommended"
to use, if available.
Hauke

If Walter could concoct a way to intercept references to Win32 API W
functions and hook them into such translation functions, everyone could
simply program to the Win32 API without ever caring about Win9x

limitations.

The only cost is the fact that each W function will cause a small amount

of

bloat that is unneeded on WinNT. In my experience, the bloat is a price
worth paying to save us from all the programming and the distribution
hassles.

I've written such an interceptor. But it is way beyond the scope of D to try
to paper over deficiencies in the underlying OS.

If Walter could concoct a way to intercept references to Win32 API W
functions and hook them into such translation functions, everyone could
simply program to the Win32 API without ever caring about Win9x

limitations.

The only cost is the fact that each W function will cause a small amount

of

bloat that is unneeded on WinNT. In my experience, the bloat is a price
worth paying to save us from all the programming and the distribution
hassles.

I've written such an interceptor. But it is way beyond the scope of D to

try

to paper over deficiencies in the underlying OS.

Ok, fine. But you have to decide how this is to be handled, whether that be
MSLU, or DLU.dll, or DLU.lib, or we don't support 9x execution, or we leave
it to the poor developer.

Ok, fine. But you have to decide how this is to be handled, whether that

be

MSLU, or DLU.dll, or DLU.lib, or we don't support 9x execution, or we

leave

it to the poor developer.

At this point, I don't know what to do. I just don't understand why MS
didn't make it an update to the OS, and now they never will as they have
officially abandoned 9x.
D relies on the underlying OS for unicode to work, otherwise each executable
will have to carry around a huge bloated unicode layer. That might work for
a VM based language where customers are used to bloat <g>, but not for a
systems language.
I think the right solution is to say that unicode will not work on 9x - just
ascii - and provide some kludge for code pages.

The MSLU is ok for some applications, but the necessity to ship a DLL
with it disqualifies it for others.

I find the necessity of shipping a DLL with any D executable to be
unacceptable.

For example, I would want to be able to write a self-extractor or an
Installer in D. That would not be possible if MSLU is required.
I have quite a bit of experience in writing platform abstraction code,
and I have learned that the number of string related OS functions you
use is actually pretty "small" (say ~100).
I think the D standard library should use its own wrappers for these
functions, so that they are linked statically. The amount of work
required for this is not as much as people tend to think.
About the implementation of these wrappers: I think the most sensible
course of action would be to have a global bool that specifies whether
we're on Win9x or WinNT, and then simply call the corresponding version.
Always calling the W version and in the case of an error falling back to
the A version has 2 downsides:
- It's slower on Win9x

It's not as big a problem as it seems. The first time you call it, you can
do the check, and then set a global flag. Subsequent times, just test the
flag.

- You need quite a bit of code for the fallback. You have to check the
error code of the W function before falling back to A, because even on
WinNT a W function may fail while the A version might not (for example,
the A implementation might need less memory).

On NT and later, all the A apis are are a shell around the W apis that
convert to UTF-16 first.

Now that I realize that MSLU cannot be installed as part of the OS, you're
right.

http://www.microsoft.com/globaldev/handson/dev/mslu_announce.mspx
MSLU appears to consist of a tiny loader, which binds into the
application, and the DLL. If the DLL is not found, the loader simply
redirects the calls to the OS. Which results in the same as if the
loader was not there.
Thus you can call applications which use such a loder "MSLU-aware", and
MSLU DLL is then as good as an operaing system update?
Besides, strategy can be destinguished by specifying a version at
compile-time. Unicode with MSLU loader, lightweight Unicode (NT only and
probably little sense), and lightweight non-Unicode (Windows 9x, good
for many fairly language-agnostic applications). BTW, with versions and
stuff, time would be coming to think of a better build system. Thanks
that the language semantics makes it possible.
-eye

I found some info in Richter's book. I think the thing to do is to call

the

"W" function, and if it fails with the not implemented error, fall back

to

calling the "A" function.

That's a horrid method. It's inefficient, it's easy to lose last-error
(although not difficult to preserve with a scoping class), and there's no
reason for dynamic tests. Whether or not a function is implemented is

fixed

absolutely. For example lstrcpyW() is not implemented on Windows 95, and

is

implemented on Windows 98 or later. Why test for something that is
immutable?
We need an equivalent to MSLU, or to use MSLU.

Sadly, MSLU went out of its way to be a feature of an application, not of
the operating system. In other words, it will not install as an operating
system upgrade. That and its documented specificity to VC++7 makes it nearly
useless for D.
All it mostly does, though, is convert the unicode to the current local code
page and then call the "A" function. The problem with the D library doing
that is I don't have the mappings between code pages and unicode.

I think the D compiler should handle the onerous linker crud for us
automatically.

The more I look at it, the more it looks like the wrong solution for D.

I can't conceive of a right solution, but I know that shipping NT-family
compiled exes to Win9x and having them fail is going to look back for D,
however unfair and unjustified that is.

I always use the A api's unless theres no equilavent for a W api that i need to
use. In vb all strings (which would be equal to wchar[] or utf16) get converted
automatically to char[] by the runtime to call the A api, and then back again.
It seems to me that this type of way should be the solution if one wants to
generically support both systems (9x an NT). Maybe some kind of flag for NT
only? that would use only W api's.
As was stated though, a flag for NTness would probably be best with an: #If NT
{use W api;} else {use A api;} might also work.
However (this could possibly get me flamed lol) i also think, because that
windows is by far the most popular os, that a specific windows only support
library should be built, seperate from possibly another library for unix, linux
etc... ;) it cant hurt to wish lol
Another possible solution may be to come up with a new format for defining
windows only api. So that the compiler can pick which one to use depending on
the OS, dunno...
Hopefully in a year from now i can look back at some of my posts and wonder how
i could say such useless things lol i guess leaving something unsaid that may
helpful is useless also.
regards
Lewis

In what way is it not useful? It lists the Unicode functions that are
supported by Windows 95. Any other W functions are going to be just stubs.
Pretty important information, I would have thought.
"Walter" <walter digitalmars.com> wrote in message
news:btbbts$1q9h$1 digitaldaemon.com...