Community

Sergey Gromov wrote:
> What can I say. I do not agree. C++ committee is not the best
> exemplar. Looking into implementation should be required to learn the
> details, not to understand what the hell is going on. Creating the DSLs
> should be a privilege of scripting languages, that's what they are for,
> after all.
As long as you understand the language, you should understand more or
less 'what the hell is going on'. Doesn't make much difference for
comprehension, whether it's written using syntax sugar or not. Yet it
may be more pleasant to read when it's *with* sugar. As for your
argument with scripting languages? Do you suggest that the binding to
Lua from C++ should be done... with a scripting language? That wouldn't
make much sense. And yet, the folks have managed to create a very nice
DSL within C++ that makes the code look really nice. Or should my GL
wrapper *for D* be running through a scripting language?
What can I say. I do not agree. Scripting languages have their uses, but
when the 'main' language is powerful enough, you don't need them that much.
--
Tomasz Stachowiak
http://h3.team0xf.com/
h3/h3r3tic on #D freenode

Tom S Wrote:
> Well then, the whole C++ committee is following bad practice as well, by
> making << and >> work with streams.
Right, a committee too can take bad choices, that operator usage is very unnatural, and it will always be for a C-based language. Operator overloading is useful, but like with all the situations where you have more power, you need more control too (this will become very clear when D will have macros. They will require a very strong self-discipline from the programmer, otherwise they will lead to big problems, mostly at the level of *language community*, and not at the level of single program. I have explained this in my 4th post of notes). So you have to use them avoiding to change their original semantics too much. Otherwise code becomes a jungle that a different programmer can't read well. As far as I know Java outlaws operator overloading essentially to avoid that class of problems. Java success comes from it being good for the *community* of Java programmers, and not just for the single programmer. Making the language simple and regular allows everyone to understand code written by others, and modify it, share it, etc. It also allows firms to take young programmers and use them to manage legacy Java code, etc, it allows to *lot* of less good programmers to use the language, like most of the people that today are using Processing, a graphics processing language that is essentially Java with few libs and some added sugar.
You can also read this section, it says right things:
http://en.wikipedia.org/wiki/Operator_overloading#Criticisms
Other languages don't outlaw it, like Python (despite giving less possibilities of op overload to help keep things more tidy). But you have to remember that generally even if something isn't outlawed it doesn't mean it's better for everyone to use it all the time. You don't have to use op overload *too much*. And you need some programming experience to know what "too much" means in a specific context. In this context program readability is more important that reducing the code to 2 lines instead of 4 without too much op overload.
Note that DSL (Domain Specific Language) isn't something mostly for scripting languages, they come from Lisp (and Forth too, probably) that can be compiled into a normal executable. D with macros will be fit for that too.
Bye,
bearophile

bearophile wrote:
> Tom S Wrote:
>> Well then, the whole C++ committee is following bad practice as well, by
>> making << and >> work with streams.
>
> Right, a committee too can take bad choices, that operator usage is very unnatural, and it will always be for a C-based language.
But perhaps there was no way to make it look better?
> Operator overloading is useful, but like with all the situations where you have more power, you need more control too (this will become very clear when D will have macros. They will require a very strong self-discipline from the programmer, otherwise they will lead to big problems, mostly at the level of *language community*, and not at the level of single program. I have explained this in my 4th post of notes).So you have to use them avoiding to change their original semantics too much. Otherwise code becomes a jungle that a different programmer can't read well.
You've also stated that the programmer doesn't have to understand what's
going on inside a standard lib (as long as it reads well and is
consistent across its span). The code of mine which spawned this
discussion is to be used like a standard lib for interfacing with
OpenGL. Same situation. And it doesn't abuse operator overloading 'too
much'. It's just one specific construct.
> As far as I know Java outlaws operator overloading essentially to avoid that class of problems. Java success comes from it being good for the *community* of Java programmers, and not just for the single programmer. Making the language simple and regular allows everyone to understand code written by others, and modify it, share it, etc.
Disclaimer: I don't like Java.
// It also is a total PITA to write.
> It also allows firms to take young programmers and use them to manage legacy Java code, etc,
// I'm grateful for that. Otherwise thedailywtf.com would lose half of
its sources.
> it allows to *lot* of less good programmers to use the language, like most of the people that today are using Processing, a graphics processing language that is essentially Java with few libs and some added sugar.
If Java was so great, why did Processing need the extra sugar?
Similarly, I don't want to use something different from D to write my
code because just I feel like having more sugar. D can provide it as well.
> You can also read this section, it says right things:
> http://en.wikipedia.org/wiki/Operator_overloading#Criticisms
It also highlights that there are two sides in the debate. And it
doesn't seem like the discussion ever ends, so I'd rather leave this
thread as is and not let it turn into a complete flame war ;)
> Note that DSL (Domain Specific Language) isn't something mostly for scripting languages, they come from Lisp (and Forth too, probably) that can be compiled into a normal executable. D with macros will be fit for that too.
That was also my point
--
Tomasz Stachowiak
http://h3.team0xf.com/
h3/h3r3tic on #D freenode

Anders Bergh <anders1@gmail.com> wrote:
> On Feb 8, 2008 3:20 AM, Sergey Gromov <snake.scaly@gmail.com> wrote:
> > What can I say. I do not agree. C++ committee is not the best
> > exemplar. Looking into implementation should be required to learn the
> > details, not to understand what the hell is going on. Creating the DSLs
> > should be a privilege of scripting languages, that's what they are for,
> > after all.
>
> How is this different from other code? You have to look in any
> implementation code to figure out what's going on internally, be it
> with syntax sugar or not.
If there is a bug, and I know it's /not/ in the OpenGL sub-system, and I
can tell that OpenGL sub-system from the rest of the code, then I don't
need to look into it. On the other hand, if I can't recognize the code,
I automatically consider it error prone due to its non-obvious nature.
I just have to look into it, to make sure it works.
--
SnakE

Sergey Gromov wrote:
> If there is a bug, and I know it's /not/ in the OpenGL sub-system, and I
> can tell that OpenGL sub-system from the rest of the code, then I don't
> need to look into it. On the other hand, if I can't recognize the code,
> I automatically consider it error prone due to its non-obvious nature.
> I just have to look into it, to make sure it works.
>
Uhm.. you consider code error-prone because you can't parse it in your
head? Wow. Then there's lots of error-prone code out there in other
languages I don't know! They're all bad and error-prone languages!

Alexander Panek <alexander.panek@brainsware.org> wrote:
> Sergey Gromov wrote:
> > If there is a bug, and I know it's /not/ in the OpenGL sub-system, and I
> > can tell that OpenGL sub-system from the rest of the code, then I don't
> > need to look into it. On the other hand, if I can't recognize the code,
> > I automatically consider it error prone due to its non-obvious nature.
> > I just have to look into it, to make sure it works.
>
> Uhm.. you consider code error-prone because you can't parse it in your
> head? Wow. Then there's lots of error-prone code out there in other
> languages I don't know! They're all bad and error-prone languages!
Don't you see the difference between "it is" and "I consider" ? If I'm
searching for a bug, and I don't understand some code, I *must* check
it.
--
SnakE

Sergey Gromov wrote:
> Alexander Panek <alexander.panek@brainsware.org> wrote:
>> Sergey Gromov wrote:
>>> If there is a bug, and I know it's /not/ in the OpenGL sub-system, and I
>>> can tell that OpenGL sub-system from the rest of the code, then I don't
>>> need to look into it. On the other hand, if I can't recognize the code,
>>> I automatically consider it error prone due to its non-obvious nature.
>>> I just have to look into it, to make sure it works.
>> Uhm.. you consider code error-prone because you can't parse it in your
>> head? Wow. Then there's lots of error-prone code out there in other
>> languages I don't know! They're all bad and error-prone languages!
>
> Don't you see the difference between "it is" and "I consider" ? If I'm
> searching for a bug, and I don't understand some code, I *must* check
> it.
>
Are there bugs in there?

Tomasz Stachowiak:
>You've also stated that the programmer doesn't have to understand what's going on inside a standard lib (as long as it reads well and is consistent across its span).<
I think all I have said was "I agree" :-) So I can explain a bit better.
I like to understand what's inside the std lib, because I think that if you want to use a system well, you have to know something about the layer just below the one you are using (C if you use Python, assembly/CPU if you use C, the GC and the std Lib and the assembly/CPU if you use D, etc). I have found problems in using D (to built some collection data structures) because I have yet to understand how the heck the Phobos GC actually behaves.
Some kinds of languages (and their communities) like Java think that having opaque objects is better. Other languages like Python think that's not always a good thing, and the usual coding style is different. I think both views of the situation have their merits and disadvantages, I like both styles, and I think they are fit for the languages they are used in, and the typical situations they are used in. If you want I can offer more details.
Bye,
bearophile