If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

No you didn't. You made some argument about keeping stuff in registers as opposed to loading it from RAM etc.. But this can be trivially optimized away by the compiler. This is exactly what compiler optimizations are for. The fact that you don't believe it's possible reveals more about your (lack of) knowledge about modern compilers than about C++ or std::bitset.
Of course, you may still prove me wrong. Just provide some code that proves your point, that is, code that will run faster using manual bit-fiddling instead of a std::bitset. The fact that you still didn't do this suggests that you can't though.

Oh well, I must have misunderstood that bit.

I care about how easy it is to read for other programmers. I don't care about non-programmers, as they will by definition not be able to understand it anyway.

Um, actually it's not an example at all, because C++11 does have binary literals. As I said earlier, user-defined literals are there for just that.

Oh my, you're not understanding the concept of a standard.

Let me explain to you Mr AC that a standard STANDARDIZE the way to do things.
Being able to specif things in a standard way allows automatic compatibility from everything that implements the standard with anything else that implements it.

Standardizing the way of making user-defined literals is not the same as having a standard way of having a particular kind of literal.
Try to think about this: 'Everything you didn't specify will fall/fail/won't work at some point'.

It's already being done with other programming languages. (Java 7 comes to mind)
It doesn't require a large addition of the standard and can make life easier for people making software.

Comment

Let me explain to you Mr AC that a standard STANDARDIZE the way to do things.
Being able to specif things in a standard way allows automatic compatibility from everything that implements the standard with anything else that implements it.

Standardizing the way of making user-defined literals is not the same as having a standard way of having a particular kind of literal.
Try to think about this: 'Everything you didn't specify will fall/fail/won't work at some point'.

Slow down a bit. User-defined literals are part of the standard, so they should work wherever the standard is supported. If you say "but X is bound to introduce compiler C which has a small bug there preventing that custom-literal from working as expected" then you're talking about a non-compliant compiler and you should ask X to fix it rather than complaining about the standard.

I expect the real reason binary literals weren't added, if they were even considered, was because "it's just another feature", and Mr Stroustrap's been trying to keep the language as managable as possible. That Java now has it is no argument; you could never design one language which has every feature popular in any other language. Admitted it's not a difficult feature to add, but it's still another feature without a major user-base.

Personally I think binary is a lot more useful than octal, but that's beside the point.

Comment

Slow down a bit. User-defined literals are part of the standard, so they should work wherever the standard is supported. If you say "but X is bound to introduce compiler C which has a small bug there preventing that custom-literal from working as expected" then you're talking about a non-compliant compiler and you should ask X to fix it rather than complaining about the standard.

I expect the real reason binary literals weren't added, if they were even considered, was because "it's just another feature", and Mr Stroustrap's been trying to keep the language as managable as possible. That Java now has it is no argument; you could never design one language which has every feature popular in any other language. Admitted it's not a difficult feature to add, but it's still another feature without a major user-base.

Personally I think binary is a lot more useful than octal, but that's beside the point.

NOPE i'm not talking about that. I'm not talking about bugs but different softwares doing things on a different way because the standard did NOT specified that.

The thing is that two people, organizations would have made the 'same' custom-literal which is written differently.
And would not be universal and stuff written from one wouldn't work on stuff written from another one.
The software would be perfectly fine but adapting all software for all things from all parties would be an impossible task.

Your user-defined literals don't tell you what the literals stand for.
That's not a bug in software. That's unspecified because we are talking about custom literals.
Universally usable binary literals would be very welcome in many places.