Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

There are two problems. The first is that the OS allows you to run porn.jpg.exe having downloaded it from some random place on the 'net. I don't think that either OS X or Windows do: they'll both pop up a thing saying 'You are trying to run a program downloaded from the Internet, do you really want to?', which isn't normally something that happens when people try to open a file so ought to trigger them to avoid it (if it doesn't, then seeing the.exe extension probably won't either).

The second is that the OS allows programs and other file types to set icons at all before their first run. This also leads to confused deputy-like attacks where you think you're opening a file with one program but are actually opening it with something that will interpret it as code. The solution to this is probably to have programs keep their generic program icon until after their first run. If you double click on something that has a generic program icon, then you probably intend to run it...

This post needs a '-1 not even wrong' moderation. I started to write a reply to your points, but I honestly can't tell if you're trolling or completely fail to understand any of the parts of the hardware and software stacks involved.

Not really. Self has multiple prototype chains, Smalltalk and Java have only single inheritance. Both JavaScript and Smalltalk lack any declarative mechanism for defining classes in the language and do so by manipulating objects.

an LLVM-based bytecode for its shading language to remove the need for a compiler from the graphics drivers

This removes the need for a shader language parser in the graphics driver. It still needs a compiler, unless you think the GPU is going to natively execute the bytecode. If you remove the compiler from a modern GPU driver, then there's very little left...

There are a few. Smalltalk obviously does, as does Self. Lisp also can, depending on how it's used. JavaScript is probably the mainstream language that fits closest (it's actually very close to Smalltalk). Java and C# both mostly do, though they wouldn't count as pure OO.

C++ is a language that is very good for generic programming. It doesn't really meet Alan Kay's definition of OO (and he's the one who coined the term), nor does it pass the Ingalls Test for OO. It has classes, but method dispatch is tied to the class hierarchy so if you want to really adopt an OO style you need to use multiple inheritance and pure abstract base classes, which is a very cumbersome way of using C++.

The worst C++ code is written by people who are thinking in C when they write C++, but the second-wrost C++ code is written by people who are really thinking in Smalltalk. If you're one of these people, then learn Objective-C: the language is far better at representing how you think about programs.

Any programming language can be used to write code in any style. You can write good OO code with a macro assembler if you want. However, every language has a set of styles that fit naturally with the language and ones that don't. You can force C++ to behave in an OO way, and it sort-of works, but it's not using the language in the most efficient way.

The main advantage of auto is in templates, where the type is something complex derived from template instantiations. You simply don't want to make it explicit. Oh, and in lambdas, where explicitly writing the type of the lambda is very hard (though you generally cast to std::function for assignment). As to using a float instead of an int... I have no idea how you'd do that with auto. The type of auto is the type that you initialise it with. If you're using a platform where you don't have hardfloat, then why do you have APIs that return floats? There's your bug, the use of auto was just a symptom.

And if you've never seen anyone use the algorithms library, then you must be using C++ in a very specialised environment. I've not seen any C++11 code that didn't use std::move. I've rarely seen a nontrivial C++ file that didn't use std::min or std::max, and most code will also use std::copy. The only code that I've seen that didn't use the algorithms library included is own, poorly optimised and buggy, versions of several of them.

First, WindowsMaker doesn't use Objective-C, it's written in C. However, GNUstep, which is the open source implementation of the Cocoa frameworks (originally the OpenStep specification, but they're tracking Apple changes) could use more help! Oh, and we support (on *NIX) a superset of the Objective-C language that Apple supports on their products, so I wouldn't say that Obj-C is more limited on Linux.

That said, and I say this as the maintainer of the GNUstep Objective-C implementation, I'd recommend C++, but with the two caveats:

C++ is not an OO language. It sort-of supports OOP, but writing OO code in C++ is not the natural way of using the language.

Don't look at any version of the language before C++11. It's just terrible and will damage your brain.

C++11 and C++14 have cleaned up C++ a lot. With shared_ptr and unique_ptr, you can write code with sane memory management. With perfect forwarding, lambdas, and variadic templates, you can write code that has most of the benefits of a late-bound language. I like a lot of Objective-C, but Apple broke the 'simple, orthogonal syntax' when they added declared properties and a few other things. Any successful programming language eventually becomes a mess of compromises and ugly corners. Some, like Python and C++, start that way, but at least C++ has been slowly improving over the last couple of versions.

The one thing where Objective-C is still a clear winner is in writing libraries that want to maintain a stable ABI. This is insanely difficult in C++ because the language doesn't have a clean separation of interface and implementation and relies a lot on inlining and static binding for performance. The down side, of course, is that once you have a library in Objective-C you're limited to consumers who also want to use Objective-C.

Oh, and Qt GUIs suck beyond belief on OS X - not sure what they're like on Windows, but I wouldn't recommend them for a portable UI. Good MVC design and a native UI is the only way to go if you really want a cross-platform GUI app that doesn't suck.

If Pluto is a planet, then so is Eris (which is larger), and Earth's moon (around 5 times larger than Pluto) is possibly a binary planet. Ganymede, the largest moon in the solar system, is under 3% the mass of Earth and is about ten times bigger than Pluto. There are quite a lot of moons bigger than Pluto, so would you want to classify them all as planets?

62F is apparently around 16C, which is the sort of temperature where you're going to be warm enough to not need a coat while cycling. I've not had problems cycling at any temperature above freezing (and then it's the ice on the road that's the problem, not the temperature). Do you cycle naked or something?

Zoom right in on the bits that you think are white, so that they fill your entire monitor. They're obviously blue. For a lot of us, that's the colour that we see when we look at it in context as well. I can see how you'd interpret it as being white by overcompensating for the colour in the bottom right, but that doesn't stop you from being wrong. The gold bits are gold when you zoom in (mostly, some are black), but a shiny black often looks yellow-gold in overexposed photos.

Who says the OS should provide nothing useful and let app makers make their money on it?

If you set up a straw man, then it's very easy to kill it. The issue is not an OS providing something, it's that Microsoft, which had a near-monopoly in the desktop space, used the money from selling the OS to fund development in another market (browsers) and then bundled their version, undercutting the competition with cross subsidies. There was a thriving browser market before IE was introduced, but it's hard to compete when most of your customers are forced to pay to fund the development of your competitor.

150 km a day on a bike? How long does that take? According to my phone GPS, which isn't spectacularly accurate, I do about 18km/hour (though I'm far from the fastest cyclist), so even if you're twice as fast as me that sounds like it would involve a bit over 4 hours on a bike. That's a lot of time to spend commuting each day, it's adding over 50% to the normal work day!

Generally Fundamental Evangelical Christians teach humility and service to others and subscribe to the view that others are more important than me. That's exactly opposite to what you claim "ALL" religion is.

Really? Because that's exactly the set of values that I'd choose to indoctrinate my serfs with.

"You know that those who are recognized as rulers of the Gentiles lord it over them; and their great men exercise authority over them. But it is not this way among you, but whoever wishes to become great among you shall be your servant; "For even the Son of Man did not come to be served, but to serve, and to give His life a ransom for many."

Or, to summaries: 'Hey oppressed people, don't think about following a leader from amongst yourself, that kind of thing always ends badly'.