And what is wrong with .NET? I have each version installed and it causes no problems on my systems. It is actually fairly decent to code in as well (I use C#, F# and VC/VB.NET). This ties right into my topic on the aversion to the .NET Framework. Most complaints about .NET seem to stem from a bandwagon approach of knocking anything MS.

Darwin, Josh & all the other idiots that are quick to jump to conclusion... apart from stating that LangOver requires .net to run and that's a pitty because I can't test it. How on earth you decide I have something against the .net?

Oded, you should mention the requirements on your Website to save the hassle to others who don't have .net installed on their pc.

If you don't like mouth-breathing idiots such as myself jumping to conclusions, try being more explicit about what you mean when you post. I've been caught out this way myself in the past, but have yet to call those who jump to conclusions because of my mistakes idiots, or any other derogatory names.

<offtopic>I'm one of the people who (after I matured a bit and got over my OMFGASSEMBLY! period) have taken a liking to .NET, and find the general "it just sucks" aversion to .NET pretty silly and tiring.

There's a few relevant issues with it, though:

.NET appsare more memory-hungry than native equivalents, and there's CPU-speed overhead as well - although it's not as easy as saying "always slower", but that's a topic on it's own.

For older Windows versions, .NET doesn't come preinstalled, and it's a large download if you're on dialup. Yes, some people still are.

.NET doesn't get really interesting to program in before 3.0, which minimum OS requirement is XP-SP2. Some people are still using older OSes.

That's what I can think of off top of my head, and IMHO for most stuff it's not a reason to discard a .NET based language... unless you're writing stuff that needs to run on really low-end hardware or old operating systems. There's of course "portability issues", but given MONO I'd say .NET apps are somewhat more portable than native Windows apps (OK, OK, Wine).</offtopic>

.NET appsare more memory-hungry than native equivalents, and there's CPU-speed overhead as well - although it's not as easy as saying "always slower", but that's a topic on it's own.

</snip>

Actually, with .NET the reason that it appears to use more memory is because of the way that memory is managed in the .NET framework, specifically with regard to GC. The collection process requires resources. Garbage collection often derives information regarding an object’s use to make decisions on its future availability. Garbage collection may occur at inopportune times. This can result in delays or slowdowns that may be unacceptable. Additionally, since garbage collection typically does not happen immediately upon when an object is no longer needed, the overall memory footprint of the application may be higher. The GC in .NET also forces .NET programs to have poor locality, i.e. it interacts with local system resources in an often less-than-optimal manner, namely becasue they often consume more memory than they actually use.

Putting the long explanation behind a cut

Spoiler

The Simplified model of .NET GCThe rules:

All objects allocated from one contiguous range or memory (with one exception)

The heap is divided into generations (see diagram).

Objects in any generation should be roughly the same age.

Objects in higher generations are considered more likely to be stable.

The oldest objects are in the lowest addresses on the heap.

An allocation pointer marks the boundary between allocated and free memory.

Periodically, dead objects are removed and everything is “slid up” towards the lower address range on the heap.(eliminate fragmentation).

The order of objects on the heap is the order in which they were created (with one exception)

There are never any gaps between objects. (yes, another exception!)

Some of the free memory is committed, and when needed more memory is requested from the operating system for the reserved space.

wraith808: well, .NET apps don't just appear to use more memory than an equivalent native app, they use more memory; one reason is the structure of the .NET library and the idioms around it (there's a fair amount of objects going 'round), another is the way .NET uses it's heap. Your Win32 Private Bytes usage is going to be comparatively higher than what a native app has.

That isn't necessarily a problem, though, and it's a shame that people who think they've got a clue are quick to ramble on and on without knowing what's going on under the hood; there's a whole bunch of reasons why the way .NET manages memory is a good thing for a wide class of applications, and that the net effect is a smaller total toll on system resources... but that's too big (and way too off-topic ) thing to write about here.

The GC in .NET also forces .NET programs to have poor locality, i.e. it interacts with local system resources in an often less-than-optimal manner, namely becasue they often consume more memory than they actually use.

Unless I misunderstand what you're writing, that's not what I understand as locality. Yes, .NET heap can be compacted and stuff can move around, but you should generally get fine locality as long as you're using contiguous data structures... you might get temporary bumps because of compacting, but if your data lives long enough to suffer from this, it'll probably end up on the gen2 heap and not be affected much after that

Nope (well, I don't know enough of the internals to say which environment has the upper hand head-to-head, but they use similar techniques). I'm in favor of .NET (for a lot of things), btw, if that didn't come across clearly enough

Could a moderator perhaps split out the whole stuff-about-.NET to a separate thread so it doesn't distract from LangOvers product announcement? - sorry for continuing the off-topicness

Could a moderator perhaps split out the whole stuff-about-.NET to a separate thread so it doesn't distract from LangOvers product announcement? - sorry for continuing the off-topicness

I'll wait until this is done to continue discussion.

Update: Though since I had to post to say that, let me just say when I say appear to use more, I mean because memory is still allocated to the application that is not actually in use; when you dispose of objects, the memory is not freed immediately. This can't rightly be said to be used by the application since the developer is not using the objects... the memory is still just in that application's space.

Update: Though since I had to post to say that, let me just say when I say appear to use more, I mean because memory is still allocated to the application that is not actually in use; when you dispose of objects, the memory is not freed immediately. This can't rightly be said to be used by the application since the developer is not using the objects... the memory is still just in that application's space.

The application might not be 'using' the memory, but the thing that counts wrt. memory footprint is the Win32 Private Bytes size, not the amount of CLR memory. Keep in mind that even after GC has run, the win32 memory used by the CLR isn't necessarily reduced.

well, .NET apps don't just appear to use more memory than an equivalent native app, they use more memory; one reason is the structure of the .NET library and the idioms around it

That's not necessarily true, although it will be in many cases. There are a couple of things that can allow for more efficient memory usage.

For example, the immutability of strings allows the reuse of a single instance of a string value, without allocating multiple redundant values.

Also, the CLR design of generics is much more efficient than any other language/platform that I'm aware of. In some cases this can allow the source code to be much smaller. Basically, the definition of MyGeneric<MyClass> only needs to be stored once; whereas C++ for example must separately compile this for each different MyClass that's used.

In general, then, some kinds of programs will take more memory, and some may take less. But that's really comparing the same program, ported to different platforms. I'm betting that if you design your code from the ground up with an understanding of .Net (or whatever platform you're building for), you should be able to come up with a design that meshes well with whatever criteria are important to you.

If you are trying to reduce the total amount of memory your application allocates, keep in mind that interning a string has two unwanted side effects. First, the memory allocated for interned String objects is not likely be released until the common language runtime (CLR) terminates. The reason is that the CLR's reference to the interned String object can persist after your application, or even your application domain, terminates. Second, to intern a string, you must first create the string. The memory used by the String object must still be allocated, even though the memory will eventually be garbage collected.

Also, the CLR design of generics is much more efficient than any other language/platform that I'm aware of. In some cases this can allow the source code to be much smaller. Basically, the definition of MyGeneric<MyClass> only needs to be stored once; whereas C++ for example must separately compile this for each different MyClass that's used.

Might be true for the IL code generated, but what happens when the JIT'er runs? - also, there's object allocation overhead every time you use a delegate... which includes the very innocent-looking lambda expressions. Setting up the closures might be relatively inexpensive, but it isn't free (I measured a 10x speed hit in object serialization because of a INotifyPropertyChanged implementation using lambda expressions).

And as mentioned earlier, we have to keep the difference between win32 memory usage and CLR memory usage in mind. There's reasons for it; not freeing win32 memory right away means subsequent CLR allocations can be done faster. But holding on to (win32) memory until system memory pressure is high enough might leave other apps deciding against, say, allocating more cache because the available (win32) memory is low. Pros and cons.

In general, then, some kinds of programs will take more memory, and some may take less. But that's really comparing the same program, ported to different platforms. I'm betting that if you design your code from the ground up with an understanding of .Net (or whatever platform you're building for), you should be able to come up with a design that meshes well with whatever criteria are important to you.

Wise words. Idiomatic .NET (at least C#) programming does tend to involve a fair amount of objects being created, though. Fortunately a lot of them are short-lived and get collected fast, not putting much pressure on the win32 memory. Still, there's a fair amount of memory overhead from the framework. This becomes pretty inconsequential on larger apps that need a lot of memory for <whatever> processing, but it can be noticable on small tools. Whether this matters depends on the situation

The CLR memory model is really interesting when considering long-running server applications; for normal managed apps, memory fragmentation can end up being a pretty big issue, unless you're writing custom allocators. With .NET, you get address space defragmentation for free.

Might be true for the IL code generated, but what happens when the JIT'er runs?

Hmmm, that's a good question. I never thought about it at that level. I don't know enough about how the CLR represents things at that level to know, but it may be below the level at which it makes a difference. That is, it might be that it can continue to use a just a single implementation, because at that point it's already getting a vtable as input and need not worry about the details of how it got to be so. Do you have any specific knowledge of this?

...which shouldn't be a big deal, but in real life turns out to be a significant source of memory leakage for those who don't know it works this way. As I said, you've got to know your platform. I do this pretty extensively, and haven't had an issue with it -- but I know that I've got to clean up the delegates when I'm done, or the referenced objects won't be GC'ed.

which includes the very innocent-looking lambda expressions. Setting up the closures might be relatively inexpensive, but it isn't free

Sure, but at the same time, it makes it much more straightforward to write code that operates on huge amounts of data, that stream through the app. A naive approach would involve loading all the data. Having closures makes it simple to write a program that deals with such enormous data with minimal memory footprint. This is possible in other languages (e.g., C++), but involves more difficult coding.

Update: Though since I had to post to say that, let me just say when I say appear to use more, I mean because memory is still allocated to the application that is not actually in use; when you dispose of objects, the memory is not freed immediately. This can't rightly be said to be used by the application since the developer is not using the objects... the memory is still just in that application's space.

The application might not be 'using' the memory, but the thing that counts wrt. memory footprint is the Win32 Private Bytes size, not the amount of CLR memory. Keep in mind that even after GC has run, the win32 memory used by the CLR isn't necessarily reduced.

Well, yes... but if the memory used by the CLR isn't reduced, then it is because of one of those rules stated above, correct? So isn't that still a function of the programmer? If you dispose of an object, and remove all pointers to it, GC should clear it up... or am I missing something?

wraith808, garbage collection is done entirely at the CLR's mercy; depending on win32 memory process and which heap generation your object is in, when it's collected can vary a lot. This is just one of the reasons why you shouldn't depend on finalizers being called. Furthermore, just because a bunch of your CLR objects are being collected doesn't mean the used win32 memory is released - this makes sense because allocating system memory is "slow", so (if you're thinking only of the running .NET process and not the entire system) it makes a lot of sense to hang on to the win32 memory even if it's no longer strictly needed.

There's several different GC profiles your app can use, with different heuristics for when and how the GC works. There's also manual GC interaction you can do, but you should be really careful about this since it can seriously pessimize your app performance.

wraith808, garbage collection is done entirely at the CLR's mercy; depending on win32 memory process and which heap generation your object is in, when it's collected can vary a lot. This is just one of the reasons why you shouldn't depend on finalizers being called. Furthermore, just because a bunch of your CLR objects are being collected doesn't mean the used win32 memory is released - this makes sense because allocating system memory is "slow", so (if you're thinking only of the running .NET process and not the entire system) it makes a lot of sense to hang on to the win32 memory even if it's no longer strictly needed.

There's several different GC profiles your app can use, with different heuristics for when and how the GC works. There's also manual GC interaction you can do, but you should be really careful about this since it can seriously pessimize your app performance.

Well, yes... I know it's at the CLR's discretion, but I thought that the rules pretty much covered most cases. I also know that the generation plays a role, because it doesn't run a full GC in most cases, but I thought that memory was compacted if there were holes in between and the memory wasn't from a generation higher than what was GC'd. Is that understanding not correct?

I do have my own list of issues with .NET, but they are more about the C# language specifically:

System.Windows.Forms: Why came up with the idea that data (Listboxes specifically [They're wonderful things, all things considered]) should be kept on the control, and not in a database somewhere. And not only that, said data is in a read-only collection.

That's the easiest way to sort a listview, and even then, it's assuming the column contains an int. (The application it's used in has all but the first column as ints, and I hardcoded a check appropriate to that. The alternative was to pad the 'int's to three figures to keep it from going out of order.

I do have my own list of issues with .NET, but they are more about the C# language specifically:

System.Windows.Forms: Why came up with the idea that data (Listboxes specifically [They're wonderful things, all things considered]) should be kept on the control, and not in a database somewhere. And not only that, said data is in a read-only collection.

Most Collections have no .sort() method.

The Following code snippet

The first two items on your list are .NET framework, not C#, issues

I don't get your item #1 - nobody (in their right mind) keeps their data in user interface controls... keep it in your model-layer objects, manage lifetime with a persistance layer, and present the objects in the GUI layer (you can use databinding, or you can shuffle values back and forth manually - your choice).

#2 - not all collections can be sorted efficiently, so it's best not adding the method where it doesn't make sense.

#3 - ugh. You're approaching things wrongly - exactly how to do things right depend on whole bunch of things, though. But in general, you'll want to bind your controls to objects (as opposed to string/int/whatever representations of individual properties) and use proper sorting: check out IComparable<T> and IComparer<T> interfaces. There's several ways to handle sorting, and there's more to it than just the sorting itself... for instance, it's often better practice to not sort your object data directly, but bind the GUI element to a filter/sort adapter that constructs the binding collection from it's source collection.

One of the real pains of distributing a dot net application seems to be worrying about and figuring out ways to deal with the situation where a required edition of the dot net clr isnt installed.

And one of the things that absolutely infuriates and confounds me is why MS did such a horrible job of helping the user to understand the problem when a dot net program runs and can't find the dot net clr runtime libraries installed. They couldn't find a nice way to instruct the user that they need to install them? And why didn't they write standard functions/utilities to check if the needed runtimes are installed and send user to a nice clean simple url for lay people to read about and download the proper runtimes needed.

It just boggles my mind that in designing a next generation language framework like this they couldn't have put more time into making a smoother process out of helping users figure out they need to install the runtimes and how to do it.

Amen to that, mouser - the error message you get when you don't have the correct .NET framework installed is less than helpful! I don't think it would be too hard writing a little wrapper program that reads your main program assembly (or a configuration file or whatever); a quick-and-dirty test shows that Paint.NET launches even if I rename it's executable to "flafgiraf_PaintDotNet.exe.quox"