Introduction

.NET as a technology has brought about great changes. Besides moving to a managed context, .NET provides developers with the tools to rapidly develop applications. However, there has been one attribute of .NET processes that has been callously overlooked by many: Memory Consumption.

Since .NET processes pull in lots of code from the framework class libraries, each process has a large footprint. When many processes are run, there will be a lot of libraries that are duplicatly loaded in separate address spaces, all taking up valuable system resources. The answer to memory conservation comes in the form of AppDomains in .NET. With AppDomains, multiple applications can run in the same process, thereby sharing the .NET runtime libraries.

Sharing the .NET runtime using AppDomains does come at a slight performance hit, though this will not be noticeable if the application is not using static methods and fields in abundance, yet the improvement in memory consumption is phenomenal. Fully homogenous .NET solutions are not common in industry as of yet, and as a result, the issue of memory has not been brought to the forefront. Yet memory consumption is always a consideration of good software design and this article shows ways to use system resources more efficiently with AppDomains in .NET.

The Benchmark

To benchmark the memory consumption, I created two very simplistic applications. A StayinAlive.exe and a Win32StayinAlive.exe, using an executable and a .NET assembly respectively. Both programs run with a message to the console every 1 second indicating that they are alive and run until the program is shutdown by the user. To illustrate the magnitude of the problem, I ran 5 copies of the .NET version to illustrate the added memory consumption and to prove that there is no efficiency of library usage in running multiple copies of the same program. Consider the diagram below and notice that the Win32 process takes about 700KB of memory whereas its comparable counterpart in .NET takes around 4.5MB of memory. This may not be a problem for simple applications, but when running multiple .NET applications on a system, it can add up very quickly. Our 5 console applications add up to about 22.5MB of memory!

A Bit About AppDomains

Application domains remained a mystery to me for quite a while, but after some general reading in this topic, I came to the following understanding: In .NET, the basic unit of execution is NOT the process, rather it is that of the Application Domain. The only true process is what is called a Runtime Host. The CLR is a DLL which means that, in order to run, it must be hosted in some process: The runtime host. There are basically three runtimes with the .NET framework: Internet Explorer, ASP.NET, and Windows shell. Also note that if you are running .NET 1.0 and 1.1, there will be separate runtime hosts for each of these (i.e. 6 runtime hosts). There are numerous topics regarding application domains such as threads and inter-application communication using remoting that are not in the scope of this article, but I urge you to look up Application Domains in the MSDN documentation.

The point I’m trying to make here is that everything in .NET runs within an application domain. Even though you never create an AppDomain explicitly, the runtime host creates a default domain for you before running your application runs. What makes them even more powerful, is that a single process can have multiple domains running within it. Unlike a thread, each application domain runs isolated from the others with its own address space and memory. So where’s the benefit? What do we get by running multiple applications in the same process versus running multiple processes with single (defaulted) application domains.

Quoting MSDN:

“If an assembly is used by multiple domains in a process, the assembly's code (but not its data) can be shared by all domains referencing the assembly. This reduces the amount of memory used at run time.”

But….....

“The performance of a domain-neutral assembly is slower if that assembly contains static data or static methods that are accessed frequently.”

So, would I run multiple data processing applications that used static methods in the same process, just to save memory? Of course not, but I certainly would consider it if I had some high latency applications that, just like my dog Debug, slept around for a good part of the day.

The Solution

To illustrate the power of AppDomains and the actual improvement in memory consumption, I created an ApplicationHost process. This process will load multiple .NET executables into multiple application domains, in the same process, so we can see just how much memory we save.

First, I created a class HostedApp that represents an application being hosted in my process, as follows:

Each HostedApp instance will create a new AppDomain to run the hosted .NET process. The RunApp() method is privately scoped and is matched to the delegate for a ThreadStart procedure. The Run() method is called on the instance to start the hosted process in its own Thread and AppDomain.

And the verdict is ........

That’s right folks … 7.6MB for 5 processes + the hosting application. This is an approximate savings of 15MB of memory to get the same job done.

Issues

When you run the above code, you will notice one thing. Since all hosted applications are run in the same process (Windows shell), they will share the same console. Hence, if you have console applications that require user interaction from the console, this can get messy. There is hope though. When running Windows applications, such as forms, the forms are all hosted in separate threads and will run independently. In the source code provided, I have also included a simple StayinAliveForm.exe program that launches a form (instead of a console) that displays the same message in a Label control with the current time. For extra credit, you can change the hosted application from StayinAlive.exe to StayinAliveForm.exe and watch the program run with 5 independent and fully functional forms. Of course, your savings in memory will be comparatively more due to the larger footprint for Windows applications.

Also, note that the .NET runtime is not dumb in the least bit. A lot of this memory will actually get paged out eventually, but the fact is that it still is committed memory and can be paged back to physical memory any time.

Conclusion

Application domains are very powerful, yet they can get very complex very quickly if you dive into advanced features such as moving data between domains and communicating with domains hosted on remote machines. If you have huge assemblies of your own, for example, rich user interface component libraries, that are shared by multiple processes in your product, you may want to consider using App Domains to improve memory consumption. Again, I refer you (and myself) to the MSDN documentation on this to further your (and my) knowledge about application domains. Good luck and thanks for the memory!

License

This article has no explicit license attached to it but may contain usage terms in the article text or the download files themselves. If in doubt please contact the author via the discussion board below.

Share

About the Author

Neil Baliga is the founder of Verifide Technologies, Inc. (www.verifide.com), an initiative for automated test systems for product verification used in manufacturing. He strongly believes that the value in software is in its simplicity. His experience includes UNIX, Win32 API, TCP/IP multithreaded servers, C#, C++ et. al, and Radio Frequency (RF) measurement science. He came across .NET in 2001 and has been in love with it ever since. He is an avid LA Lakers and Denver Broncos fan and loves to hang out with his dog 'Reboot'. He is extremely lucky to have the love and support of his beautiful wife Jyothi.

1. copy StayingAlive.exe and StayingAliveAppDomains.exe to the C:\temp folder
2. double-click StayingAlive.exe five times
3. Examine Task Manager and record the memory used by each (the Working Set)
4. Close all five, noting the total memory used
5. double-click StayingAliveAppDomains.exe and note the memory usage as reported by Task Manager

Results of the comparison:

Each copy of StayingAlive, run separately, used 1860KB resulting in a total use of 9.2MB of memory. In contrast, StayingaliveAppDomains used just 3.6MB of memory, a saving of 61%!

Dear,
I have run your code on my machine, I am getting those results:
1- Running 5 instances of StayinAlive.exe, consumed each around 1.7MB==> total 8.5MB
2- Running single instance of Verifide.ApplicationHost.exe, it consumed from 7MB to 9.3MB.

In its best cases the gain was 1.5MB.
I have considered the same running conditions for all runs.

Besides, the task manager is not a reliable benchmark. Such differences are due to windows and .net framework. Such judgment lack rationality.

Actually, I am investigating AppDomains, so far I am finding them overhead more than gain. Please correct me if I am wrong.

Thanks a lot and really sorry for being tough or rude, but I am really keen to find real findings regarding what is being said about AppDomains. Maybe I did not know how to use them in the right way.

When I try the example code with a generic fully managed assembly, it works fine. However, when I try executing a mixed-mode assembly (uses MFC & STL), I get the following exception:

FileLoadException Attempt to load an unverifiable executable with fixups
(IAT with more than 2 sections or a TLS section.) (Exception from HRESULT: 0x80131019)

Must mixed-mode assemblies be run in separate processes, or is there a technique I can use to get around this issue? A mixed-mode assembly, when started as its own process, still runs in its own AppDomain, so I'm not sure why I can't use the article's technique to run it...

I'm not a memory guru, but what I DO know is that there are at least two different "types" of memory - shared and private. Shared memory is shared between processes while private memory is per-process.

The CLR will load NGEN'd assemblies into shared memory, which results in smaller working sets for all applications using said assemblies. Most (if not all) of the BCL is NGEN'd. This means the argument that each individual .NET process contains a copy of the BCL has no validity whatsoever.

Hi,
I have a Process 'A' running, i have to start a Thread in Process 'A' in a Seperate Application Domain ,
do i need to create a seperate Assembly to contain that thread ?
or in other words, should the executing contents of an AppDomain present in an Assembly? .....
Rerards,

The benifit is that physical memory is shared between AppDomains when shared within the same process.

So, idealy you would have the root AppDomain simply be a loader and monitor of additional AppDomains. Can you subscribe to an AppDomain.OnUnload or Dispose event and reload the AppDomain when it crashes? You could even instrument and log such an app.

I hear that string instances are shared between AppDomains in a process. So, theoretically, two AppDomain could both lock on the same string instance.

Why should it be unreliable? It gets this information from the operating system, which must have knowledge of the memory allocated to the process, as well as the handles that are open by the process, the peak memory consumption, and more. After all, that is the operating systems purpose.

Sure, profiling your code can give you a more detailed analysis of what's going on inside your application and for what memory is being allocated, but the operating system - by its very nature - would know about that which it has "handed out" to the process.

I second that, though there is one issue I am aware of with the Task Manager (TM). The statistics the TM reports is based upon the samples it takes. So for example, the peak mem usage is not the actual peak mem usage, just the max value that it has sampled. To check this I ran the following quick sample:

When you run this, you should get at least 40MB (4bytes*10M int) in mem usage, peak mem usage, or vm size? Or do you? When I run this program, I only see about 5MB (Even in High Update Speed mode). However if I uncomment the first Console.ReadLine() call, I will see the 40MB accounted for, though it is paged in virtual memory very quickly.

For purposes of the article at hand, this is not an issue. I am not dynamically allocating and freeing memory ad-hoc and the numbers, though not perfect, still lend proof to the fact that memory is saved using AppDomains. Thanks for your input.
Cheers

------------ Neil -----------
Give someone a program,
and you frustrate them for a day
Teach them to program,
and you frustrate them for life
-----------------------------

OK, please someone correct me if I'm wrong, but I thought this was how it works:

- The "Mem usage" field in task manager is actually the working set size, which is how much memory that windows has allocated the process.

- Areas of the processes memory that haven't been used for a while will be removed from the processes working set, and paged out to disk. When you click the minimize button on the window, windows agressively tries to trim as much as possible from the processes working set to free up memory. Try it and watch the mem usage fall!

- Also, the working set includes shared libraries. A DLL that has been loaded by multiple processes may be included in the working set of each process, even though it is only in memory once. Each 4mb process is probably sharing most of that 4mb with each other process.

- The "Private bytes" field in task manager (you have to go into options to add the column) shows the total memory allocated to that process, and not shared with any other processes. In this situation, this field is more useful for measureing the memory footprint.

I don't doubt that this technique does save memory, but its not a good idea to look at the figures from task manager in this way.

One situation where this technique would give a huge performance benefit is if you were using remoting. A remoting call to a different appdomain in the same process should (haven't tested it myself!) be many times quicker than calling to a different process.