.NET World Follow Up
Published Jun 26 2019 11:13 PM 664 Views
Community Manager
First published on TechNet on Apr 21, 2005

My last blog entry on the memory bloat of managed (.NET) applications generated the controversy and misinterpretation of my primary points that I expected, so I’m following up with some clarification.

First, I stated that managed code is ideal for client-server applications, especially on the server side. On a server security is critical, and because server applications are long-running, avoiding memory leaks is also critical. For client-side only applications, however, security is only important if the client is network-enabled and connects to arbitrary and potentially compromised servers (like Internet Explorer does). Memory leaks are not as important because most client applications don’t require continuous access and restarting the application (or rebooting if necessary) resets a leak.

Memory footprint is much important for a client-side-only application since there can be many such applications running concurrently and clients often have limited memory. Someone stated that by the time that Longhorn ships most new systems will have 1 to 2 GB of memory. In corporate environments clients have at least 3-year life cycles and home users even in prosperous nations might upgrade less often. In developing nations you’ll see system configurations lagging the mainstream by 5 years. That means that most of the world’s computers won’t have 1-2 GB of memory until several years after Longhorn finally ships.

It’s amazing to me that no matter how much memory we add, how much faster we make our CPUs, and how much faster we make our disks spin and seek, computing doesn’t seem to get faster. If you have a Windows NT 4 system around compare its boot time and common tasks with that of a Windows XP system. Then compare their system specs. Then ask yourself what you really can do on the Windows XP system that you can’t do on the Windows NT 4 system.


Originally by Mark Russinovich on 4/21/2005 10:59:00 AM
Migrated from original Sysinternals.com/Blog

# re: .NET World Follow Up

My only major problem with NT 4.0 was the horrible memory manager it had. I had more crashes with NT 4.0 than I have ever had with any other OS, Windows or not.

When I went to 2000, my system became much more stable, and XP (security holes or no) has been the most stable yet.

4/21/2005 12:23:00 PM by VesperDEM


# re: .NET World Follow Up

I can enjoy a well-rounded Teletubbies eXPerience, thank you very much, Dr. Russinovich.

4/21/2005 9:16:00 PM by Anonymous


# re: .NET World Follow Up

my job is to administrate a lot of computers running under windows for more than 12 years now.

i've started with windows 3.11, switched to microsoft's nt-line with nt 3.51 and then all the way up to xp today.

the people i work for have to do a great deal of their dayly work with those machines. they want to get things done with them.

if they ask me "hey admin, windows 2000/XP/2003 has been published. do we have to upgrade?" i always answer with a new question "is there a unique functionality to this new os that you need for your daily work?". the answer to this is mostly "no".

if there is a real need for 2K or higher (mostly usb peripherals) i install it because of the missing functionality under nt 4.0. but the majority of the machines is still running the good old windows nt 4.0.

why? because it works and the people still get things done with it.

4/22/2005 10:44:00 AM by Anonymous


# re: .NET World Follow Up

If memory footprint was the overriding criteria, we'd all still be running Windows 95.

4/22/2005 10:44:00 AM by Anonymous


# re: .NET World Follow Up

If a program crashes fourteen times a day because of memory leaks, I'm going to dump it. Restarting the application isn't a solution in that case.

However, memory leaks are the programmer's fault and responsibility to fix, no matter if he's writing in a .NET language or toggling switches on a PDP-4 front panel. No excuses should be allowed.

4/22/2005 10:59:00 AM by Ross


# re: .NET World Follow Up

I've been working on a port of our server-side code from VB-6 to .NET as a proof of concept to either pave (or block) the way for the rest of our server-side apps.

One app currently handles 192 simultaneous clients. In VB6, this equates to about a 40MB footprint on the server side. After my initial port to .NET, that same set of 192 clients causes the memory footprint to bloat to 110MB.

My next test was to start up all 192 clients performing activity (as opposed to just logging on and being idle). The VB-6 app handles the load with no sweat (it barely breaks 10% CPU on a 2.4Ghz Dual Pentium).

Contrast this with .NET when all 192 clients start performing activity, the CPU is totally bogged down to 99% CPU (both pentiums sweating away, the fans on our test machine even start spinning faster!). Then, many clients start timing our because the server-side app just can't keep up.

I was very exciting when I began the port to .NET because many things are so much sweeter, cleaner, and more elegant to code in .NET. However, with these two very simple tests, I have become quite disheartened in the .NET performance. Basically, there's no way we could sell our APP in .NET if it's performance is that poor.

And with all that said, even a server class PC with Gigs of RAM and multiple CPUs has a tough job of keeping up when compared to the neraly similar functionality that a VB-6 app provided.

4/22/2005 11:47:00 AM by Todd Lindberg


# re: .NET World Follow Up

"If memory footprint was the overriding criteria, we'd all still be running Windows 95."

Hey - that's not true. We'd be running Windows 3.1x. When I upgraded from 3.1 to 95 on my old ThinkPad, I remember how traumatic it was having to double my RAM (to 16MB). 95 was HUGE! ;-)

4/22/2005 12:10:00 PM by Wes


# re: .NET World Follow Up

At a PDC I was at many years ago, no less than Mr. Cutler said, "Intel will keep making faster and faster processors, and we'll use up every cycle we can get."

And of course there the maxim (now two CEOs out of date): "Andy giveth and Bill taketh away."

- Sean (the tall one)

4/22/2005 2:00:00 PM by Sean


# re: .NET World Follow Up

I started computing in 1979 with a TRS-80 from Radio Shack. That machine greeted you instantly with the friendly "Ready >_" prompt upon power-up.

Switching to a CP/M machine, followed by an MS-DOS machine, meant waiting for DOS to load from floppy disk.

Each version of Windows requirs more time to boot, more memory, and a faster machine. Does it really require billions and billions of CPU cycles just to boot up? Linux is appearing more and more attractive...

4/23/2005 9:28:00 PM by Kevin Rice


# re: .NET World Follow Up

"Linux is appearing more and more attractive..."

Linux without a shell, perhaps. Run KDE or Gnome and it's no better than XP as far as boot performance or runtime bloat.

4/23/2005 10:25:00 PM by Anonymous


# re: .NET World Follow Up

I do completely share Mark's opinion, even more so after reading the answer posts of Cullen Waters and Ken Henderson.

They're praising .NET to be the choice for more "complex" applications. For me as a software user, this is really scaring: I think that the future of software development should go away from clumsy, feature-bloated (and consequently bug-packed) applications towards simple, well-designed, easy-to-use applications that appear immediately upon mouse-click, that's what I call productive! Like this, Notepad is the best choice to do the comparsion!

.NET is definitely the wrong way.

4/25/2005 7:37:00 AM by florian


# re: .NET World Follow Up

NT4 was good for its time. Sure its boot time was decent and the resource usage was low in comparison to current offerings.

Here comes the but: Anyone who thinks NT4 is worth having on any hardware within the realms of "current" hasn't been keeping an eye on developments.

NT4 does not outperform windows 2000 on identical hardware once you hit a certain point. Your tests may differ but mine showed that for quake3 (yes yes, I'm a gamer too) once you hit 256MB RAM and a P3 450 or so, NT4 started to lose for fps and load times. Theres a few factors to that testing. Lots of disabled services, clean installs, as near to HCL hardware except the video cards as possible.

While I haven't yet found the level of hardware where Windows XP beats Windows 2000, I have noticed a trend for vastly better memory management above 2GB in XP. I expect that at 2.5Ghz and 4GB ram the XP install will win out or at worst come even.

I'm also a netadmin, and when people ask if its time to upgrade to the latest OS I always say "Ready when you are". I like new tech, anyone who doesn't has no place working in IT.

What can I do with XP that I can't with NT4? GET SUPPORT!

4/26/2005 9:14:00 PM by rivet


# re: .NET World Follow Up

'Then ask yourself what you really can do on the Windows XP system that you can’t do on the Windows NT 4 system.'

Connect it to an ADSL line?

I give you 10 minutes before Nt4 is brought down irreparably. Admittedly its the same with an unpatched XP.

4/27/2005 1:46:00 AM by Ian Murphy


# re: .NET World Follow Up

I agree Mark's comment on .Net applications.

Applications written in .Net are not free from getting memory-leaked actually. However, the architecture of .Net really does help programmers write memory-leak-free network applications easier. So I have chosen .Net as a server-side application platform. As to the desktop application aspect, I really do not appreciate .Net applications, since they lauch slower, eat up more mem than needed, as Mark remarks here.

5/2/2005 5:14:00 AM by Anonymous


# re: .NET World Follow Up

Two remarks:

(a) On a Windows Terminal Server we have several GB of RAM, BUT THIS IS SHARED BY 10-100 USERS AT THE SAME TIME! So here a too big RAM footprint really hurts.

(b) Does anyone remember UCSD-Pascal? Kind of Java's and .NET's grand (grand) father. At the time it was popular (Apple II ...) they hat at most 64k of RAM, 360k disks and no hard disk drives. But they had already useful applications running (compiler, editor, sql databases, wordprocessors, spreadsheets, ...). And this was partly due to the fact that the interpreted P-code used was very compact. For a more recent example of this kind look for Niklaus Wirth's Oberon Operating System and (OO+GC) Programming Language -- 3c6c

5/8/2005 3:14:00 AM by Anonymous


# re: .NET World Follow Up

While I agree that .Net is bad from a pure performance perspective I still think that .Net is good overall. In order to increase productivity while programming it helps to abstract things out which is what .Net really brings to the table.

As to the NT4 to WinXP comparison, or really what Kevin Rice has said with going back further to old command line systems, yes the new computer is not any faster at doing the exact same thing. But, there are many other important considerations.

First adding extra functionality generally follows an exponential curve in that the simple bits of functionality are easy to add both in terms of the computations being done by the application for the end user.

Second, the extra functionality also seem to be exponential in terms of programmer time. Generally the first features are the simple to implement ones. So then when you look at how to improve the application next you're looking at something much more complex to implement. Also the management of the application has its impact. I mean a nice simple to maintain app won't stay that way if you add more functionality to it. This requires the app to become more modular which in turn decreases the performance of the application.

5/10/2005 10:09:00 AM by Eli Allen


# re: .NET World Follow Up

Just because each release of anything from Microsoft is a greater resource hog than its predecessors doesn't mean it needs to be so. I work on several platforms, among them WinXP and Apple's OS X... and since OS X 10.1 every subsequent release has ran faster on the same hardware.

Now, personally, I think .NET has a great deal to offer, but I also think that a clean framework for C++ would've accomplished many of the same goals. It isn't so much that Microsoft's Java knockoff is wonderful, it's that MFC/COM was so bloody awful . But, like it or not, .NET isn't disappearing anytime soon.

If memory footprint really turns out to be that much of a problem, there are other platforms out there perfectly able to fill the need. People who think Windows is the only real choice need to update their educations.

5/16/2005 11:07:00 PM by Anonymous


# re: .NET World Follow Up

As an developer in MS technologies I have progressed from VB6 to .NET. For the past 3 years I have been developing both C# and VB.NET applications and I must say I really like the framework. Slamming .NET based on one Port of a VB6 application to .NET is not a fair comparision and much is being overlooked. For example, I could write horrible VB6 code that would perform very slow the same way I could write horrible VB.NET code. VB.NET is a major change from VB6 and many factors should be consider when porting an application. One factor that instantly comes to mind is the use of late binding. In VB6 it was perfectly legal to declare a variable as a variant data type and at runtime cast it as something. In .NET this is highly frowned upon b/c it is expensive and slow. A properly designed .NET application can perform quite well. Furthermore, .NET application tend to load slowly the first time they are run but after that the application runs faster b/c the application has been compiled. There is also an option to not wait for Just In Time compilation but to compile your .NET upon install for your platform. There is a great example on MSDN on porting a Classic ASP e-commerce site to .NET. It includes an unoptimized and optimized .NET version. The optimized version which follows best practices runs quite fast.

5/17/2005 9:43:00 AM by Dennis


# re: .NET World Follow Up

The computer world making more work (and money) for itself.... Most people don't even know what is going on, and most of those who know, couldn't care less!

All we need is a revolutionary code interpreter that can mix and match and translate code. Somebody is working on it... I'm sure.

So where is the linux that is optimised for boot-speed? Because it isn't much better!

I once tried a 4MB Windows 95 installation that could run Netscape. And it worked. I bet it would boot in a second.

The old windows' were buggy and crashed a lot more than Windows XP, that is also true.

What I am curious about, is why someone hasn't made a linux that boots instantly. Why can't you listen to your mp3's and start working on your spreadsheet while your pc is still 'booting'?

Windows XP can be set up to be pretty quick, and I was surpised to see it boot in 20 seconds on a PII 266.

I am rambling, yes. But the bottom line is that apparently no-one has come up with a money-making scheme of high-enough priority that involves speeding up code.

Actually. Its quite simple why the programming world is in it's current state!

Remember just 3 years ago, how long you would have to wait for your C++ code to compile?

I have done some comparisons on C, pascal and assembler compilers. What I found is that the C and Pascal compilers overall still produce up to 70% redundant machine code in their output.

The problem is thus with the compiler models currently in use.

Maybe someone is hoping that more abstract code and languages will somehow be the gateway to more efficient compilation...

So what is the most efficient compiler? Who is funding research into better techniques? Where can I get paid to pursue this in what little time I have free?

6/12/2005 10:44:00 PM by Anonymous


# re: .NET World Follow Up

Rico Mariani had a series recently on blogs.msdn.com wherein he compared the performance of an app written in both c# and c++.

http://blogs.msdn.com/ricom/

The results were interesting. Comparing first iterations of the two apps, the c# app far outperformed the c++ app. After several rounds of optimisation, the c# app still beat the c++ app, but not by as much.

His conclusion on memory usage:
...
Even with all these overheads we did a pretty darn good job right out of the gate. Remember we turned in a fabulous bottom line with zero effort even including the downsides mentioned above.

What can you learn from this? I think the main lesson is that you get the best "deal" from the CLR when you give it a meaty amount of work to do. For very small programs, if you don't mind a few megabytes of overhead you're fine but if you need to be tiny you'll be disappointed. For large existing programs, you'll be very sad indeed if you add managed code to your program to do just one very small thing -- it's not a good deal.

If you're introducing managed code, give us something meaty and you'll be much more pleased. Or else use the managed code at a time when space is not at a premium. The worst thing you could possibly do is add managed code to an already long startup sequence and have it do just a wee little something that's inconvenient to do in unmanaged code. Benchmarkers keep this in mind, and I've I said this before but it bears repeating: If you want to make this framework look bad, or any framework for that matter, then benchmark it doing almost nothing. You can get all the badness you like that way, but it's vacuous badness.

If you instead keep these parameters in mind and use the system thoughfully you can have great success -- often with comparative ease! Personally I think this Dictionary reader, the managed version, was a tremendous success.

6/13/2005 1:50:00 PM by Anonymous


# re: .NET World Follow Up

Todd: While I do not know the specifics of your case, it very much sounds like you've just not done enough instrumentation. Coming from non-managed world it is very easy to write managed code that performs very poorly and even if you are expert on managed code and .NET you should still instrument while developing.

It's possible to make high performance apps in .NET but without years of experience you really need to profile often so you get to know the costs of all the abstractions provided.

6/24/2005 9:03:00 AM by Anonymous


# re: .NET World Follow Up

Regarding the .NET startup costs and boot time..

It's very simple. Create a managed service that perioidically does something. This causes the .NET to be constantly loaded, similar to what we can expect in Longhorn. Now you'll have all Managed apps start faster when .NET files are kept in the filecache.

Second.. Forget what Mark Russinovich said. Use only managed apps if you can. When there's no memory leaking crappy legacy apps around you can keep using the STR(suspend to RAM) to shutdown the computer every day. When you need it again the "boot time" is a non-issue.

And developers.. get VSTS and use the instrumentation for managed code! It's great.

6/24/2005 9:12:00 AM by Anonymous


# re: .NET World Follow Up

?????
????
???
??
???
??
???
??
????
???
????

6/30/2005 10:43:00 PM by Anonymous
# re: .NET World Follow Up

it doesnt matter if you like it or not, boys Microsoft likes .NET! there are two ways you can go. one, you write your applications with c++, which have small footprint and run faster, but takes lots of time to implement, debug and design the user interface. two, you can "upgrade yourself" and start using c# or vb .net (they are the same anyway) and write code 10x faster than writing native code and be there when Longhorn hits the shelves.
Microsoft made its decission long long time ago. And in this ecosystem we are only little tiny ants who have no effect on the big picture

7/19/2005 6:43:00 AM by Kaan Gonel


# re: .NET World Follow Up

These 2 sentences seem to cancel each other out:

Memory leaks are not as important because most client applications don’t require continuous access and restarting the application (or rebooting if necessary) resets a leak.

Memory footprint is much important for a client-side-only application since there can be many such applications running concurrently and clients often have limited memory.


I agree footprint is a big deal, but wouldn't a leaky c++ app have a much bigger footprint than a managed one?

For example, I've been working on a mixed C++ and .net app, and accidentally had a leak on the C++ side, and suddenly a rarely used code-path that should have used 40-50Megs was using over 500. I had to go back and find and fix the leak, whereas in .net it would have been GC'ed. If the leak had been only an extra 20-30megs I might not have noticed and never fixed the problem at all.

7/28/2005 11:23:00 PM by RyanLeeSchneider


# re: .NET World Follow Up

For all the .net proponents, I ask what other platforms can you run you C# and .net programs on. As I currently see it, you apps are stuck in a Microsoft specific language and programming environment that is useless when it comes time to port to *Nix.

I agree MFC is not much better in the portability respect, but much of the memory leaks and network operation abstraaction can be built into a set of standard C++ classes that can be fully portable.

NT 4.0 is perfectly happy on a DSL line as long as you disable all unnessisary services. No need for a firewall if none of the ports are open. It is the services that get exploited. Install a decent non-IE browser and you are good to go.

8/23/2005 6:14:00 PM by Anonymous


# re: .NET World Follow Up

>For all the .net proponents, I ask what other platforms can you run you C# and .net programs on.

Those platforms which Mono and Rotor allow. I've ran my Net 2.0 app in latest Mono build on Debian and it works very well - 5 lines needed changing since Mono is few months behind the latest Net 2.0 beta changes.

If you need to do GUI then there are many approaches. Do some research next time you comment.

8/25/2005 12:17:00 PM by Anonymous


# re: .NET World Follow Up

"What I am curious about, is why someone hasn't made a linux that boots instantly. Why can't you listen to your mp3's and start working on your spreadsheet while your pc is still 'booting'?"

If that's what rocks your boat, try linuxbios.org. They load the kernel straight into the bios rom. No need to post, the kernel is the first thing that gets loaded. So yeah, instant boot has been possible for a long while. Add to that init-ng which boots on my system in about 10 seconds (after POST which could be skipped with linuxbios). The reason these apps aren't mainstream is there isn't enough support for it financially. Both projects are beeing developed by a handfull of people with hardly any funds.

"For example, I've been working on a mixed C++ and .net app, and accidentally had a leak on the C++ side, and suddenly a rarely used code-path that should have used 40-50Megs was using over 500. I had to go back and find and fix the leak, whereas in .net it would have been GC'ed. If the leak had been only an extra 20-30megs I might not have noticed and never fixed the problem at all."

I know in coorperate enviroments time is money and if you cost time fixing a leak, then you might get to expensive. But really, you can't be serious right? Memory leaks should be fixed. Bad code should be fixed. Almost makes me want to say MS wants .Net so that we can all put more crappy code on our systems. Who cares wether it's bad or not, atleast it won't crash your system. No really, that's why I don't install .Net, and don't run .Net stuff. .Net makes it to easy for people to write sloppy code and get away with it.

10/21/2005 12:05:00 PM by node


# re: .NET World Follow Up

The developers in a project I was involved with recently used .NET to build a server-side web application. It pulls database records into memory to reduce the amount of time it takes the user to load the next page. That aside, it starts out "clean" at just over 200 MB of private memory. Log about 10 users on and it can quickly jump to 1 GB. Talk about bloatware. I've actually seen it go down on a Dual Xeon 2GHz with 4 GB of memory with only 20 users logged in.

11/10/2005 11:18:00 AM by s33ker


# re: .NET World Follow Up

"I have done some comparisons on C, pascal and assembler compilers. What I found is that the C and Pascal compilers overall still produce up to 70% redundant machine code in their output."

but typical applications spend 95% of the time executing 5% of the code, hand-optimising that 5% of the code produced by the compiler can give 95% of the benefits of a mythical perfect compiler.

3/3/2006 3:54:00 PM by Anonymous


# re: .NET World Follow Up

A Windows NT 4.0 system cannot get patched up easily for old vulnerabilities since Microsoft makes their money by selling new operating systems.

I ran Windows 98 for a long time before I upgraded to windows 2000 in year 2006. I needed USB and video cards which NT didn't support but Win98 did.

Windows 2000 let's me hibernate the computer and the load times are really fast when you come out of hibernation. Windows 2000 start up time is slow compared to Windows 3.11 or Windows 3.1, but Windows 3.1 apps crash like crazy. I still have some old Win 3.1 apps for MIDI but they always crash on Windows 2000 and I have lost some music work I've done in MIDI several times.

Windows 2000 needs to be restarted about once a week or sometimes once every 2 weeks. Windows 98 needed to be restarted once a day. It's definitely gotten better that way.

As for Windows NT - the big problem is hardware manufacturer support and Microsoft updates.. since Windows NT is dated and MS wants you to by products from them it will be hard to get updates for an NT 4.0 system.

What ticks me off is that people are moving toward XML and .NET because supposedly processors are getting faster and faster. It's funny because I've seen 1.5Ghz laptops for sale for several eons.

1.5Ghz was here about two years ago or earlier, and they are still selling 1.5Ghz laptops in October of 2006, the date I am writing?

I think that processors are no longer getting faster as they once were. A few years ago it was typical to upgrade your 200Mhz pentium machine up to a 600Mhz machine. That's three times the MHz. But now I see 1.5Ghz computers for sale, and these have been for sale for ages.

I don't see 6.5Ghz computers all over the stores. And I don't see 12.8Ghz computers all over the stores. I don't see 24Ghz computers. CPU speeds are no longer doubling, folks.

The other thing to consider about the future of 12Ghz and 35Ghz computers with 60GB of ram - ELECTRICITY costs! Electricity prices have always gone up, because electricity requires fuel. Fuel prices are always going up.

So if my 12Ghz machine requires 1500W of electricity this will be as much as an electric heater. To have an electric heater running constantly (since I leave my machine on 16 hours out of the day) is going to cost a lot of money. Especially if one has 4 computers in the house.

More and more households are starting to put more desktops in the household. People are moving away from family shared computers. There is usually at least 2 or 3 computers in a household.. one for a kid, one for the parents, and maybe one for another kid.

10/17/2006 1:22:00 PM by Lars


Version history
Last update:
‎Jun 26 2019 11:13 PM
Updated by: