What makes a “lightweight” desktop environment lightweight?

Over the last few days I was wondering what is a “lightweight” desktop. And I must say I couldn’t come up with an answer to that question.

I was considering various things like “being memory efficient” which I discarded for obvious reasons. First of all it’s difficult to measure memory usage correctly (I haven’t seen anyone, who provides numbers, doing it correctly, this includes especially Phoronix). And then it’s comparing mostly apples to oranges. Loading a high-resolution wallpaper might make all the difference in the memory usage. Also if desktop environment Foo provides features which are not provided by Bar it’s obvious that Foo uses more memory. But still it’s apples vs oranges. It’s not a comparison on memory, it’s a comparison of features. And of course one might consider the Time-memory-tradeoff.

So is it all about features? Obviously not. If there is a feature a user needs and uses it cannot be bloat. The fact that a desktop environment has only few features cannot be the key to being lightweight. Being evil now: many people say GNOME is removing features, but nobody would say that GNOME is lightweight.

What about support for old systems? That’s not lightweight, that’s support for old hardware. And it’s something which doesn’t make any sense given Moore’s law. Which raises the first question: what is old hardware? One year, two years, ten years? Is it a moving target or is a Pentium III for all time the reference? Optimizing for old hardware means not making use of modern hardware capabilities. But does that make sense to not use modern hardware if it is available? Using the GPU for things the GPU can do better than the CPU is a good thing, isn’t it? Parallelize a computation on multi-core if possible is a good thing, isn’t it? But if you do so, you are optimizing for modern hardware and not for old hardware. So saying you are good for old hardware, implies you are bad on new hardware? Also I’m wondering how one can optimize for old hardware? Developers tend to have new hardware to not have problems like this. And how can one keep support for old hardware when the complete stack is moving towards new hardware? Who tests the kernel against old hardware? Who provides DRI drivers for obsoleted hardware which doesn’t fit into modern mainboards (who remembers AGP or PCI)? Who ensures that software is still working on 32 bit systems, who would notice such a breakage for example in the X-Server? So lightweight cannot be fit for old hardware. And remember: optimizing for old hardware is not the same as optimizing for modern low-end hardware. Even the Raspberry Pi has a stronger CPU (700 MHz) than the oldest Pentium III (450 MHz) – not to mention things like OpenGL…

What’s it then? Let’s ask Wikipedia. For Xfce it tells us, that “it aims to be fast and lightweight, while still being visually appealing and easy to use”. Unfortunately there’s no link on lightweight and also no reference. Let’s try again, LXDE: “The goal of the project is to provide a desktop environment that is fast and energy efficient”. Again no real definition and just stating a goal. But it goes on:

LXDE is designed to work well with computers on the low end of the performance spectrum such as older resource-constrained machines, new generation netbooks, and other small computers, especially those with low amounts of RAM.

I have no idea what a “new generation netbook” is, but it sounds like something modern half a decade ago. But we are down to the “being good on old hardware”, which we just discarded. Interestingly Wikipedia has references to prove that LXDE is good on RAM – unfortunately it’s references to Phoronix. Shame on you, Wikipedia, that’s a “benchmark” which has been considered seriously flawed by people understanding the topic.

Ok Phoronix aside, there is one more of the lightweight desktops to check. Razor-qt: “is a lightweight, Qt-based free software desktop environment for the X Window System. Unlike the KDE desktop environment which is also based on Qt, Razor-qt usually performs well even when used in computers with older hardware”. Damn, it again only claims and no reference for anything. Who says that KDE desktop environment (what’s that again?) is not performing well on older hardware? And what does the “usually” mean in that sentence?

So all Wikipedia gives us, is buzzword bingo and various claims without any proof (and not even a “reference missing”). Let’s try different and go to the projects directly.

Xfce tells us that it “is a lightweight desktop environment for UNIX-like operating systems. It aims to be fast and low on system resources, while still being visually appealing and user friendly.” Which sounds very similar to what Wikipedia wrote. Unfortunately nothing that tells us what lightweight means or what fast is or low on system resources. Fuzzy is fuzzy.

LXDE also has the same definition as shown in Wikipedia (maybe we should mark those wiki articles as advertisement?) but at least gives us a definition for what lightweight means to them: “It needs less CPU and performs extremely well with reasonable memory.” Fuzzy is fuzzy. What’s “reasonable memory” and “less CPU”? It goes on about various other things like “fast” and “energy saving” (which even are contradicting, e.g. fast is defined as working on hardware from 1999 and not require 3D – how that can be energy saving on new hardware one has to show me). The point energy saving is quite funny as it just says they are better than “other systems”. Oh well fuzzy is fuzzy.

Razor-qt is an advanced, easy-to-use, and fast desktop environment based on Qt technologies. It has been tailored for users who value simplicity, speed, and an intuitive interface. Unlike most desktop environments, Razor-qt also works fine with weak machines.

Fuzzy is fuzzy. What’s a “weak machine”, what are the “most desktop environments”? Where’s the proof for other DEs not working well on those not-defined “weak machines”?

And here, poor fool! with all my lore I stand, no wiser than before. All what I could derive from studying what lightweight means is, that one just has to claim to be lightweight. Bonus points if you include it into your name. It seems like if you repeat it often enough people will repeat it and it will turn out into truth.

Comment navigation

In a comment in Jos Poortvliet’s Google+ page you say “What it needs is a new decoration designed for low performance”. With decoration, do you mean window decoration or/and window and components (buttons, etc.) decoration? Where could I get the info on how to make a decoration and what are the things that make them “lightweight”? Thanks in advance.

I do not get your first question. For the second: there is some information on our KWin wiki space about how to implement window decorations. What we need is a window decoration which would perform somewhat okish with network transparency. Though I’m thinking about whether that’s still a valid usecase given Qt 5.

I believe most thinks the “lightweight” means just that your software can not demand the CPU cycles to do basic things like open menu, move window, open multiple files etc.

I drop RAM out as I follow the old idea, non-used RAM is wasted RAM (remember those Windows users who claimed how they have only 20-30% of 64/128 RAM used and Unix people had 1-2 megabytes free?) and same thing goes more with the storage space, no need to have 2 terabytes drive if only using 60 gigabytes.

But with CPU I would consider being the “heavy” and “light” as it is the one what draws the power from battery or plug and cause higher power bill.

Unfortunately, I cannot contribute a scientific definition. All I know is experience. On any computer I have with a discreet graphics card – Gnome and KDE run very well. It also runs well on Nvidia or AMD motherboard graphics cards. On my netbook – an Acer Aspire One – KDE does not run at anything approaching an acceptable level. Unless you like taking 1-4 minutes (or more) to launch a program. It seems to be the GUI’s fault, mostly, but I’m not sure. On that system – XFCE runs perfectly as does LXDE and Fluxbox. Were it for my sole use, I’d just run Fluxbox and save all the CPU cycles for programs. But it needs to be wife/kid friendly. As a note: every computer in my house that’s powerful enough to do so runs KDE as that is my favorite DE and I make heavy use of activities.

I use only KDE on my netbook and I don’t find any lag or any frame dropping with it. It is blazing fast even that I have only Intel Atom 450 CPU 512KB cache and 1.66Ghz (two threads). RAM I have the default 512MB. It is a Asus 1000HE what has little newer CPU in it. I used as well Acer Aspire One (the first generation) and worked perfectly fine but didn’t like the smaller keyboard.

I don’t use any activities (have not found any good reason them to exist overall) but I keep almost all bells and whistles turned on with KWin.

Although I don’t use may different applications, mainly digiKam, Rekonq, Amarok, Dolphin, Basket, Calligra Words and Kate. And I typically have 4-5 of those running same time.

(Not to start anything but on same computer latest GNOME lags and drops frame a lot, and LXDE or XFCE4 does run well but they are not so nice by usability standpoint so I don’t use them but I have them installed).

I think Will Stephenson identified what ‘lightweight’ often means: a quick startup time.

It also refers to “works well on my computer”.

In the end it also just means reliable. If a process starts chewing CPU for no reason that’s being ‘heavyweight.’

Given what you develop Martin I guess that’s particularly frustrating, since essentially you are at the whim of the device drivers. (Though with KDE5 the entire desktop will be… maybe that will actually shake out to make things easier since it means dropping support for hardware without working opengl stacks.)

So you are correct that in the end it doesn’t mean much at all. But obviously stuff like quick startup times and reliability are good goals for any software project. If we see criticism of KDE being too heavy we should probably just assume that’s what is meant.

> Over the last few days I was wondering what is a “lightweight” desktop. And I must say I couldn’t come up with an answer to that question.
>
> I was considering various things like “being memory efficient” which I discarded for obvious reasons. First of all it’s difficult to measure memory usage correctly (I haven’t seen anyone, who provides numbers, doing it correctly, this includes especially Phoronix).

Easily measurable or not, memory efficiency is a quality you can take into consideration when developing software to make it more memory efficient. Measuring memory usage of individual processes present in a piece of software and summing the numbers is also a viable way of measuring memory usage.

> And then it’s comparing mostly apples to oranges. Loading a high-resolution wallpaper might make all the difference in the memory usage.

I completely agree with that. I think a more valid comparison would be to examine which software loads the same wallpaper in the most efficient manner, desktop wallpapers thankfully being a feature most DEs support.

> Also if desktop environment Foo provides features which are not provided by Bar it’s obvious that Foo uses more memory. But still it’s apples vs oranges. It’s not a comparison on memory, it’s a comparison of features.

It’s not a given that more features means higher memory usage. A case to prove my point is “notepad.exe” versus “vi”. I think you will agree that vi has a richer feature set, but notepad.exe is more memory expensive.

> And of course one might consider the Time-memory-tradeoff.

Agreed. If anything, CPU time spent and memory usage should be tuned to the specific requirements of the application, and the scarcity of these resources. That doesn’t change the fact, though, that unless you’ve reached the unlikely case of algorithmic perfection, one piece of software can perform the same tasks as another in seemingly (to the end user) the same way, but in a more efficient manner — both in terms of CPU usage and RAM usage.

> So is it all about features? Obviously not. If there is a feature a user needs and uses it cannot be bloat. The fact that a desktop environment has only few features cannot be the key to being lightweight.

I don’t understand the basis of any of these conclusions. I’d say that having only a few features might very well be a key to being lightweight, but like you, I’d be arguing on the basis of my opinions and not facts.

> Being evil now: many people say GNOME is removing features, but nobody would say that GNOME is lightweight.

I don’t know about this so I will not address it in less general terms that saying that less features isn’t the only aspect of being lightweight.

> What about support for old systems? That’s not lightweight, that’s support for old hardware.

As far as I’m concerned, “support for old hardware” is just another way of saying that the software has low enough resource requirements to run on less than an average desktop computer. How you come to the conclusion that that’s not lightweight, I don’t understand, since the question you are asking is what lightweight means.

> And it’s something which doesn’t make any sense given Moore’s law.

I’m not sure how Moore’s law has anything to do with this. Moore’s law, as a prediction of the increase in complexity of computer hardware, has nothing to say about software complexity or efficiency. It doesn’t deal with how the increasing amount of resources should be spent.

> Which raises the first question: what is old hardware? One year, two years, ten years? Is it a moving target or is a Pentium III for all time the reference?

That is a good question that I can’t answer. In a general DE sense, I think people use this term to address systems of less capacity than the average desktop computer, but I agree that it’s a vague notion at best.

> Optimizing for old hardware means not making use of modern hardware capabilities.

Not necessarily. Software can be built in such a way that modern hardware capabilities will be utilized if available, falling back on procedures that lower-end system supports.

> But does that make sense to not use modern hardware if it is available? Using the GPU for things the GPU can do better than the CPU is a good thing, isn’t it?

Yes, probably. One should consider the negative impact of introducing features that specifically require GPU hardware, though. There ought to be a compelling reason to introduce them since they naturally exclude working hardware that is still in use. That’s one of the reasons I think it’s good to have a wide range of desktop environments that all focus on different user requirements. As far as I’m concerned, the most of what I want out of a window manager in terms of graphics can be done with simple blitter hardware. That has generally been available in x86 PCs for more than a decade, and in home computers in general since the MSX2 and the Amiga 1000. X11 already uses the graphics hardware available for this, reverting to the CPU if not.

> Parallelize a computation on multi-core if possible is a good thing, isn’t it? But if you do so, you are optimizing for modern hardware and not for old hardware.

Agreed. A far as desktop environments go, they all already do this by running different tasks in different pieces of software running in different processes.

> So saying you are good for old hardware, implies you are bad on new hardware?

I disagree on the basis of the points I made above.

> Also I’m wondering how one can optimize for old hardware? Developers tend to have new hardware to not have problems like this.

Good developers also have a basic understanding of how their compiler works and understand how their code affects complexity. Optimizing for old hardware is as simple as decreasing the use of resources that are scarce on old hardware. A developer will have a good idea of these parameters without even running the software.

> And how can one keep support for old hardware when the complete stack is moving towards new hardware?

Old hardware is not the only arena where restrictive use of resources is desirable. Consider embedded platforms, phones, laptops, set-top boxes, servers, you name it — they all have different requirements.

Phones is a particularly good example in that the abundance of resources doesn’t necessarily mean you should utilize it fully at all time, since that comes at the expense of battery usage.

> Who tests the kernel against old hardware?

Users. Kernel breakage doesn’t usually go unnoticed.

> Who provides DRI drivers for obsoleted hardware which doesn’t fit into modern mainboards (who remembers AGP or PCI)?

> Who ensures that software is still working on 32 bit systems, who would notice such a breakage for example in the X-Server?

Again, users. 32-bit systems are still abundant, and I don’t see them going away any time soon. 8-bit and 16-bit systems are also abundant, both modern and old. If they didn’t go away, why should 32-bit systems?

> So lightweight cannot be fit for old hardware.

I’m not sure how you arrive at this conclusion, even if the points you made above had all been correct. Could you elaborate that?

> And remember: optimizing for old hardware is not the same as optimizing for modern low-end hardware. Even the Raspberry Pi has a stronger CPU (700 MHz) than the oldest Pentium III (450 MHz) – not to mention things like OpenGL…

The raspberry Pi isn’t really low-end hardware in terms of what kind of measly hardware the Linux kernel actually supports. Optimizing for old hardware in the most general sense (by minimizing algorithmic complexity and RAM usage) is exactly the same as optimizing for modern low-end hardware.

And you’ll have to back up that claim of the Raspberry Pi CPU being faster than a 450 MHz P3. The Raspberry Pi foundation claims something closer to a 300 MHz Pentium II, but without citing any performance benchmarks. The ultimate fallacy here I guess is that you made your conclusion on the basis of the different clock rates. Clock rate comparison isn’t very meaningful when comparing two vastly different architectures, and neither is the instruction execution rate.

> What’s it then? Let’s ask Wikipedia. For Xfce it tells us, that “it aims to be fast and lightweight, while still being visually appealing and easy to use”. Unfortunately there’s no link on lightweight and also no reference. Let’s try again, LXDE: “The goal of the project is to provide a desktop environment that is fast and energy efficient”. Again no real definition and just stating a goal. But it goes on:
> LXDE is designed to work well with computers on the low end of the performance spectrum such as older resource-constrained machines, new generation netbooks, and other small computers, especially those with low amounts of RAM.
I have no idea what a “new generation netbook” is, but it sounds like something modern half a decade ago. But we are down to the “being good on old hardware”, which we just discarded.

By a “new generation netbook” I assume they mean netbooks, which are still heavily in use. I bought one two years ago with 1 Gb RAM and a dual core Atom CPU, and needless to say, the computing resources available weren’t that of a typical desktop system at the time. Also, “it aims to be fast and lightweight, while still being visually appealing and easy to use” is, although unreferenced, completely true unless the Xfce developers are lying about their aims on the official project website.

> Interestingly Wikipedia has references to prove that LXDE is good on RAM – unfortunately it’s references to Phoronix. Shame on you, Wikipedia, that’s a “benchmark” which has been considered seriously flawed by people understanding the topic.

The link you use as a reference doesn’t argue why Phoronix is flawed on any technical basis. Neither does any of the links in the article. If this supposed expert really is an expert on benchmarking, I’m sure he would be able to prove his point, and as an expert, it would probably also be in his interest to do so. Just looking at who said it isn’t a particularly strong basis for a claim like that.

> Ok Phoronix aside, there is one more of the lightweight desktops to check. Razor-qt: “is a lightweight, Qt-based free software desktop environment for the X Window System. Unlike the KDE desktop environment which is also based on Qt, Razor-qt usually performs well even when used in computers with older hardware”. Damn, it again only claims and no reference for anything. Who says that KDE desktop environment (what’s that again?) is not performing well on older hardware? And what does the “usually” mean in that sentence?

I definitely agree that that’s a vague and probably completely arbitrary claim. I would like to see hard numbers or a running test to make up an opinion.

> So all Wikipedia gives us, is buzzword bingo and various claims without any proof (and not even a “reference missing”). Let’s try different and go to the projects directly.

That’s wikipedia for you! I always propose the option of correcting articles that make these kinds of unreferenced statements.

> Xfce tells us that it “is a lightweight desktop environment for UNIX-like operating systems. It aims to be fast and low on system resources, while still being visually appealing and user friendly.” Which sounds very similar to what Wikipedia wrote. Unfortunately nothing that tells us what lightweight means or what fast is or low on system resources. Fuzzy is fuzzy.

Fuzzy, but then again it’s a claim of aims, not statements of fact. “Visually appealing” and “user friendly” are of course also concepts that only exist in the eye of the beholder.

> LXDE also has the same definition as shown in Wikipedia (maybe we should mark those wiki articles as advertisement?) but at least gives us a definition for what lightweight means to them: “It needs less CPU and performs extremely well with reasonable memory.” Fuzzy is fuzzy. What’s “reasonable memory” and “less CPU”? It goes on about various other things like “fast” and “energy saving” (which even are contradicting, e.g. fast is defined as working on hardware from 1999 and not require 3D – how that can be energy saving on new hardware one has to show me). The point energy saving is quite funny as it just says they are better than “other systems”. Oh well fuzzy is fuzzy.

Again, fuzzy, but make no mistake in thinking that graphics cards aren’t power hogs. The minimal rendering tasks LXDE performs might be more energy efficient when performed on a CPU than a GPU, and the fact that you don’t even need a modern GPU for it to work means you can cut power usage down by a few watt. You could of course change my mind if you provided hard evidence for your claim that “fast” and “energy saving” are contradictory in this particular case.

> Last but not least: Razor-qt. Razor doesn’t say it’s lightweight:
>
> Razor-qt is an advanced, easy-to-use, and fast desktop environment based on Qt technologies. It has been tailored for users who value simplicity, speed, and an intuitive interface. Unlike most desktop environments, Razor-qt also works fine with weak machines.
>
> Fuzzy is fuzzy. What’s a “weak machine”, what are the “most desktop environments”? Where’s the proof for other DEs not working well on those not-defined “weak machines”?

Yes, fuzzy is still fuzzy, but hey, try running Gnome or KDE on a Raspberry Pi, then try Razor-qt and evaluate the experience and you’ll see their point. “Most desktop environments” might be a fallacious assumption, but their claims to be fast and working on “weak machines” holds true compared to at least KDE and Gnome. They don’t have to prove it for it to be true, either. It’s their project website, not an encyclopedia.

> And here, poor fool! with all my lore I stand, no wiser than before. All what I could derive from studying what lightweight means is, that one just has to claim to be lightweight. Bonus points if you include it into your name. It seems like if you repeat it often enough people will repeat it and it will turn out into truth.

You are no wiser because you dismissed any basis for which the claim of being “lightweight” can be made, using dubious reasoning, misunderstandings, an army of straw men, and without the factual basis you expect the LXDE and Razor-qt developers to deliver.

+1 for thos post. As much as I respect Marthin I think his post has serious flaws.

“Lightweight” just can’t be measured so there is not much to prove without extensive comparisations. Is a feather lightweight? First thing that comes to mind is: Yes of course! But I think it’s a rather heavy bloatedwared compilation of atoms, who are really lightweight. The Godparticle still laught about my opinion though.

Every ‘label’ you will ever think of, is temporary one and has the mark on it of the one who came up with it. In the end ladies and gentleman, most people don’t care about memory usage and CPU cycles. They will be happy when a desktop ruins fluid on their machine.

So let’s take the other approach: categories on a scale. We devide computers in 3 categories, based on power+features(aka GPU stuff). We set 3 markers on it, and desktops can claim the mark they can reach

33%: Lightweight – Desktop runs fluid on the top of the 33% lowest PC’s sold in the last 10 years
66%: Middleweight: Desktop runs fluid on the top of the 66% lowest PC’s sold in the last 10 years
100% Heavy weight: Desktop runs only on the most powerfull ‘still common hardware’ available in PC stores right now

This is probably as close as I can get by ‘human perception’ when those statements are read. No hard science, just the most common thought in the world.

““Lightweight” just can’t be measured so there is not much to prove without extensive comparisations. Is a feather lightweight? First thing that comes to mind is: Yes of course! But I think it’s a rather heavy bloatedwared compilation of atoms, who are really lightweight. The Godparticle still laught about my opinion though.

Every ‘label’ you will ever think of, is temporary one and has the mark on it of the one who came up with it. In the end ladies and gentleman, most people don’t care about memory usage and CPU cycles. They will be happy when a desktop ruins fluid on their machine.”

Has lightweight definition changed in last 300-400 years? No… lightweight is still lightweight, the question is on what you compare and where you use it.

Old computers or slow computers are not “light weight”, they are old or slow computers.
New, old… that is temporary definition. Right away you carry your computer out of the store, it is old one.

About at 2010-2011 I made a project where I installed Linux on old Pentium III 500Mhz with 512MB RAM. KDE at that time what I installed (4.4 if I remember correctly) flied trough everything. It was amazing that you could get a over 10 years old computer work perfectly with old Radeon 8500 card.

Your and Iric arguments are flawed, because you don’t count the rest of the software system affecting how graphical user interface works. I made that setup on that time to be very minimal, I didn’t install full KDE SC but only what I actually required. I basicly didn’t even install much, the whole software system toke only about 700MB disk space.
Boot times were acceptable that I thought I could take the computer to be used in garage very well for browsing, writing and other basic things.

And last time when KWin was about in topics being “heavy and bloated” etc. I made tests with KDE (4.5-4.6 if I remember correctly) with 3D off and with on my laptop. Results were that with 3D enabled, battery life time was better. Why? Because power was not wasted to draw graphics so much with 3D used than without. If you can use GPU for drawing a new window or its movement, it does it faster with higher watt ratio than using CPU to do it. KDE gave actually longer battery life time than what LXDE did (using only browser, time calculated with watch but even if giving 15 minutes bonus for LXDE it would have lose).

I am just finishing installing Linux system to Acer Travelmate 800 computer and you can check its specs… 15″ with 1.7Ghz CPU and 1GB RAM. This is from 2003-2004 so it is 9-10 years old and KDE FLIES on this computer, no problems at all. This can boot to KDM under 10 seconds (Fedora toke nice 30 seconds) and to desktop in another 10 (could be faster if that 4 second wait time would get removed by default).
Replace slow 5400RPM HDD with SSD and this would be very capable computer today because this has 1400×1050 resolution display what is higher than most todays laptop display resolutions.
If this just would not have such bad physical damages (case cracked from front, couple parts missing from screen bezel)… It could be used very well on public.

Do you guess what makes KDE fly and be perfect in these old computers? I don’t use bloated Ubuntu or any Ubuntu variant. I don’t use Fedora or openSUSE (sorry guys!) what installs all kind bloated system services and so on.
I want to install only the needed software what are needed and then maintain them.

ps. KDE setup is not minimal, I just drop off kde-edu, kde-games and most kde-graphics from setup. I always use 3D mode on KWin with present+grid desktop, shadows, blur, minimization effects. Main applications installed now are rekonq, kmail, digikam, gwenview, calligra suit, amarok, dragonplayer, okular, kate, basket and chromium (rekonq does not render all pages correctly) but this has dozens of other as well.

The current problem with KDE SC is the akonadi framwork which uses a database as cache? How stupid is that. Just compare Trojita with KMail2 and see the difference in speed. Besides the akonadi indexing backends like virtuoso and nepumuk are always pulling sitting on one core and suck power for now useful reason. So i am happy to get rid of them because i know where i store my stuff and i now how to use grep and find for the harder cases. So lightwight is in my opinion only uses cpu cycles on the users behalf. That is not indexing stupid stuff i don’t need but have plenty of…

I have not got indexing problems since first Nepomuk release…
If I throw a lot old files (100’s of gigabytes) to new setup, I expect the nepomuk to take its CPU time and I/O usage to get them indexed. But its limited in nice to lower than others so it does not slow down.

After it has indexed files, it offers speed and features very well. Even that I know where my files are, I don’t always remember what content I had in what files. So when I search specific keywords from hundreds of PDFs in specific topic, I am happy I get in second or two the listings of files.

Let me tell you the reason why I have definitely dropped the most recent versions of KDE and Gnome and have a poor feeling towards the forthcoming Cynnamon. They are probably the same that made me hate windows 8.
All of theese environments require, to be used, thousands of useless mouse movements to activate features that are too deeply “hidden” in a compulsive quest for useless effects and misundestood graphic politeness.
What we need is: a stable desktop manager with a simple menu, possibly a task bar with shortcuts to the most used applications, a simple and stable file manager to manage documents, images etc. In short: no frills. The elements of the menu must be reachable with a minimum amount of mouse movements. Once activated they must stay visible until a choice is made or the menu dropped. The shortcuts must be easily created (possibly with drag and drop). The choice of applications must be wide enough to allow the satisfacion of individual needs without too much “googling”. The screen, the mouse, the keyboard must stay responsive even when the system is overcharged. Removable units like usb-keys and external disks must be auto mounted without the need to tamper with lot of mount parameters. The audio system must work out of the box. Fonts must be light (the contrary of bold) and higly readable. Color combinations must put users’ eyes at ease. The connection of printers and wi-fi devices MUST NOT be a nightmare and so must be the users management. The installation process must be EASILY UNDERSTANDABLE (not necessarily easy, but manageable by pople without specific technical skills). After all that, if they come, frills are welcome. But not “instead” of the above features.
Give us such a desktop environment and I dont’ kow if it can be called lightweight. Nevertheless I am induced to think that it would be widely used.
Best regards.

“What we need is: a stable desktop manager with a simple menu, possibly a task bar with shortcuts to the most used applications, a simple and stable file manager to manage documents, images etc. In short: no frills. The elements of the menu must be reachable with a minimum amount of mouse movements. Once activated they must stay visible until a choice is made or the menu dropped. The shortcuts must be easily created (possibly with drag and drop).”

What “we” you talk about? There are larger groups of seniors, kids etc who need different kind graphical user interface than what you listed there. Especially they need the effects, they need visual aid for every action what happens or they do so they learn quickly and don’t get panicked because something just jumped on screen. They need flexibility so the needed applications and features can be added and everything else removed so they don’t have the 5+n toolbar buttons but just those 3 what they need but they have possibility to execute the needed hidden function in remote support easily (trough phone or email…)

I can agree that shortcuts needs to be drag’n’drop. The current desktop locking is stupid. You need to keep desktop unlocked to use it, but it causes usability problems when it is so.

“Give us such a desktop environment and I dont’ kow if it can be called lightweight. Nevertheless I am induced to think that it would be widely used.
Best regards.”

Most things you listed are in KDE by default. It just needs more polishing like removing the activities, locking desktop and limiting virtual desktop amount (easily added more or removed them as it is in desktop grid). WiFi, printer etc tasks are just fine (always can be done better), but only non-KDE software actually does their tasks, like CUPS is installed and WiFi HW is supported.

What we need is to design KDE for seniors and kids… to be so simple and clean as possible and then give a lots of options for users so if they don’t like something or they need something, they can change it for their needs and taste. And it needs to be easy, behind 4-5 mouse clicks. (Like system settings > look > font)

> “Especially they need the effects, they need visual aid for every action what happens or they do so they learn quickly and don’t get panicked because something just jumped on screen.”

Can you cite any empirical studies that support this statement? Kids have been avid users of computers since long before they had graphical operating systems, and from my experience, they learn more quickly how to adapt to new software than their parents. This is all anecdotal, of course, so maybe you can show me a data which suggests otherwise.

In my opinion, suggesting out of the blue that seniors and children “need” visual effects is quite condescending. If I were to make assumptions on a similar basis, I’d say that children don’t need to be pampered with when it comes to technology. If they are interested in something, they will figure it out. As for seniors, they will use anything that they are familiar with, something which few desktop environments are able to deliver.

KDE devs (and enthusiasts) need to understand and focus on the aimof what their desktop environment needs to serve.

1. You claim simply – KDE does not say anywhere it works on a machine with celeron or Pentium-M or atom.
2. Devs can perhaps list their hwinfo into a table which could be used as a guidance by users (just like what MS states in their website). This way devs are not liable if the speed/whatever is not to an users taste.
3. The philosophical divide is what is important. You cannot satisfy everyone.
4. Focus of producing what KDE-devs like.

Alternately, everything in KDE should installable individually. Perhaps technically difficult – I do not know.

PS: In your “Leave a Reply” form it must be “Website” not Webseite (DE) typo

Quickly going over the feature descriptions of desktop environments and software suites that don’t claim to be lightweight, they make claims just as arbitrary and vague, and depending on how you interpret these claims, more or less misleading.

Looking at Gnome, it claims to be “Simple and easy to use”, that it “Helps you get things done”, that it “Puts you in control”, and that it’s “Finely crafted” — all more or less subjective interpretations.

Examining the feature descriptions of some KDE software, starting with Konqueror, they claim it’s a “next-generation” browser and that it “supports the full gamut of current Internet technologies”, all while impressing us with supposed next-generation features such as HTML 4.0 support, Netscape Communicator plugin support and CSS2.

Plasma Desktop? It “offers all the tools required for a modern desktop computing experience”. It has “Beautiful looks”. It has features that gives you “Easy deployment” options. Aren’t all of these sort of vague and ultimately entirely subjective features?

KWin? “It gives you complete control over your windows, making sure they’re not in the way but aid you in your task.” — “not in the way”? “complete control”? Features that “can make window management smoother, easier, more efficient and more natural.”? Not providing any factual basis for these claims. Oh, and “It’s hardware requirements are very modest”.

Gnome and KDE certainly don’t go through any lengths to prove any of these supposed qualities (and I really don’t blame them), so if you see vaguely defined buzzwords in feature descriptions as a problem, how come you didn’t start looking at home base?

my opnion is that “lightweight” is variable status, not a feature. For example, at home with a 8GB Ram 2 Intel Cores machine, I can use “heavy” applications like Eclipse + Tomcat + Amarok + Firefox (?) with loads of tabs, even a second machine running on Virtualbox at same time running a Postgre/MySQL instance, so running KDE (my default), e17 or even razor-qt doesn’t make any difference on performance. Now lets take my workstation example (sigh)… which sadly with only 2GB RAM 2 Amd Cores, I can’t open Eclipse + Tomcat instance + Browser (any) + KDE? I just can’t, things start getting sluggish, so Openbox + fbpanel is my choice, sometimes fluxbox, which means to me KDE isn’t “lightweight” on this environment, but {Open/Black/Flux}boxes are.

> And here, poor fool! with all my lore I stand, no wiser than before.
That’s blasphemy. Dubbing Goethe is not in the cards.

To answer your question: “lightweight” describes a relation of the actual condition, ie. SW on HW in our case.
Every so-called “lightweight” system of today would totally suck being heavily bloated on the oldest machine I possess.

You’d call Win95 “lightweight”?
Well apparently you never installed it on a 486@66MHz from 24 floppies.

But DOS was “lightweight”!
Yeah, no. Not the latter versions on a 286@MHz (that’s no typo, kiddies – 4MHz, not 400 and not 4GHz either)

“Lightweight” simply means to pick the proper SW for the present HW and is *not* the opposite of “bloat” what means to waste resources on unwanted/unused features, nor “inefficient” what means to waste resources to dumbness.

There’re two strategies to cope with that:
a) for HW from the 90ies, pick SW from the 90ies
b) scaling SW, ie. a basic system (but rather don’t try running Linux 2.6+ on a *real* i386) where you can activate more features at will – you got a “lightweight” system as it can (be) adapt(ed to) to the present HW.

Interesting thought. It seems unlikely to me, though, that systems boasting about being “lightweight” are doing so entirely in terms of hardware expense, perhaps for the exact reasons you mentioned. In the case of for example desktop environments, it seems like “lightweight” is a claim based on (possibly shallow) comparisons with other contemporary DEs — in particular popular and widely used ones, and not entirely thought of in terms of hardware resource use, but also in functionality “bloat” and the size of the feature set. That’s the way I use the word, too, for what it’s worth. So I agree that it’s a relative term, but not relative to a particular use case, but to the current standards. In those terms, neither MS-DOS nor Windows 95 were particularly lightweight. DOS, compared to the ROM BASIC implementations typically running on micros at the time, and Windows 95, compared to Workbench, Windows 3.1, DOS, System 7, etc.

On a slightly unrelated note, to me this also means that being lightweight, while I definitely think it’s a tangible quality, isn’t necessarily a good thing. As much as I personally wouldn’t install KDE, I can definitely see its advantages over simpler solutions like LXDE or Xfce in some cases that might not apply to me, but probably a large portion of computer users.

If you can run code on a GPU instead of a CPU, that would be good, as long as it’s code that the GPU can run well. (Metrics: Faster, more energy-efficient, which really depend on the combination of CPU and GPU.)

But code that you do not run at all is more lightweight than code that runs on either CPU or GPU.

I have a vague memory (pre-WWW-everywhere, so I can’t Google it to jog my memory) of a BeBox demonstration featuring “4 videos and bad music” as they demonstrated the pervasive multithreading of BeOS, by playing multiple audio and video files at the same time. They did that with maybe 2 200MHz CPUs and 32MB of RAM.

Now we have GHz-class multiprocessors and GB-magnitude memories. Instant response to mouse and keyboard actions should be a given: Windows should pop up with full decorations within 1 frame at 60 Hz, even on 10-year-old computers. I should not ever wait for my computer.

You’re right, but as far as I understand, faster CPUs and bigger memories are not built to make PCs faster, but to sell new units and make them easier to program. Portability of code is also more important than in the past.
To make that BeBox demo they wrote a lot of hardware specific code, that maybe wouldn’t run on different hardware, maybe also some assembly code.

But today, with new hardware released almost every 6 months, who cares writing code specific for something that will be obsolete next year? It seems more important to me to “write once, and run it everywhere”. And make it easy to write. Sure, javascript code will run slower than vanilla C code, but can you tell the milliseconds difference on our super-fast hardware?

Optimizations are important indeed, but to me giving everyone the opportunity to make real new, cool ideas is a lot more important than those optimisations.

And there goes our hope of a better future ! Write once, debug everywhere has been a moto since 2 decades and still people don’t get it. There is no silver magic bullet in computer science.

Optimization give you a better experience, more battery, less money to spend on new hardware and so on.

But same goes with the process of software development. Use JavaScript for what it is good at, small little high level script, where you don’t need to spend time in. Don’t loose your time trying to write a game engine in JavaScript, because at the end you will always be stuck in the sandbox…

Everything is about balance. Optimize what need to be optimized and in what direction. Writing code fast or fast code depending on what there usage will be.

For a long time video card support in Linux/*BSD lagged and just having reasonablly stable and performant 2D support would be considered lucky, forget any notion of 3D. Over the years, that situation improved and it got to a point where most of the 2D hardware worked reasonably well and some basic OpenGL support existed for some of the more popular 3D hardware.

Then came the turning point. New video cards are all about 3D and treat 2D as some legacy thing that might be used to boot the computer or whatever. Somebody realized that the new hardware needed to be thought of differently, but didn’t bother to consider the importancre of backwards compatibility. Suddenly I find myself in a situation where only the newest video hardware has any sort of support and it’s stillin development so it is buggy and not performant. Meanwhile the only way to have ANY OpenGL in hardware, or in saome cases ANYTHING more than VESA, is to use an old X server and old Mesa that are not maintained. That means any remaining bugs are not addressed even if patches are supplied by the users. That also means that some applications are stuck at older versions, not receiving patches that may leave them with glaring vulnerabilities.

When you write off old hardware, you forget two critical things. 1) Not everyone can afford to buy the newest every year. 2) Not everyone wants to throw away hardware that works just because it isn’t new. Point 2 is especially valid if you consider that the older stuff was built better. I have boxes that are more than a decade old that run rock solid, and I have some stuff built in the last 2 years that has some portion that doesn’t work because it either never did or some component burn up/malfunctioned in a matter of months. Between those points there is a gap as the hardware made in that time period is all dead.

Especially for video hardware, you have to consider that not everyone has the newest GPU. Many server motherboards have some rather old but “well supported” (prior to the KMS move and Mesa “reboot”) video hardware onboard and might not have much option other than a PCI slot for alternate video hardware. Sure there are plenty of people that say you shouldn’t even run X on a server, but sometimes you just have to fire it up for a task (installing certain software packages for example). On several of my servers that have “modern” onboard video, I disable it and slap in an old PCI card just to free up memory bandwidth. For example, my file server has intel video in the chipset, which uses system RAM and uses 3% of the memory bandwidth to just draw a 80×25 textmode console with no screen activity. All the PCie slots are full of network cards and disk controllers, so I put a Matrox Millenium 2 in a PCI slot. That frees memory bandwidth for the stuff that needs it (serving files), I can run a higher res console, and the video card is actually performant in text mode (the new stuff that treats 2D as legacy might nnot have working VESA, and if it does, scrolling a full screen of text can be so slow as to make a noticeable difference in kernel compilation time just due to the burden of rendering the compiler output inefficiently). Also, if you actually use X11 over a network connection as it was meant to be, all this 3D crap falls on it’s face.

There is also a disturbing trend in development where not only is it assumed that all users will constantly buy new hardware and not desire to keep using the hardware they already spent good money on, but that the technology progress will keep pace or accelrate. We have already seen CPU clock rate peak and actually back off as the manufactureres upped the core count. Not everything can be parallelized efficiently and we see plenty of evidence of that. The notion that users will have the same ultrafast box that some developer has shortly after the code is done and released is a fallacy. Even if it were true, software bloat has outpaced hardware development in the past decade.

I recently built a monster of a desktop and it still just FEELS slow on some things that should be fast. Sometimes I think I might be deluded by my memory, but all I have to do is fire up one of my 10+ year old boxes and watch it fly. I boot up my SGI Indy, running the last IRIX 6.5.x that supports it (released over a decade later than the hardware) and it flies compared to any Linux desktop, or any Mac. Even Windows 2000 on a P2 feels fast in comparison. What really blows my mind is my cell phone: 1Ghz CPU, 1GB RAM, 64GB flash, it runs Linux and it feels slow. Switching apps on it takes seconds as does launching apps. My first PC was a 486 with 24MB RAM and 2GB harddrive (after upgrades) and it ran OS/2 like a champ. I can fire up OS/2 on a P2 with 512MB RAM and some absurdly large disk and it too flies, apps launch in fractions of a second and task switching is instant. I remember when Linux was the efficient little thing you put on the old box that worldn’t run the new Windows version too well. Now it’s a monster that performs worse than anything else (except maybe the BSDs in terms of video which are largely still using UMS for X11 or VESA if you are unfortunate enough to be stuck on new hardware).

I run KDE4 on my desktop because I want some of the apps and it’s good enough. It’s still a crap interface compared to KDE3, but it has become untenable to keep KDE3 working, and many new apps won’t run on it (but there are a few good ol’ apps that don’t work on KDE4). On everything else, I run Enlightenment, ironically. Ironic because when I first installed Linux with a 2.0.x kernel on my 486, Enlightenment was the eye-candy WM that was dog slow compared things like FVWM and even KDE1 when that came around (yes, I’ve used KDE through every version). Now, Englightment is the quick one that does just fine rendering on the CPU at the maximum resolution of a 15 year old video card.

The purpose of making faster hardware was to do the same tasks as before but quicker, and to enable one to do new tasks that were previously not practical. What can I do now that I couldn’t do a decade ago? Nothing really. The difference? What I could do before I still can but slower. Wait, slower? Yes, SLOWER! I don’t need a fancied up interface that takes so much of the system resources that I am waiting on it. THAT is what is perceived as “heavy” and what we want to avoid with “lightweight” alternatives (not that any you mention are noticeably lighter).

I call a sollution “lightweight” when it does not grap my attention. It is stable (does not grab my attention by erratic behavioiur). Does not have unnecessary decoration. It is resonably fast, so I do not notice delay. It does not influence running of other programs (does not take all ram). Has little dependencies. For frameworks – it is lightweight if it is easy to understand and use.