Posted
by
timothy
on Thursday May 19, 2011 @07:18PM
from the modified-seldon-plan dept.

fatalGlory writes "Despite some initial reservations about Gnome-shell, it appears to be coming out very nicely. In some preliminary benchmarking tests I've been conducting, Ubuntu's Unity desktop on 11.04 Natty uses roughly double the memory that Gnome-shell uses."

Because he was comparing to a LiveCD non-install of fedora (the reason being unclear), and Im not sure you can switch between Gnome2 and Unity on a LiveCD (one potential reason being limited ramdisk space).

I think something is wrong somewhere. Windows XP SP3 can run very comfortably in 512MB virtual machine. And the 512MB virtual machine doesn't even have to use all 512MB of the host machine's RAM.So what does the GUI give you for all that memory use? Or is it mostly bloat?

I doubt it provides "faster performance". Seems to me most GUI designers either don't really care about performance or are clueless about it. They keep creating extra steps to do common stuff. Or insert artificial delays so that they can sh

I was under the impression that the point of Unity was to be useful on small form-factor devices. I have a little ARM-based machine, with a 10" screen, which shipped with Ubuntu. GNOME sucks on it - most of the dialog boxes don't fit on the screen - so I was interested to see if Unity was an improvement. Turns out? It won't even start Unity, because it claims my system is not up to it. Designing a UI for small devices that requires a high-end machine to run seems a bit silly. Even when we've all got q

I always hate it when Windows fanbois pulled that kind of switch in debate tactics: First they claimed that switching to Linux and OpenOffice would entail all kinds of expensive training costs. Then after Office 2007 came out, suddenly retraining costs weren't a problem.

So, similarly, I don't feel like brooking people who seem to feel a need to engage in apologetics for Gnome. First: Switch to Linux from bloated Windows. Then: Bloat? So what?

Canonical's Gnome 3 PPA is a mess right now, spontaneously explodes, and when you try to go back to either classic or Unity there will be issues PPA Purge doesn't fix. Better spend your time looking for your next distro in 5 months when you won't have a classic mode. Debian, Mint, Arch, Puppy, Pardus, Mandriva, Fedora, FreeBSD, Gentoo, Sabayon, PCLinuxOS, PC-BSD, MEPIS all will be in better shape than 11.11, try 'em out and pick one.

I'm partial to Enlightenment - and Sabayon has a nice offering. In fact, it's the only 64 bit distro that offers Enlightenment working out of the box. Others offer E17, but you have to work at making it work.

However - most people who are using Ubuntu came from Windows, and they aren't especially likely to leave Ubuntu for the sake of a more efficient desktop manager. Most of them have little idea where they are on the food chain, and those who do, feel little need to climb any higher.

Never tried Sabayon, but I might check it out. I tried Mint, hated the menu, but otherwise thought it quite nice. Tried PCLinuxOS and liked it quite a lot, but am wary of its over-reliance on TexStar.

We moved our old laptop to Lubuntu 10.04 (=Ubuntu 10.04 with LXDE), which is not officially supported but nonetheless considered an LTS release. The laptop now uses less than 200MB RAM when it's running with a number of applications (Thunderbird, Chromium, Pidgin, some shells and file managers) and almost a

As long as I can still install XFCE in 11.10 I'll probably stick with Ubuntu for my desktop workstation for now. I hate that Gnome 2.x will no longer be an option, but XFCE is a decent alternative that is still being actively developed.

> Just saying. People didn't pick Ubunutu for the G, they picked it because it is the easiest, most "stable" (it just works), most friendly out there.and a few lines later> Debian(...)If you want software versions everyone has been using for a decade, try "experimental" branch. Cat is still under review for inclusion in stable.

Decide: either stable + just works (in my experience, that's debian stable), or fresh, with the latest packages. New users should prefer the former.I like newer packages and per

I feel the urge to troll because I loved using ubuntu until 11.04, but now have switched to kubuntu 11.04. Instead I will say I look forward to the continual improvements that will be made to both unity and gnome shell.

Why does everyone bring this up? That support is leaving in 11.10 so the argument only works as a stopgap for another few months. Sure you can choose that now, but in the long run if you don't want to use unity (and who does?) you have to switch someday.

The big question is where will people go; slackware? Fedora? Xubuntu? I know Ubunutu is trying to get more mainstream, but they'll lose some of their hardcore users, and I have to wonder how that'll affect their devs.

What more do you need to know? I installed (and fully updated) Natty this weekend, and crashed it 3 times in 20 minutes with different Unity bugs. Then, I hit up the goog, and found out how to get my classic gnome interface back (it's in a dropdown at the login prompt). Waste of 22 minutes, if you ask me. I can't imagine how much time the Unity devs wasted on that crap.

everyone knows that! it's built on a solid, stable unix foundation, with a 'keep it simple' philosophy that guarantees performance and stability! with 12 overhead cams and a dual plated stainless steel cooking surface, your family will be sure to love the new Unity.

I have a nice new laptop now, W510 (thanks to my brother), and I have Ubuntu 11.04 on it.

In the 3 weeks that I have it, it crashed on me about 5-6 times. It gets stuck, the keyboard/mouse stop responding. I can't ssh into it either, so I have to shut it down with the power button (ouch).

Also this is my first experience with 11.04 and I am hating it. Except for Unity (which I don't use, to me the benchmark is like this: switch to classic), I am finding all sorts of real problems with this distro.

Years ago I had KDE lock up on me every now and then, sometimes I could ssh in, but I sure couldn't figure out how to save the work done in the GUI. There might have been a way to do it, but how many people in the world knew of

And I haven't had to post as an Anonymous Coward since I registered. Maybe you just don't know how to log in. Actually I'm sure you're just not a people person. Now get this odd saying that only I get over to that place, you know the one.

Wow, you haven't done anything interesting at _all_ on Windows then (or run on hardware that overheats...) XP doesn't blue-screen anywhere near as often as Win98 or 98SE or WinMe did, but it still knows how.

Then you haven't turned a computer on. I'll admit I haven't had Windows 7 blue-screen on me, but I haven't worked in it with as much frequency as previous versions, but I've had my share of BSODs on Vista, XP, Server 2000 and Server 2003. Most have been driver problems or related to failing hard drives. Any system can crash badly. But hey, you got to call someone a "fag" on Slashdot, so I suppose making blatantly moronic claims to get there is just fine for you.

I've never seen Unity (or Ubuntu for that matter) crash. I upgraded to 11.04 on a netbook and two laptops. Use 'em everyday. Never crashed on my desktop, either, but I did have a wireless problem so I'm sticking with 10.10 there.

It's really not that big a deal these days, you want to use memory, because memory is FAST, and in comparison to the old days, dirt cheap. Loading things into memory is not an automatic sign of bloat, sometimes it is a sign of doing what you should, putting memory into use.

It really depends, but as a general rule, the OS and the environment shouldn't take up very much memory as that's not typically why one buys a computer. As much memory as possible should be available for the applications the person wants to use. A half gig isn't really that much, however if you're into programs that use a lot of memory, that's memory that could be used for your rendering software or VM.

Using more memory isn't automatically a bad thing, but if it's software that you have to run in order to d

I want to use my memory myself, not have it used up by some bloated piece of shit window manager

+1, not to mention memory management being shared in pools isn't actually happening as well as people seem to think. A lot of apps sit idle and horde their memory, leaving various other applications pining for RAM. When the baseline system becomes standard with 8 GB of RAM you'll still see applications hording memory, until we have some advances in shared memory management, beyond today's advances.

On my 4 GB system, if all the applications running on my system right now claimed 100% of the RAM they're using as private, unshared RAM, that still leaves half of my RAM available for other programs. This with an instance of Google Chrome with eight open tabs that's been been in use and running non-stop for the last six hours.
Frankly, unless you are running some sort of critical application that must have RAM now, then even an OS that is horribly inefficient at handling shared memory management will do a

If that "bloated piece of shit window manager" is making you hit you hit 100% RAM usage with whatever program you're using, perhaps you might want to spend the $40 and get 2 2GB sticks.

If you're not hitting 100% RAM usage, then it doesn't matter. RAM that sits unused is wasted RAM. You don't get karma points by not using it if you have it. The computer isn't happier just because you're letting the RAM sit there and do nothing.

An efficient and smart personal computer operating system should have 100% of ava

I'm hip with the "put all your memory to use" paradigm - that makes perfect sense. However, I want that memory to be in use for my applications. The Window Manager is just a utility, not an end in itself.

At my place, my primary box is an x86_64 box with 2G of RAM and it flies. It would easily run Natty, if I cared to install it (I don't like Gnome). But I still boot up my old PIII 555 Mhz with 128MB RAM just to see how far we've gotten. My conclusion after 11 years (bought that old PIII in 2000) is tha

what happens when every app decides it is "the one true app" and should use as much memory as it can grab? When you have a half dozen or more programs all deciding that it can use all of your RAM for its cache and you start swapping everything else out to disk, it can be extremely painful to switch between tasks.

I realize that not everyone has a problem with it, but Firefox kills my Vista laptop with 3GB RAM after leaving it running for a day or two with a dozen tabs open, and at some point, will often s

what happens when every app decides it is "the one true app" and should use as much memory as it can grab? When you have a half dozen or more programs all deciding that it can use all of your RAM for its cache and you start swapping everything else out to disk, it can be extremely painful to switch between tasks.

You close a couple programs? Or get more RAM? In all seriousness, there's never been more free RAM for programs to use than right now, and it's only going to get better. It could be worse: it could be 1985 and you have one program running at a time due to memory constraints.

I realize that not everyone has a problem with it, but Firefox kills my Vista laptop with 3GB RAM after leaving it running for a day or two with a dozen tabs open, and at some point, will often starting pausing for 60 seconds or more due to swapping. On the flip side, with my Linux desktop with 6GB of RAM, I hardly notice it chewing up crazy amounts of memory (1.2-1.5GB) until it's been open for a couple weeks.

You should try Windows 7 instead of Vista. It's memory management is years ahead of it. As for UI choices, I can respect your decision.

You close a couple programs? Or get more RAM? In all seriousness, there's never been more free RAM for programs to use than right now, and it's only going to get better. It could be worse: it could be 1985 and you have one program running at a time due to memory constraints.

My laptop is maxed out, the chipset only supports 4 GB max and unfortunately, it came with Vista 32 rather than 64 despite having a 64 bit processor so it'll only address 3 GB of that anyway. I could upgrade the OS, but, well, I don't really care for windows, I just use it out of convenience (and also have gentoo installed on it as well). So while RAM is cheap, most portable systems have significantly lower limits than traditional desktops.

Memory is used, even if no application is using it. It's used to cache disk. Free memory is not wasted memory. Bloat is a relative term. If one application, with feature parity with another, uses much less memory than the other, it makes the other seam bloated. If Unity is using much more memory than Gnome3, as there is feature parity, then Gnome3 is bloated. It's an arms race we all win from because our computers get faster for free.:-)

Plus unused memory is wasted. Basically any free memory might as well be used as cache, which can be freed up instantly if an application needs it. That used to happen only at the OS level with disk cache, but nowadays apps do it too.

But people are used to thinking of memory as a precious resource and want to have a nice big un-used chunk all the time... some have a point in that they then will use it later, most don't. I seen people rant

yeah except the usage heuristics fall short almost all the time, leaving the machine bloated and clunky as each app assumes it is the center of your universe on that machine.. how about using what's needed, and dumping what's not? I don't mind a bit of caching, but this relatively new trend of caching almost everything whether it's really needed or not is fucking stupid.

Well - there is another vector here: Developer productivity. Libraries use extra memory, but they enable a developer to do more in less time. Imagine coding firefox entirely in assembler. The process would likely use much less memory, but how many coders would be required to maintain and extend it?

That being said, I agree wasting memory for the purpose of wasting memory is bad: Inefficient data structures, caching with low hit ratio etc etc. There are no excuses for inherently bad design.

There is a sweet spot though isn't there. Assembler is slow to develop in, but very fast. C can be used almost as a high level assembler, you can be pretty clear about how you want things to compile, yet C is much faster to develop in than assembler. Some languages (say python) are very quick to develop in, but if you care about speed at all, are the wrong choice. I would say C and C++ are the sweet spot where you get most bang for bucks in terms of speed cost and productivity gain. Afterwards, it does seam

Absolutely worthless comparison, as it compares vastly different distributions. He isnt even comparing 2 debian based distros, or trying to control for different running services; why is there not even an attempt to isolate the memory usage of the DE / WM?

Perhaps this could have been useful as a comparison of distro memory usage, but even in that it fails-- its comparing an installed Debian distro to live-CD based Fedora; why wasnt fedora installed and compared (perhaps using VMs?), or Ubuntu run from LiveCD?

For that matter, if you care about memory usage, you only care about memory usage in low-memory environments. In large-memory environments, usage will be different. I would hope that while few applications are running and there is lots memory free, the operating system would just cache everything it ever thought about loading from any drive or network. Similarly, it might be more speed efficient to allocate large blocks of memory for certain tasks, so as to keep RAM access contiguous.

I just installed 11.04 this evening.The reason/, has it in for Unity is : Unity sucks. If you don't already know where to find something, you will never find it. In my 20+ years of using computers I've never had a UI hide the details of getting shit done nearly as well as Unity. Sure thing - if all you want to do is open Firefox or an office suite nobody on this board has ever used - it is pretty damn slick. But want to do anything 'normal' besides that (or God forbid : advanced!) and unless you know e

Right click on the Applications icon in the launcher and your sorted categories are right there just like you use to have. Or use the drop down option when typing into the search area (to the far right) to limit to specific categories. Or just click Applications and expand the Installed section and everything you have is right there.

So far it typically takes me 2-3 times as many clicks to get what I want once I know where Unity has hidden it, compared to 10.10, or more when I don't, and no, "everything you have is right there" isn't true if you have more than a dozen or so things.

Keep in launcher (one click). Or frequently run apps will show up at the top of the applications window (two clicks). Everything installed shows up when you expand Installed (three clicks). Sort applications by right click (three clicks) or by dropdown (four clicks) to find something by category.

How are you taking 2-3 times as many clicks beyond maybe the first time?

Is number of clicks really a useful measure of usability? I'm sure it's not the only one...

If you don't know the name, how are you going to find it in a drop down list of applications?

Search could be improved by specifying keywords for applications instead of a strict title search. I should be able to search for "burn a dvd" or "listen to music" and have all relevant apps show up. How much more intuitive is that versus "Brasero" or "Rhythmbox".

I agree. I find that Windows used to be almost useless in this department before they implemented a working and useful search function in Windows 7. Windows users usually don't deal with as many installed applications as you get in unices. My macports/bin directory sports about 1600 applications coming from about 150 ports; I assumed the other 50% of 300 ports were all libraries. Even if you organized things per port, you still get more then 100 items to deal with. Windows "Start" menu is a horrible interf

> If you don't know the name, how are you going to find it in a drop down list of applications?

By picking the right category and then the most fitting function description? "Internet / Firefox Web Browser". "Utilities / gedit text editor".

> How much more intuitive is that versus "Brasero" or "Rhythmbox".

The usual way of doing things is searching only once, when you dont know what youre looking for. When you've found it, you either give it a name yourself or accept the name it already has. But when you

I guess the reason I see it differently is I like finding things by search rather than drilling down. I grew to like it on Win7 and going back to the nested menus on 10.04 wasn't fun. But you're right - it sucks if you want to browse.

Almost a year ago now I installed Ubuntu for a neighbor who is nice enough but whose IQ is below average. And she had NO problem with it. Icons switched? Not an issue. She just dealt with it rather then throw a hissy fit like most of Slashdot.

She managed her updates nicely, just clicking them BUT unfortunately that also included the 11.04 upgrade... and since then she can't find anything. Yes, she is stupid but wasn't Unity supposed to be easier for people like her? The average Jean? The people who clean fo

Pointless benchmarks. How about more commentary on usability speed? My fairly typical 2008-spec desktop rig has 4gb of RAM and my aging monitor is 1920x1080. My slightly battered old notebook is not even far behind this. So why do developers insist on making using as little memory as possible even if you happen to have an ass-load of it?

Why do they have to waste my time making me click extra, hiding things away just to save another 50px of screen real estate?

Because, most people don't load up with 4gigs of ram (or more) just to run the desktop environment. Sure Unity doesn't use anywhere near 4 gigs, but every megabyte that's used by the desktop environment is a megabyte that's not available for whatever it is that you're using the computer for.

But more than that, the sort of sloppy coding practices which lead to bloat also lead to other problems, ones which are of larger importance.

At the end of the day, I know that it's not realistic to eliminate all the bloa

Others have correctly pointed out that comparing memory usage on two different distros is pointless. On top of that, comparing total memory usage is stupid.

Look, you have memory in your system to be used. If you dug into it and found out that most of that memory consisted of massive, unused libraries, duplicate code, empty datastructures, or garbage that wasn't getting cleaned up, then sure, you could give it a hard time. But if it's full of cached images and icons so that the interface can be quicker and more responsive, well, isn't that why you have all that RAM?

A perfect program/OS would very quickly gobble up all available memory by storing and caching useful stuff...and then free it up the instant it was needed elsewhere. That turns out to be harder than it sounds, since procs generally don't know or care about totally memory usage, but still, the ideal should not be the opposite extreme.

Yes, but a lot of people don't understand this. The prefetch functionality of Vista/7 gobbles up free memory for this very purpose (and it does seem to help speed things up over time), however it's also one of the things people choose to bash them for without knowing how it works.

I will admit that 7 improved upon Vista in that it's a fair bit less aggressive than Vista. and won't prefetch much if there's limited RAM available, whereas Vista prefetches regardless of the impact of using what little RAM was av

You could turn it off with Vista, or at least I think you could. I ran it for a bit with a half gig of RAM and didn't have any trouble doing it. I did turn off Aero, but then again my video card didn't support it anyways.

Maybe they're doing a great job of caching some things, but they're mostly hiding the applications I use and making me wait for the animated graphics to pop up different parts of the menu system so I can get at them, so I don't count that as a win.

So far, Unity has gotten me interested in taking the time to learn Lubuntu or Xubuntu, so it may end up having been useful, but I don't think that was how they intended it,

It is simple Slashdot editors don't like Unity so any benchmark no matter how bad that shows Unity in a bad light will be posted. Just like any article about Windows no matter how bad the facts if it shows Windows in a bad light will get published.

But if it's full of cached images and icons so that the interface can be quicker and more responsive, well, isn't that why you have all that RAM?

A perfect program/OS would very quickly gobble up all available memory by storing and caching useful stuff...and then free it up the instant it was needed elsewhere.

The problem with filling up all your memory is that you don't know what the priority for memory use is. Is all that RAM on my system there to cache icons, or is it there to cache my database? As an application developer, you don't know how important your program is to the system compared to another program. As an OS developer, you similarly don't know. Chewing up all the available RAM so that one application is optimally fast, may be exactly the wrong thing to do if, as a user, I don't care about the sp

Look, you have memory in your system to be used. If you dug into it and found out that most of that memory consisted of massive, unused libraries, duplicate code, empty datastructures, or garbage that wasn't getting cleaned up, then sure, you could give it a hard time. But if it's full of cached images and icons so that the interface can be quicker and more responsive, well, isn't that why you have all that RAM?

That caching should be done at the operating system level, not the application level. That way the OS can decide what to keep in the cache. It does depend on applications not overusing memory and driving things out of cache, but no system is perfect.

That was one of my biggest complaints. I have two monitors one of which is 1900x1200 and Unity was a serious pain on that monitor. Plus because I can't move the bar it ends up in the middle of the set up. Sure I could move my monitors or change which one is the primary display, but I shouldn't have to, I should be able to move the damned bar. The space savings are moot and the interface itself doesn't work well. Windows randomly maximize and windows don't move around the screen smoothly when dragged. Someti

When I had my netbook (1024x600) it was actually pretty difficult to get a decent browsing experience even, reducing the two bars to one, using the single menu instead of split... and a few other tweaks (hiding the menubar in firefox).. it wasn't too bad... though it made me painfully aware of checking the screen size for popovers in browser, and never hiding buttons.... got really used to F11 (full screen) and alt with +|-|0 etc for scaling. Actually it was the better scaling that got me over to chrome...

Unity still has some quirky behavior, and frankly I'm just not a fan in general.

Gnome-shell is close, yet far away. It's more extensible seems like, which is promising. I just want two things:-When I highlight an 'application icon' in the activities view, automatically show only the windows for that application in the window previews.-Provide a means by which I can start typing and search window title bar contents (like KDE and Compiz Window Title Filter).

-When I highlight an 'application icon' in the activities view, automatically show only the windows for that application in the window previews.

Unity shows a thumbnail preview of all apps running when you click on the icon in the launcher. Well, if you're in another app like email, clicking the Firefox icon will take you to the last Firefox window you were using. Clicking the icon again then shows you a thumbnail preview of all Firefox windows. Works that way for most apps I've seen, except for LibreOffice,

Ever since I upgraded to Ubuntu 11.04 my Desktop doesn't redraw. Whatever was the last app on the screen leaves its last bitmap up when I close the last app. And switching between apps takes about 700ms, even when there's not much going on.

My PC is an old P4/2.4GHz/2GB with Intel 82845G/GL[Brookdale-G]/GE integrated graphics, so Ubuntu refused to install Unity and left GNOME. Yeah, it's old, but one reason I prefer Linux to Windows (or Mac) is that I expect to get the full performance out of the old stuff,

I saw the same behavior with Xubuntu 11.04 using a Lenovo netbook. Luckily, being XFCE, i simply turned off compositing.lspci shows the video as Intel 945GME. 3d is not really needed there, and i noted the machine freezing if leaving a 3d screensaver running for a while, so i changed that too.

I think support for Intel 8xx graphics has been rather poor since the change nearly two years ago to use the GEM infrastructure. My i845 locked up about once an hour in Fedora 12 unless I revert to the non-accelerating vesa driver. This turns out to be a GPU bug which happens to be triggered very rarely with the old drivers. A few patches have been found to work around this problem, but I haven't tried them. AFAIK comprehensive GPU documentation from Intel is only available for i810 and i965, so for eve

Hate to say it but they both suck anyway. Updated to 11.4 and unity. Took about 30 minutes to decide to try out Gnome 3. I should also mention I love(d) Gnome 2 with Cairo Dock and Compiz. Now Im back to KDE. Running 4.x. I persoally hate both Unity and Gnome 3. The interface is just pain awkwrd, and yes I took the tie to customize a bit hoping it would be usable by setting up my own task bars, icons etc....

Since gnome shell nowadays is just nothing more than an extended window manager:-(Gone are the days of a desktop. I just wonder what they thought about sacrificing all the desktop space.I am glad that there are alternatives to Gnome. It was nice knowing you but Gnome 3 in many areas is a step in the wrong directionin some in the right.

Wow, I thought, somebody did a descend evaluation of GnomeShell and Unity. So I followed the link. Some may think, this was my first mistake. but then I read the article and the first words were "This morning I decided to play with the Fedora based Gnome3". Well super this is news? I played with Gnome3 on Ubuntu weeks ago. Oh yes I didn't post results of that playing and so nobody considered this news.

To be honest the memory figures look strange to say the least. He compares different applications in combin

And hey, big surprise, comments disabled on that article. "You must log in to post a comment." And even if I did want yet another account just for commenting on a single blog, I don't see anywhere to register.

Just for fun, I'll respond here. I might even try to email it to him, if that works.

There are three fundamental entities that make up our universe, matter, energy and information.

I'm not sure information is an "entity" in any relevant sense. It's a phenomenon. More in a moment...

Now, creating and communicating information is demonstrably a mental process, requiring an intelligent mind to create and to receive the information.

Actually, it's demonstrably a physical process, one which can be performed entirely by machines, unless you are willing to describe my laptop as an "intelligent mind." But it depends what you mean by "information", in this case, as you point out:

In this way, information is distinct from data, which Shannon unfortunately referred to as "information" in his work on statistical-level information-theory, leading to the present ambiguity.

If that is the way in which information is distinct from data, then I work with a hell of a lot more data than information. My computer creates, interprets, communicates, and manipulates all sorts of data that no "intelligent mind" will ever touch, unless, again, you're willing to allow that my cell phone is an "intelligent mind."

This reality has been demonstrated amply in the book by Gitt and is expressed in a streamlined form in this lecture by Wilder-Smith.

I'm not willing to buy and read a book, but maybe I'll listen to the lecture.

Yet, so many software engineers remain evolutionists.

...what? Unless you're referring to the arguments you referenced via an amazon link and an mp3 file, I see nothing in your argument which requires intelligent design or negates evolution. Even if I accepted your premise that information must have an intelligent designer -- sorry, a god -- as its originator, as a "software engineer," I'd hope you understand that humans can and have written programs which simulate the genetic process at various levels -- why, then, could this god not design evolution as part of the "program" of the universe, fire it off and let it run, exactly as human beings do all the time?

Despite interpreting and often designing language conventions every day, very few software engineers seem to have considered the implications of language and information-theory for genetics, biology and metaphysics.

Again, out of the blue, you're introducing a new topic -- languages -- along with committing a stupidly trivial fallacy. Just guessing here, because you didn't actually deliver an argument, but if you did, I imagine it would look like this:

1. Humans can create languages.2. Humans have intelligent minds.3. Given 1 and 2, intelligent minds can create languages.4. DNA is a language.5. Given 3 and 4, an intelligent mind can create DNA (the language).Therefore, only an intelligent mind could have created DNA.

Both 5 and the conclusion are absurd on their face, and I hope you can see that. 5 is fallacious because 3 asserts only that intelligent minds can create languages, not that they can create all languages. Even if 5 were sound, the conclusion is fallacious because 5 asserts only that an intelligent mind can create DNA, and not that only an intelligent mind can create (or could have created) DNA.

And again, what about this falsifies evolution? If it worked, it would falsify abiogenesis. Evolution can happen without intervention once we have DNA, just as a program can run without human intervention once we start it running.

So I just killed about an hour listening to the mp3 and thinking about how I'd respond... not much new there.

Starts with a few intelligence-insulting analogies, like "Bricks don't build houses." Yeah, bricks also don't reproduce.

"We ought to be able to put the contents of life into a test tube and see that it would build life." In other words, if we can't put something in a test tube and get something out of it, we require an intelligent agency for that process? Seriously? If we assume this guy isn't as dum

The reason you want to run at least some parts of Unity as a plugin for the window manager is because there are race conditions in X such that only the window manager will be guaranteed to know when a window has appeared or disappeared the instant it happens. So, either you write the code in the window manager, or you have another mechanism to communicate to your program from the window manager. Or you jump through a whole bunch of hoops to stop your app from c