If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

When you run filesystem tests, please specify if Windows Defender or any other antivirus software is running, it makes a huge difference.

As a huge difference makes the nobarrier option of ext4 but then we go to other things etc etc. So every system stays in its default settings.
Actually, 26.5% difference is rather surprising. In reality I find ext4 almost twice as fast as ntfs.

As a huge difference makes the nobarrier option of ext4 but then we go to other things etc etc. So every system stays in its default settings.

Question is - who's Phoronix' "ideal" client?
As a large portion of the application that are being used during benchmarks, the claim that the sys-admin is stupid and could not (or should not?) be expected to flip a switch on certain performance improving switches in its distribution (Be that write barrier in ext4 or disable 8.3 in NTFS) is ridicules at best.

The ext4 developers cannot (and must not!) assume that you have battery backup on your HBA or good UPS, only the sysadmin can make that call and decide if the performance improvement is worth the additional risk.

BTW, take the time to compare ext3 in ordered mode (default) and journal mode. You'll be amazed by the performance hit.

As a huge difference makes the nobarrier option of ext4 but then we go to other things etc etc.

Correct. Any important information regarding the configuration of the tested system has to be clearly known.
It might be ok to leave the default parameters of the file system itself, it make less sense not to turn off Windows Defender since it's not a "core component" of the file system.

So every system stays in its default settings.

Why? I don't think there's a martial law for that!
When you test performance you do it for a reason... to see what are the best numbers you can get out of a system and to know what parameters make a difference.
Basically no one in the world runs a system without making any change.
So what's the point of testing the default settings?

If someone is concerned about a system performance it means that is already tweaking the system.

Also... upcoming tests will use Apache, SQLite, PostgreSQL on Windows 7, those are not the dafault software used on Windows 7. No one is using Windows 7 to reach best numbers out of those softwares. It would make much more sense to test on Windows Server and not on the client.

Unless we assume that those benchmark are just pointless waste of time

I don't know much about file systems, but I see that many more technical users have reported that NTFS is much slower than EXT4, so wouldn't it be useful to include real world file system tests? Even basic ones like file creation, copy, delete, etc. This could probably enlighten less techical users (as myself) to the real advantages of one or the other in everyday usage.
Even in my ignorance on these matters I believe that something most be really well done on the EXT4 side since I installed ubuntu 10.04 on my eee pc the other day and it only took 10 minutes, where a normal windows xp instalation could take more then half an hour.

Correct. Any important information regarding the configuration of the tested system has to be clearly known.
It might be ok to leave the default parameters of the file system itself, it make less sense not to turn off Windows Defender since it's not a "core component" of the file system.

These tests have to do with the out of the box experience. We want to see the experience of the simple user who buys his computer with windows7 or ubuntu preinstalled since this kind of user rarely tweak things.
It's funny though that you speak for things that may hammer the windows performance, while Ubuntu can be tweaked in a lot of ways rather than just by disabling some processes...

Why? I don't think there's a martial law for that!

For sure not, but still Linux with it's glorious configurable ability can easily produce serious arguments about th options that were used instead of the others or the more others that were not, or even the distro, the DE etc etc.

When you test performance you do it for a reason... to see what are the best numbers you can get out of a system and to know what parameters make a difference.

These tests don't show the performance in general, but the performance in its default settings.
If pure performance was the goal then the choice wouldn't be Ubuntu but Gentoo with it's vast tweak abilities.

Basically no one in the world runs a system without making any change.
So what's the point of testing the default settings?

The vast majority of people never go thus far as to change something more than their desktop wallpaper, how much to change things that affect the performance of a filesystem or a web server... Don't think that the average computer user use to lie in forums like this one...
Hey that's what Apple promotes all these years and goes quite well. "Great experience out of the box".

If someone is concerned about a system performance it means that is already tweaking the system.

Not really. A lot of people suffer from low performance of their computers but have absolutely no idea what causes it and how can be fixed.

Also... upcoming tests will use Apache, SQLite, PostgreSQL on Windows 7, those are not the dafault software used on Windows 7. No one is using Windows 7 to reach best numbers out of those softwares. It would make much more sense to test on Windows Server and not on the client.

The tests show the performance differences between two desktop OSes.
It's normal to test Ubuntu with windows7. If windows server was the candidate, then normally it would be challenged by Ubuntu Server ot RHEL or Centos.

Unless we assume that those benchmark are just pointless waste of time

For a company or an individual who cares for the out of the box experience, these tests are rather valuable. For the user (like me) who likes to get the maximum performance, can take this specific results as the a metre of comparison.

For a company or an individual who cares for the out of the box experience, these tests are rather valuable. For the user (like me) who likes to get the maximum performance, can take this specific results as the a metre of comparison.

Here's where your argument completely fails.
Out of the box performance may be relevant when you're talking about desktop applications (to some extent) such as games and general utilities (lzma/gzip, mp3 encoding, etc).
However, out of the box experience loses all meaning once you start talking about server applications such as apache or postgresql and specialized applications such ray-tracing or simulation software.