7. Applying 'read only' to a folder and all the subfolders/files doesn't work. The pull-down menu just abruptly disappears.

8. Files that are marked 'read only' and then copied to a flash disk causes the file to be 'read and write' enabled again. But files that have been made 'read only' during previous sessions are copied intact with their 'read only' attributes.

Overall, this is a pretty long list of gripes. I can't believe Canonical hasn't found these bugs or haven't fixed them after 5 months of releasing 10.04. Moreover, 10.04 doesn't even work at all on my desktop. Test driving the OS just gives me a blank screen, and downloading the latest .iso (10.04.1) doesn't help either and only gets me to the Ubuntu chime (with still a blank screen).

I think Ubuntu is a great OS... IF they could iron out the quirks and make it run on ALL computers. This isn't the first time I've experienced Ubuntu not even 'test-driving' on a PC. Ubuntu and Kubuntu from 9.04 to 10.04... i've seen it several times. The OS just doesn't work.

And yes, to gain widespread acceptance you can't tell people (or me) to enter this or that at the terminal. I grew up with DOS but Linux commands seem... Greek. We could probably use it to communicate with aliens.

Also, I wish they'd step off the 6-month release schedules. Sticking to the 6-month cycle means they have to rush out the OS even when there are still a ton of bugs to weed out. I'd rather they polish the OS before releasing it. What's happening now is they keep rolling out buggy software that just turn people off in the long run. I've been using Ubuntu for the past year, but I'm now feeling Ubuntu just might be a crappy OS. And yes, apparently a lot of bugs reported in the Ubuntu site don't seem to be getting addressed.

As you get older, you don't lose your friends.. you just find out who the real ones are.

Some of what you're talking about may be bugs with your specific hardware. It's near impossible for Canonical to write drivers for all the hardware out there. There's simply too much and most of it is poorly documented. In Windows-land the market is so big manufacturers are forced into writing drivers because chances are the user is going to run Windows. That's not the case with Ubuntu or Linux in general. That's why you see good support for popular hardware or useful niche hardware but as you move away from that less and less support. There's not much they can do other than throw more time at it and that's expensive and potentially wasteful.

There are numerous bugs. There's lots of quirks you have to work around, etc. But the base is good and always improving and Ubuntu is seeing some excellent integration work with outside services that really excites me. Sadly I think they're hurting themselves in other areas like laptops, where the hardware driver issue is even great and for some reason each new Ubuntu release seems to hurt battery life even more.

I wasn't aware Avast have AV software that runs on Linux.... and if it does it would probably only be useful for scanning Windows files.

As for defragging, I do think eventually a hdd will become fragmented if you dont have much spare space, regardless of the strategies the file system designers have put in place. I also think Linux needs better file system and hdd utilities.

ronch wrote:And yes, the OS is constantly being updated via system update.

Would you rather be out of date and lack security enhancements?

ronch wrote:2. Disk Utility doesn't even let me check my hard disk for file system errors (because it's mounted, and I can't unmount it in the GUI... sorry, I'm not a command-line expert under Linux).

Severe problems are caught during boot and the OS does file system scan every 26 boots automatically.

ronch wrote:1. Sometimes the touchpad quits working (and at the most inconvenient scenarios too, when I don't have a mouse).

It would help if we knew your hardware to possibly help you.

ronch wrote:3. Vsync isn't working on the distro's media player.

The default player is merely adequate. May I suggest mplayer/or VLC Player?

ronch wrote:4. I had to wait several months before avast! antivirus started to run without requiring me to enter some sudo stuff on the command prompt.

This is a third-party application. How exactly is the behavior of third-party applications the responsibility of Canonical? I don't even know how useful it would be on Linux (didn't even know they made it for it to be honest).

I'm guessing this is the result of kernel mode setting. You can turn this off if you like.

ronch wrote:7. Applying 'read only' to a folder and all the subfolders/files doesn't work. The pull-down menu just abruptly disappears.

8. Files that are marked 'read only' and then copied to a flash disk causes the file to be 'read and write' enabled again. But files that have been made 'read only' during previous sessions are copied intact with their 'read only' attributes.

I don't use the gui alot for copying. However, the "p" option in your copy command will retain the attributes. Overall this seems like a lot of complaining for an OS you barely understand yet.

VLC last I tried in Ubuntu had major CPU usage issues. mplayer also has the vsync issue. Hell, vsync is a real bug that extends to not only the video player but the desktop environment. I get constant screen tearing in Ubuntu.

Skrying wrote:VLC last I tried in Ubuntu had major CPU usage issues. mplayer also has the vsync issue. Hell, vsync is a real bug that extends to not only the video player but the desktop environment. I get constant screen tearing in Ubuntu.

True it exists in X itself. However, I have noticed that enabling XvmC minimizes the tearing.

ronch wrote:Don't get me wrong, I love Ubuntu as much as the next guy, and it works well enough, IF it works at all.

Thing is, I've been using Ubuntu 9.10 previously and am now using 10.04 on my laptop and I have a few gripes. And yes, the OS is constantly being updated via system update.

1. Sometimes the touchpad quits working (and at the most inconvenient scenarios too, when I don't have a mouse).

Dunno about this one. This may actually be an issue with the Linux driver for the touchpad so I can't argue here.

ronch wrote:2. Disk Utility doesn't even let me check my hard disk for file system errors (because it's mounted, and I can't unmount it in the GUI... sorry, I'm not a command-line expert under Linux).

You are assuming that the system works like Windows does and that it needs this. It is actually good that you cannot muck with your filesystem in such a way while it is in use. As someone else pointed out, if there is anything amiss at boot (basically the filesystem wasn't cleanly shutdown) then a basic check is automatically run and a complete scan is run on a recurred schedule based on the number of boots.

ronch wrote:3. Vsync isn't working on the distro's media player.

Having never had a reason to try turning on vsync on any Linux media player, I can't speak to this one. Certainly could be an OS related issue, but given the amount of trouble I have with media players on Windows, I tend to not care either.

ronch wrote:4. I had to wait several months before avast! antivirus started to run without requiring me to enter some sudo stuff on the command prompt.

Third party software and not needed anyway unless you are using the system to serve files up to Windows systems. At some point in the future Linux antivirus will be needed, but not right now.

ronch wrote:5. No defrag utility.

Again, assuming that Linux works like Windows. In general most current Unix file systems don't need defragmentation. In fact, in many of them, the concept doesn't even apply.

Ok, this one is actually an Ubuntu problem and one I ran into. It annoyed me as well, though in reality, if this is your worst gripe, then things are actually pretty good. There are fixes to this, but they do require you to actually interact with your machine at the command prompt level -- and they aren't technically supported.

ronch wrote:7. Applying 'read only' to a folder and all the subfolders/files doesn't work. The pull-down menu just abruptly disappears.

Can't speak to this one, but it does sound like a bug to me.

ronch wrote:8. Files that are marked 'read only' and then copied to a flash disk causes the file to be 'read and write' enabled again. But files that have been made 'read only' during previous sessions are copied intact with their 'read only' attributes.

Blame Microsoft for this one. Your flash disk is likely formatted with FAT32 which does not support Unix style permissions so permissions are lost when files are copied to a FAT32 formatted drive. If you don't need to access the flash disk from Windows, format it with some version of ext filesystem and this won't happen anymore.

ronch wrote:Overall, this is a pretty long list of gripes. I can't believe Canonical hasn't found these bugs or haven't fixed them after 5 months of releasing 10.04. Moreover, 10.04 doesn't even work at all on my desktop. Test driving the OS just gives me a blank screen, and downloading the latest .iso (10.04.1) doesn't help either and only gets me to the Ubuntu chime (with still a blank screen).

I think Ubuntu is a great OS... IF they could iron out the quirks and make it run on ALL computers. This isn't the first time I've experienced Ubuntu not even 'test-driving' on a PC. Ubuntu and Kubuntu from 9.04 to 10.04... i've seen it several times. The OS just doesn't work.

And yes, to gain widespread acceptance you can't tell people (or me) to enter this or that at the terminal. I grew up with DOS but Linux commands seem... Greek. We could probably use it to communicate with aliens.

Also, I wish they'd step off the 6-month release schedules. Sticking to the 6-month cycle means they have to rush out the OS even when there are still a ton of bugs to weed out. I'd rather they polish the OS before releasing it. What's happening now is they keep rolling out buggy software that just turn people off in the long run. I've been using Ubuntu for the past year, but I'm now feeling Ubuntu just might be a crappy OS. And yes, apparently a lot of bugs reported in the Ubuntu site don't seem to be getting addressed.

Well, since you aren't paying for the OS, I don't know that you have any real claim to gripe about it's state. Part of the problem, and not picking on you here, it is rather common, is that people assume that if it doesn't behave like Windows, then it is broken or wrong. As far as the buginess goes, if you don't like it, go pay several hundred dollars for the latest version of Windows. But then you have deal with all its bugs too.

I don't in the least bit disagree that Linux is not ready for the average computer user. It isn't, and it never will be. To run any OS other than the way it's creators decided, requires you to understand how the OS works and how to manipulate it behind the scenes. That applies to Windows, OSX, Linux, you name it. Linux was created with a different vision. One that it foreign to the average computer user who had never done it any way other than the Microsoft way.

SecretSquirrel wrote:I don't in the least bit disagree that Linux is not ready for the average computer user.

Did you actually mean "Linux is not ready for the average Windows user"?

I've actually got both my sister and my parents running Ubuntu and they are 5 timezones and a 7 hour plane ride away. Yes, my father is computer literate but my mother is terrible, my sister is in between. All are happy with Ubuntu because it just works once setup.

Yes there are some bugs with Ubuntu (and the OP is experiencing some of them, but not all of that list are bugs) , but in comparison to the bugs I've run in to with Windows, I'll take Ubuntu every time - even without factoring in the cost. The speed of security fixes plus the ability to fix some things myself and submit patches back are just a bonus.

I've not used 10.04 extensively yet, but I have been quite happy with the 8.04 LTS and 9.10 releases, which I've been using as my primary desktop OSes at work and home respectively. I did use 10.04 Server for my new homebrewed router, will be setting up another server with 10.04 tonight, and hope to upgrade my home desktop to 10.04 within the next week or so.

SecretSquirrel wrote:I don't in the least bit disagree that Linux is not ready for the average computer user.

Did you actually mean "Linux is not ready for the average Windows user"?

I've actually got both my sister and my parents running Ubuntu and they are 5 timezones and a 7 hour plane ride away. Yes, my father is computer literate but my mother is terrible, my sister is in between. All are happy with Ubuntu because it just works once setup.

Yes there are some bugs with Ubuntu (and the OP is experiencing some of them, but not all of that list are bugs) , but in comparison to the bugs I've run in to with Windows, I'll take Ubuntu every time - even without factoring in the cost. The speed of security fixes plus the ability to fix some things myself and submit patches back are just a bonus.

Unfortunately, the "average computer user" and "average Windows user" are synonymous . And, I would argue that your family falls outside of "average". Mine does too. They have someone in the family to get them setup and running and explain the differences and set expectations.

Yes, do not underestimate the importance of the "explain the differences" part! I keep putting off upgrading the last Win2K system in the house here because it belongs to my wife, and she does not like it when things change!

(I think I've managed to ease her into it by getting her a WinXP based netbook nearly a year ago... I think she'd be OK with her desktop changing to WinXP now. I'm not even going to consider Linux for her. No way!)

SecretSquirrel wrote:Well, since you aren't paying for the OS, I don't know that you have any real claim to gripe about it's state.

I think that's pretty unhelpful. If Ubuntu is asking people to "try Ubuntu", which they most surely are, then people that try Ubuntu are as justified in complaining about this or that as anyone else. And most people don't "pay" for Windows either - it comes with their computer and and they have to put in extra effort to find a computer it doesn't come on.

Part of the problem, and not picking on you here, it is rather common, is that people assume that if it doesn't behave like Windows, then it is broken or wrong.

But whose problem is that? That's the Linux Distro's problem if the Linux Disto has any intention of claiming itself to be an alternative to Windows.

As far as the buginess goes, if you don't like it, go pay several hundred dollars for the latest version of Windows. But then you have deal with all its bugs too.

Yep, dead on. Personally, I could not get along without Windows (I hope that changes some day), but keeping an Ubuntu computer around is very, very worthwhile. I prefer to use my Ubuntu machine when I can, but I end up using the Windows machine whenever I need to game or use CAD or graphics applicatoins. And the graphics applications for Linux suck, and it's amazing that they continue to suck as much as they do. How hard could it be to get a GNU knock-off of IrfanView up and running, or Paint.Net? And Gimp is no Photoshop alternative, no way, no how. Anyone that says so has either not used Photoshop in any serious way, or is turning a blind eye to Gimp's deficiencies. Fspot plain sucks - what a pile of crap it is. 99% of the photo editing I do is extremely simple stuff - resizing, cropping, simple color corrections, and batch processing. IrfanView is a figurative Godsend for this. GNU either needs to plagiarize it or convince Irfan to open source it

I don't in the least bit disagree that Linux is not ready for the average computer user. It isn't, and it never will be. Linux was created with a different vision. One that it foreign to the average computer user who had never done it any way other than the Microsoft way.

That's very unfortunate insofar as "it never will be". But part of what you say isn't quite true since Linux itself is just the kernel, and the GNU applications and interfaces and then the various GUIs that run on top of it very often most certainly are aimed at the "average user". One of Ubuntu's marketing claims is how easy it is to install - just 7 steps - why would Cononical bother bringing that up if it was not aiming at the "average computer user". So Linux itself may not aim at the average user, but Cononical's Ubuntu most certainly does, and it's not fair or true to say that it does not.

Having said all that, I love Ubuntu. I enjoy the Gnome GUI, I enjoy using the command line quite a lot too. I like many of the applications that are installed. I love the access and simple ability to install any of thousands of applications with a few simple keystrokes - that beats Windows for simplicity any day of the week. Cononical is doing great things with Ubuntu and I'm a huge fan.

axeman wrote:Ext3 volumes can get pretty heavily fragmented if the disk gets very full and you have lots of large files. It does annoy me slightly that there isn't a real obvious way to deal with this - on the flip side, it's no where near as bad as NTFS. A brand new install of Windows ends up with 10% of the files fragmented or some nonsense when there is 100s of GB of free space... wtf? By contrast, you're only likely to run into noticeable performance degradation due to fragmentation on Linux in special cases, like a disk that's chock full of large video files. My biggest issue with this was a disk with a bunch of virtual machines on it that was getting quite full, causing serious performance issues when trying to run VMs. The thing is, "special cases" are more likely to occur nowadays.

Ext4 (which AFAIK most distros have moved to as the default file system type) includes a number of new features which are supposed to help reduce fragmentation.

I agree that the lack of production-quality defrag tools is somewhat annoying, but IMO even taking that into account ext3/4 is still a more sane filesystem implementation than NTFS.

been seein' those "not ready for prime time" accusations since OS/2 days - the only common factor is that 'prime time' means 'what I am familiar with'

Every software setup has its quirks, its bugs, and its way of doing things.

It seems to be only with computers that people expect the tool to adapt to them rather than take the traditional view that a craftsman knows his tools -- and don't get me started on those computer classes that cater to nonproductive paradigms for mass consumption.

As far as Ubuntu and the lay public -- the FUD mongering there is also way off base IMHO. Ubuntu is different, yes, but it is as easy or even easier to install than more popular systems, often configures to hardware easier, and has all of the bases covered.

Hey! if you don't like something, that's fine. Choose something else. But don't cast judgment based on personal preference. Others have just as much right to their choices, too.

And if you are having tech support problems, address them appropriately and you'll be much more likely to find a solution.

bryanl wrote:been seein' those "not ready for prime time" accusations since OS/2 days - the only common factor is that 'prime time' means 'what I am familiar with'

Every software setup has its quirks, its bugs, and its way of doing things.

It seems to be only with computers that people expect the tool to adapt to them rather than take the traditional view that a craftsman knows his tools...

This comes about because Linux is essentially an operating system for programmers. It's branching out and moving on, very slowly, but that's how it started and that's still how most of its maintainers and developers think. So there's very little interest in developing their own useable UI (because time spent on UI is time not spent making better programming interfaces, etc), and so the GUI elements that exist are all adopted from Windows.So most Linux installs use a bad GUI paradigm in order to make the transition easier for Windows users, which leads to the expectation that it can be controlled like Windows, which leads people to complain when it doesn't. Now, it is becoming a better Windows "emulator" all the time and eventually this will likely not be a problem, but it isn't there yet.

And before you accuse me of being any more of a Linux hater than I admit to being, please realize that I spend all my time at work on a Debian development server building a filesystem for Linux.

re: "So there's very little interest in developing their own useable UI " -- these kinds of allegations come across as rationalizations because they seem to ignore Gnome, KDE, Compiz, and many more such user interactions.

Programmers use computers, too, and often with more vigor and attention than other computer users who are really just using a browser or part of an office suite and have minimal interaction with the underlying environment.

Of course, you could get into Linux being only the kernel or whatnot but the topic here is Ubuntu, one of many distributions that provide a complete environment.

But all of that begs the real measure. Ubuntu can install on many machines with minimal hassle compared to alternatives. It provides a wealth of functionality and auxiliary applications with an ease of acquisition not found in its alternatives. As many who have installed it for 'Mom' know, its usability and need for user support are on a par with (or better than) its alternatives. Then you get into the licensing and upgrade hassles and Ubuntu has it all over the alternatives in that realm. The measures are clear. There are very few areas where Ubuntu or its ilk come up short and those are usually quite specific and narrow in scope.

To me, the real issue is the continual and ongoing efforts to rationalize Linux distributions as being unsuitable, difficult, user antagonistic or whatever. That implies something about human behavior and cognition that is difficult to measure but important to understand. - That's the real UI problem.

7. Applying 'read only' to a folder and all the subfolders/files doesn't work. The pull-down menu just abruptly disappears.

8. Files that are marked 'read only' and then copied to a flash disk causes the file to be 'read and write' enabled again. But files that have been made 'read only' during previous sessions are copied intact with their 'read only' attributes.

Overall, this is a pretty long list of gripes. I can't believe Canonical hasn't found these bugs or haven't fixed them after 5 months of releasing 10.04. Moreover, 10.04 doesn't even work at all on my desktop. Test driving the OS just gives me a blank screen, and downloading the latest .iso (10.04.1) doesn't help either and only gets me to the Ubuntu chime (with still a blank screen).

I think Ubuntu is a great OS... IF they could iron out the quirks and make it run on ALL computers. This isn't the first time I've experienced Ubuntu not even 'test-driving' on a PC. Ubuntu and Kubuntu from 9.04 to 10.04... i've seen it several times. The OS just doesn't work.

And yes, to gain widespread acceptance you can't tell people (or me) to enter this or that at the terminal. I grew up with DOS but Linux commands seem... Greek. We could probably use it to communicate with aliens.

Also, I wish they'd step off the 6-month release schedules. Sticking to the 6-month cycle means they have to rush out the OS even when there are still a ton of bugs to weed out. I'd rather they polish the OS before releasing it. What's happening now is they keep rolling out buggy software that just turn people off in the long run. I've been using Ubuntu for the past year, but I'm now feeling Ubuntu just might be a crappy OS. And yes, apparently a lot of bugs reported in the Ubuntu site don't seem to be getting addressed.

This is a reason why I use Gentoo Linux. It does not have a team of developers that think that they know how to design a X Windowing system better than upstream, so it uses a stock X Windows Server. That is unless you decide to modify the source code, in which case you are on your own.

Why do you need to check your hard disk for file system errors? It will be done automatically if necessary as long as you did not do something silly like mount your hard drive without a journal.

Use a different media player. SMPlayer is nice.

UNIX systems practice chastity, so they do not catch STDs like certain other operating systems. Avoiding self destructive acts against which "protection" is roughly 87% effective is a feature.

UNIX file systems are designed to avoid file fragmentation such that it does not build to a level where it will detriment performance or cause your computer to crash. It is a feature.

This is a reason why I use Gentoo Linux. I can customize the splash screen to be whatever I want. The default is a very attractive wall of text.

You should be using chmod for that.

This is why I use the terminal.

You should try Sabayon Linux. It is based on Gentoo Linux, so familiarizing yourself with it will enable you to make an easy transition to Gentoo Linux when you decide that third parties cannot make design decisions (e.g. which media player to install) about your computer better than you can. Alternatively, you could ask on freenode in your distribution's channel (e.g. #ubuntu) for information on alternative software that you can install.

When posting lists on message boards, you should use the list bbcode.

You could switch to Debian if you want longer release cycles, although software for UNIX systems changes so quickly that it could be obsolete and potentially unsuitable for your purposes by the time you receive it. Waiting longer does not mean that quality will be higher, especially since the distribution designers do not really coordinate with the upstream developers in a way that weeds out bugs. The way things usually work is that someone finds an issue, complains and then one of three things happens. One, it is a distribution issue, which is at some point fixed, but it is out of the user's hands until then. Two, it is an upstream issue that was detected by users of a distribution with a shorter release cycle and already fixed; in which case, the user will have to wait until the next release for the fix unless it is a security or stability issue. Three, it is something that is new and it is sent to upstream; in which case the downstream package maintainers and upstream developers might collaborate on a fix. In the case of Ubuntu, it is typically one or two. The release cycles are the main things that keep software versions in synchronization, allowing end users the benefit of the improvements made by upstream and allowing upstream the benefit of feedback made by end-users.

Disclaimer: I over-analyze everything, so try not to be offended if I over-analyze something you wrote.

Thanks for the replies, everyone. And apologies for the late reply. As I've said, I believe Ubuntu is a great OS and I only wish Canonical could iron out the bugs or shortcomings I've encountered, as well as all the other bugs other users are encoutnering. To address some of the comments posted here...

1. My computer (laptop) uses an Nvidia 7000M GPU.

2. Handling of Avast has been fixed, so kudos to Ubuntu. I use Avast to scan USB thumb drives which I use with Windows PCs.

3. Although the OS automatically checks the file system after a number of boots, I'd appreciate it if I can do the checks whenever I choose. Perhaps this is possible but I don't know how.

4. I actually prefer the media player bundled with Ubuntu. Very easy touse. And, although I find the tearing a bit irritating, especially for fast scenes, the convenience of the package more than makes up for it (say, the mouse wheel can be used to forward/reverse the scenes or F11 to toggle full screen).

5. Even if ext3 or ext4 avoids fragmenting files, there will come a time when so much data being written and deleted will eventually cause fragmentation, and that's when a defrag tool will come in handy.

6. The biggest concern I'm now having, which I haven't included in my first post, is that Ubuntu doesn't work with all PCs. Even booting off the CD will cause the PC to just hang up with a blank screen that went off to sleep mode. I've tried alternative boot parameters but my desktop just doesn't run properly. This is true for Ubuntu and Kubuntu 10.04 and Linux Mint 9 KDE. My desktop uses an ATI HD5670 1GB video card. Perhaps it's the GPU or X, but I just couldn't figure it out. Canonical acknowledges this problem, and I hope they fix it because inviting people to use Ubuntu and then later finding out the OS doesn't work just makes them disappointed and paint a bad picture of the OS.

7. I still say Ubuntu (and other distros) should step out of the 6 month release cycle. If it ain't ready, don't let it out. It's that way for processors, video cards, Windows, games, etc. and Linux distros should be no different. If it needs to be delayed, delay it. I'd rather wait a bit longer for better peace of mind than get something that's available right now but messes my data up. I'm sure everyone agrees with that.

I'm now trying Linux Mint 9 KDE and using ext4 (instead of ext3). Some concerns, such as the vsync issue, applying read-only attributes to the folder and all subfolders/files, touchpad not working, etc. appear to be a non-issue here, but I've been running this Linux distro for only a few days now so I'm still observing it. It has its fair share of bugs/annoyances such as..

1. During installation, I specified that I would be logged in automtically, but everytime the OS starts it asks for my password. Points for Ubuntu on this one.

2. I configured the wireless connection manager to log in automatically to our home network, but when the OS starts it still asks me everytime for my network password. Our network password is about 16 chars long so I find it a bit irritating. Another point for Ubuntu.

3. Changing the desktop effects used when switching desktkops (the cube animation is cool) causes the configuration screen (where you specify which graphical effect to use) to briefly display itself when I already closed it earlier. A bummer when showing off the OS's cool effects to other people.

This is actually a nice alternative to Ubuntu, on which it's based upon. I've always wanted to try Kubuntu but for some reason some annoyances/glitches prevent me from successfully achieving it. Some reviews say Linux Mint 9 KDE is what Kubuntu should have been all along. Will try this distro for a week or two. If I find it better to live with then maybe this is it. Otherwise, perhaps Ubuntu again for me. The new version, 10.10 looks enticing but maybe the LTS is a better choice.

As you get older, you don't lose your friends.. you just find out who the real ones are.

RE the 6 month release cycle... There's no obligation to keep up with the 6 month cycle, indeed it isn't really recommended that you do. The idea is that if you want a long term stable system then you stick with LTS releases. If you're someone who always wants the latest and greatest and don't mind fiddling with stuff then go with the non LTS versions.

I do agree that 10.04 does have a lot more bugs than I'd have liked for an LTS, specifically to do with the no display on installation problem you were running into. I've run into that myself on a lot of machines, mostly running nvidia hardware. I believe it's related to a change in the basic drive included on the cd and the work around involves installing with the Alternate Install CD and installing the proprietary nvidia driver from the command line before trying to use the desktop (but don't quote me on that)

Hardware and software is ALWAYS being pushed out the door before it's ready. Personally I think there's a very strong argument that neither XP or Vista were really "ready for prime time" actually at release (just look at the Blaster worm that hit XP and the various bugs in Vista (eg very slow file sharing etc))

RE checking system partitions...To manually check the partitions you'd need to have them read only, obviously you can't do that while your logged in normally so you have to start your machine in recovery mode and run the check from the command line. (assuming you've done the ubuntu standard install of putting everything on a single partition)

I'll grant you that this is fiddly and complicated but since it's something you shouldn't be doing all the time I don't see it as a problem.

RE command line...I get the impression you really don't like this and feel that it's something that went out with the Ark However it is to a large extent how things work on computers be they Linux, Mac and yes Windows too! Some knowledge of the command line is essential to anyone who considers himself a "power user". It doesn't take long to find fixes for Windows problems that call for opening up the command prompt or delving into the registry (which personally I find worse than the command line as it isn't self documenting... type in a command wrong and you'll usually get some explanation for why it didn't work and what you should have done).

Obviously Linux requires by far the most command line knowledge of the three major OSs and while the amount of time you need to spend using it is continually diminishing this will always be the case.

RE your wifi and login issues...I think the problem you've got is related to your keyring. This is what keeps all your stored passwords encrypted while your not logged in. If you're setup to automatically log in then the keyring doesn't get unlocked as you didn't enter your password so the passphrase either doesn't get saved or isn't automatically recovered (I forget which).

Since your setup to login automatically but it's still asking for a password and not remembering your wifi key I guess something has gone pretty wrong somewhere.

RE fragmentation...I've got some servers run ext3 file systems that have been running for 6 years and are still only 5% fragmented. Granted they aren't doing the kinds of tasks that create fragmented files (big database files are the usual cause as I understand it) but they do have GBs of data being written and deleted on them all the time. It just doesn't seem to be a problem that you need to worry about (hence no one has written a tool to deal with it)

cheesyking wrote:RE fragmentation...I've got some servers run ext3 file systems that have been running for 6 years and are still only 5% fragmented. Granted they aren't doing the kinds of tasks that create fragmented files (big database files are the usual cause as I understand it) but they do have GBs of data being written and deleted on them all the time. It just doesn't seem to be a problem that you need to worry about (hence no one has written a tool to deal with it)

Clearly Linux file systems live up to what has been said regarding fragmentation. I would still be concerned if my drives were maxed out space wise, but then I suppose it would be time for a bigger drive.

I'm not a linux nut, but every once in a while I mess around with it. Knoppix is what I have used the most, considering it's many uses, but for regular desktop use, Mandriva is the best out of the box distro available.

l33t-g4m3r wrote:... but for regular desktop use, Mandriva is the best out of the box distro available.

...if you don't mind relying on a company which is teetering on the edge of bankruptcy, just laid off most of its staff, and has publicly stated that they are going to focus on the server market (not desktop) going forward. I'm doubtful they will survive; no sane server admin would rely on a distro from a company in their condition.

ronch wrote:7. I still say Ubuntu (and other distros) should step out of the 6 month release cycle. If it ain't ready, don't let it out. It's that way for processors, video cards, Windows, games, etc. and Linux distros should be no different. If it needs to be delayed, delay it. I'd rather wait a bit longer for better peace of mind than get something that's available right now but messes my data up. I'm sure everyone agrees with that.

Google has moved the the 6 month release cycle with Chrome. Google has explained the reason for doing so and they make a lot of sense. The reasons Ubuntu has a 6 month cycle are probably very similar to Google's reasons. Coming from a business environment myself - we wouldn't be able to do our jobs without some firm deadlines. Deadlines don't just allow us to make progress, they force us to make progress.

Think about Intel's tick-tock. It's exactly the same thing but just on a 2 year cadence rather than a 6 month cadence. Considering the differences between CPU design and manufacturing and software design and production, the difference in cadence makes sense.

Moral of the story: deadlines make things happen. Deadlines result in progress. Deadlines let you plan and focus and perform.

Given Ubuntu's progress since I started using it in 2008, I'm very impressed with where their release cadence has gotten them.

If you want the "when it's bug free" release mentality, that's Debian; if you want the "get it done" mentality, that's Ubuntu.

I would even take that one step further: IMO the Ubuntu LTS (long term support) releases are a pretty good compromise between those two extremes. They are released at approximately the same frequency as Debian, but are nominally based on a snapshot of the Debian "testing" branch instead of the latest official Debian release. What this effectively means is that you get newer versions of everything than you'd get with the official Debian release, but fewer bugs than a regular (non-LTS) Ubuntu release.

ronch wrote:5. Even if ext3 or ext4 avoids fragmenting files, there will come a time when so much data being written and deleted will eventually cause fragmentation, and that's when a defrag tool will come in handy.

I have never heard of anyone having issues with ext3 fragmentation. There is a strong possibility that you will die before this becomes a problem.

ronch wrote:6. The biggest concern I'm now having, which I haven't included in my first post, is that Ubuntu doesn't work with all PCs. Even booting off the CD will cause the PC to just hang up with a blank screen that went off to sleep mode. I've tried alternative boot parameters but my desktop just doesn't run properly. This is true for Ubuntu and Kubuntu 10.04 and Linux Mint 9 KDE. My desktop uses an ATI HD5670 1GB video card. Perhaps it's the GPU or X, but I just couldn't figure it out. Canonical acknowledges this problem, and I hope they fix it because inviting people to use Ubuntu and then later finding out the OS doesn't work just makes them disappointed and paint a bad picture of the OS.

If you get a graphical environment, the issue is with those particular distributions and not Linux distributions in general. Gentoo Linux in particular rarely has issues with hardware compatibility provided that there exists a Linux distribution capable of running on the given hardware. System Rescue CD is a Gentoo Linux based distribution that exists solely as a LiveCD. My suggestion is that you try Sabayon Linux on such systems if System Rescue CD works. Sabayon Linux is also Gentoo Linux based.

Neither System Rescue CD have as good hardware support as Gentoo Linux itself, but unlike Gentoo Linux, they automate the process of getting hardware working. Gentoo Linux's strength in terms of hardware support comes from a combination of the fact that it has the latest software versions and also from the fact that the user is responsible for getting the hardware to work under it. Sabayon and System Rescue CD to some extent give up the excellent hardware support that you get when you as a professional take the task of getting the hardware to work into your own hands, but they do a very good job such basic tasks for you and being Gentoo Linux based, they make it easy for you to go into the system to fix things like you would on a Gentoo Linux system. In fact, System Rescue CD is meant primarily for fixing systems (e.g. Windows, Linux, Mac OS X, etcetera) and not meant to be used as a permanent distribution because it is a LiveCD. On the other hand, Sabayon Linux is meant to be used as a permanent distribution, which is why I wholeheartedly recommend that you try it.

If you are willing to learn (and I have not scared you away), the best distribution that you can install on your system is Gentoo Linux. It empowers you to take matters into your own hands and automates a great deal of the drudge work involved in doing it. The only things it does not automate that you will miss is the configuration of your hard disk and kernel compilation and installation, but there is a good reason behind that, particularly the fact that it is considered impossible to get those things right for every single individual without having them do it themselves.

ronch wrote:7. I still say Ubuntu (and other distros) should step out of the 6 month release cycle. If it ain't ready, don't let it out. It's that way for processors, video cards, Windows, games, etc. and Linux distros should be no different. If it needs to be delayed, delay it. I'd rather wait a bit longer for better peace of mind than get something that's available right now but messes my data up. I'm sure everyone agrees with that.

Not every distribution is on a 6-month release cycle. Gentoo Linux releases continuously. Debian Linux is on a 2-year release cycle, but it rarely keeps to it, so it could take 3 or 4 years between releases. RedHat Linux releases a new major OS version every 5 to 6 years.

There are also Linux alternatives such as FreeBSD, which is in some respects better designed than Linux. It has a more traditional release cycle.

ronch wrote:I'm now trying Linux Mint 9 KDE and using ext4 (instead of ext3). Some concerns, such as the vsync issue, applying read-only attributes to the folder and all subfolders/files, touchpad not working, etc. appear to be a non-issue here, but I've been running this Linux distro for only a few days now so I'm still observing it. It has its fair share of bugs/annoyances such as.

By default, all folders should be set to 755, with all files set to 644. Why would you want to be any different?

By the way, if you do not like Linux Mint, I suggest that you try Sabayon Linux. Linux Mint is transitively Debian based while Sabayon Linux is directly Gentoo Linux based. The two distributions take significantly different approaches to package management, so any quirks that Linux Mint inherited from Debian are likely not present in Sabayon Linux.

ronch wrote:1. During installation, I specified that I would be logged in automtically, but everytime the OS starts it asks for my password. Points for Ubuntu on this one.

That is probably a bug in the installer. I require password logins on all of my systems, so I do not know how to fix it, but I know that it can be done. I have OpenSUSE configured to do this on my mother's laptop.

ronch wrote:2. I configured the wireless connection manager to log in automatically to our home network, but when the OS starts it still asks me everytime for my network password. Our network password is about 16 chars long so I find it a bit irritating. Another point for Ubuntu.

Wireless functionality has traditionally been buggy under Linux, although it has recently become much better. What wireless connection manager are you using? In my experience, the nm-applet which GNOME uses works well. Wicd is a solid alternative. KDE's KNetworkManager and more recently, the network management plasmoid that replaces the GUI in KNetworkManager only recently become usable to the point where they do not routinely lose network connection information. Since I use Gentoo's testing tree, I get the latest software fairly quickly and I am presently enjoying the improvements. Since you use Linux Mint, you will likely need to wait for the next release before you see these improvements, although you could switch wireless connection managers to workaround that. Alternatively, you could install Sabayon Linux, which is based on Gentoo Linux's testing tree and should see these improvements much sooner than Debian based distributions, although that is not necessary.

ronch wrote:3. Changing the desktop effects used when switching desktkops (the cube animation is cool) causes the configuration screen (where you specify which graphical effect to use) to briefly display itself when I already closed it earlier. A bummer when showing off the OS's cool effects to other people.

Sign up for a KDE bugzilla account and vote for the bug to get the attention of KDE's developers. You could also post a bit about your own experiences with it. It is an annoying bug that I will be happy to see die. If I had time, I would try fixing it myself and submitting the patches to upstream. If you know C++, you could try your hand at that in addition to voting and possibly adding some feedback to the bug report.

ronch wrote:This is actually a nice alternative to Ubuntu, on which it's based upon. I've always wanted to try Kubuntu but for some reason some annoyances/glitches prevent me from successfully achieving it. Some reviews say Linux Mint 9 KDE is what Kubuntu should have been all along. Will try this distro for a week or two. If I find it better to live with then maybe this is it. Otherwise, perhaps Ubuntu again for me. The new version, 10.10 looks enticing but maybe the LTS is a better choice.

Kubuntu is really Ubuntu with KDE packages installed instead of GNOME packages. That is a eccentricity of Canonical and no other distribution provider does that. Ubuntu/Kubuntu has a reputation for being a terribly done KDE distribution and I really do not advise using it with KDE.

OpenSUSE is the best Linux distribution for KDE. Slackware Linux, Gentoo Linux and Sabayon Linux are good as well. Every mainstream distribution of which I can fathom is better than Ubuntu at running KDE.

Dirge wrote:

cheesyking wrote:RE fragmentation...I've got some servers run ext3 file systems that have been running for 6 years and are still only 5% fragmented. Granted they aren't doing the kinds of tasks that create fragmented files (big database files are the usual cause as I understand it) but they do have GBs of data being written and deleted on them all the time. It just doesn't seem to be a problem that you need to worry about (hence no one has written a tool to deal with it)

Clearly Linux file systems live up to what has been said regarding fragmentation. I would still be concerned if my drives were maxed out space wise, but then I suppose it would be time for a bigger drive.

That is actually not the issue you would think it is unless you fill the drive as root. Linux file systems by default have 5% reserved space so that they reach 100% when they are only 95% full as a safety feature. Only root can write additional files after the 100% mark has been reached, assuming that there is available reserved space.

I would even take that one step further: IMO the Ubuntu LTS (long term support) releases are a pretty good compromise between those two extremes. They are released at approximately the same frequency as Debian, but are nominally based on a snapshot of the Debian "testing" branch instead of the latest official Debian release. What this effectively means is that you get newer versions of everything than you'd get with the official Debian release, but fewer bugs than a regular (non-LTS) Ubuntu release.

As far as the spectrum of Linux distributions goes, Redhat Enterprise Linux and its clone CentOS are probably the best in terms of a "bug free" experience as far as Linux while Gentoo Linux's testing tree is probably the best in terms of get it done. No release of Ubuntu Linux is better than these distributions at either of these extremes. FreeBSD is a better option in my option than RHEL, but it is not a Linux distribution.

Disclaimer: I over-analyze everything, so try not to be offended if I over-analyze something you wrote.