Ubuntu :: Seems Slow In Some Areas - High CPU Usage?

I've been using Ubuntu consistently for about three days now. I really, really love the interface and how everything works and all that, but I've been having a couple of weird problems with speed.

Graphics things seem to work really well. When I go into the overview of all my workspaces, it's instant and looks great. The problem is when I open and use some applications.

For example, when I open up the software center, it takes longer than it did the first time to start up. Also, when I drag windows off from being maximized, it takes literally about five seconds for it to show up as being dragged around by my mouse.

When I look at the system monitor, about 20% of my CPU cores are constantly being used. That's 20% each. I have a 3-core CPU, could that be the problem?

Another example: when I went to ..... just now, it would take a second for any volume changes in the video to register.

And I also have smooth scrolling in firefox, but it's very unresponsive now. It's slow as all hell. Even notifications are showing up more slowly.

So, what's the deal? What could I have done wrong?

CPU: AMD Phenom II X3 2.8Ghz
GPU: ATI Radeon HD 4860 1gb
RAM: 4gb

One more thing: I have really bad screen tearing when I try to move windows around, as if there isn't any vsync on. Where can I turn it on or fix this?

I'm still pretty new to Ubuntu and Linux, but I'm an advanced computer user. Since I feel linux is killing my CPU I think I'm going to boot back into Windows 7 for now, but if I can fix this problem I think I want to keep Ubuntu as my main OS after this, having Windows for games, unless I can get Wine to work right with them.

I was browsing my folder with lots of images, after finished i close nautilus and i notice that my computer became slow, so i'll check it with system monitor and had found that nautilus are using almost 100mb of ram (opening 4 tabs). I'm not sure if this was normal or not because i try to reopen the same folder with pcmanfm and it only consumes less than 20mb of ram (opening 4 tabs).here's the screenshot from system monitor .

I have recently upgraded from Kubuntu 9.10 to 10.04. I am having trouble with CPU usage by the X server. When running KDE programs (for example Kontact and Amarok) my computer is becoming very sluggish. This seems to be because X is using 90-100% of the CPU constantly when there are any changes to the application (any sort of redraw, even the mouse moving across the screen). I don't experience this with other applications (I am currently writing this from Chromium with no issues at all). Below is all the relevant information I can think to include.

INSTALL PROCESS: Fresh install from Ubuntu 10.04 LiveCD. My /home direrctory is on a separate partition so that has remained unchanged from 9.10. After installing Ubuntu, I installed the kubuntu-desktop package to install kdm and the KDE desktop.

The only real change from my last install is that /var is now also a separate partition (in a bid to save my MySQL data should I have to reinstall again) but it is on the same hard drive as my root partition and so shouldn't be causing any performance issues.

MY X SETUP: Three screen setup using onboard and PCI graphics cards. This worked perfectly before and, after reconfiguring X using nvidia-xconfig works almost as well again. Except that one of my monitors autodetects at 1280x1024 where before it was a higher (and widescreen) resolution.

The issue only presents itself when KDE application screens are visible and disappears when they are hidden (either minimised or closed) As far as Ican tell it could be an X, Qt or KDE issue. But I am not sure what versions of these will have been changed in the upgrade.

My problem seems to be very simple, it's high memory usage. I occasionally will use movie player to watch a few shows and I use firefox as well. My memory usage starts out real small about 500 mb but after using firefox lightly and movie player it jumps to almost 2 gigs and this is after they've been closed what gives? I've attached an image so you can see what I'm talking about.

I don't understand why I have a 800MB usage at startup. I didn't add any programs to start at startup. I do though have 10 primary partitions that are auto mounted at startup but it shouldn't use up so much, I have 2GB total RAM.

At startup nautilus uses ~400MB of virtual memory in the "system monitor"

Also which is RAM and which is SWAP in resources in system monitor: Virtual Memory && Memory?

I play RuneScape most of the time i'm on the computer and i always keep an eye on the performance. Yesterday and the days before, everything was going good, cpu usage it was causing was between 30 and 60%, now it wont go below 70%. It's always 70-99% for some reason :s i have set settings for graphics to OpenGL (they were like that before too with no problem) and it started causing high cpu usage. If i set it to Safe Mode in graphics settings it gets 100%, I never tweaked the drivers, i have a NVIDIA GeForce 9600 GT, i have the opengl thingies installed along with my nvidia driver. Ubuntu 10.10 32bit.

OSS, even though it's emulated *through* ALSA, consumes a microscopic fraction of straight ALSA's CPU usage. For example, when I'm playing something through ALSA with XMMS2, I'm getting about 25-30% CPU usage, but with OSS output, only 2%, tops. The same applies to any other application in which I can choose the output method. And, I'm on 10.04.

I use an Ubuntu 10.04 minimal install with only the packages I want, I like to keep things pretty slim.

On bootup, when I execute "startx", everything starts to load, this is fine. However, just off of a cold boot with only my startup apps running, conky shows around 450mb of RAM usage. It usually is only around 150 - 175.

Sometimes when I boot my computer Conky shows the normal RAM usage in the 150mb range, sometimes it's in the 450mb range.

now i getting consfuse about my server memory usage i just have 3 sites , 1 blog site and 2 company profile but apache memory usage is more than 300MB and total of memory use in my server is more than 500 MB (maximum 512MB burst memory)

i am using drupal for my website is this normal ? because in last week, memory consumption in my server no more than 380 MB

I have a problem that the process modem-manager eats up all the cpu resources making it impossible to do anything on my PC. This issue is comes up randomly even during boot up and unfortunately very often. The twist in it is that I can not even kill the process using kill or killall (tried to kill even from top; of course with sudo)

How can I shutdown a process when kill does not help? Is there any way I can disable to start this process during boot up?

I just upgraded KDE on my openSuSE 11.2 installation. I have never had any problems doing this in the past, but this time, when I rebooted, I noticed that after a few seconds of idle time, my CPU usage goes sky high. I ran top in a console and noticed the culprit was xorg. I am using an NVIDIA card on an AMD64 3200+ with 1 gig of memory. KDE version is currently 4.5.85. Like I said, I didn't have this problem until the last update. Any ideas as to what could be causing this?

I am calling clock_gettime() function to get time with nano seconds accuracy. My program works fine on Ubuntu but have high CPU usage problem on CentOS 5. it takes 40% CPU on Ubuntu and 90% CPU on CentOS. Kindly give me solution so that I can reduce high CPU usage problem on CentOS. You can build this code like: "gcc -lrt gettime.c -o gettime.e"

I'm using Karmic koala 9.10 64bit.Have three HDD, WD Raptor 36GB X 2 and Seagate 160GB.2 WD Raptors are striped (RAID 0), and all hdds are formatted by ext4.(Intel ICH9R raid) Problem is that, when reading/writing hdd heavily, system freezes a moment or very very slow down. When I used a windows XP, there's no problem. Moreover, one other machine, Karmic Koala 9.10 64bit and ext3, no raid, works very fine.

thing I did when I installed 10.04 was to re-arrange my icons. I clicked on the first and started to drag. The icon did not move, and nautilus was using 94% of my cpu for a good 15-20 seconds before gtk-window-decorator was using 95-100% of my cpu for a few more seconds. Finally, after a few more unsuccessful tries with the same results, I was able to move the icons.

I assumed this was a bug in nautilus, however the same thing occurred in Chrome when I tried to re-arrange my bookmarks, and the same thing in gnome-panel. The only thing in common I see is gtk-window-decorator.

I am running latest apache2 available in the lucid repos on my desktop. All packages are updated as of this moment. Now in the root of my web server I have placed several soft links that point to folders on another ext3/ntfs partitions on the same disk. When I try to download any large file (say above 500M)on this server using firefox, when the 'save' window appears, my desktop freezes, I notice very high cpu-ram-disk usage, even though I have not yet clicked on 'ok' to save the file. This issue is not present when the file size is small. Note that firefox and the webserver are running on the same computer.

Also I have tried nginx and lighttpd and the issue is present there as well. When I tried downloading the same files using Internet Explorer 6.0 using a XP VM the issue is not present. However on Windows as well using Firefox the issue recurs.

I have been using ubuntu for 4 years now on my decent laptop with 2 Gb RAM, dual core centrino, etc etc. Yet, in all those years I have been using this superior OS, I still have to do hard shutdowns because some program runs wild. I have lately 2 scenarios where I have to interfere with the process:

1: amarok crashes and leaves the python script for the gnome shortcut keys running at 100% CPU. Or: thunderbird-bin keeps running after apparently clean close of Thunderbird. That's not really bothering me, I just kill both processes.

The bigger problem is scenario 2: 2: VLC starts eating all my RAM (for no reason), my SWAP starts filling and my computer becomes unusable for 10 minutes. Or: my matlab script is too big and eats up too much RAM -> same. Note: I have nothing against SWAP because at many other times it's very useful

These are stupid and annoying problems where there is an easy solution:

1) automatically kill the stupid process that runs at 100% CPU2) automatically kill the stupid process that eats up all my RAM

Is there any program to terminate process automatically if using more than 90% or 100% CPU time for more than 5 seconds? I have a program that opens a web browser, do some stuff on it and close it. But problem is that sometimes that browser starts using 100% CPU

I've installed my debian sid about one month ago (first xfce, next gnome) but noticed that it's kind of really slow. The upgrades take ages, launching (and using) firefox takes so much time,... In comparaison to my ubuntu, archlinux (on the same computer) or previous installation of debian there is clearly a problem somewhere.Today I tried to do a "top" sorted by mem usage : 3.5% xulrunner-stub, 2.1% dropbox, 1.4% aptitude (doing upgrade), 1.4% clementine,... nothing terriblebut still I've 2.7Gb or RAM used (more than 50%)

However, I would like to ask if anyone else is experiencing Higher than normal CPU usage with openSuSE 11.4 PR?? My system, which is probably similar to others, increases the air flow with the fan on the CPU when it is working hard. Mine gets rather noisy. (don't have it on the floor, but right next to me) This fan speed increase has been lots more prevalent in the newest version of openSuSE.

Have also used 'top' and it looks like xorg is one of the biggest hogs. However, when I use VMware it goes into hyper drive a lot.

I'm diagnosing why my application is slow and I found the problem by writing a simple TCP client/server, where each side sends/receives 64k at a time. The bottleneck is on the receiving end, which uses has a Realtek 8139 100Mbit/s NIC and running Windows 7. I'm able to get 11.9 MiB/s as expected, but the CPU usage is very high - 80% on 0th CPU and ~20% on 1st CPU. Task manager shows conhost.exe taking 25% CPU and 10% for my program.

When I try the same receiving program on Debian, I can still get 11.9 MiB/s, but the CPU usage is now only 1% (user & system). On the sending side, the CPU usage is practically 0% in both Windows 7 and Debian.

I am trying to figure out why the remote X performance of our RedHat 5.3 system is so bad. We have tried using X (Gnome session) from several different X Servers (Windows Xceed, Windows XWinPro, Linux Xnest - both Fedora 11 and Centos 5.3 and Mac OS Xnest) and the system is barely usable. I have monitored the network traffic on the RHEL system and it goes up to 6MB/s at some points, which seems a bit too high for X net traffic?I have disabled ipv6 and any ip_tables modules and that has helped a bit, but it's still not very good.I suspected the network hardware and driver, but I cannot see how that would cause network traffic problems.I wonder if there are any X Server network settings I might check, or whether trying XFCE would be a better option over Gnome. If so, do I have get the xfce group from a CentOS Repo, or is there something better suited to RHEL?

For some reason after I close firefox , firefox-bin keeps running and it eats more cpu than when firefox was open, a lot more. Same thing with seamonkey, seamonkey-bin keeps running after it is closed.Attached are two pictures , 1) While firefox is running: notice the low CPU usage2) Firefox closed, CPU is almost 50%