You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!

Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.

If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.

Having a problem logging in? Please visit this page to clear all LQ-related cookies.

Introduction to Linux - A Hands on Guide

This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.

In other words, if it is not a agp or pci-x video card/chip, I do not believe that dri is going to work...

Sorry about that, I was real tired when I posted that. Didn't make sense to me, either....lol

No worries

the card is on agp port, I think it's x4 port.

well to be honest I've just looked at around how much fps people were able to pull from this card under windows, and guess what, they talk of something like 100-150 fps max all the time... So comparing to what I get well it's not actually that bad!

800x600 at 16 bits is a common setting. Multiply 800x600x16. Thats how many bits the gpu/video has to push around each cycle.

Now try the math on 1024x768 at 32 bits. 1024x768x32. Much, much more bits per cycle.

So common sense dictates the lower esolution and bit depth would make things faster. Which is true, up to a point. The major drawbacks are the lower bitdepths make things look weird, and lower resolutions make them HUGE. So you have to have tradeoffs.

I was wondering if the NVIDIA drivers need to be patched anymore. I use a 2.6.17 kernel now and I haven't needed to put the drivers in just yet but I was going to do it soon. I keep an already-patched executable around so I don't have to keep doing it. But it's old and I want newer ones.

Should I be patching the drivers to have them work properly?

The little I have found on this tells me no... but I figured I would ask.

Yes it can out of the box for slack 10.2 default 2.4.31 kernel, 2.6.13 kernel and 2.6.16.20 no patches needed.

Quote:

Originally Posted by Old_Fogie

Question 2: Is the slack testing kernel to be used with the script on the front of the page too? Just wondering, I wasn't sure if that was for above 2.6.16 or not the way I read that.

Script in beginning of thread not needed for testing 2.6.13 kernel.

Script in beginning not needed for anything that I ran into I guess I'm lucky.

Now I do have one good question.

How do I get rid of this 'obnoxious' nvidia full screen branding of my pc just as X starts and before the log-on screen.

I'm shocked noone appears to have issue with this. If anyone has the right to brand my "box" it's Pat, or Linus or GNU and NOT nvidia! Heck linuxquestions.org has more right to brand my box than them. How do I get rid of this?