Whenever I am doing processor intensive tasks on my main computer
(i.e. burning a disc, encoding a movie, etc.), I will leave my computer
alone and not do anything else on it until the task is finished (I will
even go as far as shutting off my screen saver). I do this so that as
much processor power as possible can be devoted to performing the task,
and the task can be completed as quickly as possible--not to mention
that I hope to avoid introducing any errors into whatever task is being
performed.

I will, however, work on another computer in my house. If I need a
file from my main computer, I will access it over the network from a
mapped drive I have set up.

A friend has told me that accessing files through the network,
though not as intensive, is almost as bad as performing it directly on
my main computer while it is performing the burning or encoding. I
disagree, thinking that by accessing these files over a network, I am
using only the main computer NIC card and hard drive (and perhaps very
minimal processor power).

Can you please tell me which of us is correct?

•

Sure! You both are.

You see, it really depends on exactly what you're doing. Some of
your scenario leads me to believe that you are right, and other
portions of your scenario have me siding with your friend.

I'll give you the slight edge here because I can't see your
friend's "almost as bad" comment really coming into play.

•

I've actually "lived" this situation myself. I've set my desktop
machine loose on some processor intensive task like video or audio
encoding, and moved to my laptop to continue my work so as not to place
any additional impact on the desktop. And of course, as luck would have
it, there would be some file on my desktop that I need, so I copy it over
the network to the laptop - simple as that.

"In most cases the impact of remote access
across a network is negligible ..."

Did it have any appreciable impact on the encoding or other work
happening on my desktop? Not really.

And that's the situation in most cases. In most cases, the impact of
remote access across a network is negligible for all the reasons you
mention: file access requires very little processor overhead, and
similarly the network interface is also a very low processor-usage
device.

So most of the time, I'm with you. There's no appreciable
impact.

But.

Here are a few things that could, in fact, cause the remote file
copy we're talking about to have a noticeable, though still small,
impact on whatever's happening on your primary machine:

Encoding is more than just processor work. Those
bits have to come from, and be written to, somewhere. So, by accessing
the hard disk remotely in any way, it's conceivable you could slow down
the hard disk accesses required to do whatever it is you asked your
primary machine to do. This can be exacerbated by a fragmented hard
disk as well. While the task you set it to might have it calmly and
quickly reading and writing from some locations on the hard disk that
are near to each other, the remote access could cause the disk head to
need to fly all over the media, slowing it down.

The disk might be slower than you think. As I
discovered when my gigabit network wasn't giving me gigabit throughput,
disks themselves are occasionally the choke point for throughput. If
the task you've set your machine to brings it close to that limit, then
a remote access adding more work could push it over the edge.

The network card might not be as good as you might
want. Certainly most don't place an undue load on the
processor, but that's most. I'm sure that there are inexpensive
interfaces that have a higher CPU impact than we might like.

You might be asking for a lot of data. Naturally,
the impact on the remote machine, whatever it is, will likely be in
proportion to the size of the files you're accessing, and how you're
accessing them. A simple file copy of a small file? No problem. A
larger file? Perhaps there will be a bit more impact. Accessing a database file
hosted on that machine might well be a lot more file interaction that
you expect.

So yes, there are scenarios where your friend might be somewhat
right.

But in general and most commonly, the impact is negligible.

And certainly not "almost as bad as" doing whatever directly on the
same machine. (Though even that is typically less than we think,
depending on what "whatever" is.)

Leo A. Notenboom has been playing with computers since he
was required to take a programming class in 1976. An 18 year career as a programmer at Microsoft soon followed.
After "retiring" in 2001, Leo started Ask Leo! in 2003 as a place for answers
to common computer and technical questions. More about Leo.

It really depends on how much you're doing, and how fast your system is. Transferring a few Word documents across the network is trivial. Transferring several GB of data across the network can be another story.

It depends on what kind of NIC you have (cheap ones put vastly more load on your CPU, better ones like Intel PRO/1000 cards do much more on the NIC and don't rely nearly as much on the CPU), and how fast your system is. On a reasonably new system on a 100 Mb network, file copies of even large amounts of data should have an insignificant impact on system performance. On a gigabit network, or with an older system, pushing significant amounts of data will have a major impact on system performance.

Though the days of worrying about killing a CD or DVD burn because your system can't keep up are pretty much long gone. This isn't 1996 anymore, when that was truly a concern. With multi-core processors being the standard, and most applications being single threaded so they only use a single core, your system should even stay highly responsive while doing these types of things. And there isn't any reason to worry about something going wrong if you use the computer for something else - the worst that can happen, unless you're using a buggy application, is it will take longer to complete.

Ramesh Sahoo
December 30, 2008 11:28 PM

I want to know the procidure of exchange server 2003 installation procidure and required service packs for this.

Richard
January 2, 2009 9:11 PM

Even the Newest CD/DVD burning applications still suggest in the "Help File" that:
a) the disk be defragmented before attempting the burn
b) the burning application be the sole application running while a CD/DVD is being burned
c) turn off any anti-virus software's "On-Access" scanner or define ignore rules for the types of files being burned ie .wav .mp3 .wma audio files can be safely ignored by the "On-Access" scanner

I also myself suggest that all desktop PC's have more than one HDD installed, and that all burning be done from copies of files on the second disk just in case you or your system decides to do something else with the C:\ drive
ie system restore decides that it's time to do the restore point right in the middle of your CD / DVD Burn and the disk has a fit jumping back and forth between the files being read for the burn and the
restore point being created, also nearly everything passes through the swap / page file so having two HDDs raises the over all system performance.

remotely retrieving a huge file from the same disk while burning or encoding isn't a good idea even if the system had 8 CPU's the bottleneck is still at the the drives performance threshold.

I've remotely retrieved small files files many times with no ill effect while doing live audio recording but the recording is going to another disk & not the one I'm accessing.

Chris
January 2, 2009 11:43 PM

I think we need a refresh on the job of the processor.. When moving a file, a processor's job is to transfer the bits (0's and 1's) of data from the hard drive to the output (your nic), so you are in fact, using all of the data bus to transfer data from your hard drive to your CPU's cache, which then calculations are done in RAM to create the data packet, back to the CPU, then to your output card. Think of it as a map with your CPU in the middle as it is the component that transfers data and does calculations.

Now, where you see the burden greatly lowered when copying/accessing a file over the network vs on the source machine is the GUI calculations. For example, when you double click a Word Document on the source machine, your CPU must load the interface you see on the monitor as well as access the data in the actual document. What happens? You are not just loading the data in the document to the CPU, rather, you are loading the data PLUS extra information so you can view it on the same machine. You will quickly bottleneck as these types of softwares are much more CPU oriented (lots of calculations that require access to RAM as well).

Now, when you access the file over the network, you are only moving the actual data in the document to the CPU and then executing the Word Program on the remote machine. The result -- you can save up to 80% of the load on the CPU of the source machine if you run the file remotely, but you will use the same amount of CPU load on the remote machine. More work overall, but less work on the source machine. This is exactly what you would want to do if you are running a CPU intensive program on the source machine like encoding a video.

Nicholas Gimbrone
January 3, 2009 9:04 AM

Disk I/O is a common thread here, and disks are (still) pretty slow. The other common thread is the memory consumed, remember, most operating systems like to use memory caching of recently read documents, as well as needing to do simple buffering of the disk and network I/O (if nothing else). Both of these effects can lead to an interaction between the needs of the network file access and other processes (such as DVD burning).

•

Comments on this entry are closed.

If you have a question, start by using the search box up at the
top of the page - there's a very good chance that
your question has already been answered on Ask Leo!.