Use PowerShell to Copy Files to a Shared Drive

Microsoft Scripting Guy, Ed Wilson, is here. Well, it is Monday in Charlotte, North Carolina, in the United States. Today is a cardio day. I spend the day running around going from meeting to meeting. I also spend a significant amount of time jumping through hoops to meet various deadlines for items that have no lead-time. The end result is a great workout that expends several hundred calories. Like running on a treadmill, it is a bit difficult to see any actual forward progress. But hey, such things are often necessary.

Anyway, with a significant amount of time taken up by the system idle process, it is important that the remaining processes are efficient. One problem I have always had involves finding scripts I have written. For one thing, I have a hard time remembering what scripts I have written, and if I do not remember having written a script, it is hard to search for it. To put it another way…Often I end up browsing for my scripts, rather than searching for them. This is one reason I give my scripts such descriptive names—to make it easier for me to recognize them once I find them.

In the image that follows, I show the script folder that contains the scripts I wrote for the Windows 7 Resource Kit that was published by Microsoft Press.

For most of the chapters, I wrote an average of 15 scripts. But for some of the chapters, I did not write any scripts, and for other chapters I wrote as many as 40 scripts. In addition, each collection of scripts is related to a particular topic. Therefore, if I need to find the script I wrote that sets a static IP address, subnet mask, default gateway, and DNS server, I spend a lot of time clicking or I use the search to attempt to find the script.

For me, anyway, it is easier to look in a single folder for a script titled something like Set-StaticIPAddress.ps1. The thought of clicking through 35 folders and copying and pasting to another folder, however, really creeps me out—not to mention that my wrist and clicky finger would probably give out about half way through the process. No, this is a job for Windows PowerShell, not for the mouse. Yep, the pen is more powerful than the sword, and Windows PowerShell is more powerful than the mouse.

There are a couple of things to keep in mind about this command. The first thing is that when I specify the path to copy the files, wild cards are permitted. Therefore, one might expect that a command such as the following would work. It does not because it would point to a specific file—so the path must point to a folder that serves as the starting point.

dir C:\data\BookDOcs\Win7ResKit\Scripts\*.ps1 –Recurse

In my command, I could have used the Include parameter instead of the Filter parameter because the Include parameter modifies the Path parameter. Therefore, the command that is shown here states that I want to start at the \scripts directory and burrow down until I reach the bottom (that is the Recurse portion of the command). I then want to include only the files that end with an extension of ps1. When you use the Include parameter, you need to use the Recurse switch for it to be effective.

dir C:\data\BookDOcs\Win7ResKit\Scripts\ –Recurse -include *.ps1

Instead of using the Include parameter, I decided to use the Filter parameter. The idea is that the Filter should be more efficient because the provider should filter the files before returning them to Windows PowerShell, instead of returning everything to Windows PowerShell and causing Windows PowerShell to do the filtering.

To test this idea, I use the Measure-Command cmdlet. First I test the Include statement. Here is the command I run.

The results state that the command tool 107 milliseconds (19 milliseconds longer). Keep in mind that the Measure-Command cmdlet is not accurate at the millisecond level; therefore, the results essentially state that the two commands took basically the same. The command and the output from the command are shown here.

You should keep in mind that this small test, certainly it is not conclusive. You should not rely on it when you need to move large amounts of massive data. But for small operations, such as the one I just performed, use either the Filter or the Include parameter, whichever one you are most comfortable with. After all, if it takes you an extra five minutes to get your command working just because you think that Filter will be faster, you just squandered your 19 millisecond advantage big time.

I hope you have a great day and an awesome week. I look forward to seeing you tomorrow.

just to help you out 🙂 I've got some completely different numbers from a recursive search on a network drive! The filter approach took

"TotalSeconds : 22,5731829"

The include variant took

TotalSeconds : 44,8390221

So just about double the time the first solution took!

And the simple DOS query

Measure-Command {cmd /c dir H:Script*.ps1 /s }

took only: TotalSeconds : 6,4462931

Strange? Well … the comparison is not really that fair *sss* because we get back objects from PS and text from DOS! But anyway, the .Net framework 2.0 classes used to traverse the filesystem aren't very efficient and Powershell builds upon them! That's the reason why I sometimes fall back using the Scripting.FileSystemObject which is patricular efficient if it comes to handle a large amount of files and directories! I think times may change and PS V3.0 may do better with the .Net FW 4.0 as basis ..