Jaap Brasser’s Favorite PowerShell Tips and Tricks

When I met up with Aleksandar Nikolic at TechEd Europe I was asked if I wanted to write an article for PowerShell Magazine. I hope my tips are useful to you and feel free to leave a comment or question if you would like me to clarify anything.

Using –ErrorVariable to quickly check effective permissions for a user

When working with permissions it is useful to know to which folders a user has access and to which folders access is denied. Although the Windows GUI does offer options for effective permissions, this is hardly appropriate when a large folder structure should be checked. This is where Get-ChildItem comes into play:

This will display all files and folders that are accessible, and stores all inaccessible folders in the AccessDenied array. For the purpose of this example we will focus on the files and folders that generated an error. We do this by omitting the output of Get-ChildItem by writing the output to the special variable $null:

This gives us a list that contains all errors that have been generated by Get-ChildItem which conveniently gives us all the paths that were inaccessible.

$AccessDenied | ForEach-Object {$_.Exception}

This command displays all the inaccessible paths including the error messages. To just display paths we can use the TargetObject property.

$AccessDenied | ForEach-Object {$_.TargetObject}

This will display the names of the files and folders that were inaccessible. Because this catches all errors there might also be other reasons besides permissions which might be restricting access to a file or folder. Therefore it is important to verify the errors before assuming why access might have been denied to a folder.

As a bonus, since we should already be using PowerShell v3, we can shorten the last two commands:

$AccessDenied.Exception
$AccessDenied.TargetObject

Parse Robocopy output in PowerShell to find long path names – Workaround for 260 character limit in Windows

The specified path, file name, or both are too long. The fully qualified file name must be less than 260 characters, and the directory name must be less than 248 characters.

The dreaded error, I think everyone working with Windows has run into this at some point. Even in Windows 8/Server 2012 we are still limited by this. It might be a file server with an exotic folder structure or just a single file with a long name that we cannot get rid of. This tip will show how to use Robocopy to detect the files and folders that have a long path.

Let’s start out with a simple example displaying the difference in speed of using Get-ChildItem in combination with Measure-Object, or by using Robocopy to count the number of files and the total size of a folder:

Both these commands generate similar output, but by using Robocopy we manage to get the results several times faster.

Now on to the more interesting stuff; using Robocopy to detect long file and folder paths. Since Robocopy uses a predictable output we can parse its output to extract the information we require. In this scenario we will use Robocopy to find files with a path longer than the maximum allowed. The next example will output a list of files that have a path longer than 260 characters:

To simplify this command I have written a short function to this process. It can be downloaded from the TechNet Script Repository: Get-LongPathName.ps1

Using this function, we can search for files and folders with long pathnames. The function outputs an object that contains the PathLength, Type (File or Folder) and the FullPath to the file or folder. Here is an example of how to utilize this function:

Get-LongPathName -FolderPath ‘C:\Deeppathtest’ -MaxDepth 200

This will output all files and folders with a path longer than 200 characters in the C:\Deeppathtest folder. And because this function returns an object the output can be sorted and otherwise manipulated making the output easy to work with.

The time it takes the robocopy command to run will be greatly affected by whether it is run first or second due to caching that takes place after the first run. Try reversing the order of the commands and you will see robocopy start to fall behind.

Maybe I’m missing something here. Using this method I can generate a list of the folder paths that are too long for PowerShell to use. What do I do next, just say “Oh well, I just won’t do to those folders.”
I mean, what’s the point? I guess knowing which folder paths you can’t use is better than not knowing, but it’s not better than a workaround that actually lets you use them.

That is correct, it is intended to detect the long paths rather than resolve them.

There are a number of options available for long paths, the simplest is simply using robocopy to relocate folders to a path that is not as deep. Alternatively you could using symbolic links or shared folders to allow access to the deep paths. There are also alternative API’s such as QuickIO and AlphaFS you could use in PowerShell to access the long paths.

Often black dots, representing the blood vessels which feed the wart,
are visible. They are the same color as your
skin and have growths that lok liuke threads sticking outt
of them. Another popular folk remedy is the use of duct tape.

Jaap, I’m in an odd position where I need to do a Get-ACL on each subfolder in the file structure (which is deep and massive) to determine what Active Directory Groups are used.
I’m able to access the folder names using robocopy.exe and export them to CSV but when I attempt to access the ACLs I’m running into an issue.
Currently I am mapping a drive to each folder found and doing a -recurse to give me a depth of 520 but it results in the script taking forever to complete.
Do you have any advice?

Absolutely, the mapped drive method is one way, another slightly more efficient method would be to use symbolic links instead of mapped drives. In PowerShell 5.0 the New-Item cmdlet supports the creation of symbolic links.

As for gathering ACLs that are in deep paths, I would recommend you attempt to use the File System Security PowerShell Module, it allows for deep paths as it utilizes a different API, the AlphaFS project, to access files and folders. Have a look at it here and let me know if that solves your problem:

Hi Jaap
I am not a very experience scripting guy or programmer but my issue is this.
I have downloaded NTFS Security modules and installed them i am trying to do the same thing as Matt.
My question is;
how are these module meant to help if in order to get GET-NTFSAccess output i need to run the GET-ChildItem.
Did i not understand something ?
I also read on your post about robocopy to bypass this limitation but i can’t figure out a way to link robocopy source path to something like a variable instead of destination path and then recall the variable in powershell and for last GET-NTFSAccess
Thanks a million for any help

HI RBA
you can import the alphafs module (Get-Alphafschilditem) it worked for me for a while then stopped and did not have time to figure out the reason why it stopped. But again it worked about a couple of time

Indeed Vinc, the problem with AlphaFS is that it might introduce other errors and bugs. If you are experiencing problems with the Get-AlphafsChilditem you could check the project page to see if your problems are known issues.

Hi Jaap
this is a good article
I am trying to do the same as Matt using robocopy to get all the files and folders but can not get it properly working
What i would like to do is get a list of all the file and folders and then Get-ACL and find out theone without permisision
Can you give any help