Also, for debugging.
If you put the script into a function and add [cmdletBinding()] you can run it with -debug
which means you can add lines like Write-Debug “variables is $variable”; that will only show up when -debug is specified and will pause for you. You can also add Write-Verbose “messages”; that only show
i.e. to do it while still running it as a ps1 script with arguments you’d have to be a little tricky. If you mad the script a .psm1 and added a export-modulemember -function funcName; line at the end and then ran this in a ps console ipmo -Force -Global \path\to\psm1; you could then run your script as a function that you can import into any script or console and add the -Debug and or -Debug lines directly when running the function

@lebrun78 We had no idea this wasn’t a windows network share (if you already mentioned that, I overlooked it, sorry). Most people use them, so we assume that by default. I may be wrong here, but that would point to more of a script issue. I think @JJ-Fullmer is working on a post in this thread about mounting a share in powershell using SYSTEM. It could also be how you get your Certificate from the cert store (cert:\CurrentUser\TrustedPublisher). SYSTEM is sometimes not considered a user, and is instead considered the LocalMachine in the cert store. Basically I would recommend adding some debugging statements to your script and see where its failing (e.g. if its getting the cert correctly, or if its just the mounting code thats going wrong).

If the SYSTEM account isn’t able to access the cert store. You can also create a AES key instead of a cert that can be stored like the secure password file I mentioned in my other post. Thus keeping the security of the ps credential objects
i.e.

You can then use a variable pointing to that file in place of $tab_key in your script.

Also something else I just remembered. A great way to troubleshoot powershell and batch scripts being run as the system account is using psexec to run a powershell prompt as the system user.
You can download Psexec as part of PSTools from here https://technet.microsoft.com/en-us/sysinternals/bb897553.aspx
Then you can run the following to open a interactive powershell console as the System user to test your scripts

@lebrun78
So looking at how you map the drive, I think that may be where the problem lies. With Powershell scripts mounting a drive defaults to stay in the scope of the script running it. So in other words it won’t persist once the script ends.
My recommendation is you change it up a little bit by using the New-PSDrive method with a few specific arguments.

if (!(Test-Path -Path p:)){
New-PSDrive -Name "P:" -Scope Global -PSProvider FileSystem -Persist -Root $serveur -Credential $credential;
#The -Persist switch and setting the scope to Global ensure that your drive is mounted outside of the script.
#This also lets you use your PowerShell Credential object without converting it to plain text when its used in the command (Which is what .GetNetworkCredential().password does
}

Then at the end of the script you do this to remove the PSDrive

Remove-PSDrive -Force -Scope Global -Name "P:";

One other thing you may need to do is enablelinkedconnections. It’s a registry key at
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\policies\system\enablelinkedconnections

If you create and set that key as a REG_DWORD with Value data 0x1 (1) it will make it so you can see all mounted drives from an admin prompt. By default, drives mounted by a user in standard mode won’t show up if you run get-psdrive in an admin prompt and vice versa. Setting that reg key makes it so they can see one another which will ensure the system account that runs the fog service will be able to access the mounted drive. This security “feature” of the drives not being visible to elevated/non-elevated is a part of windows UAC.

Also, a side note, you can pipe the secure pw to a file stored on the local machine to make it so you don’t have to create the secure string and have any sort of password in plain text. Just make sure that it can only be read by the System and admin accounts. i.e. add a hidden folder to the fog service called keys, or secure, or even just ~, and lock it down to only allow admin access. Then as part of your image you can store secure pw files for mounting drives with powershell without needing to put your password in plain text in a file anywhere. You just do the key the same way since its needed for decryption, and you just put the path to the password file. And you have to be an admin to run the secure string commands and an admin to access the pw file and the cert file, so there’s not really a risk of a non admin being able to see the decrypted password. Just a thought that might help simplify writing scripts. Personally I made a module for mounting shares with powershell I use in my install scripts.
The contents of the password file would look like what you are putting into your $pwd string variable, so you could essentially copy and paste that into a text file.
Or run $pwd | ConvertFrom-SecureString -key $tab_key | Out-File "C:\Program Files (x86)\Fog\secure" You could put the pw file anywhere you want really.

One other note that might be causing problems is that you should try not using $pwd as a custom variable because it is already a alias system variable for the Get-Location command. So it’s possible, although maybe not super likely, that the $pwd variable could be reset to your working directory. So maybe try changing that to $pw instead.

@lebrun78 We had no idea this wasn’t a windows network share (if you already mentioned that, I overlooked it, sorry). Most people use them, so we assume that by default. I may be wrong here, but that would point to more of a script issue. I think @JJ-Fullmer is working on a post in this thread about mounting a share in powershell using SYSTEM. It could also be how you get your Certificate from the cert store (cert:\CurrentUser\TrustedPublisher). SYSTEM is sometimes not considered a user, and is instead considered the LocalMachine in the cert store. Basically I would recommend adding some debugging statements to your script and see where its failing (e.g. if its getting the cert correctly, or if its just the mounting code thats going wrong).

@lebrun78 that indicates it’s a network share permission issue or a script issue, as we have been saying. While that may work, you have to alter every machine and is a work-a-round for the underlying issue. If its a network share permission, fixing the network share permissions to allow for SYSTEM access (even if just to a single public folder) is the route we recommend. If its an issue with how you mount/decrypt your share, then it just needs to be made SYSTEM compatible. The client was built with SYSTEM permissions in mind, and therefore I cannot vouch for the security, or functionality, of the client running as a different user.

@lebrun78 snapins run as SYSTEM (this is like a limited root on unix). If a script works when run manually but not as a snapin, then your script is not SYSTEM compliant. Usually its an issue when someone tries to use network shares, which it appears you are doing. SYSTEM is an non domain user, so if your network share requires someone to be domained user to access it, that’s one potential issue. Ultimately this is a network share issue, and without see exactly how you are mounting / using it, we cannot help.

@lebrun78 Well, if it’s “installing MS Office, as the file kind of suggests” I’m guessing the PS1 has to either download the installer files and/or start the installer files across a network share. Are you setting the installation in a “silent” mode? Does running it manually on the command line cause you to do user inputs or not?