cd c:\some-directory $xml = [xml](get-content "file.xml") $xml.Parent.Node.Id = [system.guid]::newguid().tostring() echo $xml.Save("file.xml")
When I ran the script, the file didn’t change. I couldn’t for the life of me figure out why. This was the first time I was working the XML libraries in PowerShell and I was certain it must be due to how I was using the library. Finally, after about a half hour of searching I discovered that the change file was being saved to the original directory, not to the directory I had changed to. It turns out that when you change directories in PowerShell, you don’t change the current directory in the environment and so the .NET libraries think you’re still in the original directory. Since I read the file using PowerShell with get-content and saved using the .NET XML library, I was properly reading the file, but saving it in the wrong place.
This quirk is apparently deliberate, having something to do with enhancements in a future version of PowerShell which will allow suspending of jobs in the background. I’m not sure that this is the best way to deal with that eventuality, but there you go.
There are two ways around this problem. The first, and simplest is to use the resolve-path cmdlet:
echo $xml.Save(resolve-path "file.xml")
This will ensure that the full path to the file will be sent to the .NET class, ensuring that the file is saved in the correct location.
The second option, which may be preferable in some situations, is to change the directory using the .NET Directory class:
This will change the current directory in the environment. It’s probably safer to use resolve-path, but it will not be sufficient if a .NET class must work on the current directory and you can’t pass in another path.