Winrm-transport Decode-Base64File OutOfMemoryException

Ohai Chefs!

I keep encountering a Powershell OutOfMemoryException when trying to converge a node. The error message is:


Creating the scheduled task.
SUCCESS: The scheduled task "chef-tk" has successfully been created.
Transferring files to

------Exception-------
Class: Kitchen::ActionFailed
Message: Failed to complete #converge action: [[WinRM::Transport::FileTransporter] Upload failed (exitcode: 1)
Exception of type 'System.OutOfMemoryException' was thrown.
At line:3 char:66

  • function Decode-Base64File($src, $dst) {folder (split-path $dst);sc -force
    -Enco ...
+ CategoryInfo          : OperationStopped: (:) [], OutOfMemoryException
+ FullyQualifiedErrorId : System.OutOfMemoryException

]
>>>>>> ----------------------
>>>>>> Please see .kitchen/logs/kitchen.log for more details
>>>>>> Also try running `kitchen diagnose --all` for configuration

------------------------------------------------------------------------------------------------------------------

The Chef run (using chef_zero_scheduled_task) transfers a 22MB base64-encoded file to the node and the error occurs when the file is decoded.  The file contains all the cookbooks required for the Chef run.

I've seen other posts regarding similar errors but they appear to have been a result of an earlier 'buggy' version of winrm-transport; here I'm using winrm-transport-1.0.3 but the memory exception occurs in both v1.0.2 and v1.0.3.  The node's Powershell MaxMemoryPerShellMB setting is 1024MB (the default) and the node has 2GB of memory allocated.  I've monitored the memory usage during the run and observed that the available memory decreases rapidly when the Powershell decode process runs.

My theory is that the exception occurs when Powershell is trying to reach the maximum value of its MaxMemoryPerShellMB setting but there's not enough available memory in the node to achieve it.  Unfortunately, I can't increase the node's total allocated memory at the moment as my physical host machine doesn't have enough RAM to allow me to create two nodes (more RAM is on order but I'm stuck with 8GB at present).

I've seen many posts that confirm that Powershell's 'get-content' can be extremely memory-hungry when used in a certain way.  The Powershell function here is using 'sc' which I assume is the in-built alias for 'set-content'.  Is this correspondingly memory-hungry?

I can see a few possible ways ahead that I haven't tried yet:

1) Add a recipe to increase the node's Powershell MaxMemoryPerShellMB setting prior to running my other recipes.  I've hesitated to do this because I need to restart the WinRM service for the new setting to take effect.  Also, if my theory above is correct, increasing the value will not help.

2) Find some way of not using the Powershell 'Decode-Base64File' and try the Ruby base64-decode functionality.  I'm not sure how I would achieve this.

3) Wait until I get more physical RAM and then allocate more RAM to my node VMs.  This is an annoying delay but may be my only option.

Any thoughts, ideas, solutions will be gratefully received.  Sorry for asking so many questions, but they might generate an interesting discussion!

Thanks in advance.

Jim

Unfortunately WinRM uploads are not the most efficient. In fact mainly thanks to limitations of the protocol, its kind of a “worst of” when it comes to file transport mechanisms. We hope to make this MUCH better as we switch to PRSP for uploads.

You are likely correct that you have hit this because you have hit the limits of your shell’s memory quota. Its not so much that Set-Content is a hog here, we’re just using it poorly. We are grabbing all the encoded bytes into memory and then dumping them into their rightful place decoded. This should really buffer this decoding to avoid this scenario.

The WinRM-transport gem is being deprecated in favor of the WinRM-fs gem and will likely make it into the next TK release. Unfortunately the decoding logic is exactly the same in both so WinRM-fs wont fix this. Would be great if you could file an issue in WinRM-fs and get that on our radar.

Thanks!

Just a note @jgmccolm @Matt_Wrock:

I increased the size of vm to 8GB because I didn’t what else to do (not a Windows expert) and still hit this issue.

Its actually not the size of the VM you are likely hitting but rather the maximum memory capacity of the winrm shell. Different versions of windows have different thresholds. 2012 R2 is now fairly reasonable with a 1GB cap but 2008 R2 is pretty small. You can adjust this with:

Set-Item -Path WSMan:\localhost\Shell\MaxMemoryPerShellMB -value 2048

Yeah - I tested that you and it works. Was just replying to @jgmccolm item no 3.

I found and used these instructions:
Set-Item WSMan:\localhost\Shell\MaxMemoryPerShellMB 5000
Set-Item WSMan:\localhost\Plugin\Microsoft.PowerShell\Quotas\MaxMemoryPerShellMB 5000

Restart-Service winrm

They worked for me. Now I am trying to figure how to add that to the test-kitchen work flow or if I should create another vagrant box…

BTW I was using a windows 2012R2 vagrant box. Maybe I should check what the default was.
I should better check “What’s in the box” before I use a Vagrant box. The parameter was set to 300MB…

Cool. Using Test-Kitchen the best way to work around this is by using the elevated property of the transport. If you add:

transport:
  name: winrm
  elevated: true

That runs the commands via a scheduled task which should completely circumvent the winrm shell memory limitations.

thx @Matt_Wrock. Also probably should have used your vagrant box as I assume it will work :slight_smile:

It just dawned on me that the elevated “trick” will not impact files copied to the instance by Test-Kitchen. So if you are facing shell memory limitations during that upload process, your best remedy is to increase the shell memory limit of the base box. Regarding the mwrock/Windows2012R2 box you are referring to it sticks with the default 1GB limit.

I have also faced this issue with the following setup:
gems:

  • knife-windows (1.9.0, 0.8.6)
  • winrm (2.2.3, 1.3.4)
  • winrm-elevated (1.1.0)
  • winrm-fs (1.2.0)
  • winrm-s (0.3.1)
  • winrm-transport (1.0.2)

chef-client version: 12.22.1
OS version: Windows Server 2008R2

Configurations on the MaxMemoryPerShell:

powershell -Command "Set-Item WSMan:\localhost\Shell\MaxMemoryPerShellMB 4096"
powershell -Command "Set-Item WSMan:\localhost\plugin\microsoft.powershell\quotas\MaxMemoryPerShellMB 4096"
C:\windows\syswow64\cmd.exe /c powershell -Command "Set-Item WSMan:\localhost\Shell\MaxMemoryPerShellMB 4096"
C:\windows\syswow64\cmd.exe /c powershell -Command "Set-Item WSMan:\localhost\plugin\microsoft.powershell\quotas\MaxMemoryPerShellMB 4096"

Raising the MaxMemoryPerShell values did not help and when a powershell_script was executed it still crashed with an out of memory exception… the problem resolved after I have installed WMF 4.0 (PowerShell 4.0)