Overriding node attributes with a wrapper cookbook

I’m hoping someone can help me figure this one out.

my_wrapper_cookbook depends on the community hadoop cookbook. Each recipe
in the hadoop cookbook has an ‘include_recipe hadoop::repo’ in order to set
up the appropriate yum/apt repos for installation.

I want to use local repo mirrors to speed up the installation process
because I’m installing hadoop clusters via automation to run testing every
night. The installation process takes a long time.

I’m trying my_cookbook -> attributes -> repos.rb and setting node.override
there. These do not seem to be picked up when I converge the node.
Instead, the attributes are being set via hadoop::repo for every recipe
that is run since each recipe in the hadoop cookbook includes the repo
recipe.

Hi Jay,

try using node.set['attr_name'] = 'value' in your wrapper recipe before
you include_recipe 'hadoop'.

If the node attribute you are trying to set from the wrapper cookbook is
used for computing other attributes in the hadoop cookbook you have to
reload the hadoop attributes [0]

E.g. in my_wrapper::default recipe:

node.set['hadoop']['foo'] = 'bar'
node.from_file(run_context.resolve_attribute("hadoop", "default.rb"))
include_recipe 'hadoop::default'

HTH,
Torben

[0]
http://docs.opscode.com/chef/essentials_cookbook_recipes.html#reload-attributes

On Fri, Jun 20, 2014 at 7:44 PM, Jay Reslock jreslock@gmail.com wrote:

I'm hoping someone can help me figure this one out.

my_wrapper_cookbook depends on the community hadoop cookbook. Each recipe
in the hadoop cookbook has an 'include_recipe hadoop::repo' in order to set
up the appropriate yum/apt repos for installation.

I want to use local repo mirrors to speed up the installation process
because I'm installing hadoop clusters via automation to run testing every
night. The installation process takes a long time.

I'm trying my_cookbook -> attributes -> repos.rb and setting node.override
there. These do not seem to be picked up when I converge the node.
Instead, the attributes are being set via hadoop::repo for every recipe
that is run since each recipe in the hadoop cookbook includes the repo
recipe.