API Interaction from node

Hi all,
I’m writing a little script to populate a databag each time it’s run - the purpose is to store mysql bin log data in a databag when a slave is backed up. I was originally thinking of using the API from the node and just using the local client.pem for authentication, but while I can read databags this way, I get a 403 when writing to them, so I’m looking for a breakdown of what objects the client is able to modify.
I could also just change where I’m storing the data, or if that doesn’t work, some middleware with another key will work, but I’d be interested in hearing about any other approaches if I’m Doing It Wrong™.

fwiw, the script that’s interacting with chef is run from cron.

Cheers,
Ant

Yo Ant,

You won't be able to update data bag items already created with a node like
that, at least with the default permissions? I do believe you will be able
to create new data-bag items inside of a data-bag.

Hosted chef or OSS?

On OSS, you can make the node an admin. Hosted/Private, you can use RBAC to
allow create items in that d-bag. knife-acls may help with automating this.

Cheers,

AJ

On 9 January 2013 10:49, Anthony Goddard agoddard@mbl.edu wrote:

Hi all,
I'm writing a little script to populate a databag each time it's run - the
purpose is to store mysql bin log data in a databag when a slave is backed
up. I was originally thinking of using the API from the node and just using
the local client.pem for authentication, but while I can read databags this
way, I get a 403 when writing to them, so I'm looking for a breakdown of
what objects the client is able to modify.
I could also just change where I'm storing the data, or if that doesn't
work, some middleware with another key will work, but I'd be interested in
hearing about any other approaches if I'm Doing It Wrong™.

fwiw, the script that's interacting with chef is run from cron.

Cheers,
Ant

Thanks AJ,

OSS chef - I got 403's back when adding a new data bag item though. I think the admin node will be a good workaround, and I'll write some middleware in front of it.

Cheers,
Ant

On Jan 8, 2013, at 4:53 PM, AJ Christensen aj@junglist.gen.nz wrote:

Yo Ant,

You won't be able to update data bag items already created with a node like that, at least with the default permissions? I do believe you will be able to create new data-bag items inside of a data-bag.

Hosted chef or OSS?

On OSS, you can make the node an admin. Hosted/Private, you can use RBAC to allow create items in that d-bag. knife-acls may help with automating this.

Cheers,

AJ

On 9 January 2013 10:49, Anthony Goddard agoddard@mbl.edu wrote:
Hi all,
I'm writing a little script to populate a databag each time it's run - the purpose is to store mysql bin log data in a databag when a slave is backed up. I was originally thinking of using the API from the node and just using the local client.pem for authentication, but while I can read databags this way, I get a 403 when writing to them, so I'm looking for a breakdown of what objects the client is able to modify.
I could also just change where I'm storing the data, or if that doesn't work, some middleware with another key will work, but I'd be interested in hearing about any other approaches if I'm Doing It Wrong™.

fwiw, the script that's interacting with chef is run from cron.

Cheers,
Ant

Anthony,

I would say "You're Doing It Wrong."

Chef is for configuration, not bulk storage. Storing backups in chef-server
is Doing It Wrong. There are plenty of other approaches. You can run
backups on a schedule (or on demand) separately from chef-client runs.
Chef-client can provision the backup script and everything necessary to get
it running so that it's a one-liner in the crontab or a one-liner from
SSH. If the script is copying data to remote storage, it should not be to
chef-server. Instead, chef can provision a remote storage node with
replicated disks and can provision the slave's backup script to know about
it, or you can use a storage service like S3 and chef can provision the
slave's backup script to know about that instead.

Cheers,
Jay

On Tue, Jan 8, 2013 at 4:49 PM, Anthony Goddard agoddard@mbl.edu wrote:

Hi all,
I'm writing a little script to populate a databag each time it's run - the
purpose is to store mysql bin log data in a databag when a slave is backed
up. I was originally thinking of using the API from the node and just using
the local client.pem for authentication, but while I can read databags this
way, I get a 403 when writing to them, so I'm looking for a breakdown of
what objects the client is able to modify.
I could also just change where I'm storing the data, or if that doesn't
work, some middleware with another key will work, but I'd be interested in
hearing about any other approaches if I'm Doing It Wrong™.

fwiw, the script that's interacting with chef is run from cron.

Cheers,
Ant

I think he means storing the bin log position in a data-bag (quite a common
approach to allowing slaves to find the correct position)

No?

--AJ

On 9 January 2013 14:54, Jay Feldblum y_feldblum@yahoo.com wrote:

Anthony,

I would say "You're Doing It Wrong."

Chef is for configuration, not bulk storage. Storing backups in
chef-server is Doing It Wrong. There are plenty of other approaches. You
can run backups on a schedule (or on demand) separately from chef-client
runs. Chef-client can provision the backup script and everything necessary
to get it running so that it's a one-liner in the crontab or a one-liner
from SSH. If the script is copying data to remote storage, it should not be
to chef-server. Instead, chef can provision a remote storage node with
replicated disks and can provision the slave's backup script to know about
it, or you can use a storage service like S3 and chef can provision the
slave's backup script to know about that instead.

Cheers,
Jay

On Tue, Jan 8, 2013 at 4:49 PM, Anthony Goddard agoddard@mbl.edu wrote:

Hi all,
I'm writing a little script to populate a databag each time it's run -
the purpose is to store mysql bin log data in a databag when a slave is
backed up. I was originally thinking of using the API from the node and
just using the local client.pem for authentication, but while I can read
databags this way, I get a 403 when writing to them, so I'm looking for a
breakdown of what objects the client is able to modify.
I could also just change where I'm storing the data, or if that doesn't
work, some middleware with another key will work, but I'd be interested in
hearing about any other approaches if I'm Doing It Wrong™.

fwiw, the script that's interacting with chef is run from cron.

Cheers,
Ant

Man I hope so.

Be kind of awesome if he's packing actual binlogs into a JSON value,
though.

Adam

On 1/8/13 5:57 PM, "AJ Christensen" aj@junglist.gen.nz wrote:

I think he means storing the bin log position in a data-bag (quite a
common approach to allowing slaves to find the correct position)

No?

--AJ

On 9 January 2013 14:54, Jay Feldblum y_feldblum@yahoo.com wrote:

Anthony,

I would say "You're Doing It Wrong."

Chef is for configuration, not bulk storage. Storing backups in
chef-server is Doing It Wrong. There are plenty of other approaches. You
can run backups on a schedule (or on demand) separately from chef-client
runs. Chef-client can provision the backup
script and everything necessary to get it running so that it's a
one-liner in the crontab or a one-liner from SSH. If the script is
copying data to remote storage, it should not be to chef-server. Instead,
chef can provision a remote storage node with replicated
disks and can provision the slave's backup script to know about it, or
you can use a storage service like S3 and chef can provision the slave's
backup script to know about that instead.

Cheers,
Jay

On Tue, Jan 8, 2013 at 4:49 PM, Anthony Goddard
agoddard@mbl.edu wrote:

Hi all,
I'm writing a little script to populate a databag each time it's run -
the purpose is to store mysql bin log data in a databag when a slave is
backed up. I was originally thinking of using the API from the node and
just using the local client.pem for authentication,
but while I can read databags this way, I get a 403 when writing to
them, so I'm looking for a breakdown of what objects the client is able
to modify.
I could also just change where I'm storing the data, or if that doesn't
work, some middleware with another key will work, but I'd be interested
in hearing about any other approaches if I'm Doing It Wrong.

fwiw, the script that's interacting with chef is run from cron.

Cheers,
Ant

In re-reading, maybe I understood. Ant, care to confirm? You aren't storing
bloody big fuck off binary logs in JSON on the chef-server in a data-bag,
are you?

--AJ

On 9 January 2013 17:54, Adam Jacob adam@opscode.com wrote:

Man I hope so.

Be kind of awesome if he's packing actual binlogs into a JSON value,
though.

Adam

On 1/8/13 5:57 PM, "AJ Christensen" aj@junglist.gen.nz wrote:

I think he means storing the bin log position in a data-bag (quite a
common approach to allowing slaves to find the correct position)

No?

--AJ

On 9 January 2013 14:54, Jay Feldblum y_feldblum@yahoo.com wrote:

Anthony,

I would say "You're Doing It Wrong."

Chef is for configuration, not bulk storage. Storing backups in
chef-server is Doing It Wrong. There are plenty of other approaches. You
can run backups on a schedule (or on demand) separately from chef-client
runs. Chef-client can provision the backup
script and everything necessary to get it running so that it's a
one-liner in the crontab or a one-liner from SSH. If the script is
copying data to remote storage, it should not be to chef-server. Instead,
chef can provision a remote storage node with replicated
disks and can provision the slave's backup script to know about it, or
you can use a storage service like S3 and chef can provision the slave's
backup script to know about that instead.

Cheers,
Jay

On Tue, Jan 8, 2013 at 4:49 PM, Anthony Goddard
agoddard@mbl.edu wrote:

Hi all,
I'm writing a little script to populate a databag each time it's run -
the purpose is to store mysql bin log data in a databag when a slave is
backed up. I was originally thinking of using the API from the node and
just using the local client.pem for authentication,
but while I can read databags this way, I get a 403 when writing to
them, so I'm looking for a breakdown of what objects the client is able
to modify.
I could also just change where I'm storing the data, or if that doesn't
work, some middleware with another key will work, but I'd be interested
in hearing about any other approaches if I'm Doing It Wrong .

fwiw, the script that's interacting with chef is run from cron.

Cheers,
Ant

And if you are, you get a beer for most bold use of JSON in 2013 so far. :slight_smile:

Adam

On 1/8/13 8:56 PM, "AJ Christensen" aj@junglist.gen.nz wrote:

In re-reading, maybe I understood. Ant, care to confirm? You aren't
storing bloody big fuck off binary logs in JSON on the chef-server in a
data-bag, are you?

--AJ

On 9 January 2013 17:54, Adam Jacob adam@opscode.com wrote:

Man I hope so.

Be kind of awesome if he's packing actual binlogs into a JSON value,
though.

Adam

On 1/8/13 5:57 PM, "AJ Christensen" aj@junglist.gen.nz wrote:

I think he means storing the bin log position in a data-bag (quite a
common approach to allowing slaves to find the correct position)

No?

--AJ

On 9 January 2013 14:54, Jay Feldblum y_feldblum@yahoo.com wrote:

Anthony,

I would say "You're Doing It Wrong."

Chef is for configuration, not bulk storage. Storing backups in
chef-server is Doing It Wrong. There are plenty of other approaches. You
can run backups on a schedule (or on demand) separately from chef-client
runs. Chef-client can provision the backup
script and everything necessary to get it running so that it's a
one-liner in the crontab or a one-liner from SSH. If the script is
copying data to remote storage, it should not be to chef-server. Instead,
chef can provision a remote storage node with replicated
disks and can provision the slave's backup script to know about it, or
you can use a storage service like S3 and chef can provision the slave's
backup script to know about that instead.

Cheers,
Jay

On Tue, Jan 8, 2013 at 4:49 PM, Anthony Goddard
agoddard@mbl.edu wrote:

Hi all,
I'm writing a little script to populate a databag each time it's run -
the purpose is to store mysql bin log data in a databag when a slave is
backed up. I was originally thinking of using the API from the node and
just using the local client.pem for authentication,
but while I can read databags this way, I get a 403 when writing to
them, so I'm looking for a breakdown of what objects the client is able
to modify.
I could also just change where I'm storing the data, or if that doesn't
work, some middleware with another key will work, but I'd be interested
in hearing about any other approaches if I'm Doing It Wrong .

fwiw, the script that's interacting with chef is run from cron.

Cheers,
Ant

lmao, that'd be awesome :wink:
Fortunately AJ is right, I'm only storing the bin-log position (and filename) in the databag

process is:

  • backup starts (using backup gem) on a backup slave
  • pre-backup hook stops replication, dumps the current bin-log position and filename in a data bag
  • backup completes, replication resumes
  • a new slave can now get spun up automagically and know what position to begin replication from by doing a query to the data bag that matches the backup ID

Ant

On Jan 8, 2013, at 11:58 PM, Adam Jacob adam@opscode.com wrote:

And if you are, you get a beer for most bold use of JSON in 2013 so far. :slight_smile:

Adam

On 1/8/13 8:56 PM, "AJ Christensen" aj@junglist.gen.nz wrote:

In re-reading, maybe I understood. Ant, care to confirm? You aren't
storing bloody big fuck off binary logs in JSON on the chef-server in a
data-bag, are you?

--AJ

On 9 January 2013 17:54, Adam Jacob adam@opscode.com wrote:

Man I hope so.

Be kind of awesome if he's packing actual binlogs into a JSON value,
though.

Adam

On 1/8/13 5:57 PM, "AJ Christensen" aj@junglist.gen.nz wrote:

I think he means storing the bin log position in a data-bag (quite a
common approach to allowing slaves to find the correct position)

No?

--AJ

On 9 January 2013 14:54, Jay Feldblum y_feldblum@yahoo.com wrote:

Anthony,

I would say "You're Doing It Wrong."

Chef is for configuration, not bulk storage. Storing backups in
chef-server is Doing It Wrong. There are plenty of other approaches. You
can run backups on a schedule (or on demand) separately from chef-client
runs. Chef-client can provision the backup
script and everything necessary to get it running so that it's a
one-liner in the crontab or a one-liner from SSH. If the script is
copying data to remote storage, it should not be to chef-server. Instead,
chef can provision a remote storage node with replicated
disks and can provision the slave's backup script to know about it, or
you can use a storage service like S3 and chef can provision the slave's
backup script to know about that instead.

Cheers,
Jay

On Tue, Jan 8, 2013 at 4:49 PM, Anthony Goddard
agoddard@mbl.edu wrote:

Hi all,
I'm writing a little script to populate a databag each time it's run -
the purpose is to store mysql bin log data in a databag when a slave is
backed up. I was originally thinking of using the API from the node and
just using the local client.pem for authentication,
but while I can read databags this way, I get a 403 when writing to
them, so I'm looking for a breakdown of what objects the client is able
to modify.
I could also just change where I'm storing the data, or if that doesn't
work, some middleware with another key will work, but I'd be interested
in hearing about any other approaches if I'm Doing It Wrong .

fwiw, the script that's interacting with chef is run from cron.

Cheers,
Ant