We have a use case in front of us with chef where we are going to need to embed several hundred moderately sized text file (think html content and such). Our workflow is to berks package the cookbook and store as a build artifact in Artifactory prior to uploading into our chef server. The cookbook containing this content shrinks down quite nicely in tar.gz form (500kb), but the uncompressed cookbook is almost 200mb.
I’m wondering if there is already compression happening in transit between the chef server and chef-clients? Or should we archive these files into a single file embedded within the cookbook and handle the extraction within our recipe? I’d like to keep this as simple as possible, but also know that if left uncompressed, this will cause us problems.
I really really wouldn’t do this via cookbook files. Chef does not excel as file transfer system, compared to tools like rsync, SFTP, etc etc. Even just putting a tarball on the same Artifactory server and using Chef to download and unpack it would be a better plan. If you must do it in-band, storing a tarball in the cookbook and unpacking it separately might be better but you’ll have to work out how you want to handle things like files that get removed. A git resource or an execute resource plus rsync is likely to be much better though.