On-prem Depot: 404 "Not Found" for core/node, but visible in GUI

I’m continuing my hab depot on-prem journey, just now I ran into something interesting, when I use the latest core/node package (core/node/8.11.1/20180608201931), I get the good 'ol “we tried 5 times but could not download” error, however if I revert that one from stable and move back to the previous version, it works fine (core/node/8.11.1/20180504222455)…

I’d like to wipe out the latest one and re-sync it if possible to rule out some package sync weirdness on my side, how would I go about doing that?

I should also mention, I get this behavior when using build within the studio, as well as hab pkg install core/node

Here’s the output with RUST_LOG set to debug

This repeats 5 times:

↓ Downloading core/node/8.11.1/20180608201931
DEBUG 2018-07-13T12:32:04Z: habitat_http_client::api_client: GET https://hab-depot.dbright.com/v1/depot/pkgs/core/node/8.11.1/20180608201931/download with ApiClient { endpoint: "https://hab-depot.dbright.com/v1", inner: Client { redirect_policy: FollowAll, read_timeout: Some(Duration { secs: 120, nanos: 0 }), write_timeout: Some(Duration { secs: 120, nanos: 0 }), proxy: None }, proxy: None, target_scheme: "https", user_agent_header: UserAgent("hab/0.58.0/20180629144346 (x86_64-linux; 4.9.87-linuxkit-aufs)") }
DEBUG 2018-07-13T12:32:04Z: hyper::http::h1: request line: Get "/v1/depot/pkgs/core/node/8.11.1/20180608201931/download" Http11
DEBUG 2018-07-13T12:32:04Z: hyper::http::h1: headers=Headers { Host: hab-depot.dbright.com
, User-Agent: hab/0.58.0/20180629144346 (x86_64-linux; 4.9.87-linuxkit-aufs)
, Authorization: Bearer _Qk9YLTEKYmxkci0yMDE4MDcxMjEyMzczNwpibGRyLTIwMTgwNzEyMTIzNzM3CkdiUDBUUDRLV2hUY0hXS2Frcms3d0lGUWkxZXhqSi93CnB2cmxhbFZ5L0VCQ2VXS2JSbTlJMElhM2c4RGpyMFk5a01NcllhZ2tBYytzTS82bg==
, }
DEBUG 2018-07-13T12:32:04Z: hyper::client::response: version=Http11, status=NotFound
DEBUG 2018-07-13T12:32:04Z: hyper::client::response: headers=Headers { Server: nginx/1.13.10
, Date: Fri, 13 Jul 2018 12:32:31 GMT
, Content-Length: 0
, Connection: keep-alive
, Access-Control-Allow-Origin: *
, Access-Control-Allow-Headers: authorization, range
, Access-Control-Allow-Methods: PUT, DELETE, PATCH
, Access-Control-Expose-Headers: content-disposition
, }
DEBUG 2018-07-13T12:32:04Z: habitat_depot_client: Response: Response { status: NotFound, headers: Headers { Server: nginx/1.13.10
, Date: Fri, 13 Jul 2018 12:32:31 GMT
, Content-Length: 0
, Connection: keep-alive
, Access-Control-Allow-Origin: *
, Access-Control-Allow-Headers: authorization, range
, Access-Control-Allow-Methods: PUT, DELETE, PATCH
, Access-Control-Expose-Headers: content-disposition
, }, version: Http11, url: "https://hab-depot.dbright.com/v1/depot/pkgs/core/node/8.11.1/20180608201931/download", status_raw: RawStatus(404, "Not Found"), message: Http11Message { is_proxied: false, method: None, stream: Wrapper { obj: Some(Reading(SizedReader(remaining=0))) } } }
✗✗✗
✗✗✗ We tried 5 times but could not download core/node/8.11.1/20180608201931. Giving up.
✗✗✗

I should also add, this is a new install from scratch, using the Minio store for hab packages.

Seems pretty strange. I’m not an on-prem expert but with the backend it should be pretty close to an all or nothing situation unless some packages are just straight missing. Can you validate the release number of the version of node that’s being called there inside the minio ui core/node/8.11.1/20180608201931 ? You’ll want to make sure that specific release exists. Also can you verify that you are on the latest release of builder packages? Finally can you check to make sure that specific package hasn’t been set to private in your depot?

Chances are we may end up needing to see some of the builder-api logs as the client doesn’t expose the source of the error here, just the error that it’s experiencing.

Interesting, I don't see the release in Minio:

image

The install is fresh as of yesterday, here are the package versions running:

[root@hab-depot on-prem-builder]# hab sup status
package                                        type        state  elapsed (s)  pid    group
habitat/builder-originsrv/7352/20180521233105  standalone  up     16072        48983  builder-originsrv.default
habitat/builder-sessionsrv/7352/20180521233106  standalone  up  16064  49044  builder-sessionsrv.default
habitat/builder-minio/0.1.0/20180612201128  standalone  up  16082  48854  builder-minio.default
habitat/builder-router/7352/20180521233104  standalone  up  16082  48860  builder-router.default
habitat/builder-api/7417/20180615204610  standalone  up  16082  48867  builder-api.default
habitat/builder-datastore/7311/20180426183913  standalone  up  16082  48914  builder-datastore.default
habitat/builder-api-proxy/7411/20180613221149  standalone  up  16082  48920  builder-api-proxy.default

As far as I can tell, the origin is public, I left it public (core) when I created it.

Hmm so I’m not certain why it’s not there, it should be getting imported if your upstream is set even if it wasn’t there to start, but @salam or @raskchanky are likely better to answer that. As a work around for now, you should be able to create the directory structure inside minio to match the other package with the correct release number and then upload the hartfile manually which should get you past the error.

We had a situation the other day that was similar to this, where hab pkg install tried to install 5 times and failed, returning a 404 for the package with debug logging turned on, but the package existed inside minio. What ended up being the problem was incorrectly set permissions, to where builder-api couldn’t create the temporary directories it needed.

When you go to download a package via hab pkg install, builder-api will attempt to create a temporary directory, download the package from minio to the temp dir, then send it from there to the client. Can you double check to make sure that the user builder-api is running as has permission to create directories in /hab/svc/builder-api?

Permissions are as follows:

[root@hab-depot builder-api]# ls -alt
total 12
-rwxr-xr-x. 1 root root 310 Jul 13 07:25 run
drwxr-xr-x. 9 root root 134 Jul 13 07:25 .
-rw-r--r--. 1 root root   5 Jul 13 07:25 PID
drwxrwx---. 3 hab  hab   18 Jul 12 09:18 data
drwxrwx---. 2 hab  hab   72 Jul 12 08:37 files
drwxr-xr-x. 2 root root  68 Jul 12 08:35 logs
drwxrwx---. 2 hab  hab   34 Jul 12 08:35 var
drwxrwx---. 2 hab  hab   25 Jul 12 08:35 config
drwxr-xr-x. 2 root root  37 Jul 12 08:35 hooks
drwxr-xr-x. 9 root root 169 Jul 12 08:35 ..
drwxrwx---. 2 hab  hab    6 Jul 12 08:35 static
-rw-r--r--. 1 root root 588 Jul 12 08:34 user.toml

I changed ownership so the hab user can create directories now, however how do I go about cleaning up the Hab Depot to re-start the upstream download for that package?

A new package will be downloaded from upstream if you try to view the package details for it and it’s not present. One way to do that would be something like:

curl https://my.on-prem.depot.com/v1/depot/pkgs/core/node/8.11.1/20180608201931

Assuming that returns a 404, it will add a request to the queue to download it from the upstream depot. Those requests are processed in 60 second intervals.

Also the permissions need to be set within that data/pkgs from where you shared your previous comment.

@raskchanky I curl and get back data:

curl https://hab-depot.dbright.com/v1/depot/pkgs/core/node/8.11.1/20180608201931 -k    14:54:48
{"channels":["stable","unstable"],"checksum":"95bd7714e9225a111f1797d3282b616bd602d54818ef9a02f1b010c069f25120","config":"","deps":[{"name":"bash","origin":"core","release":"20180608092913","version":"4.4.19"},{"name":"gcc-libs","origin":"core","release":"20180608091701","version":"7.3.0"},{"name":"glibc","origin":"core","release":"20180608041157","version":"2.27"},{"name":"python2","origin":"core","release":"20180608145156","version":"2.7.14"}],"exposes":[],"ident":{"name":"node","origin":"core","release":"20180608201931","version":"8.11.1"},"is_a_service":false,"manifest":"# core / node\nNode.js® is a JavaScript runtime built on Chrome's V8 JavaScript engine.\n\n* __Maintainer__: The Habitat Maintainers <humans@habitat.sh>\n* __Version__: 8.11.1\n* __Release__: 20180608201931\n* __Architecture__: x86_64\n* __System__: linux\n* __Target__: x86_64-linux\n* __Upstream URL__: [https://nodejs.org/](https://nodejs.org/)\n* __License__: MIT \n* __Source__: [https://nodejs.org/dist/v8.11.1/node-v8.11.1.tar.gz](https://nodejs.org/dist/v8.11.1/node-v8.11.1.tar.gz)\n* __SHA__: `86678028f13b26ceed08efc4b838921ca1bf514c0b7e8151bfec8ba15c5e66ad`\n* __Path__: `/hab/pkgs/core/node/8.11.1/20180608201931`\n* __Build Dependencies__: `core/gcc core/grep core/make `\n* __Dependencies__: `core/glibc core/gcc-libs core/python2 core/bash `\n* __Interpreters__: `bin/node `\n\n# Plan\n\n## Build Flags\n\n```bash\nCFLAGS: -I/hab/pkgs/core/glibc/2.27/20180608041157/include -I/hab/pkgs/core/python2/2.7.14/20180608145156/include -I/hab/pkgs/core/python2/2.7.14/20180608145156/Include -I/hab/pkgs/core/gcc/7.3.0/20180608051919/include -I/hab/pkgs/core/make/4.2.1/20180608100733/include\nCPPFLAGS: -I/hab/pkgs/core/glibc/2.27/20180608041157/include -I/hab/pkgs/core/python2/2.7.14/20180608145156/include -I/hab/pkgs/core/python2/2.7.14/20180608145156/Include -I/hab/pkgs/core/gcc/7.3.0/20180608051919/include -I/hab/pkgs/core/make/4.2.1/20180608100733/include\nCXXFLAGS: -I/hab/pkgs/core/glibc/2.27/20180608041157/include -I/hab/pkgs/core/python2/2.7.14/20180608145156/include -I/hab/pkgs/core/python2/2.7.14/20180608145156/Include -I/hab/pkgs/core/gcc/7.3.0/20180608051919/include -I/hab/pkgs/core/make/4.2.1/20180608100733/include\nLDFLAGS: -L/hab/pkgs/core/glibc/2.27/20180608041157/lib -L/hab/pkgs/core/gcc-libs/7.3.0/20180608091701/lib -L/hab/pkgs/core/python2/2.7.14/20180608145156/lib -L/hab/pkgs/core/gcc/7.3.0/20180608051919/lib\nLD_RUN_PATH: /hab/pkgs/core/node/8.11.1/20180608201931/lib:/hab/pkgs/core/glibc/2.27/20180608041157/lib:/hab/pkgs/core/gcc-libs/7.3.0/20180608091701/lib:/hab/pkgs/core/python2/2.7.14/20180608145156/lib\n```\n\n## Plan Source\n\n```bash\npkg_name=node\npkg_origin=core\npkg_version=8.11.1\npkg_description=\"Node.js® is a JavaScript runtime built on Chrome's V8 JavaScript engine.\"\npkg_upstream_url=https://nodejs.org/\npkg_license=('MIT')\npkg_maintainer=\"The Habitat Maintainers <humans@habitat.sh>\"\npkg_source=https://nodejs.org/dist/v${pkg_version}/node-v${pkg_version}.tar.gz\npkg_shasum=86678028f13b26ceed08efc4b838921ca1bf514c0b7e8151bfec8ba15c5e66ad\npkg_deps=(core/glibc core/gcc-libs core/python2 core/bash)\npkg_build_deps=(core/gcc core/grep core/make)\npkg_bin_dirs=(bin)\npkg_include_dirs=(include)\npkg_interpreters=(bin/node)\npkg_lib_dirs=(lib)\n\n# the archive contains a 'v' version # prefix, but the default value of\n# pkg_dirname is node-${pkg_version} (without the v). This tweak makes build happy\npkg_dirname=node-v$pkg_version\n\ndo_prepare() {\n  # ./configure has a shebang of #!/usr/bin/env python2. Fix it.\n  sed -e \"s#/usr/bin/env python#$(pkg_path_for python2)/bin/python2#\" -i configure\n}\n\ndo_build() {\n  ./configure \\\n    --prefix \"${pkg_prefix}\" \\\n    --dest-cpu \"x64\" \\\n    --dest-os \"linux\"\n\n  make -j\"$(nproc)\"\n}\n\ndo_install() {\n  do_default_install\n\n  # Node produces a lot of scripts that hardcode `/usr/bin/env`, so we need to\n  # fix that everywhere to point directly at the env binary in core/coreutils.\n  grep -nrlI '^\\#\\!/usr/bin/env' \"$pkg_prefix\" | while read -r target; do\n    sed -e \"s#\\#\\!/usr/bin/env node#\\#\\!${pkg_prefix}/bin/node#\" -i \"$target\"\n    sed -e \"s#\\#\\!/usr/bin/env sh#\\#\\!$(pkg_path_for bash)/bin/sh#\" -i \"$target\"\n    sed -e \"s#\\#\\!/usr/bin/env bash#\\#\\!$(pkg_path_for bash)/bin/bash#\" -i \"$target\"\n    sed -e \"s#\\#\\!/usr/bin/env python#\\#\\!$(pkg_path_for python2)/bin/python2#\" -i \"$target\"\n  done\n\n  # This script has a hardcoded bare `node` command\n  sed -e \"s#^\\([[:space:]]\\)\\+node\\([[:space:]]\\)#\\1${pkg_prefix}/bin/node\\2#\" -i \"${pkg_prefix}/lib/node_modules/npm/bin/node-gyp-bin/node-gyp\"\n}\n```","target":"x86_64-linux","tdeps":[{"name":"bash","origin":"core","release":"20180608092913","version":"4.4.19"},{"name":"bzip2","origin":"core","release":"20180608091727","version":"1.0.6"},{"name":"cacerts","origin":"core","release":"20180608102212","version":"2018.03.07"},{"name":"gcc-libs","origin":"core","release":"20180608091701","version":"7.3.0"},{"name":"gdbm","origin":"core","release":"20180608094002","version":"1.14.1"},{"name":"glibc","origin":"core","release":"20180608041157","version":"2.27"},{"name":"linux-headers","origin":"core","release":"20180608041107","version":"4.15.9"},{"name":"ncurses","origin":"core","release":"20180608091810","version":"6.1"},{"name":"openssl","origin":"core","release":"20180608102213","version":"1.0.2n"},{"name":"python2","origin":"core","release":"20180608145156","version":"2.7.14"},{"name":"readline","origin":"core","release":"20180608092900","version":"7.0.3"},{"name":"sqlite","origin":"core","release":"20180608141313","version":"3130000"},{"name":"zlib","origin":"core","release":"20180608050617","version":"1.2.11"}],"visibility":"public"}

@eeyun - that dir is owned by the hab user too, it has contents already that have worked properly (previous upstream downloads):

[root@hab-depot pkgs]# ls -alt
total 330008
drwxr-xr-x. 2 hab hab      4096 Jul 13 15:23 .
-rw-r--r--. 1 hab hab     77033 Jul 12 15:13 core-zlib-1.2.11-20180608050617-x86_64-linux.hart
-rw-r--r--. 1 hab hab    604805 Jul 12 15:13 core-sqlite-3130000-20180608141313-x86_64-linux.hart
-rw-r--r--. 1 hab hab    297225 Jul 12 15:13 core-readline-7.0.3-20180608092900-x86_64-linux.hart
-rw-r--r--. 1 hab hab  16602833 Jul 12 15:13 core-python2-2.7.14-20180608145156-x86_64-linux.hart
-rw-r--r--. 1 hab hab   2257669 Jul 12 15:13 core-openssl-1.0.2n-20180608102213-x86_64-linux.hart
-rw-r--r--. 1 hab hab    862621 Jul 12 15:13 core-ncurses-6.1-20180608091810-x86_64-linux.hart
-rw-r--r--. 1 hab hab    974657 Jul 12 15:13 core-linux-headers-4.15.9-20180608041107-x86_64-linux.hart
-rw-r--r--. 1 hab hab   9250993 Jul 12 15:13 core-glibc-2.27-20180608041157-x86_64-linux.hart
-rw-r--r--. 1 hab hab    146209 Jul 12 15:13 core-gdbm-1.14.1-20180608094002-x86_64-linux.hart
-rw-r--r--. 1 hab hab   3387241 Jul 12 15:12 core-gcc-libs-7.3.0-20180608091701-x86_64-linux.hart
-rw-r--r--. 1 hab hab    118429 Jul 12 15:12 core-cacerts-2018.03.07-20180608102212-x86_64-linux.hart
-rw-r--r--. 1 hab hab     59957 Jul 12 15:12 core-bzip2-1.0.6-20180608091727-x86_64-linux.hart
-rw-r--r--. 1 hab hab   1274505 Jul 12 15:12 core-bash-4.4.19-20180608092913-x86_64-linux.hart
-rw-r--r--. 1 hab hab   7210869 Jul 12 13:45 core-hab-launcher-7797-20180625172404-x86_64-linux.hart
-rw-r--r--. 1 hab hab   3599941 Jul 12 13:10 core-hab-studio-0.58.0-20180629150552-x86_64-linux.hart
-rw-r--r--. 1 hab hab   2531349 Jul 12 13:10 core-hab-sup-0.58.0-20180629150614-x86_64-linux.hart
-rw-r--r--. 1 hab hab 288639613 Jul 12 10:18 e2255c1a-fece-452c-a747-a356cd6ab091.tmp

The last tmp file hasn’t changed

@danielcbright strange indeed. Have you followed the workaround to get unblocked?

@eeyun, the latest core/node exists only in the Hab Depot API, but not in the Minio db, is there a way to remove a package completely? I’d like to clean it out and re-try. The only workaround I’ve done so far is to promote the latest version to unstable and let it fall back to the previous stable release…

Unfortunately, we don’t have a supported way to delete packages. In your case, deleting the package from Builder means accessing the PG console for the builder_originsrv database and manually deleting records. If you choose to go this route, you’ll need to set the search path inside psql like so:

set search_path to shard_30;

This will put you on the database shard where all the core packages live. From there, it’s just a matter of looking in the origin_packages table and deleting the appropriate record. There will likely be at least one other record in the origin_channel_packages table as well, due to foreign key constraints, but psql will tell you about that. =D

OK, so I brushed off the sql skills and was able to remove the constraint and row (although I do need to add the constraint back, not sure of the syntax for that). This removed the package from the GUI, and put me at a state where the package doesn’t exist in the GUI or in Minio. I then went to download the package again using hab pkg install core/node, it went through the downloading thing, but then never actually added the package to Minio, so now I’m at the same state I was in before. It seems to be an issue with this specific package as I haven’t run into it with others. I’m going to test that theory today when I get a chance.

I had some time so I went ahead and tried, I created the proper dirs and moved the core/node package into the proper Minio directory so it would download, so I got past that part, I ran into the same issue with core/bash just now:

» Installing core/node
☁ Determining latest version of core/node in the 'stable' channel
↓ Downloading core/node/8.11.1/20180608201931
    11.28 MB / 11.28 MB | [======================================================================================] 100.00 % 13.05 MB/s
☛ Verifying core/node/8.11.1/20180608201931
↓ Downloading core/bash/4.4.19/20180608092913
↓ Downloading core/bash/4.4.19/20180608092913
↓ Downloading core/bash/4.4.19/20180608092913
↓ Downloading core/bash/4.4.19/20180608092913
↓ Downloading core/bash/4.4.19/20180608092913
✗✗✗
✗✗✗ We tried 5 times but could not download core/bash/4.4.19/20180608092913. Giving up.
✗✗✗

I see the package downloaded and sitting in /hab/svc/builder-api/data/pkgs but it’s not getting moved to the proper Minio db directory apparently.

This is pretty mystifying at this point. I haven’t seen any behavior like this before, where some packages work and some don’t. So, just to recap:

  1. It’s not a permissions issue. All permissions inside /hab/svc/builder-api have been verified to be writable by the user that the service is running as, presumably hab.
  2. It’s not a connectivity issue with the upstream depot, because other packages download from the upstream depot without problems.
  3. It’s not a connectivity issue with minio, because other packages are getting stored in minio and download fine (via hab pkg install).

Is that list correct?

If it is, then the only remaining thing I can think to do would be to enable debug logging on your server and comb the logs for some kind of error. Enabling debug logging can be done by following the instructions outlined in the README.

@raskchanky

1 and 2 are correct,

as of my last post, I verified this is happening with all upstream core packages, they download and just sit in the /hab/svc/builder-api/data/pkgs directory. I can manually move them over to the Minio db, but this doesn’t happen automatically.

[root@hab-depot pkgs]# ls -alt
total 50608
drwxr-xr-x. 2 hab hab     4096 Jul 17 09:04 .
-rw-r--r--. 1 hab hab  2533861 Jul 17 09:00 core-hab-sup-0.59.0-20180712161546-x86_64-linux.hart
-rw-r--r--. 1 hab hab    77033 Jul 12 15:13 core-zlib-1.2.11-20180608050617-x86_64-linux.hart
-rw-r--r--. 1 hab hab   604805 Jul 12 15:13 core-sqlite-3130000-20180608141313-x86_64-linux.hart
-rw-r--r--. 1 hab hab   297225 Jul 12 15:13 core-readline-7.0.3-20180608092900-x86_64-linux.hart
-rw-r--r--. 1 hab hab 16602833 Jul 12 15:13 core-python2-2.7.14-20180608145156-x86_64-linux.hart
-rw-r--r--. 1 hab hab  2257669 Jul 12 15:13 core-openssl-1.0.2n-20180608102213-x86_64-linux.hart
-rw-r--r--. 1 hab hab   862621 Jul 12 15:13 core-ncurses-6.1-20180608091810-x86_64-linux.hart
-rw-r--r--. 1 hab hab   974657 Jul 12 15:13 core-linux-headers-4.15.9-20180608041107-x86_64-linux.hart
-rw-r--r--. 1 hab hab  9250993 Jul 12 15:13 core-glibc-2.27-20180608041157-x86_64-linux.hart
-rw-r--r--. 1 hab hab   146209 Jul 12 15:13 core-gdbm-1.14.1-20180608094002-x86_64-linux.hart
-rw-r--r--. 1 hab hab  3387241 Jul 12 15:12 core-gcc-libs-7.3.0-20180608091701-x86_64-linux.hart
-rw-r--r--. 1 hab hab   118429 Jul 12 15:12 core-cacerts-2018.03.07-20180608102212-x86_64-linux.hart
-rw-r--r--. 1 hab hab    59957 Jul 12 15:12 core-bzip2-1.0.6-20180608091727-x86_64-linux.hart
-rw-r--r--. 1 hab hab  1274505 Jul 12 15:12 core-bash-4.4.19-20180608092913-x86_64-linux.hart
-rw-r--r--. 1 hab hab  7210869 Jul 12 13:45 core-hab-launcher-7797-20180625172404-x86_64-linux.hart
-rw-r--r--. 1 hab hab  3599941 Jul 12 13:10 core-hab-studio-0.58.0-20180629150552-x86_64-linux.hart
-rw-r--r--. 1 hab hab  2531349 Jul 12 13:10 core-hab-sup-0.58.0-20180629150614-x86_64-linux.hart
drwxrwx---. 3 hab hab       18 Jul 12 09:18 ..
[root@hab-depot pkgs]# pwd
/hab/svc/builder-api/data/pkgs

Is it possible that the auth credentials for minio aren’t configured correctly in Builder?

You can find the auth credentials for minio in /hab/svc/builder-api/config/config.toml in a section titled [s3]. Can you verify that the information in that section is accurate by using it to manually login to your minio instance?

The credentials stored in the config are correct:

[s3]
backend = "minio"
bucket_name = "habitat-builder-artifact-store.local"
endpoint = "http://localhost:9000"
key_id = "depot"
secret_key = "password"

I did get a bit further though following the journalctl logs, I’m seeing this error:

Jul 19 08:47:20 hab-depot.dbright.com hab[48809]: builder-api-proxy.default(O): 192.168.86.26 - - [19/Jul/2018:12:47:20 +0000] "HEAD /health HTTP/1.1" 308 0 "-" "curl/7.54.1"
Jul 19 08:47:21 hab-depot.dbright.com hab[48809]: builder-datastore.default(O): 2018-07-19 12:47:21 GMT ERROR:  duplicate key value violates unique constraint "origin_packages_ident_key"
Jul 19 08:47:21 hab-depot.dbright.com hab[48809]: builder-datastore.default(O): 2018-07-19 12:47:21 GMT DETAIL:  Key (ident)=(core/7zip/16.04/20170131110814) already exists.
Jul 19 08:47:21 hab-depot.dbright.com hab[48809]: builder-datastore.default(O): 2018-07-19 12:47:21 GMT CONTEXT:  SQL statement "INSERT INTO origin_packages (origin_id, owner_id, name, ident, checksum, manifest, config, target, deps, tdeps, exposes, visibility)
Jul 19 08:47:21 hab-depot.dbright.com hab[48809]: builder-datastore.default(O):                       VALUES (op_origin_id, op_owner_id, op_name, op_ident, op_checksum, op_manifest, op_config, op_target, op_deps, op_tdeps, op_exposes, op_visibility)
Jul 19 08:47:21 hab-depot.dbright.com hab[48809]: builder-datastore.default(O):                       RETURNING *"
Jul 19 08:47:21 hab-depot.dbright.com hab[48809]: builder-datastore.default(O):         PL/pgSQL function insert_origin_package_v3(bigint,bigint,text,text,text,text,text,text,text,text,text,text) line 6 at SQL statement
Jul 19 08:47:21 hab-depot.dbright.com hab[48809]: builder-datastore.default(O): 2018-07-19 12:47:21 GMT STATEMENT:  SELECT * FROM insert_origin_package_v3($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12)
Jul 19 08:47:21 hab-depot.dbright.com hab[48809]: builder-originsrv.default(O): ERROR 2018-07-19T12:47:21Z: habitat_builder_originsrv::server::handlers: [err: DATA_STORE, msg: vt:origin-package-create:1], Error creating package in database, database error: ERROR: duplicate key value violates unique constraint "origin_packages_ident_key"
Jul 19 08:47:21 hab-depot.dbright.com hab[48809]: builder-api.default(O):  WARN 2018-07-19T12:47:21Z: habitat_depot::upstream: Failed to download package from upstream, err NetError(NetError(code: DATA_STORE msg: "vt:origin-package-create:1"))

Yeah, that error is what I would expect to see, since you can browse to this package in the Builder UI, you just can’t download it. That error is saying the metadata for the package already exists in the database (which is why it shows up fine in the Builder UI). But, it doesn’t shed any light onto why packages aren’t making it from your server’s local disk into minio.

@eeyun do you have any further thoughts here?

Yes actually! So i’m not sure how it ended up in that state but we use that check to validate whether or not we should upload to the backend as it should never get added to the database without being uploaded to minio first.

We’re in a state here, it seems, where the package doesn’t exist in minio but it does exist in the database, as such its not possible to upload without removing it from the database and then hab pkg uploading OR following the workaround I originally posted.