10000 how to run as an intermediate container in a multistage Dockerfile? · Issue #661 · docker-library/postgres · GitHub
[go: up one dir, main page]

Skip to content
how to run as an intermediate container in a multistage Dockerfile? #661
Closed
@the-vampiire

Description

@the-vampiire

Given that this isn't an error with the image you might have better responses asking over at the Docker Community Forums, the Docker Community Slack, or Stack Overflow. As these repositories are for issues with the image and not necessarily for questions of usability

i have followed the recommendation and asked this question on reddit, stackoverflow and docker slack chat and have not received any help. i am turning to the maintainers because i am at a loss

use case

i am looking for some advice on the best way to accomplish a build. i have got it working manually but would like to set this up in a multistage docker build to automate it.

goal: a docker image (from postgres:9.4) that has a fully set up database and users for testing and distributing with my team.

the setup scripts take a long time to run which is why id like to have the final image just contain the resultant data rather then re-executing the scripts every time the container is run.

i would then like to extend from this pre-set image by adding additional user scripts. this is another issue i am having (from the manually created image) because initdb does not execute if the data dir is populated. and i run into the same issues as i am finding in the multistage build (needing to run an intermediate container to execute the scripts). if i can learn how the multistage works then i can figure out the extending problem.

manual steps

[image 1] write Dockerfile that copies over the custom scripts and builds from postgres:9.4

run the container (to execute the scripts and set up the users / dbs) with a mounted volume to store the postgres data on the host

[image 2] write another Dockerfile that copies over the data from the mounted volume and builds from postgres:9.4

run a container from the final image (2) which now has all the data ready to go without needing to execute the scripts

is this possible to do in a multi stage Dockerfile so it can be automated?

multistage automated attempt

FROM postgres:9.4 AS builder
COPY custom-scripts/ /docker-entrypoint-initdb.d/
COPY data/ /tmp
# run the postgres process in the container?
# i saw this as the CMD for the postgres image
# also tried running /usr/local/bin/docker-entrypoint.sh
RUN postgres
FROM postgres:9.4
COPY --from=builder /var/lib/postgresql/data /var/lib/postgresql/data
HEALTHCHECK --interval=30s --timeout=30s --start-period=5s --retries=3 CMD ["CMD-SHELL", "pg_isready -U $DB_USER -d $DB_NAME"]

what is happening

"root" execution of the PostgreSQL server is not permitted.
The server must be started under an unprivileged user ID to prevent
possible system security compromise. See the documentation for
more information on how to properly start the server.

so i tried adding USER postgres before this directive and got the following error:

postgres cannot access the server configuration file "/var/lib/postgresql/data/postgresql.conf": No such file or directory

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionUsability question, not directly related to an error with the image

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0