Docker Run Postgres, expose to Local Host

I needed to spin up a Postgres database for testing a new application, so I figured I’d do it via Docker to keep my system clean.

So, the plan is to develop an application on my PC / localhost (e.g. in PyCharm), but connect to the Postgres instance within the Docker container.

Getting & Running Postgres

This is actually quite trivial:

docker pull postgres
docker run --name postgres -e POSTGRES_PASSWORD=password -d -p 5432:5432 postgres

The first command pulls the image from docker-hub, and the second one runs the container and exposes the Postgres port externally (to the same numbered port) so that we can communicate with it from our local host.

Setting Up pgAdmin

The pgAdmin utility is a wonderful UI for working with Postgres.  You can download it here:

In my case, I actually want to verify that connecting to Postgres works from outside of the container environment.  So, I chose to install the Windows version locally to help verify this.  If you are so inclined, they have Docker images instead so that you can install pgAmin in a container as well to keep your system clean.

Once you’ve installed pgAdmin, you can open it, go to the browser on the left panel, right click on “Servers”, and add a new one targeting “localhost” and port “5432”.

Once you do that and open it, you should hopefully be able to see monitoring statistics on it.  Then you can create a new database and work with it at will!

Persistent Data

Remember, if you delete your docker container, the data for the database will go away.  You can stop it and start it at much as you like though.  If you need to delete it and still access the data for some reason, look into using a volume in docker (this is what they’re for).