AWS-cli as custom docker container

Hi guys.

Battling… trying to deploy aws-cli as a local custom container, mounting a local dataset to /download int he container, mounting /root/.aws from a local directory that will then contain the configuration file.

Anyone got this working, thats willing to share some screenshots…

I am away from my environment for the holidays and so I’ve not any screenshots to share…but I do have aws-cli working. Remember that aws-cli isn’t an app or service that runs in the background like a database or webserver would. Instead when the container runs it will execute the command and then quit. This is because aws-cli is like ‘ls’ or ‘cd’. It’s a command that does something and then exits. The container will not stay running. It was a little tricky for me to understand at first, but thinking about what the container was actually doing helped me understand the behaviour I witnessed.

Here’s some documentation that will be helpful to you:

Good luck.

hi hi

makes sense what you saying, what i probably actually need is a simple linux container with was-cli installed. allowing me to remote into the container and use aws-cli => pull files from S3

G

I prefer using an alias to invoke aws-cli directly. Your paths will vary:

$ alias aws='docker run --rm -ti -v ~/.aws:/root/.aws -v $(pwd):/aws amazon/aws-cli'

Then simply use: $ aws s3 sync /directory/x s3://my-lovely-bucket/my_directory

This approach makes aws-cli behave as if it’s installed on TrueNAS natively and not from a container—no need to remote into the container. Think of the container as a command wrapper, not a VM. Unlike services that run continuously (mariadb, nginx), aws-cli commands execute and exit immediately, just like ls or cd. Once you understand this distinction, it makes perfect sense.

Now that worked perfectly…. the alias to the container, using the local docker engine…

thanks.
G