AWS Pass tool - usage

Posted on 2018-12-27 Updated on 2019-10-03

Automating multi account AWS solution in security way. Having all credentials secured is good. But it would be better to have automatic, and secured access.

Pass automation

Great news! Almost no work is needed. In practice if you have awspass tool working - the job is almost done.

But what is it all about?

Great question I'm glad, you've asked. :) Let's say you have some AWS accounts - to have separated rules for testing, UAT, production. Also, let's say you are operating with other system with only two environments - testing, and production. So what? So you need to decide if your user acceptance tests are operating with their test, or production. Usually test has not enough data - so production is used. Now - all check, reports, reconciliation needs to deal with two your accounts. Every time account switch, queries and so on.

If that is not your case. Let's say you are doing debug with test environement, and someone asks for fast check on production. You need to break your work, do what is needed, and get back to your work.

Both cases sucks. Good point is that pass can help with both. It is worth to note, that it is not the only one solution. You can use environment / shell variables to control AWS credentials. So instead of putting information into ~/.aws/ one can use environment. But that needs to be done per shell. Since, usually I have many open tabs, with different things there, and there... Probably you see my pain. So I use different way.

Dealing with almost

As I wrote in first paragraph - you are almost done. So now it is time to have it done.

Let's say I have to query databases, managed by RDS, from different accounts. To do that, usually tunnel is used. One can setup tunnels to query many databases - at the same time. Tunel from Python:

from sshtunnel import SSHTunnelForwarder

with SSHTunnelForwarder(
   (f'{env_name}-gate-host', 22),
   ssh_username='user_name',
   ssh_pkey="/var/ssh/rsa_key",
   remote_bind_address=(db_info['url'], 5432),
   local_bind_address=('0.0.0.0', 15432)) as server:

   server.start()
   db_credentials = get_db_credentials(e, config)
   db_connection_string = f"dbname='{db_info['db']}' user='{db_credentials['username']}' host='127.0.0.1' port=15432 password='{db_credentials['password']}'"
   start_point, location = get_time_info(env_name)
   with psycopg2.connect(db_connection_string) as conn, \
      conn.cursor() as cur, \
      open(location, 'w') as outfile:

      outfile.truncate(0)

      cur.copy_expert(f"""
      COPY (
         select *
         from performed_orders
         where created >= '{start_point}'::timestamp
         order by created)
      TO STDOUT
      WITH CSV HEADER QUOTE AS '"'
      FORCE QUOTE id,eod""", outfile)

To do that sshtunnel is needed - pip3 install sshtunnel. Some notes for above code - get_db_credentials fetches credentials to db - so it can be stored in pass, or in SSM. Here is piece of code for later case:

ssm = boto3.client('ssm',
    aws_access_key_id=config['aws_access_key_id'],
    aws_secret_access_key=config['aws_secret_access_key'])
res = ssm.get_parameter(Name=config['db_password_key'], WithDecryption=True)

and res['Parameter']['Value'] contains password, decrypted. COPY is PostgreSQL clause allowing data export - so result is stored in provided outfile. Note that you can open tunnel independently whenever you wish. So detailed report can be performed - any time. To have get_db_credentials working, username is also needed. Usually that is piece of cake.

Final question

Is this safe? Well, yes. Everytime, you use pass - GPG pass-phrase must be provided. System caches it for a moment, but still, nothing can happen without you.