Writing re-runable shell script

I recently started playing with [devstack][1] again (An all-in-on OpenStack developer setup). Last time was over 3 years ago because I remember a [pull request for a missing dependency][2] at the time.

The installation docs provide information to bootstrap your system with a [necessary user and privileges][3], however like many docs for software setup they contain one off instructions.

adduser stack
echo "stack ALL=(ALL) NOPASSWD: ALL" >> /etc/sudoers

When you write operations code you need to always be thinking about “testability” and “automation”. It is important to write re-runable code. You should always write parameterized code when possible, which can be refactored into usable functions at any time.

This is a good example to demonstrate a simple test condition for making the initial instructions re-runable.

sudo su -
NEW_USER="stack"
# This creates default group of same username
# This creates user with default HOME in /home/stack
[ `grep ${NEW_USER} /etc/passwd | wc -l` -eq 0 ] && useradd -s /bin/bash -m ${NEW_USER}
NEW_USER_SUDO_FILE="/etc/sudoers.d/${NEW_USER}"
[ ! -s ${NEW_USER_SUDO_FILE} ] && umask 226 && echo "${NEW_USER} ALL=(ALL) NOPASSWD: ALL" > ${NEW_USER_SUDO_FILE}
ls -l ${NEW_USER_SUDO_FILE}

[1]: http://docs.openstack.org/developer/devstack/
[2]: https://github.com/ronaldbradford/devstack/commit/59b3f402cd9a997836b03560c7d31cf96304ca1e [3]: http://docs.openstack.org/developer/devstack/guides/single-machine.html

Tagged with: Cloud Computing OpenStack

Related Posts

Testability

If I was to provide one tip for organizations on how to implement a successful technology solution, I would state you need to ensure your product/software/system is completely testable. Independent on how you elect to test your system, the design of creating a completely testable infrastructure will enable exponential savings as your business grows.

Read more

More CPUs or Newer CPUs

In a CPU-bound database workload, regardless of price, would you scale-up or scale-new? What if price was the driving factor, would you scale-up or scale-new? I am using as a baseline the first available AWS Graviton2 processor for RDS (r6g).

Read more

An Interesting Artifact with AWS RDS Aurora Storage

As part of using public datasets with my own Benchmarking Suite I wanted upsize a dataset for larger volume testing. I have always used the INFORMATION_SCHEMA.TABLES data_length and index_length columns as a sufficiently accurate measurement for actual disk space used.

Read more