provashell – testing shell scripts the easy way

In this post I will describe the provashell project, an Apache 2.0 licensed bash and shell Unit Testing library using annotations. The library is a self-contained script file with no dependencies that can be embedded or used easily in your project. Extra care has been taken to use POSIX shell features and the code has been tested in the popular shells bash, dash and zsh (latest versions the time of writing this article). I will add some detail on what drove me to do it, what it does, how it works and some examples.

Unit tests should be everywhere there is code. Tests materialise our expectations of the expected behaviour, prevent obscure bugs, generally induce more elegant designs and also serve as very effective up-to-date documentation. The oft-touted benefits are well worth it.

An area commonly overlooked in unit testing is when the code is “just a script”. This is unfortunately a common misconception, code is just code and should therefore be treated as such, with solid engineering principles and rigorous testing. Here’s a thought experiment: try to tell yourself what the differences between a ‘script’ and ‘proper code’ really are. Is it length? There might be a short piece of code that configures all your company’s firewall rules or backups and that is quite an important piece of logic, isn’t it? Is it criticality? Working non-critical code tends to end up included in critical systems and by extension becomes ‘mission-critical’ as well. Is it the language it is coded in? Most languages are pretty much logically equivalent (and Turing complete) so if something coded in Python is translated into Java to do the same thing its very essential nature has not changed at all. Is it necessity? We could go on. Code is just code, and it can and should be tested.

Unit testing for bash and shell scripts

Testing bash and shell scripts is unfortunately not that common. As discussed, shell scripts should still be tested thoroughly. There are plenty of shell testing libraries out there, usually bash-specific implementations, with different licenses. They are mature and well tested implementations. However, I was on the lookout for an Apache 2.0 licensed one that was simple, with no dependencies (such as the latest C++ compiler!) and I could not find one that suited my taste. One never knows anywhere near all there is to know about shell programming (trust me, you do not, specially when taking into account different implementations and so on) so I set about writing one myself that had the outlined characteristics which would also help me to learn more about shell coding.

Main features

The specific characteristics of provashell are as follows:

* Be unit-tested itself – Using plain shell test structures to do the assertion tests themselves. One of course can test the basic assert function and then leverage that tested function to check the other assertions, but I wanted to avoid false positives and keep concerns separate. Using provashell’s own test assert functions to test itself results in more elegant code but is potentially confusing when failures occur due to cascading effects. 

* Be as POSIX-compliant as possible – To that effect, the library has been tested in the latest (as of writing) versions of bash, zsh and dash, the latter being quite POSIX-strict. While bash-isms can be very practical when coding and in interactive shell sessions, cross-shell testing is a good code quality exercise which forces engineers to double check lots of assumptions and shortcuts, generally leading to better scripts. Even though I much prefer to use zsh for interactive sessions (specially when paired with toolchains such as the genial oh my zsh), once you have the mindset of shell implementations being real programming languages, it is fairly easy to mentally switch to bash or -even better-, POSIX shell ‘mode’ when writing persistent scripts. Such mental gymnastics will greatly help if found working on an old or unfamiliar system, with only ksh installed or something like that.

* Run no user-supplied code – This is an important security characteristic. The very first version of provashell used eval to run assertion checks in a flexible way, this resulted in elegant code but it also meant that test data could include shell code that could be run by the test framework. This is insecure and should be avoided if there are other solutions at hand, even if they are a bit more complex or less elegant. The latest version does not use eval anywhere in the code so it will not execute any user o lib user code, except for running the configured setup and teardown functions, of course. Whatever happens in those functions is up to the test developer. In any case, automated unit tests should never include user-supplied or variable data, to significantly low the risk of attacks using Continuous Integration systems or any automated test routines. Following that strategy, provashell does not run any user-supplied code other than the configured setup, teardown functions and the declared tests, and the assertion functions do not execute any external code to the best of my knowledge (you should check the code yourself anyway, grep -R eval src is your friend here).  It goes without saying that test shells should pretty much never be run as root, unless there is a very good reason (which there isn’t).

* Do not do (too much) funky stuff – Try to be as simple as possible and reuse code wherever feasible, so there is little repetition in the test library. It should also be easy to read and understand by any reasonably experienced shell coder. It is worth stressing again that shell scripts are real code and should be treated as such at all times.

* Use annotations to identify tests – Tests can be named any way the developer wants. I like annotations for tests because even though appending ‘Test’ to tests is a really simple convention, sometimes it adds extra cruft to test names in a context where it is not actually needed. For instance, the test called ‘errorIsReturnedOnNetworkTimeout’ is quite self-explanatory and easily understood in a test class context. Having ‘errorIsReturnedOnNetworkTimeoutTest’ does not add much to the definition and extends the name needlessly. It is of course a matter of style and it could argued that adding annotations to tests adds the same cruft, just in a different place. In any case, provashell uses annotations to identify tests and related functions which work well and are simple to use. Here’s a summary diagram of all the supported annotations (they need to go in a bash comment line):

provashell annotations
Function annotations defined by provashell

Yeah! Give me some shell test examples

Usage is pretty straightforward and can be demonstrated with an example. Imagine we have a function we want to test that counts the number of a’s in a text passed as a parameter. It is probably not a very good implementation but is a good enough example:

countAs() {
	c=$(printf %s "$1" | sed 's/[^a]*//g' | tr -d '\n' | wc -m)
	return "$c"
}

We then have two tests like this:

#@Test
countAsNormally() {

	countAs 'a'
	assertEq "Does not count one 'a' in a string with only one 'a'" 1 $?

	countAs 'b'
	assertEq "Does not count zero a's in a string with no a's" 0 $?

	countAs 'aaa'
	assertEq "Does not count three straight a's when they are there" 3 $?

	countAs 'abab'
	assertEq "Does not count two a's in a string when they are there" 2 $?

}

#@Test
countAsEdgeCases() {
	
	countAs ''
	assertEq 'Does not count empty string correctly' 0 $?
	
}

Once we have defined the tests we need to source the library like this (using whatever path we have for the provashell file):

. provashell

This will run the provashell code within the current context, causing the tests to be executed as expected, including the annotated pre and post functions.

Complete documentation

Extensive docs and examples can be found at the GitHub provashell project page. In any case, the source is quite short and readable.

Contributing

The provashell library uses the Apache 2.0 license. Pull requests are welcome so fork away!

Building an FTP server Maven plugin from scratch

In this post we design and code a new Maven plugin that fires up a FTP server to aid in integration testing of FTP-bound processes, thus demonstrating the flexibility and power of Maven plugins.
Continue reading “Building an FTP server Maven plugin from scratch”

Components on the server (6): adding Integration Testing

In this installment of the server-side OSGi series, we add integration testing capabilities to our project. Integration testing goes beyond plain unit testing and checks the interactions between real components. This is in contrast with unit testing, which generally uses mockups to represent components outside the one being tested. Please take a look at previous installments, as usual.

In the case of integration testing, it is manly used in a pre-production environment, with a valid build that has all unit tests passed. It can even be used in production to just after a deployment is made, taking care not to have destructive checks or massive load tests in the integration test code. YMMV.

To achieve integration testing we need to check the various OSGi components deployed interact in the way that is expected of them. Therefore we need to test the components in a group and not in isolation. To do that in the OSGi world means we need to have access to the OSGi context from within the tests to access services, call them and check their responses, etc.

To allow for this kind of integration testing within the OSGi environment, we make a slight modification to the excellent test.extender we have already patched in the previous installment.

Basically, the basic test.extender seeks out any JUnit test classes within the fragment bundle, creates an instance using an empty constructor and then fires up the tests. This is activated either by default when the fragment is loaded or by using ‘test ‘ in the console. For further information please see the previous post about this subject.

For our integration testing, we add an extra command to test.extender:

public Object _integrationTest(CommandInterpreter intp) {
        String nextArgument = intp.nextArgument();
    	testExtender.integrationTest(Long.parseLong(nextArgument));
    	return null;
}

And we refactor the TestExtender to add the integrationTest method which reuses some of the code to instantiate test cases using a constructor that accepts the OSGi context as a parameter.

Constructor[] constructors = clazz.getConstructors();
boolean foundConstructor = false;
for (int i = 0; i < constructors.length && !foundConstructor; i++) {
	Constructor constructor = constructors[i];
	Class[] types = constructor.getParameterTypes();
	if (types.length==1 && types[0].isInstance(context)) {
		foundConstructor = true;
		EClassUtils.testClass(inspectClass, constructor.newInstance(context));
	}
} // for

The OSGi context is passed onto the constructor and then the test class is run. It is obviously up to the test class to use the context appropriately for its integration testing.

In our cache project setup, we can do some useful integration testing on the cache.controller component, basically checking if the interaction with the provider components is behaving as we expect it. The integration testing is also added to a fragment that can be deployed optionally, of course.

We start by creating the fragment and adding a testing class like this:

Adding test class

Next, we add the constructor that accepts an OSGi context, which is very simple:

public CacheIntegrationTest(BundleContext ctx) {
	super();
	this.context = ctx;
}

In the setup and teardown methods we get and unget the cache service to perform the testing:


public void setUp() throws Exception {
	serviceReference = context.getServiceReference(Cache.class.getName());
	controller = (CacheControllerCore) context.getService(serviceReference);

}

public void tearDown() throws Exception {		
	context.ungetService(serviceReference);
	controller = null;		
}

In this case we get the controller cache service and store it in an instance used to perform the tests. This is quite simple and fulfills our intended purpose but we still have the flexibility to make more complex integration testing if needed.

Next we create as many test cases as needed:

public void testGet() {
	try {
		controller.init();
		double v = Math.random();
		String k = "/k"+v;
		controller.set(k, v);
		assertEquals(v, controller.get(k));
	} catch (CacheProviderException e) {
		e.printStackTrace();
		fail(e.getMessage());
	}

}

It should be noted that while the code looks like regular testing code, it is actually using real services from the OSGi environment as opposed to mockups. This means we are testing the real integration between components as well as the individual controller component code. The disadvantage here is that if there is an error in the controller we might mistake the problem with an issue with the services used. In conclusion, having integration code doesn’t negate the need to have unit tests.

Once we load the fragment onto the environment, first we need to obtain the bundle id of the integration fragment and then launch the integration testing in this manner:


osgi> integrate 125
Bundle : [125] : com.calidos.dani.osgi.cache.controller.integration
_
CLASS : [com.calidos.dani.osgi.cache.controller.CacheIntegrationTest]
___________________________________________________________________________
Method : [ testInit ] PASS
Method : [ testInitInt ] PASS
Method : [ testSize ] PASS
14:21:43,077 WARN CacheControllerCore Couldn't clear some of the provider caches as operation is unsupported
14:21:43,077 WARN CacheControllerCore Couldn't clear some of the provider caches as operation is unsupported
Method : [ testClear ] PASS
Method : [ testSet ] PASS
Method : [ testGet ] PASS
Method : [ testGetStatus ] PASS
___________________________________________________________________________

The results tell us that all operations are OK but we need to bear in mind that the clear operation is not supported in some backend caches. If this is what is expected by the operator then all is fine.

We take advantage of the new integration testing functionality to make some extensive changes to logging and exception handling of the controller code. By running the integration tests we make sure all seems to work fine (even though we still need some proper unit testing of the controller). Modifications are made quite quickly thanks to the integration tests.

To recap, we’ve added integration testing support to the existing ‘test.extender’ bundle and created integration testing code for the cache controller component. This has allowed us to make code changes quickly with less risk of mistakes.

Here you can find a patch for the test extender project as well as the patched testing bundle already compiled. Enjoy!

Components on the server (5): better Unit Testing

In this installment of the OSGi series, we add more complete Unit Testing support in the project. We also establish that some behaviour of the Servlet Bridge may not be what we want and then provide a way to customize it.

Continue reading “Components on the server (5): better Unit Testing”