Object Storage: Validating S3 Compatibility

Object Storage: Validating S3 Compatibility

Chris EvansObject Storage

This post is the fourth in a series of independently written technical briefings on the AWS S3 API and how it has become the de-facto standard for object stores.  The series is commissioned by Cloudian Inc.  For more information on Cloudian and the company’s HyperStore object storage technology, please visit their site at: http://www.cloudian.com/

In previous posts we looked at some of the technical aspects of S3.  This time, we’ll look at testing the S3 API on object store platforms.  Note that this is an ongoing post and the results will be added over time, so it will be worth checking back again in the future.

The S3 API can be tested by writing to it directly using the REST-based interface.  Unfortunately this does require a reasonable amount of coding, mainly in creating lots of test cases.  So I decided to have a look at the open source S3 Tests code on GitHub (link).  The code is written in Python and uses a couple of testing frameworks.  I chose to use a Ubuntu distribution for my work and found it relatively easy to set up.  The testing itself uses a config file that holds information on the target object store endpoint and two users; two are required in order to perform some of the security (ACL) related testing.  You also need to specify a test bucket prefix as the tests create a large number of buckets with random names, prefixed by whatever the tester specifies, in order to keep them totally unique.

The output from the s3 test scripts show the status of each test, plus a summary count at the end of the file.  Results are grouped into “skipped”, “errors” and “failures”, however there’s no detail to explain what differentiates each of these categories.  The bulk of the testing output is redirect to stderror and contains a lot of detail on each test run.  The tests themselves are hardcoded into the “s3-tests/s3tests” directory and it’s possible to trawl through the code to see what’s being run, but it isn’t easy.

In my initial testing I saw a huge number of errors.  It turned out that the s3-test scripts attempt to create more than 100 buckets, the default AWS limit (and obviously also a limit set by vendors).  Once changed, I saw more reasonable results.  So far these have been as follows:

PlatformCommitTests RunSkippedErrorsFailuresSuccess Rate
AWS S3 a72fc4a 423 51 41 34 82.3%
Cloudian HyperStore a72fc4a 423 51 35 29 84.9%

On the initial testing we can see both platforms skip the same number of tests.  Strangely, AWS, which should be 100% compatible with itself, doesn’t actually score 100% on the test.  Also shown are the result for Cloudian’s HyperStore, which scored higher than AWS.  I’m also working on testing against Swift, Ceph and other platforms that vendors are prepared to provide access to.

The Architect’s View

The current range of available tools for S3 compatibility testing aren’t great.  The s3-tests have a good go at testing, but have a number of faults; firstly the output isn’t particularly readable; second the tests aren’t reported by functional group (unless they are run separately) making it difficult to see which part of the testing failed; third, unless you’re aware of the problems, things like bucket counts cause the tests to fail.  There could also be other issues occurring that aren’t easy to spot, such as the differences between bucket naming standards in different AWS regions.

One other major issue is that the test procedures are being continually updated.  In my initial testing, the scripts ran 423 tests.  Using a version of the code from a month later, the test count was up to 497, without any obvious detail as to what the new tests were covering.  In the short term I’ll continue to run s3-tests against a range of test environments, however I’m also working on a tool that will run a set of tests in a more user-friendly way, grouping the results by function.  This will include testing basic and more advanced functionality.  In the meantime, results will be tracked on a dedicated page – https://blog.architecting.it/resources/s3-compatibility-testing/.

Comments are always welcome; please read our Comments Policy first.  If you have any related links of interest, please feel free to add them as a comment for consideration.  

Copyright (c) 2007-2020 –Post #80C1 – Brookend Ltd, first published on https://www.architecting.it/blog, do not reproduce without permission.