1 tdc - Adding test cases for tdc 2 3 Author: Lucas Bates - lucasb (a] mojatatu.com 4 5 ADDING TEST CASES 6 ----------------- 7 8 User-defined tests should be added by defining a separate JSON file. This 9 will help prevent conflicts when updating the repository. Refer to 10 template.json for the required JSON format for test cases. 11 12 Include the 'id' field, but do not assign a value. Running tdc with the -i 13 option will generate a unique ID for that test case. 14 15 tdc will recursively search the 'tc' subdirectory for .json files. Any 16 test case files you create in these directories will automatically be included. 17 If you wish to store your custom test cases elsewhere, be sure to run tdc 18 with the -f argument and the path to your file. 19 20 Be aware of required escape characters in the JSON data - particularly when 21 defining the match pattern. Refer to the tctests.json file for examples when 22 in doubt. 23 24 25 TEST CASE STRUCTURE 26 ------------------- 27 28 Each test case has required data: 29 30 id: A unique alphanumeric value to identify a particular test case 31 name: Descriptive name that explains the command under test 32 category: A list of single-word descriptions covering what the command 33 under test is testing. Example: filter, actions, u32, gact, etc. 34 setup: The list of commands required to ensure the command under test 35 succeeds. For example: if testing a filter, the command to create 36 the qdisc would appear here. 37 This list can be empty. 38 Each command can be a string to be executed, or a list consisting 39 of a string which is a command to be executed, followed by 1 or 40 more acceptable exit codes for this command. 41 If only a string is given for the command, then an exit code of 0 42 will be expected. 43 cmdUnderTest: The tc command being tested itself. 44 expExitCode: The code returned by the command under test upon its termination. 45 tdc will compare this value against the actual returned value. 46 verifyCmd: The tc command to be run to verify successful execution. 47 For example: if the command under test creates a gact action, 48 verifyCmd should be "$TC actions show action gact" 49 matchPattern: A regular expression to be applied against the output of the 50 verifyCmd to prove the command under test succeeded. This pattern 51 should be as specific as possible so that a false positive is not 52 matched. 53 matchCount: How many times the regex in matchPattern should match. A value 54 of 0 is acceptable. 55 teardown: The list of commands to clean up after the test is completed. 56 The environment should be returned to the same state as when 57 this test was started: qdiscs deleted, actions flushed, etc. 58 This list can be empty. 59 Each command can be a string to be executed, or a list consisting 60 of a string which is a command to be executed, followed by 1 or 61 more acceptable exit codes for this command. 62 If only a string is given for the command, then an exit code of 0 63 will be expected. 64 65 66 SETUP/TEARDOWN ERRORS 67 --------------------- 68 69 If an error is detected during the setup/teardown process, execution of the 70 tests will immediately stop with an error message and the namespace in which 71 the tests are run will be destroyed. This is to prevent inaccurate results 72 in the test cases. 73 74 Repeated failures of the setup/teardown may indicate a problem with the test 75 case, or possibly even a bug in one of the commands that are not being tested. 76 77 It's possible to include acceptable exit codes with the setup/teardown command 78 so that it doesn't halt the script for an error that doesn't matter. Turn the 79 individual command into a list, with the command being first, followed by all 80 acceptable exit codes for the command. 81 82