On Tue, Aug 02, 2016 at 11:30:37AM +0100, Milosz Wasilewski wrote:
For the migration to V2, we kept compatibility with the lava test shell definitions - as far as runtime operation is concerned. Once the V1 code is removed, we do have plans for improving lava test shell in ways which will break compatibility, but changes cannot be made until then. In the meantime, the advice being written into the documentation is to prepare custom scripts in the test definition repository which can do all the parsing and other work in a self-contained way which can be tested locally against log files from the device. This is a
erm, do you suggest not to use LAVA at all?
I think the point here is (as we chatted privately):
Do Not Lock Yourself Out of Your Tests™, i.e.
- don't make your test code depend on the LAVA infrastructure in any way; make sure you can always run your tests by downloading the test code to a target device, installing its dependencies (the test code itself could do this), and running a single script.
- make the LAVA-specific part as small as possible, just enough to e.g. gather any inputs that you get via LAVA, call the main test program, and translate your regular output into ways to tell lava how the test went (if needed).
That, of course, can't help with existing tests that already assume too much of LAVA to be used without it.