How to integration/unit test software hardware interfaces
Asked Answered
W

2

6

I'm working on a small fun projects that builds a robot. We as the programmers are working parallel to the people building the robot. So it is very often the case that we are trying to run changed software and the builders have changed the hardware. If the software tests are not running it is always a hard thing to figure out if the software or the hardware fails or even worse if the integration fails. There are some hard parts with an automatic testing for this issues.

We have figured out some ways of breaking things down so we have rc control to let the robot go through some movements without software assuring that he still works. Then we start some software tests that make the robot going some defined figures to show that the software behaves in the same way as before. But this always comes down to a very time consuming task because you can't automate it and someone has to start the test, watch the test and try to figure out if the robot did what it should do.

Another problem is that constant testing with our real hardware is wearing out parts of our hardware, joint, motors, gear wheels and so on.

But not testing has proven to cause so much trouble and consume so much time that I would like to know what kind of techniques are used in other projects which are dealing with hardware software interaction and if there are tools out there that can be used.

Wantage answered 17/6, 2009 at 3:51 Comment(0)
L
10

The interface between the robot and the software must be defined first ; not necessarily exhaustively, this could be done incrementally. Start small, for instance with basic moves (forward, backward), then, once it has been fully tested, both in isolation and integrated, add some behaviour (e.g. turn left, turn right), retest. That way, the whole team can use what it learned all along the project to extend the interface, possibly minimizing interface reworks.

The Progress before Hardware article describes such a process in greater details, focusing on the Test-Driven-Development (TDD) aspect.

See also answers to the How to do TDD with hardware question.

Lightness answered 17/6, 2009 at 8:16 Comment(0)
E
2

I think it's a very interesting situation.

I believe there's no problem with your testing process. If you mock your robot and test against this mock, it's all good.
If the hardware robot acts different as your mocked robot, there's another big problem: The communication.

The interface between the software and the hardware is the "protocol" specification. In my opinion it must not be changed without discussion. The hardware guys may not change it and you software guys not, too! You only may change it together. In your situation, everybody changes it on his own.

In your situation, your teams seem to work against each other. So try to focus your efforts in your interface and especially your communication, not in your integration test that won't work anyway.

A suggestion of mine would be to use a robot's software mock as the one and only specification. So you can rely on your mock and there's a central point that defines the connection between hard- and software.
When the software guys want to change it, ok. They have to discuss it with you and you will change the software mock. If the hardware was changed and the mock not, you have an apology, because you develop against your specification.

Good luck!

Eclogite answered 17/6, 2009 at 5:48 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.