roshi at fedoraproject.org
Fri Apr 4 00:13:56 UTC 2014
I've recently been working on a _rough_ proof of concept for my Test
Maps , idea - and it's just enough done to get the idea across of
what I have in mind. As I outlined on my blog, currently our test
matrices are large and testcases are only logically grouped together on
the wiki. This leads to two issues I can see: 1 - you have to look
through the given matrix to find something you can/want to test and 2 -
only those who have worked through the matrix several times (or
happened to write the matrix) can easily know when and what to test.
The current testing workflow requires a lot of organic knowledge for new
testers to learn. A new contributor joins in on IRC, says "What can I
test?" and those in channel will ask about the h/w the contributor has
and DE preferences, etc, to give them a starting point somewhere on
What I envision is a simple web-based tool that new contributors can go
to, answer some questions (What arch do you have? What DE do you use?
What install media do you have available? etc., etc.) and be handed a
list of tests they can do in a sensible order (which would aid with the
testing of the multiple WG products - each could make their own test
maps). The issue with this is, we don't have an easy way to get what
h/w and software requirements are for every test case we have.
Until that data exists, we have to hand write lists of what tests make
sense to do one after the other (while updating testcase requirements
as we go). This brings us to the proof of concept, which lives here:
With this proof of concept, I'm hoping my idea solidifies a
bit more in people's heads as to what exactly I had in mind. Then we can
determine if this would be a useful idea for us to look into going
forward - or if it isn't as nifty an idea as I think it could be.
There is currently only one "test map" to click through, and a plethora
of features aren't yet implemented. The key features that would need to
be written before we could *actually use it* include:
- Create new "testmaps" without hard coding them
- select which map to follow
- easily add or update testcases
- add/remove h/w and software requirements
- dynamically find tests to run based on users h/w.
If this idea proves to be something we want to work on, I can put
together a more complete road-map/feature-list for review.
Here are some of the cooler things we could potentially do with a system
- FAS Integration (keep track of hardware profiles and post results,
control edits to testcases)
- Track test results
-- See results in real time
-- Stats on testing and hardware usage
- Edit Testcases (and push them back to the wiki)
- Badges integration
I'm sure there's plenty of stuff I haven't thought of or had pointed
out to me - so if you have any thoughts or questions, reply!
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 490 bytes
Desc: not available
More information about the test