At our last meeting https://meetbot.fedoraproject.org/meeting_matrix_fedoraproject-org/2024-09-1...), I had agreed to try to summarize our current state of discussion and to describe a possible solution. Here we go.
We discuss this topic for a long time, the initial tracking issue #63 (https://pagure.io/fedora-server/issue/63) is 2 years old.
Our initial intention was (and still is)
* Systematization of activities according to criteria and objective needs
* As a supplement to automatic tests and aspects that may not be amenable to automatic testing
* Integration and coordination with distribution-wide QA
* Discovery of new problems for which, of course, there are no automated tests (yet).
* This includes, among other things, monitoring the release changes that could potentially have side effects for Server.
* Checking the documentation for necessary updates
Topic "WHAT to test" ====================
One position was/is that manual testing is more or less completely redundant because everything relevant is now covered by automated testing.
One argument against this is that in the past, some problems were only noticed and found in the course of manual testing (e.g. software raid when switching to GPT) or were not found at all or not in time because manual testing of a release was not carried out or was insufficient (e.g. the problems with LVM administration, one of the most important functionality for Server).
And somehow it doesn't feel right not to test our installation media at all and use it to perform an installation or update for the first time after the release has been published.
I suppose it is agreeable, "it is good to have human testing of the deliverables written to real physical media on real physical systems" (adamwill). And this is exactly what we did in the past. Sometimes our manual testing program included our central services, virtualization and containerization, which are highly consequential in the event of a failure.
Additionally, is seems to be agreeable, "to test whatever workflows (we) have that *aren't* covered in the validation tests" (adamwill), although we may need to clarify, what "validation tests" do exactly cover.
Probably, we agree about the following list of human / manual test tasks:
* test DVD installation media on physikal hardware * test netboot installation media on physikal hardware * test VM (KVM) instantiation including first steps after first boot * in both cases checking: **** Everything works without breakage **** No distortion of the graphical or terminal output. **** No irritating, inaccurate or misleading error messages. **** besides the graphical guided steps, check shell access (<F1> etc) inkl. access to log files, print screen etc. **** Accuracy of the relevant documentation * In case of DVD installation, additionally (here running on hardware) **** Installing virtualization **** Installing containerization systemd-nspawn **** Installing containerization podman (as soon as we have documentation and procedures ready) * Test the dnf upgrade procedure on hardware and on a "real life" instance, not just the minimal default * Test the dnf upgrade procedure on a "real life" VM instance, not just the minimal default * Create a list of any special or one-off tests that are likely to be needed, based on the list of changes **** Perform and monitor these tests
Topic "HOW to test" ====================
Previously, we created a corresponding list as a tracking issue. For this release, I created a wiki page, which is easier to use.
As adamwill noted, this page "definitely shouldn't exist, we should fold anything important it covers into one of the existing pages". It takes us away from our goal from the very beginning of aligning testing with the distribution QA. It is a stopgap solution because the current pages do not offer us this capability.
To organize the test practically, we need a concise and clear list of all the tasks that need to be completed and that we can “tick off”. It would be good to have a structure like the one offered by the wiki page (https://fedoraproject.org/wiki/Server/QA_Manual_Testing_Overview). And nothing that is not part of the server test program belongs on this page.
It would be really great if the current QA pages could be added to or changed accordingly.
A good starting page would be the server page that is now sent in the announcement emails: https://fedoraproject.org/wiki/Test_Results:Fedora_41_Branched_20240924.n.0_...
We should remove all items that have nothing to do with Server, starting with the download list.
The test matrix and coverage page / lists should be split into automated tests and manual/human tests.
The lists themselves can probably consist largely of annotated links to existing pages, but preferably with anchors directly to the relevant spot. And they should already indicate who tested and, above all, with which result. And it must be clear at a glance what has not yet been tested.
And we need a place for one-off, release-specific tests, should the need arise.
Topic "SUPPLEMENTING the tests" ===============================
Of our server-specific services (or roles), only two are currently covered by tests: PostgreSQL and IPA. This needs to be completed.
The process is: 1. define these as blocking roles in the prd/tech spec 2. cover them in the release criteria 3. write wiki test cases 4. automate them
The first task is done. We should continue with the second one. Pragmatically, we should focus on the services for which documentation and procedures already exist: virtualization, containerization (nspawn), web server and NFS server.
The biggest issue might be the Apache server. Its current installation procedure effectively results in an unusable and broken instance. There is a lot of work to be done.
But we should discuss this separately from the general release tests. We have tracking issue #61 for this (https://pagure.io/fedora-server/issue/61).
-- Peter Boy https://fedoraproject.org/wiki/User:Pboy PBoy@fedoraproject.org
Timezone: CET (UTC+1) / CEST (UTC+2)
Fedora Server Edition Working Group member Fedora Docs team contributor and board member Java developer and enthusiast