Now another attempt at a summary
What to manually test =====================
A) regular QA tests ------------------
1. "Default boot and install“ tests on „USB“ and optimally „DVD“ https://fedoraproject.org/wiki/Test_Results:Current_Installation_Test
Currently it is divided into 2 tests: DVD (or local) and Netboot But we have 4 ways per boot procedure - local interactive - local remote (via VNC) - local automatic (via key file on OEMDRV USB) - remote automatic How do we want to deal with this?
There are some tests about Anaconda and VNC automatically. Do they work on hardware?
2. Additionally: running the install test with arm-imange-installer There should already be a topic for this anywhere.
3. IPA Tests: https://fedoraproject.org/wiki/QA:Testcase_freeipa_replication_advanced
B) Specific server-related special cases ————————————————————
1. "maybe we should test more storage operations post-install“
Integral part of our installation during post-installation is the expansion / addition of LVs We may review the current tests regarding Custom storage configuration when we discuss to expand the current server related release criteria.
2. Another topic may be Virtualization, where we adopt a functioning internal network (virbr0) for internally protected communication and services such as database and IPA and including internal DNS (split DNS via systems-resolved, currently doesn’t work anyway).
We should postpone this for our discussion about adding server specific services according to our updated technical specifications.
C) Possible effects of individual changes specifically on Server "A domain-specific review of the changes is a good idea“
Changeset available at https://fedoraproject.org/wiki/Releases/%7Brelnum%7D/ChangeSet https://fedoraproject.org/wiki/Releases/%7Brelnum%7D/ChangeSet
We should manage this with a tracking issue for each new release.
How to organize manual testing ==============================
1. In the medium term, the current server-specific page https://fedoraproject.org/wiki/Test_Results:Fedora_41_RC_1.3_Server should be supplemented and tailored more clearly to servers.
a) The download list filtered by Server
b) The subtitle „Key“ might be better expressed by „Examples how to fill in the results“
c) The section „Test Matrix“ should expanded by a table of links to the manual test cases that are Server related (currently it can be misinterpreted as „everything is done“ (by coconut).
2. In the medium term we should introduce a „test week“. That would include the manual tests and the review of the docs. (As an example see coreos test week).
-- Peter Boy https://fedoraproject.org/wiki/User:Pboy PBoy@fedoraproject.org
Timezone: CET (UTC+1) / CEST (UTC+2)
Fedora Server Edition Working Group member Fedora Docs team contributor and board member Java developer and enthusiast
On Wed, 2024-10-23 at 16:04 +0200, Peter Boy Uni via server wrote:
Now another attempt at a summary
What to manually test
A) regular QA tests
- "Default boot and install“ tests on „USB“ and optimally „DVD“
https://fedoraproject.org/wiki/Test_Results:Current_Installation_Test
Currently it is divided into 2 tests: DVD (or local) and Netboot But we have 4 ways per boot procedure
- local interactive
- local remote (via VNC)
- local automatic (via key file on OEMDRV USB)
- remote automatic
How do we want to deal with this?
What do you mean "local automatic" and "remote automatic" exactly? Are you talking about kickstart installs? There are other test cases in the Installation matrix for kickstart, and openQA does run kickstart install tests (from various sources).
There are some tests about Anaconda and VNC automatically. Do they work on hardware?
No automated tests run on bare metal ATM. They all run on VMs. You did find a case this cycle where VNC worked great on VMs but not on bare metal, which was significant; given that, we might want to add a bare metal column for that test case, or something. But in general there's a combinatorial explosion problem here (there's a lot of specific circumstances in which any given test case might fail, and we'd go insane trying to add columns to cover them all).
- Additionally: running the install test with arm-imange-installer
There should already be a topic for this anywhere.
This is the "AArch64 disk images" table in the Installation matrix. It uses the https://fedoraproject.org/wiki/QA:Testcase_arm_image_deployment test case, which is about using arm-image-installer. There's a "Server" row in the table, and an 'aarch64 HW' column which would map to "do an install of the Server disk image to real aarch64 hardware using arm- image-installer".
How to organize manual testing
- In the medium term, the current server-specific page
https://fedoraproject.org/wiki/Test_Results:Fedora_41_RC_1.3_Server should be supplemented and tailored more clearly to servers.
a) The download list filtered by Server
b) The subtitle „Key“ might be better expressed by „Examples how to fill in the results“
c) The section „Test Matrix“ should expanded by a table of links to the manual test cases that are Server related (currently it can be misinterpreted as „everything is done“ (by coconut).
There's not really such a thing as a "manual test case", in my conception of how this works. Test cases are test cases. They can be tested by humans or robots or both. As long as the test case was followed, the result is valid.
We have automated coverage of all the non-optional tests in that page currently, which I see as good, it's a goal to automate as much testing as possible; testing is tedious.
We do have a few test cases in the Installation matrix that we specifically require a human to do (or require to be done on bare metal, which at present implies they must be done by a human), as a human-in-the-loop requirement for sanity. But I wouldn't want to take that too far; see "testing is tedious". On the whole, if a test can be reliably done by automation it should be. Overall I'm saying I'd just want to approach this from the angle of "what is not currently tested but should be tested", not "it's a problem that all the tests are automated".
server@lists.fedoraproject.org