[AutoQA] #111: depcheck test
by fedora-badges
#111: depcheck test
--------------------+-------------------------------------------------------
Reporter: wwoods | Owner:
Type: task | Status: new
Priority: major | Milestone: autoqa depcheck
Component: tests | Version: 1.0
Keywords: |
--------------------+-------------------------------------------------------
Write an actual depcheck test. This test should take, as inputs:
* one or more new package builds, and
* the name of a target repository for the new build(s)
The test should examine the PRCO (provides/requires/conflicts/obsoletes)
data for the new package(s) and the PRCO data for the target repository
(and all its parent repos).
The test should fail if the new builds would cause missing/broken
dependencies or unresolveable conflicts in the target repo(s).
--
Ticket URL: <https://fedorahosted.org/autoqa/ticket/111>
AutoQA <http://autoqa.fedorahosted.org>
Automated QA project
12 years, 9 months
[AutoQA] #118: New test proposal: Python debugability
by fedora-badges
#118: New test proposal: Python debugability
--------------------+-------------------------------------------------------
Reporter: kparal | Owner:
Type: task | Status: new
Priority: minor | Milestone:
Component: tests | Version: 1.0
Keywords: |
--------------------+-------------------------------------------------------
From Adam Williamson:
I've cut some of the context, but
basically David wants to write a test case for checking whether Python
debugging is possible as intended, and I asked exactly how he wanted it
to be used.
{{{
On Tue, 2010-01-26 at 15:30 -0500, David Malcolm wrote:
> > > Can I request a test case to cover debuggability of the Python
> runtime
> > > please (both in Fedora and in RHEL).
> > >
> > > This is in relation to:
> > > https://bugzilla.redhat.com/show_bug.cgi?id=556975
> > > https://bugzilla.redhat.com/show_bug.cgi?id=558977
> > > https://bugzilla.redhat.com/show_bug.cgi?id=557772
> > >
> > > as there seem to be gcc and gdb issues, which are conspiring to
> make
> > > python impossible to debug.
> > >
> > >
> > > The requirement is: within "gdb python", I must be able to select
> a
> > > PyEval_EvalFrameEx frame, and have the following work:
> > > (gdb) print co
> > > (gdb) print f
> > >
> > > rather that have <variable optimized out>
> > >
> > > so that I can do this:
> > > (gdb) print (char*)(((PyStringObject*)co->co_name)->ob_sval)
> > > to get the function name
> > >
> > > (gdb) print (char*)(((PyStringObject*)co->co_filename)->ob_sval)
> > > to get the source filename
> > >
> > > and
> > >
> > > (gdb) print f->f_lineno
> > > to get the approximate source line number.
> > >
> > > If the above isn't working, it becomes extremely hard to
> meaningfully
> > > debug any issues that arise inside Python.
> This is probably scriptable, and a good candidate for AutoQA and foe
> RHTS.
>
}}}
--
Ticket URL: <https://fedorahosted.org/autoqa/ticket/118>
AutoQA <http://autoqa.fedorahosted.org>
Automated QA project
12 years, 9 months
[AutoQA] #184: Add tags for destructive testing
by fedora-badges
#184: Add tags for destructive testing
----------------------------+-----------------------------------------------
Reporter: kparal | Owner:
Type: task | Status: new
Priority: major | Milestone: Virtualization
Component: infrastructure | Version: 1.0
Keywords: |
----------------------------+-----------------------------------------------
We need to add tags into autotest and autoqa tests that will determine
whether test is (may be) destructive. This tag must be propagated and
appropriate test client must be selected. After finishing this test the
test machine must be restored to a previous safe state.
--
Ticket URL: <https://fedorahosted.org/autoqa/ticket/184>
AutoQA <http://autoqa.fedorahosted.org>
Automated QA project
12 years, 10 months
[AutoQA] #181: Quick and dirty support for Virtualization
by fedora-badges
#181: Quick and dirty support for Virtualization
----------------------------+-----------------------------------------------
Reporter: kparal | Owner:
Type: task | Status: new
Priority: major | Milestone: Virtualization
Component: infrastructure | Version: 1.0
Keywords: |
----------------------------+-----------------------------------------------
The task is to implement quick and dirty support for virtualization for
AutoQA tests. We can already install autotest-client into virt machine,
but there are some tests which must not run inside virt environment (tests
using virt themselves for example). Therefore we must ensure all our tests
are marked properly ("may use virt", "may not use virt"), all our tests
clients are marked properly ("is virt", "is bare metal"), and this
information are propagated from AutoQA to autotest-server and used
properly when selecting appropriate host and scheduling jobs.
This could be quite easy, there is some support for tags in autotest
already.
As an enhancement, we may think about preferences. Do we want to mark some
test that it is preferred to run it in virt, but may also be run on bare
metal? How do we achieve correct behavior for autotest-server for such
task? For example: We may want to prefer executing sanity tests in virt
env, but it's not really mandatory, just desired. Food for thoughts.
--
Ticket URL: <https://fedorahosted.org/autoqa/ticket/181>
AutoQA <http://autoqa.fedorahosted.org>
Automated QA project
12 years, 10 months
[AutoQA] #107: writing a python test (modeled similarly to install.py)
by fedora-badges
#107: writing a python test (modeled similarly to install.py)
-------------------+--------------------------------------------------------
Reporter: liam | Owner:
Type: task | Status: new
Priority: major | Milestone:
Component: tests | Version: 1.0
Keywords: |
-------------------+--------------------------------------------------------
write a python test (modeled similarly to install.py) that takes as input:
* a URL to a kickstart file (URL can be local (e.g. file://) or
remote (e.g. http://, ftp://, nfs:// ...) ... but start with easy case
first.
* a URL for the install media (again, keep this simple for now and
assume file:///var/lib/libvirt/images/Fedora-12-x86_64-DVD.iso)
* a URL to a configuration file that describes the environment -
again, perhaps optional for now. But eventually we'll need
something that tells the test to create a guest with 4 NICs vs 1
NIC, 3 SCSI drives etc... Don't worry about being fancy at
first ... just take the defaults. This is just where I might
see it headed in 6+ months. Copy from the kvm autotest project
if you like.
At beginning, we focus on the basics and something that gets things far
enough along so we can review, adjust and repeat.
--
Ticket URL: <https://fedorahosted.org/autoqa/ticket/107>
AutoQA <http://autoqa.fedorahosted.org>
Automated QA project
12 years, 10 months
[AutoQA] #115: Update post-tree-compose to recognize new stage2 URL
by fedora-badges
#115: Update post-tree-compose to recognize new stage2 URL
-------------------------+--------------------------------------------------
Reporter: jlaska | Owner:
Type: enhancement | Status: new
Priority: major | Milestone: Automate installation test plan
Component: watchers | Version: 1.0
Keywords: |
-------------------------+--------------------------------------------------
During Fedora 13, release engineering will provide schedule drops of
rawhide for automated testing against rats_install. The provided data
will include a compose tree (without packages). For proper automated
testing of these images, post-tree-compose will need to monitor a new URL
for testing (likely candidate is
http://alt.fedoraproject.org/pub/alt/stage/rawhide-testing).
In addition, rats_install will need to support installing with stage2 and
the package repo in different locations. That will be tracked in another
ticket.
--
Ticket URL: <https://fedorahosted.org/autoqa/ticket/115>
AutoQA <http://autoqa.fedorahosted.org>
Automated QA project
12 years, 10 months
[AutoQA] #129: create auto install test script which maps 1x1 to the install method.
by fedora-badges
#129: create auto install test script which maps 1x1 to the install method.
-------------------+--------------------------------------------------------
Reporter: liam | Owner:
Type: task | Status: new
Priority: major | Milestone: Automate installation test plan
Component: tests | Version: 1.0
Keywords: |
-------------------+--------------------------------------------------------
Each install method should have an install test script, For example:
* Single ISO install - dvd_install.py
* Multi ISO install - cd_install.py
* Hard Drive ISO install - hdiso_install.py
* NFS install - nfs_install.py
* NFS ISO install - nfsiso_install.py
* HTTP/FTP install - url_install.py <-- similar to
rats_install.py now
we need to create these scripts, but each of the script should share code
as much as possible.
--
Ticket URL: <https://fedorahosted.org/autoqa/ticket/129>
AutoQA <http://autoqa.fedorahosted.org>
Automated QA project
12 years, 10 months