Hi,
as you may know I have been working on a complete rewrite of HTML report and guide for the upcoming openscap 1.1.0 release. It's a feature that will touch almost every user of openscap. I would like to gather feedback from the scap-security-guide community so that we can make sure there aren't any blocker issues in the release. It is natural that there will be small issues that we will iron out in minor releases. Basically we would just like to make sure the new report and guide aren't missing anything crucial that would prevent adoption.
See https://mpreisle.fedorapeople.org/openscap/1.1.0_xslt/ for sample HTML report and guide from SSG for RHEL6.
Looking forward to feedback.
I like the new look and functionality.
Two first blush comments: 1) On the report document, I can imagine my security officials freaking out over the in-your-face "*The system is not compliant!*" text. What is the recommended course to ensure this text does not appear if you're running the scan on a webserver, for example? Is it as simple as creating a custom profile derived from the STIG profile? Does anyone directly use the STIG profile, have a completely compliant system, and have a server that actually does anything useful? Up to now, I've left tests in that I have waivers for, and then pointed at the waivers to justify the test failures. Perhaps I will need to change that practice.
2) On the guide document, the text beginning "Providing system administrators" occurs twice.
On Thu, Aug 28, 2014 at 11:49 AM, Martin Preisler mpreisle@redhat.com wrote:
Hi,
as you may know I have been working on a complete rewrite of HTML report and guide for the upcoming openscap 1.1.0 release. It's a feature that will touch almost every user of openscap. I would like to gather feedback from the scap-security-guide community so that we can make sure there aren't any blocker issues in the release. It is natural that there will be small issues that we will iron out in minor releases. Basically we would just like to make sure the new report and guide aren't missing anything crucial that would prevent adoption.
See https://mpreisle.fedorapeople.org/openscap/1.1.0_xslt/ for sample HTML report and guide from SSG for RHEL6.
Looking forward to feedback.
-- Martin Preisler -- SCAP Security Guide mailing list scap-security-guide@lists.fedorahosted.org https://lists.fedorahosted.org/mailman/listinfo/scap-security-guide https://github.com/OpenSCAP/scap-security-guide/
Agree about the "*The system is not compliant!*" text. A lot of our security people will freak out over it. Maybe either different types of non-compliance messages are based off of a %, or better non-compliance messages that are not so alarming.
Gabe
On Thu, Aug 28, 2014 at 12:29 PM, Andrew Gilmore agilmore2@gmail.com wrote:
I like the new look and functionality.
Two first blush comments:
- On the report document, I can imagine my security officials freaking
out over the in-your-face "*The system is not compliant!*" text. What is the recommended course to ensure this text does not appear if you're running the scan on a webserver, for example? Is it as simple as creating a custom profile derived from the STIG profile? Does anyone directly use the STIG profile, have a completely compliant system, and have a server that actually does anything useful? Up to now, I've left tests in that I have waivers for, and then pointed at the waivers to justify the test failures. Perhaps I will need to change that practice.
- On the guide document, the text beginning "Providing system
administrators" occurs twice.
On Thu, Aug 28, 2014 at 11:49 AM, Martin Preisler mpreisle@redhat.com wrote:
Hi,
as you may know I have been working on a complete rewrite of HTML report and guide for the upcoming openscap 1.1.0 release. It's a feature that will touch almost every user of openscap. I would like to gather feedback from the scap-security-guide community so that we can make sure there aren't any blocker issues in the release. It is natural that there will be small issues that we will iron out in minor releases. Basically we would just like to make sure the new report and guide aren't missing anything crucial that would prevent adoption.
See https://mpreisle.fedorapeople.org/openscap/1.1.0_xslt/ for sample HTML report and guide from SSG for RHEL6.
Looking forward to feedback.
-- Martin Preisler -- SCAP Security Guide mailing list scap-security-guide@lists.fedorahosted.org https://lists.fedorahosted.org/mailman/listinfo/scap-security-guide https://github.com/OpenSCAP/scap-security-guide/
-- SCAP Security Guide mailing list scap-security-guide@lists.fedorahosted.org https://lists.fedorahosted.org/mailman/listinfo/scap-security-guide https://github.com/OpenSCAP/scap-security-guide/
I agree with @gabe and others. Another option is to possibly give a 'system needs to remediate x controls to reach compliance.
Rodney
On Aug 28, 2014, at 3:09 PM, Gabe Alford redhatrises@gmail.com wrote:
Agree about the "The system is not compliant!" text. A lot of our security people will freak out over it. Maybe either different types of non-compliance messages are based off of a %, or better non-compliance messages that are not so alarming.
Gabe
On Thu, Aug 28, 2014 at 12:29 PM, Andrew Gilmore agilmore2@gmail.com wrote: I like the new look and functionality.
Two first blush comments:
- On the report document, I can imagine my security officials freaking out over the in-your-face "The system is not compliant!" text. What is the recommended course to ensure this text does not appear if you're running the scan on a webserver, for example? Is it as simple as creating a custom profile derived from the STIG profile? Does anyone directly use the STIG profile, have a completely compliant system, and have a server that actually does anything useful?
Up to now, I've left tests in that I have waivers for, and then pointed at the waivers to justify the test failures. Perhaps I will need to change that practice.
- On the guide document, the text beginning "Providing system administrators" occurs twice.
On Thu, Aug 28, 2014 at 11:49 AM, Martin Preisler mpreisle@redhat.com wrote: Hi,
as you may know I have been working on a complete rewrite of HTML report and guide for the upcoming openscap 1.1.0 release. It's a feature that will touch almost every user of openscap. I would like to gather feedback from the scap-security-guide community so that we can make sure there aren't any blocker issues in the release. It is natural that there will be small issues that we will iron out in minor releases. Basically we would just like to make sure the new report and guide aren't missing anything crucial that would prevent adoption.
See https://mpreisle.fedorapeople.org/openscap/1.1.0_xslt/ for sample HTML report and guide from SSG for RHEL6.
Looking forward to feedback.
-- Martin Preisler -- SCAP Security Guide mailing list scap-security-guide@lists.fedorahosted.org https://lists.fedorahosted.org/mailman/listinfo/scap-security-guide https://github.com/OpenSCAP/scap-security-guide/
-- SCAP Security Guide mailing list scap-security-guide@lists.fedorahosted.org https://lists.fedorahosted.org/mailman/listinfo/scap-security-guide https://github.com/OpenSCAP/scap-security-guide/
-- SCAP Security Guide mailing list scap-security-guide@lists.fedorahosted.org https://lists.fedorahosted.org/mailman/listinfo/scap-security-guide https://github.com/OpenSCAP/scap-security-guide/
----- Original Message -----
From: "Andrew Gilmore" agilmore2@gmail.com To: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Thursday, August 28, 2014 8:29:48 PM Subject: Re: New report and guide in openscap 1.1.0
I like the new look and functionality.
Two first blush comments:
- On the report document, I can imagine my security officials freaking out
over the in-your-face "*The system is not compliant!*" text. What is the recommended course to ensure this text does not appear if you're running the scan on a webserver, for example? Is it as simple as creating a custom profile derived from the STIG profile? Does anyone directly use the STIG profile, have a completely compliant system, and have a server that actually does anything useful? Up to now, I've left tests in that I have waivers for, and then pointed at the waivers to justify the test failures. Perhaps I will need to change that practice.
Isn't that a good thing? They should freak out, their system is not compliant! The recommended course is to tailor the profile, leaving out rules that make no sense on your system. Then you fix the remaining rules using remediation. In the end the machine will be compliant.
The job of openscap is to check your machines for compliance over and over. When the machines are suddenly not compliant you really want to know that!
- On the guide document, the text beginning "Providing system
administrators" occurs twice.
Looks like an issue with SSG but I will look more into it.
On 8/29/14, 5:37 AM, Martin Preisler wrote:
----- Original Message -----
From: "Andrew Gilmore" agilmore2@gmail.com To: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Thursday, August 28, 2014 8:29:48 PM Subject: Re: New report and guide in openscap 1.1.0
I like the new look and functionality.
Two first blush comments:
- On the report document, I can imagine my security officials freaking out
over the in-your-face "*The system is not compliant!*" text. What is the recommended course to ensure this text does not appear if you're running the scan on a webserver, for example? Is it as simple as creating a custom profile derived from the STIG profile? Does anyone directly use the STIG profile, have a completely compliant system, and have a server that actually does anything useful?
Feel free to start a dedicated thread on which rules cause you the most problems. Feedback would be great.
Up to now, I've left tests in that I have waivers for, and then pointed at the waivers to justify the test failures. Perhaps I will need to change that practice.
Isn't that a good thing? They should freak out, their system is not compliant! The recommended course is to tailor the profile, leaving out rules that make no sense on your system. Then you fix the remaining rules using remediation. In the end the machine will be compliant.
The job of openscap is to check your machines for compliance over and over. When the machines are suddenly not compliant you really want to know that!
As Martin pointed out, such a finding should be alarming! Culturally though, IV&V/SCA staff may over react when they see "The system is not compliant!"
Perhaps just a combination, including Rodney's suggestion, will soften the message. e.g: " The system is not compliant! System needs to remediate X controls to reach compliance."
- On the guide document, the text beginning "Providing system
administrators" occurs twice.
Looks like an issue with SSG but I will look more into it.
I believe it's something within the stylesheet.
$ grep -rin "Providing system administrators with such guidanc" * guide.xml:14:Providing system administrators with such guidance informs them how to securely
Full code @ https://github.com/OpenSCAP/scap-security-guide/blob/master/RHEL/6/input/gui...
----- Original Message -----
From: "Shawn Wells" shawn@redhat.com To: scap-security-guide@lists.fedorahosted.org Sent: Sunday, August 31, 2014 7:56:04 AM Subject: Re: New report and guide in openscap 1.1.0
[snip]
- On the guide document, the text beginning "Providing system
administrators" occurs twice.
Looks like an issue with SSG but I will look more into it.
I believe it's something within the stylesheet.
$ grep -rin "Providing system administrators with such guidanc" * guide.xml:14:Providing system administrators with such guidance informs them how to securely
You were right, fixed in b961b25474500fa898be4c5f2e17cfd10f58148d
I had provided a comment a while back that I never heard back on.
"I am not sure if it has been mentioned, but I personally would find it useful to include details on the results.
For instance, considering a check that ensures all libraries meet certain permissions, it would be useful to identify all entries that are non-compliant, if failed.
The SCC scanner does this, sort of. Well it lists every item verified, which in some cases can make it difficult to identify just the failed items (needle in the haystack).
In some cases, the simplicity of the result details in OpenSCAP reports are desirable (management audience). In other cases, the verbosity in the SCC reports are useful (engineer/technician audience).
It would be ideal if OpenSCAP could offer both.
More specifically, it would be ideal if an option could be specified when generating the report to dictate the verbosity of the report details. Or perhaps even a filter within the report that allows the verbosity to be toggled.
I would consider the following three verbosity levels most useful:
LOW - No details, just the overall outcome for each check (the current OpenSCAP report scheme).
MEDIUM - Includes all (and only) failed items for each check that fails.
HIGH - Includes all items verified (both pass and fail items) for every check."
Any thoughts on including this?
Best regards,
Trey Henefield, CISSP Senior IAVA Engineer
Ultra Electronics Advanced Tactical Systems, Inc. 4101 Smith School Road Building IV, Suite 100 Austin, TX 78744 USA
Trey.Henefield@ultra-ats.com Tel: +1 512 327 6795 ext. 647 Fax: +1 512 327 8043 Mobile: +1 512 541 6450
www.ultra-ats.com
-----Original Message----- From: scap-security-guide-bounces@lists.fedorahosted.org [mailto:scap-security-guide-bounces@lists.fedorahosted.org] On Behalf Of Martin Preisler Sent: Thursday, August 28, 2014 12:49 PM To: SCAP Security Guide Subject: New report and guide in openscap 1.1.0
Hi,
as you may know I have been working on a complete rewrite of HTML report and guide for the upcoming openscap 1.1.0 release. It's a feature that will touch almost every user of openscap. I would like to gather feedback from the scap-security-guide community so that we can make sure there aren't any blocker issues in the release. It is natural that there will be small issues that we will iron out in minor releases. Basically we would just like to make sure the new report and guide aren't missing anything crucial that would prevent adoption.
See https://mpreisle.fedorapeople.org/openscap/1.1.0_xslt/ for sample HTML report and guide from SSG for RHEL6.
Looking forward to feedback.
----- Original Message -----
From: "Trey Henefield" trey.henefield@ultra-ats.com To: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Thursday, August 28, 2014 9:28:34 PM Subject: RE: New report and guide in openscap 1.1.0
I had provided a comment a while back that I never heard back on.
"I am not sure if it has been mentioned, but I personally would find it useful to include details on the results.
For instance, considering a check that ensures all libraries meet certain permissions, it would be useful to identify all entries that are non-compliant, if failed.
We already do that for a lot of checks but not all. For example it's done for file permission checks.
Random Examples: "Verify and Correct File Permissions with RPM" "Verify that All World-Writable Directories Have Sticky Bits Set" "Ensure All Files Are Owned by a User" "Set Password Minimum Length in login.defs" ...
Is there any type of a check that is missing this functionality where it is essential?
Ah my appologies, I did not see those checks. Very nice. I do have to say, it looks very beautifully structured. Great job!
Is it possible there could be a way to filter or toggle between failed items versus all items checked? It would be useful for SCAP content testing and for reassurance of what was checked.
Best regards,
Trey Henefield, CISSP Senior IAVA Engineer
Ultra Electronics Advanced Tactical Systems, Inc. 4101 Smith School Road Building IV, Suite 100 Austin, TX 78744 USA
Trey.Henefield@ultra-ats.com Tel: +1 512 327 6795 ext. 647 Fax: +1 512 327 8043 Mobile: +1 512 541 6450
www.ultra-ats.com
-----Original Message----- From: scap-security-guide-bounces@lists.fedorahosted.org [mailto:scap-security-guide-bounces@lists.fedorahosted.org] On Behalf Of Martin Preisler Sent: Friday, August 29, 2014 4:42 AM To: SCAP Security Guide Subject: Re: New report and guide in openscap 1.1.0
----- Original Message -----
From: "Trey Henefield" trey.henefield@ultra-ats.com To: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Thursday, August 28, 2014 9:28:34 PM Subject: RE: New report and guide in openscap 1.1.0
I had provided a comment a while back that I never heard back on.
"I am not sure if it has been mentioned, but I personally would find it useful to include details on the results.
For instance, considering a check that ensures all libraries meet certain permissions, it would be useful to identify all entries that are non-compliant, if failed.
We already do that for a lot of checks but not all. For example it's done for file permission checks.
Random Examples: "Verify and Correct File Permissions with RPM" "Verify that All World-Writable Directories Have Sticky Bits Set" "Ensure All Files Are Owned by a User" "Set Password Minimum Length in login.defs" ...
Is there any type of a check that is missing this functionality where it is essential?
-- Martin Preisler -- SCAP Security Guide mailing list scap-security-guide@lists.fedorahosted.org https://lists.fedorahosted.org/mailman/listinfo/scap-security-guide https://github.com/OpenSCAP/scap-security-guide/
Disclaimer The information contained in this communication from trey.henefield@ultra-ats.com sent at 2014-08-31 00:25:17 is confidential and may be legally privileged. It is intended solely for use by scap-security-guide@lists.fedorahosted.org and others authorized to receive it. If you are not scap-security-guide@lists.fedorahosted.org you are hereby notified that any disclosure, copying, distribution or taking action in reliance of the contents of this information is strictly prohibited and may be unlawful.
On 8/31/14, 12:25 AM, Trey Henefield wrote:
Ah my appologies, I did not see those checks. Very nice. I do have to say, it looks very beautifully structured. Great job!
Is it possible there could be a way to filter or toggle between failed items versus all items checked? It would be useful for SCAP content testing and for reassurance of what was checked.
Absolutely! Check/Uncheck the fields under "Rule Overview" as you see fit:
Yes I did see that, which I also thought was very useful. What I was referring to was not the ability to filter the overall results, but the details in each check.
So in the linked report, for the check titled “Verify that All World-Writable Directories Have Sticky Bits Set”, the “OVAL details” section lists the two directories that failed the check (it’s a little mind boggling that a scanner would create a finding). So instead, have the option of listing all directories that were checked and identify which ones passed and failed. This is what I meant by details.
Judging by the finding in the referenced check, I know that SCC has been installed. The details they provide in their report is a good example. The bad part of that example is separating the passed vs failed by color (I can’t search by color) and the ability not to be able toggle between all of the oval details of the check versus just the failed items that caused the check to fail. This would also be nice for checks that have passed, just to have the ability to be able to see what files or entries were validated against the check.
I hope that better clarifies what I was hoping could be achieved. I know all these details can be retrieved through the oval results xml file, but it would be nice to have the ability to be able to view that information in a much better organized and human-readable structure that has been created in the new report.
Thanks!
Best regards,
Trey Henefield, CISSP Senior IAVA Engineer
Ultra Electronics Advanced Tactical Systems, Inc. 4101 Smith School Road Building IV, Suite 100 Austin, TX 78744 USA
Trey.Henefield@ultra-ats.com Tel: +1 512 327 6795 ext. 647 Fax: +1 512 327 8043 Mobile: +1 512 541 6450
www.ultra-ats.com
From: scap-security-guide-bounces@lists.fedorahosted.org [mailto:scap-security-guide-bounces@lists.fedorahosted.org] On Behalf Of Shawn Wells Sent: Sunday, August 31, 2014 1:03 AM To: scap-security-guide@lists.fedorahosted.org Subject: Re: New report and guide in openscap 1.1.0
On 8/31/14, 12:25 AM, Trey Henefield wrote:
Ah my appologies, I did not see those checks. Very nice. I do have to say, it looks very beautifully structured. Great job!
Is it possible there could be a way to filter or toggle between failed items versus all items checked? It would be useful for SCAP content testing and for reassurance of what was checked.
Absolutely! Check/Uncheck the fields under "Rule Overview" as you see fit:
[cid:image001.png@01CFC4F3.BCB68910]
Disclaimer The information contained in this communication from trey.henefield@ultra-ats.com sent at 2014-09-02 08:33:41 is confidential and may be legally privileged. It is intended solely for use by scap-security-guide@lists.fedorahosted.org and others authorized to receive it. If you are not scap-security-guide@lists.fedorahosted.org you are hereby notified that any disclosure, copying, distribution or taking action in reliance of the contents of this information is strictly prohibited and may be unlawful.
----- Original Message -----
From: "Trey Henefield" trey.henefield@ultra-ats.com To: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Tuesday, September 2, 2014 2:33:28 PM Subject: RE: New report and guide in openscap 1.1.0
Yes I did see that, which I also thought was very useful. What I was referring to was not the ability to filter the overall results, but the details in each check.
So in the linked report, for the check titled “Verify that All World-Writable Directories Have Sticky Bits Set”, the “OVAL details” section lists the two directories that failed the check (it’s a little mind boggling that a scanner would create a finding). So instead, have the option of listing all directories that were checked and identify which ones passed and failed. This is what I meant by details.
Judging by the finding in the referenced check, I know that SCC has been installed. The details they provide in their report is a good example. The
The result file I used is not from my machine. Sadly I do not have access to SCC.
bad part of that example is separating the passed vs failed by color (I can’t search by color) and the ability not to be able toggle between all of the oval details of the check versus just the failed items that caused the check to fail. This would also be nice for checks that have passed, just to have the ability to be able to see what files or entries were validated against the check.
This is an interesting feature request but I do not have engineering time for this right now. Please consider opening a ticket on customer portal.
----- Original Message -----
From: "Trey Henefield" trey.henefield@ultra-ats.com To: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Sunday, August 31, 2014 6:25:06 AM Subject: RE: New report and guide in openscap 1.1.0
Ah my appologies, I did not see those checks. Very nice. I do have to say, it looks very beautifully structured. Great job!
Is it possible there could be a way to filter or toggle between failed items versus all items checked? It would be useful for SCAP content testing and for reassurance of what was checked.
I am not sure I get what you mean. You can already filter rules by their result. Can you be more specific?
On 8/29/14, 5:41 AM, Martin Preisler wrote:
----- Original Message -----
From: "Trey Henefield" trey.henefield@ultra-ats.com To: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Thursday, August 28, 2014 9:28:34 PM Subject: RE: New report and guide in openscap 1.1.0
I had provided a comment a while back that I never heard back on.
"I am not sure if it has been mentioned, but I personally would find it useful to include details on the results.
For instance, considering a check that ensures all libraries meet certain permissions, it would be useful to identify all entries that are non-compliant, if failed.
We already do that for a lot of checks but not all. For example it's done for file permission checks.
Random Examples: "Verify and Correct File Permissions with RPM" "Verify that All World-Writable Directories Have Sticky Bits Set" "Ensure All Files Are Owned by a User" "Set Password Minimum Length in login.defs" ...
Is there any type of a check that is missing this functionality where it is essential?
Honestly, it'd be incredibly useful for all of them.
From the table on File Permissions with RPM, noticed the stylesheet
creates the "OVAL details" label. Tried searching through the OpenSCAP code to see how the XSLT gets this information to no avail: https://github.com/OpenSCAP/openscap/search?utf8=%E2%9C%93&q=%22OVAL+det...
Is there something we need to expose in content, or is the inclusion of test results accomplished via OpenSCAP?
----- Original Message -----
From: "Shawn Wells" shawn@redhat.com To: scap-security-guide@lists.fedorahosted.org Sent: Sunday, August 31, 2014 8:14:05 AM Subject: Re: New report and guide in openscap 1.1.0
On 8/29/14, 5:41 AM, Martin Preisler wrote:
----- Original Message -----
From: "Trey Henefield" trey.henefield@ultra-ats.com To: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Thursday, August 28, 2014 9:28:34 PM Subject: RE: New report and guide in openscap 1.1.0
I had provided a comment a while back that I never heard back on.
"I am not sure if it has been mentioned, but I personally would find it useful to include details on the results.
For instance, considering a check that ensures all libraries meet certain permissions, it would be useful to identify all entries that are non-compliant, if failed.
We already do that for a lot of checks but not all. For example it's done for file permission checks.
Random Examples: "Verify and Correct File Permissions with RPM" "Verify that All World-Writable Directories Have Sticky Bits Set" "Ensure All Files Are Owned by a User" "Set Password Minimum Length in login.defs" ...
Is there any type of a check that is missing this functionality where it is essential?
Honestly, it'd be incredibly useful for all of them.
From the table on File Permissions with RPM, noticed the stylesheet creates the "OVAL details" label. Tried searching through the OpenSCAP code to see how the XSLT gets this information to no avail: https://github.com/OpenSCAP/openscap/search?utf8=%E2%9C%93&q=%22OVAL+det...
The way it works is somewhat complicated :-)
See https://github.com/OpenSCAP/openscap/blob/master/xsl/xccdf-report-impl.xsl#L...
We try to locate an OVAL results file, query the relevant objects using XPath and run templates on them, generating HTML.
https://github.com/OpenSCAP/openscap/blob/master/xsl/xccdf-report-oval-detai... contains support for some OVAL objects. If you need more it has to be added into that file.
I agree with Trey. The details would be more useful if they contained more information about how something failed. I would expect that in a detail.
I'd be OK with default report ONLY showing details for the severity=high items that I failed and all other details went either into a second report I could open if I wanted, or the generic detail descriptions were on the WEB rather than part of the report.
The report could contain information about the system, but I'm not sure if there is a lot of value of all the (static) details being in the in report by default.
Greg Elin http://govready.org - Making FISMA compliance easier for innovators
email: gregelin@gitmachines.com phone: 917-304-3488
On Thu, Aug 28, 2014 at 3:28 PM, Trey Henefield < trey.henefield@ultra-ats.com> wrote:
I had provided a comment a while back that I never heard back on.
"I am not sure if it has been mentioned, but I personally would find it useful to include details on the results.
For instance, considering a check that ensures all libraries meet certain permissions, it would be useful to identify all entries that are non-compliant, if failed.
The SCC scanner does this, sort of. Well it lists every item verified, which in some cases can make it difficult to identify just the failed items (needle in the haystack).
In some cases, the simplicity of the result details in OpenSCAP reports are desirable (management audience). In other cases, the verbosity in the SCC reports are useful (engineer/technician audience).
It would be ideal if OpenSCAP could offer both.
More specifically, it would be ideal if an option could be specified when generating the report to dictate the verbosity of the report details. Or perhaps even a filter within the report that allows the verbosity to be toggled.
I would consider the following three verbosity levels most useful:
LOW - No details, just the overall outcome for each check (the current OpenSCAP report scheme).
MEDIUM - Includes all (and only) failed items for each check that fails.
HIGH - Includes all items verified (both pass and fail items) for every check."
Any thoughts on including this?
Best regards,
Trey Henefield, CISSP Senior IAVA Engineer
Ultra Electronics Advanced Tactical Systems, Inc. 4101 Smith School Road Building IV, Suite 100 Austin, TX 78744 USA
Trey.Henefield@ultra-ats.com Tel: +1 512 327 6795 ext. 647 Fax: +1 512 327 8043 Mobile: +1 512 541 6450
www.ultra-ats.com
-----Original Message----- From: scap-security-guide-bounces@lists.fedorahosted.org [mailto: scap-security-guide-bounces@lists.fedorahosted.org] On Behalf Of Martin Preisler Sent: Thursday, August 28, 2014 12:49 PM To: SCAP Security Guide Subject: New report and guide in openscap 1.1.0
Hi,
as you may know I have been working on a complete rewrite of HTML report and guide for the upcoming openscap 1.1.0 release. It's a feature that will touch almost every user of openscap. I would like to gather feedback from the scap-security-guide community so that we can make sure there aren't any blocker issues in the release. It is natural that there will be small issues that we will iron out in minor releases. Basically we would just like to make sure the new report and guide aren't missing anything crucial that would prevent adoption.
See https://mpreisle.fedorapeople.org/openscap/1.1.0_xslt/ for sample HTML report and guide from SSG for RHEL6.
Looking forward to feedback.
-- Martin Preisler -- SCAP Security Guide mailing list scap-security-guide@lists.fedorahosted.org https://lists.fedorahosted.org/mailman/listinfo/scap-security-guide https://github.com/OpenSCAP/scap-security-guide/
*Disclaimer* The information contained in this communication from * trey.henefield@ultra-ats.com trey.henefield@ultra-ats.com * sent at 2014-08-28 15:28:40 is private and may be legally privileged or export controlled. It is intended solely for use by * scap-security-guide@lists.fedorahosted.org scap-security-guide@lists.fedorahosted.org * and others authorized to receive it. If you are not * scap-security-guide@lists.fedorahosted.org scap-security-guide@lists.fedorahosted.org * you are hereby notified that any disclosure, copying, distribution or taking action in reliance of the contents of this information is strictly prohibited and may be unlawful.
-- SCAP Security Guide mailing list scap-security-guide@lists.fedorahosted.org https://lists.fedorahosted.org/mailman/listinfo/scap-security-guide https://github.com/OpenSCAP/scap-security-guide/
----- Original Message -----
From: "Greg Elin" gregelin@gitmachines.com To: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Friday, August 29, 2014 11:18:37 PM Subject: Re: New report and guide in openscap 1.1.0
I agree with Trey. The details would be more useful if they contained more information about how something failed. I would expect that in a detail.
As I said in my previous email, we show these details. We always have!
They were not present in one of the previous ! draft ! versions of XCCDF report. When I released that draft I explicitly said that check details are not there but will be there in the final version. Are you sure you are looking at the new report?
I'd be OK with default report ONLY showing details for the severity=high items that I failed and all other details went either into a second report
We show check system details for all failed rules, always. This works for both OVAL and SCE checks.
I could open if I wanted, or the generic detail descriptions were on the WEB rather than part of the report.
Are you sure you want detailed reports of vulnerabilities on your infrastructure to end up on the "WEB"? One of our goals is to keep the reports and guides self sufficient. We do not want to rely on remote tools. That's why we bundle even all the JavaScript and CSS.
The report could contain information about the system, but I'm not sure if there is a lot of value of all the (static) details being in the in report by default.
Not sure what you mean by static details. The IP and MAC addresses? They help identify the machine if hostname is not set properly. The CPE platforms should be there as well IMO as they help explain applicable and not-applicable results.
On Mon, Sep 1, 2014 at 5:32 AM, Martin Preisler mpreisle@redhat.com wrote:
----- Original Message -----
From: "Greg Elin" gregelin@gitmachines.com To: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Friday, August 29, 2014 11:18:37 PM Subject: Re: New report and guide in openscap 1.1.0
I agree with Trey. The details would be more useful if they contained
more
information about how something failed. I would expect that in a detail.
As I said in my previous email, we show these details. We always have!
They were not present in one of the previous ! draft ! versions of XCCDF report. When I released that draft I explicitly said that check details are not there but will be there in the final version. Are you sure you are looking at the new report?
Thanks for the prodding. I looked more closely last night. At least two of us had some confusion with this.
AFAIK, Oval results only show up if the `--oval-results` is added to `oscap xccdf eval` command. It seems to me specific errors should be shared by default and flag used to hide them. Yes, it is in the man page and of course we all read the man page... (But I guess that is a comment for the OpenSCAP List?)
Oval details show up on only some failed tests in the sample report ( https://mpreisle.fedorapeople.org/openscap/1.1.0_xslt/report.html) "package_aide_installed" for example does not show any oval details though it fails while "service_atd_disabled" does show oval results.
Last night I built the openSCAP from source for my first time to see if the issue was related to a sample report.
In the report I ran on RHEL6.4 64 bit system with `--oval-results` flag, I encountered the OVAL results more often than I did in the sample report, but there were still some without any OVAL details, like "rsyslog_send_messages_to_logserver" and "accounts_password_pam_cracklib_dcredit".
It seems OVAL details show up when there are specific "items violating." That is _incredibly_ helpful for a test like "rpm_verify_permissions". It's awesome.
It seems OVAL details do not show up if the control just fails. That is confusing. Didn't an OVAL criterion fail? Which one? Maybe there is more than one criterion for a given test? The fact sometimes OVAL details show up and sometimes they don't seems like yet another secret, undocumented detail that is obvious to those in the know, but confusing to those that aren't. At the very least, there should always be an explanation of the OVAL detail even if it is "OVAL failed without any detail. Here is the criterion that failed: "
I'd be OK with default report ONLY showing details for the severity=high items that I failed and all other details went either into a second
report
We show check system details for all failed rules, always. This works for both OVAL and SCE checks.
What I was trying to get at is that I think fails that are high severity should be treated differently in the report than those of medium or high. For example, they could their own section, or -- as I was musing -- be the only fails included in the report.
Consider a scan result indicating 100 errors: 5 are high severity, 20 are medium, and 75 low,
Speaking for myself, I am eager to fix the 5 that are high severity. I'll glance at the 20 and the 75 to see if there is a relevant pattern (e.g., sub groupings), but generally speaking I am going to leave those for later.
The five I wanted listed so I can see the issues right away, what failed, and how to fix. I could print that report and check items off. I would be OK with having to add a flag to get details for the medium and low.
(Note: this is also a question of quantity. When I have a small number of fails, I want to see them. When I have ten or more fails, I want a way of grouping the issues.)
So, it would be helpful to get a report that by default listed out the details of the 5 severe errors and just had a link to the others.
BTW - having the report shorter now is helpful! Thanks! I showed a manageable 25 pages; small if printed double sided.
I'll try to mockup what I mean.
I could open if I wanted, or the generic detail descriptions were on the WEB rather than part of the report.
Are you sure you want detailed reports of vulnerabilities on your infrastructure to end up on the "WEB"? One of our goals is to keep the reports and guides self sufficient. We do not want to rely on remote tools. That's why we bundle even all the JavaScript and CSS.
No, I do not want results of what controls *my* system passed or failed posted on the web.
I am speaking about the generic content that is a part of each control being on the web: rationale, links, related identifiers, remediation scripts, etc.
The most important information in the results of my scan, are the details about my system and my scan results. Generic information is _generic_. I only want that generic information co-located with details when it specifically helps me take an action.
Maybe what I am driving at as a general theme here is the difference between an "Action/Working Report" optimized to help me get to work resolving issues and "Complete Report" that represents an artifact capturing a state that is less transitory.
I'll try to mockup what I mean.
The report could contain information about the system, but I'm not sure
if
there is a lot of value of all the (static) details being in the in
report
by default.
Not sure what you mean by static details. The IP and MAC addresses? They help identify the machine if hostname is not set properly. The CPE platforms should be there as well IMO as they help explain applicable and not-applicable results.
By static details I am referring to information in the SSG that does not change from system to system or from scan result to scan result.
I feel I should say a bit more about this because the above may seem contradictory.
It might seem I am one hand suggesting more information be added to the report to make the report easier for beginners (e.g., include OVAL detail on all failed tests).
On the other hand, it might seem I am suggesting less information be included in the report (e.g., removing generic static details, or putting medium and low control fails details in a second report).
I am arguing for clarity and less cognitively taxing presentation that does way better than most other scan reports out there.
Only some fails having an OVAL detail section seems ambiguous. Meanwhile including all explanatory detail for every item often seems extraneous.
If I knew the perfect algorithm, I would share it. I do know what I want to read most of all in the report is : "The control ____ failed because __________" and be able to follow a link to get more detail if I happen to need it.
I think I really want a simple report that lists what failed and why, and then a second report (or hidden second half of the report) that has way more details.
If a control's one criterion failed, just tell me it in the summary and don't make me look it up. If a control's multiple criterion failed, say multiple items caused violation and link to detail. I can always look up the full control details on the web since they are public after all. (And if the controls is a custom, agency-only rule....well, include that detail in the report since it is specific to the system/environment and not generic.)
And yes, I'll work on a mockup...
Greg
-- Martin Preisler -- SCAP Security Guide mailing list scap-security-guide@lists.fedorahosted.org https://lists.fedorahosted.org/mailman/listinfo/scap-security-guide https://github.com/OpenSCAP/scap-security-guide/
----- Original Message -----
From: "Greg Elin" gregelin@gitmachines.com To: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Tuesday, September 2, 2014 5:21:23 PM Subject: Re: New report and guide in openscap 1.1.0
On Mon, Sep 1, 2014 at 5:32 AM, Martin Preisler mpreisle@redhat.com wrote:
----- Original Message -----
From: "Greg Elin" gregelin@gitmachines.com To: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Friday, August 29, 2014 11:18:37 PM Subject: Re: New report and guide in openscap 1.1.0
I agree with Trey. The details would be more useful if they contained
more
information about how something failed. I would expect that in a detail.
As I said in my previous email, we show these details. We always have!
They were not present in one of the previous ! draft ! versions of XCCDF report. When I released that draft I explicitly said that check details are not there but will be there in the final version. Are you sure you are looking at the new report?
Thanks for the prodding. I looked more closely last night. At least two of us had some confusion with this.
Many more people are confused by this AFAIK :-)
AFAIK, Oval results only show up if the `--oval-results` is added to `oscap xccdf eval` command. It seems to me specific errors should be shared by default and flag used to hide them. Yes, it is in the man page and of course we all read the man page... (But I guess that is a comment for the OpenSCAP List?)
Yes, that is the case. We are planning to fix this with Result DataStreams at some point in the future but it's not in git yet.
Oval details show up on only some failed tests in the sample report ( https://mpreisle.fedorapeople.org/openscap/1.1.0_xslt/report.html) "package_aide_installed" for example does not show any oval details though it fails while "service_atd_disabled" does show oval results.
Our OVAL details implementation supports a subset of all possible OVAL objects. I think we support most of the use-cases where you really need to know which item is the culprit.
We may support more at some point in the future. Patches are always welcome.
[snip]
It seems OVAL details show up when there are specific "items violating." That is _incredibly_ helpful for a test like "rpm_verify_permissions". It's awesome.
It seems OVAL details do not show up if the control just fails. That is confusing. Didn't an OVAL criterion fail? Which one? Maybe there is more than one criterion for a given test? The fact sometimes OVAL details show up and sometimes they don't seems like yet another secret, undocumented detail that is obvious to those in the know, but confusing to those that aren't. At the very least, there should always be an explanation of the OVAL detail even if it is "OVAL failed without any detail. Here is the criterion that failed: "
When we don't show anything in OVAL details it most likely means we don't support details for that particular OVAL object.
I'd be OK with default report ONLY showing details for the severity=high items that I failed and all other details went either into a second
report
We show check system details for all failed rules, always. This works for both OVAL and SCE checks.
What I was trying to get at is that I think fails that are high severity should be treated differently in the report than those of medium or high. For example, they could their own section, or -- as I was musing -- be the only fails included in the report.
I wanted to implement some sort of a "remediation priority list" that the report would generate for users on demand. You could print it and go through it in order. However I didn't have enough time to implement this in the end.
If there is customer demand it may happen in the near future.
[snip]
Are you sure you want detailed reports of vulnerabilities on your infrastructure to end up on the "WEB"? One of our goals is to keep the reports and guides self sufficient. We do not want to rely on remote tools. That's why we bundle even all the JavaScript and CSS.
No, I do not want results of what controls *my* system passed or failed posted on the web.
I am speaking about the generic content that is a part of each control being on the web: rationale, links, related identifiers, remediation scripts, etc.
The most important information in the results of my scan, are the details about my system and my scan results. Generic information is _generic_. I only want that generic information co-located with details when it specifically helps me take an action.
This is a general feature/issue with XCCDF and/or Source/Result DataStreams. Sorry but we can't fix this in report without making proprietary extensions.
Maybe what I am driving at as a general theme here is the difference between an "Action/Working Report" optimized to help me get to work resolving issues and "Complete Report" that represents an artifact capturing a state that is less transitory.
I'll try to mockup what I mean.
Maybe consider patching the XSLT directly. I have restructured it and it's much easier to follow how it works now.
[snip]
By static details I am referring to information in the SSG that does not change from system to system or from scan result to scan result.
I feel I should say a bit more about this because the above may seem contradictory.
It might seem I am one hand suggesting more information be added to the report to make the report easier for beginners (e.g., include OVAL detail on all failed tests).
On the other hand, it might seem I am suggesting less information be included in the report (e.g., removing generic static details, or putting medium and low control fails details in a second report).
I am arguing for clarity and less cognitively taxing presentation that does way better than most other scan reports out there.
Only some fails having an OVAL detail section seems ambiguous. Meanwhile including all explanatory detail for every item often seems extraneous.
If I knew the perfect algorithm, I would share it. I do know what I want to read most of all in the report is : "The control ____ failed because __________" and be able to follow a link to get more detail if I happen to need it.
OK, this clarification helped a lot. The only superfluous detail I know about is rule description. And we only show this in the modal dialog. Titles, IDs, references, identifiers, remediation fixes, severity, ... were all feature requests.
I think I really want a simple report that lists what failed and why, and then a second report (or hidden second half of the report) that has way more details.
Having multiple reports adds maintenance costs. But maybe we can hide this additional information under a [+] button or something like that.
Thanks Martin.
I will look at xslt.
Helpful to know about supporting only a subset and that is why the error details are missing.
Glad to hear xslt cleaned up. It was pretty complex.
Greg Elin P: 917-304-3488 E: gregelin@gitmachines.com
Sent from my iPhone
On Sep 2, 2014, at 12:22 PM, Martin Preisler mpreisle@redhat.com wrote:
----- Original Message -----
From: "Greg Elin" gregelin@gitmachines.com To: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Tuesday, September 2, 2014 5:21:23 PM Subject: Re: New report and guide in openscap 1.1.0
On Mon, Sep 1, 2014 at 5:32 AM, Martin Preisler mpreisle@redhat.com wrote:
----- Original Message -----
From: "Greg Elin" gregelin@gitmachines.com To: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Friday, August 29, 2014 11:18:37 PM Subject: Re: New report and guide in openscap 1.1.0
I agree with Trey. The details would be more useful if they contained
more
information about how something failed. I would expect that in a detail.
As I said in my previous email, we show these details. We always have!
They were not present in one of the previous ! draft ! versions of XCCDF report. When I released that draft I explicitly said that check details are not there but will be there in the final version. Are you sure you are looking at the new report?
Thanks for the prodding. I looked more closely last night. At least two of us had some confusion with this.
Many more people are confused by this AFAIK :-)
AFAIK, Oval results only show up if the `--oval-results` is added to `oscap xccdf eval` command. It seems to me specific errors should be shared by default and flag used to hide them. Yes, it is in the man page and of course we all read the man page... (But I guess that is a comment for the OpenSCAP List?)
Yes, that is the case. We are planning to fix this with Result DataStreams at some point in the future but it's not in git yet.
Oval details show up on only some failed tests in the sample report ( https://mpreisle.fedorapeople.org/openscap/1.1.0_xslt/report.html) "package_aide_installed" for example does not show any oval details though it fails while "service_atd_disabled" does show oval results.
Our OVAL details implementation supports a subset of all possible OVAL objects. I think we support most of the use-cases where you really need to know which item is the culprit.
We may support more at some point in the future. Patches are always welcome.
[snip]
It seems OVAL details show up when there are specific "items violating." That is _incredibly_ helpful for a test like "rpm_verify_permissions". It's awesome.
It seems OVAL details do not show up if the control just fails. That is confusing. Didn't an OVAL criterion fail? Which one? Maybe there is more than one criterion for a given test? The fact sometimes OVAL details show up and sometimes they don't seems like yet another secret, undocumented detail that is obvious to those in the know, but confusing to those that aren't. At the very least, there should always be an explanation of the OVAL detail even if it is "OVAL failed without any detail. Here is the criterion that failed: "
When we don't show anything in OVAL details it most likely means we don't support details for that particular OVAL object.
I'd be OK with default report ONLY showing details for the severity=high items that I failed and all other details went either into a second
report
We show check system details for all failed rules, always. This works for both OVAL and SCE checks.
What I was trying to get at is that I think fails that are high severity should be treated differently in the report than those of medium or high. For example, they could their own section, or -- as I was musing -- be the only fails included in the report.
I wanted to implement some sort of a "remediation priority list" that the report would generate for users on demand. You could print it and go through it in order. However I didn't have enough time to implement this in the end.
If there is customer demand it may happen in the near future.
[snip]
Are you sure you want detailed reports of vulnerabilities on your infrastructure to end up on the "WEB"? One of our goals is to keep the reports and guides self sufficient. We do not want to rely on remote tools. That's why we bundle even all the JavaScript and CSS.
No, I do not want results of what controls *my* system passed or failed posted on the web.
I am speaking about the generic content that is a part of each control being on the web: rationale, links, related identifiers, remediation scripts, etc.
The most important information in the results of my scan, are the details about my system and my scan results. Generic information is _generic_. I only want that generic information co-located with details when it specifically helps me take an action.
This is a general feature/issue with XCCDF and/or Source/Result DataStreams. Sorry but we can't fix this in report without making proprietary extensions.
Maybe what I am driving at as a general theme here is the difference between an "Action/Working Report" optimized to help me get to work resolving issues and "Complete Report" that represents an artifact capturing a state that is less transitory.
I'll try to mockup what I mean.
Maybe consider patching the XSLT directly. I have restructured it and it's much easier to follow how it works now.
[snip]
By static details I am referring to information in the SSG that does not change from system to system or from scan result to scan result.
I feel I should say a bit more about this because the above may seem contradictory.
It might seem I am one hand suggesting more information be added to the report to make the report easier for beginners (e.g., include OVAL detail on all failed tests).
On the other hand, it might seem I am suggesting less information be included in the report (e.g., removing generic static details, or putting medium and low control fails details in a second report).
I am arguing for clarity and less cognitively taxing presentation that does way better than most other scan reports out there.
Only some fails having an OVAL detail section seems ambiguous. Meanwhile including all explanatory detail for every item often seems extraneous.
If I knew the perfect algorithm, I would share it. I do know what I want to read most of all in the report is : "The control ____ failed because __________" and be able to follow a link to get more detail if I happen to need it.
OK, this clarification helped a lot. The only superfluous detail I know about is rule description. And we only show this in the modal dialog. Titles, IDs, references, identifiers, remediation fixes, severity, ... were all feature requests.
I think I really want a simple report that lists what failed and why, and then a second report (or hidden second half of the report) that has way more details.
Having multiple reports adds maintenance costs. But maybe we can hide this additional information under a [+] button or something like that.
-- Martin Preisler -- SCAP Security Guide mailing list scap-security-guide@lists.fedorahosted.org https://lists.fedorahosted.org/mailman/listinfo/scap-security-guide https://github.com/OpenSCAP/scap-security-guide/
Hello Martin,
thank you for the preview of the new look / functionality.
Couple of points for the report case (not sure they have been mentioned already): 1) Text under Characteristics paragraph:
" User root started the evaluation at 2014-08-28T16:44:12. Evaluation finished at 2014-08-28T16:50:10. The target machine was called localhost.localdomain.
Benchmark from /usr/share/xml/scap/ssg/content/ssg-rhel6-xccdf.xml was used. Profile usgcb-rhel6-server was selected."
Might it look better when organized into a table (example below)?
Evaluation Run:
Performed by: root Started: 2014-08-28T16:44:12 \ here maybe also split Y-M-D with space from H-M-S? Finished: 2014-08-28T16:50:10 / Target (of Evaluation): localhost.localdomain Benchmark Location: /usr/share/xml/scap/ssg/content/ssg-rhel6-xccdf.xml Evaluated Profile: usgcb-rhel6-server
2) Regarding colours - non-consistence in colour shades in "Compliance and Scoring" vs "Rule Overview" section. Would it be possible to merge the shades? (IOW use just one shade of red, green, gray, orange, etc. across the document)
3) Missing the "Rule overview" & "Rule details" anchors (they were present in previous version). Reasoning "Rule overview" isn't displayed when displaying top of the page. Would it be possible to have "Rule overview" anchor in the top panel to be able quickly to navigate there?
Ad "Rule details" -- since they aren't displayed by default, having "Rule details" anchor would either enable the "Show all result details" button (the page display would behave after clicking "Rule details" like the "Show all result details" button was clicked + the top of the page would be navigated to the start of the details table)
4) Rule titles aren't displayed in the colour of the result -- not sure we want this, but could you possibly provide preview of a case, where passed rules titles would be coloured out in green (same colour as the bounding box has around the rule result), unknown state rule titles would be in orange, notchecked rule titles in gray etc.
5) Regarding the "Result Details" table - generally looks fine, but sometimes the inner rule description is larger than the outside red coloured table. Example of the rule: "Verify and Correct File Permissions with RPM"
The inner table spans out of the red coloured bounding box. Would it be possible either to reduce the inner table or enlarge the outer bounding box?
6) (I think) from the current layering the particular OVAL check test comment (often clarifying the requirement) might not be immediately visible / noticeable:
Example (current output):
OVAL details nosuid on /dev/shm mount point device uuid fs type mount options mount options mount options total space space used space left /dev/shm tmpfs tmpfs rw seclabel relatime 128830 57 128773
Here the first row table header ("nosuid on /dev/shm") is that comment. Would it be possible to highlight it somehow? E.g.
OVAL details
Requirement: nosuid on /dev/shm
(Evaluated) System status: mount point device uuid ... ... ...
Or use at least bold font for the "nosuid on /dev/shm" OVAL comment.
Otherwise I think in general the output is very nice. Should I notice other points will share them yet.
Thank you && Regards, Jan. -- Jan iankko Lieskovsky / Red Hat Security Technologies Team
----- Original Message -----
From: "Martin Preisler" mpreisle@redhat.com To: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Thursday, August 28, 2014 7:49:23 PM Subject: New report and guide in openscap 1.1.0
Hi,
as you may know I have been working on a complete rewrite of HTML report and guide for the upcoming openscap 1.1.0 release. It's a feature that will touch almost every user of openscap. I would like to gather feedback from the scap-security-guide community so that we can make sure there aren't any blocker issues in the release. It is natural that there will be small issues that we will iron out in minor releases. Basically we would just like to make sure the new report and guide aren't missing anything crucial that would prevent adoption.
See https://mpreisle.fedorapeople.org/openscap/1.1.0_xslt/ for sample HTML report and guide from SSG for RHEL6.
Looking forward to feedback.
-- Martin Preisler -- SCAP Security Guide mailing list scap-security-guide@lists.fedorahosted.org https://lists.fedorahosted.org/mailman/listinfo/scap-security-guide https://github.com/OpenSCAP/scap-security-guide/
On 8/29/14, 4:52 AM, Jan Lieskovsky wrote:
Hello Martin,
thank you for the preview of the new look / functionality.
Couple of points for the report case (not sure they have been mentioned already):
Text under Characteristics paragraph:
" User root started the evaluation at 2014-08-28T16:44:12. Evaluation finished at 2014-08-28T16:50:10. The target machine was called localhost.localdomain.
Benchmark from /usr/share/xml/scap/ssg/content/ssg-rhel6-xccdf.xml was used. Profile usgcb-rhel6-server was selected."
Might it look better when organized into a table (example below)?
Evaluation Run:
Performed by: root Started: 2014-08-28T16:44:12 \ here maybe also split Y-M-D with space from H-M-S? Finished: 2014-08-28T16:50:10 / Target (of Evaluation): localhost.localdomain Benchmark Location: /usr/share/xml/scap/ssg/content/ssg-rhel6-xccdf.xml Evaluated Profile: usgcb-rhel6-server
+1
Regarding colours - non-consistence in colour shades in "Compliance and Scoring" vs "Rule Overview" section. Would it be possible to merge the shades? (IOW use just one shade of red, green, gray, orange, etc. across the document)
Missing the "Rule overview" & "Rule details" anchors (they were present in previous version). Reasoning "Rule overview" isn't displayed when displaying top of the page. Would it be possible to have "Rule overview" anchor in the top panel to be able quickly to navigate there?
Ad "Rule details" -- since they aren't displayed by default, having "Rule details" anchor would either enable the "Show all result details" button (the page display would behave after clicking "Rule details" like the "Show all result details" button was clicked + the top of the page would be navigated to the start of the details table)
Rule titles aren't displayed in the colour of the result -- not sure we want this, but could you possibly provide preview of a case, where passed rules titles would be coloured out in green (same colour as the bounding box has around the rule result), unknown state rule titles would be in orange, notchecked rule titles in gray etc.
Regarding the "Result Details" table - generally looks fine, but sometimes the inner rule description is larger than the outside red coloured table. Example of the rule: "Verify and Correct File Permissions with RPM"
The inner table spans out of the red coloured bounding box. Would it be possible either to reduce the inner table or enlarge the outer bounding box?
(I think) from the current layering the particular OVAL check test comment (often clarifying the requirement) might not be immediately visible / noticeable:
Example (current output):
OVAL details nosuid on /dev/shm mount point device uuid fs type mount options mount options mount options total space space used space left
/dev/shm tmpfs tmpfs rw seclabel relatime 128830 57 128773
Here the first row table header ("nosuid on /dev/shm") is that comment. Would it be possible to highlight it somehow? E.g.
OVAL details
Requirement: nosuid on /dev/shm (Evaluated) System status:
mount point device uuid ... ... ...
Or use at least bold font for the "nosuid on /dev/shm" OVAL comment.
Otherwise I think in general the output is very nice. Should I notice other points will share them yet.
Thank you && Regards, Jan.
Jan iankko Lieskovsky / Red Hat Security Technologies Team
----- Original Message -----
From: "Martin Preisler" mpreisle@redhat.com To: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Thursday, August 28, 2014 7:49:23 PM Subject: New report and guide in openscap 1.1.0
Hi,
as you may know I have been working on a complete rewrite of HTML report and guide for the upcoming openscap 1.1.0 release. It's a feature that will touch almost every user of openscap. I would like to gather feedback from the scap-security-guide community so that we can make sure there aren't any blocker issues in the release. It is natural that there will be small issues that we will iron out in minor releases. Basically we would just like to make sure the new report and guide aren't missing anything crucial that would prevent adoption.
See https://mpreisle.fedorapeople.org/openscap/1.1.0_xslt/ for sample HTML report and guide from SSG for RHEL6.
Looking forward to feedback.
-- Martin Preisler -- SCAP Security Guide mailing list scap-security-guide@lists.fedorahosted.org https://lists.fedorahosted.org/mailman/listinfo/scap-security-guide https://github.com/OpenSCAP/scap-security-guide/
-- SCAP Security Guide mailing list scap-security-guide@lists.fedorahosted.org https://lists.fedorahosted.org/mailman/listinfo/scap-security-guide https://github.com/OpenSCAP/scap-security-guide/
This took a bit longer to process and fix.
----- Original Message -----
From: "Jan Lieskovsky" jlieskov@redhat.com To: "Martin Preisler" mpreisle@redhat.com Cc: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Friday, August 29, 2014 10:52:44 AM Subject: Re: New report and guide in openscap 1.1.0
Hello Martin,
thank you for the preview of the new look / functionality.
Couple of points for the report case (not sure they have been mentioned already):
Text under Characteristics paragraph:
" User root started the evaluation at 2014-08-28T16:44:12. Evaluation finished at 2014-08-28T16:50:10. The target machine was called localhost.localdomain.
Benchmark from /usr/share/xml/scap/ssg/content/ssg-rhel6-xccdf.xml was used. Profile usgcb-rhel6-server was selected."
Might it look better when organized into a table (example below)?
Evaluation Run:
Performed by: root Started: 2014-08-28T16:44:12 \ here maybe also split Y-M-D with space from H-M-S? Finished: 2014-08-28T16:50:10 / Target (of Evaluation): localhost.localdomain Benchmark Location: /usr/share/xml/scap/ssg/content/ssg-rhel6-xccdf.xml Evaluated Profile: usgcb-rhel6-server
Fixed, see 4cc9cdc5f33c6d74c85498d29b7cdb6b0d265700
- Regarding colours - non-consistence in colour shades in "Compliance and
Scoring" vs "Rule Overview" section. Would it be possible to merge the shades? (IOW use just one shade of red, green, gray, orange, etc. across the document)
Fixed, see 1f72e5e3c3e3fad4b0d0b02558dafaa818085682
- Missing the "Rule overview" & "Rule details" anchors (they were present in
previous version). Reasoning "Rule overview" isn't displayed when displaying top of the page. Would it be possible to have "Rule overview" anchor in the top panel to be able quickly to navigate there?
Ad "Rule details" -- since they aren't displayed by default, having "Rule details" anchor would either enable the "Show all result details" button (the page display would behave after clicking "Rule details" like the "Show all result details" button was clicked + the top of the page would be navigated to the start of the details table)
I don't see any reason to include those. If you want this behavior, disable JavaScript and reload the report. I think it's inferior to the modal dialogs.
- Rule titles aren't displayed in the colour of the result -- not sure we
want this, but could you possibly provide preview of a case, where passed rules titles would be coloured out in green (same colour as the bounding box has around the rule result), unknown state rule titles would be in orange, notchecked rule titles in gray etc.
Correct, I highlight rules that need attention. All other rules have plain color. I don't want to make the report even more colorful than it is :-)
Sorry but I do not have time to do this, patches are welcome of course.
- Regarding the "Result Details" table - generally looks fine, but sometimes
the inner rule description is larger than the outside red coloured table. Example of the rule: "Verify and Correct File Permissions with RPM"
The inner table spans out of the red coloured bounding box. Would it be possible either to reduce the inner table or enlarge the outer bounding box?
I made the check-system-details div scroll when overflowing, see e4d6b3a2476f0487319127d56fbc338832585b42
Done the same for remediation fixes in cd68636eb9dde7a5d00dc8b5830d95015cc8d667
- (I think) from the current layering the particular OVAL check test comment
(often clarifying the requirement) might not be immediately visible / noticeable:
Example (current output):
OVAL details nosuid on /dev/shm mount point device uuid fs type mount options mount options mount options total space space used space left /dev/shm tmpfs tmpfs rw seclabel relatime 128830 57 128773
Here the first row table header ("nosuid on /dev/shm") is that comment. Would it be possible to highlight it somehow? E.g.
OVAL details
Requirement: nosuid on /dev/shm (Evaluated) System status:
mount point device uuid ... ... ...
Or use at least bold font for the "nosuid on /dev/shm" OVAL comment.
See 75f5f4f316a7d3cab582e5c9a09f8f89f103e24e
It now says "Items violating {OVAL test}:" because that's exactly what we are showing there.
Otherwise I think in general the output is very nice. Should I notice other points will share them yet.
Looking forward to that.
Thank you for the fixes, Martin. (I think) it looks better than previous version.
Couple of issues yet -- based on the review of the attachment (sorry if being demanding a lot):
* "Rule result breakdown" section - for the rules with unknown result, the 'unknown' title isn't stated after the count ('6' digit),
* Same for "Failed rules by severity breakdown" - there's starting '2' digit, but it's not stated this labels 'important' (category of) rules,
* "Score" table - the rules that passed are scored in percentage ('58.46%'), while the failed ones aren't - could same percentage be added there?,
* there's one case (FWICT) of overflowing table yet -- for the "Ensure auditd Collects Unauthorized Access Attempts to Files (unsuccessful)" rule. Not in the "dialog view', but rather after clicking the "Show all rule details" button (dialog view is fine).
* (Kind request) - could the default links colour be set to be the same as the background light blue colour used in the "identifiers:" field (to reduce count of colours yet)
Thank you && Regards, Jan. -- Jan iankko Lieskovsky / Red Hat Security Technologies Team
----- Original Message -----
From: "Martin Preisler" mpreisle@redhat.com To: "Jan Lieskovsky" jlieskov@redhat.com Cc: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Monday, September 1, 2014 4:01:07 PM Subject: Re: New report and guide in openscap 1.1.0
This took a bit longer to process and fix.
----- Original Message -----
From: "Jan Lieskovsky" jlieskov@redhat.com To: "Martin Preisler" mpreisle@redhat.com Cc: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Friday, August 29, 2014 10:52:44 AM Subject: Re: New report and guide in openscap 1.1.0
Hello Martin,
thank you for the preview of the new look / functionality.
Couple of points for the report case (not sure they have been mentioned already):
Text under Characteristics paragraph:
" User root started the evaluation at 2014-08-28T16:44:12. Evaluation finished at 2014-08-28T16:50:10. The target machine was called localhost.localdomain.
Benchmark from /usr/share/xml/scap/ssg/content/ssg-rhel6-xccdf.xml was used. Profile usgcb-rhel6-server was selected."
Might it look better when organized into a table (example below)?
Evaluation Run:
Performed by: root Started: 2014-08-28T16:44:12 \ here maybe also split Y-M-D with space from H-M-S? Finished: 2014-08-28T16:50:10 / Target (of Evaluation): localhost.localdomain Benchmark Location: /usr/share/xml/scap/ssg/content/ssg-rhel6-xccdf.xml Evaluated Profile: usgcb-rhel6-server
Fixed, see 4cc9cdc5f33c6d74c85498d29b7cdb6b0d265700
- Regarding colours - non-consistence in colour shades in "Compliance and
Scoring" vs "Rule Overview" section. Would it be possible to merge the shades? (IOW use just one shade of red, green, gray, orange, etc. across the document)
Fixed, see 1f72e5e3c3e3fad4b0d0b02558dafaa818085682
- Missing the "Rule overview" & "Rule details" anchors (they were present
in previous version). Reasoning "Rule overview" isn't displayed when displaying top of the page. Would it be possible to have "Rule overview" anchor in the top panel to be able quickly to navigate there?
Ad "Rule details" -- since they aren't displayed by default, having "Rule details" anchor would either enable the "Show all result details" button (the page display would behave after clicking "Rule details" like the "Show all result details" button was clicked + the top of the page would be navigated to the start of the details table)
I don't see any reason to include those. If you want this behavior, disable JavaScript and reload the report. I think it's inferior to the modal dialogs.
Ok.
- Rule titles aren't displayed in the colour of the result -- not sure we
want this, but could you possibly provide preview of a case, where passed rules titles would be coloured out in green (same colour as the bounding box has around the rule result), unknown state rule titles would be in orange, notchecked rule titles in gray etc.
Correct, I highlight rules that need attention. All other rules have plain color. I don't want to make the report even more colorful than it is :-)
Sorry but I do not have time to do this, patches are welcome of course.
Ok :).
- Regarding the "Result Details" table - generally looks fine, but
sometimes the inner rule description is larger than the outside red coloured table. Example of the rule: "Verify and Correct File Permissions with RPM"
The inner table spans out of the red coloured bounding box. Would it be possible either to reduce the inner table or enlarge the outer bounding box?
I made the check-system-details div scroll when overflowing, see e4d6b3a2476f0487319127d56fbc338832585b42
Done the same for remediation fixes in cd68636eb9dde7a5d00dc8b5830d95015cc8d667
- (I think) from the current layering the particular OVAL check test
comment (often clarifying the requirement) might not be immediately visible / noticeable:
Example (current output):
OVAL details nosuid on /dev/shm mount point device uuid fs type mount options mount options mount options total space space used space left /dev/shm tmpfs tmpfs rw seclabel relatime 128830 57 128773
Here the first row table header ("nosuid on /dev/shm") is that comment. Would it be possible to highlight it somehow? E.g.
OVAL details
Requirement: nosuid on /dev/shm (Evaluated) System status:
mount point device uuid ... ... ...
Or use at least bold font for the "nosuid on /dev/shm" OVAL comment.
See 75f5f4f316a7d3cab582e5c9a09f8f89f103e24e
It now says "Items violating {OVAL test}:" because that's exactly what we are showing there.
Otherwise I think in general the output is very nice. Should I notice other points will share them yet.
Looking forward to that.
-- Martin Preisler
----- Original Message -----
From: "Jan Lieskovsky" jlieskov@redhat.com To: "Martin Preisler" mpreisle@redhat.com Cc: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Monday, September 1, 2014 5:57:18 PM Subject: Re: New report and guide in openscap 1.1.0
Thank you for the fixes, Martin. (I think) it looks better than previous version.
Couple of issues yet -- based on the review of the attachment (sorry if being demanding a lot):
"Rule result breakdown" section - for the rules with unknown result, the 'unknown' title isn't stated after the count ('6' digit),
Same for "Failed rules by severity breakdown" - there's starting '2' digit, but it's not stated this labels 'important' (category of) rules,
I think the reason why is obvious. It just doesn't fit there. If you hover over the progress bar you will get an explicit explanation.
- "Score" table - the rules that passed are scored in percentage ('58.46%'), while the failed ones aren't - could same percentage be added there?,
This is just XCCDF scoring, please refer to the specification.
- there's one case (FWICT) of overflowing table yet -- for the "Ensure auditd Collects Unauthorized Access Attempts to Files (unsuccessful)" rule. Not in the "dialog view', but rather after clicking the "Show all rule details" button (dialog view is fine).
Good catch. It's because it's description that's overflowing there, not remediation or OVAL results.
Fixed in 03745668e4d6e43ff668a5285507482e7118b41c
- (Kind request) - could the default links colour be set to be the same as the background light blue colour used in the "identifiers:" field (to reduce count of colours yet)
No. I use bootstrap CSS, I try to deviate as little as possible. I am not a designer. Tell this to the bootstrap folks at getbootstrap.com
----- Original Message -----
From: "Martin Preisler" mpreisle@redhat.com To: "Jan Lieskovsky" jlieskov@redhat.com Cc: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Monday, September 1, 2014 7:15:14 PM Subject: Re: New report and guide in openscap 1.1.0
----- Original Message -----
From: "Jan Lieskovsky" jlieskov@redhat.com To: "Martin Preisler" mpreisle@redhat.com Cc: "SCAP Security Guide" scap-security-guide@lists.fedorahosted.org Sent: Monday, September 1, 2014 5:57:18 PM Subject: Re: New report and guide in openscap 1.1.0
Thank you for the fixes, Martin. (I think) it looks better than previous version.
Couple of issues yet -- based on the review of the attachment (sorry if being demanding a lot):
"Rule result breakdown" section - for the rules with unknown result, the 'unknown' title isn't stated after the count ('6' digit),
Same for "Failed rules by severity breakdown" - there's starting '2' digit, but it's not stated this labels 'important' (category of) rules,
I think the reason why is obvious. It just doesn't fit there. If you hover over the progress bar you will get an explicit explanation.
Would it be possible to use a legend then in both cases & don't display category names at all (to be consistent)?
e.g. something like:
Rule result breakdown [red_square] Failed [green_square] Passed [orange_square] Notchecked
then display just
[green][red][orange]
proportion bar (without the Failed etc. square labels)
Failed rules by severity breakdown [red_square] Important [orange_square] Medium [Light blue] Low
then display just
[red][orange][light_blue]
proportion bar (without the Important, Medium etc labels)
- "Score" table - the rules that passed are scored in percentage ('58.46%'), while the failed ones aren't - could same percentage be added there?,
This is just XCCDF scoring, please refer to the specification.
Ok.
- there's one case (FWICT) of overflowing table yet -- for the "Ensure auditd Collects Unauthorized Access Attempts to Files (unsuccessful)" rule. Not in the "dialog view', but rather after clicking the "Show all rule details" button (dialog view is fine).
Good catch. It's because it's description that's overflowing there, not remediation or OVAL results.
Fixed in 03745668e4d6e43ff668a5285507482e7118b41c
Thank you.
- (Kind request) - could the default links colour be set to be the same as the background light blue colour used in the "identifiers:" field (to reduce count of colours yet)
No. I use bootstrap CSS, I try to deviate as little as possible. I am not a designer. Tell this to the bootstrap folks at getbootstrap.com
Ok.
Thank you && Regards, Jan. -- Jan iankko Lieskovsky / Red Hat Security Technologies Team
-- Martin Preisler
On 8/28/14, 1:49 PM, Martin Preisler wrote:
Hi,
as you may know I have been working on a complete rewrite of HTML report and guide for the upcoming openscap 1.1.0 release. It's a feature that will touch almost every user of openscap. I would like to gather feedback from the scap-security-guide community so that we can make sure there aren't any blocker issues in the release. It is natural that there will be small issues that we will iron out in minor releases. Basically we would just like to make sure the new report and guide aren't missing anything crucial that would prevent adoption.
See https://mpreisle.fedorapeople.org/openscap/1.1.0_xslt/ for sample HTML report and guide from SSG for RHEL6.
Looking forward to feedback.
This is really, really great. I attended a presentation last week where a community member used one of your earlier mockups in a presentation!
Could the "show all results" button also be added to the top? Slightly inconvenient to scroll all the way to the bottom, then have to scroll all the way to the top to begin full review.
----- Original Message -----
From: "Shawn Wells" shawn@redhat.com To: scap-security-guide@lists.fedorahosted.org Sent: Sunday, August 31, 2014 8:20:07 AM Subject: Re: New report and guide in openscap 1.1.0
On 8/28/14, 1:49 PM, Martin Preisler wrote:
Hi,
as you may know I have been working on a complete rewrite of HTML report and guide for the upcoming openscap 1.1.0 release. It's a feature that will touch almost every user of openscap. I would like to gather feedback from the scap-security-guide community so that we can make sure there aren't any blocker issues in the release. It is natural that there will be small issues that we will iron out in minor releases. Basically we would just like to make sure the new report and guide aren't missing anything crucial that would prevent adoption.
See https://mpreisle.fedorapeople.org/openscap/1.1.0_xslt/ for sample HTML report and guide from SSG for RHEL6.
Looking forward to feedback.
This is really, really great. I attended a presentation last week where a community member used one of your earlier mockups in a presentation!
Could the "show all results" button also be added to the top? Slightly inconvenient to scroll all the way to the bottom, then have to scroll all the way to the top to begin full review.
Not sure I understand this right. The purpose if show all results is to allow you to go through all rules one by one without any tree structure. If you are doing a review you can just click the rule name to spawn a modal dialog.
Why would you click show all results and then scroll back up?
Martin,
Is the new report in the current master branch on Github.com/OpenSCAP or Fedora?
From your blog post, '
https://git.fedorahosted.org/cgit/openscap.git/log/?h=xslt-devel' returns invalid branch.
Greg
On Mon, Sep 1, 2014 at 1:38 PM, Martin Preisler mpreisle@redhat.com wrote:
----- Original Message -----
From: "Shawn Wells" shawn@redhat.com To: scap-security-guide@lists.fedorahosted.org Sent: Sunday, August 31, 2014 8:20:07 AM Subject: Re: New report and guide in openscap 1.1.0
On 8/28/14, 1:49 PM, Martin Preisler wrote:
Hi,
as you may know I have been working on a complete rewrite of HTML
report
and guide for the upcoming openscap 1.1.0 release. It's a feature that will touch almost every user of openscap. I would like to gather
feedback
from the scap-security-guide community so that we can make sure there aren't any blocker issues in the release. It is natural that there
will be
small issues that we will iron out in minor releases. Basically we
would
just like to make sure the new report and guide aren't missing anything crucial that would prevent adoption.
See https://mpreisle.fedorapeople.org/openscap/1.1.0_xslt/ for sample
HTML
report and guide from SSG for RHEL6.
Looking forward to feedback.
This is really, really great. I attended a presentation last week where a community member used one of your earlier mockups in a presentation!
Could the "show all results" button also be added to the top? Slightly inconvenient to scroll all the way to the bottom, then have to scroll all the way to the top to begin full review.
Not sure I understand this right. The purpose if show all results is to allow you to go through all rules one by one without any tree structure. If you are doing a review you can just click the rule name to spawn a modal dialog.
Why would you click show all results and then scroll back up?
-- Martin Preisler -- SCAP Security Guide mailing list scap-security-guide@lists.fedorahosted.org https://lists.fedorahosted.org/mailman/listinfo/scap-security-guide https://github.com/OpenSCAP/scap-security-guide/
Nevermind. It looks like it was merged in with OpenSCAP master on Fedora. Greg
On Mon, Sep 1, 2014 at 3:36 PM, Greg Elin gregelin@gitmachines.com wrote:
Martin,
Is the new report in the current master branch on Github.com/OpenSCAP or Fedora?
From your blog post, ' https://git.fedorahosted.org/cgit/openscap.git/log/?h=xslt-devel' returns invalid branch.
Greg
On Mon, Sep 1, 2014 at 1:38 PM, Martin Preisler mpreisle@redhat.com wrote:
----- Original Message -----
From: "Shawn Wells" shawn@redhat.com To: scap-security-guide@lists.fedorahosted.org Sent: Sunday, August 31, 2014 8:20:07 AM Subject: Re: New report and guide in openscap 1.1.0
On 8/28/14, 1:49 PM, Martin Preisler wrote:
Hi,
as you may know I have been working on a complete rewrite of HTML
report
and guide for the upcoming openscap 1.1.0 release. It's a feature that will touch almost every user of openscap. I would like to gather
feedback
from the scap-security-guide community so that we can make sure there aren't any blocker issues in the release. It is natural that there
will be
small issues that we will iron out in minor releases. Basically we
would
just like to make sure the new report and guide aren't missing
anything
crucial that would prevent adoption.
See https://mpreisle.fedorapeople.org/openscap/1.1.0_xslt/ for
sample HTML
report and guide from SSG for RHEL6.
Looking forward to feedback.
This is really, really great. I attended a presentation last week where a community member used one of your earlier mockups in a presentation!
Could the "show all results" button also be added to the top? Slightly inconvenient to scroll all the way to the bottom, then have to scroll all the way to the top to begin full review.
Not sure I understand this right. The purpose if show all results is to allow you to go through all rules one by one without any tree structure. If you are doing a review you can just click the rule name to spawn a modal dialog.
Why would you click show all results and then scroll back up?
-- Martin Preisler -- SCAP Security Guide mailing list scap-security-guide@lists.fedorahosted.org https://lists.fedorahosted.org/mailman/listinfo/scap-security-guide https://github.com/OpenSCAP/scap-security-guide/
scap-security-guide@lists.fedorahosted.org