[Xccdf-dev] NOTCHECKED vs UNKNOWN rule result

classic Classic list List threaded Threaded
12 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

[Xccdf-dev] NOTCHECKED vs UNKNOWN rule result

joval
In the XCCDF 1.2 specification, section 6.6.4.2 (Table 26) says that check systems that are not supported should result in a NOTCHECKED result.  A NOTCHECKED result will end up having no impact on a score (see tables 40 and 41 in 7.3.2.2 and 7.3.2.3).  But, is that really what's desired?  It seems to me that in such cases, perhaps an UNKNOWN result would make more sense.

Does anyone else have an opinion?

Regards,
--David Solin

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download


_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

jOVAL.org: OVAL implemented in Java.
Scan any machine from any machine. For free!
Learn More | Features | Download

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Xccdf-dev] NOTCHECKED vs UNKNOWN rule result

Martin Preisler
Hi,
the specification is explicit in this case so openscap is implemented to print
NOTCHECKED as the result.

quote from XCCDF 1.2 spec (6.6.4.2):
> The testing tool encountered some problem and the result is unknown. For example, a result of
> ‘unknown’ might be given if the testing tool was unable to interpret the output of the checking
> engine (the output has no meaning to the testing tool).

Not supporting the check engine could certainly count as a problem. However,
the example implies that there were problems processing output from a checking
engine which means there has to be some output from a checking engine.

It seems to me that "notchecked" is a special case of "unknown" rather than
an overlap. At least that's how I see it :-)

I also find it strange that it has no impact on score. Could be dangerous
because people don't always inspect all the results and might just look at
the score.

--
Martin Preisler

----- Original Message -----

> From: "David Solin" <[hidden email]>
> To: "XCCDF-DEV" <[hidden email]>
> Sent: Monday, September 9, 2013 2:54:25 AM
> Subject: [Xccdf-dev] NOTCHECKED vs UNKNOWN rule result
>
> In the XCCDF 1.2 specification, section 6.6.4.2 (Table 26) says that
> check systems that are not supported should result in a NOTCHECKED
> result.  A NOTCHECKED result will end up having no impact on a score
> (see tables 40 and 41 in 7.3.2.2 and 7.3.2.3).  But, is that really
> what's desired?  It seems to me that in such cases, perhaps an UNKNOWN
> result would make more sense.
>
> Does anyone else have an opinion?
>
> Regards,
> --David Solin
>
> --
>
> jOVAL.org: SCAP Simplified.
> Learn More <http://www.joval.org> | Features
> <http://www.joval.org/features/> | Download <http://www.joval.org/download/>
>
>
> _______________________________________________
> XCCDF-dev mailing list
> [hidden email]
> To unsubscribe, send an email message to [hidden email].

_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Xccdf-dev] NOTCHECKED vs UNKNOWN rule result

joval
Yes, I agree that NOTCHECKED is like a special case of UNKNOWN, it's just the fact that it has no impact on the score that I think is dangerous.  Many scanners don't support OCIL or SCE, but they'd produce a score that essentially gives "credit" for rules that went unchecked.

Regards,
--David Solin

On 9/9/2013 8:18 AM, Martin Preisler wrote:
Hi,
the specification is explicit in this case so openscap is implemented to print
NOTCHECKED as the result.

quote from XCCDF 1.2 spec (6.6.4.2):
The testing tool encountered some problem and the result is unknown. For example, a result of 
‘unknown’ might be given if the testing tool was unable to interpret the output of the checking 
engine (the output has no meaning to the testing tool).
Not supporting the check engine could certainly count as a problem. However,
the example implies that there were problems processing output from a checking
engine which means there has to be some output from a checking engine.

It seems to me that "notchecked" is a special case of "unknown" rather than
an overlap. At least that's how I see it :-)

I also find it strange that it has no impact on score. Could be dangerous
because people don't always inspect all the results and might just look at
the score.



--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download


_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

jOVAL.org: OVAL implemented in Java.
Scan any machine from any machine. For free!
Learn More | Features | Download

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

Harrison, Timothy [USA]
While I agree, in part or in whole, with both of your positions, one scenario which requires consideration is when multiple tools are leveraged in order to comprehensive coverage for all checking engines.  If you leverage more than one tool in such a fashion then NOTCHECKED results would impact scoring.  Granted, if properly handled the conflicting results between the two tools the non-NOTCHECKED results would take precedence there by addressing the potential for NOTCHECKED results to skew your scores.  I don't believe this is addressed in any of the SCAP specification, but please let me know if it is for future reference.

Additionally, you need to keep in mind that depending on the XCCDF Rule selection a Rule may not be checked due to it not being selected and as a result should be excluded from scoring.  While one might exclude the unselected Rule(s) from the results a case may be made for including such results.  In the instance where multiple profiles exist it may be beneficial to have the results for all Rules in order to allow post evaluation review based on any one of the profiles defined within the XCCDF document.  Only in the instance where Values are refined differently between the profiles would there be a need for re-evaluation based on the selected profile, but even then only the delta would require re-evaluation.

The questions which need to be answered are whether these are the only instances in which one may want to exclude NOTCHECKED results from scoring and whether they, or others, warrant such an exclusion.

V/r,
Tim Harrison

Timothy Harrison, CISSP

Associate

Booz | Allen | Hamilton


Mobile: 717-372-5768

[hidden email]


From: [hidden email] [[hidden email]] on behalf of David Solin [[hidden email]]
Sent: Monday, September 09, 2013 9:48 AM
To: [hidden email]
Subject: [External] Re: [Xccdf-dev] NOTCHECKED vs UNKNOWN rule result

Yes, I agree that NOTCHECKED is like a special case of UNKNOWN, it's just the fact that it has no impact on the score that I think is dangerous.  Many scanners don't support OCIL or SCE, but they'd produce a score that essentially gives "credit" for rules that went unchecked.

Regards,
--David Solin

On 9/9/2013 8:18 AM, Martin Preisler wrote:
Hi,
the specification is explicit in this case so openscap is implemented to print
NOTCHECKED as the result.

quote from XCCDF 1.2 spec (6.6.4.2):
The testing tool encountered some problem and the result is unknown. For example, a result of 
‘unknown’ might be given if the testing tool was unable to interpret the output of the checking 
engine (the output has no meaning to the testing tool).
Not supporting the check engine could certainly count as a problem. However,
the example implies that there were problems processing output from a checking
engine which means there has to be some output from a checking engine.

It seems to me that "notchecked" is a special case of "unknown" rather than
an overlap. At least that's how I see it :-)

I also find it strange that it has no impact on score. Could be dangerous
because people don't always inspect all the results and might just look at
the score.



--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download


_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

joval
Hi Tim,

There is also a NOTSELECTED result that addresses your second point.  My concern is that, from a scoring perspective, there should probably be a difference between NOTCHECKED and NOTSELECTED.  NOTCHECKED is more like UNKNOWN than NOTSELECTED, and should be scored accordingly.

Basically, it's a matter of the specification being quite clear, but apparently not well thought out on this point.

Best regards,
--David Solin

On 9/17/2013 8:48 AM, Harrison, Timothy [USA] wrote:
While I agree, in part or in whole, with both of your positions, one scenario which requires consideration is when multiple tools are leveraged in order to comprehensive coverage for all checking engines.  If you leverage more than one tool in such a fashion then NOTCHECKED results would impact scoring.  Granted, if properly handled the conflicting results between the two tools the non-NOTCHECKED results would take precedence there by addressing the potential for NOTCHECKED results to skew your scores.  I don't believe this is addressed in any of the SCAP specification, but please let me know if it is for future reference.

Additionally, you need to keep in mind that depending on the XCCDF Rule selection a Rule may not be checked due to it not being selected and as a result should be excluded from scoring.  While one might exclude the unselected Rule(s) from the results a case may be made for including such results.  In the instance where multiple profiles exist it may be beneficial to have the results for all Rules in order to allow post evaluation review based on any one of the profiles defined within the XCCDF document.  Only in the instance where Values are refined differently between the profiles would there be a need for re-evaluation based on the selected profile, but even then only the delta would require re-evaluation.

The questions which need to be answered are whether these are the only instances in which one may want to exclude NOTCHECKED results from scoring and whether they, or others, warrant such an exclusion.

V/r,
Tim Harrison

Timothy Harrison, CISSP

Associate

Booz | Allen | Hamilton


Mobile: 717-372-5768

[hidden email]


From: [hidden email] [[hidden email]] on behalf of David Solin [[hidden email]]
Sent: Monday, September 09, 2013 9:48 AM
To: [hidden email]
Subject: [External] Re: [Xccdf-dev] NOTCHECKED vs UNKNOWN rule result

Yes, I agree that NOTCHECKED is like a special case of UNKNOWN, it's just the fact that it has no impact on the score that I think is dangerous.  Many scanners don't support OCIL or SCE, but they'd produce a score that essentially gives "credit" for rules that went unchecked.

Regards,
--David Solin

On 9/9/2013 8:18 AM, Martin Preisler wrote:
Hi,
the specification is explicit in this case so openscap is implemented to print
NOTCHECKED as the result.

quote from XCCDF 1.2 spec (6.6.4.2):
The testing tool encountered some problem and the result is unknown. For example, a result of 
‘unknown’ might be given if the testing tool was unable to interpret the output of the checking 
engine (the output has no meaning to the testing tool).
Not supporting the check engine could certainly count as a problem. However,
the example implies that there were problems processing output from a checking
engine which means there has to be some output from a checking engine.

It seems to me that "notchecked" is a special case of "unknown" rather than
an overlap. At least that's how I see it :-)

I also find it strange that it has no impact on score. Could be dangerous
because people don't always inspect all the results and might just look at
the score.



--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download



_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].


--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download


_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

jOVAL.org: OVAL implemented in Java.
Scan any machine from any machine. For free!
Learn More | Features | Download

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

Harrison, Timothy [USA]
David,

My mistake, I overlooked the NOTSELECTED status, but that does not fully nullify the second point.  If you were to switch between profiles and were not able to re-evaluate the delta, the NOTSELECTED status should change to NOTCHECKED with respect to that profile.

I'd be careful in suggesting that it was not well thought out as there may have been specific rationale behind excluding these from scoring.  Granted, it could simply have been the fact that UNKNOWN's don't get included and as you stated this is a form of UNKNOWN.

To try and adjust course on this, some discussion on the approach to scoring this type of result has yet to be put forward.  Maybe there's an avenue to account for all unknown results by performing some calculation of the percentage of checks which returned other result values vs. how many checks were selected.  I think someone giving due diligence would review the unknown results and would leverage a dashboard to bring those numbers to their attention, but how would you suggest these be included in the score?

V/r,
Tim Harrison

Timothy Harrison, CISSP

Associate

Booz | Allen | Hamilton


Mobile: 717-372-5768

[hidden email]


From: [hidden email] [[hidden email]] on behalf of David Solin [[hidden email]]
Sent: Tuesday, September 17, 2013 9:55 AM
To: [hidden email]
Subject: Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

Hi Tim,

There is also a NOTSELECTED result that addresses your second point.  My concern is that, from a scoring perspective, there should probably be a difference between NOTCHECKED and NOTSELECTED.  NOTCHECKED is more like UNKNOWN than NOTSELECTED, and should be scored accordingly.

Basically, it's a matter of the specification being quite clear, but apparently not well thought out on this point.

Best regards,
--David Solin

On 9/17/2013 8:48 AM, Harrison, Timothy [USA] wrote:
While I agree, in part or in whole, with both of your positions, one scenario which requires consideration is when multiple tools are leveraged in order to comprehensive coverage for all checking engines.  If you leverage more than one tool in such a fashion then NOTCHECKED results would impact scoring.  Granted, if properly handled the conflicting results between the two tools the non-NOTCHECKED results would take precedence there by addressing the potential for NOTCHECKED results to skew your scores.  I don't believe this is addressed in any of the SCAP specification, but please let me know if it is for future reference.

Additionally, you need to keep in mind that depending on the XCCDF Rule selection a Rule may not be checked due to it not being selected and as a result should be excluded from scoring.  While one might exclude the unselected Rule(s) from the results a case may be made for including such results.  In the instance where multiple profiles exist it may be beneficial to have the results for all Rules in order to allow post evaluation review based on any one of the profiles defined within the XCCDF document.  Only in the instance where Values are refined differently between the profiles would there be a need for re-evaluation based on the selected profile, but even then only the delta would require re-evaluation.

The questions which need to be answered are whether these are the only instances in which one may want to exclude NOTCHECKED results from scoring and whether they, or others, warrant such an exclusion.

V/r,
Tim Harrison

Timothy Harrison, CISSP

Associate

Booz | Allen | Hamilton


Mobile: 717-372-5768

[hidden email]


From: [hidden email] [[hidden email]] on behalf of David Solin [[hidden email]]
Sent: Monday, September 09, 2013 9:48 AM
To: [hidden email]
Subject: [External] Re: [Xccdf-dev] NOTCHECKED vs UNKNOWN rule result

Yes, I agree that NOTCHECKED is like a special case of UNKNOWN, it's just the fact that it has no impact on the score that I think is dangerous.  Many scanners don't support OCIL or SCE, but they'd produce a score that essentially gives "credit" for rules that went unchecked.

Regards,
--David Solin

On 9/9/2013 8:18 AM, Martin Preisler wrote:
Hi,
the specification is explicit in this case so openscap is implemented to print
NOTCHECKED as the result.

quote from XCCDF 1.2 spec (6.6.4.2):
The testing tool encountered some problem and the result is unknown. For example, a result of 
‘unknown’ might be given if the testing tool was unable to interpret the output of the checking 
engine (the output has no meaning to the testing tool).
Not supporting the check engine could certainly count as a problem. However,
the example implies that there were problems processing output from a checking
engine which means there has to be some output from a checking engine.

It seems to me that "notchecked" is a special case of "unknown" rather than
an overlap. At least that's how I see it :-)

I also find it strange that it has no impact on score. Could be dangerous
because people don't always inspect all the results and might just look at
the score.



--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download



_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].


--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download


_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

joval
I'm always open to the possibility that I'm wrong.  In this case I'd even welcome being wrong; if there was a specific rationale I'd like to know what it is.  But as good and useful as they are, these specifications are pretty complex, and some parts are more mature than others.

As for a way forward... we could, say, produce a range of scores: highest and lowest, the difference being the combined weights of unknown and unchecked (but selected) rules.  I think that would be the most accurate way to represent the uncertainty.

Regards,
--David Solin

On 9/17/2013 9:10 AM, Harrison, Timothy [USA] wrote:
David,

My mistake, I overlooked the NOTSELECTED status, but that does not fully nullify the second point.  If you were to switch between profiles and were not able to re-evaluate the delta, the NOTSELECTED status should change to NOTCHECKED with respect to that profile.

I'd be careful in suggesting that it was not well thought out as there may have been specific rationale behind excluding these from scoring.  Granted, it could simply have been the fact that UNKNOWN's don't get included and as you stated this is a form of UNKNOWN.

To try and adjust course on this, some discussion on the approach to scoring this type of result has yet to be put forward.  Maybe there's an avenue to account for all unknown results by performing some calculation of the percentage of checks which returned other result values vs. how many checks were selected.  I think someone giving due diligence would review the unknown results and would leverage a dashboard to bring those numbers to their attention, but how would you suggest these be included in the score?

V/r,
Tim Harrison

Timothy Harrison, CISSP

Associate

Booz | Allen | Hamilton


Mobile: 717-372-5768

[hidden email]


From: [hidden email] [[hidden email]] on behalf of David Solin [[hidden email]]
Sent: Tuesday, September 17, 2013 9:55 AM
To: [hidden email]
Subject: Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

Hi Tim,

There is also a NOTSELECTED result that addresses your second point.  My concern is that, from a scoring perspective, there should probably be a difference between NOTCHECKED and NOTSELECTED.  NOTCHECKED is more like UNKNOWN than NOTSELECTED, and should be scored accordingly.

Basically, it's a matter of the specification being quite clear, but apparently not well thought out on this point.

Best regards,
--David Solin

On 9/17/2013 8:48 AM, Harrison, Timothy [USA] wrote:
While I agree, in part or in whole, with both of your positions, one scenario which requires consideration is when multiple tools are leveraged in order to comprehensive coverage for all checking engines.  If you leverage more than one tool in such a fashion then NOTCHECKED results would impact scoring.  Granted, if properly handled the conflicting results between the two tools the non-NOTCHECKED results would take precedence there by addressing the potential for NOTCHECKED results to skew your scores.  I don't believe this is addressed in any of the SCAP specification, but please let me know if it is for future reference.

Additionally, you need to keep in mind that depending on the XCCDF Rule selection a Rule may not be checked due to it not being selected and as a result should be excluded from scoring.  While one might exclude the unselected Rule(s) from the results a case may be made for including such results.  In the instance where multiple profiles exist it may be beneficial to have the results for all Rules in order to allow post evaluation review based on any one of the profiles defined within the XCCDF document.  Only in the instance where Values are refined differently between the profiles would there be a need for re-evaluation based on the selected profile, but even then only the delta would require re-evaluation.

The questions which need to be answered are whether these are the only instances in which one may want to exclude NOTCHECKED results from scoring and whether they, or others, warrant such an exclusion.

V/r,
Tim Harrison

Timothy Harrison, CISSP

Associate

Booz | Allen | Hamilton


Mobile: 717-372-5768

[hidden email]


From: [hidden email] [[hidden email]] on behalf of David Solin [[hidden email]]
Sent: Monday, September 09, 2013 9:48 AM
To: [hidden email]
Subject: [External] Re: [Xccdf-dev] NOTCHECKED vs UNKNOWN rule result

Yes, I agree that NOTCHECKED is like a special case of UNKNOWN, it's just the fact that it has no impact on the score that I think is dangerous.  Many scanners don't support OCIL or SCE, but they'd produce a score that essentially gives "credit" for rules that went unchecked.

Regards,
--David Solin

On 9/9/2013 8:18 AM, Martin Preisler wrote:
Hi,
the specification is explicit in this case so openscap is implemented to print
NOTCHECKED as the result.

quote from XCCDF 1.2 spec (6.6.4.2):
The testing tool encountered some problem and the result is unknown. For example, a result of 
‘unknown’ might be given if the testing tool was unable to interpret the output of the checking 
engine (the output has no meaning to the testing tool).
Not supporting the check engine could certainly count as a problem. However,
the example implies that there were problems processing output from a checking
engine which means there has to be some output from a checking engine.

It seems to me that "notchecked" is a special case of "unknown" rather than
an overlap. At least that's how I see it :-)

I also find it strange that it has no impact on score. Could be dangerous
because people don't always inspect all the results and might just look at
the score.



--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download



_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].


--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download



_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].


--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download


_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

jOVAL.org: OVAL implemented in Java.
Scan any machine from any machine. For free!
Learn More | Features | Download

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

Harrison, Timothy [USA]

I think in practice a range of scores would be difficult for CNA to accommodate.  I think at most you could have two scores similar to how NVD provides a base score and a temporal score, but ideally you’d want to come up with a single score.  Can you share a formula to demonstrate how the approach you suggest might be calculated?

 

V/r,

Tim Harrison

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of David Solin
Sent: Wednesday, September 18, 2013 9:03 PM
To: [hidden email]
Subject: Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

 

I'm always open to the possibility that I'm wrong.  In this case I'd even welcome being wrong; if there was a specific rationale I'd like to know what it is.  But as good and useful as they are, these specifications are pretty complex, and some parts are more mature than others.

As for a way forward... we could, say, produce a range of scores: highest and lowest, the difference being the combined weights of unknown and unchecked (but selected) rules.  I think that would be the most accurate way to represent the uncertainty.

Regards,
--David Solin

On 9/17/2013 9:10 AM, Harrison, Timothy [USA] wrote:

David,

My mistake, I overlooked the NOTSELECTED status, but that does not fully nullify the second point.  If you were to switch between profiles and were not able to re-evaluate the delta, the NOTSELECTED status should change to NOTCHECKED with respect to that profile.

I'd be careful in suggesting that it was not well thought out as there may have been specific rationale behind excluding these from scoring.  Granted, it could simply have been the fact that UNKNOWN's don't get included and as you stated this is a form of UNKNOWN.

To try and adjust course on this, some discussion on the approach to scoring this type of result has yet to be put forward.  Maybe there's an avenue to account for all unknown results by performing some calculation of the percentage of checks which returned other result values vs. how many checks were selected.  I think someone giving due diligence would review the unknown results and would leverage a dashboard to bring those numbers to their attention, but how would you suggest these be included in the score?

V/r,
Tim Harrison

 

Timothy Harrison, CISSP

Associate

Booz | Allen | Hamilton

Mobile: 717-372-5768

[hidden email]


From: [hidden email] [[hidden email]] on behalf of David Solin [[hidden email]]
Sent: Tuesday, September 17, 2013 9:55 AM
To: [hidden email]
Subject: Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

Hi Tim,

There is also a NOTSELECTED result that addresses your second point.  My concern is that, from a scoring perspective, there should probably be a difference between NOTCHECKED and NOTSELECTED.  NOTCHECKED is more like UNKNOWN than NOTSELECTED, and should be scored accordingly.

Basically, it's a matter of the specification being quite clear, but apparently not well thought out on this point.

Best regards,
--David Solin

On 9/17/2013 8:48 AM, Harrison, Timothy [USA] wrote:

While I agree, in part or in whole, with both of your positions, one scenario which requires consideration is when multiple tools are leveraged in order to comprehensive coverage for all checking engines.  If you leverage more than one tool in such a fashion then NOTCHECKED results would impact scoring.  Granted, if properly handled the conflicting results between the two tools the non-NOTCHECKED results would take precedence there by addressing the potential for NOTCHECKED results to skew your scores.  I don't believe this is addressed in any of the SCAP specification, but please let me know if it is for future reference.

Additionally, you need to keep in mind that depending on the XCCDF Rule selection a Rule may not be checked due to it not being selected and as a result should be excluded from scoring.  While one might exclude the unselected Rule(s) from the results a case may be made for including such results.  In the instance where multiple profiles exist it may be beneficial to have the results for all Rules in order to allow post evaluation review based on any one of the profiles defined within the XCCDF document.  Only in the instance where Values are refined differently between the profiles would there be a need for re-evaluation based on the selected profile, but even then only the delta would require re-evaluation.

The questions which need to be answered are whether these are the only instances in which one may want to exclude NOTCHECKED results from scoring and whether they, or others, warrant such an exclusion.

V/r,
Tim Harrison

Timothy Harrison, CISSP

Associate

Booz | Allen | Hamilton

Mobile: 717-372-5768

[hidden email]


From: [hidden email] [[hidden email]] on behalf of David Solin [[hidden email]]
Sent: Monday, September 09, 2013 9:48 AM
To: [hidden email]
Subject: [External] Re: [Xccdf-dev] NOTCHECKED vs UNKNOWN rule result

Yes, I agree that NOTCHECKED is like a special case of UNKNOWN, it's just the fact that it has no impact on the score that I think is dangerous.  Many scanners don't support OCIL or SCE, but they'd produce a score that essentially gives "credit" for rules that went unchecked.

Regards,
--David Solin

On 9/9/2013 8:18 AM, Martin Preisler wrote:

Hi,
the specification is explicit in this case so openscap is implemented to print
NOTCHECKED as the result.
 
quote from XCCDF 1.2 spec (6.6.4.2):
The testing tool encountered some problem and the result is unknown. For example, a result of 
‘unknown’ might be given if the testing tool was unable to interpret the output of the checking 
engine (the output has no meaning to the testing tool).
Not supporting the check engine could certainly count as a problem. However,
the example implies that there were problems processing output from a checking
engine which means there has to be some output from a checking engine.
 
It seems to me that "notchecked" is a special case of "unknown" rather than
an overlap. At least that's how I see it :-)
 
I also find it strange that it has no impact on score. Could be dangerous
because people don't always inspect all the results and might just look at
the score.
 

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download



_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download




_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download


_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

joval
Hi Tim,

You'd simply create two scenarios, one in which every unknown and notchecked rule passes, and another where every one fails.  The former is the max score, the latter is the min score.

It could alternatively be expressed as the median score (between min and max), plus an "uncertainty" score which would be half the range from min to max.  Scanners that support all the systems used in a benchmark would then provide a score with an uncertainty of 0.

Either way, it's better than the current score, which sort of pretends that these checks just don't matter.

Regards,
--David Solin

On 9/19/2013 9:20 AM, Harrison, Timothy [USA] wrote:

I think in practice a range of scores would be difficult for CNA to accommodate.  I think at most you could have two scores similar to how NVD provides a base score and a temporal score, but ideally you’d want to come up with a single score.  Can you share a formula to demonstrate how the approach you suggest might be calculated?

 

V/r,

Tim Harrison

 

From: [hidden email] [[hidden email]] On Behalf Of David Solin
Sent: Wednesday, September 18, 2013 9:03 PM
To: [hidden email]
Subject: Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

 

I'm always open to the possibility that I'm wrong.  In this case I'd even welcome being wrong; if there was a specific rationale I'd like to know what it is.  But as good and useful as they are, these specifications are pretty complex, and some parts are more mature than others.

As for a way forward... we could, say, produce a range of scores: highest and lowest, the difference being the combined weights of unknown and unchecked (but selected) rules.  I think that would be the most accurate way to represent the uncertainty.

Regards,
--David Solin

On 9/17/2013 9:10 AM, Harrison, Timothy [USA] wrote:

David,

My mistake, I overlooked the NOTSELECTED status, but that does not fully nullify the second point.  If you were to switch between profiles and were not able to re-evaluate the delta, the NOTSELECTED status should change to NOTCHECKED with respect to that profile.

I'd be careful in suggesting that it was not well thought out as there may have been specific rationale behind excluding these from scoring.  Granted, it could simply have been the fact that UNKNOWN's don't get included and as you stated this is a form of UNKNOWN.

To try and adjust course on this, some discussion on the approach to scoring this type of result has yet to be put forward.  Maybe there's an avenue to account for all unknown results by performing some calculation of the percentage of checks which returned other result values vs. how many checks were selected.  I think someone giving due diligence would review the unknown results and would leverage a dashboard to bring those numbers to their attention, but how would you suggest these be included in the score?

V/r,
Tim Harrison

 

Timothy Harrison, CISSP

Associate

Booz | Allen | Hamilton


Mobile: 717-372-5768

[hidden email]


From: [hidden email] [[hidden email]] on behalf of David Solin [[hidden email]]
Sent: Tuesday, September 17, 2013 9:55 AM
To: [hidden email]
Subject: Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

Hi Tim,

There is also a NOTSELECTED result that addresses your second point.  My concern is that, from a scoring perspective, there should probably be a difference between NOTCHECKED and NOTSELECTED.  NOTCHECKED is more like UNKNOWN than NOTSELECTED, and should be scored accordingly.

Basically, it's a matter of the specification being quite clear, but apparently not well thought out on this point.

Best regards,
--David Solin

On 9/17/2013 8:48 AM, Harrison, Timothy [USA] wrote:

While I agree, in part or in whole, with both of your positions, one scenario which requires consideration is when multiple tools are leveraged in order to comprehensive coverage for all checking engines.  If you leverage more than one tool in such a fashion then NOTCHECKED results would impact scoring.  Granted, if properly handled the conflicting results between the two tools the non-NOTCHECKED results would take precedence there by addressing the potential for NOTCHECKED results to skew your scores.  I don't believe this is addressed in any of the SCAP specification, but please let me know if it is for future reference.

Additionally, you need to keep in mind that depending on the XCCDF Rule selection a Rule may not be checked due to it not being selected and as a result should be excluded from scoring.  While one might exclude the unselected Rule(s) from the results a case may be made for including such results.  In the instance where multiple profiles exist it may be beneficial to have the results for all Rules in order to allow post evaluation review based on any one of the profiles defined within the XCCDF document.  Only in the instance where Values are refined differently between the profiles would there be a need for re-evaluation based on the selected profile, but even then only the delta would require re-evaluation.

The questions which need to be answered are whether these are the only instances in which one may want to exclude NOTCHECKED results from scoring and whether they, or others, warrant such an exclusion.

V/r,
Tim Harrison

Timothy Harrison, CISSP

Associate

Booz | Allen | Hamilton


Mobile: 717-372-5768

[hidden email]


From: [hidden email] [[hidden email]] on behalf of David Solin [[hidden email]]
Sent: Monday, September 09, 2013 9:48 AM
To: [hidden email]
Subject: [External] Re: [Xccdf-dev] NOTCHECKED vs UNKNOWN rule result

Yes, I agree that NOTCHECKED is like a special case of UNKNOWN, it's just the fact that it has no impact on the score that I think is dangerous.  Many scanners don't support OCIL or SCE, but they'd produce a score that essentially gives "credit" for rules that went unchecked.

Regards,
--David Solin

On 9/9/2013 8:18 AM, Martin Preisler wrote:

Hi,
the specification is explicit in this case so openscap is implemented to print
NOTCHECKED as the result.
 
quote from XCCDF 1.2 spec (6.6.4.2):
The testing tool encountered some problem and the result is unknown. For example, a result of 
‘unknown’ might be given if the testing tool was unable to interpret the output of the checking 
engine (the output has no meaning to the testing tool).
Not supporting the check engine could certainly count as a problem. However,
the example implies that there were problems processing output from a checking
engine which means there has to be some output from a checking engine.
 
It seems to me that "notchecked" is a special case of "unknown" rather than
an overlap. At least that's how I see it :-)
 
I also find it strange that it has no impact on score. Could be dangerous
because people don't always inspect all the results and might just look at
the score.
 

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download



_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download




_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download



_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].


--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download


_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

jOVAL.org: OVAL implemented in Java.
Scan any machine from any machine. For free!
Learn More | Features | Download

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

Harrison, Timothy [USA]

Hi David,

 

I like the simplicity, I could envision the tool reporting maybe the “median” score of ‘x’ followed by a ‘±y’ in the instance where uncertainty exists, but I’m still thinking this may provide too inaccurate of a picture of the true system state as well as create a headache for CNA.  I do agree it is important to know what has not been checked and that adjusting the score to account for this uncertainty is probably not a bad idea, but I foresee some challenges to such an approach as on the tool side it would highlight which tools support the most/least types of checking, on the SA side it may bloat or water down the score too severely to be reliable, and on the CNA side it may be challenging to assign a compliant/non-compliant status depending on the margin of uncertainty.

 

Regards,

Tim Harrison

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of David Solin
Sent: Thursday, September 19, 2013 12:20 PM
To: [hidden email]
Subject: Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

 

Hi Tim,

You'd simply create two scenarios, one in which every unknown and notchecked rule passes, and another where every one fails.  The former is the max score, the latter is the min score.

It could alternatively be expressed as the median score (between min and max), plus an "uncertainty" score which would be half the range from min to max.  Scanners that support all the systems used in a benchmark would then provide a score with an uncertainty of 0.

Either way, it's better than the current score, which sort of pretends that these checks just don't matter.

Regards,
--David Solin

On 9/19/2013 9:20 AM, Harrison, Timothy [USA] wrote:

I think in practice a range of scores would be difficult for CNA to accommodate.  I think at most you could have two scores similar to how NVD provides a base score and a temporal score, but ideally you’d want to come up with a single score.  Can you share a formula to demonstrate how the approach you suggest might be calculated?

 

V/r,

Tim Harrison

 

From: [hidden email] [[hidden email]] On Behalf Of David Solin
Sent: Wednesday, September 18, 2013 9:03 PM
To: [hidden email]
Subject: Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

 

I'm always open to the possibility that I'm wrong.  In this case I'd even welcome being wrong; if there was a specific rationale I'd like to know what it is.  But as good and useful as they are, these specifications are pretty complex, and some parts are more mature than others.

As for a way forward... we could, say, produce a range of scores: highest and lowest, the difference being the combined weights of unknown and unchecked (but selected) rules.  I think that would be the most accurate way to represent the uncertainty.

Regards,
--David Solin

On 9/17/2013 9:10 AM, Harrison, Timothy [USA] wrote:

David,

My mistake, I overlooked the NOTSELECTED status, but that does not fully nullify the second point.  If you were to switch between profiles and were not able to re-evaluate the delta, the NOTSELECTED status should change to NOTCHECKED with respect to that profile.

I'd be careful in suggesting that it was not well thought out as there may have been specific rationale behind excluding these from scoring.  Granted, it could simply have been the fact that UNKNOWN's don't get included and as you stated this is a form of UNKNOWN.

To try and adjust course on this, some discussion on the approach to scoring this type of result has yet to be put forward.  Maybe there's an avenue to account for all unknown results by performing some calculation of the percentage of checks which returned other result values vs. how many checks were selected.  I think someone giving due diligence would review the unknown results and would leverage a dashboard to bring those numbers to their attention, but how would you suggest these be included in the score?

V/r,
Tim Harrison

 

Timothy Harrison, CISSP

Associate

Booz | Allen | Hamilton

Mobile: 717-372-5768

[hidden email]


From: [hidden email] [[hidden email]] on behalf of David Solin [[hidden email]]
Sent: Tuesday, September 17, 2013 9:55 AM
To: [hidden email]
Subject: Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

Hi Tim,

There is also a NOTSELECTED result that addresses your second point.  My concern is that, from a scoring perspective, there should probably be a difference between NOTCHECKED and NOTSELECTED.  NOTCHECKED is more like UNKNOWN than NOTSELECTED, and should be scored accordingly.

Basically, it's a matter of the specification being quite clear, but apparently not well thought out on this point.

Best regards,
--David Solin

On 9/17/2013 8:48 AM, Harrison, Timothy [USA] wrote:

While I agree, in part or in whole, with both of your positions, one scenario which requires consideration is when multiple tools are leveraged in order to comprehensive coverage for all checking engines.  If you leverage more than one tool in such a fashion then NOTCHECKED results would impact scoring.  Granted, if properly handled the conflicting results between the two tools the non-NOTCHECKED results would take precedence there by addressing the potential for NOTCHECKED results to skew your scores.  I don't believe this is addressed in any of the SCAP specification, but please let me know if it is for future reference.

Additionally, you need to keep in mind that depending on the XCCDF Rule selection a Rule may not be checked due to it not being selected and as a result should be excluded from scoring.  While one might exclude the unselected Rule(s) from the results a case may be made for including such results.  In the instance where multiple profiles exist it may be beneficial to have the results for all Rules in order to allow post evaluation review based on any one of the profiles defined within the XCCDF document.  Only in the instance where Values are refined differently between the profiles would there be a need for re-evaluation based on the selected profile, but even then only the delta would require re-evaluation.

The questions which need to be answered are whether these are the only instances in which one may want to exclude NOTCHECKED results from scoring and whether they, or others, warrant such an exclusion.

V/r,
Tim Harrison

Timothy Harrison, CISSP

Associate

Booz | Allen | Hamilton

Mobile: 717-372-5768

[hidden email]


From: [hidden email] [[hidden email]] on behalf of David Solin [[hidden email]]
Sent: Monday, September 09, 2013 9:48 AM
To: [hidden email]
Subject: [External] Re: [Xccdf-dev] NOTCHECKED vs UNKNOWN rule result

Yes, I agree that NOTCHECKED is like a special case of UNKNOWN, it's just the fact that it has no impact on the score that I think is dangerous.  Many scanners don't support OCIL or SCE, but they'd produce a score that essentially gives "credit" for rules that went unchecked.

Regards,
--David Solin

On 9/9/2013 8:18 AM, Martin Preisler wrote:

Hi,
the specification is explicit in this case so openscap is implemented to print
NOTCHECKED as the result.
 
quote from XCCDF 1.2 spec (6.6.4.2):
The testing tool encountered some problem and the result is unknown. For example, a result of 
‘unknown’ might be given if the testing tool was unable to interpret the output of the checking 
engine (the output has no meaning to the testing tool).
Not supporting the check engine could certainly count as a problem. However,
the example implies that there were problems processing output from a checking
engine which means there has to be some output from a checking engine.
 
It seems to me that "notchecked" is a special case of "unknown" rather than
an overlap. At least that's how I see it :-)
 
I also find it strange that it has no impact on score. Could be dangerous
because people don't always inspect all the results and might just look at
the score.
 

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download




_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download





_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download




_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download


_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

joval
LOL -- but that is the whole idea, isn't it?!  How can you say whether or not a system is compliant or non-compliant with a rule that you have not actually checked?

It's easy to write a tool that gives credit for passing all the rules that it doesn't check, but that obviously defeats the whole mission of compliance!

We need a way in the standard to accommodate the reality, that not every tool supports every check system.

On 9/19/2013 2:03 PM, Harrison, Timothy [USA] wrote:

Hi David,

 

I like the simplicity, I could envision the tool reporting maybe the “median” score of ‘x’ followed by a ‘±y’ in the instance where uncertainty exists, but I’m still thinking this may provide too inaccurate of a picture of the true system state as well as create a headache for CNA.  I do agree it is important to know what has not been checked and that adjusting the score to account for this uncertainty is probably not a bad idea, but I foresee some challenges to such an approach as on the tool side it would highlight which tools support the most/least types of checking, on the SA side it may bloat or water down the score too severely to be reliable, and on the CNA side it may be challenging to assign a compliant/non-compliant status depending on the margin of uncertainty.

 

Regards,

Tim Harrison

 

From: [hidden email] [[hidden email]] On Behalf Of David Solin
Sent: Thursday, September 19, 2013 12:20 PM
To: [hidden email]
Subject: Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

 

Hi Tim,

You'd simply create two scenarios, one in which every unknown and notchecked rule passes, and another where every one fails.  The former is the max score, the latter is the min score.

It could alternatively be expressed as the median score (between min and max), plus an "uncertainty" score which would be half the range from min to max.  Scanners that support all the systems used in a benchmark would then provide a score with an uncertainty of 0.

Either way, it's better than the current score, which sort of pretends that these checks just don't matter.

Regards,
--David Solin

On 9/19/2013 9:20 AM, Harrison, Timothy [USA] wrote:

I think in practice a range of scores would be difficult for CNA to accommodate.  I think at most you could have two scores similar to how NVD provides a base score and a temporal score, but ideally you’d want to come up with a single score.  Can you share a formula to demonstrate how the approach you suggest might be calculated?

 

V/r,

Tim Harrison

 

From: [hidden email] [[hidden email]] On Behalf Of David Solin
Sent: Wednesday, September 18, 2013 9:03 PM
To: [hidden email]
Subject: Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

 

I'm always open to the possibility that I'm wrong.  In this case I'd even welcome being wrong; if there was a specific rationale I'd like to know what it is.  But as good and useful as they are, these specifications are pretty complex, and some parts are more mature than others.

As for a way forward... we could, say, produce a range of scores: highest and lowest, the difference being the combined weights of unknown and unchecked (but selected) rules.  I think that would be the most accurate way to represent the uncertainty.

Regards,
--David Solin

On 9/17/2013 9:10 AM, Harrison, Timothy [USA] wrote:

David,

My mistake, I overlooked the NOTSELECTED status, but that does not fully nullify the second point.  If you were to switch between profiles and were not able to re-evaluate the delta, the NOTSELECTED status should change to NOTCHECKED with respect to that profile.

I'd be careful in suggesting that it was not well thought out as there may have been specific rationale behind excluding these from scoring.  Granted, it could simply have been the fact that UNKNOWN's don't get included and as you stated this is a form of UNKNOWN.

To try and adjust course on this, some discussion on the approach to scoring this type of result has yet to be put forward.  Maybe there's an avenue to account for all unknown results by performing some calculation of the percentage of checks which returned other result values vs. how many checks were selected.  I think someone giving due diligence would review the unknown results and would leverage a dashboard to bring those numbers to their attention, but how would you suggest these be included in the score?

V/r,
Tim Harrison

 

Timothy Harrison, CISSP

Associate

Booz | Allen | Hamilton


Mobile: 717-372-5768

[hidden email]


From: [hidden email] [[hidden email]] on behalf of David Solin [[hidden email]]
Sent: Tuesday, September 17, 2013 9:55 AM
To: [hidden email]
Subject: Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

Hi Tim,

There is also a NOTSELECTED result that addresses your second point.  My concern is that, from a scoring perspective, there should probably be a difference between NOTCHECKED and NOTSELECTED.  NOTCHECKED is more like UNKNOWN than NOTSELECTED, and should be scored accordingly.

Basically, it's a matter of the specification being quite clear, but apparently not well thought out on this point.

Best regards,
--David Solin

On 9/17/2013 8:48 AM, Harrison, Timothy [USA] wrote:

While I agree, in part or in whole, with both of your positions, one scenario which requires consideration is when multiple tools are leveraged in order to comprehensive coverage for all checking engines.  If you leverage more than one tool in such a fashion then NOTCHECKED results would impact scoring.  Granted, if properly handled the conflicting results between the two tools the non-NOTCHECKED results would take precedence there by addressing the potential for NOTCHECKED results to skew your scores.  I don't believe this is addressed in any of the SCAP specification, but please let me know if it is for future reference.

Additionally, you need to keep in mind that depending on the XCCDF Rule selection a Rule may not be checked due to it not being selected and as a result should be excluded from scoring.  While one might exclude the unselected Rule(s) from the results a case may be made for including such results.  In the instance where multiple profiles exist it may be beneficial to have the results for all Rules in order to allow post evaluation review based on any one of the profiles defined within the XCCDF document.  Only in the instance where Values are refined differently between the profiles would there be a need for re-evaluation based on the selected profile, but even then only the delta would require re-evaluation.

The questions which need to be answered are whether these are the only instances in which one may want to exclude NOTCHECKED results from scoring and whether they, or others, warrant such an exclusion.

V/r,
Tim Harrison

Timothy Harrison, CISSP

Associate

Booz | Allen | Hamilton


Mobile: 717-372-5768

[hidden email]


From: [hidden email] [[hidden email]] on behalf of David Solin [[hidden email]]
Sent: Monday, September 09, 2013 9:48 AM
To: [hidden email]
Subject: [External] Re: [Xccdf-dev] NOTCHECKED vs UNKNOWN rule result

Yes, I agree that NOTCHECKED is like a special case of UNKNOWN, it's just the fact that it has no impact on the score that I think is dangerous.  Many scanners don't support OCIL or SCE, but they'd produce a score that essentially gives "credit" for rules that went unchecked.

Regards,
--David Solin

On 9/9/2013 8:18 AM, Martin Preisler wrote:

Hi,
the specification is explicit in this case so openscap is implemented to print
NOTCHECKED as the result.
 
quote from XCCDF 1.2 spec (6.6.4.2):
The testing tool encountered some problem and the result is unknown. For example, a result of 
‘unknown’ might be given if the testing tool was unable to interpret the output of the checking 
engine (the output has no meaning to the testing tool).
Not supporting the check engine could certainly count as a problem. However,
the example implies that there were problems processing output from a checking
engine which means there has to be some output from a checking engine.
 
It seems to me that "notchecked" is a special case of "unknown" rather than
an overlap. At least that's how I see it :-)
 
I also find it strange that it has no impact on score. Could be dangerous
because people don't always inspect all the results and might just look at
the score.
 

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download




_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download





_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download




_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download



_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].


--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download


_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

jOVAL.org: OVAL implemented in Java.
Scan any machine from any machine. For free!
Learn More | Features | Download

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

Harrison, Timothy [USA]

Touché, I guess what I was trying to say is I’m unsure what the right approach/formula would be for such a vexing issue.

 

I agree some form of standard approach should be defined, but more community involvement/interest is a necessity for that to happen.

 

From: [hidden email] [mailto:[hidden email]] On Behalf Of David Solin
Sent: Thursday, September 19, 2013 3:22 PM
To: [hidden email]
Subject: Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

 

LOL -- but that is the whole idea, isn't it?!  How can you say whether or not a system is compliant or non-compliant with a rule that you have not actually checked?

It's easy to write a tool that gives credit for passing all the rules that it doesn't check, but that obviously defeats the whole mission of compliance!

We need a way in the standard to accommodate the reality, that not every tool supports every check system.

On 9/19/2013 2:03 PM, Harrison, Timothy [USA] wrote:

Hi David,

 

I like the simplicity, I could envision the tool reporting maybe the “median” score of ‘x’ followed by a ‘±y’ in the instance where uncertainty exists, but I’m still thinking this may provide too inaccurate of a picture of the true system state as well as create a headache for CNA.  I do agree it is important to know what has not been checked and that adjusting the score to account for this uncertainty is probably not a bad idea, but I foresee some challenges to such an approach as on the tool side it would highlight which tools support the most/least types of checking, on the SA side it may bloat or water down the score too severely to be reliable, and on the CNA side it may be challenging to assign a compliant/non-compliant status depending on the margin of uncertainty.

 

Regards,

Tim Harrison

 

From: [hidden email] [[hidden email]] On Behalf Of David Solin
Sent: Thursday, September 19, 2013 12:20 PM
To: [hidden email]
Subject: Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

 

Hi Tim,

You'd simply create two scenarios, one in which every unknown and notchecked rule passes, and another where every one fails.  The former is the max score, the latter is the min score.

It could alternatively be expressed as the median score (between min and max), plus an "uncertainty" score which would be half the range from min to max.  Scanners that support all the systems used in a benchmark would then provide a score with an uncertainty of 0.

Either way, it's better than the current score, which sort of pretends that these checks just don't matter.

Regards,
--David Solin

On 9/19/2013 9:20 AM, Harrison, Timothy [USA] wrote:

I think in practice a range of scores would be difficult for CNA to accommodate.  I think at most you could have two scores similar to how NVD provides a base score and a temporal score, but ideally you’d want to come up with a single score.  Can you share a formula to demonstrate how the approach you suggest might be calculated?

 

V/r,

Tim Harrison

 

From: [hidden email] [[hidden email]] On Behalf Of David Solin
Sent: Wednesday, September 18, 2013 9:03 PM
To: [hidden email]
Subject: Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

 

I'm always open to the possibility that I'm wrong.  In this case I'd even welcome being wrong; if there was a specific rationale I'd like to know what it is.  But as good and useful as they are, these specifications are pretty complex, and some parts are more mature than others.

As for a way forward... we could, say, produce a range of scores: highest and lowest, the difference being the combined weights of unknown and unchecked (but selected) rules.  I think that would be the most accurate way to represent the uncertainty.

Regards,
--David Solin

On 9/17/2013 9:10 AM, Harrison, Timothy [USA] wrote:

David,

My mistake, I overlooked the NOTSELECTED status, but that does not fully nullify the second point.  If you were to switch between profiles and were not able to re-evaluate the delta, the NOTSELECTED status should change to NOTCHECKED with respect to that profile.

I'd be careful in suggesting that it was not well thought out as there may have been specific rationale behind excluding these from scoring.  Granted, it could simply have been the fact that UNKNOWN's don't get included and as you stated this is a form of UNKNOWN.

To try and adjust course on this, some discussion on the approach to scoring this type of result has yet to be put forward.  Maybe there's an avenue to account for all unknown results by performing some calculation of the percentage of checks which returned other result values vs. how many checks were selected.  I think someone giving due diligence would review the unknown results and would leverage a dashboard to bring those numbers to their attention, but how would you suggest these be included in the score?

V/r,
Tim Harrison

 

Timothy Harrison, CISSP

Associate

Booz | Allen | Hamilton

Mobile: 717-372-5768

[hidden email]


From: [hidden email] [[hidden email]] on behalf of David Solin [[hidden email]]
Sent: Tuesday, September 17, 2013 9:55 AM
To: [hidden email]
Subject: Re: [Xccdf-dev] [External] Re: NOTCHECKED vs UNKNOWN rule result

Hi Tim,

There is also a NOTSELECTED result that addresses your second point.  My concern is that, from a scoring perspective, there should probably be a difference between NOTCHECKED and NOTSELECTED.  NOTCHECKED is more like UNKNOWN than NOTSELECTED, and should be scored accordingly.

Basically, it's a matter of the specification being quite clear, but apparently not well thought out on this point.

Best regards,
--David Solin

On 9/17/2013 8:48 AM, Harrison, Timothy [USA] wrote:

While I agree, in part or in whole, with both of your positions, one scenario which requires consideration is when multiple tools are leveraged in order to comprehensive coverage for all checking engines.  If you leverage more than one tool in such a fashion then NOTCHECKED results would impact scoring.  Granted, if properly handled the conflicting results between the two tools the non-NOTCHECKED results would take precedence there by addressing the potential for NOTCHECKED results to skew your scores.  I don't believe this is addressed in any of the SCAP specification, but please let me know if it is for future reference.

Additionally, you need to keep in mind that depending on the XCCDF Rule selection a Rule may not be checked due to it not being selected and as a result should be excluded from scoring.  While one might exclude the unselected Rule(s) from the results a case may be made for including such results.  In the instance where multiple profiles exist it may be beneficial to have the results for all Rules in order to allow post evaluation review based on any one of the profiles defined within the XCCDF document.  Only in the instance where Values are refined differently between the profiles would there be a need for re-evaluation based on the selected profile, but even then only the delta would require re-evaluation.

The questions which need to be answered are whether these are the only instances in which one may want to exclude NOTCHECKED results from scoring and whether they, or others, warrant such an exclusion.

V/r,
Tim Harrison

Timothy Harrison, CISSP

Associate

Booz | Allen | Hamilton

Mobile: 717-372-5768

[hidden email]


From: [hidden email] [[hidden email]] on behalf of David Solin [[hidden email]]
Sent: Monday, September 09, 2013 9:48 AM
To: [hidden email]
Subject: [External] Re: [Xccdf-dev] NOTCHECKED vs UNKNOWN rule result

Yes, I agree that NOTCHECKED is like a special case of UNKNOWN, it's just the fact that it has no impact on the score that I think is dangerous.  Many scanners don't support OCIL or SCE, but they'd produce a score that essentially gives "credit" for rules that went unchecked.

Regards,
--David Solin

On 9/9/2013 8:18 AM, Martin Preisler wrote:

Hi,
the specification is explicit in this case so openscap is implemented to print
NOTCHECKED as the result.
 
quote from XCCDF 1.2 spec (6.6.4.2):
The testing tool encountered some problem and the result is unknown. For example, a result of 
‘unknown’ might be given if the testing tool was unable to interpret the output of the checking 
engine (the output has no meaning to the testing tool).
Not supporting the check engine could certainly count as a problem. However,
the example implies that there were problems processing output from a checking
engine which means there has to be some output from a checking engine.
 
It seems to me that "notchecked" is a special case of "unknown" rather than
an overlap. At least that's how I see it :-)
 
I also find it strange that it has no impact on score. Could be dangerous
because people don't always inspect all the results and might just look at
the score.
 

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download





_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download






_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download





_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download




_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].

 

--

jOVAL.org: SCAP Simplified.
Learn More | Features | Download


_______________________________________________
XCCDF-dev mailing list
[hidden email]
To unsubscribe, send an email message to [hidden email].
Loading...