Skip to end of banner
Go to start of banner

Quality and Security Review Matrix

Skip to end of metadata
Go to start of metadata

You are viewing an old version of this content. View the current version.

Compare with Current View Version History

« Previous Version 17 Next »

Olaf Renner Muddasar Ahmed Amy Zwarico

This is a drafting space for LFN Quality and Security goals to

  1. Define common quality and security goals across LFN projects

  2. Define metrics and tools to measure and verify if goals are reached

  3. Define templates to guide, document and review project progress

Background:
It was agreed in LFN TAC that security of projects should be increased and security best practices should be implemented across LFN. This goal was extended to also define overall quality goals across LFN as the projects health was traditionally reviewed by TAC but it didn’t include security targets and wasn’t done in a consistent automatable way.

Why do we need this:
With regulatory requirements like US cybersecurity executive order and EU Cyber resilience act companies increasingly evaluate how secure and well maintained the open source software is they use in their products. To ensure that LFN continues to be seen as a source for quality software we should review and agree on what goals we want to reach and how to measure and verify them.
With new projects coming to LFN it becomes even more important to have a clear definition of what will be expected and how these goals can be reached.

Working group:

As it is expected that this work will take some time and may require dedicated meetings to discuss the details the formation of a working group is proposed.
Because this is an LFN wide activity, volunteers from the projects are required to help defining and reviewing goals and criteria. Please add your name

Assessment of existing documentation, guidelines and tools:

As a first step to define goals the following table collects what is currently documented in LFN, what other organizations have defined and what tools could be used for measurement and verification.
The first column collects the quality/security criteria from various source added in the other columns. Colored rows organize criteria in groups and contain sources that can be used for the criteria under the group (if not further detailed in the rows).

Quality Goal

LFN Wiki

CNCF templates

OpenSSF

Key Measures

LFX

GitHub

Notes

Project Vitals

Project Data Template (currently used both for induction ☑️ and health review ✅ *)

LFN Lifecycle states and guidelines (metrics per lifecycle stage)

LFN Security Forum Best Practices

Best Practices

Passing badge

Scorecard

Project Name

☑️

README-template.md

PCC Project Definition

Project Creation Date

☑️

Age?

PCC Project Definition

Project License

☑️

LICENSE; README-template.md

[floss_license][floss_license_osi][license_location]

Degree of FOS

PCC Project Definition

Legal Details and checks

PCC Project Definition

Community Size

☑️

#s

Contributing organizations (Diversity)

☑️ ✅

☑️

#s

Number of contributors

☑️

#s

Lifecycle Stage

☑️

PCC Project Definition

Release schedule

☑️

Months

Adoption

☑️

Not sure how to measure. Downloads?

Health Review

CNCF Devstats; CLOmonitor

Criticality Score

PCC Health Metrics; Insights

Github health metrics

Release Information

Number of commits (over last year)

Number of active committers

Number of Active committers per organization

Number of PR/changeset

Mailing list activity

Project & Community Resources

[discussion]

Website

☑️

README.md

[description_good]

Yes/NO

PCC Domain

Wiki

☑️

README.md

Yes/NO

PCC Wiki

Mailing List

☑️

README.md

Yes/NO

PCC mailing list

Slack

☑️

README.md

Yes/NO

Community Meetings

☑️

README.md

Yes/NO

PCC manage meetings

Project Governance

TSC/TOC

☑️

☑️

GOVERNANCE.md; GOVERNANCE-elections.md; GOVERNANCE-maintainer.md; 

Yes/No

Charter

☑️

Yes/No

Code of Conduct

☑️

CODE_OF_CONDUCT.md; README.md

Yes/No

How to contribute

☑️

☑️

CONTRIBUTING.md; README.md

[interact]; [contribution];[contribution_requirements]

Yes/No

Project Roles

☑️

☑️

CONTRIBUTOR_LADDER.md

Yes/No

Maintainers

☑️

☑️

MAINTAINERS.md

How to Review

☑️

☑️

REVIEWING.md

Adding/Removing PTLs

☑️

☑️

MAINTAINERS.md ??

Sub-Project Lifecycle

☑️

☑️

GOVERNANCE-subprojects.md

Dispute Resolution

☑️

Adding/removing committers

☑️

☑️

Sub-projects without a lead

☑️

Documentation

[english]

Technical Documentation

☑️

☑️

[documentation_basics][documentation_interface]

what should be minimum criteria? or scale

Contributor onboarding Documentation

☑️

[interact]; [contribution][contribution_requirements]

Yes/No

Company Diversity (past 12 months)

☑️

Number of Contributors

☑️

Release Management

☑️

[version_unique][version_semver][version_tags][release_notes][release_notes_vulns]

Release notes contains patched and outstanding defects ( details)

CI CD integration

☑️

[build];[build_common_tools][build_floss_tools]

degree of automation, analysis tools, traceability of overrides, failures

Adoption

☑️

Security Design Principals

☑️

Use Case/ Problem Statement

Problem that project solves

☑️

README.md

[description_good]

Use Cases Scenarios

☑️

README.md

Infrastructure Tooling

Wiki

☑️

Repos

☑️

[repo_public][repo_track][repo_interim][repo_distributed]

Bug Tracking tool

☑️

[report_tracker]

Code review

☑️

CI/CD tooling

☑️

[build];[build_common_tools][build_floss_tools]

Collaboration Tooling

☑️

Roadmap

Roadmap Guide

Near/long-term objectives

☑️

Milestones

☑️

Risks/Challenges

☑️

Timeline

☑️

Security Best Practices

Security Guidelines for New Projects

Security

Security Contacts

SECURITY-CONTACTS.md

[know_secure_design][know_common_errors]

Yes, channels,

We are lacking security contacts from projects

Code Scanning

☑️

license, code vulnerability, static, dynamic, manual

Snyk, Blubracket

Seed code handoff

☑️

Coding Standards

☑️

☑️

[warnings][warnings_fixed][warnings_strict]

Security design principals

☑️

OSSF Scorecard; OSSF Best Practices

self assessment or audit, outside assessment/audit

Vulnerability Reporting

☑️

SECURITY.md; incident-response.md

[release_notes_vulns][vulnerability_report_process][vulnerability_report_private][vulnerability_report_response]

Bug reporting

[report_process][report_tracker][report_responses][enhancement_responses][report_archive]

days to patch, bug found during ( unit, static, dynamic, integration, field operations

Demonstrate Security Awareness

☑️

all of this column.

Practice Secure Lifecycle Management (per release)

☑️

cryptographic practices;
Secured delivery against man-in-the-middle (MITM) attacks; Publicly known vulnerabilities fixed; [no_leaked_credentials]

Security Documentation

☑️

[vulnerabilities_fixed_60_days] [vulnerabilities_critical_fixed]

CI/CD best practices

☑️

Secure project architecture

☑️

[sites_https]

Supply Chain Security

☑️

CNCF Supply Chain Security

code intake scans, 3rd party code ver alignment, 3rd party code vulnerability reporting

There is also OpenSSF S2C2F

SBOM creation

☑️

Yes/NO

export SBOM

OpenChain Telco SBOM

Static Application Security Testing (SAST)

☑️

[static_analysis][static_analysis_common_vulnerabilities][static_analysis_fixed][static_analysis_often]

<--

Dynamic Application Security Testing (DAST)

☑️

[dynamic_analysis][dynamic_analysis_unsafe][dynamic_analysis_enable_assertions][dynamic_analysis_fixed]

<--

Software Composition Analysis (SCA)

☑️

Container vulnerablitiy scanning

☑️

Code Coverage Testing

☑️

[test][test_invocation][test_most][test_continuous_integration]; [test_policy][tests_are_added][tests_documented_added]

<--

Code Quality

☑️

Quality Goals

CNCF Templates

  • should LFN introduce similar templates to be used in the repos? Seems like a low hanging fruit, LFN could create a “blueprint repo” for new projects to have some of the required information documented in a consistent way (details can be still documented by just providing a link in the template e.g. to the Wiki)

Security goals

Security Contacts

  • projects shall have security contacts documented.

Code quality

  • ODL: auto-formatting tools are helpful to make code understandable and maintainable, patterns and static analysis e.g. for logging (exception handling, log levels), test coverage 70-80% ( Robert Varga can provide link to ODL documentation)

OpenSSF best practices

  • Functest: noted this doesn’t really fit for functest. Cédric Ollivier to share which criteria are not fitting and what alternative tools to use for reaching and measuring goals

  • No labels