2025 Quality & Security Goals
Overview
@Olaf Renner @Muddasar Ahmed @Amy Zwarico
This is a drafting space for LFN Quality and Security goals to
Define common quality and security goals across LFN projects
Define metrics and tools to measure and verify if goals are reached
Define templates to guide, document and review project progress
Background:
It was agreed in LFN TAC that security of projects should be increased and security best practices should be implemented across LFN. This goal was extended to also define overall quality goals across LFN as the projects health was traditionally reviewed by TAC but it didn’t include security targets and wasn’t done in a consistent automatable way.
Why do we need this:
With regulatory requirements like US cybersecurity executive order and EU Cyber resilience act companies increasingly evaluate how secure and well maintained the open source software is they use in their products. To ensure that LFN continues to be seen as a source for quality software we should review and agree on what goals we want to reach and how to measure and verify them.
With new projects coming to LFN it becomes even more important to have a clear definition of what will be expected and how these goals can be reached.
Working group:
As it is expected that this work will take some time and may require dedicated meetings to discuss the details the formation of a working group is proposed.
Because this is an LFN wide activity, volunteers from the projects are required to help defining and reviewing goals and criteria. Please add your name
Participants
Project | Contact |
---|---|
ONAP | @Amy Zwarico @Muddasar Ahmed |
ODL | @Robert Varga |
Anuket | @Gergely Csatari @Cédric Ollivier (functest) |
Nephio | |
|
|
Review Matrix
Assessment of existing documentation, guidelines and tools:
As a first step to define goals the following table collects what is currently documented in LFN, what other organizations have defined and what tools could be used for measurement and verification.
The first column collects the quality/security criteria from various source added in the other columns. Colored rows organize criteria in groups and contain sources that can be used for the criteria under the group (if not further detailed in the rows).
As the table lists the status quo some criteria may appear as duplicates although in different context. To mark these related criteria add the table row number of the duplicate/related criteria in the according column.
Quality Goal | Priority | Related | LFN Wiki |
|
| OpenSSF | Key Measures | LFX | GitHub | Notes | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | Project Vitals |
|
| Project Data Template (currently used both for induction and health review *) | LFN Lifecycle states and guidelines (metrics per lifecycle stage) |
| Best Practices | Scorecard |
|
|
| |
2 | Project Name | 1 @Amy Zwarico |
|
|
|
| README-template.md |
|
|
|
| |
3 | Project Creation Date |
|
|
|
|
|
|
| Age? |
|
| |
4 | Project License | 1 @Amy Zwarico |
|
|
|
| LICENSE; README-template.md | Degree of FOS |
|
| ||
5 | Legal Details and checks |
|
|
|
|
|
|
|
|
|
| |
6 | Community Size |
|
|
|
|
|
|
| #s |
|
|
|
7 | Contributing organizations (Diversity) |
| 41 |
|
|
|
|
| #s |
|
|
|
8 | Number of contributors | 1 @Amy Zwarico |
|
|
|
|
|
| #s |
|
|
|
9 | Lifecycle Stage |
|
|
|
|
|
|
|
|
|
| |
10 | Release schedule |
|
|
|
|
|
|
| Months |
|
|
|
11 | Adoption |
|
|
|
|
|
|
| Not sure how to measure. Downloads? |
|
|
|
12 | Health Review |
|
|
|
|
|
|
| ||||
13 | Release Information |
|
|
|
|
|
|
|
|
|
|
|
14 | Number of commits (over last year) | 1 @Amy Zwarico |
|
|
|
|
|
|
|
|
|
|
15 | Number of active committers | 1 @Amy Zwarico |
|
|
|
|
|
|
|
|
|
|
16 | Number of Active committers per organization | 1 @Amy Zwarico |
|
|
|
|
|
|
|
|
|
|
17 | Number of PR/changeset |
|
|
|
|
|
|
|
|
|
|
|
18 | Mailing list activity |
|
|
|
|
|
|
|
|
|
|
|
19 | Project & Community Resources |
|
|
|
|
|
|
|
|
|
| |
20 | Website |
|
|
|
|
| README.md | Yes/NO |
|
| ||
21 | Wiki |
|
|
|
|
| README.md |
| Yes/NO |
|
| |
22 | Mailing List | 1 @Amy Zwarico |
|
|
|
| README.md |
| Yes/NO |
|
| |
23 | Slack |
|
|
|
|
| README.md |
| Yes/NO |
|
|
|
24 | Community Meetings |
|
|
|
|
| README.md |
| Yes/NO |
|
| |
25 | Project Governance |
|
|
|
|
|
|
|
|
|
|
|
26 | TSC/TOC |
|
|
|
|
| GOVERNANCE.md; GOVERNANCE-elections.md; GOVERNANCE-maintainer.md; |
| Yes/No |
|
|
|
27 | Charter |
|
|
|
|
|
|
| Yes/No |
|
|
|
28 | Code of Conduct |
|
|
|
|
| CODE_OF_CONDUCT.md; README.md |
| Yes/No |
|
|
|
29 | How to contribute |
|
|
|
|
| CONTRIBUTING.md; README.md | Yes/No |
|
|
| |
30 | Project Roles |
|
|
|
|
| CONTRIBUTOR_LADDER.md |
| Yes/No |
|
|
|
31 | Maintainers | 1 @Amy Zwarico |
|
|
|
| MAINTAINERS.md |
|
|
|
|
|
32 | How to Review |
|
|
|
|
| REVIEWING.md |
|
|
|
|
|
33 | Adding/Removing PTLs |
|
|
|
|
| MAINTAINERS.md ?? |
|
|
|
|
|
34 | Sub-Project Lifecycle |
|
|
|
|
| GOVERNANCE-subprojects.md |
|
|
|
|
|
35 | Dispute Resolution |
|
|
|
|
|
|
|
|
|
|
|
36 | Adding/removing committers |
|
|
|
|
|
|
|
|
|
|
|
37 | Sub-projects without a lead |
|
|
|
|
|
|
|
|
|
|
|
38 | Documentation |
|
|
|
|
|
| [english] |
|
|
|
|
39 | Technical Documentation |
|
|
|
|
|
| what should be minimum criteria? or scale |
|
|
| |
40 | Contributor onboarding Documentation |
|
|
|
|
|
| Yes/No |
|
|
| |
41 | Company Diversity (past 12 months) |
|
|
|
|
|
|
|
|
|
|
|
42 | Number of Contributors |
|
|
|
|
|
|
|
|
|
|
|
43 | Release Management |
|
|
|
|
|
| [version_unique][version_semver][version_tags][release_notes][release_notes_vulns] | Release notes contains patched and outstanding defects ( details) |
|
|
|
44 | CI CD integration |
|
|
|
|
|
| degree of automation, analysis tools, traceability of overrides, failures |
|
|
| |
45 | Adoption |
|
|
|
|
|
|
|
|
|
|
|
46 | Security Design Principals |
|
|
|
|
|
|
|
|
|
|
|
47 | Use Case/ Problem Statement |
|
|
|
|
|
|
|
|
|
|
|
48 | Problem that project solves |
|
|
|
|
| README.md |
|
|
|
| |
49 | Use Cases Scenarios |
|
|
|
|
| README.md |
|
|
|
|
|
50 | Infrastructure Tooling |
|
|
|
|
|
|
|
|
|
|
|
51 | Wiki |
|
|
|
|
|
|
|
|
|
|
|
52 | Repos |
|
|
|
|
|
|
|
|
|
| |
53 | Bug Tracking tool |
|
|
|
|
|
|
|
|
|
| |
54 | Code review |
|
|
|
|
|
|
|
|
|
|
|
55 | CI/CD tooling |
|
|
|
|
|
|
|
|
|
| |
56 | Collaboration Tooling |
|
|
|
|
|
|
|
|
|
|
|
57 | Roadmap |
|
|
|
|
|
|
|
|
|
| |
58 | Near/long-term objectives |
|
|
|
|
|
|
|
|
|
|
|
59 | Milestones |
|
|
|
|
|
|
|
|
|
|
|
60 | Risks/Challenges |
|
|
|
|
|
|
|
|
|
|
|
61 | Timeline |
|
|
|
|
|
|
|
|
|
|
|
62 | Security Best Practices |
|
|
|
|
|
|
|
|
| ||
63 | Security Contacts | 1 @Amy Zwarico |
|
|
|
| Yes, channels, |
|
| We are lacking security contacts from projects | ||
64 | Code Scanning |
|
|
|
|
|
|
| license, code vulnerability, static, dynamic, manual | Snyk, Blubracket |
|
|
65 | Seed code handoff |
|
|
|
|
|
|
|
|
|
|
|
66 | Coding Standards |
|
|
|
|
|
|
|
|
|
| |
67 | Security design principals |
|
|
|
| OSSF Scorecard; OSSF Best Practices |
|
| self assessment or audit, outside assessment/audit |
|
|
|
68 | Vulnerability Reporting |
|
|
|
|
| SECURITY.md; incident-response.md | [release_notes_vulns][vulnerability_report_process][vulnerability_report_private][vulnerability_report_response] |
|
|
|
|
69 | Bug reporting |
|
|
|
|
|
| [report_process][report_tracker][report_responses][enhancement_responses][report_archive] | days to patch, bug found during ( unit, static, dynamic, integration, field operations |
|
|
|
70 | Demonstrate Security Awareness |
|
|
|
|
|
| all of this column. |
|
|
|
|
71 | Practice Secure Lifecycle Management (per release) |
|
|
|
|
|
| cryptographic practices; |
|
|
|
|
72 | Security Documentation | 1 @Amy Zwarico |
|
|
|
|
| [vulnerabilities_fixed_60_days] [vulnerabilities_critical_fixed] |
|
|
|
|
73 | CI/CD best practices |
|
|
|
|
|
|
|
|
|
|
|
74 | Secure project architecture |
|
|
|
|
|
|
|
|
|
| |
75 | Supply Chain Security |
|
|
|
|
|
| code intake scans, 3rd party code ver alignment, 3rd party code vulnerability reporting |
|
| There is also OpenSSF S2C2F | |
76 | SBOM creation |
|
|
|
|
|
|
| Yes/NO |
| ||
77 | Static Application Security Testing (SAST) | 1 @Amy Zwarico |
|
|
|
|
| [static_analysis][static_analysis_common_vulnerabilities][static_analysis_fixed][static_analysis_often] | <-- |
|
| low hanging fruit |
78 | Dynamic Application Security Testing (DAST) |
|
|
|
|
|
| [dynamic_analysis][dynamic_analysis_unsafe][dynamic_analysis_enable_assertions][dynamic_analysis_fixed] | <-- |
|
|
|
79 | Software Composition Analysis (SCA) | 1 @Amy Zwarico |
|
|
|
|
|
|
|
|
| SonarCloud / SonarType could be used. |
80 | Container vulnerablitiy scanning |
|
|
|
|
|
|
|
|
|
|
|
81 | Code Coverage Testing |
|
|
|
|
|
| [test][test_invocation][test_most][test_continuous_integration]; [test_policy][tests_are_added][tests_documented_added] | <-- |
|
|
|
82 | Code Quality |
|
|
|
|
|
|
|
|
|
|
|
83 |
|
|
|
|
|
|
|
|
|
|
|
|
84 |
|
|
|
|
|
|
|
|
|
|
|
|
Quality Goals
CNCF Templates
should LFN introduce similar templates to be used in the repos? Seems like a low hanging fruit, LFN could create a “blueprint repo” for new projects to have some of the required information documented in a consistent way (details can be still documented by just providing a link in the template e.g. to the Wiki)
LFX
v1 insights was used to track a lot of the measurements. V2 is broken/ not useful. @Casey Cain escalated this: A new team will be in place to work on this. We should provide our feedback on what we need.
Subprojects need to be better supported.
Issues: https://support.linuxfoundation.org
missing data- Anuket is missing two contributors.
historical data is not maintained/available consistently
Project feedback:
Nephio
a. LFX does not provide credible data to determine eligible voters (calculated by contributions collected in LFX Insights). Porch subproject was not counted in Insights.
b. LFX Insight data accuracy and trustworthiness: overall, individual contributors and company statistics looked weird.
c. LXF dashboard data presentation: Bitergia was superior in displaying data
Anuket
LFX does not provide relevant information important for company OSPOS, like
a. Company contributions per different projects and sub projects during a given timeframe
b. Person contributions per projects and sub projects during a given time period
c. Contributor company association is not working (although set in contributor profile)
d. It is impossible to figure out the source of the data (Contributor Leaderboard). e.g Contributor:John Doe, Activities: 217, Change: +36, % Contribution: 27
- Who is John Doe: no link to profile
- Activities: no link to details like what was contributed (Gerrit, Github) and where (which subprojects)
Positive: Finally they implemented the feature to check who is going to which conference (<https://myorg.lfx.dev/<company>/events).
Missing things important for community leaders
a. No breakdown of data per sub projects (RA2, Functest and so on)
b. No aggregation of data among VCS (Who are the main contributors in Gerrit + GitHub)?
c. No transparent display of data for vote eglibility and breakdown that data
Security goals
Security Contacts
projects shall have security contacts documented.
Code quality
ODL: auto-formatting tools are helpful to make code understandable and maintainable, patterns and static analysis e.g. for logging (exception handling, log levels), test coverage 70-80% ( for example as noted in OpenDaylight best practices)
OpenSSF best practices
Functest: noted this doesn’t really fit for functest. @Cédric Ollivier to share which criteria are not fitting and what alternative tools to use for reaching and measuring goals
SAST
low hanging fruit.
SCA
SonarCloud could create a lot of alerts when first used (e.g. issues not only on production code but also tests). Pick the right metric and have best practices in place.
Priority 1 Security Goals for LFN Projects
Goal | Description | Next Steps |
---|---|---|
Project Name | Project shall have a documented name that will be used in project communications and naming conventions | Parent Project: wiki and GitHub, should be same. Sub-Project should have consistent names reflective of parent projects in both locations. It should remain same starting from on-boarding authorization. |
Project License | Project shall document all licenses that will be inherited by users of the project. | Wiki, GitHub, SBOM (Example: Debian 2.2? Should also remain consistent starting from on-boarding AI: what to do with multiple types of licenses. |
Number of contributors | Project shall maintain a viewable list of all contributors. | LFX Insights AI: no aggregate data available. AI: Gerrit to GitHub mirroring, how do we maintain committers showing up correctly on GitHub? |
Number of commits (over last year) | Project shall use a version control system such as Git to track commits. |
|
Number of active committers | Project shall use a version control system such as Gerrit or GitHub to track active committers. | tools to track committers. info.yaml- need to discuss |
Number of Active committers per organization | Project shall use a version control system such as Git to track committers' organizations. |
|
Mailing List | Project shall maintain mailing list of project participants and interested parties. | AI: what are minimum types of mailer lists? |
Maintainers | Project shall maintain a list of project maintainers. |
|
Security Contacts | Project shall maintain a list of security contacts for the project. |
|
Security Documentation | Project shall document and maintain the security practices and requirements uses can expect of the project team. Practices include SLAs for fixing vulnerabilities, scanning code for vulnerabilities, etc. |
|
Static Application Security Testing (SAST) | Projects shall scan all code produced by the project with a SAST tool and fix vulnerabilities found in each release. |
|
Software Composition Analysis (SCA) | Project shall scan all code with a SCA tool and update vulnerable and out of date packages found in each release. |
|