Open Source
The number of Free/Libre and Open Source Software (FLOSS) projects increases every year. The developers community aims at delivering high-quality software with comprehensive documentation, test coverage, and high level of security. To coordinate the efforts towards these goals, a dedicated organization has been created, called Open Source Security Foundation (OpenSSF).
Respected IT companies contributed to the OpenSSF security best practices and to the software for security analysis of open source projects [1]. Representative deliverables include:
The baseline component is the OpenSSF Best Practices badge project [2] which is included into OpenSSF.
Figure 1. Some badge earners
Currently there are over 4800 projects excercising OpenSSF Best Practices [3].
The OpenSSF Best Practices badge is a way for open source projects to show that they follow the best practices of mature software development. OpenSSF Best Practices badge allows 3rd-party developers to quickly evaluate project maturity. Such projects are less likely to introduce security issues in case of re-use. They also help the authors of evaluated project to identify potential improvement areas. OpenSSF Best Practices badge indicates that the open source project follows security best practices to ensure resilience against known and future security issues.
To receive OpenSSF Best Practices badge the project has to pass evaluation according to a well-defined standardized evaluation methodology. There are three levels of badges: "passing", "silver" and "gold". Each level requires an open source project to meet a set of criteria. For "silver" and "gold", this includes reaching the previous level. The "passing" level has 67 criteria, "silver" – 55, and "gold" – another 23. All evaluation criteria are categorized into five groups: Basics, Change Control, Quality, Security and Analysis.
OpenSSF provides Scorecard software to automate GitHub open source projects self-evaluation. Evaluation metrics include: well-defined security policies, code review process, continuous testing coverage, fuzzing, and static code analysis [4]. The following checks are run against the target project by default: Binary-Artifacts, Branch-Protection, CI-Tests, CII-Best-Practices, Code-Review, Contributors, Dangerous-Workflow, Dependency-Update-Tool, Fuzzing, License, Maintained, Pinned-Dependencies, Packaging, SAST, Security-Policy, Signed-Releases, Token-Permissions, Vulnerabilities [5].
Figure 2. Home Edge architecture
Home Edge is an open source project, hosted by LF Edge, that aims at enabling a robust, reliable, and intelligent home edge computing. It provides an open source framework, platform, and ecosystem to execute various scenarios on a variety of smart home IoT devices. To successfully accelerate the edge computing services ecosystem's deployment, the Home Edge project provides users with an interoperable, flexible, and scalable edge computing service platform and a set of APIs.
In December 2020, the LF Home Edge team received the OpenSSF Best Practices "passing" badge and decided to move further. In November 2021, a "silver" badge was received. And now we are happy to announce that the Home Edge project (v1.1.14) has reached the OpenSSF Best Practices "gold" badge.
Figure 3. Home Edge reached the OpenSSF Best Practices "gold" badge
Only 12 projects of 4898 have received the OpenSSF Best Practices “gold” badge and we are proud that Home Edge is among them. We would like to express our sincere gratitude and appreciation to LF Edge contributors who made this achievement possible: Peter Moonki Hong, Taras Drozdovskyi, Taewan Kim, Suresh LC, and specifically to Brett Preston, Eric Ball, Jim White from Linux Foundation (LF) community.
On the way towards OpenSSF Best Practices “gold” badge, Home Edge project has received the following improvements:
▣ improved documentation (security, testing, code review policies, contribution guide, external APIs description)
▣ improved the build and testing system (GitHub Actions, automatic code analysis with gofmt, govet, golint)
▣ improved test coverage: up to 80%
▣ improved security checks:
√ SonarCloud: Security Hotspots – 37 -> 0; Code Smells – 253 -> 13; Duplications – 7.8% -> 3.7%
√ Integrated CodeQL Analysis, LGTM services: Security Alerts – 29 -> 0
√ LFX Security:
◾ Common Vulnerability Scoring Systems (CVSS) – 7.3/10 -> 6.5/10
◾ Secrets and Compliance Risk Score (SCRS) – 2.4/10 -> 1.2/10
We enriched the Home Edge project development infrastructure by applying various tools for automatic security code analysis. It improved both security and stability of the product. Also a better documentation is essential for adopters and new contributors. Also while getting the badge the developers acquired a rich self-development experience as SecDevOps, security assessment & automation engineers, and technical writers.
The Home Edge team has integrated the Scorecards tool into the GitHub. This tool automatically monitors the security status of open source project throughout the entire development lifecycle. This saves team’s efforts for tracking security and allows to focus on creating new functionality.
Everyone is welcome to share a feedback at LF Edge – Home Edge GitHub. Also we invite new contributors to help improve this project and expand the LF Edge community. You may reach us at LF Edge Slack channel (#homeedge) or subscribe to our E-Mail list (homeedge-tsc@lists.lfedge.org)
[1]. https://www.linuxfoundation.org/blog/open-source-security-foundation-openssf-reflection-and-future/(CC-BY-4.0)
[2]. https://bestpractices.coreinfrastructure.org/en(CC-BY-3.0+)
[3]. https://bestpractices.coreinfrastructure.org/en/projects(CC-BY-3.0+)
[4]. https://openssf.org/blog/2020/11/06/security-scorecards-for-open-source-projects/(CC-BY-4.0)