Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update and Split Trivy #80

Merged
merged 37 commits into from
Jan 25, 2024
Merged
Show file tree
Hide file tree
Changes from 35 commits
Commits
Show all changes
37 commits
Select commit Hold shift + click to select a range
b757fce
updating trivy plugin versions and commit hash
faizan12123 Dec 13, 2023
54939c7
Editing Trivy plugin in to read for if dev dependencies is included o…
faizan12123 Dec 13, 2023
e50d191
Fixed the parser for scanning lock files. Still need to fix the parse…
faizan12123 Dec 21, 2023
fae5b31
updated the artemisdb model to fix column level constraint issue caus…
faizan12123 Jan 3, 2024
49a2142
Finished package-lock generator. Now generates package-lock files if …
faizan12123 Jan 4, 2024
3d9ca25
split up the code for trivy into multiple files for better readabilit…
faizan12123 Jan 4, 2024
d32ed8a
temporarily disabling image scanning from trivy for a|b testing
faizan12123 Jan 8, 2024
d0a70d1
Merge branch 'main' into Updating-Trivy
faizan12123 Jan 9, 2024
c868d37
updating axios from 1.6.1 to 1.6.5
faizan12123 Jan 9, 2024
81b2d72
Updating cryptography dependency
faizan12123 Jan 10, 2024
6372b53
moving write_npmrc to libs so that it can be a shared component betwe…
faizan12123 Jan 10, 2024
5b3834e
removing files that have been relocated/deduced into one location
faizan12123 Jan 10, 2024
d238d97
fixing bug with generate_locks and logging errors
faizan12123 Jan 12, 2024
41e33a6
formatting files with black auto formatter for python to keep code co…
faizan12123 Jan 12, 2024
021bf68
fixing write_npmrc import for node_dependencies plugin since the file…
faizan12123 Jan 12, 2024
de16f9c
returning errors and warnings from the package lock generation to inf…
faizan12123 Jan 16, 2024
2154738
changing the write_npmrc file to accept a log as a parameter so that …
faizan12123 Jan 18, 2024
3ba837f
fixing node_dependencies unit tests to match param changes to write_n…
faizan12123 Jan 18, 2024
5c4adb3
fixing unit tests from changes made to write_npmrc
faizan12123 Jan 19, 2024
edb21be
splitting trivy into 2 plugins. Trivy SCA and trivy image scanning
faizan12123 Jan 22, 2024
fb26a5c
fixing typo
faizan12123 Jan 22, 2024
99cfdfc
updating lingUI for internationalization
faizan12123 Jan 22, 2024
dbc58fc
seperating unit tests for trivy backend changes and then fixing black…
faizan12123 Jan 22, 2024
3520f26
changing the trivy container API to just trivy so that users can reta…
faizan12123 Jan 23, 2024
1df5d04
changing trivy_image folder to just trivy to match API changes
faizan12123 Jan 23, 2024
197644a
updating the log messages to be more specific
faizan12123 Jan 23, 2024
98529fb
running prettier linter on the server.ts to meet coding standards
faizan12123 Jan 23, 2024
6fac9c9
updating trivy test file name
faizan12123 Jan 24, 2024
870f43d
fixing discrepency between tool key and api name
faizan12123 Jan 24, 2024
0518ebd
added unit tests for package-lock generation warnings in trivy tests …
faizan12123 Jan 24, 2024
7ccb930
Updating UI test to look for both trivy plugins
faizan12123 Jan 24, 2024
7c784cf
fixing unit tests
faizan12123 Jan 24, 2024
c5753cd
does not need the conditional of trivy_sca to be met as this should b…
faizan12123 Jan 24, 2024
018d402
fixing error made when updating UI unit tests
faizan12123 Jan 24, 2024
ae4a0c5
Merge branch 'main' into Updating-Trivy
faizan12123 Jan 24, 2024
c7fc92e
changed json.dump output to not boolean. Removed un-used import. Clea…
faizan12123 Jan 25, 2024
cb4b87f
changing how we add include dev dependency tag and making logging mor…
faizan12123 Jan 25, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions backend/Dockerfiles/Dockerfile.dind
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@ FROM docker:20.10.14-dind
ARG MAINTAINER
breedenc marked this conversation as resolved.
Show resolved Hide resolved
LABEL maintainer=$MAINTAINER

ARG TRIVY_VER=v0.16.0
ARG TRIVY_COMMIT=0285a89c7cce9b2d07bd5826cd2fed68420ca546
ARG TRIVY_VER=v0.48.0
davakos marked this conversation as resolved.
Show resolved Hide resolved
ARG TRIVY_COMMIT=01edbda3472ebbbe71f341d1741194ec35de7d69
ARG SNYK_VER=v1.889.0

# Copy Artemis libraries into /src for installation
Expand All @@ -17,6 +17,7 @@ COPY ./libs/ /src/
# - Symlink python3 to python for Analyzer Engine benefit
RUN apk update && apk add git unzip python3 py3-pip libgcc libstdc++ && \
apk upgrade && \
apk add npm && \
pip3 install --upgrade pip setuptools boto3 && \
pip3 install boto3 && \
ln -s /usr/bin/python3 /usr/bin/python && \
Expand Down
4 changes: 2 additions & 2 deletions backend/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -202,8 +202,8 @@ SEVERITY_LEVELS := critical,high
PHP_SCANNER_VER := 1.0.0

# Trivy
TRIVY_COMMIT := 0285a89c7cce9b2d07bd5826cd2fed68420ca546
TRIVY_VER := v0.16.0
TRIVY_COMMIT := 01edbda3472ebbbe71f341d1741194ec35de7d69
TRIVY_VER := v0.48.0

# Snyk
SNYK_VER := v1.889.0
Expand Down
48 changes: 24 additions & 24 deletions backend/Pipfile.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

73 changes: 73 additions & 0 deletions backend/engine/plugins/lib/trivy_common/generate_locks.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
import subprocess
import os
from glob import glob
from engine.plugins.lib import utils
from engine.plugins.lib.write_npmrc import handle_npmrc_creation

logger = utils.setup_logging("trivy_sca")


def install_package_files(include_dev, path, root_path):
# Create a package-lock.json file if it doesn't already exist
logger.info(
f'Generating package-lock.json for {path.replace(root_path, "")} (including dev dependencies: {include_dev})'
)
cmd = [
"npm",
"install",
"--package-lock-only", # Generate the needed lockfile
"--legacy-bundling", # Don't dedup dependencies so that we can correctly trace their root in package.json
"--legacy-peer-deps", # Ignore peer dependencies, which is the NPM 6.x behavior
"--no-audit", # Don't run an audit
]
if not include_dev:
cmd.append("--only=prod")
return subprocess.run(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=path, check=False)


def check_package_files(path: str, include_dev: bool) -> tuple:
"""
Main Function
Find all of the package.json files in the repo and build lock files for them if they dont have one already.
Parses the results and returns them with the errors.
"""

errors = []
alerts = []

# Find and loop through all the package.json files in the path
files = glob("%s/**/package.json" % path, recursive=True)

logger.info("Found %d package.json files", len(files))

# If there are no package.json files, exit function
if len(files) == 0:
return errors, alerts

# Build a set of all directories containing package files
paths = set()
for filename in files:
paths.add(os.path.dirname(filename))

# Write a .npmrc file based on the set of package.json files found
handle_npmrc_creation(logger, paths)

# Loop through paths that have a package file and generate a package-lock.json for them (if does not exist)
for sub_path in paths:
lockfile = os.path.join(sub_path, "package-lock.json")
lockfile_missing = not os.path.exists(lockfile)
if lockfile_missing:
msg = (
f"No package-lock.json file was found in path {sub_path.replace(path, '')}."
" Please consider creating a package-lock file for this project."
)
logger.warning(msg)
alerts.append(msg)
r = install_package_files(include_dev, sub_path, path)
if r.returncode != 0:
error = r.stderr.decode("utf-8")
logger.error(error)
errors.append(error)
return errors, alerts
# Return the results
return errors, alerts
89 changes: 89 additions & 0 deletions backend/engine/plugins/lib/trivy_common/parsing_util.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
"""
trivy output parser
"""
import json
from typing import NamedTuple
from engine.plugins.lib import utils

logger = utils.setup_logging("trivy")

DESC_REMEDIATION_SPLIT = "## Recommendation"


def parse_output(output: list) -> list:
results = []
for item in output:
source = item["Target"]
component_type = convert_type(item.get("Type", "N/A"))
if item.get("Vulnerabilities") is None:
continue
cve_set = set()
for vuln in item["Vulnerabilities"]:
vuln_id = vuln.get("VulnerabilityID")
if vuln_id in cve_set:
continue
cve_set.add(vuln_id)
description_result = get_description_and_remediation(vuln.get("Description"), vuln.get("FixedVersion"))

component = vuln.get("PkgName")
if vuln.get("InstalledVersion"):
component = f'{component}-{vuln.get("InstalledVersion")}'
results.append(
{
"component": component,
"source": source,
"id": vuln_id,
"description": description_result.description,
"severity": vuln.get("Severity", "").lower(),
"remediation": description_result.remediation,
"inventory": {
"component": {
"name": vuln.get("PkgName"),
"version": vuln.get("InstalledVersion"),
"type": component_type,
},
"advisory_ids": sorted(
list(set(filter(None, [vuln_id, vuln.get("PrimaryURL")] + vuln.get("References", []))))
),
},
}
)
return results


def convert_type(component_type: str) -> str:
if component_type == "bundler":
return "gem"
return component_type.lower()


def get_description_and_remediation(description, fixed_version) -> NamedTuple:
"""
gets the description and remediation fields after pulling them from the vuln and appending/removing additional info
:param fixed_version:
:param description:
:return: NamedTuple containing the description and remediation
"""
result = NamedTuple("DescriptionResult", [("description", str), ("remediation", str)])
if not description:
description = ""
remediation = ""
if DESC_REMEDIATION_SPLIT in description:
des_split = description.split(DESC_REMEDIATION_SPLIT)
remediation = des_split[1].strip()
description = des_split[0].strip()
if fixed_version:
remediation = f"Fixed Version: {fixed_version}. {remediation}".strip()
result.description = description
result.remediation = remediation
return result


def convert_output(output_str: str):
if not output_str:
return None
try:
return json.loads(output_str)
except json.JSONDecodeError as e:
logger.error(e)
return None
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,8 @@

NODE_CRED_KEY = "node-dep-creds"

log = utils.setup_logging("node_dependencies")
davakos marked this conversation as resolved.
Show resolved Hide resolved


def handle_npmrc_creation(paths: set, home_dir=None) -> bool:
def handle_npmrc_creation(logger, paths: set, home_dir=None) -> bool:
"""
Main npmrc creation function. Checks if the .npmrc file exists, and if not,
gets and writes the private registries currently in the package.jsons
Expand All @@ -16,28 +14,28 @@ def handle_npmrc_creation(paths: set, home_dir=None) -> bool:
home_dir = os.environ["HOME"]
npmrc = os.path.join(home_dir, ".npmrc")
if os.path.exists(npmrc):
log.info("%s already exists, skipping", npmrc)
logger.info("%s already exists, skipping", npmrc)
return False

scope_list = get_config_matches_in_packages(paths)
scope_list = get_config_matches_in_packages(logger, paths)
if not scope_list:
log.info("No supported private packages found. Skipping .npmrc creation.")
logger.info("No supported private packages found. Skipping .npmrc creation.")
return False
write_npmrc(npmrc, scope_list)
write_npmrc(logger, npmrc, scope_list)
return True


def get_config_matches_in_packages(paths: set) -> list:
def get_config_matches_in_packages(logger, paths: set) -> list:
"""
- Gets a list of the private scopes supported from Secrets Manager.
- Checks to see if the package.jsons in our paths have any supported private scopes.
- If so, the scopes are saved in a list and returned.
return: List of private scopes
"""
# get list of configs to check for
configs = get_scope_configs()
configs = get_scope_configs(logger)
if configs is None:
log.warning("List of configs is empty. Skipping .npmrc creation.")
logger.warning("List of configs is empty. Skipping .npmrc creation.")
return None

private_scope_list = []
Expand All @@ -50,18 +48,18 @@ def get_config_matches_in_packages(paths: set) -> list:
contents = f.read()
for config in configs:
if f'"@{config["scope"]}/' in contents:
log.info(f"%s has @%s packages", package_file, config["scope"])
logger.info(f"%s has @%s packages", package_file, config["scope"])
private_scope_list.append(config)
return private_scope_list


def get_scope_configs():
def get_scope_configs(logger):
"""
grabs the list of private registries from secrets manager
"""
creds = utils.get_secret(NODE_CRED_KEY, log)
creds = utils.get_secret(NODE_CRED_KEY, logger)
if not creds:
log.error("Unable to retrieve Node registry configs.")
logger.error("Unable to retrieve Node registry configs.")
return None
return creds

Expand All @@ -81,9 +79,9 @@ def build_npm_config(scope, registry, token, username, email, **kwargs):
)


def write_npmrc(npmrc, scope_list: list) -> None:
def write_npmrc(logger, npmrc, scope_list: list) -> None:
# Write the configs for flagged scopes to npmrc
with open(npmrc, "a") as f:
for scope in scope_list:
log.info(f"Writing {scope['scope']} config to %s", npmrc)
logger.info(f"Writing {scope['scope']} config to %s", npmrc)
f.write(build_npm_config(**scope))
4 changes: 2 additions & 2 deletions backend/engine/plugins/node_dependencies/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
from engine.plugins.lib.line_numbers.resolver import LineNumberResolver
from engine.plugins.node_dependencies.audit import npm_audit
from engine.plugins.node_dependencies.parse import parse_advisory
from engine.plugins.node_dependencies.write_npmrc import handle_npmrc_creation
from engine.plugins.lib.write_npmrc import handle_npmrc_creation

log = utils.setup_logging("node_dependencies")

Expand All @@ -35,7 +35,7 @@ def check_package_files(path: str, include_dev: bool = False) -> tuple:
paths.add(os.path.dirname(filename))

# Write a .npmrc file based on the set of package.json files found
handle_npmrc_creation(paths)
handle_npmrc_creation(log, paths)

for sub_path in paths:
absolute_package_file = f"{sub_path}/package.json"
Expand Down
Loading