Browse Source

Merge branch 'main' into 'exception_work'

# Conflicts:
#   pom.xml
#   src/main/java/rest/CreateScanRequest.java
#   src/main/java/rest/CreateScanResource.java
#   src/main/java/rest/CreateStartScan.java
#   src/main/java/rest/StoreData.java
local_dev_fixes
Nicholas Caughey 3 years ago
parent
commit
275014b89a
  1. 12
      .gitignore
  2. 127
      README.md
  3. 36
      k8s/linux-krb5.conf
  4. 21
      k8s/stage/edgeroute.yml
  5. 44
      k8s/stage/kerberos-config.yaml
  6. 177
      pom.xml
  7. 0
      schema/.gitkeep
  8. 30
      schema/OffRegScraper.py
  9. 126
      schema/populate.sql
  10. 81
      schema/schema.sql
  11. 1
      src/main/docker/Dockerfile.jvm
  12. 29
      src/main/java/dto/BrewObj.java
  13. 33
      src/main/java/dto/BrewObjPayload.java
  14. 28
      src/main/java/dto/ConnectDB.java
  15. 22
      src/main/java/dto/GitObj.java
  16. 30
      src/main/java/dto/GitObjPayload.java
  17. 16
      src/main/java/dto/PncObj.java
  18. 28
      src/main/java/dto/PncObjPayload.java
  19. 24
      src/main/java/dto/ScanObj.java
  20. 30
      src/main/java/dto/ScanObjPayload.java
  21. 78
      src/main/java/rest/CreateGetResource.java
  22. 66
      src/main/java/rest/CreateScanRequest.java
  23. 35
      src/main/java/rest/CreateScanResource.java
  24. 37
      src/main/java/rest/CreateStartScan.java
  25. 69
      src/main/java/rest/RemoveScan.java
  26. 6
      src/main/java/rest/Scan.java
  27. 92
      src/main/java/rest/StoreData.java
  28. 36
      src/main/java/rest/UsersResource.java
  29. 35
      src/main/resources/application.properties
  30. 107
      src/test/java/dto/TestPayload.java

12
.gitignore vendored

@ -0,0 +1,12 @@
.dcignore
.idea
*.iml
dev/
# Maven
target/
pom.xml.tag
pom.xml.releaseBackup
pom.xml.versionsBackup
release.properties

127
README.md

@ -1,112 +1,51 @@
See https://docs.google.com/document/d/15yod6K_ZbNkJ_ern7gwpxjBkdJIlHXORfYZ3CGQhnEM/edit?usp=sharing for a full version with images
# code-with-quarkus
# Introduction
Currently we rely on CPaaS to submit requests to PSSaaS which then invokes the PSSC scanning container. The idea behind the ScanChain api is to act as an interaction point for services to be able to directly access our scan tooling.
This project uses Quarkus, the Supersonic Subatomic Java Framework.
Our api will be written in Quarkus for ease of use and deployment to OpenShift, we will also use Tekton to assist with CI/CD.
If you want to learn more about Quarkus, please visit its website: https://quarkus.io/ .
# How to build
## Running the application in dev mode
To set up the environment. After cloning the repository:
```
cd <repository>/
quarkus create app quarkus:dev
mvn -N io.takari:maven:wrapper
```
Also, it is necessary to create a local PostgreSQL instance. For development purposes, the parameters are:
```
username = postgresql
password = password
```
ToDo: Create Database Model
To run the Quarkus build in dev mode simply run:
````
You can run your application in dev mode that enables live coding using:
```shell script
./mvnw compile quarkus:dev
````
All end points should be avaliable on localhost:8080/{endpoint}. The endpoints are listed in the endpoints section
# Deploying to OpenShift (https://quarkus.io/guides/deploying-to-openshift)
Part of the advantage of working with quarkus is the ease of which we can deploy it to OpenShift. We have the OpenShift extension already installed via the pom,
All that should be required to build and deploy OpenShift is to login to OpenShift via the usual method (oc login (creds) for example). Before running a build command:
You can then expose the routes (oc expose {route}), then your application should be accessible on the OpenShift cluster. This is verifiable either by using the console to request which services are running (oc get svc) or by using the web console which should display the service graphically.
# Design diagram
API endpoint diagram with all endpoints DB links, connections to further services (PNC API etc)
# API endpoints
## /{scanId} - GET request for retrieving scans
This is a simple request for retrieving scans that are stored in our postgresql database. The assigned scanId will return the whole scan payload in JSON format.
## / - POST request takes a JSON payload to start scans (Maybe isnt relevant/shouldnt be included in the future)
Creating scans via passing fully formed JSON payloads. The standard JSON format should contain:
product-id
event-id
is-managed-service
component-list
See appendix 1 for a provided example
## /scanRequest - Post request for starting scans
There are several different types of build that should be retrieved from the backend source. Different inputs are required based off the build source.
The required fields for BREW builds are:
buildSystemType
brewId
brewNVR - matches brewId
pncId
artifactType
fileName
builtFromSource
The required fields for git builds are:
buildSystemType
repository
reference
commitId
The required fields for PNC builds are:
buildSystemType
buildId
```
This information should allow us to have all the requirements for retrieving and then starting a scan when requested from the required sources.
> **_NOTE:_** Quarkus now ships with a Dev UI, which is available in dev mode only at http://localhost:8080/q/dev/.
## /startScan - PUT request to start off the relevant scan
## Packaging and running the application
Only requires the scanId and should start off the relevant scan, should return a success only on finished or failure if there's no further response after timeout.
## /removeScan - DELETE request to remove a scan build from DB
The application can be packaged using:
```shell script
./mvnw package
```
It produces the `quarkus-run.jar` file in the `target/quarkus-app/` directory.
Be aware that it’s not an _über-jar_ as the dependencies are copied into the `target/quarkus-app/lib/` directory.
Only requires the scanId should remove the relevant scan from our DB. Should return a success or failure.
The application is now runnable using `java -jar target/quarkus-app/quarkus-run.jar`.
# Expanded work to do
If you want to build an _über-jar_, execute the following command:
```shell script
./mvnw package -Dquarkus.package.type=uber-jar
```
## Jenkins
The application, packaged as an _über-jar_, is now runnable using `java -jar target/*-runner.jar`.
Haven't looked into the correct way for the API to interact with Jenkins needs more investigation.
## Creating a native executable
## Jira tickets still to do:
https://issues.redhat.com/browse/PSSECMGT-1548
https://issues.redhat.com/browse/PSSECMGT-1549
https://issues.redhat.com/browse/PSSECMGT-1550
https://issues.redhat.com/browse/PSSECMGT-1551
https://issues.redhat.com/browse/PSSECMGT-1552
https://issues.redhat.com/browse/PSSECMGT-1553
https://issues.redhat.com/browse/PSSECMGT-1554
You can create a native executable using:
```shell script
./mvnw package -Pnative
```
Or, if you don't have GraalVM installed, you can run the native executable build in a container using:
```shell script
./mvnw package -Pnative -Dquarkus.native.container-build=true
```
# Appendix
You can then execute your native executable with: `./target/code-with-quarkus-1.0.0-SNAPSHOT-runner`
Appendix 1
If you want to learn more about building native executables, please consult https://quarkus.io/guides/maven-tooling.
## Related Guides

36
k8s/linux-krb5.conf

@ -0,0 +1,36 @@
includedir /etc/krb5.conf.d/
# depending on your config, you may wish to uncomment the following:
# includedir /var/lib/sss/pubconf/krb5.include.d/
[libdefaults]
default_realm = IPA.REDHAT.COM
dns_lookup_realm = true
dns_lookup_kdc = true
rdns = false
dns_canonicalize_hostname = false
ticket_lifetime = 24h
forwardable = true
udp_preference_limit = 1
default_ccache_name = KEYRING:persistent:%{uid}
max_retries = 1
kdc_timeout = 1500
[realms]
REDHAT.COM = {
default_domain = redhat.com
dns_lookup_kdc = true
master_kdc = kerberos.corp.redhat.com
admin_server = kerberos.corp.redhat.com
}
IPA.REDHAT.COM = {
default_domain = ipa.redhat.com
dns_lookup_kdc = true
# Trust tickets issued by legacy realm on this host
auth_to_local = RULE:[1:$1@$0](.*@REDHAT\.COM)s/@.*//
auth_to_local = DEFAULT
}
#DO NOT ADD A [domain_realms] section
#https://mojo.redhat.com/docs/DOC-1166841

21
k8s/stage/edgeroute.yml

@ -0,0 +1,21 @@
#oc create route edge --service=osh --dry-run=client -o yaml > edgeroute.yml
apiVersion: route.openshift.io/v1
kind: Route
metadata:
creationTimestamp: null
labels:
app.kubernetes.io/name: osh
app.kubernetes.io/version: 1.0.0-SNAPSHOT
app.openshift.io/runtime: quarkus
env: stage
name: osh
spec:
port:
targetPort: http
tls:
termination: edge
to:
kind: ""
name: osh
weight: null
status: {}

44
k8s/stage/kerberos-config.yaml

@ -0,0 +1,44 @@
#oc create configmap kerberos-config --from-file=linux-krb5.conf --dry-run=client -o yaml > kerberos-config.yaml
apiVersion: v1
data:
linux-krb5.conf: |
includedir /etc/krb5.conf.d/
# depending on your config, you may wish to uncomment the following:
# includedir /var/lib/sss/pubconf/krb5.include.d/
[libdefaults]
default_realm = IPA.REDHAT.COM
dns_lookup_realm = true
dns_lookup_kdc = true
rdns = false
dns_canonicalize_hostname = false
ticket_lifetime = 24h
forwardable = true
udp_preference_limit = 1
default_ccache_name = KEYRING:persistent:%{uid}
max_retries = 1
kdc_timeout = 1500
[realms]
REDHAT.COM = {
default_domain = redhat.com
dns_lookup_kdc = true
master_kdc = kerberos.corp.redhat.com
admin_server = kerberos.corp.redhat.com
}
IPA.REDHAT.COM = {
default_domain = ipa.redhat.com
dns_lookup_kdc = true
# Trust tickets issued by legacy realm on this host
auth_to_local = RULE:[1:$1@$0](.*@REDHAT\.COM)s/@.*//
auth_to_local = DEFAULT
}
#DO NOT ADD A [domain_realms] section
#https://mojo.redhat.com/docs/DOC-1166841
kind: ConfigMap
metadata:
creationTimestamp: null
name: kerberos-config

177
pom.xml

@ -1,17 +1,16 @@
<?xml version="1.0"?>
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd" xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<repositories>
<repository>
<id>jboss</id>
<name>JBoss repository</name>
<url>http://repository.jboss.org/maven2</url>
</repository>
</repositories>
<repositories>
<repository>
<id>jboss</id>
<name>JBoss repository</name>
<url>http://repository.jboss.org/maven2</url>
</repository>
</repositories>
<modelVersion>4.0.0</modelVersion>
<groupId>com.redhat.ncaughey</groupId>
<artifactId>rest-json-quickstart</artifactId>
<groupId>com.redhat.pctOshWrapper</groupId>
<artifactId>osh</artifactId>
<version>1.0.0-SNAPSHOT</version>
<properties>
<compiler-plugin.version>3.10.1</compiler-plugin.version>
@ -33,63 +32,44 @@
<type>pom</type>
<scope>import</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.json/json -->
<!-- https://mvnrepository.com/artifact/org.json/json -->
<!-- <dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.12.1</version>
</dependency> -->
<!-- <dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.12.1</version>
</dependency> -->
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>io.fabric8</groupId>
<artifactId>tekton-client</artifactId>
<version>6.7.2</version>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-openshift</artifactId>
</dependency>
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20220320</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.postgresql/postgresql -->
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>42.6.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.hibernate.orm/hibernate-core -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<groupId>io.quarkiverse.kerberos</groupId>
<artifactId>quarkus-kerberos</artifactId>
<version>1.0.0</version>
</dependency>
<dependency>
<groupId>org.glassfish.jaxb</groupId>
<artifactId>jaxb-runtime</artifactId>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-openshift</artifactId>
</dependency>
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20220320</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.postgresql/postgresql -->
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>42.6.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.hibernate.orm/hibernate-core -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
</dependency>
<dependency>
<groupId>org.glassfish.jaxb</groupId>
<artifactId>jaxb-runtime</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-jdbc-postgresql</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-jdbc-postgresql</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
@ -99,45 +79,49 @@
<groupId>io.quarkus</groupId>
<artifactId>quarkus-arc</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-agroal</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-junit5</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.18.26</version>
<scope>provided</scope>
</dependency>
<!-- Bean Validation API and RI -->
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>1.0.0.GA</version>
</dependency>
<!-- https://mvnrepository.com/artifact/jakarta.persistence/jakarta.persistence-api -->
<dependency>
<groupId>jakarta.persistence</groupId>
<artifactId>jakarta.persistence-api</artifactId>
<version>3.1.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.eclipse.microprofile.rest.client/microprofile-rest-client-api -->
<dependency>
<groupId>org.eclipse.microprofile.rest.client</groupId>
<artifactId>microprofile-rest-client-api</artifactId>
<version>3.0.1</version>
</dependency>
<!-- <dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.5.2</version>
</dependency> -->
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.18.26</version>
<scope>provided</scope>
</dependency>
<!-- Bean Validation API and RI -->
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>1.0.0.GA</version>
</dependency>
<!-- https://mvnrepository.com/artifact/jakarta.persistence/jakarta.persistence-api -->
<dependency>
<groupId>jakarta.persistence</groupId>
<artifactId>jakarta.persistence-api</artifactId>
<version>3.1.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.eclipse.microprofile.rest.client/microprofile-rest-client-api -->
<dependency>
<groupId>org.eclipse.microprofile.rest.client</groupId>
<artifactId>microprofile-rest-client-api</artifactId>
<version>3.0.1</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>jboss</id>
<name>JBoss repository</name>
<url>http://repository.jboss.org/maven2</url>
</repository>
</repositories>
<build>
<plugins>
<plugin>
@ -193,6 +177,19 @@
</execution>
</executions>
</plugin>
<plugin>
<groupId>io.smallrye</groupId>
<artifactId>jandex-maven-plugin</artifactId>
<version>3.1.1</version>
<executions>
<execution>
<id>make-index</id>
<goals>
<goal>jandex</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<profiles>

0
schema/.gitkeep

30
schema/OffRegScraper.py

@ -0,0 +1,30 @@
from bs4 import BeautifulSoup
import requests
import re
import csv
results = {}
URL = "https://product-security.pages.redhat.com/offering-registry/"
r = requests.get(URL)
soup = BeautifulSoup(r.text, 'html.parser')
table = soup.find("table")
rows = table.findAll("tr")
for row in rows:
for elem in row.contents:
if row.contents[1].text == 'Offering':
break
else:
# We extract the short name of the URL
re_search = re.search('/offering-registry/offerings/(.*)/', row.contents[1].contents[0].attrs["href"])
results[re_search.group(1)] = row.contents[1].contents[0].text
break
print(results)
with open('offerings.csv', 'w') as csv_file:
writer = csv.writer(csv_file)
for key, value in results.items():
writer.writerow([key, value])

126
schema/populate.sql

@ -0,0 +1,126 @@
INSERT INTO osh.offerings(offering_id,description) VALUES ('ansible-automation-platform','Ansible Automation Platform (AAP)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('advisor','Insights Advisor');
INSERT INTO osh.offerings(offering_id,description) VALUES ('ansible-on-aws','Ansible on AWS');
INSERT INTO osh.offerings(offering_id,description) VALUES ('ansible-on-azure','Ansible on Azure');
INSERT INTO osh.offerings(offering_id,description) VALUES ('ansible-on-gcp','Ansible on GCP');
INSERT INTO osh.offerings(offering_id,description) VALUES ('ansible-wisdom-service','Ansible Wisdom Service');
INSERT INTO osh.offerings(offering_id,description) VALUES ('cert-manager','cert-manager Operator for Red Hat OpenShift');
INSERT INTO osh.offerings(offering_id,description) VALUES ('compliance','Insights Compliance');
INSERT INTO osh.offerings(offering_id,description) VALUES ('connected-customer-experience','Connected Customer Experience (CCX)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('cost-management','Cost Management');
INSERT INTO osh.offerings(offering_id,description) VALUES ('custom-metric-autoscaler','OpenShift Custom Metrics Autoscaler');
INSERT INTO osh.offerings(offering_id,description) VALUES ('developer-sandbox-for-red-hat-openshift','Developer Sandbox for Red Hat OpenShift');
INSERT INTO osh.offerings(offering_id,description) VALUES ('dotnet','.NET');
INSERT INTO osh.offerings(offering_id,description) VALUES ('drift','Insights Drift');
INSERT INTO osh.offerings(offering_id,description) VALUES ('eclipse-vertx','Red Hat build of Eclipse Vert.x');
INSERT INTO osh.offerings(offering_id,description) VALUES ('edge-management','Edge Management');
INSERT INTO osh.offerings(offering_id,description) VALUES ('eventing','Insights Eventing');
INSERT INTO osh.offerings(offering_id,description) VALUES ('fastdatapath','RHEL Fast Datapath');
INSERT INTO osh.offerings(offering_id,description) VALUES ('host-management-services','Host Management Services');
INSERT INTO osh.offerings(offering_id,description) VALUES ('hosted-control-planes','Hosted Control Planes (Hypershift)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('hybrid-application-console','Hybrid Application Console (HAC)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('insights-essential','Insights Essentials');
INSERT INTO osh.offerings(offering_id,description) VALUES ('kernel-module-management','Kernel Module Management');
INSERT INTO osh.offerings(offering_id,description) VALUES ('logging-subsystem-for-red-hat-openshift','Logging Subsystem for Red Hat OpenShift');
INSERT INTO osh.offerings(offering_id,description) VALUES ('lvms-operator','LVMS Operator');
INSERT INTO osh.offerings(offering_id,description) VALUES ('malware-detection','Insights Malware Detection');
INSERT INTO osh.offerings(offering_id,description) VALUES ('mgmt-platform','Management Platform');
INSERT INTO osh.offerings(offering_id,description) VALUES ('migration-toolkit-for-applications','Migration Toolkit for Applications (MTA)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('migration-toolkit-for-containers','Migration Toolkit for Containers (MTC)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('migration-toolkit-for-runtimes','Migration Toolkit for Runtimes (MTR)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('migration-toolkit-for-virtualization','Migration Toolkit for Virtualization (MTV)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('network-observability-operator','Network Observability Operator');
INSERT INTO osh.offerings(offering_id,description) VALUES ('node-healthcheck-operator','Node HealthCheck Operator');
INSERT INTO osh.offerings(offering_id,description) VALUES ('node-maintenance-operator','Node Maintenance Operator');
INSERT INTO osh.offerings(offering_id,description) VALUES ('nvidia-gpu-add-on','NVIDIA GPU Add-On');
INSERT INTO osh.offerings(offering_id,description) VALUES ('oadp','OpenShift API for Data Protection');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-container-platform','Openshift Container Platform (OCP)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-container-storage','OpenShift Container Storage (OCS)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-data-foundation-managed-service','Red Hat OpenShift Data Foundation Managed Service');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-dedicated','OpenShift Dedicated (OSD/ROSA)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-developer-tools-and-services-helm','OpenShift Developer Tools and Services (Helm)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-developer-tools-and-services-jenkins','OpenShift Developer Tools and Services (Jenkins)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-distributed-tracing','OpenShift Distributed Tracing');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-on-azure','Openshift on Azure (ARO)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-run-once-duration-override-operator','OpenShift Run Once Duration Override Operator');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-sandboxed-containers','Openshift Sandboxed Containers');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-secondary-scheduler-operator','OpenShift Secondary Scheduler Operator');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-servicemesh','OpenShift Service Mesh');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-virtualization','OpenShift Virtualization (CNV)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-web-terminal-operator','OpenShift Web Terminal Operator');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-winc','Windows Container Support for OpenShift');
INSERT INTO osh.offerings(offering_id,description) VALUES ('patch','Insights Patch');
INSERT INTO osh.offerings(offering_id,description) VALUES ('product-discovery','Product Discovery');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-3scale-api-management-platform','Red Hat 3scale API Management Platform');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-advanced-cluster-management','Red Hat Advanced Cluster Management (RHACM)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-amq-broker','Red Hat AMQ Broker');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-amq-clients','Red Hat AMQ Clients');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-amq-interconnect','Red Hat AMQ Interconnect');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-amq-online','Red Hat AMQ Online');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-amq-streams','Red Hat AMQ Streams');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-build-apicurio-registry','Red Hat build of Apicurio Registry (formerly known as Integration Service Registry)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-build-quarkus','Red Hat Build of Quarkus');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-camel-extensions-quarkus','Red Hat Camel Extensions for Quarkus');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-camel-k','Red Hat Camel K');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-camel-spring-boot','Red Hat Camel for Spring Boot');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-ceph-storage','Red Hat Ceph Storage');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-certificate-system','Red Hat Certificate System (RHCS)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-certification-program','Red Hat Certification Program (rhcertification)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-code-quarkus','Red Hat Code Quarkus');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-core-os','Red Hat CoreOS');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-data-grid','Red Hat Data Grid');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-debezium','Red Hat Debezium');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-decision-manager','Red Hat Decision Manager');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-developer-hub','Red Hat Developer Hub');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-developer-toolset','Red Hat Developer Toolset (DTS)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-devtools-compilers','Red Hat Developer Tools (DevTools Compilers)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-directory-server','Red Hat Directory Server (RHDS)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-enterprise-linux-10','Red Hat Enterprise Linux (RHEL) 10');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-enterprise-linux-6','Red Hat Enterprise Linux (RHEL) 6');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-enterprise-linux-7','Red Hat Enterprise Linux (RHEL) 7');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-enterprise-linux-8','Red Hat Enterprise Linux (RHEL) 8');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-enterprise-linux-9','Red Hat Enterprise Linux (RHEL) 9');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-fuse','Red Hat Fuse');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-gluster-storage','Red Hat Gluster Storage');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-in-vehicle-os','Red Hat In-Vehicle Operating System (RHIVOS)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-jboss-core-services','Red Hat JBoss Core Services');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-jboss-eap','Red Hat JBoss Enterprise Application Platform (EAP)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-jboss-web-server','Red Hat JBoss Web Server');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-observability-service','Red Hat Observability Service');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-open-database-access','Red Hat OpenShift Database Access');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-open-shift-data-science','Red Hat OpenShift Data Science (RHODS)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openjdk','Red Hat OpenJDK');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-api-management','Red Hat OpenShift API Management');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-builds-v2','Red Hat OpenShift Builds V2');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-connectors','Red Hat OpenShift Connectors (RHOC)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-control-plane-service','Red Hat OpenShift Control Plane Service');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-data-foundation','Red Hat OpenShift Data Foundation');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-dev-spaces','Red Hat OpenShift Dev Spaces');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-gitops','Red Hat OpenShift GitOps');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-local','Red Hat OpenShift Local');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-pipelines','Red Hat OpenShift Pipelines');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-serverless','Red Hat OpenShift Serverless');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-service-registry','Red Hat OpenShift Service Registry');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-streams-apache-kafka','Red Hat OpenShift Streams for Apache Kafka (RHOSAK)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openstack-platform','Red Hat OpenStack Platform (RHOSP)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-optaplanner','Red Hat Optaplanner');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-plug-ins-for-backstage','Red Hat Plug-ins for Backstage');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-process-automation-manager','Red Hat Process Automation Manager');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-quarkus-registry','Red Hat Quarkus Registry');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-quay','Red Hat Quay');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-satellite','Red Hat Satellite');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-service-interconnect','Red Hat Service Interconnect (formerly known as Application Interconnect)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-single-sign-on','Red Hat Single Sign-On (RHSSO)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-software-collections','Red Hat Software Collections');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-support-for-spring-boot','Red Hat support for Spring Boot');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-trusted-application-pipeline','Red Hat Trusted Application Pipeline (RHTAP)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-update-infrastructure','Red Hat Update Infrastructure (RHUI)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-virtualization','Red Hat Virtualization');
INSERT INTO osh.offerings(offering_id,description) VALUES ('resource-optimization','Insights Resource Optimization (ROS)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('rh-vulnerability-for-ocp','Insights Vulnerability for OCP');
INSERT INTO osh.offerings(offering_id,description) VALUES ('rhacs','Red Hat Advanced Cluster Security for Kubernetes (RHACS)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('self-node-remediation','Self Node Remediation');
INSERT INTO osh.offerings(offering_id,description) VALUES ('subscription-central','Subscription Central');
INSERT INTO osh.offerings(offering_id,description) VALUES ('subscription-watch','Subscription Watch');
INSERT INTO osh.offerings(offering_id,description) VALUES ('telco-sw-components','Telco SW Components');
INSERT INTO osh.offerings(offering_id,description) VALUES ('vulnerability','Vulnerability');

81
schema/schema.sql

@ -0,0 +1,81 @@
CREATE SCHEMA osh;
GRANT USAGE ON SCHEMA osh TO postgres;
CREATE TABLE IF NOT EXISTS osh.offerings(
offering_id VARCHAR(100),
description VARCHAR(200),
PRIMARY KEY (offeringId)
);
CREATE TABLE IF NOT EXISTS osh.results(
results_id SERIAL,
datetime TIMESTAMP WITHOUT TIME ZONE DEFAULT (NOW() AT TIME ZONE 'utc') NOT NULL,
state BOOLEAN,
logs bytea,
task_reference VARCHAR(50),
PRIMARY KEY (results_id)
);
CREATE TABLE IF NOT EXISTS osh.scans(
scan_id SERIAL,
offering_id VARCHAR(100),
event_id VARCHAR(100) NOT NULL,
is_managed_service BOOLEAN NOT NULL,
component_list VARCHAR(100),
datetime TIMESTAMP WITHOUT TIME ZONE DEFAULT (NOW() AT TIME ZONE 'utc') NOT NULL,
owner VARCHAR(50) NOT NULL,
results SERIAL,
status VARCHAR (50) CONSTRAINT valid_status CHECK(status in ('PENDING', 'DELETED', 'COMPLETED', 'IN PROGRESS')),
last_updated TIMESTAMP WITHOUT TIME ZONE DEFAULT (NOW() AT TIME ZONE 'utc') NOT NULL,
PRIMARY KEY(scan_id),
FOREIGN KEY (offering_id) REFERENCES osh.offerings(offering_id),
FOREIGN KEY (results) REFERENCES osh.results(results_id)
);
CREATE TABLE IF NOT EXISTS osh.archive(
scan_id SERIAL,
offering_id VARCHAR(100),
event_id VARCHAR(100) NOT NULL,
is_managed_service BOOLEAN NOT NULL,
component_list VARCHAR(100),
datetime TIMESTAMP WITHOUT TIME ZONE DEFAULT (NOW() AT TIME ZONE 'utc') NOT NULL,
owner VARCHAR(50) NOT NULL,
results SERIAL,
status VARCHAR (50) CONSTRAINT valid_status CHECK(status in ('PENDING', 'DELETED', 'COMPLETED', 'IN PROGRESS')),
last_updated TIMESTAMP WITHOUT TIME ZONE DEFAULT (NOW() AT TIME ZONE 'utc') NOT NULL,
PRIMARY KEY(scan_id),
FOREIGN KEY (offering_id) REFERENCES osh.offerings(offering_id),
FOREIGN KEY (results) REFERENCES osh.results(results_id)
);
CREATE TABLE IF NOT EXISTS osh.gitscans (
id SERIAL,
build_system_type VARCHAR(80),
repository VARCHAR(150),
reference VARCHAR(100),
commit_id VARCHAR(100),
-- SHA256 has a length of 256 bits, so 256 bits would represent 64 hex characters
hashsum VARCHAR(64),
PRIMARY KEY(id)
);
CREATE TABLE IF NOT EXISTS osh.pncscans(
id SERIAL,
build_system_type VARCHAR(80),
build_id VARCHAR(100),
PRIMARY KEY(id)
);
CREATE TABLE IF NOT EXISTS osh.brewscans(
id SERIAL,
build_system_type VARCHAR(80),
brew_id VARCHAR(100),
brew_nvr VARCHAR(100),
pnc_id VARCHAR(100),
artifact_type VARCHAR(100),
file_name VARCHAR(100),
built_from_source BOOLEAN,
PRIMARY KEY(id)
);

1
src/main/docker/Dockerfile.jvm

@ -86,6 +86,7 @@ COPY --chown=185 target/quarkus-app/*.jar /deployments/
COPY --chown=185 target/quarkus-app/app/ /deployments/app/
COPY --chown=185 target/quarkus-app/quarkus/ /deployments/quarkus/
EXPOSE 8080
USER 185
ENV JAVA_OPTS="-Dquarkus.http.host=0.0.0.0 -Djava.util.logging.manager=org.jboss.logmanager.LogManager"

29
src/main/java/dto/BrewObj.java

@ -6,21 +6,24 @@ import lombok.Getter;
import lombok.ToString;
import lombok.extern.jackson.Jacksonized;
// import org.jboss.pnc.api.dto.Request;
import java.io.Serializable;
@ToString
@Getter
@AllArgsConstructor
@Jacksonized
@Builder
@Getter
@ToString
@Jacksonized
public class BrewObj implements Serializable {
public String buildSystemType;
public String brewId;
public String brewNvr;
public String pncId;
public String artifactType;
public String fileName;
public String buildFromSource;
}
public static final String SQL = "INSERT INTO brewscans " +
"(build_system_type, brew_id, brew_nvr, pnc_id, artifact_type, file_name, built_from_source)" +
"VALUES (? ? ? ? ? ? ?)";
private final String buildSystemType;
private final String brewId;
private final String brewNvr;
private final String pncId;
private final String artifactType;
private final String fileName;
private final Boolean builtFromSource;
}

33
src/main/java/dto/BrewObjPayload.java

@ -1,23 +1,20 @@
package dto;
import org.eclipse.microprofile.config.ConfigProvider;
// import org.jboss.pnc.api.deliverablesanalyzer.dto.AnalyzePayload;
// import org.jboss.pnc.api.dto.HeartbeatConfig;
// import org.jboss.pnc.api.dto.Request;
import java.net.URI;
import java.net.URISyntaxException;
import java.nio.charset.StandardCharsets;
import java.sql.Struct;
import java.util.*;
import org.json.JSONObject;
import org.json.JSONArray;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
import org.json.JSONException;
import org.json.JSONObject;
public class BrewObjPayload {
public static BrewObj constructScanPayload(JSONObject brewObj) throws URISyntaxException {
return new BrewObj(brewObj.getString("buildSystemType"),brewObj.getString("brewId"),brewObj.getString("brewNvr"),brewObj.getString("pncId"),brewObj.getString("artifactType"),brewObj.getString("fileName"),brewObj.getString("builtFromSource"));
public static BrewObj constructScanPayload(JSONObject jsonObj) throws JSONException {
return new BrewObj(
jsonObj.getString("build_system_type"),
jsonObj.getString("brew_id"),
jsonObj.getString("brew_nvr"),
jsonObj.getString("pnc_id"),
jsonObj.getString("artifact_type"),
jsonObj.getString("file_name"),
jsonObj.getBoolean("built_from_source"));
}
}
private BrewObjPayload() {}
}

28
src/main/java/dto/ConnectDB.java

@ -1,35 +1,25 @@
package dto;
import constants.PSGQL;
import org.json.JSONException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import static constants.PSGQL.user;
import static constants.PSGQL.password;
import static constants.PSGQL.url;
import static constants.PSGQL.*;
// @TODO Replace hard-coded credentials; make use of our secure db connection practice
public class ConnectDB{
// private final String url = "jdbc:postgresql://localhost:5432/scandb";
// private final String user = "postgres";
// private final String password = "password";
public class ConnectDB {
/**
* Connect to the PostgreSQL database
*
* @return a Connection object
*/
public Connection connect() {
Connection conn = null;
public Connection connect() throws JSONException {
try {
conn = DriverManager.getConnection(url, user, password);
System.out.println("Connected to the PostgreSQL server successfully.");
Connection conn = DriverManager.getConnection(url, user, password);
System.out.println("Connected to PostgreSQL server");
return conn;
} catch (SQLException e) {
System.out.println(e.getMessage());
}
return conn;
return null;
}
}

22
src/main/java/dto/GitObj.java

@ -5,19 +5,21 @@ import lombok.Builder;
import lombok.Getter;
import lombok.ToString;
import lombok.extern.jackson.Jacksonized;
// import org.jboss.pnc.api.dto.Request;
import java.io.Serializable;
@ToString
@Getter
@AllArgsConstructor
@Jacksonized
@Builder
@Getter
@ToString
@Jacksonized
public class GitObj implements Serializable {
public String buildSystemType;
public String repository;
public String reference;
public String commitId;
public static final String SQL = "INSERT INTO gitscans " +
"(build_system_type, repository, reference, commit_id)" +
"VALUES (? ? ? ?)";
private final String buildSystemType;
private final String repository;
private final String reference;
private final String commitId;
}

30
src/main/java/dto/GitObjPayload.java

@ -1,23 +1,17 @@
package dto;
import org.eclipse.microprofile.config.ConfigProvider;
// import org.jboss.pnc.api.deliverablesanalyzer.dto.AnalyzePayload;
// import org.jboss.pnc.api.dto.HeartbeatConfig;
// import org.jboss.pnc.api.dto.Request;
import java.net.URI;
import java.net.URISyntaxException;
import java.nio.charset.StandardCharsets;
import java.sql.Struct;
import java.util.*;
import org.json.JSONObject;
import org.json.JSONArray;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
import org.json.JSONException;
import org.json.JSONObject;
public class GitObjPayload {
public static GitObj constructScanPayload(JSONObject gitObj) throws URISyntaxException {
return new GitObj(gitObj.getString("buildSystemType"),gitObj.getString("repository"),gitObj.getString("reference"),gitObj.getString("commitId"));
public static GitObj constructScanPayload(JSONObject jsonObj) throws JSONException {
return new GitObj(
jsonObj.getString("build_system_type"),
jsonObj.getString("repository"),
jsonObj.getString("reference"),
jsonObj.getString("commit_id"));
}
}
private GitObjPayload() {}
}

16
src/main/java/dto/PncObj.java

@ -5,17 +5,17 @@ import lombok.Builder;
import lombok.Getter;
import lombok.ToString;
import lombok.extern.jackson.Jacksonized;
// import org.jboss.pnc.api.dto.Request;
import java.io.Serializable;
@ToString
@Getter
@AllArgsConstructor
@Jacksonized
@Builder
@Getter
@ToString
@Jacksonized
public class PncObj implements Serializable {
public String buildSystemType;
public String buildId;
public static final String SQL = "INSERT INTO pncscans (build_system_type, build_id) VALUES (? ?)";
private final String buildSystemType;
private final String buildId;
}

28
src/main/java/dto/PncObjPayload.java

@ -1,23 +1,15 @@
package dto;
import org.eclipse.microprofile.config.ConfigProvider;
// import org.jboss.pnc.api.deliverablesanalyzer.dto.AnalyzePayload;
// import org.jboss.pnc.api.dto.HeartbeatConfig;
// import org.jboss.pnc.api.dto.Request;
import java.net.URI;
import java.net.URISyntaxException;
import java.nio.charset.StandardCharsets;
import java.sql.Struct;
import java.util.*;
import org.json.JSONObject;
import org.json.JSONArray;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
import org.json.JSONException;
import org.json.JSONObject;
public class PncObjPayload {
public static PncObj constructScanPayload(JSONObject pncObj) throws URISyntaxException {
return new PncObj(pncObj.getString("buildSystemType"),pncObj.getString("buildId"));
public static PncObj constructScanPayload(JSONObject jsonObj) throws JSONException {
return new PncObj(
jsonObj.getString("build_system_type"),
jsonObj.getString("build_id"));
}
}
private PncObjPayload() {}
}

24
src/main/java/dto/ScanObj.java

@ -5,22 +5,24 @@ import lombok.Builder;
import lombok.Getter;
import lombok.ToString;
import lombok.extern.jackson.Jacksonized;
import java.io.Serializable;
// import org.jboss.pnc.api.dto.Request;
//still need to fix all the scan objects to be significantly less poorly written
//TODO add interface for the scan objects (is probably the cleanest solution)
import java.io.Serializable;
@ToString
@Getter
@AllArgsConstructor
@Jacksonized
@Builder
@Getter
@ToString
@Jacksonized
public class ScanObj implements Serializable {
public String scanId;
public String productId;
public String eventId;
public String isManagedService;
public String componentList;
public static final String SQL = "INSERT INTO scans " +
"(scan_id, offering_id, event_id, is_managed_service, component_list) " +
"VALUES (? ? ? ? ?)";
private final String scanId;
private final String productId;
private final String eventId;
private final String isManagedService;
private final String componentList;
}

30
src/main/java/dto/ScanObjPayload.java

@ -1,23 +1,17 @@
package dto;
import org.eclipse.microprofile.config.ConfigProvider;
// import org.jboss.pnc.api.deliverablesanalyzer.dto.AnalyzePayload;
// import org.jboss.pnc.api.dto.HeartbeatConfig;
// import org.jboss.pnc.api.dto.Request;
import java.net.URI;
import java.net.URISyntaxException;
import java.nio.charset.StandardCharsets;
import java.sql.Struct;
import java.util.*;
import org.json.JSONObject;
import org.json.JSONArray;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
import org.json.JSONException;
import org.json.JSONObject;
public class ScanObjPayload {
public static ScanObj constructScanPayload(JSONObject scanObj) throws URISyntaxException {
return new ScanObj(scanObj.getString("scanId"),scanObj.getString("productId"),scanObj.getString("eventId"),scanObj.getString("isManagedService"),scanObj.getString("componentList"));
public static ScanObj constructScanPayload(JSONObject jsonObj) throws JSONException {
return new ScanObj(
jsonObj.getString("scan_id"),
jsonObj.getString("offering_id"),
jsonObj.getString("event_id"),
jsonObj.getString("is_managed_service"),
jsonObj.getString("component_list"));
}
}
private ScanObjPayload() {}
}

78
src/main/java/rest/CreateGetResource.java

@ -3,83 +3,57 @@ package rest;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.Set;
import dto.ScanObj;
import dto.ConnectDB;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
import javax.ws.rs.DELETE;
import javax.ws.rs.GET;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.inject.Inject;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import java.util.Set;
import java.util.stream.Collectors;
import javax.inject.Inject;
import javax.ws.rs.Consumes;
import java.sql.*;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.ObjectWriter;
// import org.hibernate.EntityManager;
import jakarta.persistence.EntityManager;
import jakarta.persistence.Cacheable;
import jakarta.persistence.Column;
import jakarta.persistence.Entity;
import jakarta.persistence.GeneratedValue;
import jakarta.persistence.Id;
import jakarta.persistence.NamedQuery;
import jakarta.persistence.QueryHint;
import jakarta.persistence.SequenceGenerator;
import jakarta.persistence.Table;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
// @Path("/api/v1/[osh-scan]")
@Path("/scanGet")
@Authenticated
public class CreateGetResource {
// @Inject
// EntityManager em;
private static final Logger logger = LoggerFactory.getLogger(CreateGetResource.class);
CreateScanService createScanService;
private Set<ScanObj> Scans = Collections.newSetFromMap(Collections.synchronizedMap(new LinkedHashMap<>()));
public CreateGetResource() {
// LDB: @TODO either put some code here or remove this not used public constructor
}
@GET
@Path("/{scanId}")
public Set<ScanObj> list(@PathParam("scanId") String scanId) {
//use to return specific scanIds just use usual fetch from sets, will be querying hte db directly here
try {
ConnectDB connectDB = new ConnectDB();
Connection conn = connectDB.connect();
Statement stmt = null;
String sql = "SELECT * FROM scans WHERE scanid=" +scanId;
stmt = conn.createStatement();
ResultSet rs = stmt.executeQuery(sql);
ConnectDB connectDB = new ConnectDB();
String sql = "SELECT * FROM scans WHERE scan_id=?";
try(Connection conn = connectDB.connect();
PreparedStatement pstmt = conn.prepareStatement(sql)) {
pstmt.setString(1, scanId);
ResultSet rs = pstmt.executeQuery();
while (rs.next()) {
//very ugly solution needs some change to where we put the query
Scans.add(new ScanObj(rs.getString("scanid"),rs.getString("productid"),rs.getString("eventid"),rs.getString("ismanagedservice"),rs.getString("componentlist")));
conn.close();
Scans.add(new ScanObj(
rs.getString("scan_id"),
rs.getString("offering_id"),
rs.getString("event_id"),
rs.getString("is_managed_service"),
rs.getString("component_list")));
}
} catch (SQLException e){
System.out.println(e);
} catch (SQLException e) {
logger.error(e.getMessage());
}
return Scans;
}
}
}

66
src/main/java/rest/CreateScanRequest.java

@ -1,66 +1,56 @@
package rest;
import org.eclipse.microprofile.rest.client.inject.RestClient;
import dto.ScanObj;
import javax.inject.Inject;
import javax.validation.Valid;
import javax.ws.rs.Consumes;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.UUID;
import org.json.JSONObject;
import org.json.JSONArray;
import dto.ScanObj;
import dto.BrewObj;
import dto.ConnectDB;
import dto.ScanObjPayload;
import dto.BrewObjPayload;
import dto.GitObj;
import dto.GitObjPayload;
import dto.PncObj;
import dto.PncObjPayload;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import org.eclipse.microprofile.rest.client.inject.RestClient;
import org.json.JSONException;
import org.json.JSONObject;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import javax.validation.Valid;
import javax.ws.rs.Consumes;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
import java.sql.PreparedStatement;
import java.sql.SQLException;
@Authenticated
@Path("/scanRequest")
public class CreateScanRequest {
//all of these need cleaning up to be a more sensible soution
private static final Logger logger = LoggerFactory.getLogger(CreateScanRequest.class);
@RestClient
CreateScanService createScanService;
@POST
@Path("/brew")
@Consumes({ "application/json" })
//in theory should take List<String> to clean it up
public BrewObj invokeScanAnalyze(@Valid String scanInvocation) throws URISyntaxException {
// in theory should take List<String> to clean it up
public BrewObj invokeBrewScanAnalyze(@Valid String scanInvocation) throws JSONException {
JSONObject jsonData = new JSONObject(scanInvocation);
BrewObj brewObj = BrewObjPayload.constructScanPayload(jsonData);
ConnectDB connectDB = new ConnectDB();
Connection conn = connectDB.connect();
Statement stmt = null;
String sql = "INSERT INTO brewscans (buildsystemtype, brewid, brewnvr, pncid, artifacttype, filename, builtfromsource) VALUES ('"+brewObj.buildSystemType+"','"+brewObj.brewId+"','"+brewObj.brewNvr+"','"+brewObj.pncId+"','"+brewObj.artifactType+"','"+brewObj.fileName+"','"+brewObj.buildFromSource+"')";
try{
stmt = conn.createStatement();
ResultSet rs = stmt.executeQuery(sql);
conn.close();
} catch (SQLException e){
System.out.println(e);
try(Connection conn = connectDB.connect();
PreparedStatement pstmt = conn.prepareStatement(BrewObj.SQL)) {
pstmt.setString(1, brewObj.getBuildSystemType());
pstmt.setString(2, brewObj.getBrewId());
pstmt.setString(3, brewObj.getBrewNvr());
pstmt.setString(4, brewObj.getPncId());
pstmt.setString(5, brewObj.getArtifactType());
pstmt.setString(6, brewObj.getFileName());
pstmt.setBoolean(7, brewObj.getBuiltFromSource());
pstmt.executeUpdate();
} catch (SQLException e) {
logger.error(e.getMessage());
}
return brewObj;
}

35
src/main/java/rest/CreateScanResource.java

@ -1,38 +1,31 @@
package rest;
import org.eclipse.microprofile.rest.client.inject.RestClient;
import dto.ConnectDB;
import dto.ScanObjPayload;
import dto.ScanObj;
import dto.ScanObjPayload;
import org.eclipse.microprofile.rest.client.inject.RestClient;
import org.json.JSONObject;
import org.eclipse.microprofile.rest.client.inject.RestClient;
import org.json.JSONException;
import org.json.JSONObject;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import javax.inject.Inject;
import javax.validation.Valid;
import javax.ws.rs.Consumes;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.UUID;
import org.json.JSONObject;
import org.json.JSONArray;
import dto.ScanObj;
import dto.ConnectDB;
import dto.ScanObjPayload;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
@Path("/")
public class CreateScanResource {
private static final Logger logger = LoggerFactory.getLogger(CreateScanResource.class);
@RestClient
CreateScanService createScanService;

37
src/main/java/rest/CreateStartScan.java

@ -1,41 +1,28 @@
package rest;
import org.eclipse.microprofile.rest.client.inject.RestClient;
import dto.ConnectDB;
import dto.ScanObj;
import io.quarkus.security.Authenticated;
import org.eclipse.microprofile.rest.client.inject.RestClient;
import org.eclipse.microprofile.rest.client.inject.RestClient;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import javax.inject.Inject;
import javax.validation.Valid;
import javax.ws.rs.Consumes;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.ws.rs.PUT;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.UUID;
import org.json.JSONObject;
import org.json.JSONArray;
import dto.ScanObj;
import dto.ConnectDB;
import dto.ScanObjPayload;
import javax.ws.rs.PathParam;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.Statement;
import java.sql.SQLException;
@Authenticated
@Path("/startScan")
public class CreateStartScan {
private static final Logger logger = LoggerFactory.getLogger(CreateStartScan.class);
@RestClient
CreateScanService createScanService;

69
src/main/java/rest/RemoveScan.java

@ -1,70 +1,43 @@
package rest;
import dto.ConnectDB;
import org.eclipse.microprofile.rest.client.inject.RestClient;
import dto.ScanObj;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import javax.inject.Inject;
import javax.validation.Valid;
import javax.ws.rs.Consumes;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.ws.rs.PUT;
import javax.ws.rs.DELETE;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.UUID;
import org.json.JSONObject;
import org.json.JSONArray;
import dto.ScanObj;
import dto.ConnectDB;
import dto.ScanObjPayload;
import javax.ws.rs.PathParam;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
@Path("/deleteScan")
public class RemoveScan {
private static final Logger logger = LoggerFactory.getLogger(RemoveScan.class);
// @Inject
@RestClient
CreateScanService createScanService;
// ScanObjPayload scanObjPayload;
@DELETE
@Path("/{scanId}")
public boolean invokeScanAnalyze(@PathParam("scanId") String scanId) throws URISyntaxException {
public boolean invokeScanAnalyze(@PathParam("scanId") String scanId) {
boolean rc = false;
//send task to the actual interface here using the resultset returned (should multiple scanids be allowed):
//once the task is complete AND we have confirmation that the scan is done run the following sql
String qry = "DELETE FROM scans WHERE scan_id=?";
ConnectDB connectDB = new ConnectDB();
Connection conn = connectDB.connect();
//this is ugly needs to berewritten
Statement stmt = null;
ScanObj finalScan = null;
//fix this
Boolean success = false;
String sql = "DELETE FROM scans WHERE scanid=" + scanId;
//need to add figure out an archieve system and wether its nessacery (archieve value??)
try{
stmt = conn.createStatement();
//TODO add proper checks
stmt.executeUpdate(sql);
//send task to the actual interface here using the resultset returned (should multiple scanids be allowed):
//once the task is complete AND we have confirmation that the scan is done run the following sql
conn.close();
} catch (SQLException e){
System.out.println(e);
}
success = true;
return success;
try(Connection conn = connectDB.connect();
PreparedStatement pstmt = conn.prepareStatement(qry)) {
pstmt.setString(1, scanId);
pstmt.executeUpdate();
rc = true;
} catch (SQLException e) {
logger.error(e.getMessage());
}
return rc;
}
}

6
src/main/java/rest/Scan.java

@ -1,8 +1,6 @@
package rest;
package rest;
import javax.persistence.Entity;
public class Scan {
public class Scan {
private int scanId;
private String productId;
private String eventId;

92
src/main/java/rest/StoreData.java

@ -1,91 +1,21 @@
// package rest;
// import java.util.Collections;
// import java.util.LinkedHashMap;
// import java.util.Set;
// import dto.ScanObj;
// // import dto.ConnectDB;
// // import dto.Scan;
// import java.sql.Connection;
// import java.sql.DriverManager;
// import java.sql.SQLException;
// import java.sql.Connection;
// import java.sql.DriverManager;
// import java.sql.ResultSet;
// import java.sql.Statement;
// import javax.ws.rs.DELETE;
// import javax.ws.rs.GET;
// import javax.ws.rs.POST;
// import javax.ws.rs.Path;
// import javax.inject.Inject;
// import javax.ws.rs.GET;
// import javax.ws.rs.Path;
// import javax.ws.rs.PathParam;
// import java.util.Set;
// import java.util.stream.Collectors;
// import javax.inject.Inject;
// import javax.ws.rs.Consumes;
// import java.sql.*;
// import com.fasterxml.jackson.databind.ObjectMapper;
// import com.fasterxml.jackson.databind.ObjectWriter;
// import org.hibernate.Session;
// import org.hibernate.SessionFactory;
// import org.hibernate.Transaction;
// import org.hibernate.boot.Metadata;
// import org.hibernate.boot.MetadataSources;
// import org.hibernate.boot.registry.StandardServiceRegistry;
// import org.hibernate.boot.registry.StandardServiceRegistryBuilder;
// // import org.hibernate.EntityManager;
// import jakarta.persistence.EntityManager;
// import jakarta.persistence.Cacheable;
// import jakarta.persistence.Column;
// import jakarta.persistence.Entity;
// import jakarta.persistence.GeneratedValue;
// import jakarta.persistence.Id;
// import jakarta.persistence.NamedQuery;
// import jakarta.persistence.QueryHint;
// import jakarta.persistence.SequenceGenerator;
// import jakarta.persistence.Table;
// import org.eclipse.microprofile.rest.client.inject.RestClient;
// import dto.ScanObj;
// import javax.inject.Inject;
// import javax.validation.Valid;
// import javax.ws.rs.Consumes;
// import javax.ws.rs.POST;
// import javax.ws.rs.Path;
// import javax.ws.rs.PUT;
// import java.net.URI;
// import java.net.URISyntaxException;
// import java.util.ArrayList;
// import java.util.Arrays;
// import java.util.List;
// import java.util.UUID;
// import org.json.JSONObject;
// import org.json.JSONArray;
// import dto.ScanObj;
// import dto.ConnectDB;
// import dto.ScanObjPayload;
// import dto.Scan;
// import javax.ws.rs.PathParam;
import org.hibernate.Session;
import org.hibernate.SessionFactory;
import org.hibernate.Transaction;
import org.hibernate.boot.Metadata;
import org.hibernate.boot.MetadataSources;
import org.hibernate.boot.registry.StandardServiceRegistry;
import org.hibernate.boot.registry.StandardServiceRegistryBuilder;
// import static constants.HttpHeaders.AUTHORIZATION_STRING;
// import java.sql.Connection;
// import java.sql.DriverManager;
// import java.sql.SQLException;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
// import org.hibernate.EntityManager;
// import java.sql.Connection;
// import java.sql.DriverManager;
// import java.sql.ResultSet;
// import java.sql.Statement;
// @Path("/storeData")
// public class StoreData {

36
src/main/java/rest/UsersResource.java

@ -0,0 +1,36 @@
package rest;
import dto.ConnectDB;
import dto.ScanObj;
import io.quarkiverse.kerberos.KerberosPrincipal;
import io.quarkus.security.Authenticated;
import io.quarkus.security.identity.SecurityIdentity;
import javax.inject.Inject;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.Set;
import javax.ws.rs.Produces;
@Path("/testKerberos")
@Authenticated
public class UsersResource {
@Inject
SecurityIdentity identity;
@Inject
KerberosPrincipal kerberosPrincipal;
@GET
@Path("/me")
@Produces("text/plain")
public String me() {
return identity.getPrincipal().getName();
}
}

35
src/main/resources/application.properties

@ -1,7 +1,40 @@
#Example deploy - mvn deploy -Dquarkus.profile=stage -Dquarkus.kubernetes.deploy=true
# quarkus.rest-client."rest.CreateScanService".url=https://localhost:8080/
# quarkus.rest-client."rest.CreateScanService".scope=javax.inject.Singleton
# couchdb.name=scan-results
# couchdb.url=https://localhost:5984
# quarkus.hibernate-orm.database.generation=drop-and-create
# quarkus.hibernate-orm.database.generation=drop-and-create
%dev.quarkus.kerberos.keytab-path= HTTP_osh-pct-security-tooling.apps.ocp-c1.prod.psi.redhat.com@IPA.REDHAT.COM.keytab
%dev.quarkus.kerberos.service-principal-name= HTTP/osh-pct-security-tooling.apps.ocp-c1.prod.psi.redhat.com@IPA.REDHAT.COM
%stage.quarkus.openshift.name=osh
%stage.quarkus.openshift.labels.env=stage
%stage.quarkus.log.level=DEBUG
#Only in Quarkus > 3.x
%stage.quarkus.openshift.route.tls.termination=edge
#As we cant create a edge terminated route (quarkus <3.x) lets disable route creation for now
%stage.quarkus.openshift.route.expose=false
%stage.quarkus.openshift.route.target-port=https
%stage.quarkus.openshift.route.tls.insecure-edge-termination-policy=redirect
##########################################
# Kerberos Specifics #
##########################################
%stage.quarkus.openshift.secret-volumes.osh-wrapper.secret-name=kerberos-keytab-osh
%stage.quarkus.openshift.mounts.osh-wrapper.path=/kerberos
%stage.quarkus.openshift.mounts.osh-wrapper.read-only=true
%stage.quarkus.kerberos.keytab-path= /kerberos/kerberos-keytab-osh
%stage.quarkus.kerberos.service-principal-name= HTTP/osh-pct-security-tooling.apps.ocp-c1.prod.psi.redhat.com@IPA.REDHAT.COM
%stage.quarkus.openshift.mounts.osh-wrapper-config-vol.path=/etc/krb5.conf
%stage.quarkus.openshift.mounts.osh-wrapper-config-vol.sub-path=linux-krb5.conf
%stage.quarkus.openshift.config-map-volumes.osh-wrapper-config-vol.config-map-name=kerberos-config
%stage.quarkus.openshift.config-map-volumes.osh-wrapper-config-vol.items."linux-krb5.conf".path=linux-krb5.conf
%stage.quarkus.openshift.mounts.osh-wrapper-config-vol.read-only=true

107
src/test/java/dto/TestPayload.java

@ -0,0 +1,107 @@
package dto;
import org.json.JSONObject;
import org.junit.jupiter.api.Test;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import static org.junit.jupiter.api.Assertions.*;
class TestPayload {
private static final Logger logger = LoggerFactory.getLogger(TestPayload.class);
@Test
void TestBrew() {
JSONObject jsonObject = new JSONObject();
jsonObject.put("build_system_type", "brew");
jsonObject.put("brew_id", "1");
jsonObject.put("brew_nvr", "1.1.0");
jsonObject.put("pnc_id", "153");
jsonObject.put("artifact_type", "arti");
jsonObject.put("file_name", "myfile");
jsonObject.put("built_from_source", true);
BrewObj brewObj1 = BrewObjPayload.constructScanPayload(jsonObject);
BrewObj brewObj2 = new BrewObj(
jsonObject.getString("build_system_type"),
jsonObject.getString("brew_id"),
jsonObject.getString("brew_nvr"),
jsonObject.getString("pnc_id"),
jsonObject.getString("artifact_type"),
jsonObject.getString("file_name"),
jsonObject.getBoolean("built_from_source"));
logger.info("BrewObj1: " + brewObj1.toString());
logger.info("BrewObj2: " + brewObj2.toString());
assertEquals(brewObj1.getBuildSystemType(), brewObj2.getBuildSystemType());
assertEquals(brewObj1.getBrewId(), brewObj2.getBrewId());
assertEquals(brewObj1.getBrewNvr(), brewObj2.getBrewNvr());
assertEquals(brewObj1.getPncId(), brewObj2.getPncId());
assertEquals(brewObj1.getArtifactType(), brewObj2.getArtifactType());
assertEquals(brewObj1.getFileName(), brewObj2.getFileName());
assert(brewObj1.getBuiltFromSource() == brewObj2.getBuiltFromSource());
}
@Test
void TestGit() {
JSONObject jsonObject = new JSONObject();
jsonObject.put("build_system_type", "git");
jsonObject.put("repository", "repo");
jsonObject.put("reference", "ref");
jsonObject.put("commit_id", "c6385a754421a57cd0a26ccba187cd687c8d1258");
GitObj gitObj1 = GitObjPayload.constructScanPayload(jsonObject);
GitObj gitObj2 = new GitObj(
jsonObject.getString("build_system_type"),
jsonObject.getString("repository"),
jsonObject.getString("reference"),
jsonObject.getString("commit_id"));
logger.info("GitObj1: " + gitObj1.toString());
logger.info("GitObj2: " + gitObj2.toString());
assertEquals(gitObj1.getBuildSystemType(), gitObj2.getBuildSystemType());
assertEquals(gitObj1.getRepository(), gitObj2.getRepository());
assertEquals(gitObj1.getReference(), gitObj2.getReference());
assertEquals(gitObj1.getCommitId(), gitObj2.getCommitId());
}
@Test
void TestPnc() {
JSONObject jsonObject = new JSONObject();
jsonObject.put("build_system_type", "pnc");
jsonObject.put("build_id", "153");
PncObj pncObj1 = PncObjPayload.constructScanPayload(jsonObject);
PncObj pncObj2 = new PncObj(
jsonObject.getString("build_system_type"),
jsonObject.getString("build_id"));
logger.info("PncObj1: " + pncObj1.toString());
logger.info("PncObj2: " + pncObj2.toString());
assertEquals(pncObj1.getBuildSystemType(), pncObj2.getBuildSystemType());
assertEquals(pncObj1.getBuildId(), pncObj2.getBuildId());
}
@Test
void TestScan() {
JSONObject jsonObject = new JSONObject();
jsonObject.put("scan_id", "ABC");
jsonObject.put("offering_id", "product#");
jsonObject.put("event_id", "event#");
jsonObject.put("is_managed_service", "TRUE");
jsonObject.put("component_list", "components");
ScanObj scanObj1 = ScanObjPayload.constructScanPayload(jsonObject);
ScanObj scanObj2 = new ScanObj(
jsonObject.getString("scan_id"),
jsonObject.getString("offering_id"),
jsonObject.getString("event_id"),
jsonObject.getString("is_managed_service"),
jsonObject.getString("component_list"));
logger.info("ScanObj1: " + scanObj1.toString());
logger.info("ScanObj2: " + scanObj2.toString());
assertEquals(scanObj1.getScanId(), scanObj2.getScanId());
assertEquals(scanObj1.getProductId(), scanObj2.getProductId());
assertEquals(scanObj1.getEventId(), scanObj2.getEventId());
assertEquals(scanObj1.getIsManagedService(), scanObj2.getIsManagedService());
assertEquals(scanObj1.getComponentList(), scanObj2.getComponentList());
}
}
Loading…
Cancel
Save