Browse Source
# Conflicts: # pom.xml # src/main/java/rest/CreateScanRequest.java # src/main/java/rest/CreateScanResource.java # src/main/java/rest/CreateStartScan.java # src/main/java/rest/StoreData.javalocal_dev_fixes
30 changed files with 869 additions and 587 deletions
@ -0,0 +1,12 @@
|
||||
.dcignore |
||||
.idea |
||||
*.iml |
||||
|
||||
dev/ |
||||
|
||||
# Maven |
||||
target/ |
||||
pom.xml.tag |
||||
pom.xml.releaseBackup |
||||
pom.xml.versionsBackup |
||||
release.properties |
||||
@ -1,112 +1,51 @@
|
||||
See https://docs.google.com/document/d/15yod6K_ZbNkJ_ern7gwpxjBkdJIlHXORfYZ3CGQhnEM/edit?usp=sharing for a full version with images |
||||
# code-with-quarkus |
||||
|
||||
# Introduction |
||||
Currently we rely on CPaaS to submit requests to PSSaaS which then invokes the PSSC scanning container. The idea behind the ScanChain api is to act as an interaction point for services to be able to directly access our scan tooling. |
||||
This project uses Quarkus, the Supersonic Subatomic Java Framework. |
||||
|
||||
Our api will be written in Quarkus for ease of use and deployment to OpenShift, we will also use Tekton to assist with CI/CD. |
||||
If you want to learn more about Quarkus, please visit its website: https://quarkus.io/ . |
||||
|
||||
# How to build |
||||
## Running the application in dev mode |
||||
|
||||
To set up the environment. After cloning the repository: |
||||
|
||||
``` |
||||
cd <repository>/ |
||||
quarkus create app quarkus:dev |
||||
mvn -N io.takari:maven:wrapper |
||||
``` |
||||
|
||||
Also, it is necessary to create a local PostgreSQL instance. For development purposes, the parameters are: |
||||
``` |
||||
username = postgresql |
||||
password = password |
||||
``` |
||||
|
||||
ToDo: Create Database Model |
||||
|
||||
|
||||
|
||||
To run the Quarkus build in dev mode simply run: |
||||
```` |
||||
You can run your application in dev mode that enables live coding using: |
||||
```shell script |
||||
./mvnw compile quarkus:dev |
||||
```` |
||||
All end points should be avaliable on localhost:8080/{endpoint}. The endpoints are listed in the endpoints section |
||||
|
||||
|
||||
|
||||
# Deploying to OpenShift (https://quarkus.io/guides/deploying-to-openshift) |
||||
Part of the advantage of working with quarkus is the ease of which we can deploy it to OpenShift. We have the OpenShift extension already installed via the pom, |
||||
|
||||
All that should be required to build and deploy OpenShift is to login to OpenShift via the usual method (oc login (creds) for example). Before running a build command: |
||||
|
||||
You can then expose the routes (oc expose {route}), then your application should be accessible on the OpenShift cluster. This is verifiable either by using the console to request which services are running (oc get svc) or by using the web console which should display the service graphically. |
||||
|
||||
# Design diagram |
||||
API endpoint diagram with all endpoints DB links, connections to further services (PNC API etc) |
||||
|
||||
# API endpoints |
||||
|
||||
## /{scanId} - GET request for retrieving scans |
||||
This is a simple request for retrieving scans that are stored in our postgresql database. The assigned scanId will return the whole scan payload in JSON format. |
||||
|
||||
## / - POST request takes a JSON payload to start scans (Maybe isnt relevant/shouldnt be included in the future) |
||||
|
||||
Creating scans via passing fully formed JSON payloads. The standard JSON format should contain: |
||||
product-id |
||||
event-id |
||||
is-managed-service |
||||
component-list |
||||
See appendix 1 for a provided example |
||||
|
||||
## /scanRequest - Post request for starting scans |
||||
|
||||
There are several different types of build that should be retrieved from the backend source. Different inputs are required based off the build source. |
||||
|
||||
The required fields for BREW builds are: |
||||
buildSystemType |
||||
brewId |
||||
brewNVR - matches brewId |
||||
pncId |
||||
artifactType |
||||
fileName |
||||
builtFromSource |
||||
|
||||
The required fields for git builds are: |
||||
buildSystemType |
||||
repository |
||||
reference |
||||
commitId |
||||
|
||||
The required fields for PNC builds are: |
||||
buildSystemType |
||||
buildId |
||||
``` |
||||
|
||||
This information should allow us to have all the requirements for retrieving and then starting a scan when requested from the required sources. |
||||
> **_NOTE:_** Quarkus now ships with a Dev UI, which is available in dev mode only at http://localhost:8080/q/dev/. |
||||
|
||||
## /startScan - PUT request to start off the relevant scan |
||||
## Packaging and running the application |
||||
|
||||
Only requires the scanId and should start off the relevant scan, should return a success only on finished or failure if there's no further response after timeout. |
||||
## /removeScan - DELETE request to remove a scan build from DB |
||||
The application can be packaged using: |
||||
```shell script |
||||
./mvnw package |
||||
``` |
||||
It produces the `quarkus-run.jar` file in the `target/quarkus-app/` directory. |
||||
Be aware that it’s not an _über-jar_ as the dependencies are copied into the `target/quarkus-app/lib/` directory. |
||||
|
||||
Only requires the scanId should remove the relevant scan from our DB. Should return a success or failure. |
||||
The application is now runnable using `java -jar target/quarkus-app/quarkus-run.jar`. |
||||
|
||||
# Expanded work to do |
||||
If you want to build an _über-jar_, execute the following command: |
||||
```shell script |
||||
./mvnw package -Dquarkus.package.type=uber-jar |
||||
``` |
||||
|
||||
## Jenkins |
||||
The application, packaged as an _über-jar_, is now runnable using `java -jar target/*-runner.jar`. |
||||
|
||||
Haven't looked into the correct way for the API to interact with Jenkins needs more investigation. |
||||
## Creating a native executable |
||||
|
||||
## Jira tickets still to do: |
||||
https://issues.redhat.com/browse/PSSECMGT-1548 |
||||
https://issues.redhat.com/browse/PSSECMGT-1549 |
||||
https://issues.redhat.com/browse/PSSECMGT-1550 |
||||
https://issues.redhat.com/browse/PSSECMGT-1551 |
||||
https://issues.redhat.com/browse/PSSECMGT-1552 |
||||
https://issues.redhat.com/browse/PSSECMGT-1553 |
||||
https://issues.redhat.com/browse/PSSECMGT-1554 |
||||
You can create a native executable using: |
||||
```shell script |
||||
./mvnw package -Pnative |
||||
``` |
||||
|
||||
Or, if you don't have GraalVM installed, you can run the native executable build in a container using: |
||||
```shell script |
||||
./mvnw package -Pnative -Dquarkus.native.container-build=true |
||||
``` |
||||
|
||||
# Appendix |
||||
You can then execute your native executable with: `./target/code-with-quarkus-1.0.0-SNAPSHOT-runner` |
||||
|
||||
Appendix 1 |
||||
If you want to learn more about building native executables, please consult https://quarkus.io/guides/maven-tooling. |
||||
|
||||
## Related Guides |
||||
|
||||
|
||||
@ -0,0 +1,36 @@
|
||||
includedir /etc/krb5.conf.d/ |
||||
|
||||
# depending on your config, you may wish to uncomment the following: |
||||
# includedir /var/lib/sss/pubconf/krb5.include.d/ |
||||
|
||||
[libdefaults] |
||||
default_realm = IPA.REDHAT.COM |
||||
dns_lookup_realm = true |
||||
dns_lookup_kdc = true |
||||
rdns = false |
||||
dns_canonicalize_hostname = false |
||||
ticket_lifetime = 24h |
||||
forwardable = true |
||||
udp_preference_limit = 1 |
||||
default_ccache_name = KEYRING:persistent:%{uid} |
||||
max_retries = 1 |
||||
kdc_timeout = 1500 |
||||
|
||||
[realms] |
||||
|
||||
REDHAT.COM = { |
||||
default_domain = redhat.com |
||||
dns_lookup_kdc = true |
||||
master_kdc = kerberos.corp.redhat.com |
||||
admin_server = kerberos.corp.redhat.com |
||||
} |
||||
|
||||
IPA.REDHAT.COM = { |
||||
default_domain = ipa.redhat.com |
||||
dns_lookup_kdc = true |
||||
# Trust tickets issued by legacy realm on this host |
||||
auth_to_local = RULE:[1:$1@$0](.*@REDHAT\.COM)s/@.*// |
||||
auth_to_local = DEFAULT |
||||
} |
||||
#DO NOT ADD A [domain_realms] section |
||||
#https://mojo.redhat.com/docs/DOC-1166841 |
||||
@ -0,0 +1,21 @@
|
||||
#oc create route edge --service=osh --dry-run=client -o yaml > edgeroute.yml |
||||
apiVersion: route.openshift.io/v1 |
||||
kind: Route |
||||
metadata: |
||||
creationTimestamp: null |
||||
labels: |
||||
app.kubernetes.io/name: osh |
||||
app.kubernetes.io/version: 1.0.0-SNAPSHOT |
||||
app.openshift.io/runtime: quarkus |
||||
env: stage |
||||
name: osh |
||||
spec: |
||||
port: |
||||
targetPort: http |
||||
tls: |
||||
termination: edge |
||||
to: |
||||
kind: "" |
||||
name: osh |
||||
weight: null |
||||
status: {} |
||||
@ -0,0 +1,44 @@
|
||||
#oc create configmap kerberos-config --from-file=linux-krb5.conf --dry-run=client -o yaml > kerberos-config.yaml |
||||
apiVersion: v1 |
||||
data: |
||||
linux-krb5.conf: | |
||||
includedir /etc/krb5.conf.d/ |
||||
|
||||
# depending on your config, you may wish to uncomment the following: |
||||
# includedir /var/lib/sss/pubconf/krb5.include.d/ |
||||
|
||||
[libdefaults] |
||||
default_realm = IPA.REDHAT.COM |
||||
dns_lookup_realm = true |
||||
dns_lookup_kdc = true |
||||
rdns = false |
||||
dns_canonicalize_hostname = false |
||||
ticket_lifetime = 24h |
||||
forwardable = true |
||||
udp_preference_limit = 1 |
||||
default_ccache_name = KEYRING:persistent:%{uid} |
||||
max_retries = 1 |
||||
kdc_timeout = 1500 |
||||
|
||||
[realms] |
||||
|
||||
REDHAT.COM = { |
||||
default_domain = redhat.com |
||||
dns_lookup_kdc = true |
||||
master_kdc = kerberos.corp.redhat.com |
||||
admin_server = kerberos.corp.redhat.com |
||||
} |
||||
|
||||
IPA.REDHAT.COM = { |
||||
default_domain = ipa.redhat.com |
||||
dns_lookup_kdc = true |
||||
# Trust tickets issued by legacy realm on this host |
||||
auth_to_local = RULE:[1:$1@$0](.*@REDHAT\.COM)s/@.*// |
||||
auth_to_local = DEFAULT |
||||
} |
||||
#DO NOT ADD A [domain_realms] section |
||||
#https://mojo.redhat.com/docs/DOC-1166841 |
||||
kind: ConfigMap |
||||
metadata: |
||||
creationTimestamp: null |
||||
name: kerberos-config |
||||
@ -0,0 +1,30 @@
|
||||
from bs4 import BeautifulSoup |
||||
import requests |
||||
import re |
||||
import csv |
||||
|
||||
results = {} |
||||
|
||||
URL = "https://product-security.pages.redhat.com/offering-registry/" |
||||
r = requests.get(URL) |
||||
|
||||
soup = BeautifulSoup(r.text, 'html.parser') |
||||
table = soup.find("table") |
||||
rows = table.findAll("tr") |
||||
|
||||
for row in rows: |
||||
for elem in row.contents: |
||||
if row.contents[1].text == 'Offering': |
||||
break |
||||
else: |
||||
# We extract the short name of the URL |
||||
re_search = re.search('/offering-registry/offerings/(.*)/', row.contents[1].contents[0].attrs["href"]) |
||||
results[re_search.group(1)] = row.contents[1].contents[0].text |
||||
break |
||||
|
||||
print(results) |
||||
|
||||
with open('offerings.csv', 'w') as csv_file: |
||||
writer = csv.writer(csv_file) |
||||
for key, value in results.items(): |
||||
writer.writerow([key, value]) |
||||
@ -0,0 +1,126 @@
|
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('ansible-automation-platform','Ansible Automation Platform (AAP)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('advisor','Insights Advisor'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('ansible-on-aws','Ansible on AWS'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('ansible-on-azure','Ansible on Azure'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('ansible-on-gcp','Ansible on GCP'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('ansible-wisdom-service','Ansible Wisdom Service'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('cert-manager','cert-manager Operator for Red Hat OpenShift'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('compliance','Insights Compliance'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('connected-customer-experience','Connected Customer Experience (CCX)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('cost-management','Cost Management'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('custom-metric-autoscaler','OpenShift Custom Metrics Autoscaler'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('developer-sandbox-for-red-hat-openshift','Developer Sandbox for Red Hat OpenShift'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('dotnet','.NET'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('drift','Insights Drift'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('eclipse-vertx','Red Hat build of Eclipse Vert.x'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('edge-management','Edge Management'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('eventing','Insights Eventing'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('fastdatapath','RHEL Fast Datapath'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('host-management-services','Host Management Services'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('hosted-control-planes','Hosted Control Planes (Hypershift)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('hybrid-application-console','Hybrid Application Console (HAC)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('insights-essential','Insights Essentials'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('kernel-module-management','Kernel Module Management'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('logging-subsystem-for-red-hat-openshift','Logging Subsystem for Red Hat OpenShift'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('lvms-operator','LVMS Operator'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('malware-detection','Insights Malware Detection'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('mgmt-platform','Management Platform'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('migration-toolkit-for-applications','Migration Toolkit for Applications (MTA)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('migration-toolkit-for-containers','Migration Toolkit for Containers (MTC)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('migration-toolkit-for-runtimes','Migration Toolkit for Runtimes (MTR)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('migration-toolkit-for-virtualization','Migration Toolkit for Virtualization (MTV)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('network-observability-operator','Network Observability Operator'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('node-healthcheck-operator','Node HealthCheck Operator'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('node-maintenance-operator','Node Maintenance Operator'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('nvidia-gpu-add-on','NVIDIA GPU Add-On'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('oadp','OpenShift API for Data Protection'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-container-platform','Openshift Container Platform (OCP)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-container-storage','OpenShift Container Storage (OCS)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-data-foundation-managed-service','Red Hat OpenShift Data Foundation Managed Service'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-dedicated','OpenShift Dedicated (OSD/ROSA)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-developer-tools-and-services-helm','OpenShift Developer Tools and Services (Helm)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-developer-tools-and-services-jenkins','OpenShift Developer Tools and Services (Jenkins)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-distributed-tracing','OpenShift Distributed Tracing'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-on-azure','Openshift on Azure (ARO)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-run-once-duration-override-operator','OpenShift Run Once Duration Override Operator'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-sandboxed-containers','Openshift Sandboxed Containers'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-secondary-scheduler-operator','OpenShift Secondary Scheduler Operator'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-servicemesh','OpenShift Service Mesh'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-virtualization','OpenShift Virtualization (CNV)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-web-terminal-operator','OpenShift Web Terminal Operator'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-winc','Windows Container Support for OpenShift'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('patch','Insights Patch'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('product-discovery','Product Discovery'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-3scale-api-management-platform','Red Hat 3scale API Management Platform'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-advanced-cluster-management','Red Hat Advanced Cluster Management (RHACM)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-amq-broker','Red Hat AMQ Broker'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-amq-clients','Red Hat AMQ Clients'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-amq-interconnect','Red Hat AMQ Interconnect'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-amq-online','Red Hat AMQ Online'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-amq-streams','Red Hat AMQ Streams'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-build-apicurio-registry','Red Hat build of Apicurio Registry (formerly known as Integration Service Registry)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-build-quarkus','Red Hat Build of Quarkus'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-camel-extensions-quarkus','Red Hat Camel Extensions for Quarkus'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-camel-k','Red Hat Camel K'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-camel-spring-boot','Red Hat Camel for Spring Boot'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-ceph-storage','Red Hat Ceph Storage'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-certificate-system','Red Hat Certificate System (RHCS)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-certification-program','Red Hat Certification Program (rhcertification)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-code-quarkus','Red Hat Code Quarkus'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-core-os','Red Hat CoreOS'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-data-grid','Red Hat Data Grid'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-debezium','Red Hat Debezium'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-decision-manager','Red Hat Decision Manager'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-developer-hub','Red Hat Developer Hub'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-developer-toolset','Red Hat Developer Toolset (DTS)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-devtools-compilers','Red Hat Developer Tools (DevTools Compilers)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-directory-server','Red Hat Directory Server (RHDS)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-enterprise-linux-10','Red Hat Enterprise Linux (RHEL) 10'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-enterprise-linux-6','Red Hat Enterprise Linux (RHEL) 6'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-enterprise-linux-7','Red Hat Enterprise Linux (RHEL) 7'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-enterprise-linux-8','Red Hat Enterprise Linux (RHEL) 8'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-enterprise-linux-9','Red Hat Enterprise Linux (RHEL) 9'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-fuse','Red Hat Fuse'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-gluster-storage','Red Hat Gluster Storage'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-in-vehicle-os','Red Hat In-Vehicle Operating System (RHIVOS)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-jboss-core-services','Red Hat JBoss Core Services'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-jboss-eap','Red Hat JBoss Enterprise Application Platform (EAP)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-jboss-web-server','Red Hat JBoss Web Server'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-observability-service','Red Hat Observability Service'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-open-database-access','Red Hat OpenShift Database Access'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-open-shift-data-science','Red Hat OpenShift Data Science (RHODS)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openjdk','Red Hat OpenJDK'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-api-management','Red Hat OpenShift API Management'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-builds-v2','Red Hat OpenShift Builds V2'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-connectors','Red Hat OpenShift Connectors (RHOC)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-control-plane-service','Red Hat OpenShift Control Plane Service'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-data-foundation','Red Hat OpenShift Data Foundation'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-dev-spaces','Red Hat OpenShift Dev Spaces'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-gitops','Red Hat OpenShift GitOps'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-local','Red Hat OpenShift Local'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-pipelines','Red Hat OpenShift Pipelines'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-serverless','Red Hat OpenShift Serverless'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-service-registry','Red Hat OpenShift Service Registry'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-streams-apache-kafka','Red Hat OpenShift Streams for Apache Kafka (RHOSAK)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openstack-platform','Red Hat OpenStack Platform (RHOSP)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-optaplanner','Red Hat Optaplanner'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-plug-ins-for-backstage','Red Hat Plug-ins for Backstage'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-process-automation-manager','Red Hat Process Automation Manager'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-quarkus-registry','Red Hat Quarkus Registry'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-quay','Red Hat Quay'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-satellite','Red Hat Satellite'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-service-interconnect','Red Hat Service Interconnect (formerly known as Application Interconnect)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-single-sign-on','Red Hat Single Sign-On (RHSSO)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-software-collections','Red Hat Software Collections'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-support-for-spring-boot','Red Hat support for Spring Boot'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-trusted-application-pipeline','Red Hat Trusted Application Pipeline (RHTAP)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-update-infrastructure','Red Hat Update Infrastructure (RHUI)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-virtualization','Red Hat Virtualization'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('resource-optimization','Insights Resource Optimization (ROS)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('rh-vulnerability-for-ocp','Insights Vulnerability for OCP'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('rhacs','Red Hat Advanced Cluster Security for Kubernetes (RHACS)'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('self-node-remediation','Self Node Remediation'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('subscription-central','Subscription Central'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('subscription-watch','Subscription Watch'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('telco-sw-components','Telco SW Components'); |
||||
INSERT INTO osh.offerings(offering_id,description) VALUES ('vulnerability','Vulnerability'); |
||||
@ -0,0 +1,81 @@
|
||||
CREATE SCHEMA osh; |
||||
|
||||
GRANT USAGE ON SCHEMA osh TO postgres; |
||||
|
||||
CREATE TABLE IF NOT EXISTS osh.offerings( |
||||
offering_id VARCHAR(100), |
||||
description VARCHAR(200), |
||||
PRIMARY KEY (offeringId) |
||||
); |
||||
|
||||
CREATE TABLE IF NOT EXISTS osh.results( |
||||
results_id SERIAL, |
||||
datetime TIMESTAMP WITHOUT TIME ZONE DEFAULT (NOW() AT TIME ZONE 'utc') NOT NULL, |
||||
state BOOLEAN, |
||||
logs bytea, |
||||
task_reference VARCHAR(50), |
||||
PRIMARY KEY (results_id) |
||||
); |
||||
|
||||
|
||||
CREATE TABLE IF NOT EXISTS osh.scans( |
||||
scan_id SERIAL, |
||||
offering_id VARCHAR(100), |
||||
event_id VARCHAR(100) NOT NULL, |
||||
is_managed_service BOOLEAN NOT NULL, |
||||
component_list VARCHAR(100), |
||||
datetime TIMESTAMP WITHOUT TIME ZONE DEFAULT (NOW() AT TIME ZONE 'utc') NOT NULL, |
||||
owner VARCHAR(50) NOT NULL, |
||||
results SERIAL, |
||||
status VARCHAR (50) CONSTRAINT valid_status CHECK(status in ('PENDING', 'DELETED', 'COMPLETED', 'IN PROGRESS')), |
||||
last_updated TIMESTAMP WITHOUT TIME ZONE DEFAULT (NOW() AT TIME ZONE 'utc') NOT NULL, |
||||
PRIMARY KEY(scan_id), |
||||
FOREIGN KEY (offering_id) REFERENCES osh.offerings(offering_id), |
||||
FOREIGN KEY (results) REFERENCES osh.results(results_id) |
||||
); |
||||
|
||||
CREATE TABLE IF NOT EXISTS osh.archive( |
||||
scan_id SERIAL, |
||||
offering_id VARCHAR(100), |
||||
event_id VARCHAR(100) NOT NULL, |
||||
is_managed_service BOOLEAN NOT NULL, |
||||
component_list VARCHAR(100), |
||||
datetime TIMESTAMP WITHOUT TIME ZONE DEFAULT (NOW() AT TIME ZONE 'utc') NOT NULL, |
||||
owner VARCHAR(50) NOT NULL, |
||||
results SERIAL, |
||||
status VARCHAR (50) CONSTRAINT valid_status CHECK(status in ('PENDING', 'DELETED', 'COMPLETED', 'IN PROGRESS')), |
||||
last_updated TIMESTAMP WITHOUT TIME ZONE DEFAULT (NOW() AT TIME ZONE 'utc') NOT NULL, |
||||
PRIMARY KEY(scan_id), |
||||
FOREIGN KEY (offering_id) REFERENCES osh.offerings(offering_id), |
||||
FOREIGN KEY (results) REFERENCES osh.results(results_id) |
||||
); |
||||
|
||||
CREATE TABLE IF NOT EXISTS osh.gitscans ( |
||||
id SERIAL, |
||||
build_system_type VARCHAR(80), |
||||
repository VARCHAR(150), |
||||
reference VARCHAR(100), |
||||
commit_id VARCHAR(100), |
||||
-- SHA256 has a length of 256 bits, so 256 bits would represent 64 hex characters |
||||
hashsum VARCHAR(64), |
||||
PRIMARY KEY(id) |
||||
); |
||||
|
||||
CREATE TABLE IF NOT EXISTS osh.pncscans( |
||||
id SERIAL, |
||||
build_system_type VARCHAR(80), |
||||
build_id VARCHAR(100), |
||||
PRIMARY KEY(id) |
||||
); |
||||
|
||||
CREATE TABLE IF NOT EXISTS osh.brewscans( |
||||
id SERIAL, |
||||
build_system_type VARCHAR(80), |
||||
brew_id VARCHAR(100), |
||||
brew_nvr VARCHAR(100), |
||||
pnc_id VARCHAR(100), |
||||
artifact_type VARCHAR(100), |
||||
file_name VARCHAR(100), |
||||
built_from_source BOOLEAN, |
||||
PRIMARY KEY(id) |
||||
); |
||||
@ -1,23 +1,20 @@
|
||||
package dto; |
||||
|
||||
import org.eclipse.microprofile.config.ConfigProvider; |
||||
// import org.jboss.pnc.api.deliverablesanalyzer.dto.AnalyzePayload;
|
||||
// import org.jboss.pnc.api.dto.HeartbeatConfig;
|
||||
// import org.jboss.pnc.api.dto.Request;
|
||||
|
||||
import java.net.URI; |
||||
import java.net.URISyntaxException; |
||||
import java.nio.charset.StandardCharsets; |
||||
import java.sql.Struct; |
||||
import java.util.*; |
||||
|
||||
import org.json.JSONObject; |
||||
import org.json.JSONArray; |
||||
|
||||
import static constants.HttpHeaders.AUTHORIZATION_STRING; |
||||
import org.json.JSONException; |
||||
import org.json.JSONObject; |
||||
|
||||
public class BrewObjPayload { |
||||
public static BrewObj constructScanPayload(JSONObject brewObj) throws URISyntaxException { |
||||
return new BrewObj(brewObj.getString("buildSystemType"),brewObj.getString("brewId"),brewObj.getString("brewNvr"),brewObj.getString("pncId"),brewObj.getString("artifactType"),brewObj.getString("fileName"),brewObj.getString("builtFromSource")); |
||||
|
||||
public static BrewObj constructScanPayload(JSONObject jsonObj) throws JSONException { |
||||
return new BrewObj( |
||||
jsonObj.getString("build_system_type"), |
||||
jsonObj.getString("brew_id"), |
||||
jsonObj.getString("brew_nvr"), |
||||
jsonObj.getString("pnc_id"), |
||||
jsonObj.getString("artifact_type"), |
||||
jsonObj.getString("file_name"), |
||||
jsonObj.getBoolean("built_from_source")); |
||||
} |
||||
} |
||||
|
||||
private BrewObjPayload() {} |
||||
} |
||||
|
||||
@ -1,35 +1,25 @@
|
||||
package dto; |
||||
|
||||
import constants.PSGQL; |
||||
import org.json.JSONException; |
||||
|
||||
import java.sql.Connection; |
||||
import java.sql.DriverManager; |
||||
import java.sql.SQLException; |
||||
|
||||
import static constants.PSGQL.user; |
||||
import static constants.PSGQL.password; |
||||
import static constants.PSGQL.url; |
||||
import static constants.PSGQL.*; |
||||
|
||||
// @TODO Replace hard-coded credentials; make use of our secure db connection practice
|
||||
|
||||
public class ConnectDB{ |
||||
// private final String url = "jdbc:postgresql://localhost:5432/scandb";
|
||||
// private final String user = "postgres";
|
||||
// private final String password = "password";
|
||||
public class ConnectDB { |
||||
|
||||
/** |
||||
* Connect to the PostgreSQL database |
||||
* |
||||
* @return a Connection object |
||||
*/ |
||||
public Connection connect() { |
||||
Connection conn = null; |
||||
public Connection connect() throws JSONException { |
||||
try { |
||||
conn = DriverManager.getConnection(url, user, password); |
||||
System.out.println("Connected to the PostgreSQL server successfully."); |
||||
Connection conn = DriverManager.getConnection(url, user, password); |
||||
System.out.println("Connected to PostgreSQL server"); |
||||
return conn; |
||||
} catch (SQLException e) { |
||||
System.out.println(e.getMessage()); |
||||
} |
||||
|
||||
return conn; |
||||
return null; |
||||
} |
||||
} |
||||
|
||||
@ -1,23 +1,17 @@
|
||||
package dto; |
||||
|
||||
import org.eclipse.microprofile.config.ConfigProvider; |
||||
// import org.jboss.pnc.api.deliverablesanalyzer.dto.AnalyzePayload;
|
||||
// import org.jboss.pnc.api.dto.HeartbeatConfig;
|
||||
// import org.jboss.pnc.api.dto.Request;
|
||||
|
||||
import java.net.URI; |
||||
import java.net.URISyntaxException; |
||||
import java.nio.charset.StandardCharsets; |
||||
import java.sql.Struct; |
||||
import java.util.*; |
||||
|
||||
import org.json.JSONObject; |
||||
import org.json.JSONArray; |
||||
|
||||
import static constants.HttpHeaders.AUTHORIZATION_STRING; |
||||
import org.json.JSONException; |
||||
import org.json.JSONObject; |
||||
|
||||
public class GitObjPayload { |
||||
public static GitObj constructScanPayload(JSONObject gitObj) throws URISyntaxException { |
||||
return new GitObj(gitObj.getString("buildSystemType"),gitObj.getString("repository"),gitObj.getString("reference"),gitObj.getString("commitId")); |
||||
|
||||
public static GitObj constructScanPayload(JSONObject jsonObj) throws JSONException { |
||||
return new GitObj( |
||||
jsonObj.getString("build_system_type"), |
||||
jsonObj.getString("repository"), |
||||
jsonObj.getString("reference"), |
||||
jsonObj.getString("commit_id")); |
||||
} |
||||
} |
||||
|
||||
private GitObjPayload() {} |
||||
} |
||||
|
||||
@ -1,23 +1,15 @@
|
||||
package dto; |
||||
|
||||
import org.eclipse.microprofile.config.ConfigProvider; |
||||
// import org.jboss.pnc.api.deliverablesanalyzer.dto.AnalyzePayload;
|
||||
// import org.jboss.pnc.api.dto.HeartbeatConfig;
|
||||
// import org.jboss.pnc.api.dto.Request;
|
||||
|
||||
import java.net.URI; |
||||
import java.net.URISyntaxException; |
||||
import java.nio.charset.StandardCharsets; |
||||
import java.sql.Struct; |
||||
import java.util.*; |
||||
|
||||
import org.json.JSONObject; |
||||
import org.json.JSONArray; |
||||
|
||||
import static constants.HttpHeaders.AUTHORIZATION_STRING; |
||||
import org.json.JSONException; |
||||
import org.json.JSONObject; |
||||
|
||||
public class PncObjPayload { |
||||
public static PncObj constructScanPayload(JSONObject pncObj) throws URISyntaxException { |
||||
return new PncObj(pncObj.getString("buildSystemType"),pncObj.getString("buildId")); |
||||
|
||||
public static PncObj constructScanPayload(JSONObject jsonObj) throws JSONException { |
||||
return new PncObj( |
||||
jsonObj.getString("build_system_type"), |
||||
jsonObj.getString("build_id")); |
||||
} |
||||
} |
||||
|
||||
private PncObjPayload() {} |
||||
} |
||||
|
||||
@ -1,23 +1,17 @@
|
||||
package dto; |
||||
|
||||
import org.eclipse.microprofile.config.ConfigProvider; |
||||
// import org.jboss.pnc.api.deliverablesanalyzer.dto.AnalyzePayload;
|
||||
// import org.jboss.pnc.api.dto.HeartbeatConfig;
|
||||
// import org.jboss.pnc.api.dto.Request;
|
||||
|
||||
import java.net.URI; |
||||
import java.net.URISyntaxException; |
||||
import java.nio.charset.StandardCharsets; |
||||
import java.sql.Struct; |
||||
import java.util.*; |
||||
|
||||
import org.json.JSONObject; |
||||
import org.json.JSONArray; |
||||
|
||||
import static constants.HttpHeaders.AUTHORIZATION_STRING; |
||||
import org.json.JSONException; |
||||
import org.json.JSONObject; |
||||
|
||||
public class ScanObjPayload { |
||||
public static ScanObj constructScanPayload(JSONObject scanObj) throws URISyntaxException { |
||||
return new ScanObj(scanObj.getString("scanId"),scanObj.getString("productId"),scanObj.getString("eventId"),scanObj.getString("isManagedService"),scanObj.getString("componentList")); |
||||
public static ScanObj constructScanPayload(JSONObject jsonObj) throws JSONException { |
||||
return new ScanObj( |
||||
jsonObj.getString("scan_id"), |
||||
jsonObj.getString("offering_id"), |
||||
jsonObj.getString("event_id"), |
||||
jsonObj.getString("is_managed_service"), |
||||
jsonObj.getString("component_list")); |
||||
} |
||||
} |
||||
|
||||
private ScanObjPayload() {} |
||||
} |
||||
|
||||
@ -1,70 +1,43 @@
|
||||
package rest; |
||||
|
||||
import dto.ConnectDB; |
||||
|
||||
import org.eclipse.microprofile.rest.client.inject.RestClient; |
||||
import dto.ScanObj; |
||||
import org.slf4j.Logger; |
||||
import org.slf4j.LoggerFactory; |
||||
|
||||
import javax.inject.Inject; |
||||
import javax.validation.Valid; |
||||
import javax.ws.rs.Consumes; |
||||
import javax.ws.rs.POST; |
||||
import javax.ws.rs.Path; |
||||
import javax.ws.rs.PUT; |
||||
import javax.ws.rs.DELETE; |
||||
import java.net.URI; |
||||
import java.net.URISyntaxException; |
||||
import java.util.ArrayList; |
||||
import java.util.Arrays; |
||||
import java.util.List; |
||||
import java.util.UUID; |
||||
import org.json.JSONObject; |
||||
import org.json.JSONArray; |
||||
import dto.ScanObj; |
||||
import dto.ConnectDB; |
||||
import dto.ScanObjPayload; |
||||
|
||||
import javax.ws.rs.PathParam; |
||||
|
||||
import static constants.HttpHeaders.AUTHORIZATION_STRING; |
||||
import java.sql.Connection; |
||||
import java.sql.DriverManager; |
||||
import java.sql.PreparedStatement; |
||||
import java.sql.SQLException; |
||||
|
||||
import java.sql.Connection; |
||||
import java.sql.DriverManager; |
||||
import java.sql.ResultSet; |
||||
import java.sql.Statement; |
||||
|
||||
@Path("/deleteScan") |
||||
public class RemoveScan { |
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(RemoveScan.class); |
||||
|
||||
// @Inject
|
||||
@RestClient |
||||
CreateScanService createScanService; |
||||
// ScanObjPayload scanObjPayload;
|
||||
|
||||
@DELETE |
||||
@Path("/{scanId}") |
||||
public boolean invokeScanAnalyze(@PathParam("scanId") String scanId) throws URISyntaxException { |
||||
public boolean invokeScanAnalyze(@PathParam("scanId") String scanId) { |
||||
boolean rc = false; |
||||
//send task to the actual interface here using the resultset returned (should multiple scanids be allowed):
|
||||
//once the task is complete AND we have confirmation that the scan is done run the following sql
|
||||
String qry = "DELETE FROM scans WHERE scan_id=?"; |
||||
ConnectDB connectDB = new ConnectDB(); |
||||
Connection conn = connectDB.connect(); |
||||
//this is ugly needs to berewritten
|
||||
Statement stmt = null; |
||||
ScanObj finalScan = null; |
||||
//fix this
|
||||
Boolean success = false; |
||||
String sql = "DELETE FROM scans WHERE scanid=" + scanId; |
||||
//need to add figure out an archieve system and wether its nessacery (archieve value??)
|
||||
try{ |
||||
stmt = conn.createStatement(); |
||||
//TODO add proper checks
|
||||
stmt.executeUpdate(sql); |
||||
//send task to the actual interface here using the resultset returned (should multiple scanids be allowed):
|
||||
//once the task is complete AND we have confirmation that the scan is done run the following sql
|
||||
conn.close(); |
||||
} catch (SQLException e){ |
||||
System.out.println(e); |
||||
} |
||||
success = true; |
||||
return success; |
||||
try(Connection conn = connectDB.connect(); |
||||
PreparedStatement pstmt = conn.prepareStatement(qry)) { |
||||
pstmt.setString(1, scanId); |
||||
pstmt.executeUpdate(); |
||||
rc = true; |
||||
} catch (SQLException e) { |
||||
logger.error(e.getMessage()); |
||||
} |
||||
return rc; |
||||
} |
||||
} |
||||
|
||||
@ -0,0 +1,36 @@
|
||||
package rest; |
||||
|
||||
import dto.ConnectDB; |
||||
import dto.ScanObj; |
||||
import io.quarkiverse.kerberos.KerberosPrincipal; |
||||
import io.quarkus.security.Authenticated; |
||||
import io.quarkus.security.identity.SecurityIdentity; |
||||
|
||||
import javax.inject.Inject; |
||||
import javax.ws.rs.GET; |
||||
import javax.ws.rs.Path; |
||||
import javax.ws.rs.PathParam; |
||||
import java.sql.Connection; |
||||
import java.sql.ResultSet; |
||||
import java.sql.SQLException; |
||||
import java.sql.Statement; |
||||
import java.util.Collections; |
||||
import java.util.LinkedHashMap; |
||||
import java.util.Set; |
||||
import javax.ws.rs.Produces; |
||||
|
||||
@Path("/testKerberos") |
||||
@Authenticated |
||||
public class UsersResource { |
||||
@Inject |
||||
SecurityIdentity identity; |
||||
@Inject |
||||
KerberosPrincipal kerberosPrincipal; |
||||
|
||||
@GET |
||||
@Path("/me") |
||||
@Produces("text/plain") |
||||
public String me() { |
||||
return identity.getPrincipal().getName(); |
||||
} |
||||
} |
||||
@ -1,7 +1,40 @@
|
||||
#Example deploy - mvn deploy -Dquarkus.profile=stage -Dquarkus.kubernetes.deploy=true |
||||
# quarkus.rest-client."rest.CreateScanService".url=https://localhost:8080/ |
||||
# quarkus.rest-client."rest.CreateScanService".scope=javax.inject.Singleton |
||||
|
||||
# couchdb.name=scan-results |
||||
# couchdb.url=https://localhost:5984 |
||||
|
||||
# quarkus.hibernate-orm.database.generation=drop-and-create |
||||
# quarkus.hibernate-orm.database.generation=drop-and-create |
||||
|
||||
%dev.quarkus.kerberos.keytab-path= HTTP_osh-pct-security-tooling.apps.ocp-c1.prod.psi.redhat.com@IPA.REDHAT.COM.keytab |
||||
%dev.quarkus.kerberos.service-principal-name= HTTP/osh-pct-security-tooling.apps.ocp-c1.prod.psi.redhat.com@IPA.REDHAT.COM |
||||
|
||||
%stage.quarkus.openshift.name=osh |
||||
%stage.quarkus.openshift.labels.env=stage |
||||
%stage.quarkus.log.level=DEBUG |
||||
|
||||
#Only in Quarkus > 3.x |
||||
%stage.quarkus.openshift.route.tls.termination=edge |
||||
#As we cant create a edge terminated route (quarkus <3.x) lets disable route creation for now |
||||
%stage.quarkus.openshift.route.expose=false |
||||
%stage.quarkus.openshift.route.target-port=https |
||||
%stage.quarkus.openshift.route.tls.insecure-edge-termination-policy=redirect |
||||
|
||||
########################################## |
||||
# Kerberos Specifics # |
||||
########################################## |
||||
%stage.quarkus.openshift.secret-volumes.osh-wrapper.secret-name=kerberos-keytab-osh |
||||
%stage.quarkus.openshift.mounts.osh-wrapper.path=/kerberos |
||||
%stage.quarkus.openshift.mounts.osh-wrapper.read-only=true |
||||
%stage.quarkus.kerberos.keytab-path= /kerberos/kerberos-keytab-osh |
||||
%stage.quarkus.kerberos.service-principal-name= HTTP/osh-pct-security-tooling.apps.ocp-c1.prod.psi.redhat.com@IPA.REDHAT.COM |
||||
|
||||
%stage.quarkus.openshift.mounts.osh-wrapper-config-vol.path=/etc/krb5.conf |
||||
%stage.quarkus.openshift.mounts.osh-wrapper-config-vol.sub-path=linux-krb5.conf |
||||
%stage.quarkus.openshift.config-map-volumes.osh-wrapper-config-vol.config-map-name=kerberos-config |
||||
%stage.quarkus.openshift.config-map-volumes.osh-wrapper-config-vol.items."linux-krb5.conf".path=linux-krb5.conf |
||||
%stage.quarkus.openshift.mounts.osh-wrapper-config-vol.read-only=true |
||||
|
||||
|
||||
|
||||
|
||||
@ -0,0 +1,107 @@
|
||||
package dto; |
||||
|
||||
import org.json.JSONObject; |
||||
import org.junit.jupiter.api.Test; |
||||
import org.slf4j.Logger; |
||||
import org.slf4j.LoggerFactory; |
||||
import static org.junit.jupiter.api.Assertions.*; |
||||
|
||||
class TestPayload { |
||||
|
||||
private static final Logger logger = LoggerFactory.getLogger(TestPayload.class); |
||||
|
||||
@Test |
||||
void TestBrew() { |
||||
JSONObject jsonObject = new JSONObject(); |
||||
jsonObject.put("build_system_type", "brew"); |
||||
jsonObject.put("brew_id", "1"); |
||||
jsonObject.put("brew_nvr", "1.1.0"); |
||||
jsonObject.put("pnc_id", "153"); |
||||
jsonObject.put("artifact_type", "arti"); |
||||
jsonObject.put("file_name", "myfile"); |
||||
jsonObject.put("built_from_source", true); |
||||
|
||||
BrewObj brewObj1 = BrewObjPayload.constructScanPayload(jsonObject); |
||||
BrewObj brewObj2 = new BrewObj( |
||||
jsonObject.getString("build_system_type"), |
||||
jsonObject.getString("brew_id"), |
||||
jsonObject.getString("brew_nvr"), |
||||
jsonObject.getString("pnc_id"), |
||||
jsonObject.getString("artifact_type"), |
||||
jsonObject.getString("file_name"), |
||||
jsonObject.getBoolean("built_from_source")); |
||||
|
||||
logger.info("BrewObj1: " + brewObj1.toString()); |
||||
logger.info("BrewObj2: " + brewObj2.toString()); |
||||
assertEquals(brewObj1.getBuildSystemType(), brewObj2.getBuildSystemType()); |
||||
assertEquals(brewObj1.getBrewId(), brewObj2.getBrewId()); |
||||
assertEquals(brewObj1.getBrewNvr(), brewObj2.getBrewNvr()); |
||||
assertEquals(brewObj1.getPncId(), brewObj2.getPncId()); |
||||
assertEquals(brewObj1.getArtifactType(), brewObj2.getArtifactType()); |
||||
assertEquals(brewObj1.getFileName(), brewObj2.getFileName()); |
||||
assert(brewObj1.getBuiltFromSource() == brewObj2.getBuiltFromSource()); |
||||
} |
||||
|
||||
@Test |
||||
void TestGit() { |
||||
JSONObject jsonObject = new JSONObject(); |
||||
jsonObject.put("build_system_type", "git"); |
||||
jsonObject.put("repository", "repo"); |
||||
jsonObject.put("reference", "ref"); |
||||
jsonObject.put("commit_id", "c6385a754421a57cd0a26ccba187cd687c8d1258"); |
||||
|
||||
GitObj gitObj1 = GitObjPayload.constructScanPayload(jsonObject); |
||||
GitObj gitObj2 = new GitObj( |
||||
jsonObject.getString("build_system_type"), |
||||
jsonObject.getString("repository"), |
||||
jsonObject.getString("reference"), |
||||
jsonObject.getString("commit_id")); |
||||
logger.info("GitObj1: " + gitObj1.toString()); |
||||
logger.info("GitObj2: " + gitObj2.toString()); |
||||
assertEquals(gitObj1.getBuildSystemType(), gitObj2.getBuildSystemType()); |
||||
assertEquals(gitObj1.getRepository(), gitObj2.getRepository()); |
||||
assertEquals(gitObj1.getReference(), gitObj2.getReference()); |
||||
assertEquals(gitObj1.getCommitId(), gitObj2.getCommitId()); |
||||
} |
||||
|
||||
@Test |
||||
void TestPnc() { |
||||
JSONObject jsonObject = new JSONObject(); |
||||
jsonObject.put("build_system_type", "pnc"); |
||||
jsonObject.put("build_id", "153"); |
||||
|
||||
PncObj pncObj1 = PncObjPayload.constructScanPayload(jsonObject); |
||||
PncObj pncObj2 = new PncObj( |
||||
jsonObject.getString("build_system_type"), |
||||
jsonObject.getString("build_id")); |
||||
logger.info("PncObj1: " + pncObj1.toString()); |
||||
logger.info("PncObj2: " + pncObj2.toString()); |
||||
assertEquals(pncObj1.getBuildSystemType(), pncObj2.getBuildSystemType()); |
||||
assertEquals(pncObj1.getBuildId(), pncObj2.getBuildId()); |
||||
} |
||||
|
||||
@Test |
||||
void TestScan() { |
||||
JSONObject jsonObject = new JSONObject(); |
||||
jsonObject.put("scan_id", "ABC"); |
||||
jsonObject.put("offering_id", "product#"); |
||||
jsonObject.put("event_id", "event#"); |
||||
jsonObject.put("is_managed_service", "TRUE"); |
||||
jsonObject.put("component_list", "components"); |
||||
|
||||
ScanObj scanObj1 = ScanObjPayload.constructScanPayload(jsonObject); |
||||
ScanObj scanObj2 = new ScanObj( |
||||
jsonObject.getString("scan_id"), |
||||
jsonObject.getString("offering_id"), |
||||
jsonObject.getString("event_id"), |
||||
jsonObject.getString("is_managed_service"), |
||||
jsonObject.getString("component_list")); |
||||
logger.info("ScanObj1: " + scanObj1.toString()); |
||||
logger.info("ScanObj2: " + scanObj2.toString()); |
||||
assertEquals(scanObj1.getScanId(), scanObj2.getScanId()); |
||||
assertEquals(scanObj1.getProductId(), scanObj2.getProductId()); |
||||
assertEquals(scanObj1.getEventId(), scanObj2.getEventId()); |
||||
assertEquals(scanObj1.getIsManagedService(), scanObj2.getIsManagedService()); |
||||
assertEquals(scanObj1.getComponentList(), scanObj2.getComponentList()); |
||||
} |
||||
} |
||||
Loading…
Reference in new issue