Compare commits

...

81 Commits

Author SHA1 Message Date
Jonathan Christison 5ca4917bf4 Fix app properties service account reference 3 years ago
Jonathan Christison f66b910de7 Use SA in general profile 3 years ago
Jonathan Christison 7d61509b12 Changing this to Blocking as the vertx thread is timing out 3 years ago
Jonathan Christison 69445ad0d4 Use stage env DB setup 3 years ago
Jonathan Christison bfd887c40e Add hack/examples of calling single endpoints 3 years ago
Jonathan Christison 43d06684fb Make task handler public so vertx can trigger it 3 years ago
Jonathan Christison bae9ca9be8 Add quarkus-kubernetes-config for DB setup on stage 3 years ago
Jonathan Christison 3f1e048fe6 Use postgres deployment in pct-sec namepsace 3 years ago
Jonathan Christison c0e9019fd6 Add missing class 3 years ago
Jonathan Christison 5c96690241 Add authenticated annotation to endpoints that have actions 3 years ago
Jonathan Christison 24059710cb Remove JSON elements when NULL 3 years ago
Jonathan Christison c724442abb We dont need to return every field, especially private ones 3 years ago
Jonathan Christison 707e9d4fbb Fix cascade for brew and pnc types 3 years ago
Jonathan Christison 603dc500de Hacking around eventBus injection by passing it to be set 3 years ago
Jonathan Christison be2f54c9c0 Set some fields to be public for json return 3 years ago
Jonathan Christison 2fd582e2ca ORM kinda working 3 years ago
Jonathan Christison 354745f7a0 Still adding ORM 3 years ago
Jonathan Christison 72e3b0f0b3 Start adding picocli as way of verifying CLI args 3 years ago
Jonathan Christison 7fffbe22ba Decouple ScanRequest from tekton runs 3 years ago
Jonathan Christison 5149a807f3 Fix typo add swagger-ui 3 years ago
Jonathan Christison 64c84c8764 Call run on the single-git-scan.sh 3 years ago
Jonathan Christison 85c798b0a7 Add pipeline run code for demo 3 years ago
Jonathan Christison 045df604a0 Start adding ORM support 3 years ago
Jonathan Christison 5b76dcfb83 Change SA and RBAC 3 years ago
Jonathan Christison 9e9d3b36ed Run under osh-wrapper-client-sa 3 years ago
Jonathan Christison a15014e106 Example usage 3 years ago
Jonathan Christison 6b9aa75213 Add tekton task, use defaultClient, injection woes 3 years ago
Jonathan Christison c3aa30b2b4 Add jackson support 3 years ago
Jonathan Christison 24660d8f85 Start adding validation and some primative endpoints 3 years ago
Jonathan Christison 5f8498bb0c Remove implementation to start again 3 years ago
Nicholas Caughey b073fa4c73 Merge branch 'k8s_organisation' into 'main' 3 years ago
Jonathan Christison c9bf460cad Now SA has been extended with oshwrapper pull secret switch to it for 3 years ago
Jonathan Christison 91d7911338 Link osh SA created by quarkus with osh-wrapper-client-pull secrets 3 years ago
Jonathan Christison 4a4892c9ba Remove service account declartion in application properties 3 years ago
Jonathan Christison e773802b00 Move to hack dir 3 years ago
Jonathan Christison 7c674c215c Link to the osh-wrapper client service account 3 years ago
Jonathan Christison 23499fd7e8 Bind existing SA and tekton role 3 years ago
Jonathan Christison b39b28dc65 Clean up of yaml files before kustomization 3 years ago
Nicholas Caughey 5c3671a2c6 fixed every warning from quarkus except the split package issue which can be tracked with issue#4 3 years ago
Nicholas Caughey 641bac76cb fixing duplicate dependencys 3 years ago
Nicholas Caughey 1d97dcbbdb add docker file to k8 3 years ago
Nicholas Caughey 4559146ad1 Merge branch 'openshift_inject_rte_fix' into 'main' 3 years ago
Jonathan Christison 02c461395c Fix problem with openshift deploy and jakarta injection 3 years ago
Nicholas Caughey 12f983cd22 Merge branch 'local_dev_fixes' into 'main' 3 years ago
Jonathan Christison 9d76e78393 Disable Kerberos and other Auths in dev profile 3 years ago
Nicholas Caughey 865c628bb2 fixed the pom and other issues from a bad merge 3 years ago
Nicholas Caughey c0b2f78719 Merge branch 'brewbuild' into 'main' 3 years ago
Nicholas Caughey e35882520f added the brewnvr innvocation 3 years ago
Nicholas Caughey f7853db788 Merge branch 'exception_work' into 'main' 3 years ago
Nicholas Caughey 275014b89a Merge branch 'main' into 'exception_work' 3 years ago
Nicholas Caughey 5ec9e2392f Merge branch 'db_schema' into 'main' 3 years ago
Nicholas Caughey 5246c2f604 Merge branch 'main' into 'db_schema' 3 years ago
Nicholas Caughey 5de2051b86 updated exceptions to throw on unimplemented fu nctionality 3 years ago
Nicholas Caughey 0173d5e9bc added exceptions to unimplemented code 3 years ago
Nicholas Caughey 1c1007b811 changing the groupid to be associated with the project 3 years ago
Nicholas Caughey 0d7678a990 Merge branch 'kerberos_auth' into 'main' 3 years ago
Jonathan Christison fa4ea264e2 Add a comment on how the file was created 3 years ago
Jonathan Christison e755fe945c Use edge TLS termination 3 years ago
Jonathan Christison c15a0c5ee1 Add example deploy and set TLS to edge 3 years ago
Jonathan Christison b1942b512a Change kerberos settings 3 years ago
Jonathan Christison e3fcecac06 Change to osh rather than osh-stage 3 years ago
Jonathan Christison 2e38ec0461 Add krb5.conf to container as config map 3 years ago
Jonathan Christison 4526231088 Secure volume mount example 3 years ago
Jonathan Christison 1081288418 Hacky attempt at adding DB Dev services for local development 3 years ago
Leonid Bossis 63fef64f31 checkpoint #2 3 years ago
Leonid Bossis a178a7fc18 add logging facility 3 years ago
Leonid Bossis 8975fff63d change db table field names from mixed naming convention to pythonic convention xxx_yyy_zzz and stop using CamelCase 3 years ago
Leonid Bossis c61e6fb0f6 checkpoint #1 3 years ago
jperezde fee2bd340f Added Kerberos auth to methods 3 years ago
jperezde 1ab0639941 Test keytab 3 years ago
jperezde af4a80b04a Added Kerberos dependency 3 years ago
Leonid Bossis c6385a7544 First changes after code review, making use of prepared statements, code cleanup 3 years ago
jperezde d3e2990851 Modified application.properties 3 years ago
jperezde 94d72b95c8 Added kerberos dependendency in pom.xml 3 years ago
jperezde 22c0be081b Test install Kerberos 3 years ago
jperezde 6df7da6c10 Test Dependency 3 years ago
Juan Perez de Algaba bb63891276 Update schema.sql to replace covscan for osh 3 years ago
Juan Perez de Algaba 674e248c1d Update modified covscanrest for osh 3 years ago
jperezde f3c9338181 Modified schema, create scraper and populate file for offerings file 3 years ago
Juan Perez de Algaba f502e758c7 Uploaded schema file 3 years ago
Juan Perez de Algaba 9449cf7fd0 Add new directory for db schema 3 years ago
  1. 12
      .gitignore
  2. 127
      README.md
  3. 16
      docker/osh-client/Dockerfile
  4. 99
      hack/osh-client-from-source-pipeline-run.yaml
  5. 4
      hack/pssaas-request-curl.sh
  6. 13
      hack/sample-pssaas-bad.json
  7. 15
      hack/sample-pssaas.json
  8. 9
      hack/single-brew-scan-stage.sh
  9. 9
      hack/single-brew-scan.sh
  10. 10
      hack/single-git-scan-stage.sh
  11. 10
      hack/single-git-scan.sh
  12. 48
      hack/stage-debug-pod.yaml
  13. 21
      k8s/stage/app/edge-route.yaml
  14. 44
      k8s/stage/app/kerberos-config.yaml
  15. 36
      k8s/stage/app/linux-krb5.conf
  16. 15
      k8s/stage/app/service-account.yaml
  17. 31
      k8s/stage/app/tekton-rbac.yaml
  18. 79
      k8s/stage/osh-client-tekton/osh-client-config.yaml
  19. 28
      k8s/stage/osh-client-tekton/osh-client-pvc.yaml
  20. 98
      k8s/stage/osh-client-tekton/pipline/osh-client-from-source-pipeline.yaml
  21. 25
      k8s/stage/osh-client-tekton/task/osh-client-from-source-clearup-workspace.yaml
  22. 78
      k8s/stage/osh-client-tekton/task/osh-client-from-source.yaml
  23. 146
      k8s/stage/osh-client-tekton/task/osh-client-git-cli-modified.yaml
  24. 63
      k8s/stage/osh-client-tekton/task/osh-scan-task.yaml
  25. 310
      mvnw
  26. 182
      mvnw.cmd
  27. 175
      pom.xml
  28. 0
      schema/.gitkeep
  29. 30
      schema/OffRegScraper.py
  30. 126
      schema/populate.sql
  31. 81
      schema/schema.sql
  32. 12
      src/main/docker/Dockerfile.jvm
  33. 12
      src/main/docker/Dockerfile.legacy-jar
  34. 4
      src/main/docker/Dockerfile.native
  35. 4
      src/main/docker/Dockerfile.native-micro
  36. 37
      src/main/java/com/redhat/pctsec/model/BrewBuild.java
  37. 35
      src/main/java/com/redhat/pctsec/model/BuildType.java
  38. 29
      src/main/java/com/redhat/pctsec/model/Git.java
  39. 39
      src/main/java/com/redhat/pctsec/model/PNCBuild.java
  40. 3
      src/main/java/com/redhat/pctsec/model/RequestType.java
  41. 116
      src/main/java/com/redhat/pctsec/model/Scan.java
  42. 110
      src/main/java/com/redhat/pctsec/model/ScanRequest.java
  43. 111
      src/main/java/com/redhat/pctsec/model/ScanRequests.java
  44. 19
      src/main/java/com/redhat/pctsec/model/ScanResult.java
  45. 78
      src/main/java/com/redhat/pctsec/model/ScanTask.java
  46. 3
      src/main/java/com/redhat/pctsec/model/ScanTaskState.java
  47. 17
      src/main/java/com/redhat/pctsec/model/api/request/Component.java
  48. 31
      src/main/java/com/redhat/pctsec/model/api/request/ComponentJsonDeserializer.java
  49. 49
      src/main/java/com/redhat/pctsec/model/api/request/build.java
  50. 53
      src/main/java/com/redhat/pctsec/model/api/request/git.java
  51. 70
      src/main/java/com/redhat/pctsec/model/api/request/pssaas.java
  52. 4
      src/main/java/com/redhat/pctsec/model/api/request/scanChain.java
  53. 16
      src/main/java/com/redhat/pctsec/model/jpa/ScanRepository.java
  54. 18
      src/main/java/com/redhat/pctsec/model/jpa/ScanRequestRepository.java
  55. 18
      src/main/java/com/redhat/pctsec/model/jpa/ScanRequestsRepository.java
  56. 22
      src/main/java/com/redhat/pctsec/model/jpa/UriConverter.java
  57. 72
      src/main/java/com/redhat/pctsec/model/osh/paramMapper.java
  58. 27
      src/main/java/com/redhat/pctsec/rest/v1alpha1/Kerberos.java
  59. 50
      src/main/java/com/redhat/pctsec/rest/v1alpha1/ScanRequestResource.java
  60. 41
      src/main/java/com/redhat/pctsec/rest/v1alpha1/ScanRequestsResource.java
  61. 122
      src/main/java/com/redhat/pctsec/rest/v1alpha1/ScanResource.java
  62. 139
      src/main/java/com/redhat/pctsec/tekton/TaskHandler.java
  63. 49
      src/main/java/com/redhat/pctsec/tekton/brewTaskRun.java
  64. 74
      src/main/java/com/redhat/pctsec/tekton/scmUrlPipelineRun.java
  65. 92
      src/main/java/constants/HttpHeaders.java
  66. 7
      src/main/java/constants/PSGQL.java
  67. 26
      src/main/java/dto/BrewObj.java
  68. 23
      src/main/java/dto/BrewObjPayload.java
  69. 35
      src/main/java/dto/ConnectDB.java
  70. 23
      src/main/java/dto/GitObj.java
  71. 23
      src/main/java/dto/GitObjPayload.java
  72. 21
      src/main/java/dto/PncObj.java
  73. 23
      src/main/java/dto/PncObjPayload.java
  74. 9
      src/main/java/dto/ScanInterface.java
  75. 26
      src/main/java/dto/ScanObj.java
  76. 23
      src/main/java/dto/ScanObjPayload.java
  77. 85
      src/main/java/rest/CreateGetResource.java
  78. 109
      src/main/java/rest/CreateScanRequest.java
  79. 58
      src/main/java/rest/CreateScanResource.java
  80. 16
      src/main/java/rest/CreateScanService.java
  81. 76
      src/main/java/rest/CreateStartScan.java
  82. 70
      src/main/java/rest/RemoveScan.java
  83. 43
      src/main/java/rest/Scan.java
  84. 124
      src/main/java/rest/StoreData.java
  85. 20
      src/main/java/rest/TektonResourceClient.java
  86. 175
      src/main/java/rest/TektonTaskCreate.java
  87. 116
      src/main/java/rest/callTekton.java
  88. 288
      src/main/resources/META-INF/resources/index.html
  89. 19
      src/main/resources/Scan.hbm.xml
  90. 79
      src/main/resources/application.properties
  91. 16
      src/main/resources/baseScan.yml
  92. 21
      src/main/resources/hibernate.cfg.xml
  93. 22
      src/test/java/com/redhat/pctsec/model/osh/paramMapperTest.java

12
.gitignore vendored

@ -0,0 +1,12 @@
.dcignore
.idea
*.iml
dev/
# Maven
target/
pom.xml.tag
pom.xml.releaseBackup
pom.xml.versionsBackup
release.properties

127
README.md

@ -1,112 +1,51 @@
See https://docs.google.com/document/d/15yod6K_ZbNkJ_ern7gwpxjBkdJIlHXORfYZ3CGQhnEM/edit?usp=sharing for a full version with images
# code-with-quarkus
# Introduction
Currently we rely on CPaaS to submit requests to PSSaaS which then invokes the PSSC scanning container. The idea behind the ScanChain api is to act as an interaction point for services to be able to directly access our scan tooling.
This project uses Quarkus, the Supersonic Subatomic Java Framework.
Our api will be written in Quarkus for ease of use and deployment to OpenShift, we will also use Tekton to assist with CI/CD.
If you want to learn more about Quarkus, please visit its website: https://quarkus.io/ .
# How to build
## Running the application in dev mode
To set up the environment. After cloning the repository:
```
cd <repository>/
quarkus create app quarkus:dev
mvn -N io.takari:maven:wrapper
```
Also, it is necessary to create a local PostgreSQL instance. For development purposes, the parameters are:
```
username = postgresql
password = password
```
ToDo: Create Database Model
To run the Quarkus build in dev mode simply run:
````
You can run your application in dev mode that enables live coding using:
```shell script
./mvnw compile quarkus:dev
````
All end points should be avaliable on localhost:8080/{endpoint}. The endpoints are listed in the endpoints section
# Deploying to OpenShift (https://quarkus.io/guides/deploying-to-openshift)
Part of the advantage of working with quarkus is the ease of which we can deploy it to OpenShift. We have the OpenShift extension already installed via the pom,
All that should be required to build and deploy OpenShift is to login to OpenShift via the usual method (oc login (creds) for example). Before running a build command:
You can then expose the routes (oc expose {route}), then your application should be accessible on the OpenShift cluster. This is verifiable either by using the console to request which services are running (oc get svc) or by using the web console which should display the service graphically.
# Design diagram
API endpoint diagram with all endpoints DB links, connections to further services (PNC API etc)
# API endpoints
## /{scanId} - GET request for retrieving scans
This is a simple request for retrieving scans that are stored in our postgresql database. The assigned scanId will return the whole scan payload in JSON format.
## / - POST request takes a JSON payload to start scans (Maybe isnt relevant/shouldnt be included in the future)
Creating scans via passing fully formed JSON payloads. The standard JSON format should contain:
product-id
event-id
is-managed-service
component-list
See appendix 1 for a provided example
## /scanRequest - Post request for starting scans
There are several different types of build that should be retrieved from the backend source. Different inputs are required based off the build source.
The required fields for BREW builds are:
buildSystemType
brewId
brewNVR - matches brewId
pncId
artifactType
fileName
builtFromSource
The required fields for git builds are:
buildSystemType
repository
reference
commitId
The required fields for PNC builds are:
buildSystemType
buildId
```
This information should allow us to have all the requirements for retrieving and then starting a scan when requested from the required sources.
> **_NOTE:_** Quarkus now ships with a Dev UI, which is available in dev mode only at http://localhost:8080/q/dev/.
## /startScan - PUT request to start off the relevant scan
## Packaging and running the application
Only requires the scanId and should start off the relevant scan, should return a success only on finished or failure if there's no further response after timeout.
## /removeScan - DELETE request to remove a scan build from DB
The application can be packaged using:
```shell script
./mvnw package
```
It produces the `quarkus-run.jar` file in the `target/quarkus-app/` directory.
Be aware that it’s not an _über-jar_ as the dependencies are copied into the `target/quarkus-app/lib/` directory.
Only requires the scanId should remove the relevant scan from our DB. Should return a success or failure.
The application is now runnable using `java -jar target/quarkus-app/quarkus-run.jar`.
# Expanded work to do
If you want to build an _über-jar_, execute the following command:
```shell script
./mvnw package -Dquarkus.package.type=uber-jar
```
## Jenkins
The application, packaged as an _über-jar_, is now runnable using `java -jar target/*-runner.jar`.
Haven't looked into the correct way for the API to interact with Jenkins needs more investigation.
## Creating a native executable
## Jira tickets still to do:
https://issues.redhat.com/browse/PSSECMGT-1548
https://issues.redhat.com/browse/PSSECMGT-1549
https://issues.redhat.com/browse/PSSECMGT-1550
https://issues.redhat.com/browse/PSSECMGT-1551
https://issues.redhat.com/browse/PSSECMGT-1552
https://issues.redhat.com/browse/PSSECMGT-1553
https://issues.redhat.com/browse/PSSECMGT-1554
You can create a native executable using:
```shell script
./mvnw package -Pnative
```
Or, if you don't have GraalVM installed, you can run the native executable build in a container using:
```shell script
./mvnw package -Pnative -Dquarkus.native.container-build=true
```
# Appendix
You can then execute your native executable with: `./target/code-with-quarkus-1.0.0-SNAPSHOT-runner`
Appendix 1
If you want to learn more about building native executables, please consult https://quarkus.io/guides/maven-tooling.
## Related Guides

16
docker/osh-client/Dockerfile

@ -0,0 +1,16 @@
FROM ubi9
MAINTAINER "Mert Bugra Bicak" <mbicak@redhat.com>
RUN curl http://hdn.corp.redhat.com/rhel7-csb-stage/RPMS/noarch/redhat-internal-cert-install-0.1-31.el7.noarch.rpm -o redhat-internal-cert-install-0.1-28.el7.noarch.rpm && dnf install -y chkconfig java-headless && rpm -i redhat-internal-cert-install-0.1-28.el7.noarch.rpm && rm redhat-internal-cert-install-0.1-28.el7.noarch.rpm
RUN curl -L http://download.devel.redhat.com/rel-eng/RCMTOOLS/rcm-tools-rhel-9-baseos.repo -o /etc/yum.repos.d/rcm-tools-rhel-9-baseos.repo
RUN dnf -y update && dnf install -y dnf-plugins-core https://dl.fedoraproject.org/pub/epel/epel-release-latest-9.noarch.rpm && dnf copr enable -y copr.devel.redhat.com/kdudka/covscan && dnf install -y covscan-client koji brewkoji krb5-workstation
RUN mkdir /home/covscan && chmod g+rw /home/covscan
WORKDIR /home/covscan
ENTRYPOINT ["covscan"]

99
hack/osh-client-from-source-pipeline-run.yaml

@ -0,0 +1,99 @@
apiVersion: tekton.dev/v1beta1
kind: PipelineRun
metadata:
generateName: osh-client-from-source-run-
#openshift.io/scc: pipelines-scc
spec:
serviceAccountName: osh
podTemplate:
securityContext:
runAsNonRoot: true
runAsUser: 65532
pipelineRef:
name: osh-client-from-source
params:
- name: repo-url
value: https://code.engineering.redhat.com/gerrit/messaging/activemq-artemis.git
- name: revision
value: amq-broker-7.11
workspaces:
- name: sources
persistentVolumeClaim:
claimName: osh-client-sources
- name: source-tars
persistentVolumeClaim:
claimName: osh-client-source-tars
- name: ssl-ca-directory
configmap:
name: config-trusted-cabundle
---
apiVersion: tekton.dev/v1beta1
kind: PipelineRun
metadata:
generateName: osh-client-from-source-run-
#openshift.io/scc: pipelines-scc
spec:
serviceAccountName: osh
podTemplate:
securityContext:
runAsNonRoot: true
runAsUser: 65532
pipelineRef:
name: osh-client-from-source
params:
- name: repo-url
value: https://code.engineering.redhat.com/gerrit/quarkusio/quarkus.git
- name: revision
value: 2.13.8.Final-redhat-00001
workspaces:
- name: sources
persistentVolumeClaim:
claimName: osh-client-sources
- name: source-tars
persistentVolumeClaim:
claimName: osh-client-source-tars
- name: ssl-ca-directory
configmap:
name: config-trusted-cabundle
---
apiVersion: tekton.dev/v1beta1
kind: PipelineRun
metadata:
generateName: osh-client-from-source-run-
#openshift.io/scc: pipelines-scc
spec:
serviceAccountName: osh
podTemplate:
securityContext:
runAsNonRoot: true
runAsUser: 65532
pipelineRef:
name: osh-client-from-source
params:
- name: repo-url
value: https://code.engineering.redhat.com/gerrit/quarkusio/quarkus-platform.git
- name: revision
value: 6a13a9fe4e5526bee4a8ea5e425d89945bea1c17
workspaces:
- name: sources
persistentVolumeClaim:
claimName: osh-client-sources
- name: source-tars
persistentVolumeClaim:
claimName: osh-client-source-tars
- name: ssl-ca-directory
configmap:
name: config-trusted-cabundle

4
hack/pssaas-request-curl.sh

@ -0,0 +1,4 @@
#!/bin/bash
curl -H 'Content-Type: application/json' -d '@sample-pssaas.json' localhost:8080/api/v1a/Scan/PSSaaS/run -vv | jq
#curl -H 'Content-Type: application/json' -d '@sample-pssaas-bad.json' localhost:8080/api/v1a/Scan/PSSaaS -vv

13
hack/sample-pssaas-bad.json

@ -0,0 +1,13 @@
{
"product-id": "jochrist-dev-test-rhbq",
"is-managed-service": false,
"cpaas-version": "latest",
"component-list":[
{"build-id":"ASLKGOMQVVAAA",
"type":"pnc"},
{"build-id":"ASLMBTBCNVAAA",
"type":"pnc"},
{"foo":"bar"}],
"some-other-list":[{"this":"shouldn't work"}]
}

15
hack/sample-pssaas.json

@ -0,0 +1,15 @@
{
"product-id": "jochrist-dev-test-rhbq",
"is-managed-service": false,
"cpaas-version": "latest",
"component-list":[
{"build-id":"ASLKGOMQVVAAA",
"type":"pnc"},
{"build-id":"ASLMBTBCNVAAA",
"type":"pnc"},
{"type":"git",
"repo":"https://code.engineering.redhat.com/gerrit/quarkusio/quarkus.git",
"ref":"2.13.8.Final-redhat-00001"}
]
}

9
hack/single-brew-scan-stage.sh

@ -0,0 +1,9 @@
#!/bin/bash
curl --get \
--data-urlencode "brewId=xterm-366-8.el9" \
https://osh-pct-security-tooling.apps.ocp-c1.prod.psi.redhat.com/api/v1a/Scan/single/brew -vv
curl --get https://osh-pct-security-tooling.apps.ocp-c1.prod.psi.redhat.com/api/v1a/Scan/2 -vv
curl --get https://osh-pct-security-tooling.apps.ocp-c1.prod.psi.redhat.com/api/v1a/Scan/2/run -vv

9
hack/single-brew-scan.sh

@ -0,0 +1,9 @@
#!/bin/bash
curl --get \
--data-urlencode "brewId=xterm-366-8.el9" \
localhost:8080/api/v1a/Scan/single/brew -vv
curl --get localhost:8080/api/v1a/Scan/2 -vv
curl --get localhost:8080/api/v1a/Scan/2/run -vv

10
hack/single-git-scan-stage.sh

@ -0,0 +1,10 @@
#!/bin/bash
curl --get \
--data-urlencode "repo=https://code.engineering.redhat.com/gerrit/quarkusio/quarkus.git" \
--data-urlencode "ref=2.13.8.Final-redhat-00001" \
https://osh-pct-security-tooling.apps.ocp-c1.prod.psi.redhat.com/api/v1a/Scan/single/git -vv
curl --get https://osh-pct-security-tooling.apps.ocp-c1.prod.psi.redhat.com/api/v1a/Scan/1 -vv
curl --get https://osh-pct-security-tooling.apps.ocp-c1.prod.psi.redhat.com/api/v1a/Scan/1/run -vv

10
hack/single-git-scan.sh

@ -0,0 +1,10 @@
#!/bin/bash
curl --get \
--data-urlencode "repo=https://code.engineering.redhat.com/gerrit/quarkusio/quarkus.git" \
--data-urlencode "ref=2.13.8.Final-redhat-00001" \
localhost:8080/api/v1a/Scan/single/git -vv
curl --get localhost:8080/api/v1a/Scan/1 -vv
curl --get localhost:8080/api/v1a/Scan/1/run -vv

48
hack/stage-debug-pod.yaml

@ -0,0 +1,48 @@
apiVersion: v1
kind: Pod
metadata:
name: image-debug-with-mount
namespace: pct-security-tooling
spec:
serviceAccountName: deployer
containers:
- command:
- /bin/sh
image: quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7f0ed1d2500a005e8e134920085ec6b28770b915b0449d45c4a44fbec818f33f
name: debug-with-mount
volumeMounts:
- name: osh-wrapper
mountPath: /mounts/kerberos
- name: osh-wrapper-config-vol
mountPath: /mounts/wraper-config
- name: osh-client-sources
mountPath: /mounts/osh-client-sources
- name: osh-client-tgz
mountPath: /mounts/osh-client-tgz
resources: {}
securityContext: {}
stdin: true
stdinOnce: true
tty: true
restartPolicy: Never
volumes:
- name: osh-wrapper
secret:
defaultMode: 384
optional: false
secretName: kerberos-keytab-osh
- configMap:
defaultMode: 384
items:
- key: linux-krb5.conf
path: linux-krb5.conf
name: kerberos-config
optional: false
name: osh-wrapper-config-vol
- name: osh-client-sources
persistentVolumeClaim:
claimName: osh-client-sources
- name: osh-client-tgz
persistentVolumeClaim:
claimName: osh-client-source-tars

21
k8s/stage/app/edge-route.yaml

@ -0,0 +1,21 @@
#oc create route edge --service=osh --dry-run=client -o yaml > edgeroute.yml
apiVersion: route.openshift.io/v1
kind: Route
metadata:
creationTimestamp: null
labels:
app.kubernetes.io/name: osh
app.kubernetes.io/version: 1.0.0-SNAPSHOT
app.openshift.io/runtime: quarkus
env: stage
name: osh
spec:
port:
targetPort: http
tls:
termination: edge
to:
kind: ""
name: osh
weight: null
status: {}

44
k8s/stage/app/kerberos-config.yaml

@ -0,0 +1,44 @@
#oc create configmap kerberos-config --from-file=linux-krb5.conf --dry-run=client -o yaml > kerberos-config.yaml
apiVersion: v1
data:
linux-krb5.conf: |
includedir /etc/krb5.conf.d/
# depending on your config, you may wish to uncomment the following:
# includedir /var/lib/sss/pubconf/krb5.include.d/
[libdefaults]
default_realm = IPA.REDHAT.COM
dns_lookup_realm = true
dns_lookup_kdc = true
rdns = false
dns_canonicalize_hostname = false
ticket_lifetime = 24h
forwardable = true
udp_preference_limit = 1
default_ccache_name = KEYRING:persistent:%{uid}
max_retries = 1
kdc_timeout = 1500
[realms]
REDHAT.COM = {
default_domain = redhat.com
dns_lookup_kdc = true
master_kdc = kerberos.corp.redhat.com
admin_server = kerberos.corp.redhat.com
}
IPA.REDHAT.COM = {
default_domain = ipa.redhat.com
dns_lookup_kdc = true
# Trust tickets issued by legacy realm on this host
auth_to_local = RULE:[1:$1@$0](.*@REDHAT\.COM)s/@.*//
auth_to_local = DEFAULT
}
#DO NOT ADD A [domain_realms] section
#https://mojo.redhat.com/docs/DOC-1166841
kind: ConfigMap
metadata:
creationTimestamp: null
name: kerberos-config

36
k8s/stage/app/linux-krb5.conf

@ -0,0 +1,36 @@
includedir /etc/krb5.conf.d/
# depending on your config, you may wish to uncomment the following:
# includedir /var/lib/sss/pubconf/krb5.include.d/
[libdefaults]
default_realm = IPA.REDHAT.COM
dns_lookup_realm = true
dns_lookup_kdc = true
rdns = false
dns_canonicalize_hostname = false
ticket_lifetime = 24h
forwardable = true
udp_preference_limit = 1
default_ccache_name = KEYRING:persistent:%{uid}
max_retries = 1
kdc_timeout = 1500
[realms]
REDHAT.COM = {
default_domain = redhat.com
dns_lookup_kdc = true
master_kdc = kerberos.corp.redhat.com
admin_server = kerberos.corp.redhat.com
}
IPA.REDHAT.COM = {
default_domain = ipa.redhat.com
dns_lookup_kdc = true
# Trust tickets issued by legacy realm on this host
auth_to_local = RULE:[1:$1@$0](.*@REDHAT\.COM)s/@.*//
auth_to_local = DEFAULT
}
#DO NOT ADD A [domain_realms] section
#https://mojo.redhat.com/docs/DOC-1166841

15
k8s/stage/app/service-account.yaml

@ -0,0 +1,15 @@
apiVersion: v1
kind: ServiceAccount
metadata:
labels:
app.kubernetes.io/name: osh-wrapper-client-sa
app.kubernetes.io/version: 1.0.0-SNAPSHOT
app.openshift.io/runtime: quarkus
env: stage
name: osh-wrapper-client-sa
namespace: pct-security-tooling
imagePullSecrets:
- name: pct-security-osh-wrapper-client-pull-secret
- name: osh-dockercfg-tfhlr
secrets:
- name: osh-dockercfg-tfhlr

31
k8s/stage/app/tekton-rbac.yaml

@ -0,0 +1,31 @@
kind: Role
apiVersion: rbac.authorization.k8s.io/v1
metadata:
labels:
app.kubernetes.io/component: tekton
name: osh-wrapper-tekton
namespace: pct-security-tooling
rules:
- apiGroups:
- tekton.dev
resources:
- taskruns
- pipelineruns
verbs:
- create
- get
- watch
- list
---
kind: RoleBinding
apiVersion: rbac.authorization.k8s.io/v1
metadata:
name: osh-wrapper-tekton-rolebinding
namespace: pct-security-tooling
roleRef:
apiGroup: rbac.authorization.k8s.io
kind: Role
name: osh-wrapper-tekton
subjects:
- kind: ServiceAccount
name: osh-wrapper-client-sa

79
k8s/stage/osh-client-tekton/osh-client-config.yaml

@ -0,0 +1,79 @@
apiVersion: v1
kind: ConfigMap
metadata:
annotations:
name: kerberos-config-osh-client
namespace: pct-security-tooling
data:
linux-krb5.conf: |
includedir /etc/krb5.conf.d/
# depending on your config, you may wish to uncomment the following:
# includedir /var/lib/sss/pubconf/krb5.include.d/
[libdefaults]
default_realm = IPA.REDHAT.COM
dns_lookup_realm = true
dns_lookup_kdc = true
rdns = false
dns_canonicalize_hostname = false
ticket_lifetime = 24h
forwardable = true
udp_preference_limit = 1
default_ccache_name = FILE:/tmp/krb5cc_%{uid}
max_retries = 1
kdc_timeout = 1500
[realms]
REDHAT.COM = {
default_domain = redhat.com
dns_lookup_kdc = true
master_kdc = kerberos.corp.redhat.com
admin_server = kerberos.corp.redhat.com
}
IPA.REDHAT.COM = {
default_domain = ipa.redhat.com
dns_lookup_kdc = true
# Trust tickets issued by legacy realm on this host
auth_to_local = RULE:[1:$1@$0](.*@REDHAT\.COM)s/@.*//
auth_to_local = DEFAULT
}
#DO NOT ADD A [domain_realms] section
#https://mojo.redhat.com/docs/DOC-1166841
---
#oc create configmap osh-client-config --from-file=client.conf --dry-run=client -o yaml > osh-client-config.yaml
apiVersion: v1
kind: ConfigMap
metadata:
name: osh-client-config
namespace: pct-security-tooling
data:
client.conf: |+
# client config file for covscan
# Hub XML-RPC address.
HUB_URL = "https://cov01.lab.eng.brq2.redhat.com/covscanhub/xmlrpc"
BREW_URL = "https://brewhub.engineering.redhat.com/brewhub"
KOJI_URL = "https://koji.fedoraproject.org/kojihub"
KOJI_PROFILES = "brew,koji"
CIM_SERVER = "cov01.lab.eng.brq2.redhat.com"
CIM_PORT = "8080"
DEFAULT_MOCKCONFIG = "fedora-rawhide-x86_64"
# Hub authentication method: "krbv", "password", or "gssapi"
AUTH_METHOD = "krbv"
KRB_REALM = "IPA.REDHAT.COM"
# Kerberos principal. If commented, default principal obtained by kinit is used.
KRB_PRINCIPAL = "HTTP/osh-pct-security-tooling.apps.ocp-c1.prod.psi.redhat.com@IPA.REDHAT.COM"
# Kerberos keytab file.
KRB_KEYTAB = "/kerberos/kerberos-keytab-osh"
# Enables XML-RPC verbose flag
DEBUG_XMLRPC = 0

28
k8s/stage/osh-client-tekton/osh-client-pvc.yaml

@ -0,0 +1,28 @@
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
name: osh-client-sources
namespace: pct-security-tooling
spec:
accessModes:
- ReadWriteMany
resources:
requests:
storage: 5Gi
storageClassName: dynamic-nfs
volumeMode: Filesystem
---
apiVersion: v1
kind: PersistentVolumeClaim
metadata:
name: osh-client-source-tars
namespace: pct-security-tooling
spec:
accessModes:
- ReadWriteMany
resources:
requests:
storage: 10Gi
storageClassName: dynamic-nfs
volumeMode: Filesystem

98
k8s/stage/osh-client-tekton/pipline/osh-client-from-source-pipeline.yaml

@ -0,0 +1,98 @@
#requires running `tkn hub install task "git-cli"` first
apiVersion: tekton.dev/v1beta1
kind: Pipeline
metadata:
name: osh-client-from-source
spec:
description: This pipeline clones a repo, git archives it then sends it to covscan to be scanned with snyk
params:
- name: repo-url
description: The SCMURL
type: string
- name: revision
description: The revision or tag
type: string
- name: archive-name
description: The name of the git archive file
type: string
default: $(context.pipelineRun.uid).tar.gz
workspaces:
- name: sources
description: This workspace contains our cloned sources and is temporary
- name: source-tars
description: This workspace contains our source tar gzips for covscan and is semi-persistant
- name: ssl-ca-directory
description: Location of CA bundle for ssl verification with internal services
tasks:
- name: clone
taskRef:
name: git-clone
workspaces:
- name: output
workspace: sources
subPath: $(context.pipelineRun.name)
- name: ssl-ca-directory
workspace: ssl-ca-directory
params:
- name: url
value: $(params.repo-url)
- name: revision
value: $(params.revision)
- name: verbose
value: true
- name: archive
runAfter:
- clone
taskRef:
name: git-cli
workspaces:
- name: source
workspace: sources
subPath: $(context.pipelineRun.name)
- name: source-tars
workspace: source-tars
subPath: $(context.pipelineRun.name)
params:
- name: USER_HOME
value: /home/git
- name: archive-name
value: $(params.archive-name)
- name: GIT_SCRIPT
value: |
git config --global --add safe.directory /workspace/source
git archive --format=tar.gz HEAD -o /workspace/source-tars/$(params.archive-name)
#results:
#- name: archive-name
#description: The name of the tar.gz we created
- name: covscan
params:
- name: targz-file
value: $(params.archive-name)
runAfter:
- archive
taskRef:
name: osh-scan-task-from-source
workspaces:
- name: source-tars
workspace: source-tars
subPath: $(context.pipelineRun.name)
finally:
- name: cleanup-workspace
params:
- name: clear-dir
value: $(context.pipelineRun.name)
taskRef:
name: cleanup-workspace
workspaces:
- name: sources
workspace: sources
#Note we don't provide a subpath, this way we can contain the whole folder

25
k8s/stage/osh-client-tekton/task/osh-client-from-source-clearup-workspace.yaml

@ -0,0 +1,25 @@
apiVersion: tekton.dev/v1beta1
kind: Task
metadata:
name: cleanup-workspace
spec:
params:
- name: cleanup
type: string
default: true
description: Should we actually cleanup the sources dir
- name: clear-dir
type: string
workspaces:
- name: sources
description: Where we checked out our sources
steps:
- name: perform-buildid-scan
image: registry.access.redhat.com/ubi9/ubi:9.2-696
script: |
#!/bin/bash
echo "Clearing up sources form $(params.clear-dir)"
rm -rv /workspace/sources/$(params.clear-dir)

78
k8s/stage/osh-client-tekton/task/osh-client-from-source.yaml

@ -0,0 +1,78 @@
apiVersion: tekton.dev/v1beta1
kind: Task
metadata:
name: osh-scan-task-from-source
spec:
stepTemplate:
env:
- name: "HOME"
value: "/tekton/home"
params:
- name: targz-file
type: string
default: "source.tar.gz"
description: The filename of the tar.gz we'll be uploading to covscan
- name: scan-profile
type: string
description: The scan profile we will use
default: "snyk-only-unstable"
- name: tarball-build-script
type: string
description: Parameters to be passed to tarball-build-script
default: ":"
volumes:
- name: osh-client-kerb-vol
secret:
defaultMode: 292
optional: false
secretName: kerberos-keytab-osh
- name: osh-client-kerb-config-vol
configMap:
name: kerberos-config-osh-client
items:
- key: linux-krb5.conf
path: linux-krb5.conf
defaultMode: 292
optional: false
- name: osh-client-config-vol
configMap:
name: osh-client-config
items:
- key: client.conf
path: client.conf
optional: false
workspaces:
- name: source-tars
description: source tar gzips are kept here
steps:
- name: perform-buildid-scan
image: quay.io/pct-security/osh-wrapper-client:latest
workingDir: /home/covscan
volumeMounts:
- name: osh-client-kerb-vol
mountPath: /kerberos
readOnly: true
- name: osh-client-config-vol
mountPath: /etc/osh/client.conf
readOnly: true
subPath: client.conf
- name: osh-client-kerb-config-vol
mountPath: /etc/krb5.conf
readOnly: true
subPath: linux-krb5.conf
script: |
#!/bin/bash
echo $(params.scan-profile)
echo $(params.tarball-build-script)
echo $(params.targz-file)
covscan mock-build -p $(params.scan-profile) --tarball-build-script=$(params.tarball-build-script) /workspace/source-tars/$(params.targz-file)

146
k8s/stage/osh-client-tekton/task/osh-client-git-cli-modified.yaml

@ -0,0 +1,146 @@
apiVersion: tekton.dev/v1beta1
kind: Task
metadata:
annotations:
tekton.dev/categories: Git
tekton.dev/displayName: git cli
tekton.dev/pipelines.minVersion: 0.21.0
tekton.dev/platforms: linux/amd64,linux/s390x,linux/ppc64le
tekton.dev/tags: git
creationTimestamp: "2023-06-20T22:58:05Z"
generation: 2
labels:
app.kubernetes.io/version: "0.4"
hub.tekton.dev/catalog: tekton
name: git-cli
namespace: pct-security-tooling
resourceVersion: "3453559180"
uid: 95fc93dd-8780-41ab-9477-b698762dc1de
spec:
description: |-
This task can be used to perform git operations.
Git command that needs to be run can be passed as a script to the task. This task needs authentication to git in order to push after the git operation.
params:
- default: cgr.dev/chainguard/git:root-2.39@sha256:7759f87050dd8bacabe61354d75ccd7f864d6b6f8ec42697db7159eccd491139
description: |
The base image for the task.
name: BASE_IMAGE
type: string
- default: ""
description: |
Git user name for performing git operation.
name: GIT_USER_NAME
type: string
- default: ""
description: |
Git user email for performing git operation.
name: GIT_USER_EMAIL
type: string
- default: |
git help
description: The git script to run.
name: GIT_SCRIPT
type: string
- default: /root
description: |
Absolute path to the user's home directory. Set this explicitly if you are running the image as a non-root user or have overridden
the gitInitImage param with an image containing custom user configuration.
name: USER_HOME
type: string
- default: "true"
description: Log the commands that are executed during `git-clone`'s operation.
name: VERBOSE
type: string
results:
- description: The precise commit SHA after the git operation.
name: commit
type: string
- name: archive-name
type: string
description: The archive name produced by the git archive
steps:
- env:
- name: HOME
value: $(params.USER_HOME)
- name: PARAM_VERBOSE
value: $(params.VERBOSE)
- name: PARAM_USER_HOME
value: $(params.USER_HOME)
- name: WORKSPACE_OUTPUT_PATH
value: $(workspaces.output.path)
- name: WORKSPACE_SSH_DIRECTORY_BOUND
value: $(workspaces.ssh-directory.bound)
- name: WORKSPACE_SSH_DIRECTORY_PATH
value: $(workspaces.ssh-directory.path)
- name: WORKSPACE_BASIC_AUTH_DIRECTORY_BOUND
value: $(workspaces.basic-auth.bound)
- name: WORKSPACE_BASIC_AUTH_DIRECTORY_PATH
value: $(workspaces.basic-auth.path)
image: $(params.BASE_IMAGE)
name: git
resources: {}
script: |
#!/usr/bin/env sh
set -eu
if [ "${PARAM_VERBOSE}" = "true" ] ; then
set -x
fi
if [ "${WORKSPACE_BASIC_AUTH_DIRECTORY_BOUND}" = "true" ] ; then
cp "${WORKSPACE_BASIC_AUTH_DIRECTORY_PATH}/.git-credentials" "${PARAM_USER_HOME}/.git-credentials"
cp "${WORKSPACE_BASIC_AUTH_DIRECTORY_PATH}/.gitconfig" "${PARAM_USER_HOME}/.gitconfig"
chmod 400 "${PARAM_USER_HOME}/.git-credentials"
chmod 400 "${PARAM_USER_HOME}/.gitconfig"
fi
if [ "${WORKSPACE_SSH_DIRECTORY_BOUND}" = "true" ] ; then
cp -R "${WORKSPACE_SSH_DIRECTORY_PATH}" "${PARAM_USER_HOME}"/.ssh
chmod 700 "${PARAM_USER_HOME}"/.ssh
chmod -R 400 "${PARAM_USER_HOME}"/.ssh/*
fi
# Setting up the config for the git.
git config --global user.email "$(params.GIT_USER_EMAIL)"
git config --global user.name "$(params.GIT_USER_NAME)"
eval '$(params.GIT_SCRIPT)'
RESULT_SHA="$(git rev-parse HEAD | tr -d '\n')"
EXIT_CODE="$?"
if [ "$EXIT_CODE" != 0 ]
then
exit $EXIT_CODE
fi
# Make sure we don't add a trailing newline to the result!
printf "%s" "$RESULT_SHA" > "$(results.commit.path)"
workingDir: $(workspaces.source.path)
workspaces:
- description: custom source tar location
name: source-tars
- description: A workspace that contains the fetched git repository.
name: source
- description: |
An optional workspace that contains the files that need to be added to git. You can
access the workspace from your script using `$(workspaces.input.path)`, for instance:
cp $(workspaces.input.path)/file_that_i_want .
git add file_that_i_want
# etc
name: input
optional: true
- description: |
A .ssh directory with private key, known_hosts, config, etc. Copied to
the user's home before git commands are executed. Used to authenticate
with the git remote when performing the clone. Binding a Secret to this
Workspace is strongly recommended over other volume types.
name: ssh-directory
optional: true
- description: |
A Workspace containing a .gitconfig and .git-credentials file. These
will be copied to the user's home before any git commands are run. Any
other files in this Workspace are ignored. It is strongly recommended
to use ssh-directory over basic-auth whenever possible and to bind a
Secret to this Workspace over other volume types.
name: basic-auth
optional: true

63
k8s/stage/osh-client-tekton/task/osh-scan-task.yaml

@ -0,0 +1,63 @@
apiVersion: tekton.dev/v1beta1
kind: Task
metadata:
name: osh-scan-task
spec:
stepTemplate:
env:
- name: "HOME"
value: "/tekton/home"
params:
- name: buildId
type: string
- name: scanProfile
type: string
volumes:
- name: osh-client-kerb-vol
secret:
defaultMode: 384
optional: false
secretName: kerberos-keytab-osh
- name: osh-client-kerb-config-vol
configMap:
name: kerberos-config-osh-client
items:
- key: linux-krb5.conf
path: linux-krb5.conf
defaultMode: 384
optional: false
- name: osh-client-config-vol
configMap:
name: osh-client-config
items:
- key: client.conf
path: client.conf
optional: false
steps:
- name: perform-buildid-scan
image: quay.io/pct-security/osh-wrapper-client:latest
workingDir: /home/covscan
volumeMounts:
- name: osh-client-kerb-vol
mountPath: /kerberos
readOnly: true
- name: osh-client-config-vol
mountPath: /etc/osh/client.conf
readOnly: true
subPath: client.conf
- name: osh-client-kerb-config-vol
mountPath: /etc/krb5.conf
readOnly: true
subPath: linux-krb5.conf
script: |
#!/bin/bash
echo $(params.buildId)
echo $(params.scanProfile)
covscan mock-build -p $(params.scanProfile) --brew-build $(params.buildId)

310
mvnw vendored

@ -1,310 +0,0 @@
#!/bin/sh
# ----------------------------------------------------------------------------
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# ----------------------------------------------------------------------------
# ----------------------------------------------------------------------------
# Maven Start Up Batch script
#
# Required ENV vars:
# ------------------
# JAVA_HOME - location of a JDK home dir
#
# Optional ENV vars
# -----------------
# M2_HOME - location of maven2's installed home dir
# MAVEN_OPTS - parameters passed to the Java VM when running Maven
# e.g. to debug Maven itself, use
# set MAVEN_OPTS=-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000
# MAVEN_SKIP_RC - flag to disable loading of mavenrc files
# ----------------------------------------------------------------------------
if [ -z "$MAVEN_SKIP_RC" ] ; then
if [ -f /etc/mavenrc ] ; then
. /etc/mavenrc
fi
if [ -f "$HOME/.mavenrc" ] ; then
. "$HOME/.mavenrc"
fi
fi
# OS specific support. $var _must_ be set to either true or false.
cygwin=false;
darwin=false;
mingw=false
case "`uname`" in
CYGWIN*) cygwin=true ;;
MINGW*) mingw=true;;
Darwin*) darwin=true
# Use /usr/libexec/java_home if available, otherwise fall back to /Library/Java/Home
# See https://developer.apple.com/library/mac/qa/qa1170/_index.html
if [ -z "$JAVA_HOME" ]; then
if [ -x "/usr/libexec/java_home" ]; then
export JAVA_HOME="`/usr/libexec/java_home`"
else
export JAVA_HOME="/Library/Java/Home"
fi
fi
;;
esac
if [ -z "$JAVA_HOME" ] ; then
if [ -r /etc/gentoo-release ] ; then
JAVA_HOME=`java-config --jre-home`
fi
fi
if [ -z "$M2_HOME" ] ; then
## resolve links - $0 may be a link to maven's home
PRG="$0"
# need this for relative symlinks
while [ -h "$PRG" ] ; do
ls=`ls -ld "$PRG"`
link=`expr "$ls" : '.*-> \(.*\)$'`
if expr "$link" : '/.*' > /dev/null; then
PRG="$link"
else
PRG="`dirname "$PRG"`/$link"
fi
done
saveddir=`pwd`
M2_HOME=`dirname "$PRG"`/..
# make it fully qualified
M2_HOME=`cd "$M2_HOME" && pwd`
cd "$saveddir"
# echo Using m2 at $M2_HOME
fi
# For Cygwin, ensure paths are in UNIX format before anything is touched
if $cygwin ; then
[ -n "$M2_HOME" ] &&
M2_HOME=`cygpath --unix "$M2_HOME"`
[ -n "$JAVA_HOME" ] &&
JAVA_HOME=`cygpath --unix "$JAVA_HOME"`
[ -n "$CLASSPATH" ] &&
CLASSPATH=`cygpath --path --unix "$CLASSPATH"`
fi
# For Mingw, ensure paths are in UNIX format before anything is touched
if $mingw ; then
[ -n "$M2_HOME" ] &&
M2_HOME="`(cd "$M2_HOME"; pwd)`"
[ -n "$JAVA_HOME" ] &&
JAVA_HOME="`(cd "$JAVA_HOME"; pwd)`"
fi
if [ -z "$JAVA_HOME" ]; then
javaExecutable="`which javac`"
if [ -n "$javaExecutable" ] && ! [ "`expr \"$javaExecutable\" : '\([^ ]*\)'`" = "no" ]; then
# readlink(1) is not available as standard on Solaris 10.
readLink=`which readlink`
if [ ! `expr "$readLink" : '\([^ ]*\)'` = "no" ]; then
if $darwin ; then
javaHome="`dirname \"$javaExecutable\"`"
javaExecutable="`cd \"$javaHome\" && pwd -P`/javac"
else
javaExecutable="`readlink -f \"$javaExecutable\"`"
fi
javaHome="`dirname \"$javaExecutable\"`"
javaHome=`expr "$javaHome" : '\(.*\)/bin'`
JAVA_HOME="$javaHome"
export JAVA_HOME
fi
fi
fi
if [ -z "$JAVACMD" ] ; then
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD="$JAVA_HOME/jre/sh/java"
else
JAVACMD="$JAVA_HOME/bin/java"
fi
else
JAVACMD="`which java`"
fi
fi
if [ ! -x "$JAVACMD" ] ; then
echo "Error: JAVA_HOME is not defined correctly." >&2
echo " We cannot execute $JAVACMD" >&2
exit 1
fi
if [ -z "$JAVA_HOME" ] ; then
echo "Warning: JAVA_HOME environment variable is not set."
fi
CLASSWORLDS_LAUNCHER=org.codehaus.plexus.classworlds.launcher.Launcher
# traverses directory structure from process work directory to filesystem root
# first directory with .mvn subdirectory is considered project base directory
find_maven_basedir() {
if [ -z "$1" ]
then
echo "Path not specified to find_maven_basedir"
return 1
fi
basedir="$1"
wdir="$1"
while [ "$wdir" != '/' ] ; do
if [ -d "$wdir"/.mvn ] ; then
basedir=$wdir
break
fi
# workaround for JBEAP-8937 (on Solaris 10/Sparc)
if [ -d "${wdir}" ]; then
wdir=`cd "$wdir/.."; pwd`
fi
# end of workaround
done
echo "${basedir}"
}
# concatenates all lines of a file
concat_lines() {
if [ -f "$1" ]; then
echo "$(tr -s '\n' ' ' < "$1")"
fi
}
BASE_DIR=`find_maven_basedir "$(pwd)"`
if [ -z "$BASE_DIR" ]; then
exit 1;
fi
##########################################################################################
# Extension to allow automatically downloading the maven-wrapper.jar from Maven-central
# This allows using the maven wrapper in projects that prohibit checking in binary data.
##########################################################################################
if [ -r "$BASE_DIR/.mvn/wrapper/maven-wrapper.jar" ]; then
if [ "$MVNW_VERBOSE" = true ]; then
echo "Found .mvn/wrapper/maven-wrapper.jar"
fi
else
if [ "$MVNW_VERBOSE" = true ]; then
echo "Couldn't find .mvn/wrapper/maven-wrapper.jar, downloading it ..."
fi
if [ -n "$MVNW_REPOURL" ]; then
jarUrl="$MVNW_REPOURL/io/takari/maven-wrapper/0.5.6/maven-wrapper-0.5.6.jar"
else
jarUrl="https://repo.maven.apache.org/maven2/io/takari/maven-wrapper/0.5.6/maven-wrapper-0.5.6.jar"
fi
while IFS="=" read key value; do
case "$key" in (wrapperUrl) jarUrl="$value"; break ;;
esac
done < "$BASE_DIR/.mvn/wrapper/maven-wrapper.properties"
if [ "$MVNW_VERBOSE" = true ]; then
echo "Downloading from: $jarUrl"
fi
wrapperJarPath="$BASE_DIR/.mvn/wrapper/maven-wrapper.jar"
if $cygwin; then
wrapperJarPath=`cygpath --path --windows "$wrapperJarPath"`
fi
if command -v wget > /dev/null; then
if [ "$MVNW_VERBOSE" = true ]; then
echo "Found wget ... using wget"
fi
if [ -z "$MVNW_USERNAME" ] || [ -z "$MVNW_PASSWORD" ]; then
wget "$jarUrl" -O "$wrapperJarPath"
else
wget --http-user=$MVNW_USERNAME --http-password=$MVNW_PASSWORD "$jarUrl" -O "$wrapperJarPath"
fi
elif command -v curl > /dev/null; then
if [ "$MVNW_VERBOSE" = true ]; then
echo "Found curl ... using curl"
fi
if [ -z "$MVNW_USERNAME" ] || [ -z "$MVNW_PASSWORD" ]; then
curl -o "$wrapperJarPath" "$jarUrl" -f
else
curl --user $MVNW_USERNAME:$MVNW_PASSWORD -o "$wrapperJarPath" "$jarUrl" -f
fi
else
if [ "$MVNW_VERBOSE" = true ]; then
echo "Falling back to using Java to download"
fi
javaClass="$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.java"
# For Cygwin, switch paths to Windows format before running javac
if $cygwin; then
javaClass=`cygpath --path --windows "$javaClass"`
fi
if [ -e "$javaClass" ]; then
if [ ! -e "$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.class" ]; then
if [ "$MVNW_VERBOSE" = true ]; then
echo " - Compiling MavenWrapperDownloader.java ..."
fi
# Compiling the Java class
("$JAVA_HOME/bin/javac" "$javaClass")
fi
if [ -e "$BASE_DIR/.mvn/wrapper/MavenWrapperDownloader.class" ]; then
# Running the downloader
if [ "$MVNW_VERBOSE" = true ]; then
echo " - Running MavenWrapperDownloader.java ..."
fi
("$JAVA_HOME/bin/java" -cp .mvn/wrapper MavenWrapperDownloader "$MAVEN_PROJECTBASEDIR")
fi
fi
fi
fi
##########################################################################################
# End of extension
##########################################################################################
export MAVEN_PROJECTBASEDIR=${MAVEN_BASEDIR:-"$BASE_DIR"}
if [ "$MVNW_VERBOSE" = true ]; then
echo $MAVEN_PROJECTBASEDIR
fi
MAVEN_OPTS="$(concat_lines "$MAVEN_PROJECTBASEDIR/.mvn/jvm.config") $MAVEN_OPTS"
# For Cygwin, switch paths to Windows format before running java
if $cygwin; then
[ -n "$M2_HOME" ] &&
M2_HOME=`cygpath --path --windows "$M2_HOME"`
[ -n "$JAVA_HOME" ] &&
JAVA_HOME=`cygpath --path --windows "$JAVA_HOME"`
[ -n "$CLASSPATH" ] &&
CLASSPATH=`cygpath --path --windows "$CLASSPATH"`
[ -n "$MAVEN_PROJECTBASEDIR" ] &&
MAVEN_PROJECTBASEDIR=`cygpath --path --windows "$MAVEN_PROJECTBASEDIR"`
fi
# Provide a "standardized" way to retrieve the CLI args that will
# work with both Windows and non-Windows executions.
MAVEN_CMD_LINE_ARGS="$MAVEN_CONFIG $@"
export MAVEN_CMD_LINE_ARGS
WRAPPER_LAUNCHER=org.apache.maven.wrapper.MavenWrapperMain
exec "$JAVACMD" \
$MAVEN_OPTS \
-classpath "$MAVEN_PROJECTBASEDIR/.mvn/wrapper/maven-wrapper.jar" \
"-Dmaven.home=${M2_HOME}" "-Dmaven.multiModuleProjectDirectory=${MAVEN_PROJECTBASEDIR}" \
${WRAPPER_LAUNCHER} $MAVEN_CONFIG "$@"

182
mvnw.cmd vendored

@ -1,182 +0,0 @@
@REM ----------------------------------------------------------------------------
@REM Licensed to the Apache Software Foundation (ASF) under one
@REM or more contributor license agreements. See the NOTICE file
@REM distributed with this work for additional information
@REM regarding copyright ownership. The ASF licenses this file
@REM to you under the Apache License, Version 2.0 (the
@REM "License"); you may not use this file except in compliance
@REM with the License. You may obtain a copy of the License at
@REM
@REM http://www.apache.org/licenses/LICENSE-2.0
@REM
@REM Unless required by applicable law or agreed to in writing,
@REM software distributed under the License is distributed on an
@REM "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
@REM KIND, either express or implied. See the License for the
@REM specific language governing permissions and limitations
@REM under the License.
@REM ----------------------------------------------------------------------------
@REM ----------------------------------------------------------------------------
@REM Maven Start Up Batch script
@REM
@REM Required ENV vars:
@REM JAVA_HOME - location of a JDK home dir
@REM
@REM Optional ENV vars
@REM M2_HOME - location of maven2's installed home dir
@REM MAVEN_BATCH_ECHO - set to 'on' to enable the echoing of the batch commands
@REM MAVEN_BATCH_PAUSE - set to 'on' to wait for a keystroke before ending
@REM MAVEN_OPTS - parameters passed to the Java VM when running Maven
@REM e.g. to debug Maven itself, use
@REM set MAVEN_OPTS=-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=8000
@REM MAVEN_SKIP_RC - flag to disable loading of mavenrc files
@REM ----------------------------------------------------------------------------
@REM Begin all REM lines with '@' in case MAVEN_BATCH_ECHO is 'on'
@echo off
@REM set title of command window
title %0
@REM enable echoing by setting MAVEN_BATCH_ECHO to 'on'
@if "%MAVEN_BATCH_ECHO%" == "on" echo %MAVEN_BATCH_ECHO%
@REM set %HOME% to equivalent of $HOME
if "%HOME%" == "" (set "HOME=%HOMEDRIVE%%HOMEPATH%")
@REM Execute a user defined script before this one
if not "%MAVEN_SKIP_RC%" == "" goto skipRcPre
@REM check for pre script, once with legacy .bat ending and once with .cmd ending
if exist "%HOME%\mavenrc_pre.bat" call "%HOME%\mavenrc_pre.bat"
if exist "%HOME%\mavenrc_pre.cmd" call "%HOME%\mavenrc_pre.cmd"
:skipRcPre
@setlocal
set ERROR_CODE=0
@REM To isolate internal variables from possible post scripts, we use another setlocal
@setlocal
@REM ==== START VALIDATION ====
if not "%JAVA_HOME%" == "" goto OkJHome
echo.
echo Error: JAVA_HOME not found in your environment. >&2
echo Please set the JAVA_HOME variable in your environment to match the >&2
echo location of your Java installation. >&2
echo.
goto error
:OkJHome
if exist "%JAVA_HOME%\bin\java.exe" goto init
echo.
echo Error: JAVA_HOME is set to an invalid directory. >&2
echo JAVA_HOME = "%JAVA_HOME%" >&2
echo Please set the JAVA_HOME variable in your environment to match the >&2
echo location of your Java installation. >&2
echo.
goto error
@REM ==== END VALIDATION ====
:init
@REM Find the project base dir, i.e. the directory that contains the folder ".mvn".
@REM Fallback to current working directory if not found.
set MAVEN_PROJECTBASEDIR=%MAVEN_BASEDIR%
IF NOT "%MAVEN_PROJECTBASEDIR%"=="" goto endDetectBaseDir
set EXEC_DIR=%CD%
set WDIR=%EXEC_DIR%
:findBaseDir
IF EXIST "%WDIR%"\.mvn goto baseDirFound
cd ..
IF "%WDIR%"=="%CD%" goto baseDirNotFound
set WDIR=%CD%
goto findBaseDir
:baseDirFound
set MAVEN_PROJECTBASEDIR=%WDIR%
cd "%EXEC_DIR%"
goto endDetectBaseDir
:baseDirNotFound
set MAVEN_PROJECTBASEDIR=%EXEC_DIR%
cd "%EXEC_DIR%"
:endDetectBaseDir
IF NOT EXIST "%MAVEN_PROJECTBASEDIR%\.mvn\jvm.config" goto endReadAdditionalConfig
@setlocal EnableExtensions EnableDelayedExpansion
for /F "usebackq delims=" %%a in ("%MAVEN_PROJECTBASEDIR%\.mvn\jvm.config") do set JVM_CONFIG_MAVEN_PROPS=!JVM_CONFIG_MAVEN_PROPS! %%a
@endlocal & set JVM_CONFIG_MAVEN_PROPS=%JVM_CONFIG_MAVEN_PROPS%
:endReadAdditionalConfig
SET MAVEN_JAVA_EXE="%JAVA_HOME%\bin\java.exe"
set WRAPPER_JAR="%MAVEN_PROJECTBASEDIR%\.mvn\wrapper\maven-wrapper.jar"
set WRAPPER_LAUNCHER=org.apache.maven.wrapper.MavenWrapperMain
set DOWNLOAD_URL="https://repo.maven.apache.org/maven2/io/takari/maven-wrapper/0.5.6/maven-wrapper-0.5.6.jar"
FOR /F "tokens=1,2 delims==" %%A IN ("%MAVEN_PROJECTBASEDIR%\.mvn\wrapper\maven-wrapper.properties") DO (
IF "%%A"=="wrapperUrl" SET DOWNLOAD_URL=%%B
)
@REM Extension to allow automatically downloading the maven-wrapper.jar from Maven-central
@REM This allows using the maven wrapper in projects that prohibit checking in binary data.
if exist %WRAPPER_JAR% (
if "%MVNW_VERBOSE%" == "true" (
echo Found %WRAPPER_JAR%
)
) else (
if not "%MVNW_REPOURL%" == "" (
SET DOWNLOAD_URL="%MVNW_REPOURL%/io/takari/maven-wrapper/0.5.6/maven-wrapper-0.5.6.jar"
)
if "%MVNW_VERBOSE%" == "true" (
echo Couldn't find %WRAPPER_JAR%, downloading it ...
echo Downloading from: %DOWNLOAD_URL%
)
powershell -Command "&{"^
"$webclient = new-object System.Net.WebClient;"^
"if (-not ([string]::IsNullOrEmpty('%MVNW_USERNAME%') -and [string]::IsNullOrEmpty('%MVNW_PASSWORD%'))) {"^
"$webclient.Credentials = new-object System.Net.NetworkCredential('%MVNW_USERNAME%', '%MVNW_PASSWORD%');"^
"}"^
"[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12; $webclient.DownloadFile('%DOWNLOAD_URL%', '%WRAPPER_JAR%')"^
"}"
if "%MVNW_VERBOSE%" == "true" (
echo Finished downloading %WRAPPER_JAR%
)
)
@REM End of extension
@REM Provide a "standardized" way to retrieve the CLI args that will
@REM work with both Windows and non-Windows executions.
set MAVEN_CMD_LINE_ARGS=%*
%MAVEN_JAVA_EXE% %JVM_CONFIG_MAVEN_PROPS% %MAVEN_OPTS% %MAVEN_DEBUG_OPTS% -classpath %WRAPPER_JAR% "-Dmaven.multiModuleProjectDirectory=%MAVEN_PROJECTBASEDIR%" %WRAPPER_LAUNCHER% %MAVEN_CONFIG% %*
if ERRORLEVEL 1 goto error
goto end
:error
set ERROR_CODE=1
:end
@endlocal & set ERROR_CODE=%ERROR_CODE%
if not "%MAVEN_SKIP_RC%" == "" goto skipRcPost
@REM check for post script, once with legacy .bat ending and once with .cmd ending
if exist "%HOME%\mavenrc_post.bat" call "%HOME%\mavenrc_post.bat"
if exist "%HOME%\mavenrc_post.cmd" call "%HOME%\mavenrc_post.cmd"
:skipRcPost
@REM pause the script if MAVEN_BATCH_PAUSE is set to 'on'
if "%MAVEN_BATCH_PAUSE%" == "on" pause
if "%MAVEN_TERMINATE_CMD%" == "on" exit %ERROR_CODE%
exit /B %ERROR_CODE%

175
pom.xml

@ -1,28 +1,20 @@
<?xml version="1.0"?>
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd" xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<repositories>
<repository>
<id>jboss</id>
<name>JBoss repository</name>
<url>http://repository.jboss.org/maven2</url>
</repository>
</repositories>
<modelVersion>4.0.0</modelVersion>
<groupId>com.redhat.ncaughey</groupId>
<artifactId>rest-json-quickstart</artifactId>
<groupId>com.redhat.pctsec</groupId>
<artifactId>osh-wrapper-service</artifactId>
<version>1.0.0-SNAPSHOT</version>
<properties>
<compiler-plugin.version>3.10.1</compiler-plugin.version>
<compiler-plugin.version>3.11.0</compiler-plugin.version>
<maven.compiler.release>17</maven.compiler.release>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<quarkus.platform.artifact-id>quarkus-bom</quarkus.platform.artifact-id>
<quarkus.platform.group-id>io.quarkus.platform</quarkus.platform.group-id>
<quarkus.platform.version>2.16.5.Final</quarkus.platform.version>
<quarkus.platform.version>3.1.2.Final</quarkus.platform.version>
<skipITs>true</skipITs>
<surefire-plugin.version>3.0.0-M7</surefire-plugin.version>
<surefire-plugin.version>3.0.0</surefire-plugin.version>
</properties>
<dependencyManagement>
<dependencies>
@ -33,113 +25,94 @@
<type>pom</type>
<scope>import</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.json/json -->
<!-- https://mvnrepository.com/artifact/org.json/json -->
<!-- <dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.12.1</version>
</dependency> -->
<!-- <dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.12.1</version>
</dependency> -->
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>io.fabric8</groupId>
<artifactId>tekton-client</artifactId>
<version>6.7.2</version>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-openshift</artifactId>
</dependency>
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20220320</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.postgresql/postgresql -->
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>42.6.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.hibernate.orm/hibernate-core -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
</dependency>
<dependency>
<groupId>org.glassfish.jaxb</groupId>
<artifactId>jaxb-runtime</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-jdbc-postgresql</artifactId>
</dependency>
<groupId>io.quarkiverse.kerberos</groupId>
<artifactId>quarkus-kerberos</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-resteasy-reactive-jackson</artifactId>
<artifactId>quarkus-resteasy-reactive</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-openshift</artifactId>
</dependency>
<dependency>
<groupId>io.quarkiverse.tektonclient</groupId>
<artifactId>quarkus-tekton-client</artifactId>
<version>1.0.1</version>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-arc</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-hibernate-validator</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-resteasy-reactive-jackson</artifactId>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-jdbc-postgresql</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-hibernate-orm-panache</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-smallrye-openapi</artifactId>
</dependency>
<dependency>
<groupId>info.picocli</groupId>
<artifactId>picocli</artifactId>
<version>4.7.4</version>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-vertx</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-kubernetes-config</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-junit5</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.18.26</version>
<scope>provided</scope>
</dependency>
<!-- Bean Validation API and RI -->
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>1.0.0.GA</version>
</dependency>
<!-- https://mvnrepository.com/artifact/jakarta.persistence/jakarta.persistence-api -->
<dependency>
<groupId>jakarta.persistence</groupId>
<artifactId>jakarta.persistence-api</artifactId>
<version>3.1.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.eclipse.microprofile.rest.client/microprofile-rest-client-api -->
<dependency>
<groupId>org.eclipse.microprofile.rest.client</groupId>
<artifactId>microprofile-rest-client-api</artifactId>
<version>3.0.1</version>
</dependency>
<!-- <dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.5.2</version>
</dependency> -->
<dependency>
<groupId>io.rest-assured</groupId>
<artifactId>rest-assured</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>io.smallrye</groupId>
<artifactId>jandex-maven-plugin</artifactId>
<version>3.1.1</version>
<executions>
<execution>
<id>make-index</id>
<goals>
<goal>jandex</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>${quarkus.platform.group-id}</groupId>
<artifactId>quarkus-maven-plugin</artifactId>

0
schema/.gitkeep

30
schema/OffRegScraper.py

@ -0,0 +1,30 @@
from bs4 import BeautifulSoup
import requests
import re
import csv
results = {}
URL = "https://product-security.pages.redhat.com/offering-registry/"
r = requests.get(URL)
soup = BeautifulSoup(r.text, 'html.parser')
table = soup.find("table")
rows = table.findAll("tr")
for row in rows:
for elem in row.contents:
if row.contents[1].text == 'Offering':
break
else:
# We extract the short name of the URL
re_search = re.search('/offering-registry/offerings/(.*)/', row.contents[1].contents[0].attrs["href"])
results[re_search.group(1)] = row.contents[1].contents[0].text
break
print(results)
with open('offerings.csv', 'w') as csv_file:
writer = csv.writer(csv_file)
for key, value in results.items():
writer.writerow([key, value])

126
schema/populate.sql

@ -0,0 +1,126 @@
INSERT INTO osh.offerings(offering_id,description) VALUES ('ansible-automation-platform','Ansible Automation Platform (AAP)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('advisor','Insights Advisor');
INSERT INTO osh.offerings(offering_id,description) VALUES ('ansible-on-aws','Ansible on AWS');
INSERT INTO osh.offerings(offering_id,description) VALUES ('ansible-on-azure','Ansible on Azure');
INSERT INTO osh.offerings(offering_id,description) VALUES ('ansible-on-gcp','Ansible on GCP');
INSERT INTO osh.offerings(offering_id,description) VALUES ('ansible-wisdom-service','Ansible Wisdom Service');
INSERT INTO osh.offerings(offering_id,description) VALUES ('cert-manager','cert-manager Operator for Red Hat OpenShift');
INSERT INTO osh.offerings(offering_id,description) VALUES ('compliance','Insights Compliance');
INSERT INTO osh.offerings(offering_id,description) VALUES ('connected-customer-experience','Connected Customer Experience (CCX)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('cost-management','Cost Management');
INSERT INTO osh.offerings(offering_id,description) VALUES ('custom-metric-autoscaler','OpenShift Custom Metrics Autoscaler');
INSERT INTO osh.offerings(offering_id,description) VALUES ('developer-sandbox-for-red-hat-openshift','Developer Sandbox for Red Hat OpenShift');
INSERT INTO osh.offerings(offering_id,description) VALUES ('dotnet','.NET');
INSERT INTO osh.offerings(offering_id,description) VALUES ('drift','Insights Drift');
INSERT INTO osh.offerings(offering_id,description) VALUES ('eclipse-vertx','Red Hat build of Eclipse Vert.x');
INSERT INTO osh.offerings(offering_id,description) VALUES ('edge-management','Edge Management');
INSERT INTO osh.offerings(offering_id,description) VALUES ('eventing','Insights Eventing');
INSERT INTO osh.offerings(offering_id,description) VALUES ('fastdatapath','RHEL Fast Datapath');
INSERT INTO osh.offerings(offering_id,description) VALUES ('host-management-services','Host Management Services');
INSERT INTO osh.offerings(offering_id,description) VALUES ('hosted-control-planes','Hosted Control Planes (Hypershift)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('hybrid-application-console','Hybrid Application Console (HAC)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('insights-essential','Insights Essentials');
INSERT INTO osh.offerings(offering_id,description) VALUES ('kernel-module-management','Kernel Module Management');
INSERT INTO osh.offerings(offering_id,description) VALUES ('logging-subsystem-for-red-hat-openshift','Logging Subsystem for Red Hat OpenShift');
INSERT INTO osh.offerings(offering_id,description) VALUES ('lvms-operator','LVMS Operator');
INSERT INTO osh.offerings(offering_id,description) VALUES ('malware-detection','Insights Malware Detection');
INSERT INTO osh.offerings(offering_id,description) VALUES ('mgmt-platform','Management Platform');
INSERT INTO osh.offerings(offering_id,description) VALUES ('migration-toolkit-for-applications','Migration Toolkit for Applications (MTA)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('migration-toolkit-for-containers','Migration Toolkit for Containers (MTC)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('migration-toolkit-for-runtimes','Migration Toolkit for Runtimes (MTR)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('migration-toolkit-for-virtualization','Migration Toolkit for Virtualization (MTV)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('network-observability-operator','Network Observability Operator');
INSERT INTO osh.offerings(offering_id,description) VALUES ('node-healthcheck-operator','Node HealthCheck Operator');
INSERT INTO osh.offerings(offering_id,description) VALUES ('node-maintenance-operator','Node Maintenance Operator');
INSERT INTO osh.offerings(offering_id,description) VALUES ('nvidia-gpu-add-on','NVIDIA GPU Add-On');
INSERT INTO osh.offerings(offering_id,description) VALUES ('oadp','OpenShift API for Data Protection');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-container-platform','Openshift Container Platform (OCP)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-container-storage','OpenShift Container Storage (OCS)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-data-foundation-managed-service','Red Hat OpenShift Data Foundation Managed Service');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-dedicated','OpenShift Dedicated (OSD/ROSA)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-developer-tools-and-services-helm','OpenShift Developer Tools and Services (Helm)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-developer-tools-and-services-jenkins','OpenShift Developer Tools and Services (Jenkins)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-distributed-tracing','OpenShift Distributed Tracing');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-on-azure','Openshift on Azure (ARO)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-run-once-duration-override-operator','OpenShift Run Once Duration Override Operator');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-sandboxed-containers','Openshift Sandboxed Containers');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-secondary-scheduler-operator','OpenShift Secondary Scheduler Operator');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-servicemesh','OpenShift Service Mesh');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-virtualization','OpenShift Virtualization (CNV)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-web-terminal-operator','OpenShift Web Terminal Operator');
INSERT INTO osh.offerings(offering_id,description) VALUES ('openshift-winc','Windows Container Support for OpenShift');
INSERT INTO osh.offerings(offering_id,description) VALUES ('patch','Insights Patch');
INSERT INTO osh.offerings(offering_id,description) VALUES ('product-discovery','Product Discovery');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-3scale-api-management-platform','Red Hat 3scale API Management Platform');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-advanced-cluster-management','Red Hat Advanced Cluster Management (RHACM)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-amq-broker','Red Hat AMQ Broker');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-amq-clients','Red Hat AMQ Clients');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-amq-interconnect','Red Hat AMQ Interconnect');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-amq-online','Red Hat AMQ Online');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-amq-streams','Red Hat AMQ Streams');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-build-apicurio-registry','Red Hat build of Apicurio Registry (formerly known as Integration Service Registry)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-build-quarkus','Red Hat Build of Quarkus');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-camel-extensions-quarkus','Red Hat Camel Extensions for Quarkus');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-camel-k','Red Hat Camel K');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-camel-spring-boot','Red Hat Camel for Spring Boot');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-ceph-storage','Red Hat Ceph Storage');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-certificate-system','Red Hat Certificate System (RHCS)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-certification-program','Red Hat Certification Program (rhcertification)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-code-quarkus','Red Hat Code Quarkus');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-core-os','Red Hat CoreOS');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-data-grid','Red Hat Data Grid');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-debezium','Red Hat Debezium');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-decision-manager','Red Hat Decision Manager');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-developer-hub','Red Hat Developer Hub');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-developer-toolset','Red Hat Developer Toolset (DTS)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-devtools-compilers','Red Hat Developer Tools (DevTools Compilers)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-directory-server','Red Hat Directory Server (RHDS)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-enterprise-linux-10','Red Hat Enterprise Linux (RHEL) 10');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-enterprise-linux-6','Red Hat Enterprise Linux (RHEL) 6');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-enterprise-linux-7','Red Hat Enterprise Linux (RHEL) 7');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-enterprise-linux-8','Red Hat Enterprise Linux (RHEL) 8');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-enterprise-linux-9','Red Hat Enterprise Linux (RHEL) 9');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-fuse','Red Hat Fuse');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-gluster-storage','Red Hat Gluster Storage');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-in-vehicle-os','Red Hat In-Vehicle Operating System (RHIVOS)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-jboss-core-services','Red Hat JBoss Core Services');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-jboss-eap','Red Hat JBoss Enterprise Application Platform (EAP)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-jboss-web-server','Red Hat JBoss Web Server');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-observability-service','Red Hat Observability Service');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-open-database-access','Red Hat OpenShift Database Access');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-open-shift-data-science','Red Hat OpenShift Data Science (RHODS)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openjdk','Red Hat OpenJDK');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-api-management','Red Hat OpenShift API Management');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-builds-v2','Red Hat OpenShift Builds V2');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-connectors','Red Hat OpenShift Connectors (RHOC)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-control-plane-service','Red Hat OpenShift Control Plane Service');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-data-foundation','Red Hat OpenShift Data Foundation');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-dev-spaces','Red Hat OpenShift Dev Spaces');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-gitops','Red Hat OpenShift GitOps');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-local','Red Hat OpenShift Local');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-pipelines','Red Hat OpenShift Pipelines');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-serverless','Red Hat OpenShift Serverless');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-service-registry','Red Hat OpenShift Service Registry');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openshift-streams-apache-kafka','Red Hat OpenShift Streams for Apache Kafka (RHOSAK)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-openstack-platform','Red Hat OpenStack Platform (RHOSP)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-optaplanner','Red Hat Optaplanner');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-plug-ins-for-backstage','Red Hat Plug-ins for Backstage');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-process-automation-manager','Red Hat Process Automation Manager');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-quarkus-registry','Red Hat Quarkus Registry');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-quay','Red Hat Quay');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-satellite','Red Hat Satellite');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-service-interconnect','Red Hat Service Interconnect (formerly known as Application Interconnect)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-single-sign-on','Red Hat Single Sign-On (RHSSO)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-software-collections','Red Hat Software Collections');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-support-for-spring-boot','Red Hat support for Spring Boot');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-trusted-application-pipeline','Red Hat Trusted Application Pipeline (RHTAP)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-update-infrastructure','Red Hat Update Infrastructure (RHUI)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('red-hat-virtualization','Red Hat Virtualization');
INSERT INTO osh.offerings(offering_id,description) VALUES ('resource-optimization','Insights Resource Optimization (ROS)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('rh-vulnerability-for-ocp','Insights Vulnerability for OCP');
INSERT INTO osh.offerings(offering_id,description) VALUES ('rhacs','Red Hat Advanced Cluster Security for Kubernetes (RHACS)');
INSERT INTO osh.offerings(offering_id,description) VALUES ('self-node-remediation','Self Node Remediation');
INSERT INTO osh.offerings(offering_id,description) VALUES ('subscription-central','Subscription Central');
INSERT INTO osh.offerings(offering_id,description) VALUES ('subscription-watch','Subscription Watch');
INSERT INTO osh.offerings(offering_id,description) VALUES ('telco-sw-components','Telco SW Components');
INSERT INTO osh.offerings(offering_id,description) VALUES ('vulnerability','Vulnerability');

81
schema/schema.sql

@ -0,0 +1,81 @@
CREATE SCHEMA osh;
GRANT USAGE ON SCHEMA osh TO postgres;
CREATE TABLE IF NOT EXISTS osh.offerings(
offering_id VARCHAR(100),
description VARCHAR(200),
PRIMARY KEY (offeringId)
);
CREATE TABLE IF NOT EXISTS osh.results(
results_id SERIAL,
datetime TIMESTAMP WITHOUT TIME ZONE DEFAULT (NOW() AT TIME ZONE 'utc') NOT NULL,
state BOOLEAN,
logs bytea,
task_reference VARCHAR(50),
PRIMARY KEY (results_id)
);
CREATE TABLE IF NOT EXISTS osh.scans(
scan_id SERIAL,
offering_id VARCHAR(100),
event_id VARCHAR(100) NOT NULL,
is_managed_service BOOLEAN NOT NULL,
component_list VARCHAR(100),
datetime TIMESTAMP WITHOUT TIME ZONE DEFAULT (NOW() AT TIME ZONE 'utc') NOT NULL,
owner VARCHAR(50) NOT NULL,
results SERIAL,
status VARCHAR (50) CONSTRAINT valid_status CHECK(status in ('PENDING', 'DELETED', 'COMPLETED', 'IN PROGRESS')),
last_updated TIMESTAMP WITHOUT TIME ZONE DEFAULT (NOW() AT TIME ZONE 'utc') NOT NULL,
PRIMARY KEY(scan_id),
FOREIGN KEY (offering_id) REFERENCES osh.offerings(offering_id),
FOREIGN KEY (results) REFERENCES osh.results(results_id)
);
CREATE TABLE IF NOT EXISTS osh.archive(
scan_id SERIAL,
offering_id VARCHAR(100),
event_id VARCHAR(100) NOT NULL,
is_managed_service BOOLEAN NOT NULL,
component_list VARCHAR(100),
datetime TIMESTAMP WITHOUT TIME ZONE DEFAULT (NOW() AT TIME ZONE 'utc') NOT NULL,
owner VARCHAR(50) NOT NULL,
results SERIAL,
status VARCHAR (50) CONSTRAINT valid_status CHECK(status in ('PENDING', 'DELETED', 'COMPLETED', 'IN PROGRESS')),
last_updated TIMESTAMP WITHOUT TIME ZONE DEFAULT (NOW() AT TIME ZONE 'utc') NOT NULL,
PRIMARY KEY(scan_id),
FOREIGN KEY (offering_id) REFERENCES osh.offerings(offering_id),
FOREIGN KEY (results) REFERENCES osh.results(results_id)
);
CREATE TABLE IF NOT EXISTS osh.gitscans (
id SERIAL,
build_system_type VARCHAR(80),
repository VARCHAR(150),
reference VARCHAR(100),
commit_id VARCHAR(100),
-- SHA256 has a length of 256 bits, so 256 bits would represent 64 hex characters
hashsum VARCHAR(64),
PRIMARY KEY(id)
);
CREATE TABLE IF NOT EXISTS osh.pncscans(
id SERIAL,
build_system_type VARCHAR(80),
build_id VARCHAR(100),
PRIMARY KEY(id)
);
CREATE TABLE IF NOT EXISTS osh.brewscans(
id SERIAL,
build_system_type VARCHAR(80),
brew_id VARCHAR(100),
brew_nvr VARCHAR(100),
pnc_id VARCHAR(100),
artifact_type VARCHAR(100),
file_name VARCHAR(100),
built_from_source BOOLEAN,
PRIMARY KEY(id)
);

12
src/main/docker/Dockerfile.jvm

@ -7,18 +7,20 @@
#
# Then, build the image with:
#
# docker build -f src/main/docker/Dockerfile.jvm -t quarkus/rest-json-quickstart-jvm .
# docker build -f src/main/docker/Dockerfile.jvm -t quarkus/osh-wrapper-service-jvm .
#
# Then run the container using:
#
# docker run -i --rm -p 8080:8080 quarkus/rest-json-quickstart-jvm
# docker run -i --rm -p 8080:8080 quarkus/osh-wrapper-service-jvm
#
# If you want to include the debug port into your docker image
# you will have to expose the debug port (default 5005) like this : EXPOSE 8080 5005
# you will have to expose the debug port (default 5005 being the default) like this : EXPOSE 8080 5005.
# Additionally you will have to set -e JAVA_DEBUG=true and -e JAVA_DEBUG_PORT=*:5005
# when running the container
#
# Then run the container using :
#
# docker run -i --rm -p 8080:8080 quarkus/rest-json-quickstart-jvm
# docker run -i --rm -p 8080:8080 quarkus/osh-wrapper-service-jvm
#
# This image uses the `run-java.sh` script to run the application.
# This scripts computes the command line to execute your Java application, and
@ -75,7 +77,7 @@
# accessed directly. (example: "foo.example.com,bar.example.com")
#
###
FROM registry.access.redhat.com/ubi8/openjdk-17:1.14
FROM registry.access.redhat.com/ubi8/openjdk-17:1.15
ENV LANGUAGE='en_US:en'

12
src/main/docker/Dockerfile.legacy-jar

@ -7,18 +7,20 @@
#
# Then, build the image with:
#
# docker build -f src/main/docker/Dockerfile.legacy-jar -t quarkus/rest-json-quickstart-legacy-jar .
# docker build -f src/main/docker/Dockerfile.legacy-jar -t quarkus/osh-wrapper-service-legacy-jar .
#
# Then run the container using:
#
# docker run -i --rm -p 8080:8080 quarkus/rest-json-quickstart-legacy-jar
# docker run -i --rm -p 8080:8080 quarkus/osh-wrapper-service-legacy-jar
#
# If you want to include the debug port into your docker image
# you will have to expose the debug port (default 5005) like this : EXPOSE 8080 5005
# you will have to expose the debug port (default 5005 being the default) like this : EXPOSE 8080 5005.
# Additionally you will have to set -e JAVA_DEBUG=true and -e JAVA_DEBUG_PORT=*:5005
# when running the container
#
# Then run the container using :
#
# docker run -i --rm -p 8080:8080 quarkus/rest-json-quickstart-legacy-jar
# docker run -i --rm -p 8080:8080 quarkus/osh-wrapper-service-legacy-jar
#
# This image uses the `run-java.sh` script to run the application.
# This scripts computes the command line to execute your Java application, and
@ -75,7 +77,7 @@
# accessed directly. (example: "foo.example.com,bar.example.com")
#
###
FROM registry.access.redhat.com/ubi8/openjdk-17:1.14
FROM registry.access.redhat.com/ubi8/openjdk-17:1.15
ENV LANGUAGE='en_US:en'

4
src/main/docker/Dockerfile.native

@ -7,11 +7,11 @@
#
# Then, build the image with:
#
# docker build -f src/main/docker/Dockerfile.native -t quarkus/rest-json-quickstart .
# docker build -f src/main/docker/Dockerfile.native -t quarkus/osh-wrapper-service .
#
# Then run the container using:
#
# docker run -i --rm -p 8080:8080 quarkus/rest-json-quickstart
# docker run -i --rm -p 8080:8080 quarkus/osh-wrapper-service
#
###
FROM registry.access.redhat.com/ubi8/ubi-minimal:8.6

4
src/main/docker/Dockerfile.native-micro

@ -10,11 +10,11 @@
#
# Then, build the image with:
#
# docker build -f src/main/docker/Dockerfile.native-micro -t quarkus/rest-json-quickstart .
# docker build -f src/main/docker/Dockerfile.native-micro -t quarkus/osh-wrapper-service .
#
# Then run the container using:
#
# docker run -i --rm -p 8080:8080 quarkus/rest-json-quickstart
# docker run -i --rm -p 8080:8080 quarkus/osh-wrapper-service
#
###
FROM quay.io/quarkus/quarkus-micro-image:2.0

37
src/main/java/com/redhat/pctsec/model/BrewBuild.java

@ -0,0 +1,37 @@
package com.redhat.pctsec.model;
import jakarta.persistence.Entity;
import org.eclipse.microprofile.openapi.annotations.media.Schema;
import java.net.URI;
import java.net.URL;
@Entity
public class BrewBuild extends BuildType {
public BrewBuild(String buildRef) {
super(buildRef);
}
public BrewBuild() {
super();
}
@Override
public URI SCMURL() {
return null;
}
@Override
public URL URL() {
return null;
}
@Override
public String revision() {
return null;
}
}

35
src/main/java/com/redhat/pctsec/model/BuildType.java

@ -0,0 +1,35 @@
package com.redhat.pctsec.model;
import com.fasterxml.jackson.annotation.JsonProperty;
import jakarta.persistence.*;
import java.net.URI;
import java.net.URL;
import java.util.UUID;
@Entity
@DiscriminatorColumn(name="REF_TYPE")
abstract public class BuildType {
@Id
@GeneratedValue(strategy = GenerationType.AUTO)
private UUID id;
@JsonProperty()
@Column(name="buildref")
public String buildRef;
public BuildType(String buildRef)
{
this.buildRef = buildRef;
}
public BuildType() {
}
//This is the git URL of the sources
abstract public URI SCMURL();
//This is the URL of the build
abstract public URL URL();
abstract public String revision();
}

29
src/main/java/com/redhat/pctsec/model/Git.java

@ -0,0 +1,29 @@
package com.redhat.pctsec.model;
import jakarta.persistence.Entity;
import jakarta.persistence.GeneratedValue;
import jakarta.persistence.GenerationType;
import jakarta.persistence.Id;
import org.eclipse.microprofile.openapi.annotations.media.Schema;
import java.net.URI;
import java.util.UUID;
@Entity
public class Git {
public Git() {
super();
}
@Id
@GeneratedValue(strategy = GenerationType.AUTO)
private UUID id;
public URI repo;
public String ref;
public Git(String repo, String ref) {
this.repo = URI.create(repo);
this.ref = ref;
}
}

39
src/main/java/com/redhat/pctsec/model/PNCBuild.java

@ -0,0 +1,39 @@
package com.redhat.pctsec.model;
import jakarta.persistence.Entity;
import java.net.URI;
import java.net.URL;
@Entity
public class PNCBuild extends BuildType{
public PNCBuild() {
super();
}
public PNCBuild(String buildRef) {
super(buildRef);
}
@Override
public URI SCMURL() {
return null;
}
@Override
public URL URL() {
return null;
}
@Override
public String revision() {
return null;
}
public static boolean isValidRef(String ref){
//New type PNC Ref
if(ref.length()!=14)
return false;
return true;
}
}

3
src/main/java/com/redhat/pctsec/model/RequestType.java

@ -0,0 +1,3 @@
package com.redhat.pctsec.model;
public enum RequestType {BREW, PNC, GIT}

116
src/main/java/com/redhat/pctsec/model/Scan.java

@ -0,0 +1,116 @@
package com.redhat.pctsec.model;
import com.fasterxml.jackson.annotation.JsonIgnore;
import jakarta.persistence.*;
import jakarta.transaction.Transactional;
import jakarta.validation.constraints.Email;
import jakarta.validation.constraints.NotNull;
import org.hibernate.annotations.CreationTimestamp;
import org.hibernate.annotations.UpdateTimestamp;
import java.time.Instant;
import java.util.UUID;
enum ScanState {
CREATED, TRIGGERED, RUNNING, SUCCESS, FAIL;
}
@Entity
public class Scan {
public Scan() {
this.scanRequests = new ScanRequests();
}
public Instant getCreationTimestamp() {
return creationTimestamp;
}
public void setCreationTimestamp(Instant creationTimestamp) {
this.creationTimestamp = creationTimestamp;
}
public ScanState getState() {
return state;
}
public void setState(ScanState state) {
this.state = state;
}
public String getProductName() {
return productName;
}
public void setProductName(String productName) {
this.productName = productName;
}
public String getRequestor() {
return requestor;
}
public void setRequestor(String requestor) {
this.requestor = requestor;
}
public String getEmail() {
return email;
}
public void setEmail(String email) {
this.email = email;
}
public ScanRequests getScanRequests() {
return scanRequests;
}
public void setScanRequests(ScanRequests scanRequests) {
this.scanRequests = scanRequests;
}
@Id
@GeneratedValue(strategy = GenerationType.AUTO)
public UUID id;
/*
@OneToOne
@NotNull
@JoinColumn(name = "product_id", referencedColumnName = "id")
private String productName;
*/
@Column(name="proudct_name")
private String productName;
//@Temporal(TemporalType.TIMESTAMP)
@CreationTimestamp
@JsonIgnore
@Column(name="creation_timestamp")
//@NotNull
private Instant creationTimestamp;
@UpdateTimestamp
@JsonIgnore
@Column(name="update_timestamp")
//@NotNull
private Instant updateTimestamp;
@Column(name="state")
@Enumerated(EnumType.STRING)
private ScanState state;
@Column(name="requestor")
@NotNull
private String requestor;
@Column(name="report_email")
@Email
private String email;
@OneToOne(cascade = CascadeType.ALL, fetch=FetchType.LAZY)
@JoinColumn(name = "scan_requests_id", referencedColumnName = "id")
public ScanRequests scanRequests;
}

110
src/main/java/com/redhat/pctsec/model/ScanRequest.java

@ -0,0 +1,110 @@
package com.redhat.pctsec.model;
import com.fasterxml.jackson.annotation.JsonIgnore;
import com.fasterxml.jackson.annotation.JsonInclude;
import com.redhat.pctsec.model.api.request.git;
import com.redhat.pctsec.tekton.brewTaskRun;
import com.redhat.pctsec.tekton.scmUrlPipelineRun;
import io.vertx.mutiny.core.eventbus.EventBus;
import jakarta.enterprise.context.ApplicationScoped;
import jakarta.inject.Inject;
import jakarta.inject.Singleton;
import jakarta.persistence.*;
import java.util.HashMap;
import java.util.UUID;
@ApplicationScoped
@Entity
public class ScanRequest {
@Id
@GeneratedValue
protected UUID id;
private String metadata;
private String oshScanOptions;
public EventBus getBus() {
return bus;
}
public void setBus(EventBus bus) {
this.bus = bus;
}
@Transient
@JsonIgnore
@Inject
EventBus bus;
public RequestType getType() {
return type;
}
private RequestType type;
@OneToOne(fetch=FetchType.LAZY, cascade = CascadeType.ALL)
@JoinColumn(name = "brew_build_id", referencedColumnName = "id")
@JsonInclude(JsonInclude.Include.NON_NULL)
public BrewBuild brewBuild;
@OneToOne(fetch=FetchType.LAZY, cascade = CascadeType.ALL)
@JoinColumn(name = "pnc_build_id", referencedColumnName = "id")
@JsonInclude(JsonInclude.Include.NON_NULL)
public PNCBuild pncBuild;
@OneToOne(fetch=FetchType.LAZY, cascade = CascadeType.ALL)
@JoinColumn(name = "git_id", referencedColumnName = "id")
@JsonInclude(JsonInclude.Include.NON_NULL)
public Git git;
public String getOshScanOptions() {
return oshScanOptions;
}
public void setOshScanOptions(String oshScanOptions) {
this.oshScanOptions = oshScanOptions;
}
public String getScanProperties() {
return scanProperties;
}
public void setScanProperties(String scanProperties) {
this.scanProperties = scanProperties;
}
@Column(name="scan_properties")
public String scanProperties;
public ScanRequest() {
}
public ScanRequest(BrewBuild brewBuild)
{
this.type = RequestType.BREW;
this.brewBuild = brewBuild;
}
public ScanRequest(PNCBuild pncBuild)
{
this.type = RequestType.PNC;
this.pncBuild = pncBuild;
}
public ScanRequest(Git git)
{
this.type = RequestType.GIT;
this.git = git;
}
public ScanRequest(String repo, String ref)
{
this.git = new Git(repo, ref);
}
public ScanTask executeScan(){
ScanTask st = new ScanTask(this);
st.execute();
return st;
}
}

111
src/main/java/com/redhat/pctsec/model/ScanRequests.java

@ -0,0 +1,111 @@
package com.redhat.pctsec.model;
import com.redhat.pctsec.model.api.request.pssaas;
import com.redhat.pctsec.model.api.request.scanChain;
import io.vertx.mutiny.core.eventbus.EventBus;
import jakarta.enterprise.context.ApplicationScoped;
import java.util.*;
import java.util.stream.Collectors;
import jakarta.persistence.*;
@ApplicationScoped
@Entity
@Table(name="ScanRequests")
public class ScanRequests {
@Id
@GeneratedValue
protected UUID id;
@OneToMany(fetch=FetchType.EAGER, cascade = CascadeType.ALL)
@JoinColumn(name = "scan_request_id", referencedColumnName = "id")
private Set<ScanRequest> scanRequests;// = new HashSet<>();
@Column(name="scan_properties")
private String globalScanProperties;
@Column(name="scan_metadata")
private String scanMetadata;
public ScanRequests(){
//Default to the Snyk scan
this.globalScanProperties = "-p snyk-only-unstable --tarball-build-script=\":\"";
this.scanRequests = new HashSet<>();
}
public ScanRequests(pssaas pssaas){
this();
pssaas.componentList.stream().filter(c -> c.getType().equals("git")).forEach(g -> this.addGit(g.getRepo().toString(), g.getRef()));
pssaas.componentList.stream().filter(c -> c.getType().equals("brew")).forEach(g -> this.addBrewBuild(g.getBuildId()));
pssaas.componentList.stream().filter(c -> c.getType().equals("pnc")).forEach(g -> this.addPNCBuild(g.getBuildId()));
}
public ScanRequests(scanChain scanchain){
this();
}
//public ScanRequests(String repo, String rev){
// //shortcut for single scans
// scanRequests.add(new ScanRequest(repo, rev));
//}
/*
public ScanRequests(String brewBuildId){
scanRequests.add(new ScanRequest(new B));
}
*/
public void addBrewBuild(String brewBuildId)
{
scanRequests.add(new ScanRequest(new BrewBuild(brewBuildId)));
}
public void addGit(String repo, String rev)
{
scanRequests.add(new ScanRequest(new Git(repo, rev)));
}
public void addPNCBuild(String pncBuildId)
{
scanRequests.add(new ScanRequest(new PNCBuild(pncBuildId)));
}
//Create tekton pipeline/taskrun
public List<ScanTask> execute(EventBus eventBus){
scanRequests.stream().forEach(s -> s.setBus(eventBus));
return scanRequests.stream().map(s -> s.executeScan()).collect(Collectors.toList());
/*
for(ScanRequest s : scanRequests){
s.executeScan();
}
*/
}
public Set<ScanRequest> getScanRequests() {
return scanRequests;
}
public void setScanRequests(Set<ScanRequest> scanRequests) {
this.scanRequests = scanRequests;
}
public String getGlobalScanProperties() {
return globalScanProperties;
}
public void setGlobalScanProperties(String globalScanProperties) {
this.globalScanProperties = globalScanProperties;
}
public String getScanMetadata() {
return scanMetadata;
}
public void setScanMetadata(String scanMetadata) {
this.scanMetadata = scanMetadata;
}
}

19
src/main/java/com/redhat/pctsec/model/ScanResult.java

@ -0,0 +1,19 @@
package com.redhat.pctsec.model;
import java.net.URI;
import java.net.URL;
public class ScanResult {
public URL covScanTask;
//Store files in document store
private void storeResults(){
}
private void fetchResults(){
}
}

78
src/main/java/com/redhat/pctsec/model/ScanTask.java

@ -0,0 +1,78 @@
package com.redhat.pctsec.model;
import com.fasterxml.jackson.annotation.JsonIgnore;
import io.vertx.core.eventbus.impl.EventBusImpl;
import io.vertx.mutiny.core.eventbus.EventBus;
import jakarta.enterprise.context.ApplicationScoped;
import jakarta.enterprise.context.Dependent;
import jakarta.inject.Inject;
import jakarta.persistence.*;
import java.util.UUID;
@Entity
@ApplicationScoped
public class ScanTask {
@Id
@GeneratedValue
protected UUID id;
@JsonIgnore
@Transient
@Inject
EventBus bus;
public ScanTaskState state;
public void setTektonRunId(String tektonRunId) {
this.tektonRunId = tektonRunId;
}
public String tektonRunId;
@OneToOne(fetch=FetchType.EAGER, cascade = CascadeType.ALL)
@JoinColumn(name = "scan_result_id", referencedColumnName = "id")
public ScanRequest scanRequest;
public ScanTask(ScanRequest scanRequest) {
this();
this.scanRequest = scanRequest;
this.bus = scanRequest.getBus();
//this.bus = new EventBus(new EventBusImpl());
}
public ScanTask(){
}
/*
public ScanTask(ScanRequest scanRequest)
{
this(
this.scanRequest = scanRequest;
}
*/
public void execute(){
bus.publish("tekton", this);
}
public ScanTaskState getState() {
return state;
}
public void setState(ScanTaskState state) {
this.state = state;
}
public ScanRequest getScanRequest() {
return scanRequest;
}
public void setScanRequest(ScanRequest scanRequest) {
this.scanRequest = scanRequest;
}
}

3
src/main/java/com/redhat/pctsec/model/ScanTaskState.java

@ -0,0 +1,3 @@
package com.redhat.pctsec.model;
public enum ScanTaskState {AWAIT, TRIGGERED, RUNNING, SUCCESS, FAULURE}

17
src/main/java/com/redhat/pctsec/model/api/request/Component.java

@ -0,0 +1,17 @@
package com.redhat.pctsec.model.api.request;
import com.fasterxml.jackson.databind.annotation.JsonDeserialize;
import java.net.URI;
@JsonDeserialize(using = ComponentJsonDeserializer.class)
public interface Component {
public String getType();
public String getBuildId();
public URI getRepo();
public String getRef();
}

31
src/main/java/com/redhat/pctsec/model/api/request/ComponentJsonDeserializer.java

@ -0,0 +1,31 @@
package com.redhat.pctsec.model.api.request;
import com.fasterxml.jackson.core.JacksonException;
import com.fasterxml.jackson.core.JsonParser;
import com.fasterxml.jackson.databind.DeserializationContext;
import com.fasterxml.jackson.databind.JsonDeserializer;
import com.fasterxml.jackson.databind.JsonNode;
import java.io.IOException;
import java.net.URI;
public class ComponentJsonDeserializer extends JsonDeserializer<Component> {
@Override
public Component deserialize(JsonParser jsonParser, DeserializationContext deserializationContext) throws IOException, JacksonException {
JsonNode node = jsonParser.readValueAsTree();
JsonNode componentT = node.get("type");
if(componentT.asText().equals("git"))
{
URI repo = URI.create(node.get("repo").asText());
String ref = node.get("ref").asText();
return new git(repo, ref);
}
else
{
return new build(componentT.asText(), node.get("build-id").asText());
}
}
}

49
src/main/java/com/redhat/pctsec/model/api/request/build.java

@ -0,0 +1,49 @@
package com.redhat.pctsec.model.api.request;
import com.fasterxml.jackson.annotation.JsonInclude;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.annotation.JsonPropertyOrder;
import jakarta.validation.constraints.NotNull;
import java.net.URI;
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"type",
"build-id"
})
public class build implements Component {
public final String type;
public final String buildId;
public build(@NotNull String type, @NotNull String buildId) {
this.type = type;
this.buildId = buildId;
}
@Override
@NotNull
@JsonProperty("type")
public String getType() {
return this.type;
}
@NotNull
@JsonProperty("build-id")
@Override
public String getBuildId() {
return this.buildId;
}
@Override
public URI getRepo() {
return URI.create("");
}
@Override
public String getRef() {
return "";
}
}

53
src/main/java/com/redhat/pctsec/model/api/request/git.java

@ -0,0 +1,53 @@
package com.redhat.pctsec.model.api.request;
import com.fasterxml.jackson.annotation.JsonInclude;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.annotation.JsonPropertyOrder;
import jakarta.validation.constraints.NotNull;
import java.net.URI;
@JsonInclude(JsonInclude.Include.NON_NULL)
@JsonPropertyOrder({
"type",
"repo",
"ref"
})
public class git implements Component{
@NotNull
@JsonProperty("type")
public final static String type = "git";
@NotNull
@JsonProperty("repo")
public URI repo;
@NotNull
@JsonProperty("ref")
public String ref;
public git(@NotNull URI repo, @NotNull String ref) {
this.repo = repo;
this.ref = ref;
}
@Override
public String getType() {
return this.type;
}
@Override
public String getBuildId() {
return "";
}
@Override
public URI getRepo() {
return this.repo;
}
@Override
public String getRef() {
return this.ref;
}
}

70
src/main/java/com/redhat/pctsec/model/api/request/pssaas.java

@ -0,0 +1,70 @@
package com.redhat.pctsec.model.api.request;
import java.util.LinkedHashMap;
import java.util.Map;
import java.util.Set;
import com.fasterxml.jackson.annotation.JsonAnyGetter;
import com.fasterxml.jackson.annotation.JsonAnySetter;
import com.fasterxml.jackson.annotation.JsonIgnore;
import com.fasterxml.jackson.annotation.JsonInclude;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.annotation.JsonPropertyDescription;
import com.fasterxml.jackson.annotation.JsonPropertyOrder;
import com.fasterxml.jackson.databind.annotation.JsonDeserialize;
import jakarta.validation.Valid;
import jakarta.validation.constraints.NotNull;
import jakarta.validation.constraints.Size;
public class pssaas {
/**
* The product ID associated with the scan.
* (Required)
*
*/
@JsonProperty("product-id")
@JsonPropertyDescription("The product ID associated with the scan.")
@NotNull
public String productId;
/**
* The submission event ID associated with the scan.
*
*/
@JsonProperty("event-id")
@JsonPropertyDescription("The submission event ID associated with the scan.")
public String eventId;
/**
* Indicates whether or not the product is a managed service.
* (Required)
*
*/
@JsonProperty("is-managed-service")
@JsonPropertyDescription("Indicates whether or not the product is a managed service.")
@NotNull
public Boolean isManagedService;
/**
* The version of CPaaS that submitted the scan.
*
*/
@JsonProperty("cpaas-version")
@JsonPropertyDescription("The version of CPaaS that submitted the scan.")
public String cpaasVersion;
/**
* URL of Jenkins job that submitted the scan.
*
*/
@JsonProperty("job-url")
@JsonPropertyDescription("URL of Jenkins job that submitted the scan.")
public String jobUrl;
/**
* List of components to be scanned.
* (Required)
*
*/
@JsonProperty("component-list")
@JsonDeserialize(as = java.util.LinkedHashSet.class)
@JsonPropertyDescription("List of components to be scanned.")
@Size(min = 1)
@Valid
@NotNull
public Set<Component> componentList;
}

4
src/main/java/com/redhat/pctsec/model/api/request/scanChain.java

@ -0,0 +1,4 @@
package com.redhat.pctsec.model.api.request;
public class scanChain {
}

16
src/main/java/com/redhat/pctsec/model/jpa/ScanRepository.java

@ -0,0 +1,16 @@
package com.redhat.pctsec.model.jpa;
import com.redhat.pctsec.model.Scan;
import io.quarkus.hibernate.orm.panache.PanacheRepositoryBase;
import io.smallrye.mutiny.Uni;
import jakarta.enterprise.context.ApplicationScoped;
import java.util.UUID;
@ApplicationScoped
public class ScanRepository implements PanacheRepositoryBase<Scan, UUID> {
public Uni<Scan> findByProduct(String product)
{
return find("product", product).firstResult();
}
}

18
src/main/java/com/redhat/pctsec/model/jpa/ScanRequestRepository.java

@ -0,0 +1,18 @@
package com.redhat.pctsec.model.jpa;
import com.redhat.pctsec.model.ScanRequest;
import io.quarkus.hibernate.orm.panache.PanacheRepositoryBase;
import io.smallrye.mutiny.Uni;
import jakarta.enterprise.context.ApplicationScoped;
import java.util.UUID;
@ApplicationScoped
public class ScanRequestRepository implements PanacheRepositoryBase<ScanRequest, UUID> {
public Uni<ScanRequest> findByProduct(String product)
{
return find("product", product).firstResult();
}
}

18
src/main/java/com/redhat/pctsec/model/jpa/ScanRequestsRepository.java

@ -0,0 +1,18 @@
package com.redhat.pctsec.model.jpa;
import com.redhat.pctsec.model.Scan;
import com.redhat.pctsec.model.ScanRequests;
import io.quarkus.hibernate.orm.panache.PanacheRepositoryBase;
import io.smallrye.mutiny.Uni;
import jakarta.enterprise.context.ApplicationScoped;
import java.util.UUID;
@ApplicationScoped
public class ScanRequestsRepository implements PanacheRepositoryBase<ScanRequests, UUID> {
public Uni<Scan> findByProduct(String product)
{
return find("product", product).firstResult();
}
}

22
src/main/java/com/redhat/pctsec/model/jpa/UriConverter.java

@ -0,0 +1,22 @@
package com.redhat.pctsec.model.jpa;
import jakarta.persistence.AttributeConverter;
import jakarta.persistence.Converter;
import java.net.URI;
@Converter(autoApply = true)
public class UriConverter implements AttributeConverter<URI, String>
{
@Override
public String convertToDatabaseColumn(URI uri) {
return (uri == null) ? null : uri.toString();
}
@Override
public URI convertToEntityAttribute(String s) {
return ((s.length() > 0) ? URI.create(s.trim()) : null);
}
}

72
src/main/java/com/redhat/pctsec/model/osh/paramMapper.java

@ -0,0 +1,72 @@
package com.redhat.pctsec.model.osh;
import jakarta.inject.Singleton;
import picocli.CommandLine;
import picocli.CommandLine.Option;
import picocli.CommandLine.Parameters;
public class paramMapper {
@Option(names = {"-p", "--profile"}, description = "list of analyzers to use (see command 'list-\n" +
" analyzers'); use comma as a separator: e.g. \"\n" +
" --analyzer=gcc,clang,cppcheck\"")
private String profile;
@Option(names = {"-a", "--analyzer"}, description = "list of analyzers to use (see command 'list-\n" +
" analyzers'); use comma as a separator: e.g. \"\n" +
" --analyzer=gcc,clang,cppcheck\"")
private String analyzers;
@Option(names = {"--tarball-build-script"}, description = "With this option osh-cli accepts path to\n" +
" tarball specified via first argument and then\n" +
" the tarball will be scanned. This option sets\n" +
" command which should build the package,\n" +
" usually this should be just \"make\", in case\n" +
" of packages which doesn't need to be built,\n" +
" just pass \"true\".\n")
private String tarballBuildScript;
@Option(names = {"--brew-build"}, description = "use a brew build (specified by NVR) instead\n" +
" of a local file")
private String brewBuild;
public paramMapper(){}
public paramMapper(String params){
new CommandLine(this).parseArgs(params.split(("\\s+")));
}
public String getProfile() {
return profile;
}
public void setProfile(String profile) {
this.profile = profile;
}
public String getAnalyzers() {
return analyzers;
}
public void setAnalyzers(String analyzers) {
this.analyzers = analyzers;
}
public String getTarballBuildScript() {
return tarballBuildScript;
}
public void setTarballBuildScript(String tarballBuildScript) {
this.tarballBuildScript = tarballBuildScript;
}
public String getBrewBuild() {
return brewBuild;
}
public void setBrewBuild(String brewBuild) {
this.brewBuild = brewBuild;
}
}

27
src/main/java/com/redhat/pctsec/rest/v1alpha1/Kerberos.java

@ -0,0 +1,27 @@
package com.redhat.pctsec.rest.v1alpha1;
import io.quarkiverse.kerberos.KerberosPrincipal;
import io.quarkus.arc.profile.UnlessBuildProfile;
import io.quarkus.security.Authenticated;
import io.quarkus.security.identity.SecurityIdentity;
import jakarta.inject.Inject;
import jakarta.ws.rs.GET;
import jakarta.ws.rs.Path;
import jakarta.ws.rs.Produces;
@UnlessBuildProfile("dev")
@Path("/Kerberos")
@Authenticated
public class Kerberos {
@Inject
SecurityIdentity identity;
@Inject
KerberosPrincipal kerberosPrincipal;
@GET
@Path("/me")
@Produces("text/plain")
public String me() {
return identity.getPrincipal().getName();
}
}

50
src/main/java/com/redhat/pctsec/rest/v1alpha1/ScanRequestResource.java

@ -0,0 +1,50 @@
package com.redhat.pctsec.rest.v1alpha1;
import com.redhat.pctsec.model.ScanRequest;
import com.redhat.pctsec.model.jpa.ScanRequestRepository;
import com.redhat.pctsec.model.osh.paramMapper;
import io.quarkus.security.Authenticated;
import jakarta.enterprise.context.ApplicationScoped;
import jakarta.inject.Inject;
import jakarta.transaction.Transactional;
import jakarta.ws.rs.*;
import picocli.CommandLine;
import java.util.UUID;
@ApplicationScoped
@Path("/api/v1a/ScanRequest/{id}")
public class ScanRequestResource {
@Inject
ScanRequestRepository scanRequestRepository;
@GET
@Produces({"application/json"})
public ScanRequest getScanRequest(String id)
{
ScanRequest scanRequest = scanRequestRepository.findById(UUID.fromString(id));
return scanRequest;
}
@PATCH
@Path("ScanProperties/{scanProperties}")
@Consumes({"application/octet-stream"})
@Produces({"application/json"})
@Authenticated
@Transactional
public ScanRequest patchScanRequest(String id, String scanProperties)
{
ScanRequest scanRequest = scanRequestRepository.findById(UUID.fromString(id));
try {
paramMapper pm = new paramMapper(scanProperties);
}catch(CommandLine.UnmatchedArgumentException e)
{
throw new BadRequestException("Invalid OSH Parameter");
}
scanRequest.setScanProperties(scanProperties);
scanRequestRepository.persist(scanRequest);
return scanRequest;
}
}

41
src/main/java/com/redhat/pctsec/rest/v1alpha1/ScanRequestsResource.java

@ -0,0 +1,41 @@
package com.redhat.pctsec.rest.v1alpha1;
import com.redhat.pctsec.model.Scan;
import com.redhat.pctsec.model.ScanRequest;
import com.redhat.pctsec.model.ScanRequests;
import com.redhat.pctsec.model.jpa.ScanRepository;
import com.redhat.pctsec.model.jpa.ScanRequestsRepository;
import io.quarkus.security.Authenticated;
import jakarta.enterprise.context.ApplicationScoped;
import jakarta.inject.Inject;
import jakarta.ws.rs.*;
import org.jboss.resteasy.reactive.common.NotImplementedYet;
import java.util.UUID;
@ApplicationScoped
@Path("/api/v1a/ScanRequests/{id}")
public class ScanRequestsResource {
@Inject
ScanRequestsRepository sr;
@GET
@Produces({"application/json"})
public ScanRequests getScanRequests(String id)
{
ScanRequests scanRequests = sr.findById(UUID.fromString(id));
return scanRequests;
}
@POST
@Produces({"application/json"})
@Consumes({"application/json"})
@Authenticated
public ScanRequests addScanRequest(String id, ScanRequest scanRequest)
{
throw new NotImplementedYet();
}
}

122
src/main/java/com/redhat/pctsec/rest/v1alpha1/ScanResource.java

@ -0,0 +1,122 @@
package com.redhat.pctsec.rest.v1alpha1;
import com.redhat.pctsec.model.*;
import com.redhat.pctsec.model.api.request.pssaas;
import com.redhat.pctsec.model.jpa.ScanRepository;
import io.quarkus.security.Authenticated;
import io.vertx.mutiny.core.eventbus.EventBus;
import jakarta.enterprise.context.ApplicationScoped;
import jakarta.inject.Inject;
import jakarta.transaction.Transactional;
import jakarta.validation.Valid;
import jakarta.ws.rs.*;
import org.jboss.resteasy.reactive.RestQuery;
import java.util.HashMap;
import java.util.List;
import java.util.Set;
import java.util.UUID;
@ApplicationScoped
@Path("/api/v1a/Scan")
public class ScanResource {
@Inject
ScanRepository sr;
@Inject
EventBus bus;
@POST
@Path("PSSaaS")
@Consumes({ "application/json" })
@Transactional
@Authenticated
public Scan createPSSAAS(@Valid pssaas scanRequest)
{
ScanRequests scanRequests = new ScanRequests(scanRequest);
Scan s = new Scan();
s.setRequestor("cpaas");
s.setScanRequests(scanRequests);
sr.persist(s);
return s;
}
@POST
@Path("PSSaaS/run")
@Consumes({ "application/json" })
@Transactional
@Authenticated
public List<ScanTask> createRunPSSAAS(@Valid pssaas scanRequest)
{
Scan s = this.createPSSAAS(scanRequest);
return s.scanRequests.execute(bus);
}
@GET
@Path("All")
@Produces({"application/json"})
public List<Scan> list()
{
return sr.listAll();
}
@GET
@Path("{id}")
@Produces({"application/json"})
public Scan scanRequest(String id)
{
Scan s = sr.findById(UUID.fromString(id));
return s;
}
@GET
@Path("{id}/run")
@Authenticated
public List<ScanTask> scanRequestExe(String id)
{
Scan s = sr.findById(UUID.fromString(id));
return s.scanRequests.execute(bus);
}
@GET
@Path("single/git")
@Produces({"application/json"})
@Transactional
@Authenticated
public Scan singleGit(@RestQuery String repo, @RestQuery String ref)
{
Scan s = new Scan();
s.setRequestor("jochrist");
s.getScanRequests().addGit(repo,ref);
sr.persist(s);
return s;
}
@GET
@Path("single/brew")
@Produces({"application/json"})
@Transactional
@Authenticated
public Scan singleGit(@RestQuery String brewId)
{
Scan s = new Scan();
s.setRequestor("jochrist");
s.getScanRequests().addBrewBuild(brewId);
sr.persist(s);
return s;
}
@GET
@Path("single/pnc")
@Produces({"application/json"})
@Transactional
@Authenticated
public Scan singlePNC(@RestQuery String pncId)
{
Scan s = new Scan();
s.setRequestor("jochrist");
s.getScanRequests().addPNCBuild(pncId);
sr.persist(s);
return s;
}
}

139
src/main/java/com/redhat/pctsec/tekton/TaskHandler.java

@ -0,0 +1,139 @@
package com.redhat.pctsec.tekton;
import com.redhat.pctsec.model.RequestType;
import com.redhat.pctsec.model.ScanTask;
import com.redhat.pctsec.model.ScanTaskState;
import io.fabric8.kubernetes.api.model.ConfigMapVolumeSource;
import io.fabric8.kubernetes.api.model.PersistentVolumeClaimVolumeSource;
import io.fabric8.kubernetes.api.model.PodSecurityContext;
import io.fabric8.kubernetes.api.model.PodSecurityContextBuilder;
import io.fabric8.tekton.client.DefaultTektonClient;
import io.fabric8.tekton.client.TektonClient;
import io.fabric8.tekton.client.DefaultTektonClient;
import io.fabric8.tekton.client.TektonClient;
import io.fabric8.tekton.pipeline.v1beta1.*;
import io.quarkus.vertx.ConsumeEvent;
import io.smallrye.common.annotation.Blocking;
import jakarta.inject.Inject;
import org.apache.commons.lang3.RandomStringUtils;
import org.eclipse.microprofile.config.inject.ConfigProperty;
import java.util.ArrayList;
import java.util.List;
public class TaskHandler {
@ConfigProperty(name = "quarkus.openshift.namespace")
String NAMESPACE;
@ConfigProperty(name = "tekton.pipeline.ref")
String PIPELINE_REFERENCE;
@ConfigProperty(name = "tekton.service-account")
String SERVICE_ACCOUNT;
@ConfigProperty(name = "tekton.task.ref")
String TASK_REFERENCE;
@Inject
TektonClient tektonClient;
@ConsumeEvent("tekton")
@Blocking
public ScanTask consume(ScanTask scanTask)
{
switch(scanTask.getScanRequest().getType())
{
case BREW:
scanTask.setTektonRunId(invokeScanTask(scanTask.getScanRequest().brewBuild.buildRef));
scanTask.setState(ScanTaskState.RUNNING);
break;
case PNC:
String repo = scanTask.getScanRequest().pncBuild.SCMURL().toString();
String ref = scanTask.getScanRequest().pncBuild.revision();
scanTask.setTektonRunId(invokeOshScmScanPipeline(repo, ref));
scanTask.setState(ScanTaskState.RUNNING);
break;
case GIT:
scanTask.setTektonRunId(invokeOshScmScanPipeline(scanTask.getScanRequest().git.repo.toString(), scanTask.getScanRequest().git.ref));
scanTask.setState(ScanTaskState.RUNNING);
break;
}
return scanTask;
}
public String invokeScanTask(String buildId) {
// String buildId = "xterm-366-8.el9";
String scanProfile = "snyk-only-unstable";
// random taskrun name generating for now
TaskRun taskRun = new TaskRunBuilder().withNewMetadata().withName("osh-scan-taskrun-" + RandomStringUtils.randomAlphanumeric(8).toLowerCase())
.endMetadata()
.withNewSpec()
.withServiceAccountName(SERVICE_ACCOUNT)
.withNewTaskRef()
.withName(TASK_REFERENCE)
.endTaskRef()
.withParams(
new Param("buildId", new ArrayOrString(buildId)),
new Param("scanProfile", new ArrayOrString(scanProfile)))
.endSpec()
.build();
tektonClient.v1beta1().taskRuns().inNamespace(NAMESPACE).resource(taskRun).create();
return taskRun.getMetadata().getName();
}
public String invokeOshScmScanPipeline(String repo, String ref) {
PodSecurityContext securityContext = new PodSecurityContextBuilder()
.withRunAsNonRoot(true)
.withRunAsUser(65532L)
.build();
WorkspaceBinding sourcesWorkspaceBinding = new WorkspaceBindingBuilder()
.withName("sources")
.withPersistentVolumeClaim(new PersistentVolumeClaimVolumeSource("osh-client-sources", null))
.build();
WorkspaceBinding sourceTarsWorkspaceBinding = new WorkspaceBindingBuilder()
.withName("source-tars")
.withPersistentVolumeClaim(new PersistentVolumeClaimVolumeSource("osh-client-source-tars", null))
.build();
WorkspaceBinding sslCaDirectoryWorkspaceBinding = new WorkspaceBindingBuilder()
.withName("ssl-ca-directory")
.withConfigMap(new ConfigMapVolumeSource(null, null, "config-trusted-cabundle", null))
.build();
List<WorkspaceBinding> workspaceBindings = new ArrayList<>();
workspaceBindings.add(sourcesWorkspaceBinding);
workspaceBindings.add(sourceTarsWorkspaceBinding);
workspaceBindings.add(sslCaDirectoryWorkspaceBinding);
PipelineRun pipelineRun = new PipelineRunBuilder()
.withNewMetadata().withName("osh-scm-scan-" + RandomStringUtils.randomAlphanumeric(8).toLowerCase()).endMetadata()
.withNewSpec()
.withNewPodTemplate()
.withSecurityContext(securityContext)
.endPodTemplate()
.withServiceAccountName(SERVICE_ACCOUNT)
.withNewPipelineRef().withName(PIPELINE_REFERENCE).endPipelineRef()
.addNewParam().withName("repo-url").withNewValue(repo).endParam()
.addNewParam().withName("revision").withNewValue(ref).endParam()
.withWorkspaces(workspaceBindings)
.endSpec()
.build();
tektonClient.v1beta1().pipelineRuns().inNamespace(NAMESPACE).resource(pipelineRun).create();
return pipelineRun.getMetadata().getName();
}
}

49
src/main/java/com/redhat/pctsec/tekton/brewTaskRun.java

@ -0,0 +1,49 @@
package com.redhat.pctsec.tekton;
import io.fabric8.tekton.client.DefaultTektonClient;
import io.fabric8.tekton.client.TektonClient;
import io.fabric8.tekton.pipeline.v1beta1.ArrayOrString;
import io.fabric8.tekton.pipeline.v1beta1.Param;
import io.fabric8.tekton.pipeline.v1beta1.TaskRun;
import io.fabric8.tekton.pipeline.v1beta1.TaskRunBuilder;
import jakarta.enterprise.context.ApplicationScoped;
import jakarta.inject.Singleton;
import org.apache.commons.lang3.RandomStringUtils;
import jakarta.inject.Inject;
@io.quarkus.arc.Unremovable
public class brewTaskRun {
public static final String NAMESPACE = "pct-security-tooling";
public static final String BUILD_ID = "buildId";
public static final String SCAN_PROFILE = "scanProfile";
public static final String TASK_REFERENCE = "osh-scan-task";
public static final String SERVICE_ACCOUNT = "osh-wrapper-client-sa";
//@Inject
TektonClient tektonClient = new DefaultTektonClient();
public String invokeScanTask(String buildId) {
// String buildId = "xterm-366-8.el9";
String scanProfile = "snyk-only-unstable";
// random taskrun name generating for now
TaskRun taskRun = new TaskRunBuilder().withNewMetadata().withName("osh-scan-taskrun-" + RandomStringUtils.randomAlphanumeric(8).toLowerCase())
.endMetadata()
.withNewSpec()
.withServiceAccountName(SERVICE_ACCOUNT)
.withNewTaskRef()
.withName(TASK_REFERENCE)
.endTaskRef()
.withParams(
new Param("buildId", new ArrayOrString(buildId)),
new Param("scanProfile", new ArrayOrString(scanProfile)))
.endSpec()
.build();
tektonClient.v1beta1().taskRuns().inNamespace(NAMESPACE).resource(taskRun).create();
return "Scan invoked";
}
}

74
src/main/java/com/redhat/pctsec/tekton/scmUrlPipelineRun.java

@ -0,0 +1,74 @@
package com.redhat.pctsec.tekton;
import io.fabric8.kubernetes.api.model.ConfigMapVolumeSource;
import io.fabric8.kubernetes.api.model.PersistentVolumeClaimVolumeSource;
import io.fabric8.kubernetes.api.model.PodSecurityContext;
import io.fabric8.kubernetes.api.model.PodSecurityContextBuilder;
import io.fabric8.tekton.client.DefaultTektonClient;
import io.fabric8.tekton.client.TektonClient;
import io.fabric8.tekton.client.DefaultTektonClient;
import io.fabric8.tekton.client.TektonClient;
import io.fabric8.tekton.pipeline.v1beta1.*;
import jakarta.enterprise.context.ApplicationScoped;
import jakarta.inject.Singleton;
import org.apache.commons.lang3.RandomStringUtils;
import java.util.ArrayList;
import java.util.List;
public class scmUrlPipelineRun {
public static final String NAMESPACE = "pct-security-tooling";
public static final String REPO_URL = "repo-url";
public static final String REVISION = "revision";
public static final String PIPELINE_REFERENCE = "osh-client-from-source";
public static final String SERVICE_ACCOUNT = "osh-wrapper-client-sa";
TektonClient tektonClient = new DefaultTektonClient();
public String invokeOshScmScanPipeline(String repo, String ref) {
PodSecurityContext securityContext = new PodSecurityContextBuilder()
.withRunAsNonRoot(true)
.withRunAsUser(65532L)
.build();
WorkspaceBinding sourcesWorkspaceBinding = new WorkspaceBindingBuilder()
.withName("sources")
.withPersistentVolumeClaim(new PersistentVolumeClaimVolumeSource("osh-client-sources", null))
.build();
WorkspaceBinding sourceTarsWorkspaceBinding = new WorkspaceBindingBuilder()
.withName("source-tars")
.withPersistentVolumeClaim(new PersistentVolumeClaimVolumeSource("osh-client-source-tars", null))
.build();
WorkspaceBinding sslCaDirectoryWorkspaceBinding = new WorkspaceBindingBuilder()
.withName("ssl-ca-directory")
.withConfigMap(new ConfigMapVolumeSource(null, null, "config-trusted-cabundle", null))
.build();
List<WorkspaceBinding> workspaceBindings = new ArrayList<>();
workspaceBindings.add(sourcesWorkspaceBinding);
workspaceBindings.add(sourceTarsWorkspaceBinding);
workspaceBindings.add(sslCaDirectoryWorkspaceBinding);
PipelineRun pipelineRun = new PipelineRunBuilder()
.withNewMetadata().withName("osh-scm-scan-" + RandomStringUtils.randomAlphanumeric(8).toLowerCase()).endMetadata()
.withNewSpec()
.withNewPodTemplate()
.withSecurityContext(securityContext)
.endPodTemplate()
.withServiceAccountName(SERVICE_ACCOUNT)
.withNewPipelineRef().withName(PIPELINE_REFERENCE).endPipelineRef()
.addNewParam().withName(REPO_URL).withNewValue(repo).endParam()
.addNewParam().withName(REVISION).withNewValue(ref).endParam()
.withWorkspaces(workspaceBindings)
.endSpec()
.build();
tektonClient.v1beta1().pipelineRuns().inNamespace(NAMESPACE).resource(pipelineRun).create();
return "Scan invoked. PipelineRun name: " + pipelineRun.getMetadata().getName();
}
}

92
src/main/java/constants/HttpHeaders.java

@ -1,92 +0,0 @@
package constants;
/**
* Copied from io.undertow.util.Headers
*/
public class HttpHeaders {
public static final String ACCEPT_STRING = "Accept";
public static final String ACCEPT_CHARSET_STRING = "Accept-Charset";
public static final String ACCEPT_ENCODING_STRING = "Accept-Encoding";
public static final String ACCEPT_LANGUAGE_STRING = "Accept-Language";
public static final String ACCEPT_RANGES_STRING = "Accept-Ranges";
public static final String AGE_STRING = "Age";
public static final String ALLOW_STRING = "Allow";
public static final String AUTHENTICATION_INFO_STRING = "Authentication-Info";
public static final String AUTHORIZATION_STRING = "Authorization";
public static final String CACHE_CONTROL_STRING = "Cache-Control";
public static final String COOKIE_STRING = "Cookie";
public static final String COOKIE2_STRING = "Cookie2";
public static final String CONNECTION_STRING = "Connection";
public static final String CONTENT_DISPOSITION_STRING = "Content-Disposition";
public static final String CONTENT_ENCODING_STRING = "Content-Encoding";
public static final String CONTENT_LANGUAGE_STRING = "Content-Language";
public static final String CONTENT_LENGTH_STRING = "Content-Length";
public static final String CONTENT_LOCATION_STRING = "Content-Location";
public static final String CONTENT_MD5_STRING = "Content-MD5";
public static final String CONTENT_RANGE_STRING = "Content-Range";
public static final String CONTENT_SECURITY_POLICY_STRING = "Content-Security-Policy";
public static final String CONTENT_TYPE_STRING = "Content-Type";
public static final String DATE_STRING = "Date";
public static final String ETAG_STRING = "ETag";
public static final String EXPECT_STRING = "Expect";
public static final String EXPIRES_STRING = "Expires";
public static final String FORWARDED_STRING = "Forwarded";
public static final String FROM_STRING = "From";
public static final String HOST_STRING = "Host";
public static final String IF_MATCH_STRING = "If-Match";
public static final String IF_MODIFIED_SINCE_STRING = "If-Modified-Since";
public static final String IF_NONE_MATCH_STRING = "If-None-Match";
public static final String IF_RANGE_STRING = "If-Range";
public static final String IF_UNMODIFIED_SINCE_STRING = "If-Unmodified-Since";
public static final String LAST_MODIFIED_STRING = "Last-Modified";
public static final String LOCATION_STRING = "Location";
public static final String MAX_FORWARDS_STRING = "Max-Forwards";
public static final String ORIGIN_STRING = "Origin";
public static final String PRAGMA_STRING = "Pragma";
public static final String PROXY_AUTHENTICATE_STRING = "Proxy-Authenticate";
public static final String PROXY_AUTHORIZATION_STRING = "Proxy-Authorization";
public static final String RANGE_STRING = "Range";
public static final String REFERER_STRING = "Referer";
public static final String REFERRER_POLICY_STRING = "Referrer-Policy";
public static final String REFRESH_STRING = "Refresh";
public static final String RETRY_AFTER_STRING = "Retry-After";
public static final String SEC_WEB_SOCKET_ACCEPT_STRING = "Sec-WebSocket-Accept";
public static final String SEC_WEB_SOCKET_EXTENSIONS_STRING = "Sec-WebSocket-Extensions";
public static final String SEC_WEB_SOCKET_KEY_STRING = "Sec-WebSocket-Key";
public static final String SEC_WEB_SOCKET_KEY1_STRING = "Sec-WebSocket-Key1";
public static final String SEC_WEB_SOCKET_KEY2_STRING = "Sec-WebSocket-Key2";
public static final String SEC_WEB_SOCKET_LOCATION_STRING = "Sec-WebSocket-Location";
public static final String SEC_WEB_SOCKET_ORIGIN_STRING = "Sec-WebSocket-Origin";
public static final String SEC_WEB_SOCKET_PROTOCOL_STRING = "Sec-WebSocket-Protocol";
public static final String SEC_WEB_SOCKET_VERSION_STRING = "Sec-WebSocket-Version";
public static final String SERVER_STRING = "Server";
public static final String SERVLET_ENGINE_STRING = "Servlet-Engine";
public static final String SET_COOKIE_STRING = "Set-Cookie";
public static final String SET_COOKIE2_STRING = "Set-Cookie2";
public static final String SSL_CLIENT_CERT_STRING = "SSL_CLIENT_CERT";
public static final String SSL_CIPHER_STRING = "SSL_CIPHER";
public static final String SSL_SESSION_ID_STRING = "SSL_SESSION_ID";
public static final String SSL_CIPHER_USEKEYSIZE_STRING = "SSL_CIPHER_USEKEYSIZE";
public static final String STATUS_STRING = "Status";
public static final String STRICT_TRANSPORT_SECURITY_STRING = "Strict-Transport-Security";
public static final String TE_STRING = "TE";
public static final String TRAILER_STRING = "Trailer";
public static final String TRANSFER_ENCODING_STRING = "Transfer-Encoding";
public static final String UPGRADE_STRING = "Upgrade";
public static final String USER_AGENT_STRING = "User-Agent";
public static final String VARY_STRING = "Vary";
public static final String VIA_STRING = "Via";
public static final String WARNING_STRING = "Warning";
public static final String WWW_AUTHENTICATE_STRING = "WWW-Authenticate";
public static final String X_CONTENT_TYPE_OPTIONS_STRING = "X-Content-Type-Options";
public static final String X_DISABLE_PUSH_STRING = "X-Disable-Push";
public static final String X_FORWARDED_FOR_STRING = "X-Forwarded-For";
public static final String X_FORWARDED_PROTO_STRING = "X-Forwarded-Proto";
public static final String X_FORWARDED_HOST_STRING = "X-Forwarded-Host";
public static final String X_FORWARDED_PORT_STRING = "X-Forwarded-Port";
public static final String X_FORWARDED_SERVER_STRING = "X-Forwarded-Server";
public static final String X_FRAME_OPTIONS_STRING = "X-Frame-Options";
public static final String X_XSS_PROTECTION_STRING = "X-Xss-Protection";
}

7
src/main/java/constants/PSGQL.java

@ -1,7 +0,0 @@
package constants;
public class PSGQL {
public static final String url = "jdbc:postgresql://localhost:5432/mydb";
public static final String user = "postgres";
public static final String password = "password";
}

26
src/main/java/dto/BrewObj.java

@ -1,26 +0,0 @@
package dto;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Getter;
import lombok.ToString;
import lombok.extern.jackson.Jacksonized;
// import org.jboss.pnc.api.dto.Request;
import java.io.Serializable;
@ToString
@Getter
@AllArgsConstructor
@Jacksonized
@Builder
public class BrewObj implements Serializable {
public String buildSystemType;
public String brewId;
public String brewNvr;
public String pncId;
public String artifactType;
public String fileName;
public String buildFromSource;
}

23
src/main/java/dto/BrewObjPayload.java

@ -1,23 +0,0 @@
package dto;
import org.eclipse.microprofile.config.ConfigProvider;
// import org.jboss.pnc.api.deliverablesanalyzer.dto.AnalyzePayload;
// import org.jboss.pnc.api.dto.HeartbeatConfig;
// import org.jboss.pnc.api.dto.Request;
import java.net.URI;
import java.net.URISyntaxException;
import java.nio.charset.StandardCharsets;
import java.sql.Struct;
import java.util.*;
import org.json.JSONObject;
import org.json.JSONArray;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
public class BrewObjPayload {
public static BrewObj constructScanPayload(JSONObject brewObj) throws URISyntaxException {
return new BrewObj(brewObj.getString("buildSystemType"),brewObj.getString("brewId"),brewObj.getString("brewNvr"),brewObj.getString("pncId"),brewObj.getString("artifactType"),brewObj.getString("fileName"),brewObj.getString("builtFromSource"));
}
}

35
src/main/java/dto/ConnectDB.java

@ -1,35 +0,0 @@
package dto;
import constants.PSGQL;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import static constants.PSGQL.user;
import static constants.PSGQL.password;
import static constants.PSGQL.url;
public class ConnectDB{
// private final String url = "jdbc:postgresql://localhost:5432/scandb";
// private final String user = "postgres";
// private final String password = "password";
/**
* Connect to the PostgreSQL database
*
* @return a Connection object
*/
public Connection connect() {
Connection conn = null;
try {
conn = DriverManager.getConnection(url, user, password);
System.out.println("Connected to the PostgreSQL server successfully.");
} catch (SQLException e) {
System.out.println(e.getMessage());
}
return conn;
}
}

23
src/main/java/dto/GitObj.java

@ -1,23 +0,0 @@
package dto;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Getter;
import lombok.ToString;
import lombok.extern.jackson.Jacksonized;
// import org.jboss.pnc.api.dto.Request;
import java.io.Serializable;
@ToString
@Getter
@AllArgsConstructor
@Jacksonized
@Builder
public class GitObj implements Serializable {
public String buildSystemType;
public String repository;
public String reference;
public String commitId;
}

23
src/main/java/dto/GitObjPayload.java

@ -1,23 +0,0 @@
package dto;
import org.eclipse.microprofile.config.ConfigProvider;
// import org.jboss.pnc.api.deliverablesanalyzer.dto.AnalyzePayload;
// import org.jboss.pnc.api.dto.HeartbeatConfig;
// import org.jboss.pnc.api.dto.Request;
import java.net.URI;
import java.net.URISyntaxException;
import java.nio.charset.StandardCharsets;
import java.sql.Struct;
import java.util.*;
import org.json.JSONObject;
import org.json.JSONArray;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
public class GitObjPayload {
public static GitObj constructScanPayload(JSONObject gitObj) throws URISyntaxException {
return new GitObj(gitObj.getString("buildSystemType"),gitObj.getString("repository"),gitObj.getString("reference"),gitObj.getString("commitId"));
}
}

21
src/main/java/dto/PncObj.java

@ -1,21 +0,0 @@
package dto;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Getter;
import lombok.ToString;
import lombok.extern.jackson.Jacksonized;
// import org.jboss.pnc.api.dto.Request;
import java.io.Serializable;
@ToString
@Getter
@AllArgsConstructor
@Jacksonized
@Builder
public class PncObj implements Serializable {
public String buildSystemType;
public String buildId;
}

23
src/main/java/dto/PncObjPayload.java

@ -1,23 +0,0 @@
package dto;
import org.eclipse.microprofile.config.ConfigProvider;
// import org.jboss.pnc.api.deliverablesanalyzer.dto.AnalyzePayload;
// import org.jboss.pnc.api.dto.HeartbeatConfig;
// import org.jboss.pnc.api.dto.Request;
import java.net.URI;
import java.net.URISyntaxException;
import java.nio.charset.StandardCharsets;
import java.sql.Struct;
import java.util.*;
import org.json.JSONObject;
import org.json.JSONArray;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
public class PncObjPayload {
public static PncObj constructScanPayload(JSONObject pncObj) throws URISyntaxException {
return new PncObj(pncObj.getString("buildSystemType"),pncObj.getString("buildId"));
}
}

9
src/main/java/dto/ScanInterface.java

@ -1,9 +0,0 @@
package dto;
import java.io.Serializable;
//interface for the scan objects
public interface ScanInterface extends Serializable{
public String constructPayload();
}

26
src/main/java/dto/ScanObj.java

@ -1,26 +0,0 @@
package dto;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Getter;
import lombok.ToString;
import lombok.extern.jackson.Jacksonized;
// import org.jboss.pnc.api.dto.Request;
//still need to fix all the scan objects to be significantly less poorly written
//TODO add interface for the scan objects (is probably the cleanest solution)
import java.io.Serializable;
@ToString
@Getter
@AllArgsConstructor
@Jacksonized
@Builder
public class ScanObj implements Serializable {
public String scanId;
public String productId;
public String eventId;
public String isManagedService;
public String componentList;
}

23
src/main/java/dto/ScanObjPayload.java

@ -1,23 +0,0 @@
package dto;
import org.eclipse.microprofile.config.ConfigProvider;
// import org.jboss.pnc.api.deliverablesanalyzer.dto.AnalyzePayload;
// import org.jboss.pnc.api.dto.HeartbeatConfig;
// import org.jboss.pnc.api.dto.Request;
import java.net.URI;
import java.net.URISyntaxException;
import java.nio.charset.StandardCharsets;
import java.sql.Struct;
import java.util.*;
import org.json.JSONObject;
import org.json.JSONArray;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
public class ScanObjPayload {
public static ScanObj constructScanPayload(JSONObject scanObj) throws URISyntaxException {
return new ScanObj(scanObj.getString("scanId"),scanObj.getString("productId"),scanObj.getString("eventId"),scanObj.getString("isManagedService"),scanObj.getString("componentList"));
}
}

85
src/main/java/rest/CreateGetResource.java

@ -1,85 +0,0 @@
package rest;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.Set;
import dto.ScanObj;
import dto.ConnectDB;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
import javax.ws.rs.DELETE;
import javax.ws.rs.GET;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.inject.Inject;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import java.util.Set;
import java.util.stream.Collectors;
import javax.inject.Inject;
import javax.ws.rs.Consumes;
import java.sql.*;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.ObjectWriter;
// import org.hibernate.EntityManager;
import jakarta.persistence.EntityManager;
import jakarta.persistence.Cacheable;
import jakarta.persistence.Column;
import jakarta.persistence.Entity;
import jakarta.persistence.GeneratedValue;
import jakarta.persistence.Id;
import jakarta.persistence.NamedQuery;
import jakarta.persistence.QueryHint;
import jakarta.persistence.SequenceGenerator;
import jakarta.persistence.Table;
// @Path("/api/v1/[osh-scan]")
@Path("/scanGet")
public class CreateGetResource {
// @Inject
// EntityManager em;
CreateScanService createScanService;
private Set<ScanObj> Scans = Collections.newSetFromMap(Collections.synchronizedMap(new LinkedHashMap<>()));
public CreateGetResource() {
}
@GET
@Path("/{scanId}")
public Set<ScanObj> list(@PathParam("scanId") String scanId) {
//use to return specific scanIds just use usual fetch from sets, will be querying hte db directly here
try {
ConnectDB connectDB = new ConnectDB();
Connection conn = connectDB.connect();
Statement stmt = null;
String sql = "SELECT * FROM scans WHERE scanid=" +scanId;
stmt = conn.createStatement();
ResultSet rs = stmt.executeQuery(sql);
while (rs.next()) {
//very ugly solution needs some change to where we put the query
Scans.add(new ScanObj(rs.getString("scanid"),rs.getString("productid"),rs.getString("eventid"),rs.getString("ismanagedservice"),rs.getString("componentlist")));
conn.close();
}
} catch (SQLException e){
System.out.println(e);
}
return Scans;
}
}

109
src/main/java/rest/CreateScanRequest.java

@ -1,109 +0,0 @@
package rest;
import org.eclipse.microprofile.rest.client.inject.RestClient;
import dto.ScanObj;
import javax.inject.Inject;
import javax.validation.Valid;
import javax.ws.rs.Consumes;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.UUID;
import org.json.JSONObject;
import org.json.JSONArray;
import dto.ScanObj;
import dto.BrewObj;
import dto.ConnectDB;
import dto.ScanObjPayload;
import dto.BrewObjPayload;
import dto.GitObj;
import dto.GitObjPayload;
import dto.PncObj;
import dto.PncObjPayload;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
@Path("/scanRequest")
public class CreateScanRequest {
//all of these need cleaning up to be a more sensible soution
@RestClient
CreateScanService createScanService;
@POST
@Path("/brew")
@Consumes({ "application/json" })
//in theory should take List<String> to clean it up
public BrewObj invokeScanAnalyze(@Valid String scanInvocation) throws URISyntaxException {
JSONObject jsonData = new JSONObject(scanInvocation);
BrewObj brewObj = BrewObjPayload.constructScanPayload(jsonData);
ConnectDB connectDB = new ConnectDB();
Connection conn = connectDB.connect();
Statement stmt = null;
String sql = "INSERT INTO brewscans (buildsystemtype, brewid, brewnvr, pncid, artifacttype, filename, builtfromsource) VALUES ('"+brewObj.buildSystemType+"','"+brewObj.brewId+"','"+brewObj.brewNvr+"','"+brewObj.pncId+"','"+brewObj.artifactType+"','"+brewObj.fileName+"','"+brewObj.buildFromSource+"')";
try{
stmt = conn.createStatement();
ResultSet rs = stmt.executeQuery(sql);
conn.close();
} catch (SQLException e){
System.out.println(e);
}
return brewObj;
}
@POST
@Path("/git")
@Consumes({ "application/json" })
public GitObj invokeGitScanAnalyze(@Valid String scanInvocation)throws URISyntaxException {
JSONObject jsonData = new JSONObject(scanInvocation);
GitObj gitObj = GitObjPayload.constructScanPayload(jsonData);
ConnectDB connectDB = new ConnectDB();
Connection conn = connectDB.connect();
Statement stmt = null;
String sql = "INSERT INTO gitscans (buildsystemtype, repository, reference, commitid) VALUES ('"+gitObj.buildSystemType+"','"+gitObj.repository+"','"+gitObj.reference+"','"+gitObj.commitId+"')";
try{
stmt = conn.createStatement();
ResultSet rs = stmt.executeQuery(sql);
conn.close();
} catch (SQLException e){
System.out.println(e);
}
return gitObj;
}
@POST
@Path("/pnc")
@Consumes({ "application/json" })
public PncObj invokePncScanAnalyze(@Valid String scanInvocation)throws URISyntaxException {
JSONObject jsonData = new JSONObject(scanInvocation);
PncObj pncObj = PncObjPayload.constructScanPayload(jsonData);
ConnectDB connectDB = new ConnectDB();
Connection conn = connectDB.connect();
Statement stmt = null;
String sql = "INSERT INTO pncscans (buildsystemtype, buildid) VALUES ('"+pncObj.buildSystemType+"','"+pncObj.buildId+"')";
try{
stmt = conn.createStatement();
ResultSet rs = stmt.executeQuery(sql);
conn.close();
} catch (SQLException e){
System.out.println(e);
}
return pncObj;
}
}

58
src/main/java/rest/CreateScanResource.java

@ -1,58 +0,0 @@
package rest;
import org.eclipse.microprofile.rest.client.inject.RestClient;
import dto.ScanObj;
import javax.inject.Inject;
import javax.validation.Valid;
import javax.ws.rs.Consumes;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.UUID;
import org.json.JSONObject;
import org.json.JSONArray;
import dto.ScanObj;
import dto.ConnectDB;
import dto.ScanObjPayload;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
@Path("/")
public class CreateScanResource {
@RestClient
CreateScanService createScanService;
@POST
@Consumes({ "application/json" })
//in theory should take List<String> to clean it up
public ScanObj invokeScanAnalyze(@Valid String scanInvocation) throws URISyntaxException {
JSONObject jsonData = new JSONObject(scanInvocation);
ScanObj scanObj = ScanObjPayload.constructScanPayload(jsonData);
ConnectDB connectDB = new ConnectDB();
Connection conn = connectDB.connect();
Statement stmt = null;
String sql = "INSERT INTO scans (scanid, productid, eventid, ismanagedservice, componentlist) VALUES ('" +scanObj.scanId+"', '"+scanObj.productId+"', '"+scanObj.eventId+"', '"+scanObj.isManagedService+"', '"+scanObj.componentList+"')";
try{
stmt = conn.createStatement();
ResultSet rs = stmt.executeQuery(sql);
conn.close();
} catch (SQLException e){
System.out.println(e);
}
return scanObj;
}
}

16
src/main/java/rest/CreateScanService.java

@ -1,16 +0,0 @@
package rest;
import dto.ScanObj;
import org.eclipse.microprofile.rest.client.inject.RegisterRestClient;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
@Path("/")
@RegisterRestClient
public interface CreateScanService {
//should be used for fixing the incoming data from post requests but not yet implemented
@POST
ScanObj invokeScanAnalysis(ScanObj scanObj);
}

76
src/main/java/rest/CreateStartScan.java

@ -1,76 +0,0 @@
package rest;
import org.eclipse.microprofile.rest.client.inject.RestClient;
import dto.ScanObj;
import javax.inject.Inject;
import javax.validation.Valid;
import javax.ws.rs.Consumes;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.ws.rs.PUT;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.UUID;
import org.json.JSONObject;
import org.json.JSONArray;
import dto.ScanObj;
import dto.ConnectDB;
import dto.ScanObjPayload;
import javax.ws.rs.PathParam;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
@Path("/startScan")
public class CreateStartScan {
@RestClient
CreateScanService createScanService;
@PUT
@Path("/{scanId}")
public ScanObj invokeScanAnalyze(@PathParam("scanId") String scanId) throws URISyntaxException {
ConnectDB connectDB = new ConnectDB();
Connection conn = connectDB.connect();
//this is ugly needs to berewritten
Statement stmt = null;
ScanObj finalScan = null;
String sql = "SELECT * FROM scans WHERE scanid=" + scanId;
//need to add figure out an archieve system and wether its nessacery (archieve value??)
try{
stmt = conn.createStatement();
//terrible solution has to be a better way of doing this
ResultSet rs = stmt.executeQuery(sql);
//fix for individual results (not resultset)
//TODO: need to add unique keys to DBs
finalScan = new ScanObj(rs.getString("scanid"),rs.getString("productid"),rs.getString("eventid"),rs.getString("ismanagedservice"),rs.getString("componentlist"));
String copySql = "INSERT INTO archive (scanid, productid, eventid, ismanagedservice, componentlist) VALUES ('" +finalScan.scanId+"', '"+finalScan.productId+"', '"+finalScan.eventId+"', '"+finalScan.isManagedService+"', '"+finalScan.componentList+"')";
stmt.executeUpdate(copySql);
//TODO add proper checks
String deleteSql = "DELETE FROM scans WHERE scanid=" + scanId;
stmt.executeUpdate(deleteSql);
//send task to the actual interface here using the resultset returned (should multiple scanids be allowed):
//once the task is complete AND we have confirmation that the scan is done run the following sql
conn.close();
} catch (SQLException e){
System.out.println(e);
}
return finalScan;
}
}

70
src/main/java/rest/RemoveScan.java

@ -1,70 +0,0 @@
package rest;
import org.eclipse.microprofile.rest.client.inject.RestClient;
import dto.ScanObj;
import javax.inject.Inject;
import javax.validation.Valid;
import javax.ws.rs.Consumes;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.ws.rs.PUT;
import javax.ws.rs.DELETE;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.UUID;
import org.json.JSONObject;
import org.json.JSONArray;
import dto.ScanObj;
import dto.ConnectDB;
import dto.ScanObjPayload;
import javax.ws.rs.PathParam;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
@Path("/deleteScan")
public class RemoveScan {
// @Inject
@RestClient
CreateScanService createScanService;
// ScanObjPayload scanObjPayload;
@DELETE
@Path("/{scanId}")
public boolean invokeScanAnalyze(@PathParam("scanId") String scanId) throws URISyntaxException {
ConnectDB connectDB = new ConnectDB();
Connection conn = connectDB.connect();
//this is ugly needs to berewritten
Statement stmt = null;
ScanObj finalScan = null;
//fix this
Boolean success = false;
String sql = "DELETE FROM scans WHERE scanid=" + scanId;
//need to add figure out an archieve system and wether its nessacery (archieve value??)
try{
stmt = conn.createStatement();
//TODO add proper checks
stmt.executeUpdate(sql);
//send task to the actual interface here using the resultset returned (should multiple scanids be allowed):
//once the task is complete AND we have confirmation that the scan is done run the following sql
conn.close();
} catch (SQLException e){
System.out.println(e);
}
success = true;
return success;
}
}

43
src/main/java/rest/Scan.java

@ -1,43 +0,0 @@
package rest;
import javax.persistence.Entity;
public class Scan {
private int scanId;
private String productId;
private String eventId;
private String isManagedService;
private String componentList;
public int getScanId() {
return scanId;
}
public void setScanId(int scanId) {
this.scanId = scanId;
}
public String getProductId() {
return productId;
}
public void setProductId(String productId) {
this.productId = productId;
}
public String getEventId() {
return eventId;
}
public void setEventId(String eventId) {
this.eventId = eventId;
}
public String getIsManagedService(){
return isManagedService;
}
public void setIsManagedService(String isManagedService){
this.isManagedService = isManagedService;
}
public String getComponentList(){
return componentList;
}
public void setComponentList(String componentList){
this.componentList = componentList;
}
}

124
src/main/java/rest/StoreData.java

@ -1,124 +0,0 @@
package rest;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.Set;
import dto.ScanObj;
// import dto.ConnectDB;
// import dto.Scan;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
import javax.ws.rs.DELETE;
import javax.ws.rs.GET;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.inject.Inject;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import java.util.Set;
import java.util.stream.Collectors;
import javax.inject.Inject;
import javax.ws.rs.Consumes;
import java.sql.*;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.ObjectWriter;
import org.hibernate.Session;
import org.hibernate.SessionFactory;
import org.hibernate.Transaction;
import org.hibernate.boot.Metadata;
import org.hibernate.boot.MetadataSources;
import org.hibernate.boot.registry.StandardServiceRegistry;
import org.hibernate.boot.registry.StandardServiceRegistryBuilder;
// import org.hibernate.EntityManager;
import jakarta.persistence.EntityManager;
import jakarta.persistence.Cacheable;
import jakarta.persistence.Column;
import jakarta.persistence.Entity;
import jakarta.persistence.GeneratedValue;
import jakarta.persistence.Id;
import jakarta.persistence.NamedQuery;
import jakarta.persistence.QueryHint;
import jakarta.persistence.SequenceGenerator;
import jakarta.persistence.Table;
import org.eclipse.microprofile.rest.client.inject.RestClient;
import dto.ScanObj;
import javax.inject.Inject;
import javax.validation.Valid;
import javax.ws.rs.Consumes;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.ws.rs.PUT;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.UUID;
import org.json.JSONObject;
import org.json.JSONArray;
import dto.ScanObj;
import dto.ConnectDB;
import dto.ScanObjPayload;
import javax.ws.rs.PathParam;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
@Path("/storeData")
public class StoreData {
//all of these need cleaning up to be a more sensible soution
// @RestClient
// CreateScanService createScanService;
@GET
public void Store() {
System.out.println("hmm");
//Create typesafe ServiceRegistry object
StandardServiceRegistry ssr = new StandardServiceRegistryBuilder().configure("hibernate.cfg.xml").build();
Metadata meta = new MetadataSources(ssr).getMetadataBuilder().build();
SessionFactory factory = meta.getSessionFactoryBuilder().build();
Session session = factory.openSession();
Transaction t = session.beginTransaction();
System.out.println("i assume we fail before here?");
Scan e1=new Scan();
e1.setScanId(2);
e1.setProductId("1");
e1.setEventId("Chawla");
e1.setIsManagedService("aa");
e1.setComponentList("aaa");
session.save(e1);
t.commit();
System.out.println("successfully saved");
factory.close();
session.close();
}
}

20
src/main/java/rest/TektonResourceClient.java

@ -1,20 +0,0 @@
// package rest;
// import java.util.List;
// import jakarta.enterprise.context.ApplicationScoped;
// import io.fabric8.tekton.client.TektonClient;
// import io.fabric8.tekton.pipeline.v1beta1.Pipeline;
// @ApplicationScoped
// public class TektonResourceClient {
// // @Inject
// TektonClient tektonClient;l
// public List<Pipeline> listPipelines() {
// return tektonClient.v1beta1().pipelines().list().getItems();
// }
// }

175
src/main/java/rest/TektonTaskCreate.java

@ -1,175 +0,0 @@
// package rest;
import io.fabric8.tekton.client.DefaultTektonClient;
import io.fabric8.tekton.client.TektonClient;
import javax.inject.Inject;
import javax.validation.Valid;
import javax.ws.rs.Consumes;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.UUID;
import org.json.JSONObject;
import org.json.JSONArray;
import dto.ScanObj;
import dto.BrewObj;
import dto.ConnectDB;
import dto.ScanObjPayload;
import dto.BrewObjPayload;
import dto.GitObj;
import dto.GitObjPayload;
import dto.PncObj;
import dto.PncObjPayload;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.Set;
import dto.ScanObj;
import dto.ConnectDB;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
import javax.ws.rs.DELETE;
import javax.ws.rs.GET;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.inject.Inject;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import java.util.Set;
import java.util.stream.Collectors;
import javax.inject.Inject;
import javax.ws.rs.Consumes;
import java.sql.*;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.ObjectWriter;
// import org.hibernate.EntityManager;
import jakarta.persistence.EntityManager;
import jakarta.persistence.Cacheable;
import jakarta.persistence.Column;
import jakarta.persistence.Entity;
import jakarta.persistence.GeneratedValue;
import jakarta.persistence.Id;
import jakarta.persistence.NamedQuery;
import jakarta.persistence.QueryHint;
import jakarta.persistence.SequenceGenerator;
import jakarta.persistence.Table;
import io.fabric8.tekton.client.DefaultTektonClient;
import io.fabric8.tekton.client.TektonClient;
import io.fabric8.tekton.pipeline.v1beta1.TaskBuilder;
import io.fabric8.tekton.pipeline.v1beta1.Task;
import org.eclipse.microprofile.rest.client.inject.RestClient;
import dto.ScanObj;
import javax.inject.Inject;
import javax.validation.Valid;
import javax.ws.rs.Consumes;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import java.net.URI;
import java.net.URISyntaxException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.UUID;
import org.json.JSONObject;
import org.json.JSONArray;
import dto.ScanObj;
import dto.BrewObj;
import dto.ConnectDB;
import dto.ScanObjPayload;
import dto.BrewObjPayload;
import dto.GitObj;
import dto.GitObjPayload;
import dto.PncObj;
import dto.PncObjPayload;
import static constants.HttpHeaders.AUTHORIZATION_STRING;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.Statement;
import io.fabric8.tekton.client.DefaultTektonClient;
import io.fabric8.tekton.client.TektonClient;
import io.fabric8.tekton.pipeline.v1beta1.TaskBuilder;
import io.fabric8.tekton.pipeline.v1beta1.TaskRunBuilder;
@Path("/tekton")
public class TektonTaskCreate {
@RestClient
private static final String NAMESPACE = "default";
@POST
@Consumes({ "application/json" })
public void invokeTektonTask(String data) {
JSONObject jsonData = new JSONObject(data);
// ScanObj scanObj = ScanObjPayload.constructScanPayload(jsonData);
//dont leave this in live needs to be adjusted (currently would cause a ton of issues)
String tektonArgs = "osh-cli mock-build --config="+jsonData.get("config")+"--brew-build"+jsonData.get("nvr");
try (TektonClient tkn = new DefaultTektonClient()) {
// Create Task
tkn.v1beta1().tasks().inNamespace(NAMESPACE).resource(new TaskBuilder()
.withNewMetadata().withName("tekton-osh-client").endMetadata()
.withNewSpec()
.addNewStep()
.withName("osh-client")
.withImage("alpine:3.12")
.withCommand("osh-cli")
.withArgs(tektonArgs)
.endStep()
.endSpec()
.build()).createOrReplace();
// Create TaskRun
tkn.v1beta1().taskRuns().inNamespace(NAMESPACE).resource(new TaskRunBuilder()
.withNewMetadata().withName("tekton-osh-client-task-run").endMetadata()
.withNewSpec()
.withNewTaskRef()
.withName("tekton-osh-client")
.endTaskRef()
.endSpec()
.build()).createOrReplace();
}
}
}

116
src/main/java/rest/callTekton.java

@ -1,116 +0,0 @@
// package rest;
// import io.fabric8.tekton.client.DefaultTektonClient;
// import io.fabric8.tekton.client.TektonClient;
// import javax.inject.Inject;
// import javax.validation.Valid;
// import javax.ws.rs.Consumes;
// import javax.ws.rs.POST;
// import javax.ws.rs.Path;
// import java.net.URI;
// import java.net.URISyntaxException;
// import java.util.ArrayList;
// import java.util.Arrays;
// import java.util.List;
// import java.util.UUID;
// import org.json.JSONObject;
// import org.json.JSONArray;
// import dto.ScanObj;
// import dto.BrewObj;
// import dto.ConnectDB;
// import dto.ScanObjPayload;
// import dto.BrewObjPayload;
// import dto.GitObj;
// import dto.GitObjPayload;
// import dto.PncObj;
// import dto.PncObjPayload;
// import static constants.HttpHeaders.AUTHORIZATION_STRING;
// import java.sql.Connection;
// import java.sql.DriverManager;
// import java.sql.SQLException;
// import java.sql.Connection;
// import java.sql.DriverManager;
// import java.sql.ResultSet;
// import java.sql.Statement;
// import java.util.Collections;
// import java.util.LinkedHashMap;
// import java.util.Set;
// import dto.ScanObj;
// import dto.ConnectDB;
// import java.sql.Connection;
// import java.sql.DriverManager;
// import java.sql.SQLException;
// import java.sql.Connection;
// import java.sql.DriverManager;
// import java.sql.ResultSet;
// import java.sql.Statement;
// import javax.ws.rs.DELETE;
// import javax.ws.rs.GET;
// import javax.ws.rs.POST;
// import javax.ws.rs.Path;
// import javax.inject.Inject;
// import javax.ws.rs.GET;
// import javax.ws.rs.Path;
// import javax.ws.rs.PathParam;
// import java.util.Set;
// import java.util.stream.Collectors;
// import javax.inject.Inject;
// import javax.ws.rs.Consumes;
// import java.sql.*;
// import com.fasterxml.jackson.databind.ObjectMapper;
// import com.fasterxml.jackson.databind.ObjectWriter;
// // import org.hibernate.EntityManager;
// import jakarta.persistence.EntityManager;
// import jakarta.persistence.Cacheable;
// import jakarta.persistence.Column;
// import jakarta.persistence.Entity;
// import jakarta.persistence.GeneratedValue;
// import jakarta.persistence.Id;
// import jakarta.persistence.NamedQuery;
// import jakarta.persistence.QueryHint;
// import jakarta.persistence.SequenceGenerator;
// import jakarta.persistence.Table;
// import io.fabric8.tekton.client.DefaultTektonClient;
// import io.fabric8.tekton.client.TektonClient;
// import io.fabric8.tekton.pipeline.v1beta1.TaskBuilder;
// import io.fabric8.tekton.pipeline.v1beta1.Task;
// public class callTekton {
// private static final String NAMESPACE = "default";
// public static void main(String[] args) {
// try (TektonClient tkn = new DefaultTektonClient()) {
// // Load Task object from YAML
// Task task = tkn.v1beta1()
// .tasks()
// .load(callTekton.class.getResourceAsStream("../resources/baseScan.yml")).get();
// // Create Task object into Kubernetes
// tkn.v1beta1().tasks().inNamespace(NAMESPACE).createOrReplace(task);
// // Get Task object from APIServer
// String taskName = task.getMetadata().getName();
// task = tkn.v1beta1().tasks().inNamespace(NAMESPACE)
// .withName(taskName)
// .get();
// // Delete Task object
// tkn.v1beta1().tasks().inNamespace(NAMESPACE).withName(taskName).delete();
// }
// }
// }

288
src/main/resources/META-INF/resources/index.html

File diff suppressed because one or more lines are too long

19
src/main/resources/Scan.hbm.xml

@ -1,19 +0,0 @@
<?xml version='1.0' encoding='UTF-8'?>
<!DOCTYPE hibernate-configuration PUBLIC
"-//Hibernate/Hibernate Configuration DTD 3.0//EN"
"http://www.hibernate.org/dtd/hibernate-mapping-5.3.dtd">
<hibernate-mapping>
<class name="rest.Scan" table="scans">
<id name="ScanId">
<generator class="increment"/>
</id>
<!-- <property name="scanId"></property> -->
<property name="ProductId"></property>
<property name="EventId"></property>
<property name="IsManagedService"></property>
<property name="ComponentList"></property>
</class>
</hibernate-mapping>

79
src/main/resources/application.properties

@ -1,7 +1,84 @@
#Example deploy - mvn deploy -Dquarkus.profile=stage -Dquarkus.kubernetes.deploy=true
# quarkus.rest-client."rest.CreateScanService".url=https://localhost:8080/
# quarkus.rest-client."rest.CreateScanService".scope=javax.inject.Singleton
# couchdb.name=scan-results
# couchdb.url=https://localhost:5984
# quarkus.hibernate-orm.database.generation=drop-and-create
# quarkus.hibernate-orm.database.generation=drop-and-create
#temporary fix, we need to enable it with a working devservices setup
%dev.quarkus.kerberos.enabled=false
%dev.quarkus.security.auth.enabled-in-dev-mode=false
#Also tried
#%dev.quarkus.security.enabled=false
#%dev.quarkus.http.auth.proactive=false
#%dev.quarkus.http.auth.basic=false
#%dev.quarkus.http.auth.permission.permit1.paths=/Ping/Ping
#%dev.quarkus.http.auth.permission.permit1.policy=permit
#%dev.quarkus.http.auth.permission.permit1.methods=GET,HEAD
#%quarkus.arc.unremovable-types=io.quarkiverse.kerberos.*,io.quarkiverse.kerberos.KerberosPrincipal
#%dev.quarkus.kerberos.keytab-path= HTTP_osh-pct-security-tooling.apps.ocp-c1.prod.psi.redhat.com@IPA.REDHAT.COM.keytab
#%dev.quarkus.kerberos.service-principal-name= HTTP/osh-pct-security-tooling.apps.ocp-c1.prod.psi.redhat.com@IPA.REDHAT.COM
##########################################
# Data Source #
##########################################
%dev.quarkus.datasource.devservices.enabled=true
%dev.quarkus.datasource.db-kind = postgresql
%dev.quarkus.datasource.username = quarkus
%dev.quarkus.datasource.password = quarkus
#%dev.quarkus.datasource.jdbc.url = jdbc:postgresql://localhost:5432/hibernate_db
%dev.quarkus.hibernate-orm.database.generation=drop-and-create
%stage.quarkus.kubernetes-config.secrets.enabled=true
quarkus.kubernetes-config.secrets=postgresql
%stage.quarkus.datasource.jdbc.url=jdbc:postgresql://postgresql:5432/${database-name}
%stage.quarkus.datasource.username=${database-user}
%stage.quarkus.datasource.password=${database-password}
%stage.quarkus.hibernate-orm.database.generation=drop-and-create
#Always provide swagger ui
quarkus.swagger-ui.always-include=true
%dev.quarkus.openshift.service-account=osh-wrapper-client-sa
%dev.quarkus.openshift.namespace=pct-security-tooling
%stage.quarkus.openshift.name=osh
quarkus.openshift.service-account=osh-wrapper-client-sa
%stage.quarkus.openshift.labels.env=stage
%stage.quarkus.log.level=DEBUG
quarkus.arc.remove-unused-beans=false
#Only in Quarkus > 3.x
%stage.quarkus.openshift.route.tls.termination=edge
#As we cant create a edge terminated route (quarkus <3.x) lets disable route creation for now
%stage.quarkus.openshift.route.expose=false
%stage.quarkus.openshift.route.target-port=https
%stage.quarkus.openshift.route.tls.insecure-edge-termination-policy=redirect
%stage.quarkus.openshift.namespace=pct-security-tooling
##########################################
# Kerberos Specifics #
##########################################
%stage.quarkus.openshift.secret-volumes.osh-wrapper.secret-name=kerberos-keytab-osh
%stage.quarkus.openshift.mounts.osh-wrapper.path=/kerberos
%stage.quarkus.openshift.mounts.osh-wrapper.read-only=true
%stage.quarkus.kerberos.keytab-path= /kerberos/kerberos-keytab-osh
%stage.quarkus.kerberos.service-principal-name= HTTP/osh-pct-security-tooling.apps.ocp-c1.prod.psi.redhat.com@IPA.REDHAT.COM
%stage.quarkus.openshift.mounts.osh-wrapper-config-vol.path=/etc/krb5.conf
%stage.quarkus.openshift.mounts.osh-wrapper-config-vol.sub-path=linux-krb5.conf
%stage.quarkus.openshift.config-map-volumes.osh-wrapper-config-vol.config-map-name=kerberos-config
%stage.quarkus.openshift.config-map-volumes.osh-wrapper-config-vol.items."linux-krb5.conf".path=linux-krb5.conf
%stage.quarkus.openshift.mounts.osh-wrapper-config-vol.read-only=true
##########################################
# Tekton Specifics (Used in app) #
##########################################
tekton.pipeline.ref=osh-client-from-source
tekton.task.ref=osh-scan-task
tekton.service-account=${quarkus.openshift.service-account}

16
src/main/resources/baseScan.yml

@ -1,16 +0,0 @@
apiVersion: tekton.dev/v1beta1
kind: Task
metadata:
name: basescan
spec:
params:
- name: buildId
type: string
- name: config
type: string
steps:
- name: baseScan
image: openshift
script: |
#!/bin/bash
osh-cli mock-build --config=params.config --brew-build params.buildId

21
src/main/resources/hibernate.cfg.xml

@ -1,21 +0,0 @@
<?xml version='1.0' encoding='UTF-8'?>
<!DOCTYPE hibernate-configuration PUBLIC
"-//Hibernate/Hibernate Configuration DTD 3.0//EN"
"http://www.hibernate.org/dtd/hibernate-configuration-3.0.dtd">
<hibernate-configuration>
<session-factory>
<property name="hbm2ddl.auto">update</property>
<property name="dialect">org.hibernate.dialect.PostgreSQLDialect</property>
<property name="connection.driver_class">org.postgresql.Driver</property>
<property name="connection.url">jdbc:postgresql://localhost:5432/mydb</property>
<property name="connection.username">postgres</property>
<property name="connection.password">password</property>
<!-- <property name="connection.driver_class">oracle.jdbc.driver.OracleDriver</property> -->
<property name="show_sql">true</property> <!-- Show SQL in console -->
<property name="format_sql">true</property> <!-- Show SQL formatted -->
<mapping resource="Scan.hbm.xml"/>
</session-factory>
</hibernate-configuration>

22
src/test/java/com/redhat/pctsec/model/osh/paramMapperTest.java

@ -0,0 +1,22 @@
package com.redhat.pctsec.model.osh;
import io.quarkus.test.junit.QuarkusTest;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Test;
import java.lang.String;
import static io.restassured.RestAssured.given;
import static org.hamcrest.CoreMatchers.is;
@QuarkusTest
public class paramMapperTest {
@Test
public void testSnykScan() {
paramMapper pm = new paramMapper("-p snyk-only-unstable --tarball-build-script=\":\"");
System.out.println(pm);
}
}
Loading…
Cancel
Save