If you want to test your deployed Google App Engine applications (ie. not just the code you have in your source respository) you can do so with Snyk by downloading the artifacts from Google Cloud Storage. The following demostrates a proof-of-concept of doing so.
You'll need to setup a few Google Cloud tools
You'll need the SDK installed to provide access to the gcloud
and gsutil
command line tools. Instructions for different platforms at: https://cloud.google.com/sdk/install
As well as authenticating with Google Cloud, you'll need to create or use a service account with access to Google Cloud Storage. You can list available accounts list so:
gcloud iam service-accounts list
Assuming you identify one, you can generate a local key file with the following command. Simply choose the iam account email address from the list above.
gcloud iam service-accounts keys create appengine.json --iam-account <your account name>
More details about IAM at https://cloud.google.com/iam/docs/quickstart
You'll then need to export the GOOGLE_APPLICATION_CREDENTIALS
environment variable to point at the file you generated.If you left that locally then try:
export GOOGLE_APPLICATION_CREDENTIALS=appengine.json
More details about providing the authentication details can be found in https://cloud.google.com/docs/authentication/production
gcs-fetcher
is a utility used to download artifacts from Google Cloud Storage based on a manifest file. You'll need
a Go toolchain installed to use.
go install github.com/GoogleCloudPlatform/cloud-builders/gcs-fetcher/cmd/gcs-fetcher
First you'll need to identify the manifest for your application. These are stored in a bucket named for your application,
in the /ae/
directory. You're looking for the manifest.json
files.
$ gsutil ls -r gs://staging.garethr.appspot.com/ae
gs://staging.garethr.appspot.com/ae/56b5ffd5-a6fe-42ae-98f4-8b4821d59771/:
gs://staging.garethr.appspot.com/ae/56b5ffd5-a6fe-42ae-98f4-8b4821d59771/manifest.json
gs://staging.garethr.appspot.com/ae/b0bf3187-d4ec-4912-99ef-8305ef6c61de/:
gs://staging.garethr.appspot.com/ae/b0bf3187-d4ec-4912-99ef-8305ef6c61de/manifest.json
Note that to help with identifying the latest manifest you can use -L
, eg:
$ gsutil ls -rL gs://staging.garethr.appspot.com/ae
...
gs://staging.garethr.appspot.com/ae/b0bf3187-d4ec-4912-99ef-8305ef6c61de/manifest.json:
Creation time: Tue, 07 Jan 2020 08:01:10 GMT
Update time: Tue, 07 Jan 2020 08:01:10 GMT
Storage class: STANDARD
Content-Length: 773
Content-Type: application/json
Hash (crc32c): csVRsw==
Hash (md5): S5zhOTDRdRyvseAWrRM9ZA==
ETag: CP37y+6C8eYCEAE=
Generation: 1578384070213117
Metageneration: 1
...
You can look at the manifest with cat
. Note in this case I had a Python application with four files. The manifest file
has the names of the files and their locations on Google Storage.
$ gsutil cat gs://staging.garethr.appspot.com/ae/b0bf3187-d4ec-4912-99ef-8305ef6c61de/manifest.json
{
"app.yaml": {
"sourceUrl": "https://storage.googleapis.com/hello-garethr-unique/a019edb1f4e2a68ad6f1e573bf4e02953fc3a11b",
"sha1Sum": "a019edb1_f4e2a68a_d6f1e573_bf4e0295_3fc3a11b"
},
"main.py": {
"sourceUrl": "https://storage.googleapis.com/hello-garethr-unique/9a5b7abbf336fa296b988b912b5732faa0de6ea1",
"sha1Sum": "9a5b7abb_f336fa29_6b988b91_2b5732fa_a0de6ea1"
},
"main_test.py": {
"sourceUrl": "https://storage.googleapis.com/hello-garethr-unique/7824eeacd13c187cfe57b939d5f840e4134fed33",
"sha1Sum": "7824eeac_d13c187c_fe57b939_d5f840e4_134fed33"
},
"requirements.txt": {
"sourceUrl": "https://storage.googleapis.com/hello-garethr-unique/2157301ebba8dc9ee54d69d497113971324009e2",
"sha1Sum": "2157301e_bba8dc9e_e54d69d4_97113971_324009e2"
}
}
GCS Fetcher is used to recreate the content locally, by using the names and source URLs from the manifest to download the files and store then using the original names. Point this at the file like so:
gcs-fetcher --type Manifest --location gs://staging.garethr.appspot.com/ae/b0bf3187-d4ec-4912-99ef-8305ef6c61de/manifest.json
This should download files that are currently running the application.
$ ls
app.yaml appengine.json main.py main_test.py requirements.tx
At this point you should be able to test with Snyk as normal. Either install Snyk locally and run
snyk test
Or use the Docker images, which will also install any dependencies needed as part of scanning:
docker run --rm -it --env SNYK_TOKEN -v $(PWD):/app snyk/snyk:python