Kubernetes – Get terminationGracePeriodSeconds and Other Values Missing From Describe Pod/Deployment

When checking what is running in kubernetes, people generally do something like this:

kubectl get deploy -n <namespace>
kubectl get pods -n <namespace>

And to describe extended parameters on a deployment or pod:

kubectl describe deploy -n <namespace> <deployment-name>
kubectl describe pod -n <namespace> <pod-name>

Interestingly, these more verbose describe commands are still missing a lot of information. It turns out that the only way to get *all* of the information is to go back to the get command and to tell it to output everything to YAML or a similar format:

kubectl get deploy -n <namespace> -o yaml
kubectl get pods -n <namespace> -o yaml

These commands will yield far more configuration options than the describe commands. Things like terminationGracePeriodSeconds will be readily available here.

Presto – Get and List the Connectors on All Nodes in Cluster

Some problems in presto are the result of having connector definitions only on a subset of nodes in the cluster. For example, a recent error on the presto-sql forum during insert into a hive table was:

java.lang.IllegalArgumentException: No page sink provider for catalog 'hive'
at com.google.common.base.Preconditions.checkArgument(Preconditions.java:216)
at io.prestosql.split.PageSinkManager.providerFor(PageSinkManager.java:67)
at io.prestosql.split.PageSinkManager.createPageSink(PageSinkManager.java:61)
at io.prestosql.operator.TableWriterOperator$TableWriterOperatorFactory.createPageSink(TableWriterOperator.java:114)
at io.prestosql.operator.TableWriterOperator$TableWriterOperatorFactory.createOperator(TableWriterOperator.java:105)
at io.prestosql.operator.DriverFactory.createDriver(DriverFactory.java:114)
at io.prestosql.execution.SqlTaskExecution$DriverSplitRunnerFactory.createDriver(SqlTaskExecution.java:941)
at io.prestosql.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:1069)
at io.prestosql.execution.executor.PrioritizedSplitRunner.process(PrioritizedSplitRunner.java:163)
at io.prestosql.execution.executor.TaskExecutor$TaskRunner.run(TaskExecutor.java:484)

If you have a decent size cluster, it is very painful to go to each node and check its catalogs. This problem can be even worse if you have an old node join the cluster after maintenance or something like that.

In any case, you can use the following URL on presto (/v1/service/presto) to list all nodes and their registered connectors in one shot. This will help you track down the problem fast :). You can even be lazy and parse the JSON in chrome dev tools/etc so you don’t have to eyeball all the nodes.

https://nonprod.presto.your-company.com/v1/service/presto

Example Output

  "environment": "nonprod",
"services": [
{
"id": "a35ae2a7-fa95-43c9-b893-180449a48c5a",
"nodeId": "blue-presto-worker-865b8db58-g92wn",
"type": "presto",
"pool": "general",
"location": "/blue-presto-worker-865b8db58-g92wn",
"properties": {
"node_version": "331-n-2.6.1",
"coordinator": "false",
"https": "https://10-234-232-180.nonprod-presto.pod.cluster.local:8443",
"https-external": "https://10-234-232-180.nonprod-presto.pod.cluster.local:8443",
"connectorIds": "hive-dl,system,cr-meta,ar-meta,dc-meta"
}
},
{
"id": "b8dd0f39-00b0-4c78-b0c0-ff8e753419d8",
"nodeId": "blue-presto-worker-865b8db58-d2nsz",
"type": "presto",
"pool": "general",
"location": "/blue-presto-worker-865b8db58-d2nsz",
"properties": {
"node_version": "331-n-2.6.1",
"coordinator": "false",
"https": "https://10-234-234-106.nonprod-presto.pod.cluster.local:8443",
"https-external": "https://10-234-234-106.nonprod-presto.pod.cluster.local:8443",
"connectorIds": "hive-dl,system,cr-meta,ar-meta,dc-meta"
}
},
...

Google Drive API v3 + Sheets + Shared Drives in Java

There are plenty of examples of how to use the Google Drive API online. A ton are for old versions though, and most are basic cases (not good with restricted sharing options/etc). Also, virtually none show you how to do things with shared drives.

I had to do all of this recently, so I hope this helps someone else avoid the pain I went through =). The only thing this assumes is that you have a valid credentials file generated from the developer console.

Defining Scopes

These scopes should all be enabled for your credentials on the consent screen part of the developer console. Also list them in your code.

static {
SCOPES = new ArrayList<>();
SCOPES.add(SheetsScopes.DRIVE);
SCOPES.add(SheetsScopes.DRIVE_FILE);
SCOPES.add(SheetsScopes.SPREADSHEETS);
}

Get Credentials

private HttpRequestInitializer getCredentials(NetHttpTransport httpTransport) {
    GoogleCredential credential = null;
    try {
        credential = GoogleCredential.fromStream(new FileInputStream(credentialsFilePath), httpTransport, JSON_FACTORY)
                .createScoped(SCOPES)
                .createDelegated(svcAccount);
    } catch (IOException e) {
        logger.error("ERROR Occurred while Authorization using the credentials provided...!!!");
    }
    return setHttpTimeout(credential);
}

Get Sheet and Drive Services

private Sheets getSheetService(String applicationName, NetHttpTransport httpTransport) throws FileNotFoundException {
return new Sheets.Builder(
httpTransport,
JSON_FACTORY,
getCredentials(httpTransport)
).setApplicationName(applicationName).build();
}

private Drive getDriveService(String applicationName, NetHttpTransport HTTP_TRANSPORT) throws FileNotFoundException {
return new Drive.Builder(HTTP_TRANSPORT,
JSON_FACTORY,
getCredentials(HTTP_TRANSPORT))
.setApplicationName(applicationName)
.build();
}

Create a Spreadsheet and Control Permissions

You can create a sheet easily with the sheet service. But, if you want to put your sheet in a specific parent folder and change permissions/control sharing settings, then you need to create it with the drive service setting a mime-type of sheet.

You can find your folder ID by navigating to your folder in google drive and getting the ID from the URL. Since we set “supports all drives”, we can create this file in a folder in our share drive. Without this setting, share drives fail with some kind of auth error.

private File createSpreadSheet(Drive driveService, String sheetTitle, String userFolderId) {
try {
File fileSpec = new File();
fileSpec.setName(sheetTitle);
fileSpec.setParents(Collections.singletonList(userFolderId));
fileSpec.setMimeType("application/vnd.google-apps.spreadsheet");

File sheetFile = driveService.files()
.create(fileSpec)
.setSupportsAllDrives(true) //Share drives don't work without this parameter.
.execute();

sheetFile.setViewersCanCopyContent(false);
sheetFile.setCopyRequiresWriterPermission(true);
sheetFile.setWritersCanShare(false);
driveService.files().update(sheetFile.getId(), sheetFile);

return sheetFile;
} catch (IOException e) {
throw new RuntimeException("Error occurred while creating the sheet.\n" + e);
}
}

Write Data to a Spreadsheet

private void writeToSpreadSheet(Sheets service, String spreadSheetId, String json) {
    final String range = "Sheet1";
    ValueRange body = new ValueRange()
            .setValues(getJsonData(json));
    UpdateValuesResponse response;
    try {
        response = service
                .spreadsheets()
                .values()
                .update(spreadSheetId, range, body)
                .setValueInputOption(VALUE_INPUT_OPTION)
                .execute();
    } catch (IOException e) {
        throw new RuntimeException("ERROR Occurred while insert / updating the values in Google Spread Sheet : " + spreadSheetId + "\n" + e);
    }
    logger.info(response.getUpdatedCells() + " cells updated.");
}

Find a Folder in Another Folder

private String getFolderIdIfExists(Drive driveService, String folderName) throws IOException {

    FileList folders = driveService.files().list()
            .setSupportsAllDrives(true)
            .setIncludeItemsFromAllDrives(true)
            .setQ(String.format("'%s' in parents and mimeType = 'application/vnd.google-apps.folder' and name = '%s'",
                    mainFolderId, folderName))
            .execute();

    return folders.getFiles().size() == 1 ? folders.getFiles().get(0).getId() : null;
}

Create a Folder In a Specific Folder

private String createUserFolderAndGetId(Drive driveService, String folderName) throws IOException {

    File fileSpec = new File();
    fileSpec.setName(folderName);
    fileSpec.setParents(Collections.singletonList(mainFolderId));
    fileSpec.setMimeType("application/vnd.google-apps.folder");

    File targetFolder = driveService.files()
            .create(fileSpec)
            .setSupportsAllDrives(true) //Share drives don't work without this parameter.
            .execute();

    return targetFolder.getId();
}

Helm 3 / GitLab Uninstall If Exists

Helm 3 does not seem to have a good way to “uninstall if exists” unfortunately. So, we had to find a way around that to make sure we could wipe out a previous deployment reliably, in CI/CD (in cases where we had to change a deployment version, which is rare).

As we use GitLab, we found this trick in the docs:

If any of the script commands return an exit code different from zero, the job will fail and further commands won’t be executed. This behavior can be avoided by storing the exit code in a variable:

job:
script:
- false || exit_code=$?
- if [ $exit_code -ne 0 ]; then echo "Previous command failed"; fi;

Using this, you can do:

helm uninstall -n your-namespace some-deployment-0-0-5 || exit_code=$?

And, while you’ll receive a note that it didn’t work on any release after it’s gone, the pipeline will continue on fine.

IntelliJ Maven Not Resolving Dependencies / Not Applying Excludes

I recently had a ton of issues with dependencies in IntelliJ with maven, on multiple consecutive occasions. This is pretty odd as I’ve used IntelliJ and Maven for probably around 10 years (I even have the top youtube videos on that combination!).

I’m on IntelliJ 2020.1 currently, and I found a few things through painful trial and error here. I hope they help you.

  1. Apparently at some point they removed that “Automatically Import” option that used to pop up when you created/imported a maven project. This used to make things automatically resolve as you changed your POM. Now you need to make sure you build to pull in dependencies, and for safe measure I also click the “re-import” button on the maven tools tab (little icon at the top row).
  2. There is now an “offline” mode. So, you may keep failing and failing to install because you can’t resolve dependencies on, say, maven central. This is super confusing as you may actually be online and looking at maven central and seeing the dependencies there. If this happens, check/disable offline mode!
  3. You may add excludes to a complex dependency (like the hive metastore in my case) to remove transitive dependencies that break your app/framework (like spring boot). You shold make sure to change the POM, clean, and then re-import the maven project again to ensure they’re really gone. I kept seeing them in the external dependencies list in the object browser, and my builds were failing, until I did the last step of re-importing.
  4. If all else fails, clean, invalidate-and-restart (in the file menu), install, and re-import and it seems to be a good catch-all for when you’re completely lost.

This all seems pretty crazy to me, but I’ve gone through it a few times now and it seems right. I hope it helps you save some of the time I wasted!