Logging in Serverless Spark (Part3)

Mrudula Madiraju
2 min readFeb 14, 2022

Export Logs for each application

Overview

In Part1 and Part2 of this series, we saw some fundamental usage of logging in Analytics Engine Serverless Spark.

For simple applications, you can look at the logDNA interface for troubleshooting or analyzing how it performed. When you have long running applications, you might find it useful to export the logs to analyze it locally or even to pass it on to a team-member.

This last topic will cover how you can export the logs from LogDNA programatically.

LogDNA Service Key & Export API

Follow the steps in here to get hold of your LogDNA service key and the API for exporting the logs.

Execute the Export API for your application

Assuming the your Platform LogDNA exist in us-south, you can execute the following command to export logs from LogDNA.

In this example,
- query is the application_id
- from
is date 24 hours (86400 seconds) prior from current timestamp
- to is the current timestamp

curl -s "https://api.us-south.logging.cloud.ibm.com/v2/export?query=$application_id&to=$(date +%s)000&from=$(($(date +%s)-86400))000" -u $service_key: > logs.json

Extract Application Log Lines

Next you can use the jq tool to extract the “message” lines from the JSON output. The message element in the extracted JSON contains the application log line.

cat logs.json | jq '.lines | .[] | .message' > $application_id.log

Debug with grep

A simple grep for “task” or any other element of your interest n the log file can give you insights that you need.

That’s it!

Try out Log Analysis with Analytics Engine. It may be a good practice to export the logs for your long running applications where you need to look at the logs in detail.

--

--

Mrudula Madiraju

Dealing with Data, Cloud, Compliance and sharing tit bits of epiphanies along the way.