Get Logs for a Spark Job
See the table below for useful commands related to Spark jobs:
Description | Command |
---|---|
Retrieve the initial logs that contain information about the pod spin up. |
|
Retrieve the pod ID. |
|
Retrieve logs from failed jobs. |
|
Tail logs from running containers by using the |
|
Spark deletes failed and successful executor pods. Fusion provides a cleanup Kubernetes cron job that removes successfully completed driver pods every 15 minutes. |
Viewing the Spark UI
In the event that you need to monitor or inspect your Spark job executions, you can use port forwarding to access the Spark UI in your web browser. Port forwarding forwards your local port connection to the port of the pod that is running the Spark driver.
To view the Spark UI, find the pod that is running the Spark driver and run the following command:
kubectl -n namespace port-forward driver-pod 4040:4040
You can then access the Spark UI on localhost:4040
For related topics, see Spark Operations.