layout: page title: Livy Docs - REST API
{% include JB/setup %}
Returns all the active interactive sessions.
Name | Description | Type |
---|---|---|
from | The start index to fetch sessions | int |
size | Number of sessions to fetch | int |
Name | Description | Type |
---|---|---|
from | The start index to fetch sessions | int |
size | Number of sessions to fetch | int |
sessions | Session list | list |
Creates a new interactive Scala, Python, or R shell in the cluster.
Name | Description | Type |
---|---|---|
kind | The session kind (required) | session kind |
proxyUser | User to impersonate when starting the session | string |
jars | jars to be used in this session | List of string |
pyFiles | Python files to be used in this session | List of string |
files | files to be used in this session | List of string |
driverMemory | Amount of memory to use for the driver process | string |
driverCores | Number of cores to use for the driver process | int |
executorMemory | Amount of memory to use per executor process | string |
executorCores | Number of cores to use for each executor | int |
numExecutors | Number of executors to launch for this session | int |
archives | Archives to be used in this session | List of string |
queue | The name of the YARN queue to which submitted | string |
name | The name of this session | string |
conf | Spark configuration properties | Map of key=val |
heartbeatTimeoutInSecond | Timeout in second to which session be orphaned | int |
The created Session.
Returns the session information.
The Session.
Returns the state of session
Name | Description | Type |
---|---|---|
id | Session id | int |
state | The current state of session | string |
Kills the Session job.
Gets the log lines from this session.
Name | Description | Type |
---|---|---|
from | Offset | int |
size | Max number of log lines to return | int |
Name | Description | Type |
---|---|---|
id | The session id | int |
from | Offset from start of log | int |
size | Max number of log lines | int |
log | The log lines | list of strings |
Returns all the statements in a session.
Name | Description | Type |
---|---|---|
statements | statement list | list |
Runs a statement in a session.
Name | Description | Type |
---|---|---|
code | The code to execute | string |
The statement object.
Returns a specified statement in a session.
The statement object.
Cancel the specified statement in this session.
Name | Description | Type |
---|---|---|
msg | is always "cancelled" | string |
Returns all the active batch sessions.
Name | Description | Type |
---|---|---|
from | The start index to fetch sessions | int |
size | Number of sessions to fetch | int |
Name | Description | Type |
---|---|---|
from | The start index of fetched sessions | int |
total | Number of sessions fetched | int |
sessions | Batch list | list |
Name | Description | Type |
---|---|---|
file | File containing the application to execute | path (required) |
proxyUser | User to impersonate when running the job | string |
className | Application Java/Spark main class | string |
args | Command line arguments for the application | list of strings |
jars | jars to be used in this session | list of strings |
pyFiles | Python files to be used in this session | list of strings |
files | files to be used in this session | list of strings |
driverMemory | Amount of memory to use for the driver process | string |
driverCores | Number of cores to use for the driver process | int |
executorMemory | Amount of memory to use per executor process | string |
executorCores | Number of cores to use for each executor | int |
numExecutors | Number of executors to launch for this session | int |
archives | Archives to be used in this session | List of string |
queue | The name of the YARN queue to which submitted | string |
name | The name of this session | string |
conf | Spark configuration properties | Map of key=val |
The created Batch object.
Returns the batch session information.
The Batch.
Returns the state of batch session
Name | Description | Type |
---|---|---|
id | Batch session id | int |
state | The current state of batch session | string |
Kills the Batch job.
Gets the log lines from this batch.
Name | Description | Type |
---|---|---|
from | Offset | Offset |
size | Max number of log lines to return | Offset |
Name | Description | Type |
---|---|---|
id | The batch id | id |
from | Offset from start of log | id |
size | Number of log lines | id |
log | The log lines | list of strings |
A session represents an interactive shell.
Name | Description | Type |
---|---|---|
id | The session id | int |
appId | The application id of this session | string |
owner | Remote user who submitted this session | string |
proxyUser | User to impersonate when running | string |
kind | Session kind (spark, pyspark, or sparkr) | session kind |
log | The log lines | list of strings |
state | The session state | string |
appInfo | The detailed application info | Map of key=val |
Value | Description |
---|---|
not_started | Session has not been started |
starting | Session is starting |
idle | Session is waiting for input |
busy | Session is executing a statement |
shutting_down | Session is shutting down |
error | Session errored out |
dead | Session has exited |
success | Session is successfully stopped |
Value | Description |
---|---|
spark | Interactive Scala Spark session |
pyspark | Interactive Python 2 Spark session |
pyspark3 | Interactive Python 3 Spark session |
sparkr | Interactive R Spark session |
To change the Python executable the session uses, Livy reads the path from environment variable
PYSPARK_PYTHON
(Same as pyspark).
Like pyspark, if Livy is running in local
mode, just set the environment variable.
If the session is running in yarn-cluster
mode, please set
spark.yarn.appMasterEnv.PYSPARK_PYTHON
in SparkConf so the environment variable is passed to
the driver.
To change the Python executable the session uses, Livy reads the path from environment variable
PYSPARK3_PYTHON
.
Like pyspark, if Livy is running in local
mode, just set the environment variable.
If the session is running in yarn-cluster
mode, please set
spark.yarn.appMasterEnv.PYSPARK3_PYTHON
in SparkConf so the environment variable is passed to
the driver.
A statement represents the result of an execution statement.
Name | Description | Type |
---|---|---|
id | The statement id | integer |
code | The execution code | string |
state | The execution state | statement state |
output | The execution output | statement output |
Value | Description |
---|---|
waiting | Statement is enqueued but execution hasn't started |
running | Statement is currently running |
available | Statement has a response ready |
error | Statement failed |
cancelling | Statement is being cancelling |
cancelled | Statement is cancelled |
Name | Description | Type |
---|---|---|
status | Execution status | string |
execution_count | A monotonically increasing number | integer |
data | Statement output | An object mapping a mime type to the result. If the mime type is ``application/json``, the value is a JSON value. |
Name | Description | Type |
---|---|---|
id | The session id | int |
appId | The application id of this session | string |
appInfo | The detailed application info | Map of key=val |
log | The log lines | list of strings |
state | The batch state | string |