README.rst 28 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586587588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645646647648649650651652653654655656657658659660661662663664665666667668669670671672673674675676677678679680681682683684685686687688689690691692693694695696697698699700701702703704705706707708709710711712713714715716717718719720721722723724725726727728729730731732733734735736737738739740741742743744745746747748749750751752753754755756757
  1. Welcome to Livy
  2. ===============
  3. Livy is an open source REST interface for interacting with Apache Spark from anywhere. It supports executing snippets of code or programs in a Spark context that runs locally or in YARN.
  4. * Interactive Scala, Python and R shells
  5. * Batch submissions in Scala, Java, Python
  6. * Multi users can share the same server (impersonation support)
  7. * Can be used for submitting jobs from anywhere with REST
  8. * Does not require any code change to your programs
  9. The code is currently incubating in Hue but hopefully will eventually graduate in its top
  10. project. `Pull requests`_ are welcomed!
  11. .. _Pull requests: https://github.com/cloudera/hue/pulls
  12. Quick Start
  13. ===========
  14. Livy is used for powering the Spark snippets of the `Hadoop Notebook`_ of `Hue 3.8`_, which you can see the
  15. `implementation here`_.
  16. See the API documentation below and some curl examples:
  17. * `Interactive shells`_
  18. * `Batch jobs`_
  19. * `Shared RDDs`_
  20. .. _Interactive shells: http://gethue.com/how-to-use-the-livy-spark-rest-job-server-for-interactive-spark/
  21. .. _Batch jobs: http://gethue.com/how-to-use-the-livy-spark-rest-job-server-api-for-sharing-spark-rdds-and-contexts/
  22. .. _Shared RDDs: http://gethue.com/how-to-use-the-livy-spark-rest-job-server-api-for-submitting-batch-jar-python-and-streaming-spark-jobs/
  23. .. _Hadoop Notebook: http://gethue.com/new-notebook-application-for-spark-sql/
  24. .. _Hue 3.8: http://gethue.com/hue-3-8-with-an-oozie-editor-revamp-better-performances-improved-spark-ui-is-out/
  25. .. _implementation here: https://github.com/cloudera/hue/blob/master/apps/spark/src/spark/job_server_api.py
  26. Prerequisites
  27. =============
  28. To build/run Livy, you will need:
  29. Debian/Ubuntu:
  30. * mvn (from ``maven`` package or maven3 tarball)
  31. * openjdk-7-jdk (or Oracle Java7 jdk)
  32. * spark 1.4+ from (from `Apache Spark tarball`_)
  33. * Python 2.6+
  34. * R 3.x
  35. Redhat/CentOS:
  36. * mvn (from ``maven`` package or maven3 tarball)
  37. * java-1.7.0-openjdk (or Oracle Java7 jdk)
  38. * spark 1.4+ (from `Apache Spark tarball`_)
  39. * Python 2.6+
  40. * R 3.x
  41. MacOS:
  42. * Xcode command line tools
  43. * Oracle's JDK 1.7+
  44. * Maven (Homebrew)
  45. * apache-spark 1.5 (Homebrew)
  46. * Python 2.6+
  47. * R 3.x
  48. .. _Apache Spark Tarball: https://spark.apache.org/downloads.html
  49. Building Livy
  50. =============
  51. Livy is currently built by the `Hue Build System`_, it can also be built on
  52. it's own (aka without any other Hue dependency) with `Apache Maven`_. To
  53. checkout and build Livy, run:
  54. .. code:: shell
  55. % git clone git@github.com:cloudera/hue.git
  56. % cd hue
  57. % cd apps/spark/java
  58. % mvn -DskipTests clean package
  59. By default Livy is built with the Cloudera distribution of Spark (currently
  60. based off Spark 1.5.0), but it is simple to support other versions, such as
  61. Spark 1.4.1, by compiling Livy with:
  62. .. code:: shell
  63. % mvn -DskipTests -Dspark.version=1.4.1 clean package
  64. .. _Hue Build System: https://github.com/cloudera/hue/#getting-started
  65. .. _Apache Maven: http://maven.apache.org
  66. Running Tests
  67. =============
  68. In order to run the Livy Tests, first follow the instructions in `Building
  69. Livy`_. Then run:
  70. .. code:: shell
  71. % export SPARK_HOME=/usr/lib/spark
  72. % export HADOOP_CONF_DIR=/etc/hadoop/conf
  73. % mvn test
  74. Running Livy
  75. ============
  76. In order to run Livy with local sessions, first export these variables:
  77. .. code:: shell
  78. % export SPARK_HOME=/usr/lib/spark
  79. % export HADOOP_CONF_DIR=/etc/hadoop/conf
  80. Then start the server with:
  81. .. code:: shell
  82. % ./bin/livy-server
  83. Or with YARN sessions by running:
  84. .. code:: shell
  85. % env \
  86. LIVY_SERVER_JAVA_OPTS="-Dlivy.server.session.factory=yarn" \
  87. CLASSPATH=`hadoop classpath` \
  88. $LIVY_HOME/bin/livy-server
  89. Livy Configuration
  90. ==================
  91. The properties of the server can be modified by copying
  92. `livy-defaults.conf.template`_ and renaming it ``conf/livy-defaults.conf``. The
  93. Livy configuration directory can be placed in an alternative directory by defining
  94. ``LIVY_CONF_DIR``.
  95. In particular the ``YARN mode`` (default is ``local`` process for development) can be set with:
  96. .. code:: shell
  97. livy.server.session.factory = yarn
  98. .. _livy-defaults.conf.template: https://github.com/cloudera/hue/blob/master/apps/spark/java/conf/livy-defaults.conf.template
  99. Spark Configuration
  100. ===================
  101. Livy's Spark sessions are configured through two mechanisms. First, is by way of the local
  102. `Spark configuration`_. Create, or modify the Spark configuration files as directed, and point
  103. Livy at this directory with:
  104. .. code:: shell
  105. % env \
  106. SPARK_CONF_DIR=... \
  107. $LIVY_HOME/bin/livy-server
  108. The second mechanism is by white listing `Spark configuration`_ options that can be set by the user
  109. creating a Spark session. This list can be created by copying
  110. `spark-user-configurable-options.template`_ to ``spark-user-configurable-options`` and listing
  111. the options the user may specify in the ``conf`` session field.
  112. *warning*: Be careful before enabling options. Some options may allow a malicious user to
  113. read files that are accessible by the Livy Server process user. Among other things, this might
  114. allow a user to access the Livy TLS private key, Kerberos tickets, or more.
  115. .. _Spark configuration: https://spark.apache.org/docs/latest/configuration.html
  116. .. _spark-user-configurable-options.template: https://github.com/cloudera/hue/blob/master/apps/spark/java/conf/spark-user-configurable-options.template
  117. Spark Example
  118. =============
  119. Now to see it in action by interacting with it in Python with the `Requests`_
  120. library. By default livy runs on port 8998 (which can be changed with the
  121. ``livy_server_port config`` option). We’ll start off with a Spark session that
  122. takes Scala code:
  123. .. code:: shell
  124. % sudo pip install requests
  125. .. code:: python
  126. >>> import json, pprint, requests, textwrap
  127. >>> host = 'http://localhost:8998'
  128. >>> data = {'kind': 'spark'}
  129. >>> headers = {'Content-Type': 'application/json'}
  130. >>> r = requests.post(host + '/sessions', data=json.dumps(data), headers=headers)
  131. >>> r.json()
  132. {u'state': u'starting', u'id': 0, u’kind’: u’spark’}
  133. Once the session has completed starting up, it transitions to the idle state:
  134. .. code:: python
  135. >>> session_url = host + r.headers['location']
  136. >>> r = requests.get(session_url, headers=headers)
  137. >>> r.json()
  138. {u'state': u'idle', u'id': 0, u’kind’: u’spark’}
  139. Now we can execute Scala by passing in a simple JSON command:
  140. .. code:: python
  141. >>> statements_url = session_url + '/statements'
  142. >>> data = {'code': '1 + 1'}
  143. >>> r = requests.post(statements_url, data=json.dumps(data), headers=headers)
  144. >>> r.json()
  145. {u'output': None, u'state': u'running', u'id': 0}
  146. If a statement takes longer than a few milliseconds to execute, Livy returns
  147. early and provides a URL that can be polled until it is complete:
  148. .. code:: python
  149. >>> statement_url = host + r.headers['location']
  150. >>> r = requests.get(statement_url, headers=headers)
  151. >>> pprint.pprint(r.json())
  152. [{u'id': 0,
  153. u'output': {u'data': {u'text/plain': u'res0: Int = 2'},
  154. u'execution_count': 0,
  155. u'status': u'ok'},
  156. u'state': u'available'}]
  157. That was a pretty simple example. More interesting is using Spark to estimate
  158. PI. This is from the `Spark Examples`_:
  159. .. code:: python
  160. >>> data = {
  161. ... 'code': textwrap.dedent("""\
  162. ... val NUM_SAMPLES = 100000;
  163. ... val count = sc.parallelize(1 to NUM_SAMPLES).map { i =>
  164. ... val x = Math.random();
  165. ... val y = Math.random();
  166. ... if (x*x + y*y < 1) 1 else 0
  167. ... }.reduce(_ + _);
  168. ... println(\"Pi is roughly \" + 4.0 * count / NUM_SAMPLES)
  169. ... """)
  170. ... }
  171. >>> r = requests.post(statements_url, data=json.dumps(data), headers=headers)
  172. >>> pprint.pprint(r.json())
  173. {u'id': 1,
  174. u'output': {u'data': {u'text/plain': u'Pi is roughly 3.14004\nNUM_SAMPLES: Int = 100000\ncount: Int = 78501'},
  175. u'execution_count': 1,
  176. u'status': u'ok'},
  177. u'state': u'available'}
  178. Finally, lets close our session:
  179. .. code:: python
  180. >>> session_url = 'http://localhost:8998/sessions/0'
  181. >>> requests.delete(session_url, headers=headers)
  182. <Response [204]>
  183. .. _Requests: http://docs.python-requests.org/en/latest/
  184. .. _Spark Examples: https://spark.apache.org/examples.html
  185. PySpark Example
  186. ===============
  187. pyspark has the exact same API, just with a different initial command:
  188. .. code:: python
  189. >>> data = {'kind': 'pyspark'}
  190. >>> r = requests.post(host + '/sessions', data=json.dumps(data), headers=headers)
  191. >>> r.json()
  192. {u'id': 1, u'state': u'idle'}
  193. The PI example from before then can be run as:
  194. .. code:: python
  195. >>> data = {
  196. ... 'code': textwrap.dedent("""\
  197. ... import random
  198. ... NUM_SAMPLES = 100000
  199. ... def sample(p):
  200. ... x, y = random.random(), random.random()
  201. ... return 1 if x*x + y*y < 1 else 0
  202. ...
  203. ... count = sc.parallelize(xrange(0, NUM_SAMPLES)).map(sample) \
  204. ... .reduce(lambda a, b: a + b)
  205. ... print "Pi is roughly %f" % (4.0 * count / NUM_SAMPLES)
  206. ... """)
  207. ... }
  208. >>> r = requests.post(statements_url, data=json.dumps(data), headers=headers)
  209. >>> pprint.pprint(r.json())
  210. {u'id': 12,
  211. u'output': {u'data': {u'text/plain': u'Pi is roughly 3.136000'},
  212. u'execution_count': 12,
  213. u'status': u'ok'},
  214. u'state': u'running'}
  215. SparkR Example
  216. ==============
  217. SparkR also has the same API:
  218. .. code:: python
  219. >>> data = {'kind': 'sparkR'}
  220. >>> r = requests.post(host + '/sessions', data=json.dumps(data), headers=headers)
  221. >>> r.json()
  222. {u'id': 1, u'state': u'idle'}
  223. The PI example from before then can be run as:
  224. .. code:: python
  225. >>> data = {
  226. ... 'code': textwrap.dedent("""\
  227. ... n <- 100000
  228. ... piFunc <- function(elem) {
  229. ... rands <- runif(n = 2, min = -1, max = 1)
  230. ... val <- ifelse((rands[1]^2 + rands[2]^2) < 1, 1.0, 0.0)
  231. ... val
  232. ... }
  233. ... piFuncVec <- function(elems) {
  234. ... message(length(elems))
  235. ... rands1 <- runif(n = length(elems), min = -1, max = 1)
  236. ... rands2 <- runif(n = length(elems), min = -1, max = 1)
  237. ... val <- ifelse((rands1^2 + rands2^2) < 1, 1.0, 0.0)
  238. ... sum(val)
  239. ... }
  240. ... rdd <- parallelize(sc, 1:n, slices)
  241. ... count <- reduce(lapplyPartition(rdd, piFuncVec), sum)
  242. ... cat("Pi is roughly", 4.0 * count / n, "\n")
  243. ... """)
  244. ... }
  245. >>> r = requests.post(statements_url, data=json.dumps(data), headers=headers)
  246. >>> pprint.pprint(r.json())
  247. {u'id': 12,
  248. u'output': {u'data': {u'text/plain': u'Pi is roughly 3.136000'},
  249. u'execution_count': 12,
  250. u'status': u'ok'},
  251. u'state': u'running'}
  252. Community
  253. =========
  254. * User group: http://groups.google.com/a/cloudera.org/group/hue-user
  255. * Umbrella Jira: https://issues.cloudera.org/browse/HUE-2588
  256. * Pull requests: https://github.com/cloudera/hue/pulls
  257. REST API
  258. ========
  259. GET /sessions
  260. -------------
  261. Returns all the active interactive sessions.
  262. Response Body
  263. ^^^^^^^^^^^^^
  264. +----------+-----------------+------+
  265. | name | description | type |
  266. +==========+=================+======+
  267. | sessions | `session`_ list | list |
  268. +----------+-----------------+------+
  269. POST /sessions
  270. --------------
  271. Creates a new interative Scala, Python or R shell in the cluster.
  272. Request Body
  273. ^^^^^^^^^^^^
  274. +-------------------+--------------------------------------------------------------------------------+-----------------+
  275. | name | description | type |
  276. +===================+================================================================================+=================+
  277. | kind | The session kind (required) | `session kind`_ |
  278. +-------------------+--------------------------------------------------------------------------------+-----------------+
  279. | proxyUser | The user to impersonate that will run this session (e.g. bob) | string |
  280. +-------------------+--------------------------------------------------------------------------------+-----------------+
  281. | jars | Files to be placed on the java classpath | list of paths |
  282. +-------------------+--------------------------------------------------------------------------------+-----------------+
  283. | pyFiles | Files to be placed on the PYTHONPATH | list of paths |
  284. +-------------------+--------------------------------------------------------------------------------+-----------------+
  285. | files | Files to be placed in executor working directory | list of paths |
  286. +-------------------+--------------------------------------------------------------------------------+-----------------+
  287. | driverMemory | Memory for driver (e.g. 1000M, 2G) | string |
  288. +-------------------+--------------------------------------------------------------------------------+-----------------+
  289. | driverCores | Number of cores used by driver (YARN mode only) | int |
  290. +-------------------+--------------------------------------------------------------------------------+-----------------+
  291. | executorMemory | Memory for executor (e.g. 1000M, 2G) | string |
  292. +-------------------+--------------------------------------------------------------------------------+-----------------+
  293. | executorCores | Number of cores used by executor | int |
  294. +-------------------+--------------------------------------------------------------------------------+-----------------+
  295. | totalExecutorCores| number of cluster cores used by executor (Standalone mode only) | int |
  296. +-------------------+--------------------------------------------------------------------------------+-----------------+
  297. | numExecutors | Number of executors (YARN mode only) | int |
  298. +-------------------+--------------------------------------------------------------------------------+-----------------+
  299. | archives | Archives to be uncompressed in the executor working directory (YARN mode only) | list of paths |
  300. +-------------------+--------------------------------------------------------------------------------+-----------------+
  301. | queue | The YARN queue to submit too (YARN mode only) | string |
  302. +-------------------+--------------------------------------------------------------------------------+-----------------+
  303. | name | Name of the application | string |
  304. +-------------------+--------------------------------------------------------------------------------+-----------------+
  305. | conf | Spark configuration property | Map of key=val |
  306. +-------------------+--------------------------------------------------------------------------------+-----------------+
  307. Response Body
  308. ^^^^^^^^^^^^^
  309. The created `Session`_.
  310. GET /sessions/{sessionId}
  311. -------------------------
  312. Return the session information
  313. Response
  314. ^^^^^^^^
  315. The `Session`_.
  316. DELETE /sessions/{sessionId}
  317. ----------------------------
  318. Kill the `Session`_ job.
  319. GET /sessions/{sessionId}/logs
  320. ------------------------------
  321. Get the log lines from this session.
  322. Request Parameters
  323. ^^^^^^^^^^^^^^^^^^
  324. +------+-----------------------------+------+
  325. | name | description | type |
  326. +======+=============================+======+
  327. | from | offset | int |
  328. +------+-----------------------------+------+
  329. | size | amount of batches to return | int |
  330. +------+-----------------------------+------+
  331. Response Body
  332. ^^^^^^^^^^^^^
  333. +------+-----------------------+-----------------+
  334. | name | description | type |
  335. +======+=======================+=================+
  336. | id | The session id | int |
  337. +------+-----------------------+-----------------+
  338. | from | offset | int |
  339. +------+-----------------------+-----------------+
  340. | size | total amount of lines | int |
  341. +------+-----------------------+-----------------+
  342. | log | The log lines | list of strings |
  343. +------+-----------------------+-----------------+
  344. GET /sessions/{sessionId}/statements
  345. ------------------------------------
  346. Return all the statements in a session.
  347. Response Body
  348. ^^^^^^^^^^^^^
  349. +------------+-------------------+------+
  350. | name | description | type |
  351. +============+===================+======+
  352. | statements | `statement`_ list | list |
  353. +------------+-------------------+------+
  354. POST /sessions/{sessionId}/statements
  355. -------------------------------------
  356. Execute a statement in a session.
  357. Request Body
  358. ^^^^^^^^^^^^
  359. +------+---------------------+--------+
  360. | name | description | type |
  361. +======+=====================+========+
  362. | code | The code to execute | string |
  363. +------+---------------------+--------+
  364. Response Body
  365. ^^^^^^^^^^^^^
  366. The `statement`_ object.
  367. GET /batches
  368. ------------
  369. Return all the active batch jobs.
  370. Response Body
  371. ^^^^^^^^^^^^^
  372. +---------+---------------+------+
  373. | name | description | type |
  374. +=========+===============+======+
  375. | batches | `batch`_ list | list |
  376. +---------+---------------+------+
  377. POST /batches
  378. -------------
  379. Request Body
  380. ^^^^^^^^^^^^
  381. +----------------+---------------------------------------------------+-----------------+
  382. | name | description | type |
  383. +================+===================================================+=================+
  384. | proxyUser | The user to impersonate that will execute the job | string |
  385. +----------------+---------------------------------------------------+-----------------+
  386. | file | Archive holding the file | path (required) |
  387. +----------------+---------------------------------------------------+-----------------+
  388. | args | Command line arguments | list of strings |
  389. +----------------+---------------------------------------------------+-----------------+
  390. | className | Application's java/spark main class | string |
  391. +----------------+---------------------------------------------------+-----------------+
  392. | jars | Files to be placed on the java classpath | list of paths |
  393. +----------------+---------------------------------------------------+-----------------+
  394. | pyFiles | Files to be placed on the PYTHONPATH | list of paths |
  395. +----------------+---------------------------------------------------+-----------------+
  396. | files | Files to be placed in executor working directory | list of paths |
  397. +----------------+---------------------------------------------------+-----------------+
  398. | driverMemory | Memory for driver (e.g. 1000M, 2G) | string |
  399. +----------------+---------------------------------------------------+-----------------+
  400. | driverCores | Number of cores used by driver | int |
  401. +----------------+---------------------------------------------------+-----------------+
  402. | executorMemory | Memory for executor (e.g. 1000M, 2G) | string |
  403. +----------------+---------------------------------------------------+-----------------+
  404. | executorCores | Number of cores used by executor | int |
  405. +----------------+---------------------------------------------------+-----------------+
  406. | numExecutors | Number of executor | int |
  407. +----------------+---------------------------------------------------+-----------------+
  408. | archives | Archives to be uncompressed (YARN mode only) | list of paths |
  409. +----------------+---------------------------------------------------+-----------------+
  410. | queue | The YARN queue to submit too (YARN mode only) | string |
  411. +----------------+---------------------------------------------------+-----------------+
  412. | name | Name of the application | string |
  413. +----------------+---------------------------------------------------+-----------------+
  414. | conf | Spark configuration property | Map of key=val |
  415. +----------------+---------------------------------------------------+-----------------+
  416. Response Body
  417. ^^^^^^^^^^^^^
  418. The created `Batch`_ object.
  419. GET /batches/{batchId}
  420. ----------------------
  421. Request Parameters
  422. ^^^^^^^^^^^^^^^^^^
  423. +------+-----------------------------+------+
  424. | name | description | type |
  425. +======+=============================+======+
  426. | from | offset | int |
  427. +------+-----------------------------+------+
  428. | size | amount of batches to return | int |
  429. +------+-----------------------------+------+
  430. Response Body
  431. ^^^^^^^^^^^^^
  432. +-------+-----------------------------+-----------------+
  433. | name | description | type |
  434. +=======+=============================+=================+
  435. | id | The batch id | int |
  436. +-------+-----------------------------+-----------------+
  437. | state | The state of the batch | `batch`_ state |
  438. +-------+-----------------------------+-----------------+
  439. | log | The output of the batch job | list of strings |
  440. +-------+-----------------------------+-----------------+
  441. DELETE /batches/{batchId}
  442. -------------------------
  443. Kill the `Batch`_ job.
  444. GET /batches/{batchId}/logs
  445. ---------------------------
  446. Get the log lines from this batch.
  447. Request Parameters
  448. ^^^^^^^^^^^^^^^^^^
  449. +------+-----------------------------+------+
  450. | name | description | type |
  451. +======+=============================+======+
  452. | from | offset | int |
  453. +------+-----------------------------+------+
  454. | size | amount of batches to return | int |
  455. +------+-----------------------------+------+
  456. Response Body
  457. ^^^^^^^^^^^^^
  458. +------+-----------------------+-----------------+
  459. | name | description | type |
  460. +======+=======================+=================+
  461. | id | The batch id | int |
  462. +------+-----------------------+-----------------+
  463. | from | offset | int |
  464. +------+-----------------------+-----------------+
  465. | size | total amount of lines | int |
  466. +------+-----------------------+-----------------+
  467. | log | The log lines | list of strings |
  468. +------+-----------------------+-----------------+
  469. REST Objects
  470. ============
  471. Session
  472. -------
  473. Sessions represent an interactive shell.
  474. +----------------+--------------------------------------------------+----------------------------+
  475. | name | description | type |
  476. +================+==================================================+============================+
  477. | id | The session id | int |
  478. +----------------+--------------------------------------------------+----------------------------+
  479. | kind | session kind (spark, pyspark, or sparkr) | `session kind`_ (required) |
  480. +----------------+--------------------------------------------------+----------------------------+
  481. | log | The log lines | list of strings |
  482. +----------------+--------------------------------------------------+----------------------------+
  483. | state | The session state | string |
  484. +----------------+--------------------------------------------------+----------------------------+
  485. Session State
  486. ^^^^^^^^^^^^^
  487. +-------------+----------------------------------+
  488. | name | description |
  489. +=============+==================================+
  490. | not_started | session has not been started |
  491. +-------------+----------------------------------+
  492. | starting | session is starting |
  493. +-------------+----------------------------------+
  494. | idle | session is waiting for input |
  495. +-------------+----------------------------------+
  496. | busy | session is executing a statement |
  497. +-------------+----------------------------------+
  498. | error | session errored out |
  499. +-------------+----------------------------------+
  500. | dead | session has exited |
  501. +-------------+----------------------------------+
  502. Session Kind
  503. ^^^^^^^^^^^^
  504. +---------+----------------------------------+
  505. | name | description |
  506. +=========+==================================+
  507. | spark | interactive scala/spark session |
  508. +---------+----------------------------------+
  509. | pyspark | interactive python/spark session |
  510. +---------+----------------------------------+
  511. | sparkr | interactive R/spark session |
  512. +---------+----------------------------------+
  513. Statement
  514. ---------
  515. Statements represent the result of an execution statement.
  516. +--------+----------------------+---------------------+
  517. | name | description | type |
  518. +========+======================+=====================+
  519. | id | The statement id | integer |
  520. +--------+----------------------+---------------------+
  521. | state | The execution state | `statement state`_ |
  522. +--------+----------------------+---------------------+
  523. | output | The execution output | `statement output`_ |
  524. +--------+----------------------+---------------------+
  525. Statement State
  526. ^^^^^^^^^^^^^^^
  527. +-----------+----------------------------------+
  528. | name | description |
  529. +===========+==================================+
  530. | running | Statement is currently executing |
  531. +-----------+----------------------------------+
  532. | available | Statement has a ready response |
  533. +-----------+----------------------------------+
  534. | error | Statement failed |
  535. +-----------+----------------------------------+
  536. Statement Output
  537. ^^^^^^^^^^^^^^^^
  538. +-----------------+-------------------+----------------------------------+
  539. | name | description | type |
  540. +=================+===================+==================================+
  541. | status | execution status | string |
  542. +-----------------+-------------------+----------------------------------+
  543. | execution_count | a monotomically | integer |
  544. | | increasing number | |
  545. +-----------------+-------------------+----------------------------------+
  546. | data | statement output | an object mapping a mime type to |
  547. | | | the result. If the mime type is |
  548. | | | ``application/json``, the value |
  549. | | | will be a JSON value |
  550. +-----------------+-------------------+----------------------------------+
  551. Batch
  552. -----
  553. +----------------+--------------------------------------------------+----------------------------+
  554. | name | description | type |
  555. +================+==================================================+============================+
  556. | id | The session id | int |
  557. +----------------+--------------------------------------------------+----------------------------+
  558. | kind | session kind (spark, pyspark, or sparkr) | `session kind`_ (required) |
  559. +----------------+--------------------------------------------------+----------------------------+
  560. | log | The log lines | list of strings |
  561. +----------------+--------------------------------------------------+----------------------------+
  562. | state | The session state | string |
  563. +----------------+--------------------------------------------------+----------------------------+
  564. License
  565. =======
  566. Apache License, Version 2.0
  567. http://www.apache.org/licenses/LICENSE-2.0