README.rst 26 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586587588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645646647648649650651652653654655656657658659660661662663664665666667668669670671672673674675676677678679680681682683684685686687688689690691692693694695696697698699700701702703704705706707708709710711712713714715716717718719720721
  1. Welcome to Livy, the REST Spark Server
  2. ======================================
  3. Livy is an open source REST interface for interacting with Spark from anywhere. It supports executing snippets of code or programs in a Spark context that runs locally or in YARN.
  4. * Interactive Scala, Python and R shells
  5. * Batch submissions in Scala, Java, Python
  6. * Multi users can share the same server (impersonation support)
  7. * Can be used for submitting jobs from anywhere with REST
  8. * Does not require any code change to your programs
  9. The code is currently incubating in Hue but hopefully will eventually graduate in its top project. `Pull requests` are welcomed!
  10. .. _Pull requests: https://github.com/cloudera/hue/pulls
  11. Quick Start
  12. =============
  13. Livy is used for powering the Spark snippets of the `Hadoop Notebook`_ of `Hue 3.8`_, which you can see the
  14. `implementation here`_.
  15. See the API documentation below and some curl examples:
  16. * `Interactive shells`_
  17. * `Batch jobs`_
  18. * `Shared RDDs`_
  19. .. _Interactive shells: http://gethue.com/how-to-use-the-livy-spark-rest-job-server-for-interactive-spark/
  20. .. _Batch jobs: http://gethue.com/how-to-use-the-livy-spark-rest-job-server-api-for-sharing-spark-rdds-and-contexts/
  21. .. _Shared RDDs: http://gethue.com/how-to-use-the-livy-spark-rest-job-server-api-for-submitting-batch-jar-python-and-streaming-spark-jobs/
  22. .. _Hadoop Notebook: http://gethue.com/new-notebook-application-for-spark-sql/
  23. .. _Hue 3.8: http://gethue.com/hue-3-8-with-an-oozie-editor-revamp-better-performances-improved-spark-ui-is-out/
  24. .. _implementation here: https://github.com/cloudera/hue/blob/master/apps/spark/src/spark/job_server_api.py
  25. Prerequisites
  26. =============
  27. To build/run Livy, you will need:
  28. Debian/Ubuntu:
  29. * mvn (from ``maven`` package or maven3 tarball)
  30. * openjdk-7-jdk (or Oracle Java7 jdk)
  31. * spark 1.5 from (from `Apache Spark tarball`_)
  32. * Python 2.6+
  33. * R 3.x
  34. Redhat/CentOS:
  35. * mvn (from ``maven`` package or maven3 tarball)
  36. * java-1.7.0-openjdk (or Oracle Java7 jdk)
  37. * spark 1.5 (from `Apache Spark tarball`_)
  38. * Python 2.6+
  39. * R 3.x
  40. MacOS:
  41. * Xcode command line tools
  42. * Oracle's JDK 1.7+
  43. * Maven (Homebrew)
  44. * apache-spark 1.5 (Homebrew)
  45. * Python 2.6+
  46. * R 3.x
  47. .. _Apache Spark Tarball: https://spark.apache.org/downloads.html
  48. Building Livy
  49. =============
  50. Livy is currently built by the `Hue Build System`_, it can also be built on
  51. it's own (aka without any other Hue dependency) with `Apache Maven`_. To build,
  52. checks out the code, go to the Livy directory and run:
  53. .. code:: shell
  54. git clone git@github.com:cloudera/hue.git
  55. cd hue
  56. .. code:: shell
  57. % cd apps/spark/java
  58. % mvn -DskipTests clean package
  59. .. _Hue Build System: https://github.com/cloudera/hue/#getting-started
  60. .. _Apache Maven: http://maven.apache.org
  61. Running Tests
  62. =============
  63. In order to run the Livy Tests, first follow the instructions in `Building
  64. Livy`_. Then run:
  65. .. code:: shell
  66. % export SPARK_HOME=/usr/lib/spark
  67. % export HADOOP_CONF_DIR=/etc/hadoop/conf
  68. % mvn test
  69. Running Livy
  70. ============
  71. In order to run Livy with local sessions, first export these variables:
  72. .. code:: shell
  73. % export SPARK_HOME=/usr/lib/spark
  74. % export HADOOP_CONF_DIR=/etc/hadoop/conf
  75. Then start the server with:
  76. .. code:: shell
  77. % ./bin/livy-server
  78. Or with YARN sessions by running:
  79. .. code:: shell
  80. % env \
  81. LIVY_SERVER_JAVA_OPTS="-Dlivy.server.session.factory=yarn" \
  82. CLASSPATH=`hadoop classpath` \
  83. ./bin/livy-server
  84. Livy Configuration
  85. ==================
  86. The properties of the server can be modified by copying <https://github.com/cloudera/hue/blob/master/apps/spark/java/conf/livy-defaults.conf.tmpl>
  87. and renaming it ``livy-defaults.conf``.
  88. In particular the ``YARN mode`` (default is ``local`` process for development) can be set with:
  89. .. code:: shell
  90. livy.server.session.factory = yarn
  91. Spark Example
  92. =============
  93. Now to see it in action by interacting with it in Python with the `Requests`_
  94. library. By default livy runs on port 8998 (which can be changed with the
  95. ``livy_server_port config`` option). We’ll start off with a Spark session that
  96. takes Scala code:
  97. .. code:: shell
  98. % sudo pip install requests
  99. .. code:: python
  100. >>> import json, pprint, requests, textwrap
  101. >>> host = 'http://localhost:8998'
  102. >>> data = {'kind': 'spark'}
  103. >>> headers = {'Content-Type': 'application/json'}
  104. >>> r = requests.post(host + '/sessions', data=json.dumps(data), headers=headers)
  105. >>> r.json()
  106. {u'state': u'starting', u'id': 0, u’kind’: u’spark’}
  107. Once the session has completed starting up, it transitions to the idle state:
  108. .. code:: python
  109. >>> session_url = host + r.headers['location']
  110. >>> r = requests.get(session_url, headers=headers)
  111. >>> r.json()
  112. {u'state': u'idle', u'id': 0, u’kind’: u’spark’}
  113. Now we can execute Scala by passing in a simple JSON command:
  114. .. code:: python
  115. >>> statements_url = session_url + '/statements'
  116. >>> data = {'code': '1 + 1'}
  117. >>> r = requests.post(statements_url, data=json.dumps(data), headers=headers)
  118. >>> r.json()
  119. {u'output': None, u'state': u'running', u'id': 0}
  120. If a statement takes longer than a few milliseconds to execute, Livy returns
  121. early and provides a URL that can be polled until it is complete:
  122. .. code:: python
  123. >>> statement_url = host + r.headers['location']
  124. >>> r = requests.get(statement_url, headers=headers)
  125. >>> pprint.pprint(r.json())
  126. [{u'id': 0,
  127. u'output': {u'data': {u'text/plain': u'res0: Int = 2'},
  128. u'execution_count': 0,
  129. u'status': u'ok'},
  130. u'state': u'available'}]
  131. That was a pretty simple example. More interesting is using Spark to estimate
  132. PI. This is from the `Spark Examples`_:
  133. .. code:: python
  134. >>> data = {
  135. ... 'code': textwrap.dedent("""\
  136. ... val NUM_SAMPLES = 100000;
  137. ... val count = sc.parallelize(1 to NUM_SAMPLES).map { i =>
  138. ... val x = Math.random();
  139. ... val y = Math.random();
  140. ... if (x*x + y*y < 1) 1 else 0
  141. ... }.reduce(_ + _);
  142. ... println(\"Pi is roughly \" + 4.0 * count / NUM_SAMPLES)
  143. ... """)
  144. ... }
  145. >>> r = requests.post(statements_url, data=json.dumps(data), headers=headers)
  146. >>> pprint.pprint(r.json())
  147. {u'id': 1,
  148. u'output': {u'data': {u'text/plain': u'Pi is roughly 3.14004\nNUM_SAMPLES: Int = 100000\ncount: Int = 78501'},
  149. u'execution_count': 1,
  150. u'status': u'ok'},
  151. u'state': u'available'}
  152. Finally, lets close our session:
  153. .. code:: python
  154. >>> session_url = 'http://localhost:8998/sessions/0'
  155. >>> requests.delete(session_url, headers=headers)
  156. <Response [204]>
  157. .. _Requests: http://docs.python-requests.org/en/latest/
  158. .. _Spark Examples: https://spark.apache.org/examples.html
  159. PySpark Example
  160. ===============
  161. pyspark has the exact same API, just with a different initial command:
  162. .. code:: python
  163. >>> data = {'kind': 'pyspark'}
  164. >>> r = requests.post(host + '/sessions', data=json.dumps(data), headers=headers)
  165. >>> r.json()
  166. {u'id': 1, u'state': u'idle'}
  167. The PI example from before then can be run as:
  168. .. code:: python
  169. >>> data = {
  170. ... 'code': textwrap.dedent("""\
  171. ... import random
  172. ... NUM_SAMPLES = 100000
  173. ... def sample(p):
  174. ... x, y = random.random(), random.random()
  175. ... return 1 if x*x + y*y < 1 else 0
  176. ...
  177. ... count = sc.parallelize(xrange(0, NUM_SAMPLES)).map(sample) \
  178. ... .reduce(lambda a, b: a + b)
  179. ... print "Pi is roughly %f" % (4.0 * count / NUM_SAMPLES)
  180. ... """)
  181. ... }
  182. >>> r = requests.post(statements_url, data=json.dumps(data), headers=headers)
  183. >>> pprint.pprint(r.json())
  184. {u'id': 12,
  185. u'output': {u'data': {u'text/plain': u'Pi is roughly 3.136000'},
  186. u'execution_count': 12,
  187. u'status': u'ok'},
  188. u'state': u'running'}
  189. SparkR Example
  190. ==============
  191. SparkR also has the same API:
  192. .. code:: python
  193. >>> data = {'kind': 'sparkR'}
  194. >>> r = requests.post(host + '/sessions', data=json.dumps(data), headers=headers)
  195. >>> r.json()
  196. {u'id': 1, u'state': u'idle'}
  197. The PI example from before then can be run as:
  198. .. code:: python
  199. >>> data = {
  200. ... 'code': textwrap.dedent("""\
  201. ... n <- 100000
  202. ... piFunc <- function(elem) {
  203. ... rands <- runif(n = 2, min = -1, max = 1)
  204. ... val <- ifelse((rands[1]^2 + rands[2]^2) < 1, 1.0, 0.0)
  205. ... val
  206. ... }
  207. ... piFuncVec <- function(elems) {
  208. ... message(length(elems))
  209. ... rands1 <- runif(n = length(elems), min = -1, max = 1)
  210. ... rands2 <- runif(n = length(elems), min = -1, max = 1)
  211. ... val <- ifelse((rands1^2 + rands2^2) < 1, 1.0, 0.0)
  212. ... sum(val)
  213. ... }
  214. ... rdd <- parallelize(sc, 1:n, slices)
  215. ... count <- reduce(lapplyPartition(rdd, piFuncVec), sum)
  216. ... cat("Pi is roughly", 4.0 * count / n, "\n")
  217. ... """)
  218. ... }
  219. >>> r = requests.post(statements_url, data=json.dumps(data), headers=headers)
  220. >>> pprint.pprint(r.json())
  221. {u'id': 12,
  222. u'output': {u'data': {u'text/plain': u'Pi is roughly 3.136000'},
  223. u'execution_count': 12,
  224. u'status': u'ok'},
  225. u'state': u'running'}
  226. Community
  227. =========
  228. * User group: http://groups.google.com/a/cloudera.org/group/hue-user
  229. * Umbrella Jira: https://issues.cloudera.org/browse/HUE-2588
  230. * Pull requests: https://github.com/cloudera/hue/pulls
  231. REST API
  232. ========
  233. GET /sessions
  234. -------------
  235. Returns all the active interactive sessions.
  236. Response Body
  237. ^^^^^^^^^^^^^
  238. +----------+-----------------+------+
  239. | name | description | type |
  240. +==========+=================+======+
  241. | sessions | `session`_ list | list |
  242. +----------+-----------------+------+
  243. POST /sessions
  244. --------------
  245. Creates a new interative Scala, Python or R shell in the cluster.
  246. Request Body
  247. ^^^^^^^^^^^^
  248. +----------------+--------------------------------------------------------------------------------+------------------+
  249. | name | description | type |
  250. +================+================================================================================+==================+
  251. | kind | The session kind (required) | `session kind`_ |
  252. +----------------+--------------------------------------------------------------------------------+------------------+
  253. | proxyUser | The user to impersonate that will run this session (e.g. bob) | string |
  254. +----------------+--------------------------------------------------------------------------------+------------------+
  255. | jars | Files to be placed on the java classpath | list of paths |
  256. +----------------+--------------------------------------------------------------------------------+------------------+
  257. | pyFiles | Files to be placed on the PYTHONPATH | list of paths |
  258. +----------------+--------------------------------------------------------------------------------+------------------+
  259. | files | Files to be placed in executor working directory | list of paths |
  260. +----------------+--------------------------------------------------------------------------------+------------------+
  261. | driverMemory | Memory for driver (e.g. 1000M, 2G) | string |
  262. +----------------+--------------------------------------------------------------------------------+------------------+
  263. | driverCores | Number of cores used by driver (YARN mode only) | int |
  264. +----------------+--------------------------------------------------------------------------------+------------------+
  265. | executorMemory | Memory for executor (e.g. 1000M, 2G) | string |
  266. +----------------+--------------------------------------------------------------------------------+------------------+
  267. | executorCores | Number of cores used by executor | int |
  268. +----------------+--------------------------------------------------------------------------------+------------------+
  269. | numExecutors | Number of executors (YARN mode only) | int |
  270. +----------------+--------------------------------------------------------------------------------+------------------+
  271. | archives | Archives to be uncompressed in the executor working directory (YARN mode only) | list of paths |
  272. +----------------+--------------------------------------------------------------------------------+------------------+
  273. | queue | The YARN queue to submit too (YARN mode only) | string |
  274. +----------------+--------------------------------------------------------------------------------+------------------+
  275. | name | Name of the application | string |
  276. +----------------+--------------------------------------------------------------------------------+------------------+
  277. | conf | Spark configuration property | list of key=val |
  278. +----------------+--------------------------------------------------------------------------------+------------------+
  279. Response Body
  280. ^^^^^^^^^^^^^
  281. The created `Session`_.
  282. GET /sessions/{sessionId}
  283. -------------------------
  284. Return the session information
  285. Response
  286. ^^^^^^^^
  287. The `Session`_.
  288. DELETE /sessions/{sessionId}
  289. -------------------------
  290. Kill the `Session`_ job.
  291. GET /sessions/{sessionId}/logs
  292. ---------------------------
  293. Get the log lines from this session.
  294. Request Parameters
  295. ^^^^^^^^^^^^^^^^^^
  296. +------+-----------------------------+------+
  297. | name | description | type |
  298. +======+=============================+======+
  299. | from | offset | int |
  300. +------+-----------------------------+------+
  301. | size | amount of batches to return | int |
  302. +------+-----------------------------+------+
  303. Response Body
  304. ^^^^^^^^^^^^^
  305. +------+-----------------------+-----------------+
  306. | name | description | type |
  307. +======+=======================+=================+
  308. | id | The session id | int |
  309. +------+-----------------------+-----------------+
  310. | from | offset | int |
  311. +------+-----------------------+-----------------+
  312. | size | total amount of lines | int |
  313. +------+-----------------------+-----------------+
  314. | log | The log lines | list of strings |
  315. +------+-----------------------+-----------------+
  316. GET /sessions/{sessionId}/statements
  317. ------------------------------------
  318. Return all the statements in a session.
  319. Response Body
  320. ^^^^^^^^^^^^^
  321. +------------+-------------------+------+
  322. | name | description | type |
  323. +============+===================+======+
  324. | statements | `statement`_ list | list |
  325. +------------+-------------------+------+
  326. POST /sessions/{sessionId}/statements
  327. -------------------------------------
  328. Execute a statement in a session.
  329. Request Body
  330. ^^^^^^^^^^^^
  331. +------+---------------------+--------+
  332. | name | description | type |
  333. +======+=====================+========+
  334. | code | The code to execute | string |
  335. +------+---------------------+--------+
  336. Response Body
  337. ^^^^^^^^^^^^^
  338. The `statement`_ object.
  339. GET /batches
  340. ------------
  341. Return all the active batch jobs.
  342. Response Body
  343. ^^^^^^^^^^^^^
  344. +---------+---------------+------+
  345. | name | description | type |
  346. +=========+===============+======+
  347. | batches | `batch`_ list | list |
  348. +---------+---------------+------+
  349. POST /batches
  350. -------------
  351. Request Body
  352. ^^^^^^^^^^^^
  353. +----------------+--------------------------------------------------+-----------------+
  354. | name | description | type |
  355. +================+==================================================+=================+
  356. | proxyUser | The user to impersonate that will execute the job| string |
  357. +----------------+--------------------------------------------------+-----------------+
  358. | file | Archive holding the file | path (required) |
  359. +----------------+--------------------------------------------------+-----------------+
  360. | args | Command line arguments | list of strings |
  361. +----------------+--------------------------------------------------+-----------------+
  362. | className | Application's java/spark main class | string |
  363. +----------------+--------------------------------------------------+-----------------+
  364. | jars | Files to be placed on the java classpath | list of paths |
  365. +----------------+--------------------------------------------------+-----------------+
  366. | pyFiles | Files to be placed on the PYTHONPATH | list of paths |
  367. +----------------+--------------------------------------------------+-----------------+
  368. | files | Files to be placed in executor working directory | list of paths |
  369. +----------------+--------------------------------------------------+-----------------+
  370. | driverMemory | Memory for driver (e.g. 1000M, 2G) | string |
  371. +----------------+--------------------------------------------------+-----------------+
  372. | driverCores | Number of cores used by driver | int |
  373. +----------------+--------------------------------------------------+-----------------+
  374. | executorMemory | Memory for executor (e.g. 1000M, 2G) | string |
  375. +----------------+--------------------------------------------------+-----------------+
  376. | executorCores | Number of cores used by executor | int |
  377. +----------------+--------------------------------------------------+-----------------+
  378. | numExecutors | Number of executor | int |
  379. +----------------+--------------------------------------------------+-----------------+
  380. | archives | Archives to be uncompressed (YARN mode only) | list of paths |
  381. +----------------+--------------------------------------------------+-----------------+
  382. | queue | The YARN queue to submit too (YARN mode only) | string |
  383. +----------------+--------------------------------------------------+-----------------+
  384. | name | Name of the application | string |
  385. +----------------+--------------------------------------------------+-----------------+
  386. | conf | Spark configuration property | list of key=val |
  387. +----------------+--------------------------------------------------+-----------------+
  388. Response Body
  389. ^^^^^^^^^^^^^
  390. The created `Batch`_ object.
  391. GET /batches/{batchId}
  392. ----------------------
  393. Request Parameters
  394. ^^^^^^^^^^^^^^^^^^
  395. +------+-----------------------------+------+
  396. | name | description | type |
  397. +======+=============================+======+
  398. | from | offset | int |
  399. +------+-----------------------------+------+
  400. | size | amount of batches to return | int |
  401. +------+-----------------------------+------+
  402. Response Body
  403. ^^^^^^^^^^^^^
  404. +-------+-----------------------------+-----------------+
  405. | name | description | type |
  406. +=======+=============================+=================+
  407. | id | The batch id | int |
  408. +-------+-----------------------------+-----------------+
  409. | state | The state of the batch | `batch`_ state |
  410. +-------+-----------------------------+-----------------+
  411. | log | The output of the batch job | list of strings |
  412. +-------+-----------------------------+-----------------+
  413. DELETE /batches/{batchId}
  414. -------------------------
  415. Kill the `Batch`_ job.
  416. GET /batches/{batchId}/logs
  417. ---------------------------
  418. Get the log lines from this batch.
  419. Request Parameters
  420. ^^^^^^^^^^^^^^^^^^
  421. +------+-----------------------------+------+
  422. | name | description | type |
  423. +======+=============================+======+
  424. | from | offset | int |
  425. +------+-----------------------------+------+
  426. | size | amount of batches to return | int |
  427. +------+-----------------------------+------+
  428. Response Body
  429. ^^^^^^^^^^^^^
  430. +------+-----------------------+-----------------+
  431. | name | description | type |
  432. +======+=======================+=================+
  433. | id | The batch id | int |
  434. +------+-----------------------+-----------------+
  435. | from | offset | int |
  436. +------+-----------------------+-----------------+
  437. | size | total amount of lines | int |
  438. +------+-----------------------+-----------------+
  439. | log | The log lines | list of strings |
  440. +------+-----------------------+-----------------+
  441. REST Objects
  442. ============
  443. Session
  444. -------
  445. Sessions represent an interactive shell.
  446. +----------------+--------------------------------------------------+----------------------------+
  447. | name | description | type |
  448. +================+==================================================+============================+
  449. | id | The session id | int |
  450. +----------------+--------------------------------------------------+----------------------------+
  451. | kind | session kind (spark, pyspark, or sparkr) | `session kind`_ (required) |
  452. +----------------+--------------------------------------------------+----------------------------+
  453. | log | The log lines | list of strings |
  454. +----------------+--------------------------------------------------+----------------------------+
  455. | state | The session state | string |
  456. +----------------+--------------------------------------------------+----------------------------+
  457. Session State
  458. ^^^^^^^^^^^^^
  459. +-------------+----------------------------------+
  460. | name | description |
  461. +=============+==================================+
  462. | not_started | session has not been started |
  463. +-------------+----------------------------------+
  464. | starting | session is starting |
  465. +-------------+----------------------------------+
  466. | idle | session is waiting for input |
  467. +-------------+----------------------------------+
  468. | busy | session is executing a statement |
  469. +-------------+----------------------------------+
  470. | error | session errored out |
  471. +-------------+----------------------------------+
  472. | dead | session has exited |
  473. +-------------+----------------------------------+
  474. Session Kind
  475. ^^^^^^^^^^^^
  476. +---------+----------------------------------+
  477. | name | description |
  478. +=========+==================================+
  479. | spark | interactive scala/spark session |
  480. +---------+----------------------------------+
  481. | pyspark | interactive python/spark session |
  482. +---------+----------------------------------+
  483. | sparkr | interactive R/spark session |
  484. +---------+----------------------------------+
  485. Statement
  486. ---------
  487. Statements represent the result of an execution statement.
  488. +--------+----------------------+---------------------+
  489. | name | description | type |
  490. +========+======================+=====================+
  491. | id | The statement id | integer |
  492. +--------+----------------------+---------------------+
  493. | state | The execution state | `statement state`_ |
  494. +--------+----------------------+---------------------+
  495. | output | The execution output | `statement output`_ |
  496. +--------+----------------------+---------------------+
  497. Statement State
  498. ^^^^^^^^^^^^^^^
  499. +-----------+----------------------------------+
  500. | name | description |
  501. +===========+==================================+
  502. | running | Statement is currently executing |
  503. +-----------+----------------------------------+
  504. | available | Statement has a ready response |
  505. +-----------+----------------------------------+
  506. | error | Statement failed |
  507. +-----------+----------------------------------+
  508. Statement Output
  509. ^^^^^^^^^^^^^^^^
  510. +-----------------+-------------------+----------------------------------+
  511. | name | description | type |
  512. +=================+===================+==================================+
  513. | status | execution status | string |
  514. +-----------------+-------------------+----------------------------------+
  515. | execution_count | a monotomically | integer |
  516. | | increasing number | |
  517. +-----------------+-------------------+----------------------------------+
  518. | data | statement output | an object mapping a mime type to |
  519. | | | the result. If the mime type is |
  520. | | | ``application/json``, the value |
  521. | | | will be a JSON value |
  522. +-----------------+-------------------+----------------------------------+
  523. Batch
  524. -----
  525. +----------------+--------------------------------------------------+----------------------------+
  526. | name | description | type |
  527. +================+==================================================+============================+
  528. | id | The session id | int |
  529. +----------------+--------------------------------------------------+----------------------------+
  530. | kind | session kind (spark, pyspark, or sparkr) | `session kind`_ (required) |
  531. +----------------+--------------------------------------------------+----------------------------+
  532. | log | The log lines | list of strings |
  533. +----------------+--------------------------------------------------+----------------------------+
  534. | state | The session state | string |
  535. +----------------+--------------------------------------------------+----------------------------+
  536. License
  537. =======
  538. Apache License, Version 2.0
  539. http://www.apache.org/licenses/LICENSE-2.0