README.rst 26 KB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586587588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645646647648649650651652653654655656657658659660661662663664665666667668669670671672673674675676677678679680681682683684685686687688689690691692693694695696697698699700701702703704705
  1. Welcome to Livy, the REST Spark Server
  2. ======================================
  3. Livy is an open source REST interface for interacting with Spark from anywhere. It supports executing snippets of code or programs in a Spark context that runs locally or in YARN.
  4. * Interactive Scala, Python and R shells
  5. * Batch submissions in Scala, Java, Python
  6. * Multi users can share the same server (impersonation support)
  7. * Can be used for submitting jobs from anywhere with REST
  8. * Does not require any code change to your programs
  9. The code is currently incubating in Hue but hopefully will eventually graduate in its top project. `Pull requests` are welcomed!
  10. .. _Pull requests: https://github.com/cloudera/hue/pulls
  11. Quick Start
  12. =============
  13. Livy is used for powering the Spark snippets of the `Hadoop Notebook`_ of `Hue 3.8`_, which you can see the
  14. `implementation here`_.
  15. See the API documentation below and some curl examples:
  16. * `Interactive shells`_
  17. * `Batch jobs`_
  18. * `Shared RDDs`_
  19. .. _Interactive shells: http://gethue.com/how-to-use-the-livy-spark-rest-job-server-for-interactive-spark/
  20. .. _Batch jobs: http://gethue.com/how-to-use-the-livy-spark-rest-job-server-api-for-sharing-spark-rdds-and-contexts/
  21. .. _Shared RDDs: http://gethue.com/how-to-use-the-livy-spark-rest-job-server-api-for-submitting-batch-jar-python-and-streaming-spark-jobs/
  22. .. _Hadoop Notebook: http://gethue.com/new-notebook-application-for-spark-sql/
  23. .. _Hue 3.8: http://gethue.com/hue-3-8-with-an-oozie-editor-revamp-better-performances-improved-spark-ui-is-out/
  24. .. _implementation here: https://github.com/cloudera/hue/blob/master/apps/spark/src/spark/job_server_api.py
  25. Prerequisites
  26. =============
  27. To build/run Livy, you will need:
  28. Debian/Ubuntu:
  29. * mvn (from ``maven`` package or maven3 tarball)
  30. * openjdk-7-jdk (or Oracle Java7 jdk)
  31. * spark 1.5 from (from `Apache Spark tarball`_)
  32. * Python 2.6+
  33. * R 3.x
  34. Redhat/CentOS:
  35. * mvn (from ``maven`` package or maven3 tarball)
  36. * java-1.7.0-openjdk (or Oracle Java7 jdk)
  37. * spark 1.5 (from `Apache Spark tarball`_)
  38. * Python 2.6+
  39. * R 3.x
  40. MacOS:
  41. * Xcode command line tools
  42. * Oracle's JDK 1.7+
  43. * Maven (Homebrew)
  44. * apache-spark 1.5 (Homebrew)
  45. * Python 2.6+
  46. * R 3.x
  47. .. _Apache Spark Tarball: https://spark.apache.org/downloads.html
  48. Building Livy
  49. =============
  50. Livy is currently built by the `Hue Build System`_, it can also be built on
  51. it's own (aka without any other Hue dependency) with `Apache Maven`_. To build,
  52. checks out the code, go to the Livy directory and run:
  53. .. code:: shell
  54. git clone git@github.com:cloudera/hue.git
  55. cd hue
  56. .. code:: shell
  57. % cd apps/spark/java
  58. % mvn -DskipTests clean package
  59. .. _Hue Build System: https://github.com/cloudera/hue/#getting-started
  60. .. _Apache Maven: http://maven.apache.org
  61. Running Tests
  62. =============
  63. In order to run the Livy Tests, first follow the instructions in `Building
  64. Livy`_. Then run:
  65. .. code:: shell
  66. % export SPARK_HOME=/usr/lib/spark
  67. % mvn test
  68. Running Livy
  69. ============
  70. In order to run Livy with local sessions, start the server with:
  71. .. code:: shell
  72. % export SPARK_HOME=/usr/lib/spark
  73. % ./bin/livy-server
  74. Or with YARN sessions by running:
  75. .. code:: shell
  76. % export SPARK_HOME=/usr/lib/spark
  77. % env \
  78. LIVY_SERVER_JAVA_OPTS="-Dlivy.server.session.factory=yarn" \
  79. CLASSPATH=`hadoop classpath` \
  80. ./bin/livy-server
  81. Livy Configuration
  82. ==================
  83. The properties of the server can be modified by copying <https://github.com/cloudera/hue/blob/master/apps/spark/java/conf/livy-defaults.conf.tmpl>
  84. and renaming it ``livy-defaults.conf``.
  85. In particular the ``YARN mode`` (default is ``local`` process for development) can be set with:
  86. .. code:: shell
  87. livy.server.session.factory = yarn
  88. Spark Example
  89. =============
  90. Now to see it in action by interacting with it in Python with the `Requests`_
  91. library. By default livy runs on port 8998 (which can be changed with the
  92. ``livy_server_port config`` option). We’ll start off with a Spark session that
  93. takes Scala code:
  94. .. code:: shell
  95. % sudo pip install requests
  96. .. code:: python
  97. >>> import json, pprint, requests, textwrap
  98. >>> host = 'http://localhost:8998'
  99. >>> data = {'kind': 'spark'}
  100. >>> headers = {'Content-Type': 'application/json'}
  101. >>> r = requests.post(host + '/sessions', data=json.dumps(data), headers=headers)
  102. >>> r.json()
  103. {u'state': u'starting', u'id': 0, u’kind’: u’spark’}
  104. Once the session has completed starting up, it transitions to the idle state:
  105. .. code:: python
  106. >>> session_url = host + r.headers['location']
  107. >>> r = requests.get(session_url, headers=headers)
  108. >>> r.json()
  109. {u'state': u'idle', u'id': 0, u’kind’: u’spark’}
  110. Now we can execute Scala by passing in a simple JSON command:
  111. .. code:: python
  112. >>> statements_url = session_url + '/statements'
  113. >>> data = {'code': '1 + 1'}
  114. >>> r = requests.post(statements_url, data=json.dumps(data), headers=headers)
  115. >>> r.json()
  116. {u'output': None, u'state': u'running', u'id': 0}
  117. If a statement takes longer than a few milliseconds to execute, Livy returns
  118. early and provides a URL that can be polled until it is complete:
  119. .. code:: python
  120. >>> statement_url = host + r.headers['location']
  121. >>> r = requests.get(statement_url, headers=headers)
  122. >>> pprint.pprint(r.json())
  123. [{u'id': 0,
  124. u'output': {u'data': {u'text/plain': u'res0: Int = 2'},
  125. u'execution_count': 0,
  126. u'status': u'ok'},
  127. u'state': u'available'}]
  128. That was a pretty simple example. More interesting is using Spark to estimate
  129. PI. This is from the `Spark Examples`_:
  130. .. code:: python
  131. >>> data = {
  132. ... 'code': textwrap.dedent("""\
  133. ... val NUM_SAMPLES = 100000;
  134. ... val count = sc.parallelize(1 to NUM_SAMPLES).map { i =>
  135. ... val x = Math.random();
  136. ... val y = Math.random();
  137. ... if (x*x + y*y < 1) 1 else 0
  138. ... }.reduce(_ + _);
  139. ... println(\"Pi is roughly \" + 4.0 * count / NUM_SAMPLES)
  140. ... """)
  141. ... }
  142. >>> r = requests.post(statements_url, data=json.dumps(data), headers=headers)
  143. >>> pprint.pprint(r.json())
  144. {u'id': 1,
  145. u'output': {u'data': {u'text/plain': u'Pi is roughly 3.14004\nNUM_SAMPLES: Int = 100000\ncount: Int = 78501'},
  146. u'execution_count': 1,
  147. u'status': u'ok'},
  148. u'state': u'available'}
  149. Finally, lets close our session:
  150. .. code:: python
  151. >>> session_url = 'http://localhost:8998/sessions/0'
  152. >>> requests.delete(session_url, headers=headers)
  153. <Response [204]>
  154. .. _Requests: http://docs.python-requests.org/en/latest/
  155. .. _Spark Examples: https://spark.apache.org/examples.html
  156. PySpark Example
  157. ===============
  158. pyspark has the exact same API, just with a different initial command:
  159. .. code:: python
  160. >>> data = {'kind': 'pyspark'}
  161. >>> r = requests.post(host + '/sessions', data=json.dumps(data), headers=headers)
  162. >>> r.json()
  163. {u'id': 1, u'state': u'idle'}
  164. The PI example from before then can be run as:
  165. .. code:: python
  166. >>> data = {
  167. ... 'code': textwrap.dedent("""\
  168. ... import random
  169. ... NUM_SAMPLES = 100000
  170. ... def sample(p):
  171. ... x, y = random.random(), random.random()
  172. ... return 1 if x*x + y*y < 1 else 0
  173. ...
  174. ... count = sc.parallelize(xrange(0, NUM_SAMPLES)).map(sample) \
  175. ... .reduce(lambda a, b: a + b)
  176. ... print "Pi is roughly %f" % (4.0 * count / NUM_SAMPLES)
  177. ... """)
  178. ... }
  179. >>> r = requests.post(statements_url, data=json.dumps(data), headers=headers)
  180. >>> pprint.pprint(r.json())
  181. {u'id': 12,
  182. u'output': {u'data': {u'text/plain': u'Pi is roughly 3.136000'},
  183. u'execution_count': 12,
  184. u'status': u'ok'},
  185. u'state': u'running'}
  186. SparkR Example
  187. ==============
  188. SparkR also has the same API:
  189. .. code:: python
  190. >>> data = {'kind': 'sparkR'}
  191. >>> r = requests.post(host + '/sessions', data=json.dumps(data), headers=headers)
  192. >>> r.json()
  193. {u'id': 1, u'state': u'idle'}
  194. The PI example from before then can be run as:
  195. .. code:: python
  196. >>> data = {
  197. ... 'code': textwrap.dedent("""\
  198. ... n <- 100000
  199. ... piFunc <- function(elem) {
  200. ... rands <- runif(n = 2, min = -1, max = 1)
  201. ... val <- ifelse((rands[1]^2 + rands[2]^2) < 1, 1.0, 0.0)
  202. ... val
  203. ... }
  204. ... piFuncVec <- function(elems) {
  205. ... message(length(elems))
  206. ... rands1 <- runif(n = length(elems), min = -1, max = 1)
  207. ... rands2 <- runif(n = length(elems), min = -1, max = 1)
  208. ... val <- ifelse((rands1^2 + rands2^2) < 1, 1.0, 0.0)
  209. ... sum(val)
  210. ... }
  211. ... rdd <- parallelize(sc, 1:n, slices)
  212. ... count <- reduce(lapplyPartition(rdd, piFuncVec), sum)
  213. ... cat("Pi is roughly", 4.0 * count / n, "\n")
  214. ... """)
  215. ... }
  216. >>> r = requests.post(statements_url, data=json.dumps(data), headers=headers)
  217. >>> pprint.pprint(r.json())
  218. {u'id': 12,
  219. u'output': {u'data': {u'text/plain': u'Pi is roughly 3.136000'},
  220. u'execution_count': 12,
  221. u'status': u'ok'},
  222. u'state': u'running'}
  223. Community
  224. =========
  225. * User group: http://groups.google.com/a/cloudera.org/group/hue-user
  226. * Umbrella Jira: https://issues.cloudera.org/browse/HUE-2588
  227. * Pull requests: https://github.com/cloudera/hue/pulls
  228. REST API
  229. ========
  230. GET /sessions
  231. -------------
  232. Returns all the active interactive sessions.
  233. Response Body
  234. ^^^^^^^^^^^^^
  235. +----------+-----------------+------+
  236. | name | description | type |
  237. +==========+=================+======+
  238. | sessions | `session`_ list | list |
  239. +----------+-----------------+------+
  240. POST /sessions
  241. --------------
  242. Creates a new interative Scala, Python or R shell in the cluster.
  243. Request Body
  244. ^^^^^^^^^^^^
  245. +----------------+--------------------------------------------------------------------------------+------------------+
  246. | name | description | type |
  247. +================+================================================================================+==================+
  248. | kind | The session kind (required) | `session kind`_ |
  249. +----------------+--------------------------------------------------------------------------------+------------------+
  250. | proxyUser | The user to impersonate that will run this session (e.g. bob) | string |
  251. +----------------+--------------------------------------------------------------------------------+------------------+
  252. | jars | files to be placed on the java classpath | list of paths |
  253. +----------------+--------------------------------------------------------------------------------+------------------+
  254. | pyFiles | files to be placed on the PYTHONPATH | list of paths |
  255. +----------------+--------------------------------------------------------------------------------+------------------+
  256. | files | files to be placed in executor working directory | list of paths |
  257. +----------------+--------------------------------------------------------------------------------+------------------+
  258. | driverMemory | memory for driver (e.g. 1000M, 2G) | string |
  259. +----------------+--------------------------------------------------------------------------------+------------------+
  260. | driverCores | number of cores used by driver (YARN mode only) | int |
  261. +----------------+--------------------------------------------------------------------------------+------------------+
  262. | executorMemory | memory for executor (e.g. 1000M, 2G) | string |
  263. +----------------+--------------------------------------------------------------------------------+------------------+
  264. | executorCores | number of cores used by executor | int |
  265. +----------------+--------------------------------------------------------------------------------+------------------+
  266. | numExecutors | number of executors (YARN mode only) | int |
  267. +----------------+--------------------------------------------------------------------------------+------------------+
  268. | archives | Archives to be uncompressed in the executor working directory (YARN mode only) | list of paths |
  269. +----------------+--------------------------------------------------------------------------------+------------------+
  270. | queue | The YARN queue to submit too (YARN mode only) | string |
  271. +----------------+--------------------------------------------------------------------------------+------------------+
  272. Response Body
  273. ^^^^^^^^^^^^^
  274. The created `Session`_.
  275. GET /sessions/{sessionId}
  276. -------------------------
  277. Return the session information
  278. Response
  279. ^^^^^^^^
  280. The `Session`_.
  281. DELETE /sessions/{sessionId}
  282. -------------------------
  283. Kill the `Session`_ job.
  284. GET /sessions/{sessionId}/logs
  285. ---------------------------
  286. Get the log lines from this session.
  287. Request Parameters
  288. ^^^^^^^^^^^^^^^^^^
  289. +------+-----------------------------+------+
  290. | name | description | type |
  291. +======+=============================+======+
  292. | from | offset | int |
  293. +------+-----------------------------+------+
  294. | size | amount of batches to return | int |
  295. +------+-----------------------------+------+
  296. Response Body
  297. ^^^^^^^^^^^^^
  298. +------+-----------------------+-----------------+
  299. | name | description | type |
  300. +======+=======================+=================+
  301. | id | The session id | int |
  302. +------+-----------------------+-----------------+
  303. | from | offset | int |
  304. +------+-----------------------+-----------------+
  305. | size | total amount of lines | int |
  306. +------+-----------------------+-----------------+
  307. | log | The log lines | list of strings |
  308. +------+-----------------------+-----------------+
  309. GET /sessions/{sessionId}/statements
  310. ------------------------------------
  311. Return all the statements in a session.
  312. Response Body
  313. ^^^^^^^^^^^^^
  314. +------------+-------------------+------+
  315. | name | description | type |
  316. +============+===================+======+
  317. | statements | `statement`_ list | list |
  318. +------------+-------------------+------+
  319. POST /sessions/{sessionId}/statements
  320. -------------------------------------
  321. Execute a statement in a session.
  322. Request Body
  323. ^^^^^^^^^^^^
  324. +------+---------------------+--------+
  325. | name | description | type |
  326. +======+=====================+========+
  327. | code | The code to execute | string |
  328. +------+---------------------+--------+
  329. Response Body
  330. ^^^^^^^^^^^^^
  331. The `statement`_ object.
  332. GET /batches
  333. ------------
  334. Return all the active batch jobs.
  335. Response Body
  336. ^^^^^^^^^^^^^
  337. +---------+---------------+------+
  338. | name | description | type |
  339. +=========+===============+======+
  340. | batches | `batch`_ list | list |
  341. +---------+---------------+------+
  342. POST /batches
  343. -------------
  344. Request Body
  345. ^^^^^^^^^^^^
  346. +----------------+--------------------------------------------------+-----------------+
  347. | name | description | type |
  348. +================+==================================================+=================+
  349. | proxyUser | The user to impersonate that will execute the job| string |
  350. +----------------+--------------------------------------------------+-----------------+
  351. | file | archive holding the file | path (required) |
  352. +----------------+--------------------------------------------------+-----------------+
  353. | args | command line arguments | list of strings |
  354. +----------------+--------------------------------------------------+-----------------+
  355. | className | application's java/spark main class | string |
  356. +----------------+--------------------------------------------------+-----------------+
  357. | jars | files to be placed on the java classpath | list of paths |
  358. +----------------+--------------------------------------------------+-----------------+
  359. | pyFiles | files to be placed on the PYTHONPATH | list of paths |
  360. +----------------+--------------------------------------------------+-----------------+
  361. | files | files to be placed in executor working directory | list of paths |
  362. +----------------+--------------------------------------------------+-----------------+
  363. | driverMemory | memory for driver (e.g. 1000M, 2G) | string |
  364. +----------------+--------------------------------------------------+-----------------+
  365. | driverCores | number of cores used by driver | int |
  366. +----------------+--------------------------------------------------+-----------------+
  367. | executorMemory | memory for executor (e.g. 1000M, 2G) | string |
  368. +----------------+--------------------------------------------------+-----------------+
  369. | executorCores | number of cores used by executor | int |
  370. +----------------+--------------------------------------------------+-----------------+
  371. | numExecutors | number of executor | int |
  372. +----------------+--------------------------------------------------+-----------------+
  373. | archives | Archives to be uncompressed (YARN mode only) | list of paths |
  374. +----------------+--------------------------------------------------+-----------------+
  375. | queue | The YARN queue to submit too (YARN mode only) | string |
  376. +----------------+--------------------------------------------------+-----------------+
  377. Response Body
  378. ^^^^^^^^^^^^^
  379. The created `Batch`_ object.
  380. GET /batches/{batchId}
  381. ----------------------
  382. Request Parameters
  383. ^^^^^^^^^^^^^^^^^^
  384. +------+-----------------------------+------+
  385. | name | description | type |
  386. +======+=============================+======+
  387. | from | offset | int |
  388. +------+-----------------------------+------+
  389. | size | amount of batches to return | int |
  390. +------+-----------------------------+------+
  391. Response Body
  392. ^^^^^^^^^^^^^
  393. +-------+-----------------------------+-----------------+
  394. | name | description | type |
  395. +=======+=============================+=================+
  396. | id | The batch id | int |
  397. +-------+-----------------------------+-----------------+
  398. | state | The state of the batch | `batch`_ state |
  399. +-------+-----------------------------+-----------------+
  400. | log | The output of the batch job | list of strings |
  401. +-------+-----------------------------+-----------------+
  402. DELETE /batches/{batchId}
  403. -------------------------
  404. Kill the `Batch`_ job.
  405. GET /batches/{batchId}/logs
  406. ---------------------------
  407. Get the log lines from this batch.
  408. Request Parameters
  409. ^^^^^^^^^^^^^^^^^^
  410. +------+-----------------------------+------+
  411. | name | description | type |
  412. +======+=============================+======+
  413. | from | offset | int |
  414. +------+-----------------------------+------+
  415. | size | amount of batches to return | int |
  416. +------+-----------------------------+------+
  417. Response Body
  418. ^^^^^^^^^^^^^
  419. +------+-----------------------+-----------------+
  420. | name | description | type |
  421. +======+=======================+=================+
  422. | id | The batch id | int |
  423. +------+-----------------------+-----------------+
  424. | from | offset | int |
  425. +------+-----------------------+-----------------+
  426. | size | total amount of lines | int |
  427. +------+-----------------------+-----------------+
  428. | log | The log lines | list of strings |
  429. +------+-----------------------+-----------------+
  430. REST Objects
  431. ============
  432. Session
  433. -------
  434. Sessions represent an interactive shell.
  435. +----------------+--------------------------------------------------+----------------------------+
  436. | name | description | type |
  437. +================+==================================================+============================+
  438. | id | The session id | int |
  439. +----------------+--------------------------------------------------+----------------------------+
  440. | kind | session kind (spark, pyspark, or sparkr) | `session kind`_ (required) |
  441. +----------------+--------------------------------------------------+----------------------------+
  442. | log | The log lines | list of strings |
  443. +----------------+--------------------------------------------------+----------------------------+
  444. | state | The session state | string |
  445. +----------------+--------------------------------------------------+----------------------------+
  446. Session State
  447. ^^^^^^^^^^^^^
  448. +-------------+----------------------------------+
  449. | name | description |
  450. +=============+==================================+
  451. | not_started | session has not been started |
  452. +-------------+----------------------------------+
  453. | starting | session is starting |
  454. +-------------+----------------------------------+
  455. | idle | session is waiting for input |
  456. +-------------+----------------------------------+
  457. | busy | session is executing a statement |
  458. +-------------+----------------------------------+
  459. | error | session errored out |
  460. +-------------+----------------------------------+
  461. | dead | session has exited |
  462. +-------------+----------------------------------+
  463. Session Kind
  464. ^^^^^^^^^^^^
  465. +---------+----------------------------------+
  466. | name | description |
  467. +=========+==================================+
  468. | spark | interactive scala/spark session |
  469. +---------+----------------------------------+
  470. | pyspark | interactive python/spark session |
  471. +---------+----------------------------------+
  472. | sparkr | interactive R/spark session |
  473. +---------+----------------------------------+
  474. Statement
  475. ---------
  476. Statements represent the result of an execution statement.
  477. +--------+----------------------+---------------------+
  478. | name | description | type |
  479. +========+======================+=====================+
  480. | id | The statement id | integer |
  481. +--------+----------------------+---------------------+
  482. | state | The execution state | `statement state`_ |
  483. +--------+----------------------+---------------------+
  484. | output | The execution output | `statement output`_ |
  485. +--------+----------------------+---------------------+
  486. Statement State
  487. ^^^^^^^^^^^^^^^
  488. +-----------+----------------------------------+
  489. | name | description |
  490. +===========+==================================+
  491. | running | Statement is currently executing |
  492. +-----------+----------------------------------+
  493. | available | Statement has a ready response |
  494. +-----------+----------------------------------+
  495. | error | Statement failed |
  496. +-----------+----------------------------------+
  497. Statement Output
  498. ^^^^^^^^^^^^^^^^
  499. +-----------------+-------------------+----------------------------------+
  500. | name | description | type |
  501. +=================+===================+==================================+
  502. | status | execution status | string |
  503. +-----------------+-------------------+----------------------------------+
  504. | execution_count | a monotomically | integer |
  505. | | increasing number | |
  506. +-----------------+-------------------+----------------------------------+
  507. | data | statement output | an object mapping a mime type to |
  508. | | | the result. If the mime type is |
  509. | | | ``application/json``, the value |
  510. | | | will be a JSON value |
  511. +-----------------+-------------------+----------------------------------+
  512. Batch
  513. -----
  514. +----------------+--------------------------------------------------+----------------------------+
  515. | name | description | type |
  516. +================+==================================================+============================+
  517. | id | The session id | int |
  518. +----------------+--------------------------------------------------+----------------------------+
  519. | kind | session kind (spark, pyspark, or sparkr) | `session kind`_ (required) |
  520. +----------------+--------------------------------------------------+----------------------------+
  521. | log | The log lines | list of strings |
  522. +----------------+--------------------------------------------------+----------------------------+
  523. | state | The session state | string |
  524. +----------------+--------------------------------------------------+----------------------------+
  525. License
  526. =======
  527. Apache License, Version 2.0
  528. http://www.apache.org/licenses/LICENSE-2.0