Procházet zdrojové kódy

HUE-8815 [docs] Simplify some of the README, user and admin content

Romain před 6 roky
rodič
revize
b03adf2d37

+ 1 - 2
README.md

@@ -61,5 +61,4 @@ Community
 
 License
 -----------
-Apache License, Version 2.0
-http://www.apache.org/licenses/LICENSE-2.0
+[Apache License, Version 2.0](http://www.apache.org/licenses/LICENSE-2.0)

+ 18 - 15
docs/docs-site/content/administrator/administration/operations.md

@@ -5,7 +5,7 @@ draft: false
 weight: 2
 ---
 
-## Quick Start Wizard
+## Admin Wizard
 
 The Quick Start wizard allows you to perform the following Hue setup
 operations by clicking the tab of each step or sequentially by clicking
@@ -22,7 +22,7 @@ Next in each screen:
     import users and a checkbox to enable and disable collection of
     usage information.
 
-## Configuration
+### Configuration
 
 Displays a list of the installed Hue applications and their
 configuration. The location of the folder containing the Hue
@@ -63,14 +63,8 @@ located in the `/etc/hue` directory.  Files that are alphabetically later
 take precedence.
 </div>
 
-# Administration
 
-Now that you've installed and started Hue, you can feel free to skip ahead
-to the <<usage,Using Hue>> section. Administrators may want to refer to this
-section for more details about managing and operating a Hue installation.
-
-
-## Configuration Validation
+### Configuration Validation
 
 Hue can detect certain invalid configuration.
 
@@ -79,19 +73,19 @@ To view the configuration of a running Hue instance, navigate to
 application.
 
 
-## Server Logs
+### Server Logs
 
 Displays the Hue Server log and allows you to download the log to your
 local system in a zip file.
 
-## Threads
+### Threads
 
 Read more on the [Threads and Metrics pages
  blog post](http://gethue.com/easier-administration-of-hue-with-the-new-threads-and-metrics-pages/)
 
 Threads page can be very helpful in debugging purposes. It includes a daemonic thread and the thread objects serving concurrent requests. The host name, thread name identifier and current stack frame of each are displayed. Those are useful when Hue “hangs”, sometimes in case of a request too CPU intensive. There is also a REST API to get the dump of Threads using 'desktop/debug/threads'
 
-## Metrics
+### Metrics
 
 Read more on the [Threads and Metrics pages
  blog post](http://gethue.com/easier-administration-of-hue-with-the-new-threads-and-metrics-pages/)
@@ -111,7 +105,7 @@ The below metrics of most concern to us are displayed on the page:
 One of the most useful ones are the percentiles of response time of requests and the count of active users.
 Admins can either filter a particular property in all the metrics or select a particular metric for all properties
 
-## Logging
+### Logging
 
 The Hue logs are found in `/var/log/hue`, or in a `logs` directory under your
 Hue installation root. Inside the log directory you can find:
@@ -131,13 +125,22 @@ If users on your cluster have problems running Hue, you can often find error
 messages in these log files. If you are unable to start Hue from the init
 script, the `supervisor.log` log file can often contain clues.
 
-### Viewing Recent Log Messages
-
 In addition to logging `INFO` level messages to the `logs` directory, the Hue
 web server keeps a small buffer of log messages at all levels in memory. You can
 view these logs by visiting `http://myserver:8888/hue/logs`. The `DEBUG` level
 messages shown can sometimes be helpful in troubleshooting issues.
 
+## Commands
+
+Type the following command from the Hue installation root.
+
+    cd /usr/lib/hue (or /opt/cloudera/parcels/CDH-XXXXX/share/hue if using parcels and CM)
+    build/env/bin/hue shell
+
+To list all the available commands:
+
+    build/env/bin/hue
+
 ## Troubleshooting
 
 To troubleshoot why Hue is slow or consuming high memory, admin can enable instrumentation by setting the `instrumentation` flag to True.

+ 1 - 1
docs/docs-site/content/administrator/administration/user-management.md

@@ -127,7 +127,7 @@ groups. It does not import any new users or groups.
 3.  In the **Sync LDAP users and groups** dialog, click **Sync** to
     perform the sync.
 
-## Reset a password
+### Reset a password
 
 **Programmatically**
 

+ 1 - 1
docs/docs-site/content/developer/api/_index.md

@@ -2,7 +2,7 @@
 title: "APIs"
 date: 2019-03-13T18:28:09-07:00
 draft: false
-weight: 3
+weight: 5
 ---
 
 Hue can be accessed directly via a Django Python Shell or by its REST API.

+ 1 - 1
docs/docs-site/content/developer/application/_index.md

@@ -2,7 +2,7 @@
 title: "Applications"
 date: 2019-03-13T18:28:09-07:00
 draft: false
-weight: 5
+weight: 4
 ---
 
 Building a brand new application is more work but is ideal for creating a custom solution.

+ 1 - 1
docs/docs-site/content/developer/editor/_index.md

@@ -1,5 +1,5 @@
 ---
-title: "Editor / Notebook"
+title: "Editor"
 date: 2019-03-13T18:28:09-07:00
 draft: false
 weight: 2

+ 145 - 2
docs/docs-site/content/user/browsers/_index.md

@@ -43,11 +43,13 @@ To learn more, watch the video on [Data Import Wizard](http://gethue.com/import-
 
 ### Traditional Databases
 
-Read more about [ingesting data from traditional databases](http://gethue.com/importing-data-from-traditional-databases-into-hdfshive-in-just-a-few-clicks/).
+Import data from relational databases to HDFS file or Hive table using Apache Sqoop. It enables to bring large amount of data into the cluster in just few clicks via interactive UI. The imports run on YARN and are scheduled by Oozie.
+
+Learn more about it on the [ingesting data from traditional databases](http://gethue.com/importing-data-from-traditional-databases-into-hdfshive-in-just-a-few-clicks/) post.
 
 ### Indexing
 
-In the past, indexing data into Solr to then explore it with a [Dynamic Dashboard](http://gethue.com/search-dashboards/) has been quite difficult. The task involved writing a Solr schema and a Morphlines file then submitting a job to YARN to do the indexing. Often times getting this correct for non trivial imports could take a few days of work. Now with Hue's new feature you can start your YARN indexing job in minutes. This tutorial offers a step by step guide on how to do it.
+In the past, indexing data into Solr to then explore it with a [Dynamic Dashboard](http://gethue.com/search-dashboards/) has been quite difficult. The task involved writing a Solr schema and a Morphlines file then submitting a job to YARN to do the indexing. Often times getting this correct for non trivial imports could take a few days of work. Now with Hue's new feature you can start your YARN indexing job in minutes.
 
 [Read more about it here](http://gethue.com/easy-indexing-of-data-into-solr/).
 
@@ -104,6 +106,147 @@ Learn more about it on the [ADLS integration post](http://gethue.com/browsing-ad
 
 Google file system is currently not supported.
 
+### HBase Browser
+
+We'll take a look at the [HBase Browser App](http://gethue.com/the-web-ui-for-hbase-hbase-browser).
+
+**Note**: With just a few changes in the [Python API](https://github.com/cloudera/hue/blob/master/apps/hbase/src/hbase/api.py),
+the HBase browser could be compatible with Apache Kudu or Google Big Table.
+
+
+#### SmartView
+
+The smartview is the view that you land on when you first enter a table.
+On the left hand side are the row keys and hovering over a row reveals a
+list of controls on the right. Click a row to select it, and once
+selected you can perform batch operations, sort columns, or do any
+amount of standard database operations. To explore a row, simple scroll
+to the right. By scrolling, the row should continue to lazily-load cells
+until the end.
+
+#### Adding Data
+
+To initially populate the table, you can insert a new row or bulk upload
+CSV/TSV/etc. type data into your table.
+
+
+On the right hand side of a row is a '+' sign that lets you insert
+columns into your
+row
+
+#### Mutating Data
+
+To edit a cell, simply click to edit inline.
+
+If you need more control or data about your cell, click “Full Editor” to
+edit.
+
+In the full editor, you can view cell history or upload binary data to
+the cell. Binary data of certain MIME Types are detected, meaning you
+can view and edit images, PDFs, JSON, XML, and other types directly in
+your browser!
+
+Hovering over a cell also reveals some more controls (such as the delete
+button or the timestamp). Click the title to select a few and do batch
+operations:
+
+If you need some sample data to get started and explore, check out this
+howto create [HBase table
+tutorial](http://gethue.com/hadoop-tutorial-how-to-create-example-tables-in-hbase).
+
+
+#### Smart Searchbar
+
+The "Smart Searchbar" is a sophisticated tool that helps you zero-in on
+your data. The smart search supports a number of operations. The most
+basic ones include finding and scanning row keys. Here I am selecting
+two row keys with:
+
+
+    domain.100, domain.200
+
+
+Submitting this query gives me the two rows I was looking for. If I want
+to fetch rows after one of these, I have to do a scan. This is as easy
+as writing a '+' followed by the number of rows you want to fetch.
+
+
+    domain.100, domain.200 +5
+
+
+Fetches domain.100 and domain.200 followed by the next 5 rows. If you're
+ever confused about your results, you can look down below and the query
+bar and also click in to edit your query.
+
+The Smart Search also supports column filtering. On any row, I can
+specify the specific columns or families I want to retrieve. With:
+
+
+    domain.100[column_family:]   
+
+
+I can select a bare family, or mix columns from different families like
+so:
+
+
+    domain.100[family1:, family2:, family3:column_a]
+
+
+Doing this will restrict my results from one row key to the columns I
+specified. If you want to restrict column families only, the same effect
+can be achieved with the filters on the right. Just click to toggle a
+filter.
+
+
+Finally, let's try some more complex column filters. I can query for
+bare columns:
+
+
+    domain.100[column_a]
+
+This will multiply my query over all column families. I can also do
+prefixes and scans:
+
+
+    domain.100[family: prefix* +3]
+
+
+This will fetch me all columns that start with prefix\* limited to 3
+results. Finally, I can filter on range:
+
+
+    domain.100[family: column1 to column100]
+
+
+This will fetch me all columns in 'family:' that are lexicographically
+\>= column1 but <= column100. The first column ('column1') must be a
+valid column, but the second can just be any string for comparison.
+
+The Smart Search also supports prefix filtering on rows. To select a
+prefixed row, simply type the row key followed by a star \*. The prefix
+should be highlighted like any other searchbar keyword. A prefix scan is
+performed exactly like a regular scan, but with a prefixed row.
+
+
+    domain.10* +10
+
+
+Finally, as a new feature, you can also take full advantage of the
+[HBase filtering](denied:about:blank)language, by typing your filter
+string between curly braces. HBase Browser autocompletes your filters
+for you so you don't have to look them up every time. You can apply
+filters to rows or scans.
+
+
+    domain.1000 {ColumnPrefixFilter('100-') AND ColumnCountGetFilter(3)}
+
+
+This doc only covers a few basic features of the Smart Search. You can
+take advantage of the full querying language by referring to the help
+menu when using the app. These include column prefix, bare columns,
+column range, etc. Remember that if you ever need help with the
+searchbar, you can use the help menu that pops up while typing, which
+will suggest next steps to complete your query.
 
 ## Solr Indexes / Collections
 

+ 9 - 7
docs/docs-site/content/user/concept/_index.md

@@ -44,7 +44,7 @@ From top to bottom we have:
 Learn more on the [The Hue 4 user interface in detail](http://gethue.com/the-hue-4-user-interface-in-detail/).
 
 
-## Top search
+### Top search
 
 Have you ever struggled to remember table names related to your project? Does it take much too long to find those columns or views? Hue now lets you easily search for any table, view, or column across all databases in the cluster. With the ability to search across tens of thousands of tables, you're able to quickly find the tables that are relevant for your needs for faster data discovery.
 
@@ -63,21 +63,21 @@ Example of searches:
 
 Learn more on the [Tagging](https://blog.cloudera.com/blog/2017/05/new-in-cloudera-enterprise-5-11-hue-data-search-and-tagging/).
 
-## Tagging
+### Tagging
 
 In addition, you can also now tag objects with names to better categorize them and group them to different projects. These tags are searchable, expediting the exploration process through easier, more intuitive discovery.
 
-## Left assist
+### Left assist
 
 Data where you need it when you need it.
 
 Find your documents, HDFS and S3 files and more in the left assist panel, right-clicking items will show a list of actions, you can also drag-and-drop a file to get the path in your editor and more.
 
-## Right assist
+### Right assist
 
 This assistant content depends on the context of the application selected and will display the current tables or available UDFs.
 
-## Sample popup
+### Sample popup
 
 This popup offers a quick way to see sample of the data and other statistics on databases, tables, and columns. You can open the popup from the SQL Assist or with a right-click on any SQL object (table, column, function…). In this release, it also opens faster and caches the data.
 
@@ -100,8 +100,10 @@ Shared documents will show-up with a little blue icon in the homepage.
 
 Via the Home page, saved documents can be exported for backups or transferring to another Hue.
 
-## Changing the language
+## Settings
+
+### Changing the language
 
 The language is automatically detected from the Browser or OS. English, Spanish, French, German, Korean, Japanese and Chinese are supported.
 
-The language can be manual set by a user in the "My Profile" page. Please go to My Profile > Step2 Profile and Groups > Language Preference and choose the language you want.
+The language can be manual set by a user in the "My Profile" page. Please go to My Profile > Step2 Profile and Groups > Language Preference and choose the language you want.

+ 0 - 283
docs/docs-site/content/user/contrib/_index.md

@@ -1,283 +0,0 @@
----
-title: "Contrib"
-date: 2019-03-13T18:28:09-07:00
-draft: false
-weight: 6
----
-
-Those modules are not active enough to be officially maintained in the core Hue but those
-are pretty functional and should still fit your needs. Any [contribution](https://github.com/cloudera/hue/blob/master/docs/CONTRIBUTING.md) is welcomed!
-
-## SDK
-
-Check the [Developer guide]({{% param baseURL %}}developer/index.html) or contact the community about how to build your own custom app.
-
-## HBase Browser
-
-We'll take a look at the [HBase Browser App](http://gethue.com/the-web-ui-for-hbase-hbase-browser).
-
-**Note**: With just a few changes in the [Python API](https://github.com/cloudera/hue/blob/master/apps/hbase/src/hbase/api.py),
-the HBase browser could be compatible with Apache Kudu.
-
-
-### SmartView
-
-The smartview is the view that you land on when you first enter a table.
-On the left hand side are the row keys and hovering over a row reveals a
-list of controls on the right. Click a row to select it, and once
-selected you can perform batch operations, sort columns, or do any
-amount of standard database operations. To explore a row, simple scroll
-to the right. By scrolling, the row should continue to lazily-load cells
-until the end.
-
-### Adding Data
-
-To initially populate the table, you can insert a new row or bulk upload
-CSV/TSV/etc. type data into your table.
-
-
-On the right hand side of a row is a '+' sign that lets you insert
-columns into your
-row
-
-### Mutating Data
-
-To edit a cell, simply click to edit inline.
-
-If you need more control or data about your cell, click “Full Editor” to
-edit.
-
-In the full editor, you can view cell history or upload binary data to
-the cell. Binary data of certain MIME Types are detected, meaning you
-can view and edit images, PDFs, JSON, XML, and other types directly in
-your browser!
-
-Hovering over a cell also reveals some more controls (such as the delete
-button or the timestamp). Click the title to select a few and do batch
-operations:
-
-If you need some sample data to get started and explore, check out this
-howto create [HBase table
-tutorial](http://gethue.com/hadoop-tutorial-how-to-create-example-tables-in-hbase).
-
-
-### Smart Searchbar
-
-The "Smart Searchbar" is a sophisticated tool that helps you zero-in on
-your data. The smart search supports a number of operations. The most
-basic ones include finding and scanning row keys. Here I am selecting
-two row keys with:
-
-
-    domain.100, domain.200
-
-
-Submitting this query gives me the two rows I was looking for. If I want
-to fetch rows after one of these, I have to do a scan. This is as easy
-as writing a '+' followed by the number of rows you want to fetch.
-
-
-    domain.100, domain.200 +5
-
-
-Fetches domain.100 and domain.200 followed by the next 5 rows. If you're
-ever confused about your results, you can look down below and the query
-bar and also click in to edit your query.
-
-The Smart Search also supports column filtering. On any row, I can
-specify the specific columns or families I want to retrieve. With:
-
-
-    domain.100[column_family:]   
-
-
-I can select a bare family, or mix columns from different families like
-so:
-
-
-    domain.100[family1:, family2:, family3:column_a]
-
-
-Doing this will restrict my results from one row key to the columns I
-specified. If you want to restrict column families only, the same effect
-can be achieved with the filters on the right. Just click to toggle a
-filter.
-
-
-Finally, let's try some more complex column filters. I can query for
-bare columns:
-
-
-    domain.100[column_a]
-
-This will multiply my query over all column families. I can also do
-prefixes and scans:
-
-
-    domain.100[family: prefix* +3]
-
-
-This will fetch me all columns that start with prefix\* limited to 3
-results. Finally, I can filter on range:
-
-
-    domain.100[family: column1 to column100]
-
-
-This will fetch me all columns in 'family:' that are lexicographically
-\>= column1 but <= column100. The first column ('column1') must be a
-valid column, but the second can just be any string for comparison.
-
-The Smart Search also supports prefix filtering on rows. To select a
-prefixed row, simply type the row key followed by a star \*. The prefix
-should be highlighted like any other searchbar keyword. A prefix scan is
-performed exactly like a regular scan, but with a prefixed row.
-
-
-    domain.10* +10
-
-
-Finally, as a new feature, you can also take full advantage of the
-[HBase filtering](denied:about:blank)language, by typing your filter
-string between curly braces. HBase Browser autocompletes your filters
-for you so you don't have to look them up every time. You can apply
-filters to rows or scans.
-
-
-    domain.1000 {ColumnPrefixFilter('100-') AND ColumnCountGetFilter(3)}
-
-
-This doc only covers a few basic features of the Smart Search. You can
-take advantage of the full querying language by referring to the help
-menu when using the app. These include column prefix, bare columns,
-column range, etc. Remember that if you ever need help with the
-searchbar, you can use the help menu that pops up while typing, which
-will suggest next steps to complete your query.
-
-## Sqoop 1 Importer
-Iport data from relational databases to HDFS file or Hive table using Apache Sqoop 1. It enables us to bring large amount of data into the cluster in just few clicks via interactive UI. This Sqoop connector was added to the existing import data wizard of Hue.
-
-In the past, importing data using Sqoop command line interface could be a cumbersome and inefficient process. The task expected users to have a good knowledge of Sqoop . For example they would need put together a series of required parameters with specific syntax that would result in errors easy to make. Often times getting those correctly can take a few hours of work. Now with Hue's new feature you can submityour Sqoop job in minutes. The imports run on YARN and are scheduled by Oozie. This tutorial offers a step by step guide on how to do it.
-
-Learn more about it on the [Importing data from traditional databases into HDFS/Hive in just a few clicks
-](http://gethue.com/importing-data-from-traditional-databases-into-hdfshive-in-just-a-few-clicks/) post.
-
-## Sqoop 2 Editor
-
-The Sqoop UI enables transfering data from a relational database
-to Hadoop and vice versa. The UI lives uses Apache Sqoop to do this.
-See the [Sqoop Documentation](http://sqoop.apache.org/docs/1.99.2/index.html) for more details on Sqoop.
-
-### Creating a New Job
-
-1. Click the **New job** button at the top right.
-2. In the Name field, enter a name.
-3. Choose the type of job: import or export.
-   The proceeding form fields will change depending on which type is chosen.
-4. Select a connection, or create one if it does not exist.
-5. Fill in the rest of the fields for the job.
-   For importing, the "Table name", "Storage type", "Output format", and "Output directory" are necessary at a minimum.
-   For exporting, the "Table name" and "Input directory" are necessary at a minimum.
-6. Click **save** to finish.
-
-
-### Running a Job
-
-There's a status on each of the items in the job list indicating
-the last time a job was ran. The progress of the job should dynamically
-update. There's a progress bar at the bottom of each item on the job list
-as well.
-
-1. In the list of jobs, click on the name of the job.
-2. On the left hand side of the job editor, there should be a panel containing actions.
-   Click **Run**.
-
-### Creating a New Connection
-
-1. Click the **New job** button at the top right.
-2. At the connection field, click the link titled **Add a new connection**.
-3. Fill in the displayed fields.
-4. Click **save** to finish.
-
-### Editing a Connection
-
-1. Click the **New job** button at the top right.
-2. At the connection field, select the connection by name that should be edited.
-3. Click **Edit**.
-4. Edit the any of the fields.
-5. Click **save** to finish.
-
-### Removing a Connection
-
-1. Click the **New job** button at the top right.
-2. At the connection field, select the connection by name that should be deleted.
-3. Click **Delete**.
-
-NOTE: If this does not work, it's like because a job is using that connection.
-      Make sure not jobs are using the connection that will be deleted.
-
-### Filtering Sqoop Jobs
-
-The text field in the top, left corner of the Sqoop Jobs page enables fast filtering
-of sqoop jobs by name.
-
-
-## ZooKeeper Browser
-
-
-The main two features are:
-
-- Listing of the ZooKeeper cluster stats and clients
-- Browsing and edition of the ZNode hierarchy
-
-
-ZooKeeper Browser requires the [ZooKeeper
-REST](https://github.com/apache/zookeeper/tree/trunk/src/contrib/rest)
-service to be running. Here is how to setup this one:
-
-First get and build ZooKeeper:
-
-<pre>
-git clone https://github.com/apache/zookeeper
-cd zookeeper
-ant
-Buildfile: /home/hue/Development/zookeeper/build.xml
-
-init:
-       [mkdir] Created dir: /home/hue/Development/zookeeper/build/classes
-       [mkdir] Created dir: /home/hue/Development/zookeeper/build/lib
-       [mkdir] Created dir: /home/hue/Development/zookeeper/build/package/lib
-       [mkdir] Created dir: /home/hue/Development/zookeeper/build/test/lib
-
-   ...
-</pre>
-
-And start the REST service:
-
-<pre>
-cd src/contrib/rest
-nohup ant run&
-</pre>
-
-If ZooKeeper and the REST service are not on the same machine as Hue, go
-update the [Hue
-settings](https://github.com/cloudera/hue/blob/master/desktop/conf.dist/hue.ini#L581)
-and specify the correct hostnames and ports:
-
-<pre>
-    [zookeeper]
-
-      [[clusters]]
-
-        [[[default]]]
-          # Zookeeper ensemble. Comma separated list of Host/Port.
-          # e.g. localhost:2181,localhost:2182,localhost:2183
-          ## host_ports=localhost:2181
-
-          # The URL of the REST contrib service
-          ## rest_url=http://localhost:9998
-</pre>
-
-## Git
-
-A basic read only version is done [HUE-951](https://issues.cloudera.org/browse/HUE-951).

+ 1 - 1
docs/docs-site/content/user/editor/_index.md

@@ -1,5 +1,5 @@
 ---
-title: "SQL Editor / Notebook"
+title: "Editor"
 date: 2019-03-13T18:28:09-07:00
 draft: false
 weight: 2