Romain Rigaux 231c8d38c1 [core] Remove some Java dependencies 12 年 前
..
if 39a97e4e62 HUE-612 [jb] Get and display a retired job 13 年 前
ivy ae333e405b hue (formerly Cloudera Desktop) from internal commit 4694ac0434dad85170854c5378a389585d87f679 15 年 前
scripts ae333e405b hue (formerly Cloudera Desktop) from internal commit 4694ac0434dad85170854c5378a389585d87f679 15 年 前
src ab11f7b6e9 Revert "HUE-1064 [core] JT plugin should support hadoop.rpc.protection set to anything other than "authentication"" 12 年 前
README 7ad2b31ad8 HUE-612 [jb] Show retired jobs in the job browser 13 年 前
pom.xml e9b0536e66 [core] Setting version to 3.0.0 12 年 前

README

Thrift API for HDFS
==================

Introduction:
============

The Hadoop Distributed File System is written in Java. An application
that wants to store/fetch data to/from HDFS can use the Java API
This means that applications that are not written in Java cannot
access HDFS in an elegant manner.

Thrift is a software framework for scalable cross-language services
development. It combines a powerful software stack with a code generation
engine to build services that work efficiently and seamlessly
between C++, Java, Python, PHP, and Ruby.

This project exposes HDFS APIs using the Thrift software stack. This
allows applciations written in a myriad of languages to access
HDFS elegantly.


The Application Programming Interface (API)
===========================================
The HDFS API that is exposed through Thrift can be found in if/hadoopfs.thrift.

Compilation
===========
The compilation process creates a server org.apache.hadoop.thriftfs.HadooopThriftServer
that implements the Thrift interface defined in if/hadoopfs.thrift.

The thrift compiler is used to generate API stubs in python, php, ruby,
cocoa, etc. The generated code is checked into the directories gen-*.
The generated java API is checked into lib/hadoopthriftapi.jar.

There is a sample python script hdfs.py in the scripts directory. This python
script, when invoked, creates a HadoopThriftServer in the background, and then
communicates with HDFS using the API. This script is for demonstration purposes
only.