Todd Lipcon d049e13037 HUE-219. Add hadoop test jar to the minicluster test path to work around a Java classloader issue %!s(int64=15) %!d(string=hai) anos
..
gen-py 351f1925dc Regenerate all thrift code with Thrift 0.5.0rc1 %!s(int64=15) %!d(string=hai) anos
java 06df4fcbe1 Update DatanodePlugin to ByteBuffer based thrift API for 0.5.0 %!s(int64=15) %!d(string=hai) anos
src d049e13037 HUE-219. Add hadoop test jar to the minicluster test path to work around a Java classloader issue %!s(int64=15) %!d(string=hai) anos
static-group-mapping 54769f3d64 HUE-219. Make Hue work with CDH3 trunk %!s(int64=15) %!d(string=hai) anos
.gitignore ae333e405b hue (formerly Cloudera Desktop) from internal commit 4694ac0434dad85170854c5378a389585d87f679 %!s(int64=15) %!d(string=hai) anos
Makefile 54769f3d64 HUE-219. Make Hue work with CDH3 trunk %!s(int64=15) %!d(string=hai) anos
README ae333e405b hue (formerly Cloudera Desktop) from internal commit 4694ac0434dad85170854c5378a389585d87f679 %!s(int64=15) %!d(string=hai) anos
regenerate-thrift.sh ae333e405b hue (formerly Cloudera Desktop) from internal commit 4694ac0434dad85170854c5378a389585d87f679 %!s(int64=15) %!d(string=hai) anos
setup.py 7dd63f9bec Bumped version number to 1.0. %!s(int64=15) %!d(string=hai) anos

README

If you modified any of the thrift files you need to regenerate
the generated code by running regenerate-thrift.sh, in this
directory. Finally, checkin any files that were generated during the previous
steps.