This is a slightly updated draft of a talk I was planning on giving at Hadoop Summit in 2015. However the abstract was rejected. Rather than toss it, I'm going to share it with all of you on the (almost) 1 year anniversary of the first big commit of this feature!
Keep in mind that this is (currently) locked away in trunk. If you ever want to see this see the light of day, bug your vendors....
40. 40
Default *.out log rotation:!
!
function
hadoop_rotate_log
{
local
log=$1;
local
num=${2:-‐5};
!
if
[[
-‐f
"${log}"
]];
then
#
rotate
logs
while
[[
${num}
-‐gt
1
]];
do
let
prev=${num}-‐1
if
[[
-‐f
"${log}.${prev}"
]];
then
mv
"${log}.${prev}"
"${log}.${num}"
fi
num=${prev}
done
mv
"${log}"
"${log}.${num}"
fi
}
namenode.out.1
-‐>
namenode.out.2
namenode.out
-‐>
namenode.out.1
41. 41
Put a replacement rotate function w/gzip support in hadoop-user-functions.sh!!
!
function
hadoop_rotate_log
{
local
log=$1;
local
num=${2:-‐5};
!
if
[[
-‐f
"${log}"
]];
then
while
[[
${num}
-‐gt
1
]];
do
let
prev=${num}-‐1
if
[[
-‐f
"${log}.${prev}.gz"
]];
then
mv
"${log}.${prev}.gz"
"${log}.${num}.gz"
fi
num=${prev}
done
mv
"${log}"
"${log}.${num}"
gzip
-‐9
"${log}.${num}"
fi
}
namenode.out.1.gz
-‐>
namenode.out.2.gz
namenode.out
-‐>
namenode.out.1
gzip
-‐9
namenode.out.1
-‐>
namenode.out.1.gz
42. What if we wanted to log
every daemon start in
syslog?
43. 43
Default daemon starter:!
!
function
hadoop_start_daemon
{
local
command=$1
local
class=$2
shift
2
!
hadoop_debug
"Final
CLASSPATH:
${CLASSPATH}"
hadoop_debug
"Final
HADOOP_OPTS:
${HADOOP_OPTS}"
!
export
CLASSPATH
exec
"${JAVA}"
"-‐Dproc_${command}"
${HADOOP_OPTS}
"$
{class}"
"$@"
}
44. 44
Put a replacement start function in hadoop-user-functions.sh!!
!
function
hadoop_start_daemon
{
local
command=$1
local
class=$2
shift
2
!
hadoop_debug
"Final
CLASSPATH:
${CLASSPATH}"
hadoop_debug
"Final
HADOOP_OPTS:
${HADOOP_OPTS}"
!
export
CLASSPATH
logger
-‐i
-‐p
local0.notice
-‐t
hadoop
"Started
${COMMAND}"
exec
"${JAVA}"
"-‐Dproc_${command}"
${HADOOP_OPTS}
"$
{class}"
"$@"
}
48. 48
# hadoop-user-functions.sh: (partial code below)!
function
hadoop_start_secure_daemon
{
…
jsvc="${JSVC_HOME}/jsvc"
!
if
[[
“${USER}”
-‐ne
"${HADOOP_SECURE_USER}"
]];
then
hadoop_error
"You
must
be
${HADOOP_SECURE_USER}
in
order
to
start
a
secure
${daemonname}"
exit
1
fi
…
exec
/usr/sbin/sudo
"${jsvc}"
"-‐Dproc_${daemonname}"
-‐outfile
"${daemonoutfile}"
-‐errfile
"${daemonerrfile}"
-‐pidfile
"${daemonpidfile}"
-‐nodetach
-‐home
"${JAVA_HOME}"
—user
"${HADOOP_SECURE_USER}"
-‐cp
"${CLASSPATH}"
${HADOOP_OPTS}
"${class}"
"$@"
}
49. 49
$ hdfs
datanode!
sudo launches jsvc as root!
jsvc launches secure datanode!
!
!
In order to get -‐-‐daemon
start to work, one other
function needs to get replaced*, but that’s a SMOP, now
that you know how!!
!
!
* - hadoop_start_secure_daemon_wrapper
assumes it
is running as root!
50. 50
Lots more, but out of time... e.g.:!
!
! Internals for contributors!
! Unit tests!
! API documentation!
! Other projects in the works!
! ...!
!
! Reminder: This is in trunk. Ask vendors their plans!