Hadoop 2.2.0启动脚本——sbin/start-dfs.sh
1. 执行hdfs-config.sh
2. 如果参数个数超过1,读取第一个参数。
-upgrade则继续,即dataStartOpt="",nameStartOpt="$@";
-rollback,则dataStartOpt="-rollback",nameStartOpt="-rollback $@"
3. 执行bin/hdfs getconf -namenodes获取NAMENODES
执行sbin/hadoop-daemons.sh" --config "$HADOOP_CONF_DIR" --hostnames "$NAMENODES" --script "$bin/hdfs" start namenode $nameStartOpt
5. 如果设置HADOOP_SECURE_DN_USER,则需要运行start-secure-dns.sh as root;
如果未设置,则sbin/hadoop-daemons.sh" --config "$HADOOP_CONF_DIR" --script "$bin/hdfs" start datanode $dataStartOpt
6. 执行bin/hdfs getconf -secondarynamenodes获取SECONDARY_NAMENODES,
同NAMENODES
7. SHARED_EDITS_DIR,TODO...
8. AUTOHA_ENABLED,TODO...