org.apache.hadoop.util.Shell demo/例子
package cn.shell;
import java.io.IOException; import org.apache.hadoop.util.Shell; public class ShellDemo { public static void main(String[] args) throws IOException { String pars="ipconfig"; String out=Shell.ShellCommandExecutor.execCommand(pars); System.out.println(out); } }
上面是很简单的一个例子,注意Shell.ShellCommandExecutor.execCommand内的参数:
demo2:判断yarn 的队列中是否有任务提交,队列可以指定。
public static boolean hasAppBeSubmited(String queueName){ boolean tag=false; try { String out=Shell.ShellCommandExecutor.execCommand("yarn", "application", "-list"); String[] apps =out.split("Tracking-URL"); if(apps.length ==2) { tag = apps[1].contains(queueName); } } catch (IOException e) { e.printStackTrace(); } return tag; }
demo3:shell对管道符号'|' 和重定向符号‘>’ 的支持,需要用sh -c 后边拼接命令
Shell.ShellCommandExecutor.execCommand("sh", "-c","hadoop fs -text hdfs文件 > 本地文件");
注意如果直接执行命令
Shell.ShellCommandExecutor.execCommand("hadoop", "fs", "-text","hdfs文件路径",">" "本地文件");
会报错,会把">"符号当作文件路径处理。所以需要sh -c "命令参数" 讲命令行当作整体传入
demo4:直接用Shell切分文件
Shell.ShellCommandExecutor.execCommand("split","-l", "10000", "--numeric-suffixes=1","--suffix-length=3", "--additional-suffix=.txt",srcFile,newFile);
demo5:执行shell或者python脚本
Shell.ShellCommandExecutor.execCommand("./ftp.sh",ftpHost,ftpPort,user,password,putPath+File.separator+date,fileDir);