Hadoop 2.6 日志文件和MapReduce的log文件
ref:https://blog.csdn.net/infovisthinker/article/details/45370089
mr-jobhistory-daemon.sh命令是在${HADOOP_INSTALL}/sbin/目录下面。启动完了,用jps命令可以看到有JobHistoryServer的进程启动。
启动了HistoryServer后,就可以看到Tracking URL里面的History了。
根据博主所写,测试了一下,发现启动historyserver后,多出了mapred-root-historyserver-bigdata.log / out 2个文件。
[root@bigdata userlogs]# /home/admin/Downloads/hadoop-2.10.0/sbin/mr-jobhistory-daemon.sh start historyserver starting historyserver, logging to /home/admin/Downloads/hadoop-2.10.0/logs/mapred-root-historyserver-bigdata.out [root@bigdata userlogs]# cd .. [root@bigdata logs]# ls hadoop-root-datanode-bigdata.log mapred-root-historyserver-bigdata.log hadoop-root-datanode-bigdata.out mapred-root-historyserver-bigdata.out hadoop-root-datanode-localhost.localdomain.log SecurityAuth-root.audit hadoop-root-datanode-localhost.localdomain.out userlogs hadoop-root-datanode-localhost.localdomain.out.1 yarn-root-nodemanager-bigdata.log hadoop-root-namenode-bigdata.log yarn-root-nodemanager-bigdata.out hadoop-root-namenode-bigdata.out yarn-root-nodemanager-localhost.localdomain.log hadoop-root-namenode-localhost.localdomain.log yarn-root-nodemanager-localhost.localdomain.out hadoop-root-namenode-localhost.localdomain.out yarn-root-nodemanager-localhost.localdomain.out.1 hadoop-root-namenode-localhost.localdomain.out.1 yarn-root-resourcemanager-bigdata.log hadoop-root-secondarynamenode-bigdata.log yarn-root-resourcemanager-bigdata.out hadoop-root-secondarynamenode-bigdata.out yarn-root-resourcemanager-localhost.localdomain.log hadoop-root-secondarynamenode-localhost.localdomain.log yarn-root-resourcemanager-localhost.localdomain.out hadoop-root-secondarynamenode-localhost.localdomain.out yarn-root-resourcemanager-localhost.localdomain.out.1 hadoop-root-secondarynamenode-localhost.localdomain.out.1 yarn-root-resourcemanager-localhost.localdomain.out.2 hadoop-root-secondarynamenode-localhost.localdomain.out.2 yarn-root-resourcemanager-localhost.localdomain.out.3
我们cat一下 mapred-root-historyserver-bigdata.log,下面是部分显示内容:
STARTUP_MSG: build = ssh://git.corp.linkedin.com:29418/hadoop/hadoop.git -r e2f1f118e465e787d8567dfa6e2f3b72a0eb9194; compiled by 'jhung' on 2019-10-22T19:10Z STARTUP_MSG: java = 1.8.0_241 ************************************************************/ 2020-04-11 17:03:07,796 INFO org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer: registered UNIX signal handlers for [TERM, HUP, INT] 2020-04-11 17:03:09,022 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties 2020-04-11 17:03:09,146 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2020-04-11 17:03:09,146 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: JobHistoryServer metrics system started 2020-04-11 17:03:09,152 INFO org.apache.hadoop.mapreduce.v2.hs.JobHistory: JobHistory Init 2020-04-11 17:03:09,831 INFO org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils: Default file system [hdfs://bigdata:9000] 2020-04-11 17:03:10,272 INFO org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager: Perms after creating 504, Expected: 504 2020-04-11 17:03:10,301 INFO org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils: Default file system [hdfs://bigdata:9000] 2020-04-11 17:03:10,313 INFO org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager: Perms after creating 493, Expected: 1023 2020-04-11 17:03:10,313 INFO org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager: Explicitly setting permissions to : 1023, rwxrwxrwt 2020-04-11 17:03:10,325 INFO org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager: Initializing Existing Jobs... 2020-04-11 17:03:10,344 INFO org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager: Found 0 directories to load 2020-04-11 17:03:10,345 INFO org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager: Existing job initialization finished. 0.0% of cache is occupied. 2020-04-11 17:03:10,346 INFO org.apache.hadoop.mapreduce.v2.hs.CachedHistoryStorage: CachedHistoryStorage Init 2020-04-11 17:03:10,553 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 100 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2020-04-11 17:03:10,567 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 10033 2020-04-11 17:03:10,843 INFO org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager: Updating the current master key for generating delegation tokens 2020-04-11 17:03:10,853 INFO org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager: Starting expired delegation token remover thread, tokenRemoverScanInterval=60 min(s) 2020-04-11 17:03:10,853 INFO org.apache.hadoop.security.token.delegation.AbstractDelegationTokenSecretManager: Updating the current master key for generating delegation tokens 2020-04-11 17:03:10,956 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog 2020-04-11 17:03:10,976 INFO org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. 2020-04-11 17:03:10,985 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.jobhistory is not defined 2020-04-11 17:03:10,990 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2020-04-11 17:03:10,992 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context jobhistory 2020-04-11 17:03:10,992 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs 2020-04-11 17:03:10,992 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static 2020-04-11 17:03:10,994 INFO org.apache.hadoop.http.HttpServer2: adding path spec: /jobhistory/* 2020-04-11 17:03:10,994 INFO org.apache.hadoop.http.HttpServer2: adding path spec: /ws/* 2020-04-11 17:03:11,491 INFO org.apache.hadoop.yarn.webapp.WebApps: Registered webapp guice modules 2020-04-11 17:03:11,493 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 19888 2020-04-11 17:03:11,494 INFO org.mortbay.log: jetty-6.1.26 2020-04-11 17:03:11,687 INFO org.mortbay.log: Extract jar:file:/home/admin/Downloads/hadoop-2.10.0/share/hadoop/yarn/hadoop-yarn-common-2.10.0.jar!/webapps/jobhistory to /tmp/Jetty_bigdata_19888_jobhistory____38atwj/webapp 2020-04-11 17:03:13,251 INFO org.mortbay.log: Started HttpServer2$SelectChannelConnectorWithSafeStartup@bigdata:19888 2020-04-11 17:03:13,251 INFO org.apache.hadoop.yarn.webapp.WebApps: Web app jobhistory started at 19888 2020-04-11 17:03:13,276 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue queueCapacity: 1000 scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler 2020-04-11 17:03:13,295 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 10020 2020-04-11 17:03:13,333 INFO org.apache.hadoop.yarn.factories.impl.pb.RpcServerFactoryPBImpl: Adding protocol org.apache.hadoop.mapreduce.v2.api.HSClientProtocolPB to the server 2020-04-11 17:03:13,336 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2020-04-11 17:03:13,355 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 10020: starting 2020-04-11 17:03:13,494 INFO org.apache.hadoop.mapreduce.v2.hs.HistoryClientService: Instantiated HistoryClientService at bigdata/192.168.0.108:10020 2020-04-11 17:03:13,511 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2020-04-11 17:03:13,526 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 10033: starting 2020-04-11 17:03:13,577 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2020-04-11 17:03:40,856 INFO org.apache.hadoop.mapreduce.v2.hs.JobHistory: History Cleaner started 2020-04-11 17:03:40,867 INFO org.apache.hadoop.mapreduce.v2.hs.JobHistory: History Cleaner complete 2020-04-11 17:06:10,853 INFO org.apache.hadoop.mapreduce.v2.hs.JobHistory: Starting scan to move intermediate done files 2020-04-11 17:09:10,853 INFO org.apache.hadoop.mapreduce.v2.hs.JobHistory: Starting scan to move intermediate done files
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· 从 HTTP 原因短语缺失研究 HTTP/2 和 HTTP/3 的设计差异
· AI与.NET技术实操系列:向量存储与相似性搜索在 .NET 中的实现
· 基于Microsoft.Extensions.AI核心库实现RAG应用
· Linux系列:如何用heaptrack跟踪.NET程序的非托管内存泄露
· 开发者必知的日志记录最佳实践
· TypeScript + Deepseek 打造卜卦网站:技术与玄学的结合
· Manus的开源复刻OpenManus初探
· AI 智能体引爆开源社区「GitHub 热点速览」
· 从HTTP原因短语缺失研究HTTP/2和HTTP/3的设计差异
· 三行代码完成国际化适配,妙~啊~
2014-04-11 SQL Server2008 忘记sa密码 解决办法