| [root@master software]# spark-submit test1.py |
| /home/software/spark-2.3.4-bin-hadoop2.7/conf/spark-env.sh: line 2: /usr/local/hadoop/bin/hadoop: No such file or directory |
| 2027-11-29 14:12:36 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable |
| 2027-11-29 14:12:36 INFO SparkContext:54 - Running Spark version 2.3.4 |
| 2027-11-29 14:12:36 INFO SparkContext:54 - Submitted application: first demo |
| 2027-11-29 14:12:36 INFO SecurityManager:54 - Changing view acls to: root |
| 2027-11-29 14:12:36 INFO SecurityManager:54 - Changing modify acls to: root |
| 2027-11-29 14:12:36 INFO SecurityManager:54 - Changing view acls groups to: |
| 2027-11-29 14:12:36 INFO SecurityManager:54 - Changing modify acls groups to: |
| 2027-11-29 14:12:36 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() |
| 2027-11-29 14:12:36 INFO Utils:54 - Successfully started service 'sparkDriver' on port 33641. |
| 2027-11-29 14:12:36 INFO SparkEnv:54 - Registering MapOutputTracker |
| 2027-11-29 14:12:36 INFO SparkEnv:54 - Registering BlockManagerMaster |
| 2027-11-29 14:12:36 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information |
| 2027-11-29 14:12:36 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up |
| 2027-11-29 14:12:36 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-7657f410-212b-4b1a-acc2-062d3ff54c8c |
| 2027-11-29 14:12:36 INFO MemoryStore:54 - MemoryStore started with capacity 413.9 MB |
| 2027-11-29 14:12:36 INFO SparkEnv:54 - Registering OutputCommitCoordinator |
| 2027-11-29 14:12:36 INFO log:192 - Logging initialized @2082ms |
| 2027-11-29 14:12:37 INFO Server:351 - jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown |
| 2027-11-29 14:12:37 INFO Server:419 - Started @2165ms |
| 2027-11-29 14:12:37 INFO AbstractConnector:278 - Started ServerConnector@75afaf3{HTTP/1.1,[http/1.1]}{192.168.128.78:4040} |
| 2027-11-29 14:12:37 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040. |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@11679a7f{/jobs,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@37457949{/jobs/json,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@e200e09{/jobs/job,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@12e3ada1{/jobs/job/json,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@e701185{/stages,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@ae0604{/stages/json,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@12b969be{/stages/stage,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@63242cbd{/stages/stage/json,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3a094dfe{/stages/pool,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@65e94d7a{/stages/pool/json,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@22530a07{/storage,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@60509b34{/storage/json,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@10c65c7d{/storage/rdd,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3f6b4182{/storage/rdd/json,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@552775a{/environment,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@61bfc5ff{/environment/json,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@43b85801{/executors,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@26ba66a5{/executors/json,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@30c42d8f{/executors/threadDump,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2fcb9655{/executors/threadDump/json,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2e32bbfd{/static,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@748e1056{/,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4eeb70d2{/api,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@541a12bb{/jobs/job/kill,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7ec063{/stages/stage/kill,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:37 INFO SparkUI:54 - Bound SparkUI to 192.168.128.78, and started at http://localhost:4040 |
| 2027-11-29 14:12:37 INFO SparkContext:54 - Added file file:/home/software/test1.py at file:/home/software/test1.py with timestamp 1827468757549 |
| 2027-11-29 14:12:37 INFO Utils:54 - Copying /home/software/test1.py to /tmp/spark-7c52bbd6-d752-4c4d-ab95-42cdd8c140f3/userFiles-f692857b-82b9-434c-b5bc-b5d1cf9f8bfe/test1.py |
| 2027-11-29 14:12:37 INFO Executor:54 - Starting executor ID driver on host localhost |
| 2027-11-29 14:12:37 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 43340. |
| 2027-11-29 14:12:37 INFO NettyBlockTransferService:54 - Server created on localhost:43340 |
| 2027-11-29 14:12:37 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy |
| 2027-11-29 14:12:37 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, localhost, 43340, None) |
| 2027-11-29 14:12:37 INFO BlockManagerMasterEndpoint:54 - Registering block manager localhost:43340 with 413.9 MB RAM, BlockManagerId(driver, localhost, 43340, None) |
| 2027-11-29 14:12:37 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, localhost, 43340, None) |
| 2027-11-29 14:12:37 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, localhost, 43340, None) |
| 2027-11-29 14:12:37 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3117e40b{/metrics/json,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:38 INFO SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/home/software/spark-warehouse/'). |
| 2027-11-29 14:12:38 INFO SharedState:54 - Warehouse path is 'file:/home/software/spark-warehouse/'. |
| 2027-11-29 14:12:38 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6117631{/SQL,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:38 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4aaadc64{/SQL/json,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:38 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2155efd1{/SQL/execution,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:38 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@787abb40{/SQL/execution/json,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:38 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@2d866e5f{/static/sql,null,AVAILABLE,@Spark} |
| 2027-11-29 14:12:38 INFO StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint |
| root |
| |-- id: integer (nullable = true) |
| |-- name: string (nullable = true) |
| |-- age: integer (nullable = true) |
| |
| 2027-11-29 14:12:41 INFO CodeGenerator:54 - Code generated in 397.276779 ms |
| 2027-11-29 14:12:41 INFO CodeGenerator:54 - Code generated in 37.30301 ms |
| 2027-11-29 14:12:42 INFO SparkContext:54 - Starting job: showString at NativeMethodAccessorImpl.java:0 |
| 2027-11-29 14:12:42 INFO DAGScheduler:54 - Got job 0 (showString at NativeMethodAccessorImpl.java:0) with 1 output partitions |
| 2027-11-29 14:12:42 INFO DAGScheduler:54 - Final stage: ResultStage 0 (showString at NativeMethodAccessorImpl.java:0) |
| 2027-11-29 14:12:42 INFO DAGScheduler:54 - Parents of final stage: List() |
| 2027-11-29 14:12:42 INFO DAGScheduler:54 - Missing parents: List() |
| 2027-11-29 14:12:42 INFO DAGScheduler:54 - Submitting ResultStage 0 (MapPartitionsRDD[3] at showString at NativeMethodAccessorImpl.java:0), which has no missing parents |
| 2027-11-29 14:12:42 INFO MemoryStore:54 - Block broadcast_0 stored as values in memory (estimated size 8.8 KB, free 413.9 MB) |
| 2027-11-29 14:12:42 INFO MemoryStore:54 - Block broadcast_0_piece0 stored as bytes in memory (estimated size 4.4 KB, free 413.9 MB) |
| 2027-11-29 14:12:42 INFO BlockManagerInfo:54 - Added broadcast_0_piece0 in memory on localhost:43340 (size: 4.4 KB, free: 413.9 MB) |
| 2027-11-29 14:12:42 INFO SparkContext:54 - Created broadcast 0 from broadcast at DAGScheduler.scala:1039 |
| 2027-11-29 14:12:42 INFO DAGScheduler:54 - Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[3] at showString at NativeMethodAccessorImpl.java:0) (first 15 tasks are for partitions Vector(0)) |
| 2027-11-29 14:12:42 INFO TaskSchedulerImpl:54 - Adding task set 0.0 with 1 tasks |
| 2027-11-29 14:12:42 INFO TaskSetManager:54 - Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7677 bytes) |
| 2027-11-29 14:12:42 INFO Executor:54 - Running task 0.0 in stage 0.0 (TID 0) |
| 2027-11-29 14:12:42 INFO Executor:54 - Fetching file:/home/software/test1.py with timestamp 1827468757549 |
| 2027-11-29 14:12:42 INFO Utils:54 - /home/software/test1.py has been previously copied to /tmp/spark-7c52bbd6-d752-4c4d-ab95-42cdd8c140f3/userFiles-f692857b-82b9-434c-b5bc-b5d1cf9f8bfe/test1.py |
| 2027-11-29 14:12:42 INFO JDBCRDD:54 - closed connection |
| 2027-11-29 14:12:42 INFO Executor:54 - Finished task 0.0 in stage 0.0 (TID 0). 1234 bytes result sent to driver |
| 2027-11-29 14:12:42 INFO TaskSetManager:54 - Finished task 0.0 in stage 0.0 (TID 0) in 255 ms on localhost (executor driver) (1/1) |
| 2027-11-29 14:12:42 INFO DAGScheduler:54 - ResultStage 0 (showString at NativeMethodAccessorImpl.java:0) finished in 0.566 s |
| 2027-11-29 14:12:42 INFO TaskSchedulerImpl:54 - Removed TaskSet 0.0, whose tasks have all completed, from pool |
| 2027-11-29 14:12:42 INFO DAGScheduler:54 - Job 0 finished: showString at NativeMethodAccessorImpl.java:0, took 0.641017 s |
| +---+----+---+ |
| | id|name|age| |
| +---+----+---+ |
| | 1| 狗蛋| 25| |
| +---+----+---+ |
| |
| 2027-11-29 14:12:42 INFO SparkContext:54 - Invoking stop() from shutdown hook |
| 2027-11-29 14:12:42 INFO AbstractConnector:318 - Stopped Spark@75afaf3{HTTP/1.1,[http/1.1]}{192.168.128.78:4040} |
| 2027-11-29 14:12:42 INFO SparkUI:54 - Stopped Spark web UI at http://localhost:4040 |
| 2027-11-29 14:12:42 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped! |
| 2027-11-29 14:12:42 INFO MemoryStore:54 - MemoryStore cleared |
| 2027-11-29 14:12:42 INFO BlockManager:54 - BlockManager stopped |
| 2027-11-29 14:12:42 INFO BlockManagerMaster:54 - BlockManagerMaster stopped |
| 2027-11-29 14:12:42 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped! |
| 2027-11-29 14:12:42 INFO SparkContext:54 - Successfully stopped SparkContext |
| 2027-11-29 14:12:42 INFO ShutdownHookManager:54 - Shutdown hook called |
| 2027-11-29 14:12:42 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-7c52bbd6-d752-4c4d-ab95-42cdd8c140f3/pyspark-cf837efb-447f-41bf-8250-f6f6dbc7c6a0 |
| 2027-11-29 14:12:42 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-7c52bbd6-d752-4c4d-ab95-42cdd8c140f3 |
| 2027-11-29 14:12:42 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-05f73984-6870-498b-9084-1c2af8a400a5 |
· 阿里巴巴 QwQ-32B真的超越了 DeepSeek R-1吗?
· 10年+ .NET Coder 心语 ── 封装的思维:从隐藏、稳定开始理解其本质意义
· 【设计模式】告别冗长if-else语句:使用策略模式优化代码结构
· 字符编码:从基础到乱码解决
· 提示词工程——AI应用必不可少的技术
2021-11-29 mybatis plus 项目模板