2020年寒假学习进度第十二天

  window10中eclipse连接Linux中的spark环境

 

昨天成功实现在windows中的eclipse调用linux中的hadoop环境,今天解决调用spark环境问题

 

一:

将spark的安装包在Windows环境中解压,比如我的解压目录就是D:\hadoop\spark-2.1.0-bin-without-hadoop

 

二:

配置环境变量,系统变量添加SPACK_HOME,值为spark解压bin目录,path变量添加最后一行

 

 

 

 

 第三步:

测试代码,需导入jars目录下的所有jar包

package Test;

import java.util.Arrays;
import java.util.Iterator;
import java.util.List;

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.FlatMapFunction;
import org.apache.spark.api.java.function.Function2;
import org.apache.spark.api.java.function.PairFunction;
import org.apache.spark.api.java.function.VoidFunction;

import scala.Tuple2;

public class WordCount {
    public static void main(String[] args) {
        SparkConf conf = new SparkConf().setMaster("local").setAppName("wc");
        JavaSparkContext sc = new JavaSparkContext(conf);
        
        JavaRDD<String> text = sc.textFile("hdfs://192.168.43.110:9000/user/hadoop/input/test");
        JavaRDD<String> words = text.flatMap(new FlatMapFunction<String, String>() {
            private static final long serialVersionUID = 1L;
            @Override
            public Iterator<String> call(String line) throws Exception {
                return (Iterator<String>) Arrays.asList(line.split(" "));//把字符串转化成list
            }
        });
        
        JavaPairRDD<String, Integer> pairs = words.mapToPair(new PairFunction<String, String, Integer>() {
            private static final long serialVersionUID = 1L;
            @Override
            public Tuple2<String, Integer> call(String word) throws Exception {
                // TODO Auto-generated method stub
                return new Tuple2<String, Integer>(word, 1);
            }
        });
        
        JavaPairRDD<String, Integer> results = pairs.reduceByKey(new Function2<Integer, Integer, Integer>() {            
            private static final long serialVersionUID = 1L;
            @Override
            public Integer call(Integer value1, Integer value2) throws Exception {
                // TODO Auto-generated method stub
                return value1 + value2;
            }
        });
        
        JavaPairRDD<Integer, String> temp = results.mapToPair(new PairFunction<Tuple2<String,Integer>, Integer, String>() {
            private static final long serialVersionUID = 1L;
            @Override
            public Tuple2<Integer, String> call(Tuple2<String, Integer> tuple)
                    throws Exception {
                return new Tuple2<Integer, String>(tuple._2, tuple._1);
            }
        });
        
        JavaPairRDD<String, Integer> sorted = temp.sortByKey(false).mapToPair(new PairFunction<Tuple2<Integer,String>, String, Integer>() {
            private static final long serialVersionUID = 1L;
            @Override
            public Tuple2<String, Integer> call(Tuple2<Integer, String> tuple)
                    throws Exception {
                // TODO Auto-generated method stub
                return new Tuple2<String, Integer>(tuple._2,tuple._1);
            }
        });
        
        sorted.foreach(new VoidFunction<Tuple2<String,Integer>>() {
            private static final long serialVersionUID = 1L;
            @Override
            public void call(Tuple2<String, Integer> tuple) throws Exception {
                System.out.println("word:" + tuple._1 + " count:" + tuple._2);
            }
        });
        
        sc.close();
    }
}

数据:

Look! at the window there leans an old maid. She plucks the withered leaf from the balsam, and looks at the grass-covered rampart, on which many children are playing. What is the old maid thinking of? A whole life drama is unfolding itself before her inward gaze. "The poor little children, how happy they are- how merrily they play and romp together! What red cheeks and what angels' eyes! but they have no shoes nor stockings. They dance on the green rampart, just on the place where, according to the old story, the ground always sank in, and where a sportive, frolicsome child had been lured by means of flowers, toys and sweetmeats into an open grave ready dug for it, and which was afterwards closed over the child; and from that moment, the old story says, the ground gave way no longer, the mound remained firm and fast, and was quickly covered with the green turf. The little people who now play on that spot know nothing of the old tale, else would they fancy they heard a child crying deep below the earth, and the dewdrops on each blade of grass would be to them tears of woe. Nor do they know anything of the Danish King who here, in the face of the coming foe, took an oath before all his trembling courtiers that he would hold out with the citizens of his capital, and die here in his nest; they know nothing of the men who have fought here, or of the women who from here have drenched with boiling water the enemy, clad in white, and 'biding in the snow to surprise the city.

 

四:

出现问题

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream

 

 

 

 

解决方案:

更换spark压缩包,由于spark需要调用hadoop的jar包,所以在window中使用spark时,下载spark-3.0.0-preview2-bin-hadoop2.71这种携带hadoop的spark版本,spark-2.1.0-bin-without-hadoop这种不带hadoop的版本不兼容,会出现上边的错。

 

posted @ 2020-02-12 19:00  生活依旧  阅读(153)  评论(0编辑  收藏  举报