摘要:
1. 安装Redis3 1.下载redis3的稳定版本,下载地址http://download.redis.io/releases/redis-3.2.10.tar.gz 2.上传redis-3.2.10.tar.gz到服务器 3.解压redis源码包 tar -zxvf redis-3.2.10. 阅读全文
摘要:
package com.day16import com.day15.ConnectionPoolUtilimport org.apache.spark.streaming.dstream.{DStream, ReceiverInputDStream}import org.apache.spark.s 阅读全文
摘要:
package com.day16import kafka.common.TopicAndPartitionimport kafka.message.MessageAndMetadataimport kafka.serializer.StringDecoderimport kafka.utils.{ 阅读全文
摘要:
启动 [root@node1 kafka]# ./bin/kafka-server-start.sh -daemon config/server.properties 创建主题 ./bin/kafka-topics.sh --create --zookeeper 192.168.23.101:218 阅读全文
摘要:
在没有配置kafka 删除属性的情况下 使用删除主题命令 ./bin/kafka-topics.sh --delete --zookeeper 192.168.28.131:2181,192.168.28.131:2182,192.168.28.131:2183 --topic test之后对当前主 阅读全文
摘要:
常用命令 1.新建主题 ./bin/kafka-topics.sh --create --zookeeper 192.168.28.131:2181,192.168.28.131:2182,192.168.28.131:2183 --replication-factor 3 --partitions 阅读全文
摘要:
工具:IdeaScala:版本2.10.6 <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-streaming_2.10</artifactId> <version>1.6.0</version> </depend 阅读全文
摘要:
原创,未经同意转载,复制的没唧唧 def main(args: Array[String]): Unit = { val conf = new SparkConf() conf.set("spark.master", "local") conf.set("spark.app.name", "spar 阅读全文
摘要:
[root@node3 ~]# cd /usr/local/kafka/[root@node3 kafka]# ./bin/kafka-server-start.sh -daemon config/server.properties[root@node3 kafka]# jps2944 Quorum 阅读全文
摘要:
package sparkSql.方法1创建DataFrameimport org.apache.spark.sql.{SQLContext, SaveMode}import org.apache.spark.{SparkConf, SparkContext}object Student { def 阅读全文