您现在的位置是:首页 >其他 >hadoop3.2.3伪分布式搭建与spark2.12.17配置网站首页其他
hadoop3.2.3伪分布式搭建与spark2.12.17配置
hadoop3.2.3伪分布式搭建与spark2.12.17配置
hadoop3.2.3安装与配置
1、配置hadoop环境变量
vi /etc/profile
export HADOOP_HOME=/home/hadoop/soft-installed/bigdata/hadoop-3.2.3
PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
source /etc/profile
export HDFS_NAMENODE_USER=root
export HDFS_DATANODE_USER=root
export HDFS_SECONDARYNAMENODE_USER=root
export YARN_RESOURCEMANAGER_USER=root
export YARN_NODEMANAGER_USER=root
到配置文件,目录
cd /home/hadoop/soft-install/bigdata/hadoop-2.6.4/etc/hadoop
1、配置 hadoop-env.sh
vi hadoop-env.sh
export JAVA_HOME=/home/hadoop/soft-installed/language/jdk1.8.0_202
2、配置core-site.xml
vi core-site.xml
<property> <name>hadoop.tmp.dir</name> <value>/home/hadoop/soft-installed/bigdata/hadoop-3.2.3/tmp</value> </property> <property> <name>fs.default.name</name> <value>hdfs://5yyg:9000</value> </property>
3、配置hdfs-site.xml
vi hdfs-site.xml
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
4、配置 mapred-site.xml
cp mapred-site.xml.template mapred-site.xml
vi mapred-site.xml
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
5、配置yarn-site.xml
vi yarn-site.xml
<property>
<name>yarn.resourcemanager.hostname</name>
<value>solang</value>
</property>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
配置免密
ssh-keygen -t rsa
ssh-copy-id -i /root/.ssh/id_rsa.pub 5yyg //5yyg是我的主机名
格式化
hadoop namenode -format
启动hdfs
start-all.sh
spark2.12.17搭建
配置spark环境变量
vim /etc/profile
export SPARK_HOME=/home/hadoop/soft-installed/bigdata/spark-3.1.3-bin-hadoop3.2 export HADOOP_HOME=/home/hadoop/soft-installed/bigdata/hadoop-3.2.3 PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$SPARK_HOME/bin:$SPARK_HOMR/sbin
source /etc/profile
配置workers
cp workers.template workers
配置spark-env.sh
cp spark-env.sh.template spark-env.sh
vim spark-env.sh
export JAVA_HOME=/home/hadoop/soft-installed/language/jdk1.8.0_202 export HADOOP_HOME=/home/hadoop/soft-installed/bigdata/hadoop-3.2.3 export SCALA_HOME=/home/hadoop/soft-installed/language/scala-2.12.17 export SPARK_HOME=/home/hadoop/soft-installed/bigdata/spark-3.1.3-bin-hadoop3.2 export SPARK_MASTER_IP=5yyg export SPARK_MASTER_PORT=7077