hadoop eclipse plugin 编译安装,问题总结
插件名称:hadoop2x-eclipse-plugin插件地址:https://github.com/winghc/hadoop2x-eclipse-plugin 1.下载并解压hadoop2.x,下载地址http://hadoop.apache.org/releases.html#Download(我下载的是编译好的包) 2.下载并解压eclipse(我的是4.4.1版本,其他的版本类似)
插件名称:hadoop2x-eclipse-plugin
插件地址:https://github.com/winghc/hadoop2x-eclipse-plugin
1.下载并解压hadoop2.x,下载地址http://hadoop.apache.org/releases.html#Download(我下载的是编译好的包)
2.下载并解压eclipse(我的是4.4.1版本,其他的版本类似)
3.下载hadoop2x-eclipse-plugin插件并解压自己喜欢的目录,为了方便表达,我暂时叫他"H2EP_HOME"
4.插件编译需要ant工具,下载地址http://ant.apache.org/bindownload.cgi
配置好ANT_HOME环境变量,指向ant解压地址,配置PATH环境变量,增加%ANT_HOME%\bin(linux环境类似)
5.打开命令行工具,进入“H2EP_HOME”目录;
6.执行ant jar -Dversion=2.x.x -Dhadoop.version=2.x.x -Declipse.home=/opt/eclipse -Dhadoop.home=/usr/share/hadoop
eclipse.home配置成eclipse安装目录
hadoop.home配置成hadoop的解压目录
将2.x.x修改成对应的hadoop的版本号
7.命令行在ivy-resolve-common处卡了
原因是找不到几个依赖包,那几个依赖包可能是换路径了,其实不需要这几个依赖包也可以
解决方案:
修改"H2EP_HOME"\src\contrib\eclipse-plugin\build.xml
找到:
- <target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">
去掉depends修改为
- <target name="compile" unless="skip.contrib">
8.再次执行第6步的编译命令,会提示copy不到相关jar包的错误,
解决方案:
修改"H2EP_HOME"\ivy\libraries.properties文件,
将报错的jar包版本号跟换成与"HADOOP_HOME"\share\hadoop\common\lib下面jar对应的版本号
此步可能会有多个jar包版本不匹配,需要多次修改
9.再次执行第6步的编译命令,执行成功
在"H2EP_HOME"\build\contrib\eclipse-plugin下生成hadoop-eclipse-plugin-2.x.x.jar插件
10.将hadoop-eclipse-plugin-2.x.x.jar放到eclipse的plugins目录下,启动eclipse
11.打开window===>prefernces,找到Hadoop Map/Reduce选项卡
配置hadoop installation directory目录,指向hadoop的安装目录
12.打开window====>show view====>other,找到Map/Reduce Locations,使其显示
13.在Map/Reduce Locations中右键=====>new hadoop locations,
此时没反应,查看eclipse日志(工作空间\.metadata\.log),发现报错:
- java.lang.ClassNotFoundException: org.apache.commons.collections.map.UnmodifiableMap
解决方案:
修改"H2EP_HOME"\src\contrib\eclipse-plugin\build.xml
增加:
- <copy file="${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar" todir="${build.dir}/lib" verbose="true"/>
<jar>标签增加
- lib/commons-collections-${commons-collections.version}.jar,
14.执行 eclipse.exe -clean(清理一下缓存,不然有可能还是出现13步的问题)启动eclipse
完整的build.xml如下("H2EP_HOME"\src\contrib\eclipse-plugin\build.xml)
- <?xml version="1.0" encoding="UTF-8" standalone="no"?>
- <!--
- Licensed to the Apache Software Foundation (ASF) under one or more
- contributor license agreements. See the NOTICE file distributed with
- this work for additional information regarding copyright ownership.
- The ASF licenses this file to You under the Apache License, Version 2.0
- (the "License"); you may not use this file except in compliance with
- the License. You may obtain a copy of the License at
- http://www.apache.org/licenses/LICENSE-2.0
- Unless required by applicable law or agreed to in writing, software
- distributed under the License is distributed on an "AS IS" BASIS,
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- See the License for the specific language governing permissions and
- limitations under the License.
- -->
- <project default="jar" name="eclipse-plugin">
- <import file="../build-contrib.xml"/>
- <path id="eclipse-sdk-jars">
- <fileset dir="${eclipse.home}/plugins/">
- <include name="org.eclipse.ui*.jar"/>
- <include name="org.eclipse.jdt*.jar"/>
- <include name="org.eclipse.core*.jar"/>
- <include name="org.eclipse.equinox*.jar"/>
- <include name="org.eclipse.debug*.jar"/>
- <include name="org.eclipse.osgi*.jar"/>
- <include name="org.eclipse.swt*.jar"/>
- <include name="org.eclipse.jface*.jar"/>
- <include name="org.eclipse.team.cvs.ssh2*.jar"/>
- <include name="com.jcraft.jsch*.jar"/>
- </fileset>
- </path>
- <path id="hadoop-sdk-jars">
- <fileset dir="${hadoop.home}/share/hadoop/mapreduce">
- <include name="hadoop*.jar"/>
- </fileset>
- <fileset dir="${hadoop.home}/share/hadoop/hdfs">
- <include name="hadoop*.jar"/>
- </fileset>
- <fileset dir="${hadoop.home}/share/hadoop/common">
- <include name="hadoop*.jar"/>
- </fileset>
- </path>
- <!-- Override classpath to include Eclipse SDK jars -->
- <path id="classpath">
- <pathelement location="${build.classes}"/>
- <!--pathelement location="${hadoop.root}/build/classes"/-->
- <path refid="eclipse-sdk-jars"/>
- <path refid="hadoop-sdk-jars"/>
- </path>
- <!-- Skip building if eclipse.home is unset. -->
- <target name="check-contrib" unless="eclipse.home">
- <property name="skip.contrib" value="yes"/>
- <echo message="eclipse.home unset: skipping eclipse plugin"/>
- </target>
- <!--<target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib">-->
- <!-- 此处去掉 depends="init, ivy-retrieve-common" -->
- <target name="compile" unless="skip.contrib">
- <echo message="contrib: ${name}"/>
- <javac
- encoding="${build.encoding}"
- srcdir="${src.dir}"
- includes="**/*.java"
- destdir="${build.classes}"
- debug="${javac.debug}"
- deprecation="${javac.deprecation}">
- <classpath refid="classpath"/>
- </javac>
- </target>
- <!-- Override jar target to specify manifest -->
- <target name="jar" depends="compile" unless="skip.contrib">
- <mkdir dir="${build.dir}/lib"/>
- <copy todir="${build.dir}/lib/" verbose="true">
- <fileset dir="${hadoop.home}/share/hadoop/mapreduce">
- <include name="hadoop*.jar"/>
- </fileset>
- </copy>
- <copy todir="${build.dir}/lib/" verbose="true">
- <fileset dir="${hadoop.home}/share/hadoop/common">
- <include name="hadoop*.jar"/>
- </fileset>
- </copy>
- <copy todir="${build.dir}/lib/" verbose="true">
- <fileset dir="${hadoop.home}/share/hadoop/hdfs">
- <include name="hadoop*.jar"/>
- </fileset>
- </copy>
- <copy todir="${build.dir}/lib/" verbose="true">
- <fileset dir="${hadoop.home}/share/hadoop/yarn">
- <include name="hadoop*.jar"/>
- </fileset>
- </copy>
- <copy todir="${build.dir}/classes" verbose="true">
- <fileset dir="${root}/src/java">
- <include name="*.xml"/>
- </fileset>
- </copy>
- <copy file="${hadoop.home}/share/hadoop/common/lib/protobuf-java-${protobuf.version}.jar" todir="${build.dir}/lib" verbose="true"/>
- <copy file="${hadoop.home}/share/hadoop/common/lib/log4j-${log4j.version}.jar" todir="${build.dir}/lib" verbose="true"/>
- <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar" todir="${build.dir}/lib" verbose="true"/>
- <copy file="${hadoop.home}/share/hadoop/common/lib/commons-configuration-${commons-configuration.version}.jar" todir="${build.dir}/lib" verbose="true"/>
- <copy file="${hadoop.home}/share/hadoop/common/lib/commons-lang-${commons-lang.version}.jar" todir="${build.dir}/lib" verbose="true"/>
- <!-- 此处增加 commons-collections 依赖-->
- <copy file="${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar" todir="${build.dir}/lib" verbose="true"/>
- <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-core-asl-${jackson.version}.jar" todir="${build.dir}/lib" verbose="true"/>
- <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-mapper-asl-${jackson.version}.jar" todir="${build.dir}/lib" verbose="true"/>
- <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-log4j12-${slf4j-log4j12.version}.jar" todir="${build.dir}/lib" verbose="true"/>
- <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-api-${slf4j-api.version}.jar" todir="${build.dir}/lib" verbose="true"/>
- <copy file="${hadoop.home}/share/hadoop/common/lib/guava-${guava.version}.jar" todir="${build.dir}/lib" verbose="true"/>
- <copy file="${hadoop.home}/share/hadoop/common/lib/hadoop-auth-${hadoop.version}.jar" todir="${build.dir}/lib" verbose="true"/>
- <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar" todir="${build.dir}/lib" verbose="true"/>
- <copy file="${hadoop.home}/share/hadoop/common/lib/netty-${netty.version}.jar" todir="${build.dir}/lib" verbose="true"/>
- <jar
- jarfile="${build.dir}/hadoop-${name}-${version}.jar"
- manifest="${root}/META-INF/MANIFEST.MF">
- <manifest>
- <attribute name="Bundle-ClassPath"
- value="classes/,
- lib/hadoop-mapreduce-client-core-${hadoop.version}.jar,
- lib/hadoop-mapreduce-client-common-${hadoop.version}.jar,
- lib/hadoop-mapreduce-client-jobclient-${hadoop.version}.jar,
- lib/hadoop-auth-${hadoop.version}.jar,
- lib/hadoop-common-${hadoop.version}.jar,
- lib/hadoop-hdfs-${hadoop.version}.jar,
- lib/protobuf-java-${protobuf.version}.jar,
- lib/log4j-${log4j.version}.jar,
- lib/commons-cli-1.2.jar,
- lib/commons-configuration-1.6.jar,
- lib/commons-httpclient-3.1.jar,
- <!-- 此处修改commons-lang版本-->
- lib/commons-lang-${commons-lang.version}.jar,
- <!-- 此处增加commons-collections依赖-->
- lib/commons-collections-${commons-collections.version}.jar,
- lib/jackson-core-asl-1.8.8.jar,
- lib/jackson-mapper-asl-1.8.8.jar,
- lib/slf4j-log4j12-1.7.5.jar,
- lib/slf4j-api-1.7.5.jar,
- lib/guava-${guava.version}.jar,
- lib/netty-${netty.version}.jar"/>
- </manifest>
- <fileset dir="${build.dir}" includes="classes/ lib/"/>
- <!--fileset dir="${build.dir}" includes="*.xml"/-->
- <fileset dir="${root}" includes="resources/ plugin.xml"/>
- </jar>
- </target>
- </project>
参考链接:
http://aub.iteye.com/blog/2162155
开放原子开发者工作坊旨在鼓励更多人参与开源活动,与志同道合的开发者们相互交流开发经验、分享开发心得、获取前沿技术趋势。工作坊有多种形式的开发者活动,如meetup、训练营等,主打技术交流,干货满满,真诚地邀请各位开发者共同参与!
更多推荐
所有评论(0)