
链接:https://pan.baidu.com/s/1-B2iyb-l0poGTgIZRj24VA?pwd=8h77
提取码:8h77
注意:如果环境变量不起作用,可以重启电脑试试。
验证Hadoop环境变量是否正常。双击winutils.exe,如果报如下错误。说明缺少微软运行库(正版系统往往有这个问题)。
这个是对应的资料包微软运行库安装包双击安装即可。
链接:https://pan.baidu.com/s/152Z3eodwLnZsKshKhNmcxg?pwd=ibfg
提取码:ibfg
点next
点finish
org.apache.hadoop hadoop-client 3.1.3 junit junit 4.12 org.slf4j slf4j-log4j12 1.7.30
如果爆红的话,等他加载一会,要是右下角没有加载可以重启下idea
在项目的src/main/resources目录下,新建一个文件,命名为“log4j.properties”,在文件中填入
log4j.rootLogger=INFO, stdout log4j.appender.stdout=org.apache.log4j.ConsoleAppender log4j.appender.stdout.layout=org.apache.log4j.PatternLayout log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - %m%n log4j.appender.logfile=org.apache.log4j.FileAppender log4j.appender.logfile.File=target/spring.log log4j.appender.logfile.layout=org.apache.log4j.PatternLayout log4j.appender.logfile.layout.ConversionPattern=%d %p [%c] - %m%n7.1.5创建包名:com.summer.hdfs 7.1.6创建HdfsClient类
package com.summer.hdfs;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.junit.Test;
import java.io.IOException;
import java.net.URI;
import java.net.URISyntaxException;
public class HdfsClient {
@Test
public void testMkdir() throws URISyntaxException, IOException, InterruptedException {
//连接的集群nn地址
URI uri = new URI("hdfs://hadoop102:8020");
//创建一个配置文件
Configuration configuration = new Configuration();
//用户
String user = "summer";
//获取到了客户端对象
FileSystem fs = FileSystem.get(uri, configuration,user);
//创建一个文件夹
fs.mkdirs(new Path("/xiyou/huaguoshan"));
//关闭资源
fs.close();
}
}
7.1.6.1封装代码:
package com.summer.hdfs;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import java.io.IOException;
import java.net.URI;
import java.net.URISyntaxException;
public class HdfsClient {
private FileSystem fs;
@Before
public void init() throws URISyntaxException, IOException, InterruptedException {
//连接的集群nn地址
URI uri = new URI("hdfs://hadoop102:8020");
//创建一个配置文件
Configuration configuration = new Configuration();
//用户
String user = "summer";
//获取到了客户端对象
fs = FileSystem.get(uri, configuration,user);
}
@After
public void close() throws IOException {
//关闭资源
fs.close();
}
@Test
public void testMkdir() throws URISyntaxException, IOException, InterruptedException {
//创建一个文件夹
fs.mkdirs(new Path("/xiyou/huaguoshan1"));
}
}
7.1.7执行程序
客户端去操作HDFS时,是有一个用户身份的。默认情况下,HDFS客户端API会从采用Windows默认用户访问HDFS,会报权限异常错误。所以在访问HDFS时,一定要配置用户。
package com.summer.hdfs;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import java.io.IOException;
import java.net.URI;
import java.net.URISyntaxException;
public class HdfsClient {
private FileSystem fs;
@Before
public void init() throws URISyntaxException, IOException, InterruptedException {
//连接的集群nn地址
URI uri = new URI("hdfs://hadoop102:8020");
//创建一个配置文件
Configuration configuration = new Configuration();
//获取到了客户端对象
fs = FileSystem.get(uri, configuration);
}
@After
public void close() throws IOException {
//关闭资源
fs.close();
}
@Test
public void testMkdir() throws URISyntaxException, IOException, InterruptedException {
//创建一个文件夹
fs.mkdirs(new Path("/xiyou/huaguoshan1"));
}
}
org.apache.hadoop.security.AccessControlException: Permission denied: user=73631, access=WRITE, inode="/xiyou":summer:supergroup:drwxr-xr-x