Test access from HDP to S3
To test access to S3 from HDP, SSH to a cluster node and run a few hadoop fs shell commands against your existing S3 bucket.
To test access, SSH to any cluster node and switch to the hdfs user by using
Amazon S3 access path syntax is:
For example, to access a file called “mytestfile” in a directory called “mytestdir”, which is stored in a bucket called “mytestbucket”, the URL is:
The following FileSystem shell commands demonstrate access to a bucket named “mytestbucket”:
hadoop fs -ls s3a://mytestbucket/ hadoop fs -mkdir s3a://mytestbucket/testDir hadoop fs -put testFile s3a://mytestbucket/testFile hadoop fs -cat s3a://mytestbucket/testFile test file content
For more information about configuring the S3 connector for HDP and working with data stored on S3, refer to Cloud Data Access HDP documentation.