rest - webhdfs two steps upload a file -
i build hadoop cluster 4 machines:
- {hostname}: {ip-address}
- master: 192.168.1.60
- slave1: 192.168.1.61
- slave2: 192.168.1.62
- slave3: 192.168.1.63
i use httpfs upload file hdfs restful way, there contains 2 steps finish task.
step 1: submit http post request without automatically following redirects , without sending file data.
curl -i -x post "http://192.168.1.60:50070/webhdfs/v1/user/haduser/myfile.txt?op=append"
the server return result like:
- step 2: use response address upload file.
in step 1, how datanode's ip address(192.168.1.61) rather hostname (slave1)?
if hadoop version>=2.5, @ every datanode config ${hadoop_home}/etc/hadoop/hdfs-site.xml file. add: property dfs.datanode.hostname, value datanodes's ip address.
Comments
Post a Comment