You want to add a file to Hadoop. You are trying to run a basic Hadoop command to copy a file into HDFS. You get this error: copyFromLocal: `hdfs://localhost:54310/user/...': No such file or directory
How do you copy a file from your OS into HDFS?
Do one of the following:
Option 1. Run this command to create a new directory (substitute "jdoe" with the name of your user):
hdfs dfs -mkdir -p /user/jdoe/contint
# Now repeat your copy command
Option 2. Use a directory you know exists for the destination file without renaming the file itself (substitute "/path/to/" with the directory path to the source file and substitute "foo.bar" with the file name you want copied):
hdfs dfs -copyFromLocal /path/to/foo.bar hdfs://localhost:54310/user/root/
It does not work like the cp command in Linux where the file name can be changed to in the destination path with the -copyFromLocal flag. The destination file name should not appear in the destination path argument.