2. Backing up critical HDFS metadata

Back up the following critical data before attempting an upgrade. On the node that hosts the NameNode, open the Hadoop Command line shortcut that opens a command window in the Hadoop directory. Run the following commands:

  1. Open the command prompt using the Hadoop user account and go to the HDFS home directory:

    runas /user:hdfs "cmd /K cd %HDFS_HOME%"
  2. Run the fsck command to fix any file system errors.

    hdfs fsck / -files -blocks -locations > dfs-old-fsck-1.log

    The console output is printed to the dfs-old-fsck-1.log file.

  3. Capture the complete namespace directory tree of the file system:

    hdfs fs -lsr / > dfs-old-lsr-1.log
  4. Create a list of DataNodes in the cluster:

    hdfs dfsadmin -report > dfs-old-report-1.log
  5. Capture output from fsck command:

    hdfs fsck / -block -locations -files > fsck-old-report-1.log
[Note]Note

Verify there are no missing or corrupted files/replicas in the fsck command output.


loading table of contents...