Question-12: You are working on a Replication of HDFS files from source CDP Cluster to Destination CDP cluster. However, there are thousands of files and subdirectories. You need to increase the memory, how would you do?
- You need to increase the heap size in hadoop-env.sh file.
- To increase the heap size, add the key-value pair HADOOP_CLIENT_OPTS=-Xmx<memory_value>
- To increase the heap size, add the key-value pair MAPRED_DISTCP_OPTS="-Xmx<memory_value>
- To increase the heap size, add the key-value pair HADOOP_USER_PARAMS=-Xmx<memory_value>
Answer: 1,2
- Get All Questions & Answer for CDP Generalist Exam (CDP-0011) and trainings.
- Get All Questions & Answer for CDP Administrator - Private Cloud Base Exam CDP-2001 and trainings.
- Get All Questions & Answer for CDP Data Developer Exam CDP-3001 and trainings.
This Question is from QuickTechie Cloudera CDP Certification Preparation Kit.
Steps
- On the destination Cloudera Manager instance, go to the HDFS service page.
- Click the Configuration tab.
- Expand SCOPE and select HDFS service name (Service-Wide) option.
- Expand CATEGORY and select Advanced option.
- Locate the HDFS Replication Environment Advanced Configuration Snippet (Safety Valve) for hadoop-env.sh property.
- To increase the heap size, add the key-value pair HADOOP_CLIENT_OPTS=-Xmx<memory_value>. For example, if you enter HADOOP_CLIENT_OPTS=-Xmx1g, the heap size is set to 1 GB. This value should be adjusted depending on the number of files and directories being replicated.
- Enter a Reason for change, and then click Save Changes to commit the changes.
- Get All Questions & Answer for CDP Generalist Exam (CDP-0011) and trainings.
- Get All Questions & Answer for CDP Administrator - Private Cloud Base Exam CDP-2001 and trainings.
- Get All Questions & Answer for CDP Data Developer Exam CDP-3001 and trainings.
This Question is from QuickTechie Cloudera CDP Certification Preparation Kit.