# Copyright 2012, Hortonworks Inc. All rights reserved.# # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # http://www.apache.org/licenses/LICENSE-2.0 # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # RELEASE NOTES: Hortonworks Data Platform with Hortonworks Management Console powered by Apache Hadoop Product Version: HDP-1.1.1.16 ============================================ This release of Hortonworks Data Platform (HDP) deploys the following Hadoop-related components: * Apache Hadoop 1.0.3 * Apache HBase 0.92.1 * Apache Pig 0.9.2 * Apache ZooKeeper 3.3.4 * Apache HCatalog 0.4.0 * Apache Hive 0.9.0 * Templeton 0.1.4 * Apache Oozie 3.1.3 * Apache Sqoop 1.4.2 * Hortonworks Management Center (HMC) 1.0.2 * Apache Flume 1.2.0 * HA-Monitor 0.1.1 Third party components: * Talend Open Studio for Big Data 5.1.1 * Ganglia 3.2.0 * Nagios 3.2.3 Patch Information ============================================== Patch Information for HDP-1.1.1.16 Bug Fixes: -------------------------------------------- * Hive is patched to include: HIVE-2928: Support for Oracle-backed Hive-Metastore ("longvarchar" to "clob" in package.jdo) HIVE-3082: Oracle Metastore schema script doesn't include DDL for DN internal tables * HCatalog is patched to include: HCATALOG-485: Document that storage-based security ignores GRANT/REVOKE statements HCATALOG-431: Document hcat type to java class/pig type mapping HCATALOG-492: Document CTAS workaround for Hive with JSON serde HCATALOG-442: Documentation needs update for using HCatalog with pig HCATALOG-482: Document -libjars from HDFS for HCat with MapReduce HCATALOG-481: Fix CLI usage syntax in doc & revise HCat docset HCATALOG-444: Document reader & writer interfaces HCATALOG-427: Document storage-based authorization Patch Information for HDP-1.1.0.15 Bug Fixes: -------------------------------------------- * Hadoop is patched to include the following: - High Availability (HA) enhancements: HDFS-3522, HDFS-3521, HDFS-1108, HDFS-3551, HDFS-528, HDFS-3667, HDFS-3516, HDFS-3696, HDFS-3658, MAPREDUCE-4328, MAPREDUCE-3837, MAPREDUCE-4328, MAPREDUCE-4603, and HADOOP-8656. - Performance improvements: HDFS-2465, HDFS-2751, HDFS-496, MAPREDUCE-782, MAPREDUCE-1906, MAPREDUCE-4399, MAPREDUCE-4400, MAPREDUCE-3289, MAPREDUCE-3278, HADOOP-7753, and HADOOP-8617. - Improvements and bug fixes: HADOOP-8832, HADOOP-3963, HDFS-3596, HADOOP-6995, HDFS-3846, and MAPREDUCE-4558, * HBase is patched to include the following: HBASE-6447, HBASE-6450, HBASE-6334, HBASE-4470, HBASE-6460, HBASE-6552, HBASE-6512, HBASE-6308, HBASE-6576, HBASE-6565, HBASE-6538, HBASE-6608, HBASE-6503, HBASE-5714, HBASE-6631, and HBASE-6632. * Hive is patched to include HIVE-3008, HIVE-3063, HIVE-3076, HIVE-3168, HIVE-3246, HIVE-3153, HIVE-3291, and HIVE-3098. * Oozie is patched to include OOZIE-698, OOZIE-697, OOZIE-810, and OOZIE-863. * Sqoop is patched to include SQOOP-578, SQOOP-579, SQOOP-580, SQOOP-582, and SQOOP-462. * HCatalog is patched to include HCATALOG-448, HCATALOG-350, HCATALOG-436, HCATALOG-471, and HCATALOG-464. * Pig is patched to include PIG-2766. * Ambari is patched to include AMBARI-664, AMBARI-641, AMBARI-628, AMBARI-633, and AMBARI-597. Minimum system requirements ============================================== Hardware Recommendations --------------------------- Although there is no single hardware requirement for installing HDP, there are some basic guide­lines. You can see sample setups here: http://docs.hortonworks.com/CURRENT/About_Hortonworks_Data_Platform/Hardware_Recommendations_For_Apache_Hadoop.htm Operating Systems Requirements ------------------------------------ The following operating systems are supported: * 64-bit Red Hat Enterprise Linux (RHEL) v5.*, v6.* * 64-bit CentOS v5.*, v6.* IMPORTANT: All hosts in the cluster must run the same OS, version and patch sets. Graphics Requirements ------------------------ The HMC deployment wizard runs as a browser-based Web app. You must have a machine capable of running a graphical browser to use this tool. Software Requirements ----------------------- On each of your hosts: * yum * rpm * scp * curl * wget * pdsh On the machine from which you will run HMC: * Firefox v.12+ Database Requirements ----------------------- Hive or HCatalog requires a MySQL database for its use. You can choose to use a current instance or let the HMC deployment wizard create one for you. Optional: Configure the local repositories ------------------------------------------ If your cluster does not have access to the Internet, or you are creating a large cluster and you want to conserve bandwidth, you need to provide access to the HDP installation packages using an alternative method. For more information, see http://docs.hortonworks.com/CURRENT/Appendix/Deploying_HDP_In_Production_Data_Centers_with_Firewalls/Deploying_HDP_In_Production_Data_Centers.htm IMPORTANT: The installer pulls many packages from the base OS repos. If you do not have a complete base OS available to all your machines at the time of installation, you may run into issues. For example, if you are using RHEL 6 your hosts must be able to access the “Red Hat Enterprise Linux Server 6 Optional (RPMs)” repo. If this repo is disabled, the installation is unable to access the rubygems package, which is necessary for HMC to operate. If you encounter problems with base OS repos being unavailable, please contact your system administrator to arrange for these additional repos to be proxied or mirrored. Improvements ============================================= * Fixed incorrect host mappings for Hive causing failure of Hive smoke tests. * Templeton updated to upstream version 0.1.4. * HA-monitor updated to upstream version 1.1.0 * Fixed HDFS log corruption when disk gets filled. * Added support for pluggable components. This feature will enable export of DFS functionality using arbitrary protocols. * Added support to enable service plugins for JobTracker. Known issues ============================================= * The ALTER INDEX command in an automated script that also contains CREATE INDEX command will fail for Hive. The workaround is to either use the ALTER INDEX command in an interactive shell or add this command in a separate script file. * Hive and HCatalog authorizations are based on permissions in the underlying storage system and so are not affected by account-management DDL statements such as GRANT and REVOKE. For details, see: http://docs.hortonworks.com/HCatalog/CURRENT/authorization.html * Preview of the mount point directories will display the Oozie and ZooKeeper directories even when the corresponding services are not enabled. For details, see: https://issues.apache.org/jira/browse/AMBARI-572 * While finalizing the bootstrap nodes for HMC, in some cases the update might show incorrect message. * HMC installation currently does not support Hadoop security. * It is not recommended to use init.d scripts for starting or stopping Hadoop services. * To use Oozie command line client, you must first export JAVA_HOME. * Pig or MapReduce jobs get incorrect data when reading binary data type from the HCatalog table. For more details, see: https://issues.apache.org/jira/browse/HCATALOG-430. * For gsInstaller based deployments, the Templeton's job submission APIs (beta) do not work in secure mode. Hortonworks offers technical support subscriptions for Hortonworks Data Platform. For more information, please visit http://hortonworks.com/support/ or contact us directly at info@hortonworks.com.