Monday, August 31, 2015

Help 4 HDP - Old

  1. caused by: unrecognized locktype: native (solr)
    1. vim /opt/lucidworks-hdpsearch/solr/server/solr/configsets/data_driven_schema_configs/conf/solrconfig.xml
    2. search lockType
    3. set it to hdfs
    4. /opt/lucidworks-hdpsearch/solr/server/scripts/cloud-scripts/zkcli.sh -zkhost localhost:2181 -cmd upconfig -confname myCollConfigs -confdir /opt/lucidworks-hdpsearch/solr/server/solr/configsets/data_driven_schema_configs/conf
  2. caused by: direct buffer memory (solr)
    1. vim /opt/lucidworks-hdpsearch/solr/server/solr/configsets/data_driven_schema_configs/conf/solrconfig.xml
    2. search solr.hdfs.blockcache.direct.memory.allocation
    3. set it to false
    4. /opt/lucidworks-hdpsearch/solr/server/scripts/cloud-scripts/zkcli.sh -zkhost localhost:2181 -cmd upconfig -confname myCollConfigs -confdir /opt/lucidworks-hdpsearch/solr/server/solr/configsets/data_driven_schema_configs/conf
    5. restart solr
    6. or try 'caused by: java heap space (solr)' directly
  3. caused by: java heap space (solr)
    1. vim /opt/lucidworks-hdpsearch/solr/bin/solr.in.sh
    2. search solr_heap
    3. increase it
    4. restart solr
  4. error: not found: value StructType  (spark)
    1. import org.apache.spark.sql.types._
    2. note that you need to import 'import org.apache.spark.sql.types._' even if 'import org.apache.spark.sql._' is already imported
  5. no response from namenode UI / 50070 is binded to private IP  (hadoop)
    1. ambari web -> HDFS -> configs -> custom core-site -> add property
      1. key: dfs.namenode.http-bind-host
      2. value: 0.0.0.0
    2. save it and restart related services
    3. note that there are 'dfs.namenode.rpc-bind-host', 'dfs.namenode.servicerpc-bind-host' and 'dfs.namenode.https-bind-host' properties which can solve similar issue
  6. root is not allowed to impersonate <username> (hadoop)
    1. ambari web -> HDFS -> configs -> custom core-site -> add property
      1. key: hadoop.proxyuser.root.groups
      2. value: *
      3. key: hadoop.proxyuser.root.hosts
      4. value: *
    2. save it and restart related services
    3. note that you should change root to the user name who runs/submits the service/job
  7. option sql_select_limit=default (ambari)
    1. use latest jdbc driver
      1. cd /usr/share
      2. mkdir java
      3. cd java
      4. wget http://cdn.mysql.com/Downloads/Connector-J/mysql-connector-java-5.1.36.zip
      5. unzip mysql-connector-java-5.1.36.zip
      6. cp mysql-connector-java-5.1.36/mysql-connector-java-5.1.36-bin.jar .
      7. ln -s mysql-connector-java-5.1.36-bin.jar mysql-connector-java.jar

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.