hadoop - HBase map/reduce dependency issue -


  1. overview

i developed rest api service based on resteasy framework. in service, store data hbase database. then, execute map/reduce process trigged condition(e.g. insert 1 record).

  1. requires

in map class, import third part libraries. not want package libraries war file.

tablemapreduceutil.inittablemapperjob(hbaseinitializer.table_data,   // input hbase table name                                           scan,                      // scan instance control cf , attribute selection                                           lucenemapper.class,        // mapper                                           null,                      // mapper output key                                           null,                      // mapper output value                                            job); fileoutputformat.setoutputpath(job, new path("hdfs://master:9000/qin/lucenefile")); job.submit(); 
  1. problem

if package libraries in war file deploy jetty container, work well. if not package third part library war,but upload library hdfs , add them class path, not work. below

conf.set("fs.defaultfs","hdfs://master:9000");  filesystem hdfs = filesystem.get(conf);  path classpathfilesdir = new path("bjlibs");  filestatus[] jarfiles = hdfs.liststatus(classpathfilesdir);  (filestatus fs : jarfiles) {        path disqualified = new path(fs.getpath().touri().getpath());        distributedcache.addfiletoclasspath(disqualified, conf); } hdfs.close(); 

try tablemapreduceutil.addhbasedependencyjars()


Comments

Popular posts from this blog

node.js - Mongoose: Cast to ObjectId failed for value on newly created object after setting the value -

gradle error "Cannot convert the provided notation to a File or URI" -

ios - Possible to get UIButton sizeThatFits to work? -