hadoop - HBase map/reduce dependency issue -
- overview
i developed rest api service based on resteasy framework. in service, store data hbase database. then, execute map/reduce process trigged condition(e.g. insert 1 record).
- requires
in map class, import third part libraries. not want package libraries war file.
tablemapreduceutil.inittablemapperjob(hbaseinitializer.table_data, // input hbase table name scan, // scan instance control cf , attribute selection lucenemapper.class, // mapper null, // mapper output key null, // mapper output value job); fileoutputformat.setoutputpath(job, new path("hdfs://master:9000/qin/lucenefile")); job.submit();
- problem
if package libraries in war file deploy jetty container, work well. if not package third part library war,but upload library hdfs , add them class path, not work. below
conf.set("fs.defaultfs","hdfs://master:9000"); filesystem hdfs = filesystem.get(conf); path classpathfilesdir = new path("bjlibs"); filestatus[] jarfiles = hdfs.liststatus(classpathfilesdir); (filestatus fs : jarfiles) { path disqualified = new path(fs.getpath().touri().getpath()); distributedcache.addfiletoclasspath(disqualified, conf); } hdfs.close();
try tablemapreduceutil.addhbasedependencyjars()
Comments
Post a Comment