当前位置:首页 > 服务器技术 > apache

java.lang.ClassCastException:org.apache.hadoop.io.LongWritable无法强制转换为org.apache.hadoop.hbase.io.Immu

我是Hadoop的新手.我使用hadoop 2.3.0和hbase 0.98.3.我正在尝试从文本文件中提取数据并使用MapReduce写入hadoop中的hbase表. Altough我设置了jobKeyClass和outputValueClass’es的工作,我得到classCastException.谁能帮我 ?

这是我的代码.

public static void main(String[] args) {
    Configuration config = HBaseConfiguration.create();
    Job job;
    try {
        job = new Job(config, "LogBulkLoader");
        job.setJarByClass(Main.class);

        job.setMapperClass(LogMapper.class);

        job.setOutputFormatClass(TableOutputFormat.class);
        job.getConfiguration().set(TableOutputFormat.OUTPUT_TABLE, "fatih");

        job.setOutputKeyClass(ImmutableBytesWritable.class);
        job.setOutputValueClass(Put.class);

        FileInputFormat.addInputPath(job, new Path(userActionsTestFile));
        job.setNumReduceTasks(0);
        job.waitForCompletion(true);

    } catch (IOException e) {
        e.printStackTrace();
    } catch (ClassNotFoundException e) {
        e.printStackTrace();
    } catch (InterruptedException e) {
        e.printStackTrace();
    }
}

public static class LogMapper extends
        TableMapper<ImmutableBytesWritable, Put> {

    @Override
    protected void setup(Context context) throws IOException,
            InterruptedException {
    }

    @Override
    protected void map(ImmutableBytesWritable key, Result value,
            Context context) throws IOException, InterruptedException {
        try {
            String[] l = value.toString().split(",");

            String[] t = l[4].split(" ");
            String[] date = t[0].split("-");
            String[] time = t[1].split(":");

            GregorianCalendar gc = new GregorianCalendar(
                    Integer.parseInt(date[0]), Integer.parseInt(date[1]),
                    Integer.parseInt(date[2]), Integer.parseInt(time[0]),
                    Integer.parseInt(time[1]), Integer.parseInt(time[2]));

            Put put = new Put(Bytes.toBytes(l[0]));

            put.add(Bytes.toBytes("song"), Bytes.toBytes(l[1]),
                    gc.getTimeInMillis(), Bytes.toBytes(l[6]));

            put.add(Bytes.toBytes("album"), Bytes.toBytes(l[1]),
                    gc.getTimeInMillis(), Bytes.toBytes(l[5]));
            put.add(Bytes.toBytes("album"), Bytes.toBytes(l[2]),
                    gc.getTimeInMillis(), Bytes.toBytes(l[5]));

            put.add(Bytes.toBytes("singer"), Bytes.toBytes(l[1]),
                    gc.getTimeInMillis(), Bytes.toBytes(l[5]));
            put.add(Bytes.toBytes("singer"), Bytes.toBytes(l[2]),
                    gc.getTimeInMillis(), Bytes.toBytes(l[5]));
            put.add(Bytes.toBytes("singer"), Bytes.toBytes(l[3]),
                    gc.getTimeInMillis(), Bytes.toBytes(l[5]));

            context.write(new ImmutableBytesWritable(l[0].getBytes()), put);
        } catch (Exception e) {
            e.printStackTrace();
        }
    }
}

我得到以下异常.

java.lang.Exception: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hbase.io.ImmutableBytesWritable
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:403)
Caused by: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hbase.io.ImmutableBytesWritable
at com.argedor.module1.Main$LogMapper.map(Main.java:1)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:235)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)

解决方法:

在代码中添加以下内容

  job.setMapOutputKeyClass(ImmutableBytesWritable.class)
  job.setMapOutputValueClass(Put.class)

到工作配置.


【说明】本文章由站长整理发布,文章内容不代表本站观点,如文中有侵权行为,请与本站客服联系(QQ:254677821)!