Import data from HDFS to HBase (cdh3u2)
Asked Answered
N

1

12

I have Installed hadoop and hbase cdh3u2. In hadoop i have a file at the path /home/file.txt. it has the data like

one,1
two,2
three,3

I want to import this file into hbase. in that, the first field should parsed as String, and 2nd field parsed as integer, and then it should pushed into hbase. Help me to do this

aThanks in dvance....

Naumachia answered 27/12, 2011 at 11:44 Comment(1)
What do you want your key to be? Are you pushing them both into a single column family, or two separate ones?Nubble
N
21

I like using Apache Pig for ingest into HBase because it is simple, straightforward, and flexible.

Here is a Pig script that would do the job for you, after you have created the table and the column family. To create the table and the column family, you'll do:

$ hbase shell
> create 'mydata', 'mycf'

Move the file to HDFS:

$ hadoop fs -put /home/file.txt /user/surendhar/file.txt

Then, write the pig script to store with HBaseStorage (you may have to look up how to set up and run Pig):

A = LOAD 'file.txt' USING PigStorage(',') as (strdata:chararray, intdata:long);
STORE A INTO 'hbase://mydata'
        USING org.apache.pig.backend.hadoop.hbase.HBaseStorage(
              'mycf:intdata');

Note that in the above script, the key is going to be strdata. If you want to create your own key from something, use a FOREACH statement to generate the key. HBaseStorage assumes that the first thing in the previous relation (A::strdata in this case) is the key.


Some other options would be:

  • Write a Java MapReduce job to do the same thing as above.
  • Interact directly with the HTable with the client and put in row-by-row. This should only be done with much smaller files.
  • Push the data up with the hbase shell using some sort of script (i.e., sed, perl, python) that transforms the lines of csv into shell put commands. Again, this should only be done if the number of records is small.

    $ cat /home/file.txt | transform.pl
    put 'mydata', 'one', 'mycf:intdata', '1'
    put 'mydata', 'two', 'mycf:intdata', '2'
    put 'mydata', 'three', 'mycf:intdata', '3'
    
    $ cat /home/file.txt | transform.pl | hbase shell
    
Nubble answered 27/12, 2011 at 14:42 Comment(4)
Hey Donald. Would you please check out this post? #21126983Spindle
Donald you are a hero for writing this answer!Primogenial
Do no forget to register the required HBase jars in that PIG script. Like that "REGISTER /usr/lib/hbase/lib/*.jar;"Damico
@Donald i tried this but in my HBASE i am getting only 1 row where as in my logs i am getting 851 files are moved to hbase. Please help meRecommendatory

© 2022 - 2024 — McMap. All rights reserved.