Compress Output Scalding / Cascading TsvCompressed
Asked Answered
P

2

2

So people have been having problems compressing the output of Scalding Jobs including myself. After googling I get the odd hiff of an answer in a some obscure forum somewhere but nothing suitable for peoples copy and paste needs.

I would like an output like Tsv, but writes compressed output.

Perfuse answered 29/5, 2014 at 17:42 Comment(0)
P
3

Anyway after much faffification I managed to write a TsvCompressed output which seems to do the job (you still need to set the hadoop job system configuration properties, i.e. set compress to true, and set the codec to something sensible or it defaults to crappy deflate)

import com.twitter.scalding._
import cascading.tuple.Fields
import cascading.scheme.local
import cascading.scheme.hadoop.{TextLine, TextDelimited}
import cascading.scheme.Scheme
import org.apache.hadoop.mapred.{OutputCollector, RecordReader, JobConf}

case class TsvCompressed(p: String) extends FixedPathSource(p) with DelimitedSchemeCompressed

trait DelimitedSchemeCompressed extends Source {
  val types: Array[Class[_]] = null

  override def localScheme = new local.TextDelimited(Fields.ALL, false, false, "\t", types)

  override def hdfsScheme = {
    val temp = new TextDelimited(Fields.ALL, false, false, "\t", types)
    temp.setSinkCompression(TextLine.Compress.ENABLE)
    temp.asInstanceOf[Scheme[JobConf,RecordReader[_,_],OutputCollector[_,_],_,_]]
  }
}
Perfuse answered 29/5, 2014 at 17:42 Comment(0)
P
1

I have also small project showing how to achieve compressed output from Tsv. WordCount-Compressed.

Scalding was setting null to the Cascading TextDelimeted parameter which disables compression.

Poriferous answered 18/6, 2014 at 7:49 Comment(2)
Thanks, I took a look. What does mapreduce.output.fileoutputformat.compress.type BLOCK do?Perfuse
It is one of the compression types (RECORD, BLOCK, NONE) from Hadoop. More info here. Basically, instead of compressing each record individually it compresses in blocks. Block size should be defined in the Hadoop as well.Poriferous

© 2022 - 2024 — McMap. All rights reserved.