Hash function in spark
Asked Answered
H

2

24

I'm trying to add a column to a dataframe, which will contain hash of another column.

I've found this piece of documentation: https://spark.apache.org/docs/2.3.0/api/sql/index.html#hash
And tried this:

import org.apache.spark.sql.functions._
val df = spark.read.parquet(...)
val withHashedColumn = df.withColumn("hashed", hash($"my_column"))

But what is the hash function used by that hash()? Is that murmur, sha, md5, something else?

The value I get in this column is integer, thus range of values here is probably [-2^(31) ... +2^(31-1)].
Can I get a long value here? Can I get a string hash instead?
How can I specify a concrete hashing algorithm for that?
Can I use a custom hash function?

Hogue answered 5/12, 2018 at 14:34 Comment(1)
One of the wonders of open source is that you can look at the source as you can see they use Murmur3. There is also another function sha2.Alleyn
M
21

It is Murmur based on the source code:

  /**
   * Calculates the hash code of given columns, and returns the result as an int column.
   *
   * @group misc_funcs
   * @since 2.0.0
   */
  @scala.annotation.varargs
  def hash(cols: Column*): Column = withExpr {
    new Murmur3Hash(cols.map(_.expr))
  }
Manchineel answered 23/5, 2019 at 14:38 Comment(0)
B
8

If you want a Long hash, in spark 3 there is the xxhash64 function: https://spark.apache.org/docs/3.0.0-preview/api/sql/index.html#xxhash64.

You may want only positive numbers. In this case you may use hash and sum Int.MaxValue as

df.withColumn("hashID", hash($"value").cast(LongType)+Int.MaxValue).show()
Broken answered 4/2, 2021 at 19:19 Comment(3)
Hi if I only want positive numbers, how can I achieve this in Python?Abstracted
@Broken , can you people provide some more resources where these can be used in spark context how to use in data skewness and other area.Straggle
@shasu, sorry but what you are asking is not related to the question of the page. Please open a new stackoverflow questionBroken

© 2022 - 2024 — McMap. All rights reserved.