In cassandra I have a list column type. I am new to spark and scala, and have no idea where to start. In spark I want get count of each values, is it possible to do so. Below is the dataframe
+--------------------+------------+
| id| data|
+--------------------+------------+
|53e5c3b0-8c83-11e...| [b, c]|
|508c1160-8c83-11e...| [a, b]|
|4d16c0c0-8c83-11e...| [a, b, c]|
|5774dde0-8c83-11e...|[a, b, c, d]|
+--------------------+------------+
I want output as
+--------------------+------------+
| value | count |
+--------------------+------------+
|a | 3 |
|b | 4 |
|c | 3 |
|d | 1 |
+--------------------+------------+
spark version: 1.4