I can already convert an array into a binary tree using following algorithm in java:
public class TreeNode {
public TreeNode left, right;
public int val;
public TreeNode(int val) {
this.val = val;
}
}
public TreeNode arrayToTree(Integer[] input){
TreeNode root = createTreeNode(input,1);
return root;
}
private TreeNode createTreeNode(Integer[] input, int index){
if(index<=input.length){
Integer value = input[index-1];
if(value!=null){
TreeNode t = new TreeNode(value);
t.left = createTreeNode(input, index*2);
t.right = createTreeNode(input, index*2+1);
return t;
}
}
return null;
}
when the input is {1,null,2,null,null,3}, I get following tree:
1
\
2
/
3
however I think the input {1,null,2,3} is clear enough to define a tree like above.
Any good idea to avoid the redundant nulls defined in the input array?
NULL
. There are encodings which use this fact. – Copulate