Pass closure to Scala compiler plugin
Asked Answered
L

1

6

I'm trying to write a Scala compiler plugin that will allow extremely general code generation: something like the generality of the C preprocessor, but a bit more typesafe (I'm not sure if this is a terrible idea, but it's a fun exercise). My ideal use case looks something like this:

// User code. This represents some function that might take some args
// and outputs an abstract syntax tree.
def createFooTree(...): scala.reflect.runtime.universe.Tree = ...

// Later user code (maybe separate compilation?). Here the user generates
// code programmatically using the function call to |createFooTree| and inserts
// the code using insertTree.
insertTree(createFooTree(...))

The important plugin code might look like this (based on this):

class InsertTreeComponent(val global: Global)
  extends PluginComponent
  with TypingTransformers {
  import global._
  import definitions._

  override val phaseName = "insertTree"

  override val runsRightAfter = Some("parser")
  override val runsAfter = runsRightAfter.toList
  override val runsBefore = List[String]("typer")

  def newPhase(prev: Phase): StdPhase = new StdPhase(prev) {
    def apply(unit: CompilationUnit) {
      val onTransformer = new TypingTransformer(unit) {
        override def transform(tree: Tree): Tree = tree match {
          case orig @ Apply(
            function,
            // |treeClosure| is the closure we passed, which should
            // evaluate to a Tree (albeit a runtime Tree).
            // The function.toString bit matches anything that looks like a
            // function call with a function called |insertTree|.
            treeClosure) if (function.toString == "insertTree") => {
            // This function evaluates and returns the Tree, inserting it
            // into the call site as automatically-generated code.
            // Unfortunately, the following line isn't valid.
            eval(treeClosure): Tree
          }   
  ...

Any idea how to do this? Please don't say "just use macros"; at least in 2.10, they aren't general enough.

BTW, I see two problems with the approach I've outlined: 1) The compiler plugin takes an AST, not a closure. It would need some way of creating the closure, probably adding a build dependency on the user code. 2) The user doesn't have access to scala.reflect.internal.Trees.Tree, only scala.reflect.runtime.universe.Tree, so the plugin would need to translate between the two.

Leone answered 14/3, 2013 at 5:38 Comment(1)
It's definitely a terrible idea - but a great exercise ;) - you shoud think about looking into hacking macro impl in paradise.Swordcraft
Z
9

The implementation difficulties you face are in part the reason why macros in 2.10 are not general enough. They look very challenging and even fundamental, but I'm optimistic that they can be eventually defeated. Here are some of the tricky design questions:

1) How do you know that the function you are calling is the right insertTree? What if the user has written his own function named insertTree - how do you then distinguish a magic call to your special function and a normal call to a user-defined function? To be sure you would need to typecheck the reference to the function. But that's not exactly easy (see below).

2) How exactly do you evaluate the createFooTree(...) call? Just as before, you would need to typecheck the createFooTree part to find out what it stands for, which isn't easy.

3) And then there's one more issue. What if createFooTree is defined in one of the files you're currently compiling? Then you would somehow need to separate it and its dependencies from the rest of the program, put it into a different compilation run, compile it and then call it. And then, what if compilation of the function or one of these dependencies leads to a macro expansion, which is supposed to mutate some global state of the compiler. How are we going to propagate it to the rest of the program?

4) I'm talking about typechecking all the time. Is that a problem? Apparently, yes. If your macros can expand anywhere into anything, then typechecking becomes really tricky. For example, how do you typecheck this:

class C {
  insertTree(createFoo(bar)) // creates `def foo = 2`, requires `bar` to be defined to operate
  insertTree(createBar(foo)) // creates `def bar = 4`, requires `foo` to be defined to operate
}

5) Good news, though, is that you don't have to use scala.reflect.runtime.universe.Tree. You could have createFooTree dependently-typed: def createFooTree[U <: scala.reflect.api.Universe with Singleton](u: Universe): u.Tree. This, or the approach with scala.reflect.macros.Context we use in Scala 2.10. Not very pretty, but solves the problem of universe mismatch.

As a bottom line, my current feeling is that macros in a statically typed language (especially, in an object-oriented language, since OO brings an amazing bunch of ways for pieces of code to depend on each other) are really tricky. A robust model for typed macros modifying arbitrary fragments in the program being compiled is yet to be discovered.

If you wish we could have a more detailed discussion via email. We could also collaborate to bring the idea of proper macros to fruition. Or alternatively if you could share your use case, I could try to help with finding a workaround for your particular situation.

Zoochore answered 14/3, 2013 at 13:33 Comment(1)
Thanks, I just sent you an email.Leone

© 2022 - 2024 — McMap. All rights reserved.