Peter Mikkelsen here. I'll happily answer any questions.

Great presentation!

Is there an initiative at DYALOG to do an XLA [1] backend for APL? One can't help but notice the XLA API and APL share a great deal of functionality; after all it's about processing arrays:

    XlaBuilder::Iota    <-> ⍳
    XlaBuilder::Reshape <-> ⍴
    XlaBuilder::Reduce  <-> /
    XlaBuilder::Dot     <-> .
    Transpose           <-> ⍉
It feels like APL could get quite a kick from Google's TPUs.

[1] https://www.tensorflow.org/xla

Not that I am aware of. I think the closest project is co-dfns[1] which is being developed by Aaron Hsu (he did a presentation as well). It aims to compile a subset of APL so that it can be executed on GPUs for instance, possibly with other backends. I imagine an XLA backend could be possible there.

[1] https://github.com/Co-dfns/Co-dfns