Not (necessarily) relevant to this solution, but I've recently come across a use case where it would be valuable to allow third-party (semi-trusted) individuals to sandbox/build/test some data processing/transformation pipelines in Jupyter and then "operationalize" that into ongoing ETL in my main webapp/API once they're finished. Not the same use case as Mercury but still in the bucket of hoisting a notebook into a more repeatable/operationalizable runtime. Does anyone have experience with something like that?
I'm using Papermill to operationalize Notebooks (https://github.com/nteract/papermill), it e.g. also has airflow support. I'm really happy with papermill for automatic notebook execution, in my field it's nice that we can go very quickly from analysis to operations -- while having super transparent "logging" in the executed notebooks.