Incremental computation is a fascinating area of research, I've followed Matthew Hammer's work on Adapton[1] for a while, as well as Daco Harkes work on IceDust[2]. The issues are obviously quite deep (i.e. how do can you change a single value and recalculate quicksort without the effects of the change fanning out massively?) whether you're targeting existing languages and runtimes or building from scratch.

I would very much like to see this as an area of active development both in programming languages and distributed computing platforms. Current streaming platforms are great and it's getting easier and easier to create declarative data flow pipelines, doing real-time processing, splitting things into windows etc. But on top of that I imagine incorporating incremental computation, allowing you to keep a durable history of the event stream (or at least manageable parts of it) in a way that automagically allows incremental recalculation of upstream data when an underlying fact is retracted or updated.

[1] http://adapton.org/

[2] https://dcharkes.github.io/

Of course if that already exists then I'd love to hear about it! There seem to be all sorts of hybrid stream/batch systems but none with quite this focus. Effectively I want stuff that streams into a time series database in realtime, with the opportunity to edit or delete events later. On top of that there would be a processing pipeline with familiar streaming paradigms (transforming data, windowing data arbitrarily, aggregating on top of that) with the additional magic that the necessary events/windows/calculations re-fire if some underlying data changes. All in a magically efficient and transparent way.

Shameless plug for something that does all of these things (perhaps not exactly as you want, but ..)

https://github.com/frankmcsherry/differential-dataflow