How large is "large" for you?

For smaller datasets then anywhere up to a few mb which isn't so bad with an API, but in theory for historic data it could be up to several gb. I've not seen datasette go that high (IIRC it's a 1000 row return limit by default).

That's what got me intrigued with Atlassians offering, as data lakes tend to be something internal to a company, not something I've ever seen offered as an interaction point to users.

I've also tested out roapi [1] which is nice if the data is in some structured format already (Parquet/JSON)

[1] https://github.com/roapi/roapi