It's 2017, and despite all this parallel stuff, query planners are still dumb as bricks. Every day I get annoyed at various obvious-to-the-eye WHERE clause pushdown opportunities, or situations where I can manually copy a view to a table, add indexes, and build a query on top of that quicker than letting the database fill up the entire disk with temporary files trying to do it as a subquery. It's utterly maddening if you spend your life on ad-hoc analytics/reporting workloads.
I strongly believe there are huge opportunities for AI to come along and, given a logical description of the data and query, do a better job of physically organising and serving the data. I realise some database management systems do this to an extent, but it's pretty weak sauce in my experience.
A lower-hanging fruit may be a dynamic profiler that continuously analyzes query patterns, and automatically reorganizes indexes, table clustering, partitioning, and query planning based on the actual database activity.
This could even be something that operates as a layer on top of Postgres, although it would probably need more precise data than what is currently available through stats tables, and since Postgres doesn't have query hints there's no way to modify query plans. It would also need the look at logs to see what the most frequent queries are, and possibly require a dedicated slave to "test run" queries on to gauge whether it has found an optimal solution (or maybe do it on the actual database during quiet hours).
AI would definitely be interesting, though.