I don't know what it is about game development that really brings out the yak shaving in people.
One time, about 8 years ago, I backed a game called Nowhere[1] by a very talented programmer. The original premise was an alien life simulator.
Well, it's been eight years, and development is still going strong!
The developer is currently working on the String implementation for the programming language he invented[2], which he's using to write the other programming language he invented[3], which is eventually going to be used to write the game.
[1] https://duangle.com/nowhere
Game development methodology has fascinated me recently. With the Battlefield 2042 debacle, its interesting to compare it to what I know (financial trading platforms).
With trading platforms, they last for years, decades maybe. They're created with some ideas around how to manage performance and to support evolving requirements. Developers know this thing is going to be around for a long time, so its treated as such. Years down the line, you start creating a new platform and look to do a long migration to keep clients happy because you can't just turn off their favourite functionality.
Modern AAA FPS games seem to be the complete opposite. Reinvent significant amounts every release. Dump the old game as the new one is released. Much seems to be from scratch. The BF2042 scoreboard issue seems like it should be almost off the shelf. There also seems to be this big shift towards short release cycles which pushes even more churn and reinventing the wheel. Look and feel must be updated to keep things "fresh". Although most of the popular games on Steam[1] are older games that have been around for years.
So many places to shave a yak, I'm surprised games get shipped at all.
Of course take all of this comment as an outsider who just yearns for the old days of cool mods and custom servers.
If you don’t mind a digression for someone who needs help…
What is the architecture pattern of a trading platform?
I am looking to build a system that is able to: - receive 1000s incoming streams of data - save the data - make data available to live subscribers
The closest analogous system I can think of is bond/stock/commodity/etc price subscriptions for traders.
I feel like this many in - many out data stream architecture should be solved by now and I rather not start from scratch in architecture and technology choices.
I can’t find the right phrase to even google to get started.
The trading platforms I've worked on tend to order things into a single stream that you can act upon. This helps with testing, race conditions, auditability. You will want to be able to replay an exact series of events to recreate conditions.
LMAX Disruptor[1] and Aeron[2] are two open source examples of something widely used, either using the libraries themselves or as concepts.
Some things that trading systems (generally?) need which you may not, which might simplify your architecture:
- Every message must be delivered and in order. UDP is usually used over TCP, as the platform will likely want more control over handling missed messages. - It will be common to have read/write applications that need to add a message as a response. Writing is difficult as you need to be quick, otherwise a subsequent message from another application might invalidate your message.
Do subscribers need older data? How do dropped packets impact the system? Can they just be forgotten? What latency requirements do you have? Do subscribers also write to the same stream?
- [1] https://lmax-exchange.github.io/disruptor/ - [2] https://github.com/real-logic/aeron