Consider that JSON being ubiquitous immediately makes it easier to adopt compared to a custom description language. If people actively used IDL today, there would probably be a lot of demand for a JSON variant or subset.
I'd make similar arguments about using JSON vs S-expressions for data interchange, but JSON works well with both Javascript and HTTP and everyone standardized on those, and maps cleanly to basic data structures in just about every modern programming language.
These JSON-based tools are actually very much like Lisp in that both the interface specification and the data are expressed in exactly the same format/language. This is not true of a lot of these older standards, and seems to have been the failed promise of XML.
IDL does look like it maps nicely to typed function calls in most languages, but it lacks the advantage of being expressed in a standard format/language that is already well-supported for other tasks, and seemingly doesn't impose any requirements on how the data itself is transmitted.
For an example of why language/format matters, consider the tool c2ffi (https://github.com/rpav/c2ffi). It generates a JSON description of a C header file. The header file itself is a pain to parse and extract information from. But once you have a tool to do it and put that information in a standard format, you can now build an FFI wrapper in just about any other language in at least semi-automated fashion. It makes the easy parts easier, compared to other systems like SWIG and GObject where the interface format is totally custom and you're mostly reliant on a single implementation to make it all work.
If anything, let's be grateful that the good ideas of the past are being rediscovered and reinvented in a way that might grant them more longevity and broad usefulness than they had in their first life. Did you use IDL? What was your experience like? How would you compare it to something like gRPC?
That sounds like C2FFI: https://github.com/rpav/c2ffi
The only language ecosystem where I've seen it used is Common Lisp, in the CL-Autowrap library (https://github.com/rpav/cl-autowrap).
But C2FFI emits plain JSON, so I don't see any reason why you couldn't build e.g. a Python auto-binding library on top of it. It depends on LLVM to generate the spec file, but end users don't need to have that.
Not like GObject declaring the interface in a special comment (modulo typos). Something that's fully machine-parseable in any programming language and does not require access to the source code.
I mentioned C2FFI (https://github.com/rpav/c2ffi) in another post in this thread. That's an extra "spec.json" file that you need to distribute along with your library, but maybe that's a whole lot easier and simpler than inventing a new library format. And it's JSON, which is human-readable, so it's relatively easy to debug.
(Although despite planning to use it for some personal project I haven't got around to it yet...)
It parses C/C++/Objective-C into JSON metadata. It uses clang/LLVM so the parsing/etc should be very accurate.
So, to implement your idea, you could just embed this JSON into an ELF section. (Or, if you don't like JSON, convert your JSON to some other format, such as S-expressions or protobuf or whatever.)
Unfortunately, C is a pretty nasty language to parse, so you end up using something like http://www.swig.org/ or https://github.com/rpav/c2ffi to parse it for you. But the challenge with adopting those sort of tools for Java is they aren't written in Java, they are written in C and/or C++. (Obviously that doesn't stop you from using them with Java, but it does make the whole thing less pleasant.)
VS 6.0 is getting very hard to source legally these days, and I wish MS made it easier to get.
As far as having high-level languages call directly into C++, yes that's quite a pain (nearly impossible without something like https://github.com/rpav/c2ffi). Note also that calling into non-C ABI functions in any language is hard (and most HLLs don't support anything like extern "C" to make it easy).