Well, nowadays, you just can't trust a file by its extension, you shouldn't trust it magic header too.
How do you verify it then? Trust the unfuzzed library to decode it in a VM that will get compromised whenever someone finds a bug in libjpeg or JS engine?
You verify it with a formal recognizer that was generated from the official grammar. Traditionally this would have been e.g. a yacc/bison parser written from the specification's BNF. Today, parser combinators such as hammer[1] are probably easier to use (it has nicer bit-level support).

This puts all of the recognizing/parsing code in the same location. It also verifies the entire input at the same time, before the results are passed back to the main program. You get clean valid/invalid check of the entire input.

For a very good discussion of why formal recognizers are important (and why, if possible, it's important to design transport formats and protocols that are deterministic context-free or simpler[2]), see Meredith and Sergey's talk[3] at 28c3.

[1] https://github.com/abiggerhammer/hammer/

[2] http://www.langsec.org/occupy/

[3] https://media.ccc.de/v/28c3-4763-en-the_science_of_insecurit...