A few months ago I described how you could use JSON Schema to validate your automation data models, host/group variable files, or even Ansible inventory file.
I had to use a weird toolchain to get it done – either ansible-inventory to build a complete data model from various inventory sources, or yq to convert YAML to JSON… and just for the giggles jsonschema CLI command requires the JSON input to reside in a file, so you have to use a temporary file to get the job done.
Obviously I could have used Python, and it wouldn’t be more than a dozen or so lines of code to read YAML and have it validated with jsonschema Python module, but I wanted to keep it as simple as possible and probably failed. What exactly did RFC 6670 say about squashed complexity sausage?
Things got much simpler with the new validate module in ansible.utils collection. You pass the data you want to have validated, the validation criteria (JSON schema describing your data), and the validation engine (currently only jsonschema) to the module and get back a success message or a list of errors. Great job (and thanks for the reference to my article in the announcement blog post 🙏)
A fair warning before you get too excited: the error messages generated by jsonschema could be a bit counter-intuitive. Trust me, I use it in my Git commit hooks, and while I appreciate the all good message every time I get one, trying to decipher what exactly I did wrong from the error message is a bit of a chore (hope you noticed how extremely polite I tried to sound, sometimes it’s a ******* nightmare).
Wrap those error messages in lovely JSON generated by ansible-playbook when a module fails, and you have a perfect puzzle to keep you awake all night when your playbook crashes on bad data at 2AM on a Sunday morning. Make sure you set ANSIBLE_STDOUT_CALLBACK to yaml before you start troubleshooting ;)