I think the postcodes aren’t the best example in what concern the data quality assessment rules. Even if one succeeds in his attempts of reengineering the postcodes rules, by next rule change the rules need to be changed as well. There are also cases in which the postcode system changes from the grounds up. Imagine you have to code rules for postcodes from all over the world!
At least in what concerns the postcodes and address information, one should use when possible an address validator or row data provided by an authorized entity. I know that there are such kinds of services. If possible one should implement such address validators directly in the source systems. This depends also on the number of new addresses added each year. Sometimes a validator is cost-effective, other times it isn’t.
Same approach should be used also for credit cards. Checking only the prefix and length of a credit card doesn’t help much, though it's a step in assessing the quality.
Usually I asses data quality based on following dimensions: duplication (aka uniqueness), completeness, consistency, conformity, accuracy, integrity and eventually timeliness.
I have a series of posts on this theme: http://sql-troubles.blogspot.de/search?q=data+quality