Lets just say that the developer may have said no bugs, but his users didn't. (Unless they "tested" using production data.) The other thing this tells me is that the developer made some forms that look just like the paper he was replacing.
Good software does more than the typical user can grasp. It does it more efficiently then they will come up with on their own. Most UIs put way too much on the screen. And users typically want more than is there. And a developer can't move them to the next level without mistakes. Because the developer can't completely understand the problem.
Quite simply software is, has always been, and will always be an iterative process. Now if you have people who will start using the software bits (in production where they really understand), then you can discover and tweak during testing. But the user can't replicate the process without working in production. Nor can a few users actually predict what that one user (everyone has one of those users) will somehow do in spite of your best efforts to make them do what you want. That user will even want to do that very thing, but somehow that user ALWAYS finds a way to muck it up. And that IS a bug.
One thing that would help is an awareness of what is "good enough." Combine that with getting others to understand what is good enough. I was lucky to work on a project for a manager who wanted good enough. And he (and the users) were very happy with the results (knowing about the bugs). We quickly resolved the worst of them and slowly took out most the rest.