I think that's quite a complex solution to an unnecessary complex problem. I don't see why a report is needed at all - it seems to me that the reporting is only necessary because the existing file replication mechanism across servers is unreliable. To my mind, the core issue is ensuring that the file tree is identical across servers.
If the aim is to ensure that parts of the directory tree contain files that are the same across servers, there are a number of other ways to crack the nut:
1) Where you have hundreds of thousands of small files, it's much faster to replace the entire target folder structure by using GnuWin32 tar on the source tree and un-tar'ing it on the destination. This works MUCH faster than just copying the tree across as no confirmation is needed between each copy. If the files are small and highly numerous, it works faster than comparing date-stamps too. It works especially well with compressed images as tar does not attempt to compress the (uncompressible) images again. This method works well across a high-bandwidth, low-latency connection such as a datacentre LAN.
2) Another mechanism is to use the GnuWin32 md5sum command on files in a directory, which will create a digest of md5sums. This can be copied to the target folder and, using the -c (check) flag, the contents of the folder checked against the digest. The plus here is that the actual file contents are checked without having to move much data across the network, so this option is good where the file sizes are large and ensures that the file contents are identical. The downside is that md5sum does not have a recursive option, so the digest has to be created per folder, which makes it awkward to use on a directory tree. It also will be slower at creating the digest and comparing the digest than using date-stamp comparisons. The upside is that this method will work particularly well where the bandwidth and latencies between servers is poor.
3) The in-built FC (file compare) command also does file comparisons, but being a single process comparing actual file contents across a LAN, it will be slow.
4) Powershell has the compare-object command which can be placed in a script to loop though the folder structure. Again it compares file contents. Alternatively, a powershell script can be written to perform a datestamp and filesize comparision, similar to the article.
5) Finally, trusty robocopy can be used to perform a selective copy of changed files only.