I'm sorry I didn't see this question yesterday...
Back in 2003/4 I was one of the SDE/T's at Microsoft testing the new Filestreams feature in Yukon (2005). One of my responsibilities was to test the performance of the filesystem, to see what would happen when SQL server started generating large numbers of files in a directory. At the time, the Microsoft standard line was they did not recommend putting more than ~100,000 files in a single directory. I wrote programs that created, deleted, and did random seeks on directories with millions of files.
I was actually the guy who made the recommendation to suggest turning off 8.3 names, after my testing found that filesystem performance got exponentially worse after only a few hundred thousand files.
I remember the windows guys were shocked when I told them I wrote a program that generated over 20 million files in a single directory, with only slight linear performance degradation.
Ahh, the memories.