At my workplace, we actually use SSIS to pull data across from "live" systems into our reporting database. SOMETIMES it is a 1:1 copy of the data, sometimes we transform it for consumption by reports. The SSIS load is scheduled at different frequencies depending on report need. SOME SSIS packages run nightly, some run hourly, some run daily at 2:00 PM, others run every Friday. It depends on the report need and the duration of the loads. The nightly load runs during company downtime, so it can afford to be slow (takes hours sometimes). Hourly load runs during company uptime so needs to be fast. We have a 5 minute time limit on it. If it exceeds that, the DBA team gets notified and we look into why it is running slow. The 2 PM and Friday runs are always fast (under a minute), so we don't monitor those ones.
So it really depends on the need. If the report is run hundreds of times per minute, having the data refreshed "nightly" is probably a safe bet as hourly would cause interruptions to the reports based on how we do our SSIS load - truncate and reload each time rather than a merge approach. BUT it does depend on if end users need "live" data or snapshot data. For analytical reporting, snapshot data is usually what you want, but if your end users are expecting to see things they do appear immediately in the system, you will need a different approach than SSIS. In that case, some options I can think of are replication (I've not used this, but I think it should work), Service Broker (I've used this and it is a cool tool when it works, but can be challenging to debug when it acts up), 3rd party app that monitors for changes and pushes them across (in house or 3rd party... not sure of any that exist, but I am sure they do), and possibly other methods that I am not thinking of.
Yikes... 3 hours to fix what should be an easy problem is never fun. I understand having to work with different groups, but when a critical thing is busted, I like to set up a room with all people who can help and have laptops for everyone so they can work in the same room to check things out. Much faster to get things solved and with laptops for everyone, they can still RDP to their work boxes to do their job. Mind you I have only been involved in 1 major outage like that and I was not the coordinator in that scenario and we didn't have a big room - it was emails and phone calls and phone tag and it was slow and painful to find the problem and fix it and a lot of "We didn't change anything so it must be something you did"... finger pointing doesn't help, but in the end, the database side needed no changes so I didn't end up fixing anything... it was entirely the application team who released an update, told us they didn't, then later rolled back the update and the problem was fixed.
I feel bad you have to work with someone like that. There are times when I am sure I have unintentionally done things like that - I've merged stuff in git that blew out others work, but it was by accident and I worked with them to recover the work. People that go out of their way to make everyone else's job harder shouldn't have their job (my opinion). If you can't be a team player, then don't work on projects with others. And, again my opinion, a C# console app that is JUST for file/folder creation with specific names and such is definitely overkill. That feels like a job for powershell, or even a bat or cmd file.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.