At my workplace, end users are not allowed to connect to the LIVE systems via excel. We don't have any specific rules restricting this, but we advise the analysts not to do it.
If they NEED data, we build up a report in SSRS and let them use that. If they need to do analytics on data, they can pull the data from the SSRS report that was created forthem to use. It is a bit more restrictive, but it allows the development and administration teams help build efficient queries for the reports.
In the event that our reports don't provide the proper data, we will work with the analysts to determine what they need and work with them to build a report (or multiple reports) that are efficient and suits their needs.
We strongly encourage everyone to NOT send spreadsheets around to each other (especially ones that can pull "fresh" data) to each other and should have a centralized location for the excel files. Why you ask? Lets say you send the spreadsheet to 3 people. Those 3 people send it to 3 0thers who in turn send it to 3 others each. You now have 40 people with that document who could all be hitting the database at the same time to get their report data. It brings up 2 problems. First, the performance hit. If that document spreads around too much and a lot of people refresh it at the same time, they are going to be hammering the servers. The second problem, lets say that the document has been shared via email to 1000 people total. You are aware of 10 out of those 1000 people who use it. You make a change to the document to fix a bad calculation and now there are 2 versions floating around. The other fun thing that can happen is Analyst A makes the Excel file by pulling in a lot of data sources and ends up having a lot of advanced TSQL and Excel magic in there. They understand the file inside and out, but nobody else wants to make changes because it is very complicated. Analyst A quits and Analyst B gets tasked with fixing a new found bug but has no idea how it is working. They need to spend a bunch of time debugging and decoding the document and Excel can hide a lot of things from you (calculated cells/columns, VBA, hidden cells/worksheets, etc) so troubleshooting it may be non-trivial.
Now take a different approach where you have a single SSRS report which is verified and validated. You export the results on a schedule to an Excel file on a network share. Having multiple people open it at once is fine (first copy will be read-write, the rest will be read-only), there is no "data refresh" so no hammering of the database. And if changes are needed to be made to the excel file, they are changed at the SSRS report level and next export will be accurate and valid data.
Having a data analyst build something up in Excel as a prototype against a test system we have no issues with. But anything that is going to be going out to be used by the company needs to be controlled and supportable.
And if it is in SSRS and someone complains that the Excel file is wrong after they do some calculations, they can re-export the data from the SSRS server.
Another approach would be that if the analysts keep hammering the server and bringing it to its knees, they could get a read-only replica of the data warehouse. Then while they are doing the analysis, they are only hurting other analysts and not the entire company.
To summarize the above - Our company does it with simple company wide policies. We don't put systems in place to prevent certain things from happening, we put rules in place and employees follow them to avoid getting the dreaded call from the DBA team asking what they are doing that is bringing the system down. We have monitoring in place for long running transactions and for blocking/deadlocks so if someone starts blocking for a long period of time, they may get a phone call, or (depending on what is being done and how bad it is hurting things) they may just get their SPID killed followed up by an email.