We actually have a similar setup where I work and it was a lot of trial and error to get thigns to work optimally.
Even after we thought we had things set properly, we had an SSIS job go crazy. It had been run about 100 times and all of them completed within a few minutes which was within our acceptable window. Then one day had a user complain that there was no data in the table he was looking at. I looked at the SSIS job and it had been running for 8 hours.
SSIS and SSRS operate in their own memory space outside of the database engine. I am not sure how SSAS handles its memory.
Our "window" was roughly 18 GB of memory for our SSIS, SSRS, OS and application overhead.
If you have a similar test environment, I would recommend doing a simulated load on that environment and watching perfmon to see how your memory handles things.
If you don't have a test environment, I would try to get a window in which you can do some testing. I expect your highest memory SSIS packages are going to be in your ETL load, I'd run that while perfmon is running so you can see the memory hit caused by the ETL load. Then it probably wouldn't hurt to run other SSIS packages just in case your ETL load isnt' the biggest memory hog.
As for SSRS, we don't have any huge reports; most don't even have any visualizations - just table outputs. So it depends on what your reports look like. I would pick a few reports and look at what perfmon says about the memory.