I think old / new is not a major point in whether or not to pick up some skill / Tool. Key point should be whether it's relevant or not. If the Project was to do data Integration In a UNIX Environment I would expect any developer to have an understanding of basic Shell Scripting including some of the more common commands like ps, netstat to vi(m) as well as understanding the underlying concepts of process communication, Environment variables etc. That really hasn´t changed all that much in the last 30 years, and it seems very common to Access the Linux / UNIX back end via Terminal / Shell. Even given Big Data type of Projects, awk / sed and similar often do a better job (both in Terms of performance but especially in Terms of Project cost) for many Tasks where people without much background start to code something in some high Level language. So yes: These skills should be readily available, even if you can learn 95% from 30 year old books.
On the Windows side DOS Batch Scripting to me seems a lot less important. It used to be that without understanding command.com and autoexec.bat there really wasn't all that much you could do with an IBM compatible PC. I am not sure most "digital" natives really understand where the term "IBM compatible" Comes from, or whether that's relevant for more than trivia and you can get by fine with "Windows PC". Still it can be helpful to understand some of what you can do with cmd.exe and programs like netstat, Tracert and similar, but there are a lot more GUI based Tools available for a standard Windows Server Environment than what you would expect on the UNIX side.
So yes: as long as it is relevant to the application stack you are running, basic understanding of the components and interfaces should be mandatory not just for admin type people. But learning "old" Technology that is available just because nobody has gotten around to removing it from the standard installation yet is not really a good reason to learn something.