Both rising and more volatile energy prices are strong incentives for manufacturing companies to become more energy-efficient and flexible. A promising approach is the intelligent control of Industrial Energy Supply Systems (IESS), which provide various energy services to industrial production facilities and machines. Due to the high complexity of such systems widespread conventional control approaches often lead to suboptimal operating behavior and limited flexibility. Rising digitization in industrial production sites offers the opportunity to implement new advanced control algorithms e. g. based on Mixed Integer Linear Programming (MILP) or Deep Reinforcement Learning (DRL) to optimize the operational strategies of IESS. This paper presents a comparative study of different controllers for optimized operation strategies. For this purpose, a framework is used that allows for a standardized comparison of rule-, model- and data-based controllers by connecting them to dynamic simulation models of IESS of varying complexity. The results indicate that controllers based on DRL and MILP have a huge potential to reduce energy-related cost of up to 50% for less complex and around 6% for more complex systems. In some cases however, both algorithms still show unfavorable operating behavior in terms of non-direct costs such as temperature and switching restrictions, depending on the complexity and general conditions of the systems.