Due to increasing computational workload and thermal design power requirements of high power-density microelectronics, low heat carrying capacity and poor thermal conductivity of air renders air-cooling insufficient to meet the cooling demands of component heat generation in high-performance servers. A more effective method of removing heat from these high-powered components is by using single-phase immersion cooling with a dielectric fluid of superior thermal properties and high boiling point. This study compares traditional forced-air cooling with forced convection single-phase immersion cooling to minimize chip junction temperatures of a 776 W high powered data center server using CFD simulations. The server is of spread-core configuration consisting of 2 CPU heatsink assemblies and 32 DIMM units with their specified chip thermal design power (TDPs). The first method consists of forced-air cooling with a 28°C air inlet supply and 110 CFM inlet air flowrate to establish baseline thermal performance. The second method is forced convection single-phase immersion cooling of the server in EC-110 dielectric fluid at 28 °C temperature and 2 GPM flow rate to observe server performance improvement in CPU case temperatures, maximum DIMM temperature, and server pressure drop through immersion cooling method. Lastly, CFD simulations are performed at different fluid inlet temperatures of 30, 40 and 50 °C, and 2 GPM fluid inlet flow rate, and the percentage change in the CPU case temperatures, server pressure drop and the maximum DIMM temperatures with fluid inlet temperature were studied.