I have a Desktop App running on a Windows server, accessed via Remote Desktop. When the app runs on a non-virtualized server or a regular PC, performance is reasonably fast. However, on a virtualized Windows server, it becomes significantly slower.
Is there a way to force software rendering to bypass dependency on the virtualized GPU?
Any suggestions to improve the performance of a Xojo app in virtualized environments?
5 posts - 4 participants