I finally did a build of my game but ran into a strange problem. I initially selected x86_64 as the architecture. I was told that you get a slight performance increase using the 64bit version vs the regular 32-bit. Then I played the game and found that it was running slower in the stand alone build than it was in the Unity editor.
I figured it was the large background image that was moving with a parallax factor of .02, so I disabled the background and tried it again. This time the game ran fine. However, I decided to test by using a much smaller image (128 X 128) stretched across the background to see if it was the large image causing the problem. Even with such a small image, the game ran slower than it did in the editor. Then for the sake of experimentation, I did a build using the 32-bit architecture instead. It ran with out any major fps hiccups like the 64bit architecture was having.
After doing some more experimenting, I found that if there was any image filling up the background regardless of its resolution size, it caused the game to run slower in the build at 64bit, not a problem with 32bit.
This seems counterintuitive to me. From what I understand, 64bit allows you to use more memory for an application. Why would a standalone build run slower in 64bit mode as opposed to 32bit mode. Is there something I am missing here? Are there times that 32bit will run a standalone better than 64 (assuming the computer can run both)? If there is anyone that can shed some light on this issue, it would be greatly appreciated. I searched Google, the forums, etc. all day and could not find an answer.
↧