- Home /
Can I make a 64-bit game in Unity?
I am currently testing 8 players on one computer, and I'm shaking my head looking at how my one processor core is overloaded and the others are practically idling.
I clicked into "set affinity" and all cores are selected. The reason I think this is not working right is the program is running in 32-bit. Is there any way I can compile the game into a 64-bit program?
The program running in 32-bit has nothing to do with Unity not using more than one core. 32-bit and 64-bit programs are both able to use multiple processors. The only thing that will happen running a 64-bit build is the game will be able to utilize 4+ gigabytes of RA$$anonymous$$.
Sadly, there is nothing you can do about this. Unity currently doesn't properly utilize multiple processing cores, and according to everything the Unity $$anonymous$$m has said about the issue, it probably never will.
@Jason B: not sure where you got that idea; Unity 3.5 has increased support for multiple cores, and they will continue to add to that where feasible. Also, your own code can use multiple cores if you have anything that can be parallelized effectively.
The only times I've ever seen threading brought up (even right on this site), the answers are always "Unity can't do it" and "You might be able to, but it'd be extremely difficult even to thread something simple", on top of the fact that I've never once seen code samples for how threading inside Unity would work (for instance, if I want one handful of objects to only run their scripts on core 2). Also, I'm not even sure what the 3.5 notes meant by "offloading rendering to multiple cores"; I thought my GPU was doing the rendering, not my CPU.
Great deal of confusion all-around. :P
I think you must have misinterpreted those answers, because "Unity can't do it" is certainly completely untrue, and while threading isn't trivial, it's not hard to thread something simple. (See here for a script that scales textures using multiple cores. Normally I'd link to the wiki, but it's down.) The GPU renders the pixels, but a significant amount of setup has to be done on the CPU, such as submitting meshes for rendering. e.g., draw calls are purely a function of CPU. It's really not very confusing.
I suppose my point with the threading is that if you can't thread anything Unity-related, it brings the practical applications of threading down significantly (for instance, it'd be of great use to be able to do all your raycasting on a separate processor if you have a lot of things casting rays, like AI).
But with no Unity functions being thread-safe, it stamps most game-related uses out of the equation and limits you to doing raw math, and if there's no way to translate that math back into a Unity function, you can't thread it. There's no way to intercept the calculations a raycast does and have the math done through a thread, for instance. Obviously for rescaling a texture, that's one of the few instances where you can get use out of threading. For most games though, multi-core processing "can't be done" (at least for all intents and purposes) just due to the fact that 95% of you'd want to offload to another core isn't thread-safe. Now if you're making a game where a lot of mathematical calculations are taking place, with a lot of dynamic stuff going on, then sure, you can thread all you want and go crazy.
Anyways, I know you probably know 10 times more than I do about all this, I just figured I'd throw my hat into the ring. And the only reason I'm commenting to death on this poor guy's question is because I think what he really wanted was to thread his processor-heavy stuff to other cores. Which I'm sure he'll find out once he realizes his 64-bit build is still making him sad. lol
Answer by Grady · Jan 14, 2012 at 12:21 AM
Yes, @Hordaland is exactly right,
When in File > Build Settings > PC and Mac Standalone, under the "Target Platform" dropdown box, there will be: Windows, Windows 64-bit, and Mac OS X.
This is just a bit more of a detailed version of @Hordaland's answer, but he is exactly right!!!!!!!
Hope this helps you
-Grady
I can't believe I missed that. I had 32 bit selected. It should at least give me a little boost with all the rendering dropped on a separate core as I am almost certain there is no bottle neck with my GPU. Thanks! -and everyone else too.
32 vs 64 shouldn't have any significance in performance. There are only very few scenarios where 64 gives significantly better performance (like some complex scientific calculations). In case of games, you won't notice anything, if something at all.
Actually, it ran a lot better for some reason. Back in 32bit, I had trouble after opening the game any more than 6 times; it kept getting random freezes. In 64 bit, I didn't notice any slowdowns whatsoever. Although I checked and that one core was still maxed out.
Answer by Eric5h5 · Jan 14, 2012 at 12:52 AM
Being 64-bit or not has nothing to do with the number of cores used. You can't force a program to be multi-threaded if it wasn't programmed that way. That said, Unity 3.5 does have some improvements in that area, such as rendering being in a separate thread.
Answer by Hordaland · Jan 13, 2012 at 11:32 PM
When choosing "target platform" in build settings you can choose Windows 64-bit. I guess that should work.
I'am lost with Unity3d compiler. For me as non-expert, it looks compiling for 64 with memory limit of 32.
Your answer

Follow this Question
Related Questions
getting udp package info inside unity (GlovePIE) 0 Answers
Gun Against Wall 1 Answer
Best program for 3D characters? 1 Answer
HighScore analytics 0 Answers
creating a game? 3 Answers