1
\$\begingroup\$

I'm trying to profile the game using Unity Profiler, I can see in the Hierarchy tab that there's an 18.3 KB correlated to NetworkIdentiry.UNetStaticUpdate().

enter image description here

I guess this has happened inside the Unity UNet API, is there a workaround to avoid that to happen? and why an internal Unity calls can cause GC?

\$\endgroup\$
4
  • \$\begingroup\$ Can you clarify, are you seeing these 18 kB allocations every frame / most frames, or only in spikes here and there? \$\endgroup\$ Commented Nov 1, 2018 at 15:22
  • \$\begingroup\$ Yea, It's actually a spike which happens every 100 frames. but it's inside a Unity Call so I don't know how to avoid it. \$\endgroup\$ Commented Nov 1, 2018 at 22:18
  • \$\begingroup\$ Does anything significant happen in your game on a similar interval? Is anything being created/activated/deactivated or moving between sleep & wake states? \$\endgroup\$ Commented Nov 1, 2018 at 22:20
  • \$\begingroup\$ hmm, actually the interesting part is that there's no any time spike or unusual behaviour that happen in that frame. of course, I can do more investigation about that. \$\endgroup\$ Commented Nov 1, 2018 at 22:24

1 Answer 1

1
\$\begingroup\$

C# is a language with a garbage collector. GC.Alloc() just tells you that something is allocating memory. It is not surprising that internal Unity calls allocate memory.

https://docs.unity3d.com/Manual/BestPracticeUnderstandingPerformanceInUnity4-1.html

In Unity’s CPU Profiler, the Overview has a “GC Alloc” column. This column displays the number of bytes allocated on the managed heap in a specific frame (4) (Note: Note that this is not identical to the number of bytes temporarily allocated during a given frame. The profile displays the number of bytes allocated in a specific frame, even if some/all of the allocated memory is reused in subsequent frames). With the “Deep Profiling” option enabled, it’s possible to track down the method in which these allocations occur.

\$\endgroup\$
4
  • 5
    \$\begingroup\$ By my reading, it looks like OP already knows what garbage collection is and how to read the profiler. It's not unusual for Unity game developers to try to minimize or even eliminate GC allocations when their game is in a steady state (without new entities being created), and I've talked with devs who say they've achieved this goal (though I haven't profiled their games to confirm). So, rather than stop at "well, garbage happens," I think it's reasonable to look for constructive solutions to change the way we use UNet to reduce unnecessary GC allocations to a minimum. \$\endgroup\$ Commented Nov 1, 2018 at 14:50
  • \$\begingroup\$ But if Unity is doing this internally, there's nothing you can do about it, right? And it's not just "garbage happens", it's "allocation happens". \$\endgroup\$ Commented Nov 1, 2018 at 15:13
  • 2
    \$\begingroup\$ I don't think it's a foregone conclusion that we can't do anything about it. Unity is highly configurable, and many of its APIs can be used in multiple ways. I don't consider it to be outside the realm of possibility that by setting up their networking in a different way, the user can reduce the allocations resulting from their use of this system during steady-state play. (To give an example, in an early prototype we once found we were allocating 1MB every frame while performing a particular action. Yes this was allocating in Unity code, but by calling it differently we eliminated this trash) \$\endgroup\$ Commented Nov 1, 2018 at 15:21
  • \$\begingroup\$ Ah, interesting. :) \$\endgroup\$ Commented Nov 1, 2018 at 17:01

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.