Unity general questions

Comments

  • As far as I know, graphics cards don't read PNGs, so Unity has to convert them into a file format that they do read easily/quickly. If you can avoid using the alpha/transparent channel, the banding will likely look better (because there are more bits for the 3 remaining channels) when you compress them. Otherwise, you may just need to have either noisier textures (which hides some of the banding), or flatter colours (no gradients, so no banding).

    If you're targeting Adreno/Tegra devices, you get access to better-looking texture compression too (ATC or DXT), but obviously not all Android devices can use those.
    Thanked by 1AngryMoose
  • edited

    The PNGs come out at about 137kb, but once imported into Unity, they bounce up to 1.3mb.
    I've tried compressing them, but I end up with a lot of banding.
    GPUs have no concept of file formats like PNG, JPG, etc. GPU manufacturers make use of various (often proprietary) compression format that textures need to be in if you want them to be optimal for the GPU. You can otherwise go ahead and use 32 or 16 bit RGBA images that are not compressed at all, and will generally look better, at the cost of more memory and texture bandwidth consumed when rendering.

    What you're seeing above is because you have your Android settings set to ETC texture compression (under the Project Settings window I believe...). ETC compression is supported by *all* Android devices... whereas the other Android compression formats (DXT, ATC, etc.) only work on GPUs that support them. So generally speaking, it's much easier to ship an Android title only worrying about ETC compression. The downside here is that ETC compression doesn't support Alpha channels at all, so you end up getting the banding that you're seeing because Unity will set a 'Compressed' RGBA texture under ETC settings to be 16 RGBA, which causes a lot of fine color depth to be lost, generally in cases of soft gradients and falloffs.

    Some per-case solutions to this could be:

    a) suck it up and ship with the banding; there's a chance that what you see in the editor may not even be very visible on device
    b) switch from Compressed to Truecolor for textures that band, which will use up double the memory
    c) same as b) but see if you can reduce the resolution to compensate
    d) re-design your image to remove soft falloffs in the the RGB or A channels, and stick to well defined color sections that wont make color loss from the quantization very apparent
    e) break up your texture into 2 new textures, both RGB only (with the Alpha channel of the original texture now a grayscale image), set both textures to Compressed, and write a custom shader that will use the 2nd texture as an alpha channel for blending operations


    There's a ton of Googling you can do to understand texture compression, how it works, and its limitations, and you can start here in the Unity Manual where it talks about Getting Started with Android Development (near the bottom under the heading 'ETC as Recommended Texture Compression').

    Good luck; I hope you come right!
  • I'm not 100% certain about this, but something else that might be driving up your texture memory consumption is automatic mipmapping. Chances are you don't need mipmaps if your game is 2D with little/no zooming. Turn mipmapping off per texture by changing it to an advanced texture and you'll see the option.
    Thanked by 1Elyaradine
  • I'm not 100% certain about this, but something else that might be driving up your texture memory consumption is automatic mipmapping. Chances are you don't need mipmaps if your game is 2D with little/no zooming. Turn mipmapping off per texture by changing it to an advanced texture and you'll see the option.
    Yeah, I had a similar issue last week when I moved to Unity 5. Create Mipmap is a default setting now when you import sprites. Just untick it in the inspector when you slice your 2D sprites. Also, have a look at the quality settings (EDIT > PROJECT SETTINGS > QUALITY). This also helped me bring my texture memory consumption down considerably. Unity 5 seems to do a lot of stuff as defaults that it did not always do. With 2D animator transitions it also automatically puts an exit time and transition time, which messed me around for quite a while before I noticed it. Does anyone else know if there is a place where you can change what these default setting should be? Also, use the profiler which is great for seeing where things are happening and to what extend. So awesome that this is now available in the free version :)
  • dislekcia said:
    I'm not 100% certain about this, but something else that might be driving up your texture memory consumption is automatic mipmapping. Chances are you don't need mipmaps if your game is 2D with little/no zooming. Turn mipmapping off per texture by changing it to an advanced texture and you'll see the option.
    Yup, that definitely is something that can be disabled for savings, but it will only increase your memory per-texture by 33%, so his 1024x512 PNG that he wants to be compressed, but isn't because ETC doesn't support alpha, ends up taking up 1.3MB as a RGBA16 texture in Unity.

    Disabling mips on that will reduce that usage to 1MB, but that's still a far cry from from the 137KB that he is seeing the PNG take up (not understanding how texture formats work on GPUs). If the image didn't have an alpha channel, it could use ETC and it would end up a mere 170kb, even with mipmaps (and only a slightly lower 131kb without mips).

    So while unnecessary mips can help save you memory, it's all relative, and the major savings will always be by rather taking advantage of texture compression and using it wherever possible.

    Of course, as you said, you can get some decent savings from uncompressed images by disabling mipmaps whenever you can, but that's harder and harder to do now when you need to cater for resolutions that range from 800x480 all the way up to 2560x1600. You ideally want to be designing your assets for those high resolutions, and then use your mipmaps on the lower resolutions so you don't have texture minifcation pixel loss.
  • edited
    Question 1

    Hey guys, a (hopefully) easy and quick one... I'm looking for an "elastic" easing/lerping function, something that doesn't just go from A to B, but overshoots a bit and bobs around the target value... Like a spring with zero friction or something.

    What do I even look for?

    If the amount of overshoot and time/cycle can be controlled, bonus points, but not necessary.
  • (sorry for double posting but I tried to attach an image in the previous post by editing..... But no attachments via editing, it seems!)

    Question 2

    I am trying to read gyroscope values from my iPad to apply as a rotational element - so I only want landscape orientation, turn left and turn right, with the effect of rotating an on-screen element by the same amount of degrees (so maintaining the object on-screen's orientation relative to the ground).

    Like this:

    image

    I got as far as the following code to grab a rotation, right now it seems to take portrait orientation as zero rotation (even if my game is running in landscape) (tested via unity remote on iPad during editor)... But how do I "ground" the value so that it would think landscape orientation is zero?

    Camera.main.transform.eulerAngles = new Vector3 (Camera.main.transform.eulerAngles.x, 
    		                                                Camera.main.transform.eulerAngles.y,
    		                                                -Input.gyro.attitude.w*50f);


    I've tried to -0.5 from the Input.gyro.attitude.w value, because when I debugged it it seemed to 0 at portrait and around 0.5 at landscape orientation, so I figured if I take 0.5 from it I can zero it.

    But Unity throws me an error saying I can't manipulate a double (apparently what Input.gyro.attitude is) with a float (which is what 0.5 is apparently).

    Am I on the right track or not at all? XD

    Thanks guys :)
    gravity_steering.gif
    693 x 1050 - 43K
  • @Tuism can't you just disable the landscape orientation in build settings - player settings - allowed orientations?
  • @FanieG I don't think it has anything to do with that - I am running the game in landscape in the editor and testing gyro via unity remote 4 - the game is locked to landscape on the desktop. Exporting it is another matter altogether, and I will disable other orientations, but the values that the gyro outputs remain the same and so won't help me. I need to get a 0 value at landscape orientation (for Input.gyro.attitude.w), but it's sitting at 0.5.
    Thanked by 1FanieG
  • edited
    @Tuism sorry misunderstood then. Maybe you should ask @Stray_Train he is great with mobile code. From the unity documentation it shows that attitude is a quaternion so don't know why you getting an issue with doubles?
  • edited
    Cast the thing that's a double to a float value, then do whatever the hell you want to do to it.

    float angleWanted;
    angleWanted = (float)Input.gyro.attitude.w - 0.5f;
    Thanked by 1Tuism
  • dislekcia said:
    Cast the thing that's a double to a float value, then do whatever the hell you want to do to it.

    float angleWanted;
    angleWanted = (float)Input.gyro.attitude.w - 0.5f;
    Thanks! I always thought casting meant "as GameObject" or some such.

    OK now that I tested that theory... Nope, it didn't do what I wanted to do, which is to have a zero at landscape all the time.

    I don't really understand the gyro thing (omg Quaternions) and googling got me this far...

    help? :/
  • @Tuism Quaternions are a trick to prevent gimbal locking http://en.wikipedia.org/wiki/Gimbal_lock.
    I'm assuming you'd be wanting to get the rotation around the device's z-axis, or whichever gyro axis points into the screen.
    To do so, you need to call Input.gyro.attitude.eulerAngles.z, and then subtract the offset in degrees, so either +-90.
    Hope this helps
    Thanked by 1Tuism
  • Input.gyro.attitude.eulerAngles.z+90 worked perfectly!! :D Thanks so much!! I thought it was something a lot more complex than a quaternion -> Euler conversion... Doh!!

    Wooo :D
  • edited
    A non-specific question... When deploying from Editor to device (iOS), how do you make sure everything remains the same?

    The lighting and colours seem off from one another in this example, for example. (It's not running simultaneously, iPad is running a pre-built build) The reflections around the shadow being cast seems to be rendered differently, the canyon floor's colours are different, as are the walls' colours:

    image

    Is it just a matter of tweaking it in editor until it's right on Device? Or is there some secret to lighting settings or somesuch?
  • Sorry again for double post... Again, can't edit and attach :/

    Here's a screenshot comparison, probably better than the photo :P

    image
    comparison.jpg
    1688 x 607 - 180K
  • Do you know about Quality Settings? If not, you need to read up on them a bit, and change them so that what you see in the Editor (the 'selected' level is the same settings that you're seeing on the platform that you are building to (the green checkmark under that platform).
  • Thanks! I checked out the Quality Settings stuff, it seems to be platform-dependant? I have my build settings set up to be iOS, and the two screenshots I posted up are screenshots from device (home + power button) and screenshot from Editor > play... I'm guessing that editor > play uses the same settings as the build settings, since I don't see a setting for editor > play? Or have I missed something?

    image
  • It's a bit unintuitive with the editor. The editor will use whichever quality setting you have selected (good in this case). If you want a preview of what it looks like on iOS, you should select 'simple' (or change the iOS default and select that).
    Thanked by 1Tuism
  • OH, so the dark bar across is the editor > play's quality setting, then. OK cool I got it, hope this syncs things up between device and editor, as I gave it a try, and besides some pixel sizes and things, it feels like some lighting quality and things aren't consistent. Maybe it's just the colour of the monitor, but screenshotting should be monitor colour-agnostic.

    Let's see :)
  • Also recently learned that you need to click on the little down pointing arrow under the platform of choice and set the quality there.
  • OK the quality stuff is bugging my brain out. I don't think this is a quality level problem:

    I have fog settings on. Playing in editor gives me one result, playing in iOS gives me another. The fog doesn't seem to appear on the iOS version, then I tried compiling without the fog on, and I saw that somehow some reflections or something seem to be "overpowering" the fog, since I can see the fog on the sides of the wall, but not on the path, where faces are facing the camera more readily... So I thought it might be a material/shader problem... But I have no idea.

    Anyone have any ideas? :/

    image
    quality_bug.jpg
    1024 x 1074 - 291K
  • Woah... OK... you're not going to get away with using PBR shaders on mobile. It's fine to use them for quick and dirty testing, but it's VERY likely that the differences you are seeing us because the Standard (Unity's new PBR shader stuff) shader ends up doing different stuff on mobile as mobile can't realistically handle all of the things that PBR shaders do.

    I haven't dig into them much in Unity 5, but I do know for a fact that the Reflection Probes will give pretty different results on mobile vs. desktop, and there are likely many many more things that PBR wont do on mobile, so I suggest you start by replacing your Standard shaders with Mobile ones and then see if they sync up better :)
    Thanked by 1Tuism
  • edited
    Hey guys,

    Im trying to get my game object to have the same rotation as my thumbstick in Unity2D. What I have at the moment is this:

    float rH = Input.GetAxis("RotationHorizontal") * 10;
        float rV = Input.GetAxis("RotationVertical") * 10;
    
        if (rH != 0 || rV != 0)
        {
          _angle = Mathf.Atan2(rH, rV) * Mathf.Rad2Deg;
          transform.rotation = Quaternion.Euler(new Vector3(0, 0, _angle));
        }


    This works great, except that now I want to fire a projectile, and the way I do this is like this:
    transform.rotation = Player.transform.rotation;
        transform.position = Player.transform.position;
        rigidbody2D.AddForce(transform.forward * WeaponBase.ProjectileSpeed);


    The problem my projectile doesnt move. Its not that its colliding with the player because I excluded that collision. Any ideas?

    transform.forward in this case is a Vector3(0.0,0.0, my rotation value). Yes, I do see the issue here, Im just not sure how to fix it :(
  • edited
    It looks like you're operating on transposed axes there. Generally speaking, if you want to change something's facing direction, you rotate it around its y-axis, not its z-axis, because z+ is what Unity considers forward. It's probably not trivial for you to change this now, though, but it's probably a best-practice to bear in mind for the future.

    For now, instead of launching your projectile along transform.forward, you can shoot them along transform.up, which should be the direction you want.
  • @cinimod, try this:

    rigidbody2D.AddForce(new Vector3(rV, rH, 0) * WeaponBase.ProjectileSpeed);
  • petrc said:
    rigidbody2D.AddForce(new Vector3(rV, rH, 0) * WeaponBase.ProjectileSpeed);
    Be careful with this. That vector is not guaranteed to be normalised. In fact, it's very likely not to be, and the * 10 (why is that there anyway?) will guarantee that it is not. Your projectiles won't always fire at the same speed this way!
  • Shouldn't there be time.alphaTime in there?

    And AddForce is cumulative over time (right?), which means unless you're doing an accelerating rocket or something, it's probably not right.

    setting velocity directly can work, but you need to be very careful about it, it can break very easily.
  • @Chippit: The * 10 shouldn't be in there :P That's left over from some other things that I was trying.

    @Tuism: At the moment, I cant see how my projectile acts so Im not sure if I want to do a time.AlphaTime just yet, although I do this for all my movement.
  • Adding a force is really quite weak, so you probably need to either up that amount like crazy, or you need to add an impulse, not a force.

  • edited
    Garbage.

    So I've been looking at Garbage Collection and how that works because BeatAttack was suffering from lag spikes. So instead of looking into the giant bloat of code in Beat Attack I tried starting a new project just to see if I can make stuff with less garbage generated.

    So I'm pooling EVERYTHING in this game and I'm using Coroutines instead of a timer that counts down, etc etc etc.

    But IMMEDIATELY I feel like I'm getting lag spikes. So I look into the profiler - let me know if I'm reading this right:

    image

    The red slope that goes up and dips is the garbage, right? And when it dips is when Unity is doing Garbage Collection, and when the lag spike happens, right?

    I'm sorting via the column "GC Alloc", which represents what thing is generating garbage, right?

    So the screenshot here is after I've turned ALL my coroutines off, all the effect scripts off, and nothing is actually updating in the game. But it looks like all the garbage is being generated by GameView.GetMainGameViewRenderRect()... I have no idea what this is, but it looks like just the general rendering of the game's screen, the bare basics.

    So it looks to me like I'm incurring the minimum garbage that I can, and yet it still has the dips/spikes about as often as with everything on??? What???

    What am I missing?!

    Thanks oh ye gurus!
  • Are you 100% sure that garbage collection is causing your lag spikes? 51ms is a very long frame time.

    GC spikes should be really easy to see on the CPU graph, can you post the rest of the profiler?
    Thanked by 1Tuism
  • Thanks, I didn't even know that's how I tell how long a frame took to run... Ok so I've run it a bit and found a huge spike of 150ms :/ omg that's nuts.

    Comparing that to where the red line dips, I see it's not GC then (if I understand it right).

    How do I find it!?

    image
    Screen Shot 2015-05-07 at 4.04.33 PM.png
    1417 x 886 - 566K
  • There are plenty of things running in the editor that can cause 'lag spikes'. Most editor windows repaint themselves about every second (the profiler certainly does), and that can actually impact the performance of your game since it needs to take a slice of your GPU to do so. But yes, as @Squidcor says, we can't really help without giving us the full profiler results, specifically with samples from your slow frames with spikes as well as a normal frame for comparison.

    Something to bear in mind also is that profiling in the editor is inaccurate, especially when memory concerns (specifically total memory use though) are related. If you want truly representative results, you need to deploy a standalone player and profile that.
    Thanked by 1Tuism
  • Yeah, I suspected it's not that simple, but here's something I think seems relevant:

    image

    Does this here mean that this 77ms spike is being caused by the Coroutine Enemy.Jiggle()?

    I have a coroutine set up to make the sprite jiggle every 0.2 seconds. It's started up once on the enemy gameobjects and left to run on its own...

    void Start () 
    	{
    		StartCoroutine (Jiggle ());
    	}


    WaitForSeconds jiggleWait = new WaitForSeconds(0.2f);
    
    	IEnumerator Jiggle() 
    	{
    		while (true) {
    			EnemySprite.transform.position = new Vector2 (transform.position.x + Random.Range (-0.02f, 0.02f), transform.position.y + Random.Range (-0.02f, 0.02f));
    			yield return jiggleWait;
    		}
    	}


    Everywhere I read stuff people say coroutines were better than running a timer that did timer -= time.deltaTime in Update(). So I tried it.

    Is this a bad case to use a Coroutine?
    Screen Shot 2015-05-07 at 4.15.15 PM.png
    580 x 136 - 33K
  • edited
    @Tuism: You're confusing the different measurements that the profiler is giving you... In short, no that co-routine is not responsible for the 77ms frame time. You can see how much time it's taking up in the column marked "Time ms": 0.03ms.

    That call is generating 102 bytes of garbage (not a large amount, at all) and being called 6 times. Don't confuse the amount of garbage generated with the time that a thing takes... If you want to find out what the game is actually doing for those 77ms in that frame, order the list by the Total % column - that will show you what is being called for the largest percentage of the time that the frame is being processed.

    -edit- Yeah, in that other image you posed above, you can see that the game spent 98% of its time in a thing called "Overhead". That's probably the editor doing stuff, which is why @Chippit pointed that out. It did that for 154.14ms of a 157.28ms frame, meaning that everything else (including the tiny amounts of potential garbage that you're worrying about) happened in 3.14ms - that's over 300 frames per second. Garbage collection has its own task, so no, Overhead isn't your GC.
    Thanked by 1Tuism
  • Cool thanks, so what I understand is that "Overhead" is the editor doing stuff, and has no bearing on the game when it's on-device and out of the editor?

    So - I really shouldn't worry about it at all?
  • @Tuism: Build the game as a standalone executable but in Debug mode, make sure you select the "Connect profiler" option in the build settings. Then run the game while the editor is open (you may have to select the game in the profiler window, there's a little dropdown for that) and then see what the results are. Overhead could be a lot of things, but find out if it's actually an issue when running your game how it's going to be released.
    Thanked by 1Tuism
  • edited
    Hrm. I've never seen the reported overhead get nearly close to that high. It should only be including overhead from the profiler itself (not other Editor behaviour, which it shouldn't be counting but will affect the execution time of other things seemingly randomly), so it's very suspicious behaviour. Do you see these spikes if you restart the editor? Do you see them if you close all editor windows except the Game window? Do you see these in deployed players (any platform)? Have you profiled those builds?

    Also, there seems to be a bit of confusion about the GCs too. You won't necessarily see them accompanied by convenient dips in the memory graphs (although in theory that SHOULD happen). In the CPU profiler, however, every time it happens, you will find a seemingly random script taking a long time to execute. This is because, when the garbage collector decides to run (it SHOULD run after roughly 1mb of managed allocations, but this is implementation dependant and not necessarily a reliable factor) it needs to suspend all threads so that it can walk the heap and find and referenced objects and then infer the ones that can be cleaned up. This can happen at any point, and it's unpredictable. However, on PC this is much faster than it sounds, because the implementations are well-informed and very advanced, and so you rarely actually see GC spikes on desktop builds, especially not with smaller, simpler games with a shallow heap.

    Additionally, I also regularly see advice that recommends preferring coroutines. This isn't always (or even often) good advice in practice. In fact, if you were concerned about GC problems as you are (perhaps unnecessarily, but we don't know yet!), then coroutine based solutions are actually worse. Their main benefit is to remove operations occuring on every frame, and its benefits are only significantly felt if it means you don't need to have an Update() method at all. They can also allow you to engineer some things in a more modular and segmented fashion, which can be beneficial as well, but they're definitely not a silver bullet to any problem you might run into.

    EDIT: I've also just seen that earlier post of yours, that was posted while I was typing up my previous one. That profiler output is very unusual indeed. Did you say most of your game logic was disabled? I can only imagine strange spikes like that being caused by new assets being loaded by the editor as they are created, or perhaps by a bunch of game exceptions (though those usually appear differently), but it's difficult to tell without extra info.
    Thanked by 1Tuism
  • Thanks guys, I'll investigate more and report back when I can. From all the reading I've done it seems a lot of it is centred around "premature optimisation kills kittens". I'm doing this exercise to see how pooling and things can work easiest for me, with the added bonus of a new prototype and experimenting with the new fad of chromatic aberration scanlines :P

    So yeah I'll try some of the stuff you guys suggested :) Thanks!
  • edited
    Also, coming back to the physics question, because it's a good one. Basically what everyone has said above is correct, but it's really helpful to understand why. The explanation gets a bit mathematical, and involves some highschool physics, but it's actually quite straightforward.

    Firstly, you should decide whether you want to be thinking in terms of momentum (p=mv) or speed (v).

    If you're thinking in momentum
    Knowing that a force is a change in momentum over time, the simplest case has you applying a force over some time to change an object's momentum. Is there an engine pushing your object? A rocket? You probably want to do this.

    Impulse is the integral of force. In simplest terms, it's basically force with the time taken away. It is the instant of a force applied to an object. Game engines don't really have an 'instant', so in these cases it usually means one physics tick. (Usually 50ms in Unity, but it's configurable). Explosion? Probably an impulse.

    If you're thinking in speed
    Acceleration is the change in speed over time. Apply acceleration over a few frames to change an object's speed independant of its mass. Here the physics starts breaking down, because there are no analogies here, but this is video games and we don't care! When you're moving your physically-simulated player character, you probably want to use this, so it doesn't feel sluggish and you don't have to care about mass.

    What Unity calls 'VelocityChange' is the integral of acceleration. Same thing - acceleration in an instant rather than over time, and ignoring the mass of the object. Same as above, really, just if you want to have instant change (hitting a bouncy pad, a wall, something like that).

    In summary:
    • Force is mass-dependant, continuous, applied over time. (N) (kg.m/s^2)
    • Impulse is mass-dependant, instantaneous. (Ns) (kg.m/s)
    • Acceleration is mass-independant, continuous. (m/s^2)
    • VelocityChange is mass-independant, instant. (m/s)
    • To Unity, instantaneous forces actually happen over one physics tick, which is Time.fixedDeltaTime. When you work with physics, always make sure you use this time, and do stuff in FixedUpdate (which ticks at this rate and is dependable)
    In practice, given
    float oneOverDTime = 1 / Time.fixedDeltaTime;

    All of these are equivalent:
    ridigbody.AddForce(force * Vector3.up * oneOverDTime, ForceMode.Force);
    ridigbody.AddForce(force * Vector3.up, ForceMode.Impulse);
    ridigbody.AddForce((force / ridigbody.mass) * Vector3.up * oneOverDTime, ForceMode.Acceleration);
    ridigbody.AddForce((force / ridigbody.mass) * Vector3.up, ForceMode.VelocityChange);


    Which is just four ways of applying the upwards force of a given strength to an object of a given weight for 50ms (by default).

    PHYSICS.
  • Also something to remember when profiling in the editor is that if you have the scene window open at runtime it affects performance enormously in some cases. Even the profiler will add overhead when running, especially so in "Deep profile"
  • Hey guys,

    Back to the question I asked earlier. I am trying to get my sprite to rotate towards my mouse. This works fine with the following:

    Vector3 newAngle = Camera.main.ScreenToWorldPoint(Input.mousePosition) - transform.position;
          transform.rotation = Quaternion.Euler(new Vector3(0, 0, Mathf.Rad2Deg * Mathf.Atan2(newAngle.y, newAngle.x)));


    I then have projectile code that looks like this:

    Physics2D.IgnoreCollision(collider2D, Player.collider2D);
        if (weapon != null)
        {
          WeaponBase = weapon;
        }
        gameObject.SetActive(true);
        float accuracyVariation = (Random.value > 0.5 ? -1 : 1) * ((1 - WeaponBase.Accuracy) * Random.value);
        Quaternion rotation = Player.transform.rotation;
        transform.rotation = rotation;
        transform.position = Player.transform.position - (transform.forward * 0.3f);
        rigidbody2D.AddForce(transform.up * WeaponBase.ProjectileSpeed, ForceMode2D.Impulse);


    The issue is, my projectile fires out to the left of my sprite (270 degrees off). Any ideas?
  • @CiNiMoD Your problem could be for a number of different reasons, depending on your scene graph structure.

    The first potential problem could be that you're explicitly setting the rotation of the projectile to match that of the your avatar. If in your scene graph, projectiles are children of your avatar, this would add an additional rotational offset to your projectiles, as transforms are relative to the parent transform.

    An alternative cause of the problem could be that the initial orientation of your sprite was off by +-90 degrees and the fix here is as simple as adding a +-90 offset to your projectile's rotation.

    There are a few other potential causes but one of these seem the likeliest.

    One minor tip, unless there is a game-play specific reason for disabling collisions in code, it is a better idea to disable collisions between entities in the editor, by assigning different layers between the avatar and projectile prefabs and then turning off collisions in the physics settings. I'm not sure if this will result in a performance difference, but it seems Unity is better tuned to static content and does a lot of optimizations, such as baking in lighting at compile time.
  • Hi guys,

    I'm having size issues when building for Android.
    Whats the best way to export 2D sprites from photoshop? Is there some sort of trick to it?

    The PNGs come out at about 137kb, but once imported into Unity, they bounce up to 1.3mb.
    I've tried compressing them, but I end up with a lot of banding.
    hey man are you using a texture atlas?
    check out texture packer https://codeandweb.com/texturepacker

  • Hey Peeps

    so i have an object that has a collider inside. when the player triggers the collider i get the position of a spawn point and i change the x value of my object to take the new value. so what happens is that the object gets reused and just moved to another position. i have 4 of these object in my scene. They build up my walls of my 3d runner.

    completely randomly the rigid body that collides with the trigger just sometimes doesn't fire and then i have a gap in my wall. whats even more frustrating is when i pause the game and i move the trigger just before the trigger happens and play then it triggers.

    i think this might be a bug because i just don't understand how something so simple just stops working. to make things worst when i build it for my iOS device the problem is way more evident.

    I'm using unity 5 p4

    please can anyone assist me with this issue.
  • It sounds to me like you're running into a frame rate issue. There are a couple of things to bear in mind here, chief among them being the physics update rate (default 50ms), and that it affects everything that Unity's physics handles. Are you moving your characters during FixedUpdate or Update? What trigger method are you using? If you're using enter trigger events, bear in mind that they only fire again if your triggering object (your player) completely exists the trigger zone. If your frame rate is low, player movement is fast enough and happens in Update (it probably shouldn't), and your triggers are very close together, it's possible that you never have any frames where the player object isn't colliding with your trigger, and the trigger never reset.

    That would be my first guess, at least. WIthout seeing more of your code, your scene setup, and the game in action, it's harder to help any further.
  • @CiNiMoD: It sounds like you're having problems visualising the 3D space around your ship - vectors are getting confused and you're switching between transform.forward and transform.up for reasons I don't see... I strongly suggest you draw some debug rays in the various directions to help you see what vectors you could be using instead to get the result you want :)

    Also, I'm not sure that the euler angle creation method for quaternions is a great idea. You should be able to just create a quaternion that points towards your mouse cursor (from a simple vector that does that) and then slerp towards it by a specific amount to make your ship point that way.
  • @dislekcia: Thanks, Ill try creating some rays. Its super weird though, I am using this code for when Im using a gamepad:
    float rH = Input.GetAxis("RotationHorizontal");
        float rV = Input.GetAxis("RotationVertical");
        float scroll = Input.GetAxis("ScrollWeapons");
    
        if (rH != 0 || rV != 0)
        {
          _angle = Mathf.Atan2(rH, rV) * Mathf.Rad2Deg;
          transform.rotation = Quaternion.Euler(new Vector3(0, 0, _angle));
        }


    This seems to work correctly, and fires the projectile in the right way. Saying this though, I have to rotate my sprite to point down instead of right for it to look right. Could you explain what you mean by:
    You should be able to just create a quaternion that points towards your mouse cursor (from a simple vector that does that) and then slerp towards it by a specific amount to make your ship point that way.
    Quaternions break my brain :/ I've never really used them before and I've got to be honest, I dont really know how they work. I've found some math sites that teach you about them and Im gonna start reading them :P
  • CiNiMoD said:
    Quaternions break my brain :/ I've never really used them before and I've got to be honest, I dont really know how they work. I've found some math sites that teach you about them and Im gonna start reading them :P
    Quaternions aren't that complicated, they're just a way of representing a rotation. Once you've got your debug rays pointing in useful directions, just use Quaternion.SetFromToRotation(vector3, vector3) and you'll get the exact rotation necessary to turn your ship from the direction it's currently facing to the one you want it to face. Then all you do is step a short distance "along" that rotation using Quaternion.slerp() and life is nice and simple :)

    Never try to use the components of a Quaternion, they're not rational (literally, they're irrational maths, you can't understand them by inspection). And you lose a lot of the point of Quaternions when you try to keep working in angles, rather use vectors pointing in the directions you want, then they're great!
Sign In or Register to comment.