Using multiple cameras

Aug 13, 2009 at 2:01 AM

Hello,

I am working on the instancing and partitioning example in the tutorial. Right now, a FPS camera is used in that example. To this, I added the CameraOrbit class from tutorial 13 (updates and input), which orbits around a object. Depending on the state of the mouse left button in the Update call, I am switching from the FPS camera to the orbit camera, i.e., if the mouse is down, I set the drawtarget to use the orbit cam, and if up use the fps cam.

Currently the target of the orbit camera is the origin, so the camera orbits around the centre. When the FPS camera moves the camera target is bound to shift, and so, in the draw call, I use projectFromScreen to get the new target and store that in a static variable accessible to the orbit class.

Now, individually both the cameras work fine, but I am having problems syncronizing the views of both the cameras. When I switch from the orbit cam to fps cam, I just use the camera.LookAt() function call to adjust the camera to look the target of the orbit cam like this:

if (state.MouseState.LeftButton.IsDown)
{
    drawToScreen.Camera = camera;
    fpsCamera.LookAt(Game1.TargetPosition, camera.Position, camera.CameraMatrix.Up);
} 
else
{
    drawToScreen.Camera = fpsCamera;
}

The problem happens when I insert the code to synchonize this the other way round. I made the cameraRotation variable in OrbitClass public, and I can access it in my main class file. I set the rotation variable to the direction of the current camera:

 

if (state.MouseState.LeftButton.IsDown)
{
    drawToScreen.Camera = camera;
    fpsCamera.LookAt(Game1.TargetPosition, camera.Position, camera.CameraMatrix.Up);
} 
else
{
    drawToScreen.Camera = fpsCamera;
    Vector3 direction;
    fpsCamera.GetCameraViewDirection(out direction);
    cameraOrbit.cameraRotation.X = direction.X;
    cameraOrbit.cameraRotation.Y = direction.Y;
}

In camera orbit class, I have made these changes to the Update call:

 

Vector3 position = GraphVisualizer.TargetPosition;
            position.Z += GraphVisualizer.ViewDistance;
            //this is a fairly nasty way to generate the camera matrix :-)
            cameraMatrix =
                Matrix.CreateTranslation(position) * //move +viewDistance on the z-axis (negative z-axis is into the screen, so positive moves away)
                Matrix.CreateRotationX(this.cameraRotation.Y + MathHelper.PiOver2) * // rotate up/down around x-axis (the model is rotated 90 deg, hence the PiOver2)
                Matrix.CreateRotationZ(this.cameraRotation.X); //then finally rotate around the z-axis left/right

 

Vector3 position = Game1.TargetPosition;
position.Z += Game1.ViewDistance;

//this is a fairly nasty way to generate the camera matrix :-)
cameraMatrix =
                Matrix.CreateTranslation(position) * //move +viewDistance on the z-axis (negative z-axis is into the screen, so positive moves away)
                Matrix.CreateRotationX(this.cameraRotation.Y + MathHelper.PiOver2) * // rotate up/down around x-axis (the model is rotated 90 deg, hence the PiOver2)
                Matrix.CreateRotationZ(this.cameraRotation.X); //then finally rotate around the z-axis left/right

This way, the origin now is not the origin but the TargetPositon vector (updated in the draw call) plus the ViewDistance which is now shared between both the cameras. 

When I execute this, this code doesnt work, i.e., the cameras arent syncronized, not even the fpsCam which was being synchronized in the first case. 

Could you please help me with this problem? Am I missing something or is this not the way this has to be done?

Coordinator
Oct 31, 2009 at 10:01 PM
Edited Oct 31, 2009 at 10:01 PM
Looking at your code, there is an obvious problem with what you are doing.

The camera view direction (for the FPS camera) is a Vector3. This is a 3D unit vector in space.. ie, an XYZ direction. However, the orbit camera was based off XZ axis rotation angles, which is what is stored in cameraRotation.
These aren't the same, there is a big difference between angles and direction (think roll/pitch/yaw). They don't directly convert to and from one another - you have do some fairly complex maths to do this. Usually involving inverse cosines / sine or two component inverse tan (Atan2, ASin, etc).

Remember, CameraRotation.X represents the angle (in radians) the orbit camera is rotated around the Z axis (ie, the angle left to right). The XY direction from this angle is sin(A), cos(A). Ie, basic trig.
The tricky bit is that CameraRotation.Y represents the angle around the X axis, which is applied as a matrix rotation after the first... It all gets rather complex internally if you treat it as angles. Vectors are much simpler.

For example, the view direction for that CameraRotation would *probably* be:

XYZ = Vector3(sin(cx) * cos(cy), cos(cx) * cos(cy), sin(cy));

Needless to say, going backwards, and getting cx/cy from the XYZ direction is not trivial.

However, this is exactly what happens within LookAt for the fps camera - as internally the camera stores it's rotation as an XY angle.
If you look at FirstPersonControlledCamera3D.LookAt, this is exactly what happens. It may look simple, but it's one of those things that is deceptive - and only works when it's exactly right (otherwise it breaks spectacularly).

So unfortunately, that's all I can really tell you. I hope you can appreciate that while it may appear a simple problem, it's actually far more complex - and one of the main reasons why you should avoid working with angles as much as possible :-) (or, as in the case in xen, hide their use as much as possible :-)

Good luck