I want to rotate a 2D orthographic projection around the centre point of my view. However, something is off (it looks like a translation) but I'm not sure what.
To put some concrete values on the variables below:
- rotation = 90.0 degrees
- zoom = 0.5
- size = (640, 480)
- center = (320, 340)
My code is as follows:
Transform::orthographic(0.0, self.size.x / self.zoom, 0.0, self.size.y / self.zoom, 0.0, 1.0) * Transform::from_trs(self.center, -self.rotation, Vec2f::ONE, self.center)
And the values from the transform matrix (columnar):
[-0.00000000006829905, 0.0015625, 0, -1.03125,
0.0020833334, 0.0000000000910654, 0, -0.37500012,
0, 0, 1, 0,
0, 0, 0, 1]
Here's the correct rotation and the code (borrowed from SFML and re-written in Rust for my own transform class):
let angle = self.rotation.as_radians();
let cosine = angle.cos();
let sine = angle.sin();
let tx = -self.center.x * cosine - self.center.y * sine + self.center.x;
let ty = self.center.x * sine - self.center.y * cosine + self.center.y;
let a = 2.0 / (self.size.x / self.zoom);
let b = -2.0 / (self.size.y / self.zoom);
let c = -a * self.center.x;
let d = -b * self.center.y;
Transform::new(
a * cosine, a * sine, 0.0, a * tx + c,
-b * sine, b * cosine, 0.0, b * ty + d,
0.0, 0.0, 1.0, 0.0,
0.0, 0.0, 0.0, 1.0,
)
And the values from the transform matrix:
[-0.00000000006829905, 0.0015625, 0, -0.53125,
0.0020833334, 0.0000000000910654, 0, -0.66666675,
0, 0, 1, 0,
0, 0, 0, 1]
Having done the calculations by hand, the difference for the translation in x is the value of -c from the correct implementation, but I don't know why this is negated in comparison. I have no idea where the difference for the translation in y comes from.
What the incorrect rotation looks like:
And the correct rotation:
FWIW, here is the code for the Transform class:
impl Transform {
pub const fn new(
m00: f32, m01: f32, m02: f32, m03: f32,
m10: f32, m11: f32, m12: f32, m13: f32,
m20: f32, m21: f32, m22: f32, m23: f32,
m30: f32, m31: f32, m32: f32, m33: f32,
) -> Self {
Self {
matrix: [
m00, m10, m20, m30,
m01, m11, m21, m31,
m02, m12, m22, m32,
m03, m13, m23, m33,
]
}
}
pub fn from_trs<V, A>(position: V, rotation: A, scale: V, origin: V) -> Self
where
V: Into<Vec2f>,
A: Into<Angle>,
{
let position = position.into();
let rotation = rotation.into();
let scale = scale.into();
let origin = origin.into();
let angle = -rotation.as_radians();
let cosine = angle.cos();
let sine = angle.sin();
let sxc = scale.x * cosine;
let syc = scale.y * cosine;
let sxs = scale.x * sine;
let sys = scale.y * sine;
let tx = -origin.x * sxc - origin.y * sys + position.x;
let ty = origin.x * sxs - origin.y * syc + position.y;
Self::new(
cosine, sine, 0.0, tx,
-sine, cosine, 0.0, ty,
0.0, 0.0, 1.0, 0.0,
0.0, 0.0, 0.0, 1.0,
)
}
pub fn orthographic(left: f32, right: f32, top: f32, bottom: f32, near: f32, far: f32) -> Self {
// note: depth is in range 0 to 1, not -1 to 1 like in OpenGL.
let a = 2.0 / (right - left);
let b = 2.0 / (top - bottom);
let c = 1.0 / (far - near);
let tx = (left + right) / (right - left);
let ty = (top + bottom) / (top - bottom);
let tz = near / (far - near);
Self::new (
a, 0.0, 0.0, -tx,
0.0, b, 0.0, -ty,
0.0, 0.0, c, -tz,
0.0, 0.0, 0.0, 1.0,
)
}
}
impl Mul for Transform {
type Output = Transform;
fn mul(self, rhs: Self) -> Self::Output {
let r0 = [self.matrix[0], self.matrix[4], self.matrix[8], self.matrix[12]];
let r1 = [self.matrix[1], self.matrix[5], self.matrix[9], self.matrix[13]];
let r2 = [self.matrix[2], self.matrix[6], self.matrix[10], self.matrix[14]];
let r3 = [self.matrix[3], self.matrix[7], self.matrix[11], self.matrix[15]];
let c0 = [rhs.matrix[0], rhs.matrix[1], rhs.matrix[2], rhs.matrix[3]];
let c1 = [rhs.matrix[4], rhs.matrix[5], rhs.matrix[6], rhs.matrix[7]];
let c2 = [rhs.matrix[8], rhs.matrix[9], rhs.matrix[10], rhs.matrix[11]];
let c3 = [rhs.matrix[12], rhs.matrix[13], rhs.matrix[14], rhs.matrix[15]];
Self::new(
dot(&r0, &c0), dot(&r0, &c1), dot(&r0, &c2), dot(&r0, &c3),
dot(&r1, &c0), dot(&r1, &c1), dot(&r1, &c2), dot(&r1, &c3),
dot(&r2, &c0), dot(&r2, &c1), dot(&r2, &c2), dot(&r2, &c3),
dot(&r3, &c0), dot(&r3, &c1), dot(&r3, &c2), dot(&r3, &c3),
)
}
}
fn dot(u: &[f32; 4], v: &[f32; 4]) -> f32 {
u[0] * v[0] + u[1] * v[1] + u[2] * v[2] + u[3] * v[3]
}
Why does this extra translation exist in the working implementation when I already account for moving the origin in my TRS matrix? And, how can I correctly calculate this translation and apply it to my transform?

