A transform has an associated type, and all type classification is left to the Transform3D object. A transform will typically have multiple types, unless it is a general, unclassifiable matrix, in which case it won't be assigned a type.
The Transform3D type is internally computed when the transform object is constructed and updated any time it is modified. A matrix will typically have multiple types. For example, the type associated with an identity matrix is the result of ORing all of the types, except for ZERO and NEGATIVE_DETERMINANT, together. There are public methods available to get the ORed type of the transformation, the sign of the determinant, and the least general matrix type. The matrix type flags are defined as follows:
[ m00 m01 m02 m03 ] [ x ] [ x' ] [ m10 m11 m12 m13 ] . [ y ] = [ y' ] [ m20 m21 m22 m23 ] [ z ] [ z' ] [ m30 m31 m32 m33 ] [ w ] [ w' ] x' = m00 . x+m01 . y+m02 . z+m03 . w y' = m10 . x+m11 . y+m12 . z+m13 . w z' = m20 . x+m21 . y+m22 . z+m23 . w w' = m30 . x+m31 . y+m32 . z+m33 . w
Note: When transforming a Point3f or a Point3d, the input w is set to 1. When transforming a Vector3f or Vector3d, the input w is set to 0.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|