If you haven’t
read issue #2 “Morphic 3.0”, please do before
continuing.
A Morph only
knows about its own coordinate system. Its drawing and placement of submorphs
are expressed this way. But when the #drawOn: method asks for drawing services
to the canvas, the canvas must do the actual transformation of those services
to Display coordinates. In case you don’t know much about the Morphic 2.0
framework, all drawing services are requested to a Canvas object, that
implements many standard drawing primitives. This doesn’t change in Morphic
3.0.
Let's see the
structure of the objects. First of all, we have class Morph. As
usual, a Morph is a graphic object that can be manipulated by the user. The
instance variables we are interested in now are ‘location’ that holds an
instance of Location, and ‘coordinateSystem’ that holds an instance of the
CoordinateSystem hierarchy. A morph can be moved, zoomed and rotated by the
user or by code. This information is handled by the location object. In
addition, a morph specifies a coordinate system for its drawing operations and
the placement of its submorphs. This is the coordinateSystem object. Note that
the zoom operation doesn’t modify the coordinateSystem object, but only its
location. That’s why it is “zoom” and not “resize”.
All coordinate systems in Morphic are 2D. A CoordinateSystem specifies the valid values for the coordinates, and a transformation to 2D square space of extent 1 x 1 (or a region of it). An example of the CoordinateSystem hierarchy is CartesianCoordinateSystem. Its instance variables are ‘xMin’, ‘xMax’, ‘yMin’, and ‘yMax’. It is formally defined by a two perpendicular axes. It does a linear conversion to [-0.5..0.5] x [-0.5..0.5].
The user modifiable characteristics of a morph are stored in a Location. A location specifies how morph relates to the coordinate system it lives in, i.e. the coordinateSystem of the owner or container. The instance variables are ‘centerX’, ‘centerY’, ‘width’, ‘height’, ‘angle’. It seems as if a location would only be valid in a cartesian coordinate system, but I preferred these names instead of the possible and more general ‘firstCoordinateCenter’, ‘secondCoordinateCenter’, ‘firstCoordinateExtent’, and ‘secondCoordinateExtent’. Moving a morph only affects its location’s ‘centerX’ and ‘centerY’, that hold the position. Zooming in or out a morph only affects its location’s ‘width’ and ‘height’, they hold the visible extent. Rotating a morph only affects the location’s instance variable ‘angle’.
As I said above, the canvas needs to transform from local morph
coordinates to Display coordinates, in order to actually draw.
The services we will use are:
CartesianCoordinateSystem >>
externalFor: internalPoint
| dx dy |
dx := internalPoint x -
self centerX / self extentX.
dy := internalPoint y -
self centerY / self extentY.
^dx@dy
This method does the transformation of internalPoint to the space
[-0.5..0.5] x [-0.5..0.5]. CartesianCoordinateSystem >> internalFor:
cannonicalPoint does the inverse conversion.
Location >> addTo: aPoint
| dx dy sin cos |
dx := aPoint x * width.
dy := aPoint y * height.
sin := self angleSin.
cos := self angleCos.
^(dx * cos - (dy * sin) + centerX) @
(dy * cos + (dx * sin) +
centerY)
This method applies the location (position, zoom,
rotation) to aPoint, complete the transformation to the owner's coordinate
system. Location >> substractFrom: aPoint does the inverse conversion.
The idea is to traverse the owner chain, and at each step to transform
the coordinates to the enclosing coordinateSystem. Something like:
MorphicCanvas >> line: point1 to: point2 width: w
color: c in: aMorph
m := aMorph.
pt1 := point1.
pt2 := point2.
[ m isWorldOrHandMorph ]
whileFalse: [
pt1 := m
location addTo:(m coordinateSystem externalFor: pt1).
pt2 := m
location addTo:(m coordinateSystem externalFor: pt2).
m := m owner ].
"Now pt1 and pt2 are
in World coordinates, and we can draw."
self asBalloonCanvas
aaLevel: 4;
drawPolygon:
(Array with: pt1 with: pt2)
color: c
borderWidth: w
borderColor: c
After implementing all this, the first thing we notice is that the y
axis in all the coordinate systems is inverted. It appears to me that mirroring
a morph in one axis is something similar to rotating. Therefore it should be
something to be handled by the Location. It could also be something the user
could do via the halo.
You can download
my latest image to play with
it.
This is only
getting started. But this is the perfect time to think about coordinate systems
and locations, and discuss about the ideas and the design.
Juan Vuletich