|
Post by dave625 on May 27, 2014 11:43:17 GMT -8
When using the poly object, I'm getting a strange display anomaly when trying to draw a segment. Here is the code: void setup() { GD.begin(); }
int f(int val) { return val * 16; }
void shape(int x, int y) { Poly po; GD.ColorRGB(200,0,0); po.begin(); po.v(f(x + 0), f(y + 15)); po.v(f(x + 15),f(y + 0)); po.v(f(x + 110), f(y + 0)); po.v(f(x + 125), f(y + 15)); po.v(f(x + 110), f(y + 30)); po.v(f(x + 15), f(y + 30)); po.draw(); }
void loop() { GD.Clear(); shape(100,100); GD.swap(); } all points appear to render correctly, with the exception of the top left most point. Here is a facsimile of the vertex in question: Is there a way around this? Thanks in advance.
|
|
|
Post by jamesbowman on May 27, 2014 14:10:05 GMT -8
Yes, the GPU's edge filling seems to sometimes not get every pixel exactly right, so you sometimes get a 2-pixel step at the end of an edge. The easiest way to hide this - and all the other jaggies - is to draw an antialias line around the edge. Page 151 in the book covers this. If you draw it in the same color as the polygon interior, you get this: The code is: void shape(int x, int y) { Poly po; GD.ColorRGB(200,0,0); GD.SaveContext(); po.begin(); po.v(f(x + 0), f(y + 15)); po.v(f(x + 15),f(y + 0)); po.v(f(x + 110), f(y + 0)); po.v(f(x + 125), f(y + 15)); po.v(f(x + 110), f(y + 30)); po.v(f(x + 15), f(y + 30)); po.draw(); GD.RestoreContext();
GD.LineWidth(1 * 16); po.outline(); }
Alternatively, if you really want pixel-perfect diagonals - maybe for a retro game look - a bitmap might be a better approach. Creating an L1 bitmap that covers the screen and drawing the objects into it works well. The "space invaders" demo does this.
|
|
|
Post by dave625 on May 28, 2014 0:42:59 GMT -8
Thank you James, that's just the ticket...
Dave
|
|