Mixing up value types and reference types

Ming-Yee Iu valhalla at jinq.org
Wed Oct 1 18:10:49 UTC 2014


When you get around to working on the syntax for value types, it would be
nice if you could find a way to make value types be distinguished from
reference types somehow.

In C#, I always found it a little confusing that value types looked exactly
like reference types in the code, so I could never be exactly sure about
what would happen if I wrote

  A a = b;

or

  return a;

without looking at the documentation to see whether A was a value type or
not.

For example, I might read up on some blog about a function which returns
the bounding box of a UI widget as a Rectangle object. I would then want to
make a slightly bigger rectangle for the border. Without knowing whether
Rectangle was a value type or not, I might write either:

  Rectangle a = widget.getBounds();

  Rectangle border = a; border.x--; border.y--; border.width += 2;
border.height += 2;

or

  Rectangle a = widget.getBounds();

  Rectangle border = new Rectangle(a.x-1, a.y-1, a.width +=2; a.height+=2);

But I would have to look up the documentation first to find out if
Rectangle is a value type or not first.

I'm not sure if new syntax would actually be the best way of solving the
problem. Maybe simply setting some good conventions early on for people to
follow would be enough.

Obviously, Hungarian notation might be going a bit far, but maybe something
as simple as declaring that value types should start with a lower case
letter so that they look like primitive types would be sufficient?

e.g. int, double, point, rectangle2d,

vs. Integer, Double, Point, Rectangle2d


-Ming


More information about the valhalla-dev mailing list