Friday, June 4, 2010

Notes on Tuesday 02.06.2010 meeting

Just a few notes before I forget, from our Tuesday 2 June meeting:

We discussed different kinds of sound signals, which might include
  • entering a region (e.g., the buffer around a river).  For example, this could be a short artificial sound.
  • moving or remaining within a region (e.g., following the river).  For example, this could be some naturalistic or mnemonic sound (like the sound of a river).
  • query or hovering in a region (some kind of additional information, probably as speech)
Jim M. brought up the hierarchy of visual symbolism in graphical maps.  This seems to suggest that multiple sound signals could be active at the same time (foreground, background), something we have not considered thus far.

While the discussion of sound symbology got people thinking, it is an issue that we have not explored systematically yet.

Andrew brought up more complex orienting signals about current location or direction.  For example, it is easy to imagine directional signals that would differentiate, e.g., between moving upstream and downstream on a river.  (But how we would implement this, including recognizing which direction is upstream, is not obvious.) 


Amy suggested that the map lines (rivers) needed to be much simpler even than they are now, just a few basic arcs, to be easier to follow.  Would the standard (Douglas-Peucker) line simplification be suitable for this?  Jake thinks not; probably manual simplification will be necessary.  Splining might help but probably not enough to make a big difference.

The Map Publisher program came up, but I have forgotten why ... was it for line simplification?  For manual modification of a map?

Jake suggested voice input, especially for orientation tasks ("where am I?", "what is this?").  Apparently there is a Google or Garmin application that uses it in a mapping application.  Since we don't want the user to have to move a hand from the mouse or other pointing device, voice input might be a useful input modality.

Orientation:  What sort of "where am I?" information is needed, and in what form?  Jim M. thought that users may be getting more  used to longitude and lattitude as positional information because of widespread use of GPS.  Another approach is relative position: "200 yards north of the Erb Memorial Union."  We really don't know what is most useful at this point, and it might be task-dependent.

Amy suggested the developers exploring the map with everything displayed in white (i.e., invisible) to get a more realistic sense of what works and doesn't in navigation.

Two examples of spatial reasoning / exploration came up in the meeting:
  • Display includes base map of US, cities with population indicated, roads.  Explore roads and cities.  How are they related?  (Bigger cities have denser networks of roads.) 
  • Yellowstone: What river is nearest Old Faithful?  Are there areas of Yellowstone without geysers?

Thoughts:
  • Perhaps we should experiment with radical Douglas-Peucker line simplification, to see what happens when a river is simplified down to just a handful of points, and also experiment with spline versus straight line segment representation (if ArcGIS supports splines). 
  • We can start building a foundation for different kinds of sounds on different kinds of events (entering and leaving a region, moving or pausing within a region, etc)
  • A simple way to explore the "white screen" version of a soundscape map or mGIS is to cover or turn away the screen.

No comments:

Post a Comment