Virtual Reality and the WorldToolKit for Windows

A C library for constructing high-end virtual worlds in a 32-bit environment

Ron Fosner

Ron is a principal software developer at Lotus Development, where he researches and develops graphical and interactive techniques for data analysis and exploration. Ron can be contacted at ron@lotus.com.


One maxim in software engineering is that if it needs to be faster, do it in hardware. And when it comes to virtual reality, software engineers appear to have taken this maxim to heart. Serious virtual-reality (VR) packages often require huge amounts of disk space, memory, and computing power, as well as specialized hardware such as data gloves and goggles. Indeed, the leading workstation platform for VR applications is a Silicon Graphics workstation with a RealityEngine dedicated graphics processor.

At the same time, Windows is sorely underpowered when it comes to graphics, particularly those needed for fast rendering of three-dimensional shaded objects. However, the WorldToolKit for Windows (WTKWIN) from Sense8 overcomes such obstacles without resorting to high-performance hardware. The WorldToolKit is a library of over 400 C routines designed to provide optimized performance of interactive 3-D programs under any 32-bit version of Microsoft Windows (Win 3.1 with Win32s to Windows NT 3.5). The toolkit takes care of most of the requirements of VR, providing 3-D rendering, object interactivity, drivers for a number of input devices, and other items necessary for simulating a virtual world.

In this article, I'll create a VR application that tracks down a contamination problem in an hypothetical town. The test problem I've designed lends itself particularly well to modeling. In the process, I'll examine how the WorldToolKit is used to construct a virtual world, cover some of the limitations you can expect when using the toolkit, and discuss the overall system requirements when creating a virtual world.

The WorldToolKit

The WorldToolKit is a real-time graphics development environment for building 3-D simulations and virtual-reality applications. Although in this article I'll focus on the Windows version of WorldToolKit, it is also available for platforms ranging from UNIX-based workstations (SGI Irix, Sun Solaris, Kubota Kenai, to name a few) to PCs (running Windows or DOS). This is important because WorldToolKit's hardware independence makes cross-platform development possible. A WorldToolKit application written for DOS, for instance, will compile on high-end workstations.

WTKWIN contains routines to control, interact, view, and change objects in a 3-D view. The user operates any of a number of input devices to move around or manipulate objects in the world view. WTKWIN automatically displays a view into the virtual world and takes care of all aspects of the display, including the view perspective, shading, texture mapping, display updating, and querying input devices. All the user needs to do is design the virtual world's objects and their behavior, hook up the input devices, and run the simulation. Additionally, WTKWIN can read 3-D objects created with tools such as Autocad, 3D Studio, Swivel 3D, or other modelers that generate .DXF or .3DS files.

A WTKWIN simulation contains a number of items. The first, called a "universe," holds all things to be simulated. Things can be dynamically added to or deleted from the universe, and the universe can be started or stopped. The universe can also contain a "portal," a polygon that loads another universe or executes some user-defined function after the user has passed through it. The basic graphical entity in a universe is an "object." An object is anything that resides in the universe and is usually represented graphically. There are both static and dynamic objects. Static objects are usually the background of the universe. They neither move nor change during the simulation. In a simulation of a house, the walls, floor, and ceiling would be static. A dynamic object can change or interact in the simulation. A dynamic object can also have a task associated with it. For example, if an object is affected by gravity, the task associated with the object moves it according to the laws of gravity. Objects can be collected into hierarchies that affect each other. If you have a box with a hinged lid, the lid can swing open or closed. But if you drop the box, the lid goes with it. In addition, you can create objects on the fly as the simulation runs.

An object is described as a collection of polygons each of which, in turn, is described by a collection of vertices (corners). For example, the simplest polygon is a triangle which consists of three vertices. In WTKWIN, a polygon can have up to a maximum of 256 vertices. An object is also described by its appearance: It's given attributes such as color, texture, and size. All attributes and vertices can be changed dynamically.

Light sources can also be created for the universe. You can specify ambient light (the amount of brightness that illuminates everything in the universe) and actual light sources, which have direction and brightness. During each frame of the simulation, WTKWIN automatically handles the shading of each polygon in the universe, thus presenting the user with a realistically shaded view.

Viewpoint objects are the frame of reference from which a view is rendered. Typically, a viewpoint object is attached to a sensor object (like a mouse, forceball, or joystick). Thus, when the sensor is moved, so is the viewpoint. A simulation can have multiple viewpoints, sensors, and lights.

A WTKWIN program spends most of its time in a simulation loop. The static and dynamic objects are created, the sensors are initialized, the universe is started, and the simulation loop begins; see Figure 1. Inside the loop, the sensors are read, tasks are performed, positions are updated, and the universe is rendered. Windows events are also queried, along with sensor input. Most impressive, however, is that WTKWIN insulates you from Windows, while providing versatility. You can take a program written for WTKWIN, and if you haven't added any Windows-specific items, you can port it with little change to an SGI workstation. On the other hand, you can add a regular Windows menu onto the VR window and make calls to the Windows API, including GDI calls to draw into the VR window.

VR for Problem Analysis

Imagine that a serious problem has surfaced in the town of Virtual Falls--a resort community with little native industry other than tourism. The local government has noticed the lake in the area (a major attraction for vacationers) showing increasing levels of a contaminant that is causing algae to grow progressively faster during the summer months. If this contaminant is not removed, the lake will eventually be choked, killing most of the fish and ruining the aesthetic appeal of the lake. The pollutant has been determined to be the result of a chemical reaction between the runoff from an old mine in the mountains and a component in an ash typically found as a byproduct from a paper-products factory. The ash builds up over the dry summer and winter months, then reacts with the runoff when the spring rainy season starts. Only two nearby paper-products factories could possibly be the source of the ash, but they each place the blame on the other.

Each company's ash contains a chemical that will react with the mine runoff. Each ash will also react with the other's in a chemical reaction that will deplete the chemical before it can react with the runoff. This prevents the local government from determining the extent of each company's responsibility. The available evidence includes measurements taken throughout the area of just three chemical levels: the mine runoff, Company A's ash, and Company B's ash. Since these measurements were taken before the start of the rainy season, there is no information on how the two ashes react prior to meeting up with the mine runoff. All the authorities have is the raw data and the knowledge that the two ashes consume each other in equal parts, and the minimum levels of each chemical required to react with the mine runoff.

Construction of the Virtual World

The first step in creating the virtual world is to generate the underlying topographical data. Most VR systems accept some form of a geometry file describing various 3-D objects. However, 3-D terrain is usually fairly data intensive. For example, the French SPOT satellite data typically has 20-meter resolution, which translates to about 1,300,000 pixels for a 10-square-mile area. Trying to run over a million polygons through a couple of transformation matrices and lighting calculations at a rate of 15 times per second (or faster) is problematic at best. Consequently, I took advantage of WTKWIN's ability to read in an ASCII format file called a "neutral file format" (NFF)--a collection of vertex and polygon definitions. Thus, I was able to create a mathematical model of the major terrain features and generate an NFF file at any resolution desired.

The next step is adding features. For this example, I want to use a visual representation of the data, voice command and response, audio queues, and interactivity. Additionally, I want the user to be able to travel in, around, or through the data, and to query the data. For example, to query the program about the ash concentrations at a particular point, you select that point with the mouse. WTKWIN reduces the mouse pick to a particular polygon in the virtual world, from which we can extract the coordinates in the nonvirtual world. Using these coordinates, we can get the chemical concentrations at that location and then pass them to the output routines.

The next step is processing the user's request to change some aspect of the VR world. The request can be via voice command, direct manipulation of objects, or a menu interface placed upon the VR view. Some VR toolkits have reasonably complete menu systems that operate in a manner similar to that of Windows; in the VR view the menu system remains fixed in front of the user's view much like you'd expect to see in a head-mounted display. WTKWIN doesn't have a formal menuing system. However, one of the demo programs creates an array of buttons that remain in front of the user, with each button having a bitmap texture applied to it. Alternatively, you could create your own menu system using WTKWIN's 3-D text objects to create a true 3-D menu. Alternately, you can use Windows' menu system and have a menu across the top of the window, just like any other Windows program.

The Virtual World

The various parts that make up the VR test program have been assembled. The program will read in the terrain model, initializing the user viewpoint to be over the southwest corner, looking inwards; see Listing One, page 102. The user has, by default, a keyboard interface and mouse interface. If a six-dimensional device is specified on the command line, it's used as the motion sensor. If no 6-D device is specified or found, then the mouse is used as the motion sensor. Once the world is initialized and the simulation loop has started, the user can fly around or change it. The user can bring up a menu in the VR window that contains a number of options.

The first two buttons that pop up in the VR display provide an exit from the program and a method to reset the viewpoint, respectively. The next button toggles the voice annotation, while the last one brings up three other buttons, which modify the landscape by indicating levels of the various chemicals we're interested in. As each chemical button is toggled, an indicator appears in the upper-left corner and its particular concentrations are calculated. Chemical levels sufficient to react are then displayed. When more than one chemical is active, the chemical reactions are accounted for and the remaining chemical levels left can still react are displayed. Each chemical has a different color, so the mixtures of chemicals plus the resulting pollutant can easily be seen. Thus the user can toggle the runoff and each ash type and discover the dispersion areas. When all three are toggled, you can fly around the landscape, tracing the pollution back to its source. If you trace the pollutant up a river, you can see where the pollutant enters the river. Tracing it back to its source, you can fix on the origin of the pollution--in this case, Company A. The complete program including the full source code are available electronically; see "Availability," page 3.

Hardware and Software Requirements for PC VR

I started off working on a Compaq 486/50, using a mouse, 32-Kbyte color video board, and MediaVision ProAudio 16 sound board. With the addition of a 6-D force ball, this platform was acceptable for using the WTKWIN for programs of up to medium complexity; about 1000 polygons at a low level of refresh (less than six frames/sec). The important parts are the computer and the video board. You'll need a fast computer with a math coprocessor and as fast a video board as you can get, preferably one with a VL or PCI bus.

If you decide to go all out, you may want to consider a faster processor, possibly a Pentium-based PC with a PCI bus, or a 6-D input device. The Pentium is a big jump up from a 486 due to its redesigned math coprocessor, and the PCI bus makes for a fast graphics pathway. 6-D input devices make interaction with the virtual world much more natural. Once you experience using one, you'll discover how intuitive 6-D motion can be. Throw in a MIDI sound board, a video board, and monitor capable of at least 800x600x32K colors, and you have quite a respectable PC VR platform. In my case, I used a Compaq Deskpro 5/50M (60-MHz Pentium), coupled with a Matrox MGA Impression video board with 3 Mbytes of RAM, a Media Vision ProAudio Studio 16 sound board, and an IDEK 21-inch monitor.

For information on 6-D input devices, contact Logitech (510-795-8500), Spaceball Technologies (508-970-0330), or CIS Graphics (508-692-9599). Prices range from about $300 to $1500.

Other options include the Mattel PowerGlove (if you can find one) or the Sega Visor (due out sometime soon). WTKWIN will support both. The toolkit also supports 3-D stereo video monitors from StereoGraphics, which can produce some dazzling effects without a head-mounted display. However, you still have to wear LCD shutter glasses that look like thick, nerdy sunglasses. In total, you're looking at between $5000 and $7000 for a VR development system, including the computer.

Aside from the basic Windows 3.1 with Win32s installed, WTKWIN will run under Windows NT and Windows for Workgroups. For development, you'll need a 32-bit C/C++ compiler such as: Watcom C/C++ 9.5, Borland C++ 4.x, Symantec C++ Professional 6.1, and Microsoft Visual C/C++ 32-bit 1.1. I used the Microsoft compiler. You'll also want a 3-D modeling program to design the parts of your virtual environment, and a paint program to create and edit textures with which to paint the polygons.

Summary

Generally, I found the process easier than I expected. The toolkit performed well and made development of a real application possible. In the end, I came out with a fairly detailed virtual reality, complete with interactivity and voice annotation. The speed at which you can fly around is good, as are the refresh rates (around 5--6 times/sec). Although I was using somewhat high-end equipment, this is the target that most PC-based VR will be designed for. And as better video hardware and rendering boards become available, expect to see some dramatic improvements in rendering times.

For More Information

WorldToolKit for Windows Toolkit

Sense8 Corp.

4000 Bridgeway, Suite 101

Sausalito, CA 94965

415-331-6318

$795.00

Figure 1 The simulation loop.

Listing One

/***********************************************************************
Function: user. After WTK initialization routines are done, they call the 
routine WTuser, which can be compared to the normal C function main. WTuser is
passed in argc & argv, just like main is. Universe and sensors are created and
initialized here, and actions are connected to the universe or its objects.
***********************************************************************/

int WTSTD WTuser(int argc, char *argv[])
{
    printf("Geological Terrain/Pollution VR Demo\n");
    printf("using the Sense8 WorldToolKit\n");
    printf("Programmed by Ron Fosner, 1994\n");
    printf("Parts of this program are Copyright 1994 Sense8 Corporation\n");
    // read command line arguments
    ScanCommandLineArgs(argc, argv);
    // initialize the static universe
    printf ("Creating new universe\n");
    WTuniverse_new(WTDISPLAY_DEFAULT, WTWINDOW_DEFAULT);
    uview = WTuniverse_getviewpoint();
    // prepare to read keyboard
    WTkeyboard_open();
    // Load in the terrain NFF file
    LoadTheUniverse();
    // load in the texturemaps for the buttons
    LoadTheButtons();
    // create the industrial sites and place then in the terrain
    LoadTheIndustrialSites();
    // load some default lights (these are locations & directions in a file)
    printf ("Loading lights\n");
    if ( !WTlight_load("lights") )
        printf("Couldn't read lights\n");
    // set universe action function
    WTuniverse_setactions(UniverseActions);
    printf("Universe ready\n");
    WTuniverse_ready();
    // OK, the universe is set, now hook it up to the outside world
    InitTheSensors();
    // enter main loop
    printf("Universe go\n");
    WTuniverse_go();  // we'll remain in this function till the user quits
    // all done - clean up
    WTuniverse_delete();
} 
/* Function: LoadTheUniverse. Loads NFF file that describes the terrain model.
It also allocates memory to hold the initial colors of the polygons
that make up the universe for later replacement of modified colors. */
void LoadTheUniverse()
{
    WTpq modelpq;
    printf ("Loading stationary model: '%s'\n",universe_model);
    if ( !WTuniverse_load(universe_model,&modelpq,1.0) )
      {
      // Use the supplied WTerror function that will simply
      // write an error message to the text window
      WTerror("Couldn't load file '%s'", universe_model);
      }
    WTviewpoint_setposition(uview, modelpq.p);
    WTviewpoint_setorientation(uview, modelpq.q);
    WTviewpoint_zoomall(uview);  // make sure we can see it all
    // now save the initial viewpoint so the user can get back
    // to the original position in case they get lost
    WTviewpoint_getposition(uview, initial_pq.p); 
    WTviewpoint_getorientation(uview, initial_pq.q);
    printf("There are %ld polygons in the stationary universe\n",
      WTuniverse_npolygons());
    // Now save the original colors
    poly_array_pointer = malloc( sizeof(unsigned) * WTuniverse_npolygons() );
    if ( NULL != poly_array_pointer )
      {
      SavePolyColors();
      }
    else
      {
      printf ("Not enough memory to save the polygon colors\n");
      }
}
/* Function: LoadTheButtons. Create the VR view's UI. An alternate UI that will
hover directly in front of the user's VR view. We create "buttons" that the
user can press (using the right mouse button) that will trigger various tasks.
We specify a bitmap to paint each button. We also create a second level menu 
system that will be displayed when the a first level button is toggled. */
void LoadTheButtons(void)
{
    Impart("Loading button images");
    // associate a bitmap with a task
    NewButton("sunset",     0.0, 0.95, QuitTask);
    NewButton("buteye",     0.2, 0.95, ResetViewpointTask);
    NewButton("uvula",        0.4, 0.95, VoiceAnnotateTask);
    NewButton("ash1",        0.6, 0.95, Ash1Task);
    NewButton("info",         0.8, 0.95, InfoTask);
    NewButton("butbulb1",    1.0, 0.95, ModifyLightingTask);
    // These are the 2nd tier menu buttons
    // Lighting tasks
    UpButton   = NewButton("butarrwr", 1.0, 0.75, BrightenLightTask);
    DownButton = NewButton("butarrwl", 1.0, 0.55, DimLightTask);
}
/* Function: NewButton. Create 3D button object in universe. Button is given
a task (ButtonTask) which ensures that the button object follows the viewpoint,
so that interface will always be in front of user. Button is also given an
action, which is the action that's to occur when the user presses button. */
WTobject * NewButton(
    char * texturename,   // file containing button image
    float x,              // x screen coord of button. between 0.0 and 1.0
    float y,              // y screen coord of button. between 0.0 and 1.0
    void (* button_action)() // what the button does when activated
    )
{
    Buttoninfo *info;
    WTobject *o;
    Buttonlist *blist;
    WTpq modelpq;
    // load in button object template, a simple rectangle
    // (We'll differentiate them by specifying a unique bitmap texture)
    o = WTobject_new("button.nff", &modelpq, 1.0, FALSE, TRUE);
    if ( !o )
      {
      WTerror("Couldn't load file 'button.nff'");
      }
    if ( !WTobject_settexture(o, texturename, FALSE, FALSE) )
      {
      printf("Couldn't apply texture %s\n",texturename);
      printf("Perhaps the textures in images\\"
          "buttons are not on your VIM path?\n");
      }
    info = malloc(sizeof(Buttoninfo));
    info->x = x;
    info->y = y;
    info->action = button_action;
    WTobject_setdata(o, (void *)info);
    // Assign task that'll keep all buttons aligned
    WTobject_settask(o, ButtonTask);
    // chain button into global buttonlist
    blist = malloc(sizeof(Buttonlist));
    blist->button = o;
    blist->next = buttons;
    buttons = blist;
    // interface is initially off
    WTobject_remove(o);
    return o;
}
/* Function: ButtonTask. A fairly complicated function I stole from Sense8 
to keep user buttons right in front of the viewpoint. Thus, whenever user
swings viepoint around, this will ensure that buttons swing right along. */
void ButtonTask( WTobject *obj )
{
    WTq q;
    WTp3 p;
    Buttoninfo *info;
    long x0, y0, x1, y1;
    WTwindow *curr;
    float horiz_angle, vert_angle, height, width;
    WTviewpoint_getorientation(uview, q);
    WTviewpoint_getposition(uview, p);
    WTobject_setorientation(obj, q);
    WTobject_setposition(obj, p);
    curr = WTuniverse_getcurrwindow();
    WTwindow_getposition(curr, &x0, &y0, &x1, &y1);
    width = x1;
    height = y1;
    // fetch the data stored with the button
    info = (Buttoninfo *)WTobject_getdata(obj);
    // Fetch the current viewpoint's half angle
    horiz_angle = 2.0 * WTviewpoint_getviewangle(uview);
    // and vertical viewing angle 
    vert_angle = horiz_angle * ( height / width );
    // caclulate new location of the buttons so that they remain in the same
    // relative place with respect to our viewing position this has the effect
    // of making them "float" in space directly in front of our viewpoint.
    p[Z] = 5.6;
    p[X] = (info->x - 0.5) * p[Z] * horiz_angle;
    p[Y] = (info->y - 0.5) * p[Z] * vert_angle;
    // OK, so traslate the buttons position
    WTobject_translate(obj, p, WTFRAME_VPOINT);
}
/* Function: ButtonsToggle. Adds/Removes the buttons from the universe */
void ButtonsToggle( void )
{
    static FLAG button_control_on = FALSE;
    button_control_on = !button_control_on;
    if ( button_control_on )
      {
      ButtonsAdd();
      }
    else
      {
      ButtonsRemove();
      }
}
/* Function: ButtonsAdd. Adds nontransient buttons to universe using 
WTobject_add. They have a location, and ButtonTask will enure they're correctly
drawn when the next screen refresh occurs. */
void ButtonsAdd()
{
    Buttonlist *blist;
    for ( blist=buttons ; blist ; blist=blist->next )
        {
        // skip buttons that are transient
        if (       blist->button != UpButton
                && blist->button != DownButton
            )
            {
            WTobject_add(blist->button);
            }
        }
}
/* Function: ButtonsRemove. Removes buttons from the universe */
void ButtonsRemove()
{
    Buttonlist *blist;
    for ( blist=buttons ; blist ; blist=blist->next )
        {
        WTobject_remove(blist->button);
        }
}
/* Function: ButtonsAction. */
FLAG ButtonsAction( WTobject *obj )
{
    Buttonlist *blist;
    Buttoninfo *info;
    for ( blist=buttons ; blist ; blist=blist->next )
        {
        if ( blist->button==obj )
            {
            info = (Buttoninfo *)WTobject_getdata(obj);
            info->action();
            return TRUE;
            }
        }
    return FALSE; /* no button picked */
}
/* Function: LoadTheIndustrialSites. Demonstrates how to call a function to 
operate on each polygon in the static universe. In this case, the function to 
call takes the center of gravity of each polygon. This is used to calculate the
pollution concentration of the whole polygon and then set the color. */
void LoadTheIndustrialSites( void )
{
    WTobject * mine_obj, * ash1_obj, * ash2_obj;
    WTp3 pos;
    /* first place the mine */
    mine_obj = WTobject_newblock(1,1,1,FALSE,TRUE);
    ash1_obj = WTobject_newblock(5,5,5,FALSE,TRUE);
    ash2_obj = WTobject_newblock(5,5,5,FALSE,TRUE);
    if (    NULL == mine_obj ||
            NULL == ash1_obj ||
            NULL == ash2_obj  )
        {
        Impart("Industrial sites could not be created");
        return;
        }
    /* set color to near black */
    WTobject_setcolor( mine_obj, 0X111 );
    WTobject_setcolor( ash1_obj, 0X111 );
    WTobject_setcolor( ash2_obj, 0X111 );
    /* mine position */
    pos[X] = -27.0; pos[Z] = +39.0; pos[Y] = -23.2;
    WTobject_setposition( mine_obj, pos );
    /* ash sites */
    pos[X] = -53.0; pos[Z] = -39.0; pos[Y] = -16.3;
    WTobject_setposition( ash1_obj, pos );
    pos[X] = +36.0; pos[Z] = +6.0; pos[Y] = -19.7;
    WTobject_setposition( ash2_obj, pos );
}
/* Function: InitTheSensors. Attempt to initialize specified sensor and mouse,
and where we take care of WTK fundametals like attaching a sensor to a 
viewpoint. While deceptively simple, this is an important point of using the 
WTK, that you can get a lot of mileage out of a few simple calls. */
void InitTheSensors(void)
{
    char answer[10];
    printf ("Setup sensors\n");
    // set up the sensors as requested on command line
    if ( use_geoball )
        {
        sensor = geoball = WTgeoball_new(com[geoball_on]);
        }
    if ( use_spaceball )
        {
        /* Since I'm not that great juggling 6 dimensions, you can use the
        Spaceball mode that limits the user to one dimension at a time; 
        just use the next line instead of the one below it */
//         sensor = spaceball = WTspaceball_newdominant(com[spaceball_on]);
         sensor = spaceball = WTspaceball_new(com[spaceball_on]);
        }
    // if not using any other sensors, use mouse
     if ( !use_spaceball && !use_geoball )
        {
        mouse = WTsensor_new(WTmouse_open, WTmouse_close, 
                WTmouse_moveview2, NULL, 1, WTSENSOR_DEFAULT);
        if (mouse)
            {
            movemouse = TRUE;
            sensor = mouse;
            }
        else
            {
            printf("Unable to open mouse!\n");
            }
        }
    else
        {
        movemouse = FALSE;
        /* we need mouse for polygon picking. */
        mouse = WTsensor_new(WTmouse_open, WTmouse_close,
            WTmouse_drawcursor, NULL, 1, WTSENSOR_DEFAULT);
        } 
    // Use a WTK function to scale sensor speed with size of universe
    // This is one of the "magic" things that make life easier
    WTsensor_setsensitivity(sensor, 0.1 * WTuniverse_getradius());
    normalspeed = WTsensor_getsensitivity(sensor);
    /* OK, now here is an important part of using WTK. Attach the selected
    sensor to the viewpoint (you can attach a sensor to just about anything, 
    like a box, but here we want to move the viewpoint, not an object). This
    allows you to easily connect the viewpoint to a manipulatory device. WTK 
    take care of the universe from here! */
    if (use_geoball)
      {
      WTviewpoint_addsensor(uview, geoball);
      }
    if (use_spaceball)
      {
      WTviewpoint_addsensor(uview, spaceball);
      }
    if (mouse)
      {
      WTviewpoint_addsensor(uview, mouse);
      }
}
/* Function: UniverseActions. WTK-called function WTuniverse_setactions()
in function user. The equivalent to a message queue. This is where we poll the
input devices and act upon any changes from the user. If we make any changes 
to the universe (like changing the color of a polygon) WTK will take care of 
it in the next cycle, we don't have to take any other action. */
void UniverseActions()
{
    int    key;
    WTobject *obj;
    // These are the defined actions if we are using the Spaceball...
    if ( use_spaceball )
        {
        /* stop by pressing 8 on the spaceball */
        if ( WTsensor_getmiscdata(spaceball) & WTSPACEBALL_BUTTON8 )
            {
            QuitTask();
            }
        /* teleport to initial view by pressing button 7 */
        else if ( WTsensor_getmiscdata(spaceball) & WTSPACEBALL_BUTTON7 )
            {
            ResetViewpointTask();
            }
        /* turn on/off the buttons by pressing button 1 */
        else if ( WTsensor_getmiscdata(spaceball) & WTSPACEBALL_BUTTON1 )
            {
            ButtonsToggle();
            }
        }
    /* These are the defined actions if we are using the Geometry Ball... */
    if ( use_geoball )
        {
        /* stop by pressing both buttons on the Geoball */
        if ( WTsensor_getmiscdata(geoball) ==
                            (WTGEOBALL_LEFTBUTTON|WTGEOBALL_RIGHTBUTTON) )
            {
            QuitTask();
            }
        /* toggle buttons by pressing the left button on the Geoball */
        else if ( WTsensor_getmiscdata(geoball) == WTGEOBALL_LEFTBUTTON )
            {
            ButtonsToggle();
            }
        }
    /* Mouse actions. This is only active if mouse is not being used for 
       movement. If users have no other device for movement, then they must 
       toggle between mouse movement and mouse movement and mouse picking. */
    // left mouse button to pick polygons (like terrain)
    if ( !movemouse && (WTsensor_getmiscdata(mouse) & WTMOUSE_LEFTBUTTON))
        {
        poly = WTuniverse_pickpolygon(*(WTp2*)WTsensor_getrawdata(mouse));
        if ( poly )
            {
            WTp3 c_g;
            WTpoly_getcg(poly,c_g);
            // Simplified version, just caclulate ash 1 concentration rather
            // than the total concentration depending upon which chemical
            // composition buttons are active. We only care about ash 1 conc.
            ImpartAsh1ConcentrationAtLocation(c_g);
            }
        }
    // right mouse button to pick objects (like buttons)
    if ( !movemouse && (WTsensor_getmiscdata(mouse) & WTMOUSE_RIGHTBUTTON))
        {
        // get the location of the object under the mouse cursor
      obj = WTuniverse_pickobject(*(WTp2*)WTsensor_getrawdata(mouse));
        if ( obj )    // is there one under the cursor?
            {
            ButtonsAction(obj); // see if it's a button, if so do it's task
            }
        }
    // process the keyboard (Notice that I don't use any Windows calls...)
    key = WTkeyboard_getkey();
    if (key)
      {
      HandleKeyPress(key);  // pass it off the the key handler
      }
}
/* Function: HandleKeyPress. Key handler. If the user is 6D input device 
impaired, the provide an alternate user input path to access the functionality 
of both the on-screen buttons, and the 6D input device buttons. */ 
void HandleKeyPress(int key)
{
    // interpret keypresses, and if we recognize one, process it
    switch ( key )
        {
        case 'b':  // Toggle the button interface
            ButtonsToggle();
            break;
        case 'f':    // Flip mouse move <-> mousepick
            movemouse ^=1;
            /* switch between using mouse to move and using it to point */
            if (movemouse)
                {
                Impart("Use mouse to move around world");
                WTsensor_setupdatefn (mouse, WTmouse_moveview2);
                }
            else
                {
                Impart("Use mouse to select objects");
                WTsensor_setupdatefn (mouse, WTmouse_drawcursor);
                }
            break;
        case 'i':    // Display status information
            InfoTask();
            break;
        case 'q':    // Quit
            QuitTask();
            break;
        case '!':    // '!' resets view back to initial view
            ResetViewpointTask();
            break;
        // special resolution modification keys that are driver dependent
#if DVI || SPEA
        case '2':
            printf("Set LOW resolution\n");
            WTuniverse_setresolution(WTRESOLUTION_LOW);
            break;
#if SPEA
        case '3':
            printf("Set Medium resolution\n");
            WTuniverse_setresolution(WTRESOLUTION_MEDIUMRGB);
            break;
#elif DVI
        case '4':
            printf("Set Adaptive resolution\n");
            WTuniverse_setresolution(WTRESOLUTION_ADAPTIVE);
            break;
#endif
        case '5':
            printf("Set HIGH resolution\n");
            WTuniverse_setresolution(WTRESOLUTION_HIGH);
            break;
#endif /* DVI || SPEA */
        default:        // unrecognized key press? - then dislay help text
            DisplayHelpTask();
            printf("\nEnter command..\n");
            break;
        }// end o' switch
}
/* Function: ScanCommandLineArgs. Process command line args to see if user 
specified an alternate input device. We accept -[G|S][1|2] to specify device
and the serial port it's connected to. */
void ScanCommandLineArgs(int argc, char *argv[] )
{
    while (--argc > 0)
        {
        if ('-' == (*++argv)[0])
          {
            switch ((*argv)[1])
                {
                case 'g': // GeoBall
                case 'G':
                    use_geoball = TRUE;
                    geoball_on = (*argv)[2] - '1'; /* Convert from ASCII */
                    break;
                case 's': // SpaceBall
                case 'S':
                    use_spaceball = TRUE;
                    spaceball_on = (*argv)[2] - '1';
                    break;
                default:
                    WTerror("Unrecognized argument -%c",(*argv)[1]);
              } // switch
          } // if
        } // while
}
/* Function: SavePolyColors. Saves all of the colors of all the polygons in the
static universe. It demonstrates use of the WTuniverse_getpolys(), and
WTpoly_next() functions, which enable you to visit all of the polygons
in the universe. The value of each poly color is saved in an array. */
void    SavePolyColors( void )
{
    unsigned * c;
    for ( poly = WTuniverse_getpolys(), c = poly_array_pointer ;
            poly != NULL ;
            poly = WTpoly_next(poly)
         )
        {
        *c++ = WTpoly_getcolor(poly);
        }
}
/* Function: RestorePolyColors. Replaces the original polygon color. Similar to
SavePolyColors(), it demonstrates how you can use the polygon order to 
know which polygon you're operating one. */
void    RestorePolyColors( void )
{
    unsigned * c;
    if ( NULL == poly_array_pointer )
        return;
    for ( poly = WTuniverse_getpolys(), c = poly_array_pointer ;
            poly != NULL ;
            poly = WTpoly_next(poly)
         )
        {
        WTpoly_setcolor(poly,*c++);
        }
}
/* Function: LoopThroughPolys. Demonstrates how to call a function to operate 
on each polygon in the static universe. In this case, function to call takes
the center of gravity of each polygon. This is used to calculate the
pollution concentration of the whole polygon and then set the color. */
void    LoopThroughPolys( void *(func)(WTp3) )
{
    WTp3 c_g; // center of gravity
    for ( poly = WTuniverse_getpolys() ;
            poly != NULL ;
            poly = WTpoly_next(poly)
         )
        {
        WTpoly_getcg(poly,c_g); // get the center of the polygon
        (*func)(c_g);                // call da function
        }
}
/* Function: DisplayHelpTask. Display how to use the various UIs */
void DisplayHelpTask( void )
{
    printf("\n-----------------------------------------------\n");
    printf("Right mouse button to toggle object selection\n");
    printf("Left mouse button to select polygon\n");
    printf("'b' Toggle user interface buttons\n");
    printf("'f' Flip between mouse move and mouse pick\n");
    printf("'i' status Information\n");
    printf("'q' Quit immediately\n");
    printf("'!' reset view back to initial position\n");
    printf("To change: 'w' Convergence 's' Sensor sensitivity\n");
#if DVI
    printf("Resolution: 2 - LOW  4 - Adaptive  5 - HIGH\n");
#elif SPEA
    printf("Resolution: 2 - LOW  3 - Medium  5 - HIGH\n");
#endif
    if ( use_spaceball )
        {
        printf("Spaceball commands...\n");
        printf("     Button 1 - toggle the 3D viewport buttons\n");
        printf("     Button 7 - reset view back to initial position\n");
        printf("     Button 8 - Quit immediately\n");
        }
    if ( use_geoball )
        {
        printf("Geoball commands...\n");
        printf("     Both Buttons - Quit immediately\n");
        }
    printf("\n-----------------------------------------------\n");
}
#define DISTANCE(x1,y1,x2,y2)  (sqrt( (x1-x2)*(x1-x2) + (y1-y2)*(y1-y2) ))
#define INSIDE_DISTANCE(x1,y1,x2,y2,d)  ( DISTANCE(x1,y1,x2,y2) < d )

/* Function: ComputeAsh1ConcentrationAtLocation. Pass in the center of gravity
of a polygon, then use sum up the contributions from all overlapping circles 
used to distribute the chemical over the area. Rather than using a graduated
scale of chemical data, this is quick since it results in a boolean result. */
void    ComputeAsh1ConcentrationAtLocation(WTp3 c_g)
{
    float x = c_g[X], z = c_g[Z];
    int count;
    // Ash 1 is at pos[X] = -53.0; pos[Z] = -39.0
    for  ( count= 0 ; count < sizeof( Ash1 )/sizeof( Ash1[0] ) ; ++count )
        {
        if ( INSIDE_DISTANCE(x,z,Ash1[count].x,Ash1[count].z,Ash1[count].d) )
            {
            WTpoly_setcolor(poly,0xF0F);
            return;
            }
        }
}        
/* Function: ImpartAsh1ConcentrationAtLocation. Pass in the center of gravity 
of a polygon, then calculate if there is any ash 1 found at that location. This
is a simplified version that simply returns a found/not-found. */
void    ImpartAsh1ConcentrationAtLocation(WTp3 c_g)
{
    float x = c_g[X], z = c_g[Z];
    int count;
    char text[50];
   sprintf(text,"Location %+4.f,%+4.f ",c_g[X],c_g[Z]);
    Impart(text);
    // Ash 1 is at pos[X] = -53.0; pos[Z] = -39.0
    for  ( count= 0 ; count < sizeof( Ash1 )/sizeof( Ash1[0] ) ; ++count )
        {
        if ( INSIDE_DISTANCE(x,z,Ash1[count].x,Ash1[count].z,Ash1[count].d) )
            {
            Impart("Ash 1 found");
            return;
            }
        }
    Impart("None found");
}        
/* Function: Ash1Task. Loops through polygons in the terrain model, computing 
ash 1 concentrations and setting polygon colors, or resetting polygon colors.*/
void    Ash1Task( void )
{
    static FLAG ash1_on = FALSE;
    ash1_on = !ash1_on;
    if ( ash1_on ) 
      {
      LoopThroughPolys(ComputeAsh1ConcentrationAtLocation);
      }
    else
      {
      RestorePolyColors();
      }
}        
void QuitTask( void )
{
    Impart("Quitting");
    WTuniverse_stop();
}
void ResetViewpointTask( void )
{
    Impart("Resetting Viewpoint");
    WTviewpoint_moveto(uview, &initial_pq);
}
void InfoTask( void )
{
    WTp3 p;
    WTq q;
    printf("Polygons: %6d, Frame rate: %8.2f fps\n",
        WTuniverse_npolygons(), WTuniverse_framerate());
    WTviewpoint_getposition(uview, p);
    printf("Viewpoint: x=%8.3f,  y=%8.3f,  z=%8.3f\n", p[X], p[Y], p[Z]);
    WTviewpoint_getorientation(uview, q);
    printf("Orientation: qx=%8.4f, qy=%8.4f, qz=%8.4f, qw=%8.4f\n",
        q[X], q[Y], q[Z], q[W]);
}
/* Function: VoiceAnnotateTask. Voice annotation feature is only connection 
to Windows, and there just because we need to start a DDE conversation with the
external voice server, called Monologue, by First Byte. Note phonetic spelling,
which and dramitically clarify what you want it to say. */
void VoiceAnnotateTask( void )
{
    static FLAG control_on = FALSE;
    control_on = !control_on;
    if ( NULL == hConvTalk ) // Have we initiated the DDE conversation yet?
        {
        // No, so do it.
        InitiateDDEConversation( ); // This will set hConvTalk if successful
        }
    if ( NULL == hConvTalk )    // Did we fail?
        {
        control_on = FALSE;
        Talking = FALSE;
        MessageBox (NULL, "Cannot connect with Monologue!",
                "VR Demo", MB_ICONEXCLAMATION | MB_OK) ;
        return;
        }
    if ( TRUE == control_on )
        {
        // Tell user that Voice Annotation is activated
        Talking = TRUE;
        Say("<<~V4S7>>"); // reset volume & speed
        Say("<<~'AEkt-IXv-EY-IX-ted>>"); //Say "activated" phonetically
        }
    else // turn off control
        {
        // Tell user that Voice Annotation is deactivated
        Say("<<~d-IY'AEkt-IXv-EY-IX-ted>>"); //Say "deactivated" phonetically
        Talking = FALSE;
        }
}
/* Function: Impart. If we're talking, then say the text, else print it out. */
void Impart(char * text)
{
    if ( !Talking )
      {
      printf(text);
      printf("\n");
      }
    else
      {
      Say( text );
      }
}
/* Function: Say. If voice annotation is running, then do Windows DDE thing, 
and pass the string off the the voice server. */
void Say(char * text)
{
    HSZ    hszItem;
    if ( !Talking )
      {
      return;
      }
    hszItem = DdeCreateStringHandle (idInst,text, 0) ;
    DdeClientTransaction (
                         text, strlen(text)+1, hConvTalk, hszItem, 
                         CF_TEXT, XTYP_POKE, 1500000L, NULL) ;
    DdeFreeStringHandle (idInst, hszItem) ;
}    
/* Function: InitiateDDEConversation. Just what it says. */
void InitiateDDEConversation( void )
{
    HSZ    hszService, hszTopic, hszItem ;
   // Initialize for using DDEML
    if (    DMLERR_NO_ERROR != 
            DdeInitialize(    &idInst,
                (PFNCALLBACK) MakeProcInstance ((FARPROC) DdeCallback, hInst),
               APPCLASS_STANDARD | APPCMD_CLIENTONLY, 0L))
      {
      MessageBox (NULL, "Could not initiate DDE conversation!",
                  "VR Demo", MB_ICONEXCLAMATION | MB_OK) ;
      }
    else
      {
      printf("DDE connect with voice annotation\n");
      }
   // Try connecting to MONOLOG.EXE
    hszService = DdeCreateStringHandle (idInst, "MONOLOG", CP_WINANSI) ;
    hszTopic   = DdeCreateStringHandle (idInst, "TALK",   CP_WINANSI) ;
    hConvTalk = DdeConnect (idInst, hszService, hszTopic, NULL) ;
    // Free the string handles
    DdeFreeStringHandle (idInst, hszService) ;
    DdeFreeStringHandle (idInst, hszTopic) ;
}
/* Function: DdeCallback. Just to make Windows happy... */
HDDEDATA FAR PASCAL DdeCallback (UINT iType, UINT iFmt, HCONV hConv,
                                 HSZ hsz1, HSZ hsz2, HDDEDATA hData,
                                 DWORD dwData1, DWORD dwData2)
{
     return NULL ; // we don't need to do anything...(yet)
}
/* Function: ModifyLightingTask. Add/remove transient buttons objects. */
void ModifyLightingTask( void )
{
    static FLAG control_on = FALSE;
    control_on = !control_on; /* toggle the flag */
    /* control turned off; get rid of buttons */
    if ( !control_on )
      {
      WTobject_remove(UpButton);
      WTobject_remove(DownButton);
      }
    else
      {
      WTobject_add(UpButton);
      WTobject_add(DownButton);
      }
}
/* Function: DimLightingTask.  Modify the ambient lighting. */
void DimLightTask( void )
{
    short bgcolor = WTuniverse_getbgcolor();
    if ( --bgcolor<0 )
      {
      bgcolor = 0;
      }
    WTlight_setambient(0.9*WTlight_getambient());
    WTuniverse_setbgcolor(bgcolor);
}
/* Function: DimLightingTask.  Modify the ambient lighting. */
void BrightenLightTask( void )
{
    short bgcolor = WTuniverse_getbgcolor();
    if ( bgcolor<15 )
      {
      bgcolor++;
      }
    if ( bgcolor>15 )
      {
      bgcolor = 15;
      }
    WTlight_setambient(1.11*WTlight_getambient());
    WTuniverse_setbgcolor(bgcolor);
}


Copyright © 1995, Dr. Dobb's Journal