FreeVR: Virtual Reality Integration Library |
|
FreeVR: User's GuideFebruary 29, 2024 for FreeVR Version 0.7eWritten by Bill Sherman IntroductionThis guide is for the user of applications compiled with the FreeVR library. Therefore, topics such as how to configure a FreeVR application to run in a CAVE™ environment are not included here. It is assumed that a VR system administrator has already configured your system for the available hardware using the FreeVR Administrator's Guide. What this guide does cover is how to use FreeVR in simulator mode, how to record and play-back tracking data, and (soon) how to make some basic adjustments to the configuration that a user might be interested in.
The other FreeVR guides are: Running a FreeVR ApplicationOnce a VR system is configured, running a FreeVR application is simply a matter of running the executable via command line, or via a GUI scripting interface (such as with Python-TK). For example, to run the CAVE Quake III Arena application on a terminal command line: % cq3a [cq3a-options]Of course different applications may have different options that can be provided. In the case of cq3a, if there is a world database available, then no options are required. When a FreeVR application is run at the shell, the user will typically see a simple report of the running system. That report will look something like this: When the "debug level" is set to greater than 2, then there may be more informational output, and as the debug level is set even higher, an increasing amount of output for debugging purposes will be produced. Thus, for a standard user on a pre-configured system, a debug level of 2 (aka "almost always") is sufficient. (For the very confident, setting the debug level to 1 (aka "always") will produce a little less output. The debug level can be set via environment variable (see below), or in the FreeVR configuration file (aka ".freevrrc").************************************************* ** FreeVR library version: FreeVR Version 0.7e ** ************************************************* FreeVR (<pre>): Allocating 268435456 bytes of shared memory. FreeVR (0.000): **** Configuring the System! **** FreeVR config: Reading string "Default Config String". FreeVR config: Using system 'simulator' FreeVR config: Reading file "/home/user/.freevrrc". FreeVR Config>>================================================== C A V E Q U A K E I I I A R E N A --------------------------------------- by Paul Rajlich [followed by more CQ3A-specific output] FreeVR (2.3156): **** Initializing the System! **** FreeVR (2.3161): Process "default-visren"(visren) spawned! Pid = 2953, spawn time = 2.316142 FreeVR (2.3165): Process "simulator-input"(input) spawned! Pid = 2954, spawn time = 2.316540 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ FreeVR (2.3168): Inputs for device 'simulator-indev' are NOT YET created. FreeVR (2.3168): Still waiting for 'simulator-indev' inputs to be created -- skipping in 5. FreeVR (2.3169): Process "default-telnet"(telnet) spawned! Pid = 2955, spawn time = 2.316894 FreeVR (2.3170): Telnet: Connect to 'indy:3000' to send commands to FreeVR FreeVR (2.3175): X11 input device simulator-indev waiting for window 'default' to be opened. FreeVR (2.4246): Window 'default' now operating (1) FreeVR (2.5448): Window 'PV-left' now operating (1) FreeVR: >>>>>> Inputs for all 2 devices created. <<<<<< ================================================== FreeVR Version = "FreeVR Version 0.7e" Compiled: Mar 1 2024 at 00:49:41 on indy (Linux 4.18.0-513.11.1.el8_9.x86_64)[linux2.6-glx-64] with GLX Using System = "simulator" (initialized = 0) debug level = debug:aalways (2) debug exact = debug:none (0) visrenmode default = "mono" (0) processor lock default = "default" InputMap = "default" 1 input devs = [ "simulator-indev" ] (initialized = 1) 1 users = [ "default-user" ] (initialized = 1) 1 eyes = [ "default-user:cyclops" ] 1 eyelists = [ "default" ] 2 windows = [ "default" "PV-left" ] 4 processes = [ "built-in main"(2952) "default-visren"(2953) "simulator-input"(2954) "default-telnet"(2955) ] ================================================== For a complete list of debug levels, see the debug.1fv manpage. The output shown above indicates what version of FreeVR the application was compiled with, along with some of the settings, such as with the GLX graphical interface, and possibly the pthread or single-thread variants of the library. Plus details about the operating system the library was compiled on are listed, along with the date and time the library was built. These details are often helpful when debugging issues that might arise from using a library that is out-of-sync with the application code. The above output also indicates the type of system is being run. In this case a simulated CAVE view is provided, with a "simulated-input" process to control virtual position trackers, valuators and button presses. Also this system has a default-telnet process running which enables a user to have extra access to the internals of the running VR application. (See the accompanying Telnet/Socket documentation" for more details.)
If there are unrecoverable errors, then it may simply be a fault in the
configuration file, in which case the
FreeVR Administrator's Guide should be consulted.
Deeper issues may require information from the
FreeVR Application Development Guide.
When a FreeVR application is executed, there are a handful of environment variables whose values have an effect on how the initial configuration process will take place:
There are also many environment variables that pertain to debugging
portions of the library such as inputs or serial communication.
The FreeVR library provides the mechanism for FreeVR applications
to supply information about their basic operation directly on-screen.
To summon this basic application quick-ref documentation, press the ./Del key
on the numeric keypad in any of the rendering windows.
This will bring up a semi-transparent box with a white border.
If all you see is a small square in the upper left corner of the
screen, then the application developer(s) did not opt to make
use of this FreeVR feature.
When a FreeVR application is executed on a machine with no specific configuration information, it will default to a configuration with inputs taken from the keyboard to simulate typical inputs found in most CAVE™ VR displays. Keystrokes allow the user to virtually press the wand buttons, move the joystick, and move around the head and wand.
The user can reconfigure the default keymappings using the
configuration instructions (that will be) described below.
However, a default mapping has been assigned that will be
familiar to users of the EVL CAVE library.
The FreeVR library avoids the use of mappings of specific keys to particular functions by the application. Instead, keys are mapped to button inputs which can then be used by the application. So, by default, the 'Escape' key is mapped to button(0), which is the de facto method of terminating CAVE-based VR applications. Button inputsThe default inputs for the button inputs (normally found on a CAVE wand) are the three mouse buttons. Joystick inputsThe valuator inputs from the standard wand joystick are based on the location of the mouse pointer inside the simulator view window. The joystick inputs are only active while the spacebar is depressed.6-DOF Tracker-sensor inputsTranslating a 6-DOF sensorThe basic method of moving a 6-DOF sensor is to translate it using the four arrow keys. By default the arrow keys move the sensor in the X-Z plane (ie. parallel to the floor). The 'f' and 'v' keys move the sensor up and down (in the Y direction) respectively. There are several ways to modify this behavior. While holding down the 'left-shift' key, the Y and Z movements are reversed. Thus the arrow keys now move the sensor in the X-Y plane. By pressing the one key, translation becomes relative to the orientation of the sensor. Thus the up-arrow key will move the sensor in the direction it is facing. Pressing the one key again returns to absolute translation mode. The mouse can also be used to translate the sensor.
Normally, the 6-DOF sensors are restricted to move only within the "CAVE volume" depicted in the simulator display. This restriction can be disabled and reenabled with the 'two' key. The sensor can be reset to the origin with the 'r' key. This also resets the orientation. Rotating a 6-DOF sensorThe basic method for rotating a sensor is with the following keys:
The four arrow keys can also be switched from controlling translation to rotation by holding down the left-alt key. Selecting a 6-DOF sensorThe default configuration creates two sensors that can be manipulated. One for the user's head, the other for a wand controller. Upon startup, the head sensor is under the control of the user. This can be changed with the 'n' key, which selects the "next" sensor. The user can also specify a specific sensor with the 's' (skull) and 'w' (wand) keys. Overall simulator input chartHere is a table that summarizes the default input controls:
Manipulating the view
To change the view in the FreeVR simulator display, the following keymappings on the numeric keypad are the default:
Note, however there are three caveats in the current implementation
of the FreeVR library.
First, key repeating is disabled in FreeVR input windows, so holding
down a key to continue a view change won't work — you need to repeatedly
press the key (hopefully we'll fix that in a near-future release).
Second, if the "xmodmap" is strange (as is the case for at least
Solaris) many of the keys won't work.
In this case, you will need to change your X keyboard mappings
to have the keypad operate in a more expected manner.
In large part, FreeVR does not include explicit code for
recording inputs, but rather relies on the ncat (aka "netcat",
aka "nc") networking utility to capture the VRPN output from
FreeVR as it operates as a VRPN server.
inputdevice "vrpn-server" = { type = "vrpnout"; } process "io_process" += { objects += "vrpn-server"; } NOTE: the above code assumes the process that handles inputs (and outputs) is called "io_process" — there's a good chance this is not the actual name, so replace that with the actual name. (The name "simulator-input" is the default name when running in simulator mode.) By default VRPN outputs to port 3883, which is where a VR application that is expecting an VRPN input stream would connect. For recording, simply run a VR application (any application will work, but usually one would run a particular application for which they want to analyze the user's movements, or they might use the basic inputs program which just echos any user inputs to the view). Once the application is running, then in a separate shell, the ncat program can be used to read and capture the VRPN stream via socket. One quirk of the VRPN protocol is that after connecting to a server, the client must respond to the servers acknowledgement-request message (what the VRPN nomenclature refers to as a "cookie"). This exchange also informs the server and client what version of VRPN the other is communicating in to ensure that they are compatible. There are two basic ways to provide the "cookie" response: 1) you can type (or copy/paste) the response into the shell and hit Enter; or 2) you can store your "cookie" response in an ASCII text file and pipe it to ncat with the tail utility.
For both methods, the FreeVR application should already be
running.
% nc localhost 3883 > «file-name» (type) ver. 07.34 0«Enter» Method 2: Create a new file (any name is okay, for this example we will use "vrpn_cookie.txt") with the contents: % cat vrpn_cookie.txt ver. 07.34 0Then run: % tail -f vrpn-cookie.txt | nc localhost 3883 > «file-name»
In both cases, type a shell interrupt (ie. Ctrl-C) or quit the FreeVR
application to terminate the recording.
Regardless of the playback viewing mechanism, the way to setup the simple VRPN playback server is to pipe (or redirect) the data through the ncat utility listening for clients on the 3883 port: % nc -l localhost 3883 < «file-name» Playback using "vrpntest"The easiest playback mechanism (perhaps to test the validity of the recording) is to use the FreeVR "vrpntest" utility. The vrpntest utility can translate the VRPN input stream into a basic flow of inputs, a CSV representation, or a pseudo-graphical screen representation.For VRPN playback, the vrpntest utility has a command-line-option that can be used to regulate the playback flow rate to be "real-time" rather than "as-fast-as-possible". This option is either "-govern_time" or just "-gt". Method 1: Basic stream output governed real time: % vrpntest -gt Method 2: CSV output (perhaps saving to a CSV file): % vrpntest -csv or % vrpntest -csv > «file-name.csv» Method 3: screen-rendered output governed real time: % vrpntest -screen -govern_time Playback using a FreeVR applicationFor playing back through a FreeVR application, it makes the most sense to either use the inputs program which can translate the VRPN stream into visual representations of 6-sensor movement, button presses and valuator (joystick) action. Or, the original application that was running when the stream was captured could be selected so the interactions can be re-lived. In either case, a .freevrrc file will need to be configured to have a VRPN input, and probably one with very few other (non-VRPN) inputs configured. Usually there should at least be an Xwindows input used configured for one button (2-switch) input that watches for when the Escape ('Esc') key is pressed: inputdevice "xwin-vr" = { type = "xwindows"; args = "window = default;"; input "2switch[ESC]" = "2switch(keyboard:key[Escape])"; } A "vrpn" input device is also required, and this can be similar to what might be used to read VRPN data from a normal (live) VRPN server. The trick is that all the inputs should match what's in the stream, but also the VRPN device name used by the FreeVR VRPN output is called FreeVR. As with the vrpntest utility, there is a special option for the playback of VRPN streams through FreeVR applications: govern_time. So here is a typical VRPN input device configuration for a playback session: inputdevice "vrpn-playback" = { type = "vrpn"; args = "hostname = localhost;"; args += "govern_time = on;"; input "2switch[A]" = "2switch(FreeVR:button[1])"; input "2switch[B]" = "2switch(FreeVR:button[2])"; input "2switch[3]" = "2switch(FreeVR:button[3])"; input "val[x]" = "Valuator(FreeVR:analog[-0])"; input "val[y]" = "Valuator(FreeVR:analog[-1])"; control "print_devinfo" = "2switch(FreeVR:button[0])"; ################################################ # Add two position trackers from VRPN ## Using "ipos" to lay the tracked device across the "floor" ipos.translate = -4.0, 0.0, -1.0; r2e.rotate *= 1.0, 0.0, 0.0, -30.0; input "playback-head" = "6sensor(FreeVR:tracker[0,r2e, ipos])"; ipos.translate = -1.5, 0.0, -1.0; input "playback-0" = "6sensor(FreeVR:tracker[1,, ipos])"; ipos.translate = 1.5, 0.0, -1.0; input "playback-2" = "6sensor(FreeVR:tracker[2,, ipos])"; ipos.translate = 4.0, 0.0, -1.0; input "playback-3" = "6sensor(FreeVR:tracker[3,, ipos])"; } And these are then combined with a VRPN input device, such as in: process "vrpn+xwinsim-input" = { type = input; objects = "xwin-vr", "vrpn-playback"; }
Then run the FreeVR application. (And repeating from above: the
ncat utility should already be setup in listen mode with the
recorded data piped into it.)
A quick way to visualize recorded 6-sensor (6-DOF) position tracking data can be accomplished with the gnuplot tool in conjunction with a CSV copy of the recorded data (and a simple AWK one-liner): % nc -l localhost 3883 < «file-name.raw» (new shell) % vrpntest -csv > «file-name.csv» % awk -F, '$4==12 {print $0;}' < «file-name.csv» > «file-name-6dof.csv»And then plot it with gnuplot: % gnuplot > set datafile separator ',' > splot '«file-name-6dof.csv»' using 6:7:8:5 with points palette What the above is doing is converting the raw VRPN stream into a CSV format file. That file is then filtered (using AWK) to only contain the 6-DOF position tracker data — into the file called «file-name-6dof.csv». The gnuplot utility is then run. It needs to know that the file uses commas as separators (is a CSV file), and then it is 3D plotted using data columns 6, 7 & 8, which are the X,Y,Z coordinates, and using column 5 to color the data, which is the sensor number. © Copyright William R. Sherman, 2024. |