Previous topic

Nodes

Next topic

Animation

This Page

Event Handling

class libavg.avg.Event

Bases: Boost.Python.instance

Base class for user input events.

type

One of KEYUP, KEYDOWN, CURSORMOTION, CURSORUP, CURSORDOWN, CURSOROVER, CURSOROUT or QUIT. Read-only.

when

The time when the event occured in milliseconds since program start. Read-only.

class libavg.avg.KeyEvent

Bases: libavg.avg.Event

Generated when a key is pressed or released.

keycode

The keycode of the key according to US keyboard layout. Read-only.

keystring

A character or word describing the key pressed. Read-only.

modifiers

Any modifier keys (shift, ctrl, etc.) pressed. This is a number of KeyModifier values or’ed together. Read-only.

scancode

The untranslated (hardware-dependent) scancode of the key pressed. Read-only.

unicode

Unicode index of the character. Takes into account the current keyboard layout and any modifiers pressed. This attribute is only filled in the KEYDOWN event. Read-only.

class libavg.avg.MouseEvent

Bases: libavg.avg.Event

Generated when a mouse-related event occurs.

button

The button that caused the event. Read-only.

cursorid

Always -1 for mouse events, but can be used to handle mouse and tracking events in one handler. Read-only.

lastdownpos

The position of the last mouse down event with the same button. Useful for implementing dragging. Read-only.

leftbuttonstate

True if the left mouse button is currently pressed. Read-only.

middlebuttonstate

True if the middle mouse button is currently pressed. Read-only.

node

The node that the event occured in. Read-only.

pos

Position in the global coordinate system. Read-only.

rightbuttonstate

True if the right mouse button is currently pressed. Read-only.

source

Always MOUSE.

speed

Current speed of the mouse in pixels per millisecond as a Point2D. Read-only.

x

x position in the global coordinate system. Read-only.

y

y position in the global coordinate system. Read-only.

class libavg.avg.TouchEvent

Bases: libavg.avg.Event

Generated when a touch or other tracking event occurs. Touch events happen only when a multi-touch sensitive surface or other camera tracker is active.

area

Size of the blob found in pixels. Read-only.

center

Position as Point2D, with sub-pixel accuracy. Used for calibration. Read-only.

cursorid

An identifier for the current touch. A single touch will generate a down, zero or more motion and a single up event in its lifetime, all with the same cursorid.

eccentricity
lastdownpos

The initial position of the cursor. Useful for implementing dragging.

majoraxis

Major axis of an ellipse that is similar to the blob. Read-only.

minoraxis

Minor axis of an ellipse that is similar to the blob. Read-only.

node

The node that the event occured in. Read-only.

orientation

Angle of the blob in radians. For hovering hands, this is roughly the direction of the hand, modulo 180 degrees. Read-only.

pos

Position in the global coordinate system. Read-only.

source

source can be either TRACK or TOUCH. In most cases, actual touches will generate TOUCH events. When used with a DI device, the internal tracker also generates TRACK events for hands above the surface. When used with an FTIR device, the internal tracker generates TRACK events for the actual touches.

speed

Current speed of the touch in pixels per millisecond as a Point2D. Read-only.

x

x position in the global coordinate system. Read-only.

y

y position in the global coordinate system. Read-only.

getContour() → list

Returns the contour of the blob as a list of points if supported by the tracker being used.

getRelatedEvents() → events

Only for DI devices and the internal tracker: Returns a python tuple containing the events ‘related’ to this one. For TOUCH events (fingers), the tuple contains one element: the corresponding TRACK event (hand). For TRACK events, the tuple contains all TOUCH events that belong to the same hand.

class libavg.avg.Tracker

Bases: Boost.Python.instance

A class that uses a camera to track moving objects and delivers the movements as avg events. Create a tracker by using Player.addTracker(). The properties of this class are explained under https://www.libavg.de/wiki/index.php/Tracker_Setup.

This is the internal libavg tracker. For trackers created using Player.enableMultitouch(), no Tracker object exists.

abortCalibration()

Aborts coordinate calibration session and restores the previous coordinate transformer.

endCalibration()

Ends coordinate calibration session and activates the coordinate transformer generated.

getDisplayROIPos()
getDisplayROISize()
getImage(imageid) → Bitmap

Returns one of the intermediate images necessary for tracking. These images are only available if setDebugImages was called before with appropriate parameters. Possible imageid values are IMG_CAMERA, IMG_DISTORTED, IMG_NOHISTORY, IMG_HISTOGRAM, IMG_FINGERS or IMG_HIGHPASS.

getParam(element) → value

Returns a tracker configuration parameter.

resetHistory()

Throws away the current history image and generates a new one from the next second of images.

saveConfig()

Saves the current tracker configuration to the default config file.

setDebugImages(img, finger)

Controls whether debug images of intermediate tracking results and detected finger positions are generated and exported to python. Generating the debug images takes a moderate amount of time, so it is turned off by default.

Parameters:
  • img – Whether to generate intermediate result images.
  • finger – Whether to generate the :py:const:IMG_FINGERS: result image.
setParam(element, value)

Sets one of the tracker configuration parameters.

startCalibration(displayextents) → TrackerCalibrator

Starts coordinate calibration session. The returned TrackerCalibrator exists until endCalibration() or abortCalibration() is called.

Parameters:
  • displayextents – The width and height of the display area.
class libavg.avg.TrackerCalibrator

Bases: Boost.Python.instance

Generates a mapping of display points to camera points using a set of reference points. Python code should display reference points that the user must touch to establish a mapping. Created by Tracker.startCalibration().

getDisplayPoint() → Point2D
nextPoint() → bool

Advances to the next point. Returns False and ends calibration if all points have been set.

setCamPoint(pos)