The Vision Egg is a powerful, flexible, and free way to produce
stimuli for vision research experiments.
The Vision Egg is a high level interface between Python and OpenGL. In
addition to methods for automatic generation of traditional visual
stimuli such as sinusoidal gratings and random dot patterns, it has a
number of functions for moving numeric data, images, movies, text, and
3D objects to and from your video card and allowing use of some of its
features like perspective distortion. Therefore, it is also useful for
anyone wishing to make use of the features of today's graphics cards.
Perform experiments using an inexpensive PC and standard
consumer graphics card
Perform experiments using a graphics workstation if special
features needed
Data acquisition and other realtime hardware control
capabilities useful in electrophysiology and fMRI experiments,
including gaze-contingent stimuli
Dynamically generated stimuli can be changed in realtime via
software or external hardware
Produce traditional stimuli to replace legacy systems
Produce stimuli not possible using other hardware
Demo programs to get you started right away
Run stimuli on your laptop - great for talks
Free, open-source software
By harnessing the incredible power of today's consumer graphics cards,
producing visual stimuli of research quality now requires no
specialized hardware beyond a relatively recent computer and graphics
card.
Based on open standards, it runs on anything from cheap PCs to
expensive special hardware for special needs. For example, running on
some platforms, such as SGI workstations, the Vision Egg has a 10-bit
luminance dynamic range (both pixel depth and DAC) and precise
frame-by-frame control.
The Vision Egg is open source software (GNU LGPL). Therefore, you can
be assured of a product that meets your needs but does not lock you
in. Download it today and give it a try!
SR Research, the makers of eye tracking hardware and software, have
released Pylink.
Pylink can be used with the Vision Egg!
According to SR Research:
Pylink allows for tracker control, real-time data access, and
external synchronization with eye data via custom messaging.
Many people find Python to be a simpler, yet still powerful,
alternative to C. Pylink can also be used in combination with the
excellent third party open source Vision Egg software; providing a
combined visual presentation and eye tracking scripting package.
Distributed with Pylink is a modified Vision Egg demo using realtime
tracker data to move a Gaussian-windowed grating in a gaze-contingent
fashion. Following this example, it should be easy to create other
VisionEgg/Pylink scripts for a variety of vision experiments involving
eye tracking.
There are several major improvements. (The few changes that may break
old code are detailed in the release notes included in this email).
There is nothing more I intend to add before I release Version 1.0 --
this is a release candidate subject to final testing and bug fixing,
so I would appreciate all the abuse you can put it through. In
particular, test/conform.py runs many tests on your system and reports
the output.
Changes for 0.9.9:
Screen.put_pixels() method for blitting of raw pixel data
Support for QuickTime movies (currently Mac OS X only)
Redesign of type check system for accuracy and clarity
TrueType font rendering with SDL_ttf2
Textures with alpha--bugfixes and examples
Positioning of viewports and 2D stimuli can use relative
positioning anchors
Now requires Python 2.2 -- new style classes used to restrict
attribute acccess
Now requires pygame 1.5
Renamed timing_func() to time_func()
EPhysGUI saves absolute time a trial was started (to recontruct
all stimuli)
Allow use of pygame Surfaces as source of texture data
Mipmaps of sphere-mapped sinusoidal grating to prevent spatial aliasing
De-emphasis on Presentation and Controller classes (moved to
FlowControl module)
Changed orientations such that 0 degrees is right and 90 degrees is up.
Bugfix in SphereMap module -- gaussian formula produced windows
too wide by 2/sqrt(2)
Allow conversion of 3D vertices into 2D screen coordinates
Added wireframe azimuth/elevation grid with text labels
Allow arbitrary orientation of textures and text with angle parameter
FrameTimer class now available for use in your own main loops
Use Python 2.3 logging module (copy included for use with Python 2.2)
No installation of demos or documentation (get source or demo package)
Many small enhancements and bugfixes
New tests:
high-voltage regulation test for displays (Brainard et al., 2002)
incomplete DC restoration test for displays (Brainard et al., 2002)
unit-test suite: among many other things, pixel accuracy of textures
New demos:
mpeg.py plays MPEG movies (currently seeking a free movie to include)
quicktime.py plays QuickTime movies (currently Mac OS X only)
convert3d_to_2d.py converts 3D positions to 2D positions
dots_simple_loop.py uses own loop rather than Presentation class
makeMovie2.py makes a movie with get_framebuffer_as_image() function
mouse_gabor_2d.py shows a gabor wavelet under mouse and keyboard control
mouse_gabor_perspective.py is sphereGratingWindowed.py improved
and renamed
mouseTarget_user_loop.py uses own loop rather than Presentation class
I am proud to announce release 0.9.4 of the Vision Egg. A large number of features have been added. This is a major improvement on the last release, and I recommend all users upgrade.
This release features:
Complete electrophysiology application (see below)
Constant visual angle (perspective-distorted) gaussian and
circular windows added
3D texture-mapped sphere stimulus added
Random dot stimulus added
Texture module re-written for ease-of-use, clarity, and support
of dynamic texture updating
Color grating stimulus added
Support for plaids added
Masks for gratings and textures implemented using multitexturing
- uses high bit depth available on some hardware
Vision Egg Programmer's Manual created
GLTrace module for tracking all calls to OpenGL
Many more minor features and bug fixes
New electrophysiology application features:
Complete application with ready-to-run experiments
Application extendible with new experiments by modifying example
experiments
Full-screen graphics possible with server application, GUI
client controls experiments over TCP
Automated sequence control, including sequences-of-sequences
Stimulus onset timing calibration / verification support
3D position / perspective-distortion calibration support
Parameters saved during experiments as Python or Matlab scripts
Complete configurations can be saved and restored from file
All code written for the 0.9.3 release should continue to run on the
0.9.4 release. There may be some slight changes in functionality due
to bug fixes.
Please direct enquires to the Vision Egg mailing list.
The primary author of the Vision Egg is Andrew Straw
This page last modified 27 Jun 2004.