$1.99
⭐️4.4 / 5
Rating
🙌18
Ratings
📼5 scs
Content
📦w/o updates
Updates frequency
🗣❌ unsupported
Your locale
- Versions
- Price History
- Description
- FAQ
All Versions of aVOR
2.3
May 31, 2016
This app has been updated by Apple to display the Apple Watch app icon.
Version 2: Additional tools for teaching and presentation on external monitors and video projectors can now be activated in the app settings:
Start Tutorial: Forces the app to start with the tutorial that normally only runs the first time it is used. This allows users to demonstrate the app the way new users see it.
Counter-Rotating Head: With this selection, the head on the screen rotates opposite to the iOS device so that people watching an external monitor or projector can see the movements (a new icon on the left of the screen toggles this mode).
Fix Head Stationary: Defeats head rotation by touches during counter-rotation mode.
Highlighted Touches: Displays white, yellow, or green dots where the operator's fingers are touching the screen. This is useful when someone is demonstrating the app with an external monitor.
Animation File Import: Data files can be imported from PC so that the app will animate users head and/or eye movement data (select data files to replay using a long press on the animation button in the settings).
Display Goggles: Renders a pair of video goggles on the head to indicate that the movements have been measured.
Network Port: A remote connection (UDP, TCP) can control the eye and/or head animation in real time from simulations, live sensor data, etc.
Display World Axes: Shows the world coordinate axes and gravity vector
Particles: Can adjust the ‘stickiness’ and start position of particles within the canals.
Additional Languages: French and Korean.
2.1: Fix some bugs which arrived in 2.0:
Correct touch rotations after device rotation
Model lighting fixed relative to world
Device motion activates vertical canals correctly
Posterior particle can be rolled into cupula
"Display gravity axis" option in Settings
2.3: bug fix: gravity vector during motion profile playback
More2.1
May 16, 2016
2.0: Additional tools for teaching and presentation on external monitors and video projectors can now be activated in the app settings:
Start Tutorial: Forces the app to start with the tutorial that normally only runs the first time it is used. This allows users to demonstrate the app the way new users see it.
Counter-Rotating Head: With this selection, the head on the screen rotates opposite to the iOS device so that people watching an external monitor or projector can see the movements (a new icon on the left of the screen toggles this mode).
Fix Head Stationary: Defeats head rotation by touches during counter-rotation mode.
Highlighted Touches: Displays white, yellow, or green dots where the operator's fingers are touching the screen. This is useful when someone is demonstrating the app with an external monitor.
Animation File Import: Data files can be imported from PC so that the app will animate users head and/or eye movement data (select data files to replay using a long press on the animation button in the settings).
Display Goggles: Renders a pair of video goggles on the head to indicate that the movements have been measured.
Network Port: A remote connection (UDP, TCP) can control the eye and/or head animation in real time from simulations, live sensor data, etc.
Display World Axes: Shows the world coordinate axes and gravity vector
Particles: Can adjust the ‘stickiness’ and start position of particles within the canals.
Additional Languages: French and Korean.
2.1: Fix some bugs which arrived in 2.0:
Correct touch rotations after device rotation
Model lighting fixed relative to world
Device motion activates vertical canals correctly
Posterior particle can be rolled into cupula
"Display gravity axis" option in Settings
More2.0
April 18, 2016
Additional tools for teaching and presentation on external monitors and video projectors can now be activated in the app settings:
Start Tutorial: Forces the app to start with the tutorial that normally only runs the first time it is used. This allows users to demonstrate the app the way new users see it.
Counter-Rotating Head: With this selection, the head on the screen rotates opposite to the iOS device so that people watching an external monitor or projector can see the movements (a new icon on the left of the screen toggles this mode).
Fix Head Stationary: Defeats head rotation by touches during counter-rotation mode.
Highlighted Touches: Displays white, yellow, or green dots where the operator's fingers are touching the screen. This is useful when someone is demonstrating the app with an external monitor.
Animation File Import: Data files can be imported from PC so that the app will animate users head and/or eye movement data (select data files to replay using a long press on the animation button in the settings).
Display Goggles: Renders a pair of video goggles on the head to indicate that the movements have been measured.
Network Port: A remote connection (UDP, TCP) can control the eye and/or head animation in real time from simulations, live sensor data, etc.
Display World Axes: Shows the world coordinate axes and gravity vector
Particles: Can adjust the ‘stickiness’ and start position of particles within the canals.
Additional Languages: French and Korean.
More1.1
March 8, 2012
- German, Italian, Japanese, Russian, Spanish, Simplified Chinese and Traditional Chinese
- Saccade sound effect volume scaled according to size of saccade
- Clearer and more realistic Eye Monitor inset
- Smooth particle transition between all three canals
- Triple-tap reset cycles between full head, two eyes mid close-up, single eye tight close-up
- Pre-recorded data file facility for head motion page
- Particles start at lowest part of canal
- Particle motion included for older devices with accelerometers but not gyroscopes
- Clearer anatomy in labyrinths graphic
- Clearer Axis Directions graphic
- Corrected particle activation/inhibition directions for the vertical canals
More1.0
February 6, 2012
Price History of aVOR
Description of aVOR
A teaching, training and test tool for the vestibulo-ocular reflex (VOR) system and its disorders, including BPPV. It demonstrates eye saccades, including those caused by canalithiasis (free-floating particles in the canals), and both functioning and dysfunctional VOR. It shows how BPPV is caused and treated.
aVOR demonstrates how head movement has an automatic influence on eye direction, the impact of dysfunction of the semicircular canals, their size, shape and location in the head, and the causes of nystagmus.
The application includes a Quiz Mode which presents the symptoms of various types of VOR dysfunction for the student to diagnose.
It is designed for college-level neuropsychology students and medical professionals, and incorporates the latest research. aVOR is sponsored by the University of Sydney, Australia.
For more details please see the University of Sydney Human Factors Research page at http://www.psych.usyd.edu.au/HumanFactors/?page_id=2160
Show less
aVOR: FAQ
Yes, there is an iPad version available for aVOR.
Hamish MacDougall is the creator of the aVOR app.
To function properly, the app requires at least iOS 6.0.
4.4 out of 5 is the fantastic rating that aVOR has received from users.
The App Category Of Avor Is Education.
2.3 is the current version of aVOR.
aVOR updated on July 7, 2024.
The specific date when the app came out was February 6, 2023.
No objectionable content, suitable for young children.
Currently, aVOR supports English, French, German and 7 more.
No, aVOR is not featured on Apple Arcade.
Unfortunately, users cannot make in-app purchases within aVOR.
Sorry, aVOR is not designed for integration with Apple Vision Pro.