Money money money

LookTel’s Money Reader is a revolutionary augmented reality app for the iPhone, iPod Touch (with camera) and iPad (with camera) for the visually impaired that counts US dollar bills ($1, $2, $5, $10, $20, $50, and $100 bills). As soon as the app detects the bills, it starts counting. There is no need to place the bills in front of the phone camera in a certain orientation. The app detects the bill artwork and immediately recognizes the denomination.

Money Reader does not require an internet connection which makes it usable everywhere.

This technology is a helpful mobile assistant that’s simple and easy to use. While shopping, use the application to verify money while checking out or to ensure you are getting the right amount of change back. It can also be used to quickly and easily sort money with total independence from virtually anywhere.

LookTel Money Reader provides Voice Over support for several languages including English, Spanish, French, Italian, German, Polish, Russian, Korean, Finnish, Danish, Swedish, Norwegian, and Japanese.

Let’s sum it up!

What is it?

LookTel’s Money Reader.

What is its purpose?

Allows the visually impaired person to count US dollar bills.

Skills needed?

Person must be able to use a smart phone device.

Who can use this?

Anyone with a visual impairment.

Where can I get it?

This app is available for $9.99, click here.

Check out the video for a very impressive demo!

Source: LookTel.

Label me

Imagine reaching into your kitchen cupboard blindfolded and trying to grab the can of soup (no, not the can of dog food, silly!). You’d probably experience something like this:

This could turn your dinner into an unwanted surprise!

The Braille labeler called “6dot” allows anyone to create Braille labels for visually impaired people — in any language!
Those familiar with Braille can use the six keys on 6dot (mapped to the six dots in the Braille System) to print labels. Those not familiar with the Braille System can attach an external standard QWERTY keyboard to the labeler and use it to create labels (for example, parents of blind children). This device gives visually impaired people the ability to differentiate between similarly designed objects (think bottles in a fridge, medicines, button panel on microwave etc.) by creating labels in Braille that can be stuck on such objects. The concept of Braille labelers is not new but 6dot is far more accessible and weighs significantly less than other Braillers in the market.

Let’s sum it up!

What is it?

The 6Dot Braille Label maker

What is its purpose?

Allows to make custom labels for the people with visual impairments.

Skills needed?

Typing skills, either 6dot braille or with the use of a QWERTY keyboard.

Who can use this?

Anyone who can type or use the braille system. This device is useful for people with visual impairments.

Where can I get it?

This device can be purchased through 6Dot for $150.

Source: 6Dot

Take Note

Sonocent, a UK based assistive technology company, recently released  its Mac version of Audio Notetaker – a very simple, intuitive, and easy to use note taking software that can be quite beneficial for people with dyslexia and those who may have problems focusing for a longer period of time (for example, in a classroom).

The software either allows direct recording of audio or import of an audio file. It also allows to color code different sections of the audio. For example, if you are attending a lecture in which the professor discusses different topics, you can add different colors to different sections of the audio that will let you know what is where. Audio Notetaker also allows the user to control the playback speed, which means that one can easily increase or decrease the playback speed as per their requirements.

Audio Notetaker offers quite many features such as recording/playing audio, adding colors to audio sections, highlighting etc. These features could not only benefit students with dyslexia, anyone with some kind of learning disability, but also people suffering from cognitive and memory deficits.

Let’s sum it up!

What is it?

This is an audio note taking software.

What is its purpose?

Audio Notetaker offers a visual and interactive form of note taking, making audio the basis of your notes and not text. This approach requires minimal input, allowing you to focus on listening instead of on writing.

Skills needed?

Cognitive and motor skills to use the software.

Who can use this?

Anyone with difficulty with reading or writing, or difficulty with memory.

Where can I get it?

A free 30 day trial version of the software can be downloaded here. A one year license would cost you $74.99 and a full license is $149.99. This product is available through Sonocent Ltd.

Watch the video for more info on this device.

Source: Sonocent Ltd.

Your Car is Ready Sir….

Late last year the State of Nevada passed a law allowing the use of autonomous cars on the road. This has led to an increased interest in the idea of a car being operated without the need of a driver behind the wheel. Google has been working on such an automobile, dubbed the self-driving project. Google took a Toyota Prius and wired it with sensors and cameras, which are then routed to a GPS system and power computer. Volunteer testing has started to take place with many visually impaired individuals signing up for a ride. Once such driver is Steve Mahan, who has lost 95% of his eye site. He was asked by Google to go along with the car through a Taco Bell drive through in which he ordered a burrito (a first time for everything). Although very far from the production stage, Google has already logged more than 200,000 miles with self-driving project since 2010.

Autonomous cars could be used for any individual with a physical disability unable to drive. The person riding or ‘driving’ the car would still require a certain cognitive level for safety. An example would be a person who is blind or has severe visual impairment or even an elderly person that does not drive any longer due to fear of getting lost or not being able to react to traffic fast enough.

Let’s sum it up!

What is it?

An autonomous car without the need of a driver behind the wheel

What is its purpose?

Allows visually impaired persons to be able to “drive” to places without having to rely on alternative transportation.

Skills needed?

The driver should have adequate cognitive skills to remain safe in the car.

Who can use this car?

Anyone that has an impairment, such as vision or perception, that may prevent the person fro being a safe driver.

Where can I get it?

This item is still in the research and testing phase. But hopefully in a few years we will be able to “ride” behind the steering wheel.

Source: Google + via The Verge

Eye of the Beholder

One of the downside to eye tracking software is that is it cumbersome and far too expensive for the average disabled user. Students at Brigham Young University have created an “eye tracking software” that cost only $1,500 and can be piggy backed to a Windows operating system. At this price point, this technology becomes accessible to other parts of the world where price does make a difference in getting the help needed. As the name suggests, this software uses a camera to track the eye movements of the user, which then is translated on screen as a mouse arrow.

This device can be categorized as a communication device. Many people with various impairments can benefit from the use of “eye tracking software” in order to communicate with others or use the personal computer as environmental control unit. Examples are persons with Amyotrophic lateral sclerosis (ALS), spinal cord injuries or other impairments that make functional use of the upper extremities challenging.

Let’s sum it up!

What is it?

Eye tracking software that can be used with any windows operating system.

What is its purpose?

This software tracks the users eye movements an enables him/her to control the computer that way.

Skills needed?

Voluntary eye movement, cognition and perceptual skills.

Who can use it?

Anybody who cannot manipulate and control a computer due to functional impairments.

Where can I get it?

More information to acquire this software can be requested through Brigham Young University. Greg Bishop is the faculty team member supervising this project.

Source: Engadget via Medgadget via Brigham Young

Game On!

Not all assistive technology needs to be so serious minded. Sometimes people just want to have fun! Enter the “Switchblade Adroit” video game controller. Designed for people with physical impairments, the Switchblade’s buttons can customized to the users choice and ergonomics. The 19 port control comes with 2 joysticks and allows for a number of buttons, rumble packs and switches to be added on. Making this a completely customizable game controller.

Let’s sum it up!

What is it?

The Android Switchblade is custom controller for gamers with disabilities.

What is its purpose?

Allows to customize joysticks and control buttons according to the users choice.

Skills needed?

Some movement in extremity. The controller can be fit to meet the needs of the gamer.

Who can use this?

Anybody with difficulty using a regular commercially available game controller.

Where can I get it?

Sold by Evil Controllers for around $400. No official launch date has been set.

Source: Engadget via Joystiq via Thrifty Nerd

Chew On This.

Those of us unfortunate enough to remember having braces also remember having to wear our retainers once said braces come off. While they may have only helped us keep our teeth straight in the past, they now have a more functional element thanks to researches at Georgia Institute of Technology.

The “Tongue Drive System” Tongue Drive is a wireless device that enables people with high-level spinal cord injuries or Amyotrophic lateral sclerosis (ALS) to operate a computer and maneuver an electrically powered wheelchair simply by moving their tongues. This device uses seven magnets strategically placed within an upper mouth retainer. The sensors then transmit data to an iOS app that translates it to on-screen or a joystick movement. Earlier versions used a headset, but the prototype is hoped to be more comfortable and discreet. The system is currently being trialled by 11 participants with high-level spinal-cord injuries, with larger trials planned. The magnetic retainer is hooked up via bluetooth to a smart phone, which then directs an electric wheelchair to move by the users command.

Let’s sum it up!

What is it?

The Tongue Drive is a dental retainer embedded with sensors to control ant kind of computer system. The sensors track the location of a tiny magnet attached to the tongues of users and communicate to the device via bluetooth.

What is its purpose?

Tongue Drive is a wireless device that enables people with high-level spinal cord injuries to operate a computer and maneuver an electrically powered wheelchair simply by moving their tongues.

Skills needed?

Functional oral motor control and cognitive function to use the device.

Who can use this?

This device would be beneficial to high-level spinal cord injuries or anyone who cannot use extremities to control computer systems. This device could replace the sip-n-puff control.

Where can I get it?

This type is still considered a prototype. Currently it is being tested with high spinal cord injuries in the Atlanta-based Shepard Center as well as the Rehabilitation Institute of Chicago. To receive information on this device you may contact Maysom Ghovanloo, an  associate professor in the School of Electrical and Computer Engineering who is part of this device development.

Source: Engadget, Cnet, Georgia Institute of Technology

May I Give You a Hand With That?

Quality of life is a term that is of great importance to people with physical disabilities. How can technology help increase an individuals quality of life? Especially when it comes to those with sever physical disabilities who always depend on other individuals for their day to day needs? A study at Brown University published in Nature demonstrates, for the first time that robotic limbs can successfully be controlled with just the power of the user’s mind. The study, performed by a team at Brown University in collaboration with other research institutions, implanted a computer-mind interface about the size of a pea into a patch of neurons in the motor cortexes of two volunteers—a 58-year-old woman and a 66-year-old man, both quadriplegics.

As the first prosthetic limb not physically attached to the body, the “robot arm” as it is being dubbed by its creators, BrainGate instead uses neural brain waves to move. Starting with a small computer chip being implanted within the paralyzed persons brain. The chip then communicates via algorithms to a sensor placed externally on the skull. Once everything is in place the user must be trained to use the external “robot arm”. This is done by thinking about moving their paralyzed arms. Usually, this process of learning takes no more than a couple of hours to complete. Once the user has some practice, they can have the ability to eat, drink and even move small object around just by thinking it.

The Braingate system will not be available until more trials have been done. They are hoping within the next few years to see more devices like this enter the market.

Let’s sum it up!

What is it?

This device is a simple computer chip that is inserted and planted into the brain of a paralyzed person.

What is its purpose?

To control an electronic device by brain waves rather then through movement of an extremity or speech.

Skills needed?

Cognitive function to focus on thinking about the movement to be executed by the electronic device.

Who can use this?

This could be used by any person with limited move of the upper extremities

Where can I get it?

This device is still in the research stage. The supervising faculty Leigh Hochberg can be contacted through Brown University.

Source: Assistive Technology Blog via Arstechnica, Brown Univeristy

Seeing-Eye Phone?

The smaller technology becomes the easier it becomes to mimic our body. This statement can be true for almost all assistive technology, but really makes its mark when referring to smart phones. Smart phones have become such common place in society that we now have named phobias attached to them when they go missing. Fortunately, for the visually impaired this means help in the most amazing ways.

Meet the “knfbREADER Mobile system”, created by Ray Kurzweil of the Kurzweil Group. This voice guided recognition system, helps the visually impaired “read” where the options for Braille may not be available. The Mobile Reader products can be activated and ready to use with the touch of a single button on the phone. It is essentially a high resolution camera coupled and a powerful software processing system built within a Nokia N82.. The user only has to press a button and the computer will take over by reading aloud what the user has selected in the framing point. At the same time, it can display the print on the phone’s built-in screen and highlight each word as it is spoken.

Available worldwide through K-NFRB Reading Technologies, Inc.

Let’s sum it up!

What is it?

The “knfbREADER Mobile system” is a voice guided recognition system

What is its purpose?

The user takes a photo of the print to be read and the character recognition software in conjunction with high quality text-to-speech will read the contents of the document aloud.

Skills needed?

Adequate fine motor skills to manipulate touch screen or cognition and language to use phone device through voice control.

Who can use this?

Any individual with visual impairment could benefit from this device.

Where can I get it?

Sales inquiries through KNBF reader. A list of distributors close to you may be found through a click here. The basic software is available for $900.

Source: Gizmodo via K-NFRB Technology, Inc.

Tagged

Granny Nanny

Researchers at the University of Texas at Arlington are helping to provide the elderly with more independent living as they age. Instead of placing them into nursing homes or bringing in expensive in-home care-givers, the researchers would like to embed sensors throughout the individuals homes.  Sensors in an apartment can capture movement, temperature, sound, and other elements. They can be worn, to help indicate if the elderly resident has fallen or isn’t responding. Additionally, the researchers are working on assistive robots to recognize the person’s position and follow him or her into places where no other sensors are available.

 

Let’s sum it up!

What is it?

System to unobtrusively monitor daily activities of elderly in their home environment and alert caregivers when help might be needed.

What is its purpose?

To facilitate ‘aging in place,’ where people can decide the level of monitoring support they wish to have to feel better about staying at home alone, even if it’s for just a few hours at a time.

Skills needed?

No skills needed to use this device.

Who can use this?

Any person who would benefit from supervision or reminders in the home environment.

Where can I get it?

Still in the testing stages, professors and students have built a prototype room at the University of Texas at Arlington Labs where they are recording and processing all forms of daily activities via volunteers. The Department of Computer Science Engeneering can be contacted via Fillia Makedon.

Source: Engadget, University of Texas at Arlington