What are the criteria for selecting a writing assistant technology for those with dyslexia?
What should be the review process?
What features/ capabilities are most important?
How do I review and grade different writing assistant packages?
Dr. Robert Iakobashvili, Ghotit CTO: I have been asked this question numerous times. So here is the list of my recommendations for reviewing writing assistant solutions and finding the solution that fits you best.
1. Choose your own text samples
Test the dyslexia software with your own text samples. The solution needs to assist your writing, so collect samples of typical text written by you or your targeted user group, and test these text samples.
2. Use text samples from multiple sources
It is recommended to use numerous samples from diversified sources written on different subjects. Using a large enough and diversified corpus takes more time, but has a real value in understanding what a solution can bring to you or your institution.
3. Don’t use any published text samples in your reviewing.
Why? Many software companies collect published samples. These companies than optimize their algorithms to make sure that these samples produce good scores. You don’t want to be fooled by such targeted optimizations.
4. Make sure the solution is self-learning.
Don’t you want a solution that with time will understand better you or your user group. Make sure that the solution supports intelligent algorithms that learn your writing, and offer improved suggestions the more you use the solution.
5. Make sure the vendor provides additional value compared to MS-Office, Pages, Windows, Mac or iOS platform tools.
For most of the population (that are not dyslexic) the standard Windows, Mac or iOS spell checker and word-prediction are good enough. Why spend money on a writing assistant solution if the Office or Pages spell checker is good enough? Take your text samples and compare the results between the Windows spell checker and the writing assistant results. Write your phrases with word-prediction and see if you get the right predictions faster with less typing using a software tested than when writing your platform tools and if it’s easier to comprehend and select the right prediction.
See also, if there are specific features offered by the writing assistant software vendor (word-banks, definitions dictionary, read-out-loud feature etc.) that provide you even additional value.
6. The more users who test the solution the better.
If you are reviewing your software for an organization or a group of users, try to engage as many users as possible in the evaluation process. The different users should be asked to provide their inputs in different areas: correctness of the algorithms, user experience, features etc.
7. Be precise in defining the reviewing goals and criteria of success.
Define exactly the reviewing goals and success criteria of the solution. The success criteria can include:
– Correctness of the suggested words of the solution
– Improved grammar and punctuation
– Increased speed in typing due to an effective word-prediction technology
– Feature completeness – e.g. integrated dictionary, read-out-loud feature.
8. Make sure that the solution interoperates with your targeted environments.
Does your customer base use Windows? Macs? iPhones? Android devices?
Make sure the solution supports all of the different devices that you need to support.
9. Confirm that the vendor is a credible vendor.
Many basic spell checking algorithms are published and can be easily programmed. Make sure that the vendor is a credible vendor with a real business that offers support when needed.
10. Make sure that the vendor is a dedicated to the dyslexic community
There are a lot of writing solutions out there. Some of them are generic spell checkers that are positioned as solutions for dyslexics. But dyslexics require more sophisticated writing assistive solutions. Make sure that the vendor is familiar with the specific challenges that dyslexics face when writing text, and that the company is dedicated to the dyslexic market.