My readability program uses a revolutionary way to determine how readable a book or article is. Rather than looking at syllables and word length, I check how often each word appeares in typical written English found on the Internet.
I have developed a proprietary algorithm that operates similar to a Google search engine. My World Crawler spiders through hundreds of web pages per day extracting and parsing through sentences.
A database of words is then developed and a frequency is assigned to each word based on how often it appears on the Internet. My analysis engine then uses this database of words to compare your book/article with the words in the database. A statistical report is then generated after each word has been compared to the word database.
The final numbers tells you how common your language is relative to language on the Internet. Values higher than the average are considered easy to read, while those lower are more difficult to understand.