I’m finding there are a lot of sentences in the database that are badly worded, missing words, spelt incorrectly or fail to make grammatical or logical sense when they are read out. Is this intentional, or simply due to a lack of proof-reading before they were released into the production environment?
Perhaps there needs to be another button added to both the recording and validation screens that allows a sentence to be flagged for review and possibly removed from the database or blacklisted in some way. I have to fail a number of the recordings due to the speaker inserting a word that should be there in the sentence but isn’t.
(Great initiative BTW - your friends @ Mycroft.ai sent me )