Man, I thought I was going crazy all these weeks, but reading other people's reviews, now I know that I'm not. I've been experiencing the same thing that others have been complaining about. When I first started doing the "Answer a Question" jobs, everything was accepted without any problem. I even had a few "nice job!" comments from the editors thrown my way. But then, all of a sudden, I finally hit that "wall", where it seemed as if I was getting at least one rejection for every batch I submitted.The feedback was ridiculous, too, just lots of hairsplitting and nitpicking, as if the editor was looking for any reason to reject my submission.
For example, there was one question asking about the best way to find a medical specialist. I wrote, "One way is to ask your doctor for a referral." The source I had linked to said exactly that, except added that it was recommended to ask for two referrals. The editor rejected my task whining, "You wrote that you needed *a* referral. That's not what your source said. It said that you need *two* referrals." No, dingbat. The source said to ask for *a* referral, but ask for two to be on the safe side.
Another thing that editors have been doing is rejecting your work when they've become too lazy to edit it. They'll just reject it with the generic, "Has poor writing/multiple infractions" robo feedback. How do I know this? Because it's always the longest 200-300 word answers that get this feedback.
Editors have also been rejecting tasks based on their assumptions about writers being "ESL." In case you don't know what I'm talking about, editors went to the CrowdSource forums complaining about how writers were "obviously ESL." (They have the magical ability to tell who's a blue-blooded American speaker and who is a Third Worlder who only just learned English a year ago). They then demanded the ability to flag jobs as being "ESL." Apparently, this "ESL" thing gained traction, because not too soon after, a writer went to the forums complaining that several of his tasks had been unfairly rejected, with the suspicion that they had been rejected by an editor who had deemed his work "ESL." CrowdSource agreed with him that the rejections were unfair and overturned them.
Editors also seemed to have lost scope of what their job entails. Some editors have begun leaving snarky feedback based on value judgments they have made about the pieces they are marking. Last I heard, a writer complained at the CS forums about a rude piece of feedback she had received on her piece. When called out on it in the forums, the editor who had left the commentary made the excuse that she was justified, because it was "obvious" to her that the writer had "run out of things to say."
At any rate, after about a month of smooth sailing, I saw my score--which had climbed up to an 84--suddenly drop down to 77 over one weekend. What I noticed, too, is that in spite of still having other tasks pass successfully, they had no impact on my score. Apparently, only the Expert answers affect your score and not the High Traffic ones. So you could submit hundreds of High Traffic answers successfully with flying colors but as soon as your Expert answer is rejected, your score plummets.
Needless to say, don't waste your time with CrowdSource. It's a shame, too, because if it weren't for the bad editors, you could make some decent income there. Not anymore.