Crowdsourcing mobilizes crowds to help solve problems, from cleaning BP's oil spill to marketing your company effectively. That's great for the organization with the problem, but is it good for you? Or, as critics contend, is crowdsourcing a threat to your profession that you should resist?
My experience is that it depends. The new trend in crowdsourcing is coordinating expert crowds to attack tasks requiring specialized knowledge. For example, uTest organizes software-testers, Local Motors organizes car designers, InnoCentive matches scientists to research efforts and BountyJobs matches headhunters with positions that require domain expertise (such as Pathologist's Assistant or Swimwear Technical Designer). These expert crowdsourcing sites don't commoditize your work by sending it to the general masses; instead, they provide you with a new way to get compensated for your expertise.
Here's why: the best sites cut out the overhead and let you focus on what you do best. The best sites also reward you for results.
Say you're a headhunter. In addition to the work you perform, your hourly rate covers other things such as customer acquisition (sales and marketing), customer retention (customer support, billing, collections) and general overhead.
The new expert crowdsourcing sites reduce overhead costs by bringing customers and experts together and automating service, support and billing. You get paid directly for the results of your expertise, and you can apply more of your effort to solving the main problem rather than finding customers and managing operations. The setup is attractive to freelancers or other small business people who love exercising their specialty more than they love managing general business tasks.
This newer model of expert crowdsourcing is still developing, and some sites offer better incentives than others. The model I prefer is a collaborative compensation model, which is different than the more commonly-known contest model used for some TV ads. In a contest model, participants respond to a request for submissions, but only one or two winners win the job and its rewards. In a collaborative model, contributors each earn compensation based on their performance toward a final result.
uTest is a good example of the collaborative model. Its software testers get paid for performance by writing test cases or finding bugs based on those test cases. If the task is to find software bugs, some find many, some only a few. But chances are high that most find something. The performance-based payout rates uTest experts earn can rival the effective earning rate they would make as freelancers. And customers win because they get a collaborative result from many participants while paying only for results.
The crowdsourcing movement is young and the underlying models are evolving quickly. The practice is stratifying into two distinct types: crowds of experts and general crowds that simply enjoy participating. In the participation model, the crowd may accept winner-take-all prizes or even simple participatory reward. For experts, the long term question is "will crowdsourcing pay well for my expertise?"
My prediction is that soon, if not already, the answer is yes.