Don't Pay 'Blind' for Crowd-Sourcing Services

By March 05, 2012

It may be useful to commission 'the crowd' to carry out a task. But given that ways of working will vary from one person to another, the results will also tend to be uneven. That's why it's better to pay in several stages.

In order to help its strategic planning, a company needs to know the names of the 100 best cement manufacturers worldwide. It decides to use a paid crowd-sourcing platform where the results of the survey will be submitted by a large number of people who each contribute piecemeal information. But they don’t all work in the same way, are not all equally into that particular subject, and don’t all make the same amount of effort. The resulting quality of the work will reflect this, often being incomplete and/or showing needless repetition, poor quality information, contradictions, etc. One can well imagine the disappointment of the company that commissioned the work. Researchers* at the University of California, Berkeley studied  crowd-sourcing mechanisms by observing the Amazon Mechanical Turk (AMT) platform. They identified two behaviours which influence the quality of the final result and make paying blind for the service more risky. The first is the way in which the people carrying out the small tasks respond to the request. Some workers provide many more answers than others because the subject interests them or falls within their area of expertise, while others, through lack of time or interest, may provide very few.

A disappointing result

The second behaviour pattern relates to the way in which workers obtain the information. It turns out that in order to fulfil a certain task some members of the network might carry out the sub-tasks by consulting lists which already exist on the Internet. So if several workers use the same lists, the company will have paid for a disappointing service which provides little added value. There is even more risk of disappointment, the researchers point out, since the answers arriving from the crowd are subject to the law of diminishing returns. In other words, initially there is a high rate of arrival for previously unseen answers but as the query progresses the arrival rate of new answers begins to taper off, which makes it more and more difficult to achieve a result which meets the criteria set out by the commissioning company.

Paying for what you need

For this reason, the researchers recommend paying in several stages - what is known as the “pay as you go” approach - for crowd-sourcing platforms. This means that the company pays for the results it gets rather than for simply accessing the platform. The company submitting the request pays an upfront amount which covers the fee for being put in touch with the network that will carry out the job plus the cost of a preliminary result. Once this has been delivered, the company can then decide whether it needs more answers, in which case it will have to make a further payment in order to improve the quality and/or increase the quantity of the data. So, asking ‘the crowd’ to carry out a complex task can be a useful solution, as long as payment is made in several stages. This will help to limit, if not entirely avoid, the risk of disappointments and nasty surprises.

*Beth TrushkowskyTim KraskaMichael J. FranklinPurnamrita Sarkar

Legal mentions © L’Atelier BNP Paribas