Crowdsourcing is heating up in terms of corporate interest. Pepsi’s decision to skip the Super Bowl in favor of a crowdsourced ideas initiative – Pepsi Refresh – is an example of the interest in the market. Digital strategy, marketing and design firm Last Exit called crowdsourcing a top digital marketing trend for 2010.
Contests are a particular form of
crowdsourcing that are proving to be beneficial in a number of areas.
Contests allow people from around the world to compete with one another
on a specific challenge put forth by an organization. Participation is
motivated by incentives commensurate with the level of the challenge.
The contest version of crowdsourcing
has its own activities for gathering, filtering and selecting among the
submissions of people. These activities are:
Crowdsourcing
starts with the contributions of people from around the globe. These
submissions are aggregated into a common site. Submissions are provided
in the format matching the contest objectives.
People
provide their feedback on the submissions of others. This feedback can
be up-down votes, star ratings, comments and buying into ideas with
virtual currency. This process can be collaborative, helping refine
submissions.
Organizations
establish panels of experts who review the crowdsourced submissions,
and select those best meeting their requirements. Experts possess
distinct domain knowledge to make the final decision in the contest.
The
winners of the contest are determined by people’s votes and other
measures. This selection process is a mix of overall crowd sentiment,
weighted for higher reputed members, and the power of individuals to
leverage word-of-mouth marketing.
These components can be integrated
in different ways to provide four different models for running
crowdsourced contests. These four models are described below.
Model #1: Crowd Sentiment, Expert Decision
The Crowd Sentiment, Expert Decision
model allows organizations to include the sentiment of the crowd as
part of their decision-making process. This is valuable input for
contests where the selected submissions will ultimately be put in front
of the market. The crowdsourced feedback provides an early read on the
potential market reaction.
This model is also ideal for cases
where a collaborative spirit can refine and improve submissions.
Especially for more complex contests, feedback from interested
collaborators is valuable for fully understanding the opportunity in
the submission and its weaknesses.
Two organizations are using Spigit for this model of crowdsourcing contest. Cisco is seeking $1 billion ideas through its I-Prize contest. And the Enterprise 2.0 Conference is managing its competitive speaker proposal process with this model. Both are utilizing crowdsourced feedback as part of the decision-making process.
Model #2: Crowd Decision
The Crowd Decision model
leverages the crowd for all parts of the contest. This model provides a
great platform for organizations to better understand the meaning that
is associated to their products and services. The submissions reflect
the creativity of customers and interested parties. The feedback on a
submission signals the intensity of feeling for someone’s particular
interpretation of meaning. Winners are determined by how the community
rates their submissions.
This model is ideal for marketing
purposes. It becomes a strategic engagement model, particularly where
customers are talking about your organization in social media. It’s a
fun way to increase company awareness.
Model #3: Expert Decision
The Expert Decision model
engages the global community to find solutions to complex problems.
Experts review the submissions, identifying those best addressing the
objective of the contest. The sentiment of the crowd is not an element
in these contests, as they typically address more technical challenges.
This model also prevents theft by
competitors of people’s ideas. The submissions are only visible to
designated experts associated with the sponsoring organization. The
closed nature of submissions is important for generating interest from
people with the technical competence to address a challenge.
Model #4: American Idol
The American Idol model is
so-named because it reflects the selection process of that show. The
community ultimately selects the winners of the contest. But the
candidates in the contest are first selected by experts.
This model is good when the quality
of submissions will fluctuate significantly. The experts act as a
filter before the community votes. It’s also appropriate when the
sponsoring organization has a specific direction it wants for the
winning submission. The experts identify candidate submissions
consistent with the direction desired.
Four different models for running a
competitive crowdsourcing initiative, each with its own characteristics
and business objectives. The biggest takeaway for anyone considering
such an initiative is the flexibility of approaches to accomplish
different objectives.
(Cross-posted @ Spigit)