. an arm of the usa goverment is investing in
high-risk, high-payoff research programs;
one of which is the The Good Judgment Project
that harnesses crowd-sourced intelligence
to make predictions about global events.
. IARPA ( Intelligence Advanced Research Projects Activity)IARPA's ACE (AGGREGATIVE CONTINGENT ESTIMATION)
invests in high-risk, high-payoff research programs
for an intelligence advantage over future adversaries.
Failure is completely acceptable as long as...
given technical and programmatic integrity
and Results are fully documented .
All IARPA programs are structured according to
the Heilmeier framework:
1. What are you trying to do?
2. How is it done at present? Who does it?
What are the limitations of present approaches?
3. What is new about your approach?
Why do you think that you can be successful at this time?
4. If you succeed, what difference will it make?
5. How long will it take? How much will it cost?
How will you evaluate progress during and at the conclusion of the effort?
(i.e., what are your proposed milestones and metrics?)'.
The goal of the ACE Program is to dramatically enhance theCROWD WISDOM TO FORECAST WORLD EVENTS
accuracy, precision, and timeliness of intelligence forecasts
for a broad range of event types,
through the development of advanced techniques that
elicit, weight, and combine
the judgments of many intelligence analysts.
The ACE Program seeks technical innovations
in the following areas:
(a) efficient elicitation of probabilistic judgments,
including conditional probabilities for contingent events;
(b) mathematical aggregation of judgments
based on factors that may include: past performance,
expertise, cognitive style, metaknowledge,
and other attributes predictive of accuracy; and
(c) effective representation of aggregated
probabilistic forecasts and their distributions.
The ACE Program will build upon
technical achievements of past research
and on state-of-the-art systems used today
for generating probabilistic forecasts
from widely-dispersed experts.
The program will involve empirical testing of
forecasting accuracy against real events.
The Good Judgment Project is a four-year research studywashingtonpost 2013 [dead link]:
part of the IARPA Ace Project [mentioned above]
organized as part of a forecasting tournament.
. people around the world predict global events.
Their collective forecasts are surprisingly accurate.
. each year since 2011,ESS`comment:
IARPA has posed about 100-150 questions
to teams participating in its ACE forecasting tournament
on topics such as the Syrian civil war,
the stability of the Eurozone and Sino-Japanese relations.
Each research team was required to gather
predictions from thousands of forecasters online
and to generate daily collective forecasts
that assign realistic probabilities to possible outcomes.
The Good Judgment Project emerged as the clear winner.
The Good Judgment Project attributes its success to
a blend of getting the right "crowd"
(not necessarily subject matter experts),
offering basic tutorials on inferential traps to avoid,
and best practices to embrace,
concentrating the most talented forecasters into super teams,
and constantly fine-tuning the aggregation algorithms it uses
to combine individual forecasts into a
collective prediction on each forecasting question.
. in the Good Judgment Project,Good Judgment Project participant"dartthrowingchimp:
there are 3 prediction market "methods":
the first uses an Intrade-like
continuous double auction (CDA) trading scheme,
the second uses a market-maker scoring rule (LMSR),
the third just reports the average price of the two markets.
Michael's graph just shows the predictions of the LMSR market,
which performed relatively poorly on this particular question,
while the CDA market (and the average of the two markets)
actually outperformed every other method,
including the "Top Forecasters".
Since early 2012, I’ve been a participant/subjectmore recently:
in the Good Judgment Project (GJP),
a U.S. government-funded experiment in
“wisdom of crowds” forecasting.
Over the past year,
GJP participants have been asked to estimate
the probability of several events related to the conflict in Syria,
including the likelihood that Bashar al-Assad would leave office
and the likelihood that opposition forces would
seize control of the city of Aleppo.
. I wouldn’t describe myself as an expert on civil wars,
but during my decade of work for the
Political Instability Task Force,
I spent a lot of time looking at data on
the onset, duration, and end
of civil wars around the world.
From that work, I have a pretty good sense of
the typical dynamics of these conflicts.
U.S. Holocaust Memorial Museum
for my work on the atrocities early-warning system
discussed in this post.
Since the spring of 2013, I have also been paid to
write questions for the Good Judgment Project,
in which I participated as a forecaster the year before.
. Will Unarmed Civilians Soon Get Massacred in Ukraine?
According to one pool of forecasters,
most probably not.
As part of a public atrocities early-warning system,
the Center for the Prevention of Genocide
is running a kind of always-on forecasting survey
called an opinion pool,
similar in spirit to a prediction market,
but instead of having participants trade shares
tied [to] the occurrence of some future event,
we simply ask participants to estimate the
probability of each event's occurrence.
In contrast to a traditional survey,
every question remains open until the event occurs
or the forecasting window closes.
This way, participants can update their forecasts
as often as they like,
as they see or hear relevant information
or just change their minds.