Don Arthur emails on a topic close to my heart, with a sensible and straightforward proposal that should be extremely appealing to a federal government that has announced its commitment to evidence-based policymaking.
I’m a big fan of your op-eds on randomised control trials. You might be interested to know that Catholic Social Services Australia (CSSA) is lobbying the government to use a RCT approach for projects funded through the new innovation fund for employment services.
CSSA’s latest submission says:
The innovation fund creates an opportunity for the Department to identify and disseminate information on innovative practices. However, because the impacts of employment programs are typically small, experimental or quasi-experimental methods are needed to accurately measure program impacts.
The Department should ensure that all programs supported by innovation funds are subject to a rigorous impact evaluation. The Department should consider contracting an independent research organisation to evaluate and report on innovative projects. The evaluators should have all necessary access to DEEWR data and other resources. Such an approach would assist comparison of projects under the scheme.
Using the resources of the Department it ought to be possible to conduct randomised control trials with job seekers receiving the ordinary level of service as a control group.
One of the reasons an experimental approach is feasible with employment services is that Centrelink is already referring job seekers to Job Network members on an almost random basis. Unless a job seeker states a preference for a particular provider, Centrelink shares out the referrals among providers with spare capacity.
The counterfactual in innovation experiments would be the regular level of service. That would neatly answer the most pertinent policy question — Is the new approach better than what we’re already doing?
The government is planning to hand out $41m in innovation funding. It would be a tragedy if we never found out whether the innovative approaches actually worked.
It would be an intreesting approach, and the whole field of evidence based policy in one that Australia does very badly compared to, for example, the UK.
You do of course have to adjust for the fact that trials generally outperform the full implementation of a programme (indeed you would probably maximise outcome rates by continually running series of trials). There would also be the added difficulty that assignment to a pilot service may not be purely random as there would be an initial point of time at which this pilot service had large excess capacity relative to every other provider as it had only just been set up (which could be solve by gradually ramping it up to full capcity and only evaluating over the period after which it reached full capcity, but that stretches out the timeframe of the evaluation).
Finally, a lot of the discussion of the theoretical benefits of RCTs ignores the agency of those administering the trial. For example, in a large scale trial undertaken in the UK some of the people staffing the call centre which made the random assignments had formed priors as to which service approaches they through were better, and so when a caller was randomly assigned to one of the unfavoured trials they would advise them to call back to get another go at being placed in the ‘good’ trial.
One of the best studies I have seen using a Centrelink style referral process as a basis for random selection is by David Autor (MIT) and Susan N Houseman (NBER) “Temporary Help Jobs Improve Labor Market Outcomes for Low-Skilled Workers? Evidence from ‘Work First'”. Abstract at:
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=842477
Here in Australia we have the same high use of temporary job placements by Job Network. Some organisations who started life as a labor hire company are more likely to use labour hire than others.
What do you think of such a study being replicated here?
Evelyn, I’m a big fan of David’s, and would love to see his study replicated here. I’m not sure if you have any sway over these programs, but if so, let me know if you can think of clever ways of advancing them.