A Randomised Trial of the Job Network?

Don Arthur emails on a topic close to my heart, with a sensible and straightforward proposal that should be extremely appealing to a federal government that has announced its commitment to evidence-based policymaking.

I’m a big fan of your op-eds on randomised control trials. You might be interested to know that Catholic Social Services Australia (CSSA) is lobbying the government to use a RCT approach for projects funded through the new innovation fund for employment services.

CSSA’s latest submission says:

The innovation fund creates an opportunity for the Department to identify and disseminate information on innovative practices. However, because the impacts of employment programs are typically small, experimental or quasi-experimental methods are needed to accurately measure program impacts.

The Department should ensure that all programs supported by innovation funds are subject to a rigorous impact evaluation. The Department should consider contracting an independent research organisation to evaluate and report on innovative projects. The evaluators should have all necessary access to DEEWR data and other resources. Such an approach would assist comparison of projects under the scheme.

Using the resources of the Department it ought to be possible to conduct randomised control trials with job seekers receiving the ordinary level of service as a control group.

One of the reasons an experimental approach is feasible with employment services is that Centrelink is already referring job seekers to Job Network members on an almost random basis. Unless a job seeker states a preference for a particular provider, Centrelink shares out the referrals among providers with spare capacity.

The counterfactual in innovation experiments would be the regular level of service. That would neatly answer the most pertinent policy question — Is the new approach better than what we’re already doing?

The government is planning to hand out $41m in innovation funding. It would be a tragedy if we never found out whether the innovative approaches actually worked.

This entry was posted in Uncategorized. Bookmark the permalink.

3 Responses to A Randomised Trial of the Job Network?

  1. Steve W says:

    It would be an intreesting approach, and the whole field of evidence based policy in one that Australia does very badly compared to, for example, the UK.

    You do of course have to adjust for the fact that trials generally outperform the full implementation of a programme (indeed you would probably maximise outcome rates by continually running series of trials). There would also be the added difficulty that assignment to a pilot service may not be purely random as there would be an initial point of time at which this pilot service had large excess capacity relative to every other provider as it had only just been set up (which could be solve by gradually ramping it up to full capcity and only evaluating over the period after which it reached full capcity, but that stretches out the timeframe of the evaluation).

    Finally, a lot of the discussion of the theoretical benefits of RCTs ignores the agency of those administering the trial. For example, in a large scale trial undertaken in the UK some of the people staffing the call centre which made the random assignments had formed priors as to which service approaches they through were better, and so when a caller was randomly assigned to one of the unfavoured trials they would advise them to call back to get another go at being placed in the ‘good’ trial.

  2. Evelyn H says:

    One of the best studies I have seen using a Centrelink style referral process as a basis for random selection is by David Autor (MIT) and Susan N Houseman (NBER) “Temporary Help Jobs Improve Labor Market Outcomes for Low-Skilled Workers? Evidence from ‘Work First'”. Abstract at:

    A disproportionate share of low-skilled U.S. workers is employed by temporary-help firms. These firms offer rapid entry into paid employment, but temporary-help jobs are typically brief, and it is unknown whether they foster longer-term employment. We exploit a unique aspect of the city of Detroits welfare-to-work program, in which one in five jobs taken is obtained with a temporary-help firm, to identify the effects of temporary-help jobs on the subsequent labor market advancement of low-skilled workers. Welfare participants are assigned on a rotating basis to one of numerous program providers that have substantially different placement rates into temporary-help and regular (direct-hire) jobs but offer otherwise standardized services. This gives rise to variation in job-taking rates that is functionally equivalent to random assignment. Using provider assignments as instrumental variables, we find that temporary-help job placements yield significant short-term earnings gains, but these gains are offset by lower earnings and less frequent employment over the next one to two years. Job placements with direct-hire employers, by contrast, substantially raise earnings over one, two, and three years following placement. The primary observable difference between these types of job placements is their effect on subsequent employment stability. Direct-hire placements roughly double the probability of ongoing employment in each of the first eight quarters following program assignment, while temporary help placements only positively affect the probability of ongoing employment for two quarters and do not facilitate transitions to direct-hire jobs. These results qualify the interpretation of a large experimental literature documenting the benefits of job placement services for labor market outcomes of low-skilled workers. We find that the benefits of job placements derive entirely from direct-hire jobs; placing low-skilled workers in temporary-help jobs is no more effective than providing no job placements at all.

    Here in Australia we have the same high use of temporary job placements by Job Network. Some organisations who started life as a labor hire company are more likely to use labour hire than others.

    What do you think of such a study being replicated here?

  3. Andrew Leigh says:

    Evelyn, I’m a big fan of David’s, and would love to see his study replicated here. I’m not sure if you have any sway over these programs, but if so, let me know if you can think of clever ways of advancing them.

Comments are closed.