Redesigning the IDEAL website

In this post, we invite you to participate in helping with the IDEAL website redesign.  You can get started right away if you want by completing this one-question survey about your specialty or discipline.  (If you’ve already completed the pop-up one, you don’t need to do it again.)

It’s now over 10 years since the IDEAL Collaboration was launched.  The time has come to bring our website up to date.  This blog is about how we are planning on doing this, and how you can help.

A lot of water has flowed under the bridge since 2010, but one thing has not changed.  New health care interventions must be underpinned by reliable evidence of effectiveness and safety. As we anticipate a future of further depleted resources and hugely increased pressure on health spending, so new interventions will have to prove their worth.  We urgently need better study designs that encourage innovation whilst reliably demonstrating effectiveness.

IDEAL is now internationally recognised as an integrated evaluation pathway for evaluation of complex interventions.  Starting with surgery, it has now been used in invasive non-surgical procedures, physiotherapy, radiotherapy, quality improvement studies and the evaluation of therapeutic devices.

The IDEAL guidelines for reporting were published in 2020, and appear on the EQUATOR guidelines pages.  Endorsements have come from major journals (Lancet, BMJ, Annals of Surgery) and professional bodies (Royal College of Surgeons).

There are IDEAL centres in the Netherlands and China as well as the Oxford centre.  Plans for a North American IDEAL network are progressing quickly.  IDEAL is being used in Health Technology Assessment and commissioning/purchasing decisions in Scotland, the Netherlands and Canada.  The head of the Chinese IDEAL Centre has been appointed as the scientific advisor for the Chinese technology evaluation centre in Hainan province.

The number of papers using or citing IDEAL continues to grow, and the NIHR in the UK has funded a number of IDEAL format studies, including an ongoing study of image guided brain cancer surgery.

Our recent Policy Forum was attended by high level representatives of NIHR, NICE, the MHRA (the UK’s regulator for medicine and medical devices) and the Commissioners at NHS England.  We are now in discussions about using IDEAL more systematically in the Commissioning process.

We have developed a range of training materials and a faculty of presenters, and have recently started to offer an Advisory service for those wishing to develop their research or evaluation using IDEAL principles.

The Collaboration continues to move forward intellectually too.  Hani Marcus from London has led the development of detailed guidance on IDEAL Stage 0 – the pre-clinical evaluation of devices.  Arsenio Paez from New York has led a group to discuss how to decide whether a randomised trial is necessary for new devices.  Both papers will be published shortly.  Meanwhile IDEAL has begin a major project on developing guidelines for the evaluation of surgical robots.

Meet our team and our vision for the website redesign

Peter McCulloch

Here’s what we want to achieve with the updated website:

  • Improved adherence to accessibility standards
  • A modern, engaging visual identity
  • Easier access to IDEAL resources
  • Better support for educational events
          • Better support for innovators and researchers planning evaluations of surgical procedures.


Allison Hirst

Because surgery is procedure-based, it presents challenges to gathering, reporting, and applying sound evidence that are not seen in medicine. Therefore our broad aim to increase awareness and utilisation of the IDEAL framework in surgical research.

A user-centred approach

Arsenio Paez

Evidence from human-computer interaction research consistently shows that involving users early on in any design process increases the chances of success.

This ought to be a truism but isn’t.  Sometimes, we assume we know what users want without asking them.  Often, we lack a clear understanding of what exactly users want to achieve in interacting with a resource.  On the other hand, user research can be a never-ending piece of string.


Mudathir Ibrahim


So the key challenge is:

How do we get best value from user involvement using the resources we have available?



Baptiste Vasey

Here’s how we intend to approach it:

  1. A series of simple surveys on the IDEAL website, to understand who uses the site and what they are looking for.  Take the first one now!
  2. Recruiting an informal, representative group of website users who can provide insight on the user experience
  3. Conducting cycles of design, development, and evaluation  with this group as we progress through the project.


IDEAL is a website that has something for a varied audience.  In particular, we need to involve surgeons, researchers, editors, and commissioners.

If you would like to help, we’ll be happy to hear from you.  Tweet us at @IDEALcollab, or email Allison Hirst at, tell us a little about yourself and we’ll take it from there.


Transparency and Access to Data

These resources deal with the essential topic of open and comprehensive disclosure in research.  Without knowing exactly what was planned, carried out and observed in a study it is impossible to determine its importance.  Failures to disclose this information has led to substantial harm to patients and wasted research effort. Continue reading

Outcome Measures and External Validity

These resources deal with how we identify, define and measure the outcomes of trials in surgery and other complex interventions.  A key aspect is the external validity of these measures, that is, whether they accurately and consistently represent the health outcomes they are supposed to measure. Continue reading

The need for core outcome sets in surgery

By Natalie Blencowe and Jane Blazeby

Compared with pharmaceutical trials, the quality of surgical randomised trials is poor and the evidence base for many surgical procedures remains weak. Reasons for this are multi-factorial including problems with recruitment, blinding, and the fact that surgical procedures are constantly evolving.

Another major difficulty relates to outcome assessment because there are currently no recognised definitions or standards for measuring surgical outcomes, including complications.

Lack of consistent outcomes

A recent systematic review of oesophageal cancer surgery has highlighted the extent of this problem, as not a single outcome was reported across all 122 included papers.

Anastomotic leak was reported in 80 studies but only defined in 28, using 22 different definitions  [1]. Similar problems have been reported in reviews of colorectal cancer, obesity and reconstructive breast surgery [2].

If studies do not all report the same outcomes, or provide definitions, it becomes impossible to accurately synthesise data so that outcomes can be compared between hospitals.

In addition, most studies measure and report surgeon-selected outcomes rather than patient-reported outcomes. This means that patients’ perspectives are often not considered and also that the intervention in question cannot be fully evaluated.

Core outcome sets

One solution is to provide a core outcome set for each surgical condition or procedure.

Core outcome sets contain a minimum agreed set of outcomes to be reported in all studies of a particular condition or procedure, and agreed definitions should also be provided as part of this.

If definitions and outcomes are standardised, meaningful cross-study comparisons can be made which minimises outcome reporting bias.

Developing core outcome sets via COMET

One way of developing core outcome sets is to use Delphi methodology to reach consensus by surveying key stakeholders, including patients.

The Core Outcome Measurement in Effectiveness Trials initiative (COMET) facilitates development of such measures in all areas of healthcare, including surgery. We are developing a core outcome sets for oesophageal and colorectal cancer surgery, for obesity surgery and for breast reconstructive surgery 1, 2.

To achieve this, we are working with the respective sub speciality organisations and with patient support groups. Whilst it is anticipated that clinical and patient-reported outcomes will be included, the final items (and their definitions) are yet to be decided.

If you would like to find out more, please visit our website, or post a question below.

  1. Blencowe NS, Strong S, McNair AG, Brookes ST, Crosby T, Griffin SM, Blazeby JM. Reporting of short-term clinical outcomes after esophagectomy: a systematic review. Ann Surg. 2012; 255(4):658-66.
  2. Potter S, Brigic A, Whiting PF, Cawthorn SJ, Avery KN, Donovan JL, Blazeby JM. Reporting clinical outcomes of breast reconstruction: a systematic review. J Natl Cancer Inst. 2011; 103(1):31-46.


Open Science and Data Sharing in Clinical Research

Basing Informed Decisions on the Totality of the Evidence

In a March 2012 editorial for Circulation: Cardiovascular Quality and Outcomes, Harlan Krumholz, Director of the YODA project, identified the steps required to achieve the fullest use of clinical research data to benefit patient care:

  1. Post, in the public domain, the study protocol for each published trial. The protocol should be comprehensive and include policies and procedures relevant to actions taken in the trial.
  2. Develop mechanisms for those who own trial data to share their raw data and individual patient data.
  3. Encourage industry to commit to place all its clinical research data relevant to approved products in the public domain. This action would acknowledge that the privilege of selling products is accompanied by a responsibility to share all the clinical research data relevant to the products’ benefits and harms.
  4. Develop a culture within academics that values data sharing and open science. After a period in which the original investigators can complete their funded studies, the data should be de-identified and made available for investigators globally.
  5. Identify, within all systematic reviews, trials that are not published, using sources such as and regulatory postings to determine what is missing.
  6. Share data.

Krumholz, H.M. 2012. Open Science and Data Sharing in Clinical Research: Basing Informed Decisions on the Totality of the Evidence. Circ Cardiovasc Qual Outcomes. 5:141-142.

Other recent relevant articles include:

Ross, J.S., Lehman, R. and C.P. Gross. 2012. The Importance of Clinical Trial Data Sharing: Toward More Open Science. Circ Cardiovasc Qual Outcomes. 5:238-240.

Spertus, J.A. 2012. The Double-Edged Sword of Open Access to Research Data. Circ Cardiovasc Qual Outcomes. 5:143-144.

Gotzsche, P.C. 2012. Strengthening and Opening Up Health Research by Sharing Our Raw Data. Circ Cardiovasc Qual Outcomes. 5:236-237.

Yale University Open Data Access (YODA) Project

We are very grateful to Harlan Krumholz and Richard Lehmann for providing this short overview of the YODA project.  YODA provides a clear model of how we can address the fifth component of the IDEAL framework:  long-term follow-up.

The aim of the YODA project is to promote and facilitate the sharing of clinical research data so that research can be reproduced and extended in the service of advancing the public’s interest.

The project seeks to develop, test and implement methods to disseminate research data as widely, comprehensively, responsibly and productively as possible.

Why we need YODA

There is clear evidence that many clinical research studies are never published and that much of the clinical research cannot be reproduced – or risks being duplicated unnecessarily – because data are not shared.

This issue resides across private and publicly funded efforts – and involves academics, industry scientists and leaders, funders, policymakers, and journal editors.

How YODA can help

Our aspiration is to find common ground between the interests of academia, industry, government and the public in promoting more open science and data exchange. Through this course we believe that all parties can benefit from greater confidence by the public in the scientific process and the principal actors.

We are specifically focusing on the sharing of data that may result in better information about the risk and benefits of products and strategies that are in use, rather than on pre-clinical or pre-approval research, which might have issues of intellectual property. Our ultimate goal is to ensure that support better informed decisions by ensuring that data are not hidden from view. We are seeking partners with data and making efforts to forge together mechanisms by which the data can be shared. Our approach is to accomplish this sharing through mutual collaboration rather than external regulation.

Progress to date

In its initial project, YODA has developed one such dissemination model which provides a means for rigorous and objective evaluation of clinical trial data to ensure that patients and practitioners possess all necessary information about a drug or device when making treatment decisions.

This model is designed to provide industry with confidence that the analyses will be conducted in a scientifically rigorous, objective and fair manner. Several features of the model are specifically focused on promoting transparency and protecting against industry influence:

  • The company engaging in the model must provide all relevant product data
  • Two independent research groups, selected after a competitive application process, systematically review and analyze all relevant product data
  • An independent Steering Committee, including leaders in the field of clinical research and biomedical ethics, advise the YODA project team
  • A Clinical Advisory Committee, including leaders in the clinical practice that uses the product under evaluation, advise the project
  • Project leadership are committed to transparency, publication, and making the data publicly available

Applying the model to recombinant bone growth factors

In August 2011, Medtronic Inc. reached agreement with Yale University to commit to this model for analyzing all data relating to its products containing recombinant bone morphogenetic protein-2 (rhBMP-2). Yale University has obtained full individual patient data from 17 trials conducted by Medtronic together with all necessary supporting materials (meta-data).

Yale selected two highly regarded independent research groups, University of York and Oregon Health and Science University, to conduct analyses of these data independent of each other and without any direct involvement or influence from Medtronic or Yale University.

Dissemination plans

The research groups are on target to complete their analyses in August 2012. Their results will be disseminated simultaneously soon after completion, and the full Medtronic data sets will then be made publicly available for further evaluation by external investigators. The completion of this project will yield important methodological lessons for further development and deployment of this model of retrospective data disclosure.
The YODA project looks forward to learning from the results of this unique collaboration, as part of its continuing effort to explore the best means for sharing of data from all interventional human trials. More importantly, we hope that this project is just the first of many that will follow.

Find out more

For further details, visit the YODA Web site:

See also: Krumholz, H.M. and J.S. Ross. 2011. “A Model for Dissemination and Independent Analysis of Industry Data.” JAMA. 306(14):1593-1594.

Implementing IDEAL at the IJS

Cover fo the IJSThere is good evidence that poorly reported research is more likely to be biased. Just as the CONSORT Statement improved the quality of research reporting, IDEAL has the potential to improve the quality of surgical case reports and other publication types that report research in surgery and complex interventions.

But is there still a problem? Riaz Agha at the International Journal of Surgery (IJS) thinks so:

“Recent work I have done on Plastic Surgery RCTs has shown how people don’t report conflicts of interest, funding sources, ethical approval, etc in addition to how they randomised their patients.

Journal Editors could have easily corrected all that through having strict policies in place. We publish all this information each and every time at the IJS as we have a good Manuscript Administrator who ensures compliance before things go to Production. It’s incorporated into our submission process so we capture it at source.”

If you want to find out more about how the IJS is implementing IDEAL Framework in journal publication, please visit their website.

Of course, IDEAL is a whole framework for how to bring new interventions through and not just about reporting the right information. Whilst journal editors have a very influential role in improving the quality of research reporting, change needs to happen in parallel with other developments.

“By the time the manuscript gets to a journal Editor – the work has already been done and it’s too late to influence the research itself. It would be useful for instance if new technologies/interventions were mapped onto the IDEAL framework in Research Reports from the RCS – giving it credence and making it the roadmap we use and expect new interventions to proceed along.

I think we can raise awareness with editorials, debate, presentations at major conferences, etc. but it will take time and a cultural change.“