Action 1.08 states
The health service organisation uses organisation-wide quality improvement systems that:
- Identify safety and quality measures, and monitor and report performance and outcomes
- Identify areas for improvement in safety and quality
- Implement and monitor safety and quality improvement strategies
- Involve consumers and the workforce in the review of safety and quality performance and system
Intent
An effective quality improvement system is operating across the organisation
Reflective questions
How does the quality improvement system reflect the health service organisation’s safety and quality priorities and strategic direction?
How does the health service organisation identify and document safety and quality risks?
What processes are used to ensure that the actions taken to manage identified risks are effective?
Key tasks
- Define quality for clinical services (for example, effectiveness, safety, consumer experience) and share this information with the workforce
- Review the quality improvement system, including the vision, mission, values and objectives, to ensure that they reflect the organisation’s clinical safety and quality priorities, and strategic direction
- Decide how feedback will be collected from the workforce, patients and consumers
- Consider whether there is a coherent, planned and systematic schedule of audits of clinical and organisational systems, and reliable processes to capture findings and implement necessary improvements
- Develop a schedule for reporting to the governing body and managing the design and performance of key clinical systems
- Monitor and review progress on actions taken to improve safety and quality, and provide feedback to the workforce, patients and consumers
- Provide information and training, where necessary, to the workforce, patients and consumers to encourage their involvement in the analysis of performance data.
Strategies for improvement
Hospitals
Develop a quality improvement system
The elements of a successful quality improvement system include:
- A description of ‘high quality’ that is reflected through the organisation’s vision, mission and values
- A definition of the organisation’s stakeholders
- Clearly defined and aligned organisational objectives and clinical quality objectives
- Clearly defined processes and responsibilities that are required to meet quality objectives
- Training for the organisation’s workforce in safety and quality
- Processes to verify that the quality improvement system is operating effectively
- Mechanisms for monitoring consumer satisfaction, measuring quality and implementing improvements.
Define quality and how it will be measured
Define the elements of quality to be used by the organisation (for example, safety, effectiveness, consumer experience). Provide a common language and understanding for the design, implementation and monitoring of safety and quality performance throughout the organisation. An example of this is the national list of hospital-acquired complications (HACs).
HAC refers to a complication for which clinical risk mitigation strategies may reduce (but not necessarily eliminate) the risk of that complication occurring. The national list of HACs includes 16 complications that were selected based on the criteria of preventability, patient impact, service impact and clinical priority. HACs are identified using routinely collected data extracted from patient healthcare records. Codes are used to identify the diagnosis, and a flag is used to indicate that the diagnosis arose during the episode of care.
The HACs list provides a succinct set of complications to support monitoring of patient safety in a hospital setting. Regular reports to clinicians, boards and other stakeholders on HACs can help identify areas that require attention, as well as areas of best practice. While the identification and reporting of HACs are important elements supporting patient safety, they are intended to complement and be used alongside other quality improvement processes.
The HACs list was developed through a comprehensive process that included reviews of the literature, clinical engagement, and testing of the concept with public and private hospitals. The list was agreed by the Commission’s Inter-Jurisdictional Committee in June 2016. In March 2017, the list was included in the National Health Reform Agreement for use in pricing and funding of Australian public hospitals.
Involve the workforce, patients and consumers in defining quality, and in processes such as reviewing quality improvement systems.
Define the key indicators for safety and quality measures that will be routinely collected and reported to management and the clinical workforce, as well as the level of detail required to enable the governing body and workforce to fulfil their responsibilities. These may include data from incidents and complaints management systems, safety and quality audit reports, infection control reports, reviews of clinical practice, and clinical indicators relating to specific actions in the NSQHS Standards.
Routinely measure and monitor patient experience by using national core common questions on patient experience developed by the Commission.
Conduct regular reviews and audits
Develop a schedule of reviews and audits that cover the variety of services and locations used for the delivery of care, to ensure that there is systematic oversight of safety and quality systems.
Conduct audits throughout the organisation, including the clinical, departmental, divisional and whole-of-organisation levels. Actively engage clinicians and consumers in the audit processes and analysis of results. Ensure that audits test the design and performance of the organisation’s clinical governance system.
Audits are effective if their outcomes are used for improvement and assurance purposes. Independent auditors or reviewers can assist to ensure a high level of assurance of objective reporting for the governing body. Report audit outcomes throughout the organisation – to the governing body, the workforce, and patients and consumers.
Record the outcomes of clinical system audits on a register, together with proposed actions and responsibilities, and evidence of implementation and follow-up. These records can be used to show how risks and opportunities identified through the quality improvement system are addressed, to improve safety and continuously improve performance.
Examples of evidence
Select only examples currently in use:
- Policy documents that describe the processes and accountability for monitoring the safety and quality of health care
- Documented safety and quality performance measures
- Schedule for internal or external audits
- Audit reports, presentations and analysis of safety and quality performance data
- Feedback from the workforce about the use of safety and quality systems
- Feedback from consumers about their involvement in the review of safety and quality performance data
- Quality improvement plan that includes actions to deal with issues identified
- Examples of specific quality improvement activities that have been implemented and evaluated
- Committee and meeting records in which reports, presentations, and safety and quality performance data are regularly reviewed and reported to the governing body or relevant committees
- Training documents on the health service organisation’s quality improvement system
- Communication with the workforce, patients and carers that provides feedback regarding safety and quality of patient care
- Reports on hospital-acquired complications indicator set (public hospitals only).
Day Procedure Services
Develop a quality improvement system
The elements of a successful quality improvement system include:
- A description of ‘high quality’ that is reflected through the organisation’s vision, mission and values
- A definition of the organisation’s stakeholders
- Clearly defined and aligned organisational objectives and clinical quality objectives
- Clearly defined processes and responsibilities that are required to meet quality objectives
- Training for the organisation’s workforce in safety and quality
- Processes to verify that the quality improvement system is operating effectively
- Mechanisms for monitoring consumer satisfaction, measuring quality and implementing improvements.
Define quality and how it will be measured
Define the elements of quality to be used by the organisation (for example, safety, effectiveness, consumer experience). Provide a common language and understanding for the design, implementation and monitoring of safety and quality performance throughout the organisation.
Define the key indicators for safety and quality measures that will be routinely collected and reported to management and the clinical workforce, as well as the level of detail required to enable the governing body and workforce to fulfil their responsibilities. These may include data from incidents and complaints management systems, safety and quality audit reports, infection control reports, reviews of clinical practice, and clinical indicators relating to specific actions in the NSQHS Standards.
Routinely measure and monitor patient experience by using national core common questions on patient experience developed by the Commission.
Conduct regular reviews and audits
Record outcomes of clinical system audits on a register, together with proposed actions and responsibilities, and evidence of implementation and follow-up. These records can be used to show how risks and opportunities identified through the quality improvement system are addressed, to improve safety and continuously improve performance.
Actively engage clinicians and consumers in the audit processes and analysis of results. Ensure that audits test the design and performance of the organisation’s clinical governance system.
Examples of evidence
Select only examples currently in use:
- Policy documents that describe the processes and accountability for monitoring the safety and quality of health care
- Documented safety and quality performance measures
- Schedule for internal or external audits
- Audit reports, presentations and analysis of safety and quality performance data
- Feedback from the workforce about the use of safety and quality systems
- Feedback from consumers about their involvement in the review of safety and quality performance data
- Quality improvement plan that includes actions to deal with issues identified
- Examples of specific quality improvement activities that have been implemented and evaluated
- Committee and meeting records in which reports, presentations, and safety and quality performance data are regularly reviewed and reported to the governing body or relevant committees
- Training documents on the health service organisation’s quality improvement system
- Communication with the workforce, patients and carers that provides feedback regarding safety and quality of patient care.
MPS & Small Hospitals
Identify the local governance arrangement for monitoring and improving safety and quality, including identifying local individuals or groups with responsibility for oversight of clinical safety and quality risk management.
MPSs or small hospitals that are part of a local health network or private hospital group should use the description of ‘high quality’ that is reflected through the network or group’s vision, mission and values.
Small hospitals that are not part of a local health network or private hospital group should develop a description of ‘high quality’ – for example, describe an effective and safe health service organisation in which consumers have a good experience of care.
The organisation then:
- Shares this information with the workforce
- Determines how feedback will be collected from the workforce, patients and consumers
- Considers whether there is a coherent, planned and systematic schedule of audits of clinical and organisational systems, and reliable processes to capture findings and implement necessary improvements
- Develops a schedule for reporting to the governing body and managing the design and performance of key clinical systems
- Monitors and reviews progress on actions taken to improve safety and quality, and provides feedback to the workforce, patients and consumers
- Provides information and training, if necessary, to the workforce, patients and consumers to assist their involvement in analysing performance data.
One safety and quality measure that could be used in public health service organisations is hospital-acquired complications (HAC). HAC refers to a complication for which clinical risk mitigation strategies may reduce (but not necessarily eliminate) the risk of that complication occurring. The national list of HACs includes 16 complications that were selected based on the criteria of preventability, patient impact, service impact and clinical priority. Not all complications will be relevant to MPSs or small hospitals.
Examples of evidence
Select only examples currently in use:
- Policy documents that describe the processes and accountability for monitoring the safety and quality of health care
- Documented safety and quality performance measures
- Schedule for internal or external audits
- Audit reports, presentations and analysis of safety and quality performance data
- Feedback from the workforce about the use of safety and quality systems
- Feedback from consumers about their involvement in the review of safety and quality performance data
- Quality improvement plan that includes actions to deal with issues identified
- Examples of specific quality improvement activities that have been implemented and evaluated
- Committee and meeting records in which reports, presentations, and safety and quality performance data are regularly reviewed and reported to the governing body or relevant committees
- Training documents on the health service organisation’s quality improvement system
- Communication with the workforce, patients and carers that provides feedback regarding safety and quality of patient care
- Reports on hospital-acquired complications indicator set (public hospitals only).
Hospitals
Develop a quality improvement system
The elements of a successful quality improvement system include:
- A description of ‘high quality’ that is reflected through the organisation’s vision, mission and values
- A definition of the organisation’s stakeholders
- Clearly defined and aligned organisational objectives and clinical quality objectives
- Clearly defined processes and responsibilities that are required to meet quality objectives
- Training for the organisation’s workforce in safety and quality
- Processes to verify that the quality improvement system is operating effectively
- Mechanisms for monitoring consumer satisfaction, measuring quality and implementing improvements.
Define quality and how it will be measured
Define the elements of quality to be used by the organisation (for example, safety, effectiveness, consumer experience). Provide a common language and understanding for the design, implementation and monitoring of safety and quality performance throughout the organisation. An example of this is the national list of hospital-acquired complications (HACs).
HAC refers to a complication for which clinical risk mitigation strategies may reduce (but not necessarily eliminate) the risk of that complication occurring. The national list of HACs includes 16 complications that were selected based on the criteria of preventability, patient impact, service impact and clinical priority. HACs are identified using routinely collected data extracted from patient healthcare records. Codes are used to identify the diagnosis, and a flag is used to indicate that the diagnosis arose during the episode of care.
The HACs list provides a succinct set of complications to support monitoring of patient safety in a hospital setting. Regular reports to clinicians, boards and other stakeholders on HACs can help identify areas that require attention, as well as areas of best practice. While the identification and reporting of HACs are important elements supporting patient safety, they are intended to complement and be used alongside other quality improvement processes.
The HACs list was developed through a comprehensive process that included reviews of the literature, clinical engagement, and testing of the concept with public and private hospitals. The list was agreed by the Commission’s Inter-Jurisdictional Committee in June 2016. In March 2017, the list was included in the National Health Reform Agreement for use in pricing and funding of Australian public hospitals.
Involve the workforce, patients and consumers in defining quality, and in processes such as reviewing quality improvement systems.
Define the key indicators for safety and quality measures that will be routinely collected and reported to management and the clinical workforce, as well as the level of detail required to enable the governing body and workforce to fulfil their responsibilities. These may include data from incidents and complaints management systems, safety and quality audit reports, infection control reports, reviews of clinical practice, and clinical indicators relating to specific actions in the NSQHS Standards.
Routinely measure and monitor patient experience by using national core common questions on patient experience developed by the Commission.
Conduct regular reviews and audits
Develop a schedule of reviews and audits that cover the variety of services and locations used for the delivery of care, to ensure that there is systematic oversight of safety and quality systems.
Conduct audits throughout the organisation, including the clinical, departmental, divisional and whole-of-organisation levels. Actively engage clinicians and consumers in the audit processes and analysis of results. Ensure that audits test the design and performance of the organisation’s clinical governance system.
Audits are effective if their outcomes are used for improvement and assurance purposes. Independent auditors or reviewers can assist to ensure a high level of assurance of objective reporting for the governing body. Report audit outcomes throughout the organisation – to the governing body, the workforce, and patients and consumers.
Record the outcomes of clinical system audits on a register, together with proposed actions and responsibilities, and evidence of implementation and follow-up. These records can be used to show how risks and opportunities identified through the quality improvement system are addressed, to improve safety and continuously improve performance.
Examples of evidence
Select only examples currently in use:
- Policy documents that describe the processes and accountability for monitoring the safety and quality of health care
- Documented safety and quality performance measures
- Schedule for internal or external audits
- Audit reports, presentations and analysis of safety and quality performance data
- Feedback from the workforce about the use of safety and quality systems
- Feedback from consumers about their involvement in the review of safety and quality performance data
- Quality improvement plan that includes actions to deal with issues identified
- Examples of specific quality improvement activities that have been implemented and evaluated
- Committee and meeting records in which reports, presentations, and safety and quality performance data are regularly reviewed and reported to the governing body or relevant committees
- Training documents on the health service organisation’s quality improvement system
- Communication with the workforce, patients and carers that provides feedback regarding safety and quality of patient care
- Reports on hospital-acquired complications indicator set (public hospitals only).
Day Procedure Services
Develop a quality improvement system
The elements of a successful quality improvement system include:
- A description of ‘high quality’ that is reflected through the organisation’s vision, mission and values
- A definition of the organisation’s stakeholders
- Clearly defined and aligned organisational objectives and clinical quality objectives
- Clearly defined processes and responsibilities that are required to meet quality objectives
- Training for the organisation’s workforce in safety and quality
- Processes to verify that the quality improvement system is operating effectively
- Mechanisms for monitoring consumer satisfaction, measuring quality and implementing improvements.
Define quality and how it will be measured
Define the elements of quality to be used by the organisation (for example, safety, effectiveness, consumer experience). Provide a common language and understanding for the design, implementation and monitoring of safety and quality performance throughout the organisation.
Define the key indicators for safety and quality measures that will be routinely collected and reported to management and the clinical workforce, as well as the level of detail required to enable the governing body and workforce to fulfil their responsibilities. These may include data from incidents and complaints management systems, safety and quality audit reports, infection control reports, reviews of clinical practice, and clinical indicators relating to specific actions in the NSQHS Standards.
Routinely measure and monitor patient experience by using national core common questions on patient experience developed by the Commission.
Conduct regular reviews and audits
Record outcomes of clinical system audits on a register, together with proposed actions and responsibilities, and evidence of implementation and follow-up. These records can be used to show how risks and opportunities identified through the quality improvement system are addressed, to improve safety and continuously improve performance.
Actively engage clinicians and consumers in the audit processes and analysis of results. Ensure that audits test the design and performance of the organisation’s clinical governance system.
Examples of evidence
Select only examples currently in use:
- Policy documents that describe the processes and accountability for monitoring the safety and quality of health care
- Documented safety and quality performance measures
- Schedule for internal or external audits
- Audit reports, presentations and analysis of safety and quality performance data
- Feedback from the workforce about the use of safety and quality systems
- Feedback from consumers about their involvement in the review of safety and quality performance data
- Quality improvement plan that includes actions to deal with issues identified
- Examples of specific quality improvement activities that have been implemented and evaluated
- Committee and meeting records in which reports, presentations, and safety and quality performance data are regularly reviewed and reported to the governing body or relevant committees
- Training documents on the health service organisation’s quality improvement system
- Communication with the workforce, patients and carers that provides feedback regarding safety and quality of patient care.
MPS & Small Hospitals
Identify the local governance arrangement for monitoring and improving safety and quality, including identifying local individuals or groups with responsibility for oversight of clinical safety and quality risk management.
MPSs or small hospitals that are part of a local health network or private hospital group should use the description of ‘high quality’ that is reflected through the network or group’s vision, mission and values.
Small hospitals that are not part of a local health network or private hospital group should develop a description of ‘high quality’ – for example, describe an effective and safe health service organisation in which consumers have a good experience of care.
The organisation then:
- Shares this information with the workforce
- Determines how feedback will be collected from the workforce, patients and consumers
- Considers whether there is a coherent, planned and systematic schedule of audits of clinical and organisational systems, and reliable processes to capture findings and implement necessary improvements
- Develops a schedule for reporting to the governing body and managing the design and performance of key clinical systems
- Monitors and reviews progress on actions taken to improve safety and quality, and provides feedback to the workforce, patients and consumers
- Provides information and training, if necessary, to the workforce, patients and consumers to assist their involvement in analysing performance data.
One safety and quality measure that could be used in public health service organisations is hospital-acquired complications (HAC). HAC refers to a complication for which clinical risk mitigation strategies may reduce (but not necessarily eliminate) the risk of that complication occurring. The national list of HACs includes 16 complications that were selected based on the criteria of preventability, patient impact, service impact and clinical priority. Not all complications will be relevant to MPSs or small hospitals.
Examples of evidence
Select only examples currently in use:
- Policy documents that describe the processes and accountability for monitoring the safety and quality of health care
- Documented safety and quality performance measures
- Schedule for internal or external audits
- Audit reports, presentations and analysis of safety and quality performance data
- Feedback from the workforce about the use of safety and quality systems
- Feedback from consumers about their involvement in the review of safety and quality performance data
- Quality improvement plan that includes actions to deal with issues identified
- Examples of specific quality improvement activities that have been implemented and evaluated
- Committee and meeting records in which reports, presentations, and safety and quality performance data are regularly reviewed and reported to the governing body or relevant committees
- Training documents on the health service organisation’s quality improvement system
- Communication with the workforce, patients and carers that provides feedback regarding safety and quality of patient care
- Reports on hospital-acquired complications indicator set (public hospitals only).