Эротические рассказы

Managing Medical and Obstetric Emergencies and Trauma. Группа авторовЧитать онлайн книгу.

Managing Medical and Obstetric Emergencies and Trauma - Группа авторов


Скачать книгу
end of care by the treating team and individuals; and (ii) those that occur at the blunt or organisational level, typically through policies, procedures, staffing and culture. These errors can be further subdivided (Table 4.1).

Explanation Example
Sharp errors that occur with the team/individuals treating the patient Mistake Lack or misapplication of knowledge Not knowing the correct drug to prescribe
Slip or lapse Skills‐based mistake Knowing the correct drug but writing another one
Violation Deliberate action that may be routine or exceptional Not attempting to get a drug second checked as there are no staff available
Blunt/organisational errors Policies, procedures, infrastructure and building layout that has errors embedded Different drugs used by different specialties and departments for same condition
Schematic illustration of the ‘Swiss cheese’ model.

      Each of the slices of Swiss cheese represents barriers that, under ideal circumstances, would prevent or detect error. The holes represent weaknesses in these barriers; if the holes align the error passes through undetected with the potential to cause poor outcome and patient harm.

      Reconsider the example of drug error using the Swiss cheese model. The first slice is the doctor writing the prescription, the second slice is the organisation’s drug policy, the third is the midwife who draws up the drug and the fourth is the midwife who second checks the drug.

      Now consider the following: What if the doctor is relatively new to the obstetric unit and unfamiliar with the specific drugs or doses used in this situation? – their ‘slice of cheese’ has larger holes. What if the organisation has failed to develop a robust drug policy that is fit for purpose and guidelines are out of date or not easily accessed? – this second slice is considerably weakened or may even be removed completely. What if the drug is drawn up by a midwife who has just returned from a career break who is not familiar with the particular antihypertensive drug used? – their ‘slice’ has also got larger holes. Labour wards are often chronically short of staff and the midwife who performs the second drug check is distracted as they are looking after two high‐risk women in labour. Inadvertently their check is only cursory – this final slice (or barrier) is completely removed.

      The end result is that multiple defences have been weakened or removed and error leading to unintentional harm is more likely. Also be aware of the different types of failure within the system: (i) latent failures include organisational error (e.g. no effective policy, out‐of‐date guidelines and inadequate staffing levels); and (ii) active failures (e.g. failure to escalate, drug errors, failure to monitor or act on deteriorating vital signs).

      Historically, those making mistakes have been identified and singled out for punishment and/or retraining, in what is often referred to as a culture of blame. With our example drug error blame would most likely have fallen on the shoulders of the nurse administering and/or the doctor incorrectly prescribing. Does retraining these individuals make it safer for other or future patients? That clearly depends on the underlying reasons. If it was purely a knowledge gap, possibly, but does the same knowledge gap exist elsewhere? Potentially all the other issues remain unresolved. Moreover such punitive reactions make it less likely for individuals to admit mistakes and near misses in the future.

      The focus is now on learning from error and, in shifting away from the individual, is much more focused on determining the system/organisational errors. Once robust systems, procedures and policies that work and are effective are in place, then errors can be captured. Of course issues will still need to be addressed where individuals have been reckless or lacked knowledge – but now reasons why the individuals felt the need to violate, or had not been given all the knowledge required, can be looked at.

      Violation may be indicative of the failure of systems, procedures or policies or other cultural issues. It is important that policies, procedures, roles and even our buildings and equipment are all designed proactively with human factors in mind so things do not have to be fixed retrospectively when adverse events occur. This means that all members of the organisation must be aware of human factors, not just the front‐line clinical staff.

      Improving team and individual performance

      Having discussed the magnitude of the problem of healthcare error, the rest of this chapter will focus on how the performance of teams and individuals can be developed.

      Raising awareness of the human factors and being able to practise these skills and behaviours within multiprofessional teams allows the development of effective teams in all situations. Simulation activity allows a team to explore these new ideas, practise them and develop them. To do this we need feedback on our performance within a safe environment where no patient is at risk and egos and personal interests can be set aside. Consider how you developed a clinical skill. It was something that needed to be practised again and again until eventually it started to become automatic and routine. The same applies for our human factor behaviours. In addition, recognising our inherent human limitations and the situations when errors are more likely to occur, we can all be hypervigilant when required.


Скачать книгу
Books sex-story
Яндекс.Метрика