Portfolios and Competency

On Thursday evening our #meded chat will be on the topic of portfolios and demonstrating competency. In advance of our discussion it would be good to use this space to write a little about your experience as a student, or doctor (and other health professionals are welcome too!) How do you feel about the rise of the competency model of medical education which is replacing the apprenticeship model? What are the benefits, if any? What are the main problems with the way portfolios are executed? Have you had any positive experiences through portfolios?
It’s often stated that a competency model is needed to satisfy the demands of wider society. Do you think that there is any truth in this? Why do you think portfolios and competency have been accepted by the leaders of medical education with seeming enthusiasm, whilst many on the ground seem disappointed and disillusioned with the experience?

, ,

  1. #1 by David Lewis on June 29, 2011 - 6:24 am

    It was the need to have a logbook for operations training as a surgeon that led to the recording of every patient seen during VTS scheme 1996-1999. Having been an early adopter of mobile devices helped. My trusty Psion 3a device went everywhere. The database easily exported to Excel spreadsheet. I still have the file and a rather thick binder containing the printout. Every patient, anonymised with patient number, presenting complaint, diagnosis if known, treatment given, reference to literature supporting management plan, and outcome if known. Learning portfolios were not routine then, but my portfolio has provided a solid foundation for the practise of family medicine.

    • #2 by amcunningham on June 29, 2011 - 5:44 pm

      Thank you. Do you still gain from this? And do you think it would be easy to convince many more to adopt this? I am also guessing that you kept this record with little feedback from others. Do you think a mentor would have helped or hindered the process? Would having another access your portfolio have changed the nature of what you recorded? You might find Galbraith’s presentation here very interesting.
      http://www.medbiq.org/conference2011/videos

  2. #3 by @NHS_GP on June 29, 2011 - 7:49 am

    As a medical student I had to complete a portfolio which included cannnulation, venepuncture, examinations, ABGs, and of course signed forms stating I had bothered to turn up. This made sense at that time because I was a student. I just missed eportfolios and the foundation program, but much of the same hoop jumping and form filling has transferred into GP appraisals.

    I am unconvinced portfolios create better doctors, in fact I think the infantilise doctors when they should be moving towards independent practice. So much of what I learnt as a PRHO and SHO can’t be written down, and wouldn’t have fitted into any of the required boxes of a portfolio anyway. It is working with skilled experienced colleagues that is important, and the portfolio may be an obstacle to gaining the most from this.

  3. #4 by Catherine on June 29, 2011 - 8:15 am

    I’m afraid I don’t know enough about portfolios to comment in depth. The impression I get is that of a negative one – but, something is telling me that I should probably check out that obscure website with it’s illogical interface and clunky menu system at some point.

    If only there was an app for it. (No, seriously)

  4. #5 by Antonia on June 29, 2011 - 8:22 am

    My supervisors and colleagues were very happy with my work during my F1 year.
    I failed the interim validation for my ePortfolio because:
    1. I hadn’t got my ALS. This was because the course was scheduled for a day when the trust had also scheduled another “Obligatory” course and in reality I could not attend either because they were before the start of my contract and I was on holiday. In October tried to book a place on an ALS course and was told they were fully booked for the year and would not take bookings for the new year until after January (and after interim validation).
    2. I hadn’t attended enough teaching. I attended every teaching session I could when I was not on call, nights or on holiday. Unfortunately this was inadequate.
    I was then allocated an F2 job that I was impossible for me to travel to. So I had to withdraw from the foundation program all together and apply for a job overseas. When going to interviews overseas I had nothing to show for the hours I had spent on the ePortfolio because as a document it has no relevance outside the Foundation program. Potential employers did not want to read 150 words about how well I could cannulate. They assumed that as an MBBS I could do that.

    I then went on to have my final portfolio graded as a potential fail, having submitted everything that was asked of me. I was told it was “Too thin”. I went to make my case to the assessor in question and he accused me of having written everything in the last two weeks. I pointed out that the date he was looking at was the “last modified date” not the “created on” date which he could find if he actually opened the document and read it. (of course I had proof read everything before I submitted it). He did then decided to read something in front of me and by chance picked a “reflection” on
    why I felt attendance at Grand Rounds should not be compulsory for F1’s and F2’s because it stood to reason that if you wanted to learn you went. If junior doctors can’t make decisions for themselves about learning opportunities then how did they ever expect them to become “independent learners”? He read this one reflection and decided maybe he could pass my portfolio after all.

    I think there does need to be a log book for F1 and F2 but it needs more flexibility. I feel it should be a collection of 10-12 case reports that you discuss with your supervisor on a regular basis and through these highlight competencies. The list of “skills” could be assessed in one go during an F1 induction course. I spent an enormous number of hours on my eportfolio and can definitely say I have professionally gained nothing from it.

  5. #6 by Catherine on June 29, 2011 - 8:37 am

    I’m coming to the end of my third year as an undergrad medic, and have mixed feelings towards those forms and bits of paper that require signatures in order to prove myself worthy… I can understand where they stem from – DOPs forms are a relatively easy way to keep track of a student’s progress, and that they are doing something on firm. It’s just as well I’m in a teaching hospital, because asking for feedback, in the ‘What they could have done better box’ that cannot be left blank on pain of death for pulse oximetry, is a little embarrassing. Especially if they’re already trying to run away from you.
    On the other hand, I can see the bright side. In the hand books we were given on firm, there is a short ‘diary’ section, for each day of your attachment, with the option of a consultant signature. This was formative (thank God) and its existence a result of an older student, who was annoyed that they did not shine enough, compared to their peers, on that ‘end of attachment certificate’ with a tick next to acceptable. So, it could be worse. (Even though I empathize – it is slightly infuriating when your firm partner shows up for fewer hours in the six week firm than you did in three days, and still gets signed off. But nevermind, such is life).
    Anyway, as it is, I’m just rolling with it. I’m not creative enough to come up with a better idea, so I’ll just carry on having a spare few pieces of paper with me at all times.

  6. #7 by Malcolm McKenzie on June 29, 2011 - 9:09 am

    I had a young colleague, nice fella, who had a patient who needed a urinary catheter. He seemed reluctant to do it and changed the subject when I gently probed as to what the issue was, ‘but catheters are disgusting!’, he said.
    I levelled – ‘how many catheters have you done?’
    ‘Two….. but I’m signed up for it!’ he beamed.

    There is a very real risk with WPBA and ePortolios that someone’s true ability is not assessed well as it goes on the basis of a single encounter. Like my colleague, one may have the documentation but not have the experience (and therefore either not realise or worse still not accept that this is the case). Doctors out of their depth get defensive, and this is not what we should have learned through Clinical Governance and the need for an open, non-judgemental forum for discussion (or indeed anything post Bristol)
    Long gone are the days when your boss would give a call to a colleague describing you as ‘a good egg’, or ‘safe pair of hands’, but there was a real safety in that honest, sensible total appraisal of a doctor. Perhaps we still need that over-riding adjunct.

    I can’t make the discussion on Thursday (am somewhat predictably on call)

    On the plus side, I did manage to basically do my ePortfolio (mainly by uploading many paper documents) in 48 hours or so. However, I think my requirements are a lot less than those coming through the training now. I’m genuinely not sure how I would cope with it in current situation.

    Anyway those are just some thoughts, which feel free to think about or discard. As a competing interest I should declare that in my final ARCP STR I was described on my ePortfolio as being ‘so laid back he is horizontal’. Fortunately I was blessed with a panel who were sensible enough to allow me to explain that being laid-back is not synonymous with complacency and I had a string of tough Senior Clinical Fellow jobs before my lateral move into ST to back this up. I fear that if I didn’t have these two basic and simple things, my progression may have been harder.

    Respectfully

    • #8 by Docshannon on June 29, 2011 - 10:45 am

      …” honest, sensible, total appraisal of a doctor.”.

      The holy grail! It seems completely unattainable in today’s NHS with the current system of eportfolio, competencies and job application forms that value bullshit answers over ability as a doctor. Maybe there should be a “good egg” box to tick before ARCP that gets you bonus points.

  7. #9 by Kate Bowman on June 29, 2011 - 9:44 am

    I am a current third year student at the University of Manchester, having come down from St Andrews where I did the first three years of my course. A portfolio has been a “pass-or-fail” component of both courses.

    In St Andrews we used an online e-portfolio. It was quite a common OSPE (Practical not Clinical) station for students to fail on – they simply hadn’t got organised and done the written work, or in some instances tutors failed students whose documents had all been uploaded the night before.

    The required reflections included teamwork, the qualities of a good doctor and the first experience in the dissecting room. In third year we were actually required to reflect on previous reflections – pretty tedious!

    Moving to Manchester meant printing out hard copies of all our documents, as Manchester still have paper portfolios. The sections of the portfolio are GMC buzzwords, “probity”, “prescribing”, “management” and so on. I had my meeting with my portfolio supervisor last week. He flicked through, and passed me on the grounds that I seemed to have a lot of sheets of paper in there.

    I think that reflection is important – we learn from our mistakes – but I’m not sure forcing written reflections is the best format. I would much rather discuss my experiences and get useful feedback, learning points and support. It’s far too easy (and tempting) to “cheat” on portfolio too! I can see the use of getting skills signed off, but I’m unsure about the other components of a portfolio. There also seems to be a real discrepancy between tutors’ expectations and standards when it comes to assessing portfolios.

    In the future portfolios should be online and interactive, and yes, there should be an app so that you can instantly log skills practice, or record a prescribing error seen etc.

  8. #10 by Natalie Silvey on June 29, 2011 - 10:05 am

    As a foundation year one doctor I am very familiar with the e-portfolio system. Before I started as an F1 one of my friends who was an SHO sat me down to go through the system warning me that it would be the bane of my
    life at times.

    The portfolio itself is a nice idea, somewhere to keep all your assessments & certificates together at the very least. However I am not at all sure about how it works in practice. Sometimes it feels like a tick boxing exercise – the struggle to provide evidence for each competency and the time spent on it can detract from time that could be spent improving knowledge and skills (in my humble opinion – I have found elearning modules that bridge this gap).

    I find writing reflections particularly challenging. I reflect everyday as part of my ongoing practice but having to write down my experiences hasn’t always been that useful. Also I find I shy away from writing about the most important things because I don’t necessarily want others to read about the things that have affected me personally .
    I also worry that the eportfolio really doesn’t distinguish between doctors, how much can you tell about someone’s performance based on it? During my interim
    validation when applying for F2 jobs the part I cared most about was my assessment by my clinical supervisors, they worked with me and their feedback has been invaluable.

    I also agree with the comments above, if getting signed off for practical procedures continues in this way are we ensuring that doctors have the skills they need? Does a tickbox saying you have been signed off as competent mean you actually are competent in real life practice?

    Sorry for the long and rambling post but it is something I feel strongly about. Doctors do need assessments and a record of their progress but not sure eportfolio always fufils this.

  9. #11 by Docshannon on June 29, 2011 - 10:26 am

    I have used the NHS eportfolio for two years and have not enjoyed the experience at all. I agree with the concept but it is poorly executed.  

    The main problem for me is that it feels like the purpose is not to aid my professional development, though that’s how it’s been sold, rather it is an easy way for those above to tick boxes in an attempt to quantify my competency as a doctor. This is of course hugely flawed. Competency is more than just tick boxes and shallow assessments. It is not a quantitative measure. 

    Out of the workplace assessments, I find the TAB/miniPAT 360 degree feedback results both useful and interesting.  The rest are a complete waste of time and salvaging anything from them is 90% dependent on how engaged your educational supervisor is (lots find eportfolio as rubbish as trainees do). They do not help my personal or professional development. They are almost universally hated by all, particularly the CBDs. I get far more out of talking to seniors and colleagues about cases as they occur and getting immediate feedback on my management decisions at the time. Also, Eportfolio doesn’t make me want to read more in my spare time. Sitting exams does. Wanting to do a better job does. Personal pride does. Interesting cases do.  Ticking off curriculum items doesn’t. At all. 

    The reflections section is cumbersome and cluttered. It is far too prescriptive to be useful. As professionals we are used to reflecting, indeed it is impossible to get through med school without being tested on your ability to reflect, so I don’t see the need for such structured forms. An alternative to the current setup? A brief paragraph of introduction with a free text box and space for links to papers, guidelines and learning objectives in order to make it useful for professional development.

    Another huge downside of eportfolio is how disjointed the sections are. It is a real effort to link assessments with the curriculum and learning objectives and then you can’t see how it all fits together. Its immensely frustrating. Free iPhone apps manage to do it better. Why not show a gantt chart of assessments? Why not integrate the sections and abide by the “three click” rule of websites? Why not use a format that is simpler and therefore clearer and gives more information to the user? Have FY trainees actually been involved in developing eportfolio? Doesn’t feel like it. 

    The logbook of procedures is a poor cousin to the surgical trainees elogbook or the anaesthetic gaslog. 

    I would like a section to record brief details of patient encounters, much like the RCP logbook. It would be an interesting to look back and reflect on, and would come in useful for future job applications.  I created my own using HandBase for iphone but can’t link it to eportfolio. There is no room for personal interest or lateral thinking in the portfolio, only curriculum ticking and assessments. 

    Finally, it has no use past foundation years. Professional logbooks change and the printed format of eportfolio is such that it is useless for reference in hard copy. Being able to export data to excel or PDF in a sensible format would be better. Sadly CMT uses nhs eportfolio. My heart sank at that news. There is no escape!

    My wishlist would be an eportfolio that was simple in format with a clear but functional interface (3 click rule in force) that was truly aimed at professional development. It would link to an iPhone app and could be exported into excel. I would be able to save links to papers and conferences into it easily.  There would be no CBDs or miniCEX with just a few reasonable DOPS and two TABs per year. Anaesthetics & ITU are forging ahead with apps, databases and elogbooks. They can work. They can be better than NHS eportfolio. 

  10. #12 by ben goldacre on June 29, 2011 - 10:40 am

    interesting. i’ve always assumed that the idea of competence portfolios is to measure progress and spot problems. if that’s the case, then there’s one way that they can make things worse.

    in the past, a poorly performing doctor would be regarded as so by colleagues, imperfectly, through gist and reputation, and local systems might gradually lumber into action, jobs interviews might not go well, etc.

    now, since the competences being signed off inevitably don’t map perfectly onto real world skills, (and since you can maybe get things signed off by exerting effort and trying many times with many people): less competent people end up with a sheaf of papers scraped together, actively showing that theyre competent.

    i don’t think the previous situation was brilliant, but it seems to me that the new one may be worse, by actively misleading people into thinking less competent people are competent.

    i’m sure there are other good things about competence tickboxes that i’ve not covered in this brief comment.

    but in general, it is always a worry, with imperfect systems, that the illusion of a solution to a problem – like measuring medical performance – can be worse than leaving people recognising that it is unsolved.

  11. #13 by David Andrew on June 29, 2011 - 11:02 am

    Interesting example of portfolio’s going wrong – in education http://www.youtube.com/watch?v=bguz4garGH8

  12. #14 by Russell Brown on June 29, 2011 - 11:25 am

    I have no direct experience of these monsters. However, although I can see putative advantages (and Ben’s comments are apposite), I view them as a poorly-evidenced mechanism to deprofessionalise medicine. They are a training-version of the “bare-below-the-elbows” nonsense.

    The amount of time I see my training colleagues spend with their GP trainees on the portfolios makes me cross. Those hours could more usefully be spent gaining experience of patient care. As far as competencies are concerned, many of the courses seem to be geared to a multi-disciplinary team, so much of the contents is of little or no use to medics. “See one, do one, teach one” is inadequate, but tick box checking is no better and may well be worse.

    But it shouldn’t be abandoned. Much better to examine what is not working about it and modify it so that it benefits all stake-holders (apologies for using that word). We need to move from perception to actuality. I am unsure how that can be achieved but at least if it is recognised, a direction of travel to facilitate the introduction of a better system could be explored.

    Then all you need is an agreement on the definition of the word “better”…

  13. #15 by Michael Swaddle on June 29, 2011 - 12:15 pm

    I comment as a Pharmacist who has trained other Pharmacists as a Pre-Reg tutor over 20 years.

    My feelings are well expressed in Ben’s comment “I don’t think the previous situation was brilliant, but it seems to me that the new one may be worse, by actively misleading people into thinking less competent people are competent.”

    I have always attempted to use various strategies to get students, and current, Pharmacists to look at the whole patient not just the drugs, Rx or whatever that is their current focus. Now all I get are requests to sign off their competency check box.

    In my experience my best students have the poorest records as they are busy doing their job. A full, complete and tidy portfolio raises the hackles of suspicion on the back of my neck.

    I don’t want to be Dickensian in promoting Apprenticeships but they had the one to one professional contact that exposes the true level of skills and knowledge. Could there be a “Buddy” system evolved where two or three join together to spend a few short times each year shadowing or working together discussing their doubts, fears, knowledge and practice; an exchange between peers that can be free flowing.

    As a pilot I’m used to this, twice every year, and it works well. We share, exchange, comment and demonstrate our foibles as well as competencies.

    I have seen no evidence of sincere doubts or uncertainties in any of the “Reflexions” that have been presented to me for comment. Usually they are self laudatory about what they have “learned”.

    I hope an outsider’s view may help with your discussions.

  14. #16 by Ben S on June 29, 2011 - 12:56 pm

    Interesting, Catherine your medical school sounds awfully like mine! I’ve just finished and have recently had to join the e-portfolio world (I haven’t actually come across anyone that believes it, certainly in its present form, is at all worthwhile).

    My medical school recently introduced DOPs (directly observed procedures) which to me seem bizarre. I have offered many a more junior medical student a chance to do venous cannulation to receive the response “I’ve got signed for that, do you think I could do an ECG?!” Having not had DOPs, I never had this mentality and did whatever I had the opportunity to do, and actively sort to do those I hadn’t for the benefit of my future patients.

    My medical school also had these things called LOBs (learning objectives) which I also chose to ignore but that some people follow religiously and are distraught when examined on something that wasn’t in the “FLOB” (final LOB) book.. (These are actually useful to some people to get a curriculum but my philosophy is generally that medicine cannot be that inherently prescriptive, pardon pun). From DOPs to LOBs to QOFs or CEQUINs, we are steadily reducing doctors to box checking automatons.

    Whether these things will have a net positive or negative impact on healthcare I wouldn’t like to say, but I know where I’d place my money (if it wasn’t all wiped out in debt).

    As Mr Cameron mentioned the phrase today, I shall repeat it. Bread and Circuses.

  15. #17 by Neil Mehta on June 29, 2011 - 1:10 pm

    Just to clarify, I think we need to be sure we distinguish a Portfolio that is a learning, reflecting, self-improvement process from a place or a thing or a shoebox.

    At our medical school program, we have a narrative formative 360 degree feedback system. Students receive over a hundred assessments each year. Viewed only by them and their advisor who is NOT involved in summative decisions. They write reflective portfolios citing evidence for their statements.

    We have seen a consistent trend over the 7 years of doing this. Students in year 1 tend to be anxious about the feedback, but by the time they graduate, they actively seek out feedback and want to know what they can do better. Anecdotal evidence indicates that they feel they will take this reflective learning with them as they move on. Don’t think any student actually enjoys the work of creating a portfolio but they are uniformly positive about the portfolio model at the time of exit interviews.

    Another concept to discuss is the EPAs (Entrustable Professional Activities). These are closer to the standards in a competency but might be more directly applicable to the apprenticeship model. Each EPA might map to more than one competency.

    ePortfolios don’t have to be huge cumbersome systems. We should not throw the baby out with the bathwater. A poorly designed UI of an ePortfolio system should not influence our perception of a potentially valuable tool. A simple blog shared with a few people might be enough. (it can be on paper! but then difficult to share). Regularly reflecting on what you encounter, writing about what you learn, can be the first step to becoming a life long learner. http://blogedutech.blogspot.com/2011/05/reflections-on-why-do-i-blog.html

    What are the negatives/issues/questions?
    1. Portfolios are hard work – both for learners and the educators
    2. Do have good evidence that they work – using meaningful outcomes?
    3. How can we capture some of the richness of what happens f2f in the clinical setting? Do we need to?
    4. Are Portfolios useful/better only for certain “competencies” like professionalism or communication skills?
    5. Is it necessary to transfer the evidence and portfolios when going from med school to grad med ed and then on to practice?
    6. Do Grad med ed program directors have time to review portfolios when choosing their candidates?

    Look forward to the chat!

    • #18 by amcunningham on June 29, 2011 - 5:15 pm

      100 assessments per year? That sounds like a big assessment burden on staff and students. Is everyone really happy with this? Does anyone express the opinion that this is ‘hoop jumping’? If not then we need to know a lot more about how you manage to pull this off! Is there any chance that one of your students would leave a comment here to tell us first hand about their experiences?

      Most people here are not complaining about the tech or the UI (in fact as Ian writes in #20, the electronic version has clear benefits for mapping) but instead about the burden of assessment in the NHS foundation programme. And this has also been highlighted in the evaluation published last October.

      Click to access MEE_FoundationExcellence_acc_FINAL1.pdf

      I think ten Cate’s writing about EPAs are an interesting work around competency and fit more closely with Eraut’s model of professional learning.

      Talking about portfolios not having to be big cumbersome systems.. have you seen the vision of the future portfolio in the US? http://dream.presentme.com/audio2011/20110510MedBiqGalbraith/
      from here http://www.medbiq.org/conference2011/videos

      I hope you can join us tomorrow. This comment thread shows the complexity, so it will be interesting to see what we can achieve in 1 hour of 140 cht messages:)

      • #19 by Neil Mehta on June 29, 2011 - 6:20 pm

        Some day I’ll tell you how we manage to pull this off 😉
        Need multiple sources and #s of feedback to ensure validity of the data!

        Bob Galbraith and I co-presented at the AAMC with others listed below. (asso american med coll national meeting last year)

        Group for Information Resources Plenary Session
        “Capturing the Continuum of Learning: Student to Practitioner Portfolio”
        (Start at 14:00 for my portion. Bob Galbraith is first speaker) http://mediasite.yorkcast.com/webcast/Viewer/?peid=63fc5d2cb49e498db157047128b10121

        Robert Galbraith (National Board of Medical Examiners)
        Neil Mehta (Cleveland Clinic)
        Linda Lewin (University of Maryland School of Medicine)
        Anderson Spickard (Vanderbilty University Medical Center)
        Freda Bush (Federation of State Medical Boards)

        • #20 by Ehsan Balagamwala on June 30, 2011 - 1:58 am

          Being a student in the program that Dr. Mehta has mentioned, I can perhaps provide my perspective on the portfolio system. Coming from a large, private undergraduate university that heavily focused on objective measures (grades, honors, etc.), it was definitely quite a bit of a culture shock. Providing written feedback and receiving written feedback from peers and faculty for the first time was quite an experience because very rarely was I ever told “you need to improve on ”. Previously, all that mattered were letter grades, scores and honors, because that’s what gets students interviews at medical schools, grad schools, etc. However, over the last 3 years, I have learned tremendously about how to work in a group, what my strengths are and what my weaknesses are – something I would have struggled to learn in a program that was exam and grade based.

          I strongly believe that anything that’s worth having is worth fighting (or working hard) for. It would be incorrect for me to say that writing a portfolio or writing peer feedback is easy. To provide thoughtful feedback, one needs to spend significant amount of time observing peers, thinking about how their strengths can help them overcome their weaknesses and how to tactfully provide constructive criticisms about not only their medical knowledge but also their professionalism and communication skills. The feedback and portfolio system at our institution is definitely demanding, but its returns are invaluable. I have always been quite self-motivated and that was one of the reasons why I chose to come to this medical school. However, the growth I’ve seen in self-evaluation as well as motivation would not have been possible without such a program. There is definitely no substitute to reflecting on one’s feedback and then reviewing self-reflections with an advisor (who is not involved in promotions decisions, but serves as an advocate and ensures that we do a fair and balanced self-evaluation). I have benefited most from the portfolio system when I’ve developed a focused learning plan rather than a generalized one. Looking back at my learning plans from my first year versus my third year, I have noticed that my goals have become more specific and more attainable. Rather than “learning pathophysiology of all organ systems,” my current learning objectives are “refine my neurological exam technique to perform it more coherently and efficiently.” The only major down-side of our program is that it requires tremendous amount of manpower – as Dr. Mehta has written, I accumulate well over a hundred unique pieces of evidence during a given academic year. My faculty and peers have to write narrative feedback and I have to go through them to pick up themes and identify strengths and areas for improvements. My advisor has to do the same concurrently. Then I have to write a several page long portfolio, which my advisor needs to review to ensure that it accurately reflects the evidence I have accumulated. This will then have to be evaluated by an independent committee to ensure that I have met all the standards. This process is definitely not easy, but as I’ve already mentioned, anything that’s worth having is worth fighting (or working hard) for. The ability to be a self-motivated reflective physician is definitely worth working hard for.

      • #21 by Clark Madsen on June 30, 2011 - 5:37 pm

        I am also a student at the medical school mentioned by Dr. Mehta. I have had a generally positive opinion on our portfolio process. I do want to point out that the portfolio we use is hardly a checklist, but we have a set of competencies in areas such as medical knowledge, professionalism, and clinical reasoning. Evaluators are given the opportunity to comment on our work with individual patients and in a more generalized way.

        The benefits:
        – Personalized feedback that I can immediatly use to change how I do and approach things.
        – No more unnesecary judgements or rankings (you are a superior student or a poor student)
        – Evaluations are enjoyable to recieve because you can use the feedback to create a more complete portfolio of your work and character.
        – No more studying details to try to impress your evaluator. Your evaluator becomes your aide in understanding medicine.
        – Improved relationships with mentors. It is not them grading you, but them helping guide you.
        – I no longer fear showing my weaknesses. Since being weak in an area is not penalized, and not addressing a weakness is; I have no reason to not seek help in areas that I need help.
        – The biggest benefit is that I no longer spend my time trying to pass tests. I feel that I am in the pilot’s seat of my own medical education. I learn from my weaknesses, both self-evident and those elucidated by my evaluators. I have fellow students that spend all of their time memorizing certain details and questions because they will be on a test. I am free to learn medicine in my own way. It makes medicine so much more enjoyable.

        The downsides:
        – Getting evaluations from doctors that are very busy is difficult.
        – Getting quality evaluations (“Good job” isn’t a constructive evaluation and is no better than a grade)
        – Spending several days combining the evaluations into a coherent portfolio.
        – Getting evaluators to give you feedback in areas that you need the most help.

        In the end I think it is a very productive system for students that are willing to take responsibility for their education. We don’t really have checklists, and we do get hundreds of evaluations, but you get them throughout the year so if you keep up with them and make changes as you go through the year they don’t get too cumbersome. Finally, when I get a letter from my schools promotions committee saying that they feel I have been successful in the past year of medical school, I know it is based on a true weighing of my strengths and weaknesses and not a list of letters and numbers used to objectify my actions.

  16. #22 by Dan on June 29, 2011 - 1:44 pm

    I comment as a Paramedic, mentoring Student Paramedics.

    Portfolios are not new in the profession, As a IHCD trainee Paramedic I was required to keep a portfolio of exam results and competencies in intubation etc., that had to be signed of by an anesthetics registrar or higher before i could be qualified.
    As the Paramedic role progressed, rightly, into the world of professional registration it became a requirement for the first time for Paramedic to keep a portfolio of evidence of practice and Continuous Professional Development (CPD), which can be requested for viewing by the HPC. New Student Paramedics are issued with, in my experience an Paper Portfolio system, therein recorded all their formative and summative assessments. This includes “sign offs” from other registered staff members, evidence of direct observation and their own “Reflective Practices” against the learner outcomes. As with any portfolio, if the student fails to complete the required amount they are deffered the year.

    With regards competence, the students are expected to complete a recommended minimum number of sign offs against specific skills, i.e., Primary Survey. The nature of the job results in some skill areas receiving less attention, such as traction splintage or pediatric assessments. in this case the student can demonstrate competence in a simulation environment. However despite the number of signatures, the Mentor still has the overriding decision as to whether the student can deemed competent. This, so long as there is not a failure to fail mentality, helps to prevent incompetent students progressing, especially as we sign against our registration!

    Whether you are a student paramedic or a trainee GP, portfolios could be used to contain ALL evidence of your practice including reflections, plaudits from colleagues and CPD not just competency signatures. However, as has been mentioned above, a full, good portfolio doesn’t guarantee a good, well rounded professional.

    To close, I am in favour of portfolios as an AIDE to professionalism, but not as sole evidence of a competent practitioner

  17. #23 by carl heneghan on June 29, 2011 - 2:19 pm

    I’m from the era before portfolios. In the hazy days when you turned up, looked for teaching opportunities and then passed your finals. Hoping you’d been around the wards for some consultant to at least put a name to your face and possibly give you a reference.

    Upon qualifying the sole competency was a piece of paper from your consultant to say you were ‘competent’. Therefore never undertook portfolios. Yet, I’m now entrenched in appraisal and revalidations. From previous comments it seems to me it’s based on the number of pieces of paper you collect or the forms you’ve filled in and produced. Possibly its medicine by sheer weight of volume

    My perception is the current system is burdensome and all about jumping through hoops. This has the potential to turn training into an extension of medical school.

    Some aspects of medicine are obviously competency based and it’s helpful to show that you are up to the job. Therefore, if you are surgeon you need to be competent to be left to your own devices. But by requiring numerous competencies to be assessed is deleterious. For instance, showing you are competent to take a blood pressure reading is nonsense.

    The evidence for reflection is actually miserable but we are all supposed to be reflective practitioners.

    Here’s what I do. I collect all the ‘Thank You’ statements I get, whether by email, by letter or personal ones.

    I role ‘em out each year and reflect on them.

    On the odd occasion I also roll out the bad uns.

    • #24 by amcunningham on June 29, 2011 - 4:48 pm

      It’s not only the evidence that is questionable, we should have critical perspectives on the theory as well. As Eraut states “The term ‘reflection’ is now in such common use in professional education that there is considerable danger of it being taken for granted, rather than treated as problematic. ”
      http://onlinelibrary.wiley.com/doi/10.1111/j.1473-6861.2004.00066.x/full

  18. #25 by Ian on June 29, 2011 - 3:06 pm

    I won’t be able to join on the chat on Thursday evening, but I’m going to add some comments from both sides of the fence. Initially as an ex trainee and then as a trainer / assessor.

    I managed to avoid e-Portfolio during my training, partly because as yet anaesthesia (until August 2011) doesn’t have e-portfolio. All the information that I had to provide each year had to be recorded in a large word document with several attachments of logbook, consultant assessments and MSF summary and then sent to the Deanery. The “evidence” attendance certificates, audit reports / presentations, WBA , Module completion certificates all had to be collated in a folder and carted off to the RITA panel as it was then. Most of the initial information in the form had to be reproduced or copied into new documents year after year, the paper portfolio was rarely looked at in detail. Most of the time it was a scramble to locate where the certificates had been put and print presentations so that you had something new in the portfolio.

    I can therefore see the benefits of an electronic portfolio, less bits of paper to lose, be these WBA forms or certificates, yes they may need scanning and uploading but lots of places now send pdf certificates anyway. The ability to map assessments, courses, meetings, reading to the curriculum directly rather than having to map curriculum codes onto a paper form must be better. It also allows trainees and trainers to see where evidence is lacking or perhaps that there is a large area of the curriculum that you have not experienced or need more experience. It then comes down to how the portfolios are set up, I am not a fan of the FP, Medicne, or ACCS (EM) portfolios – These are set up as the whole curriculum subdivided and evidence is expected under each subsection – I am hoping that the anesthetic portfolio (which I have seen the test site for) will remain with just the main subject headings which require evidence and the guidance document which states “The purpose of WBA is not to tick off each individual competence but to provide a series of snapshots of work from the general features of which it can be inferred whether the trainee is making the necessary progress” Hopefully this will prevent training becoming a tick box exercise and as people have previously commented – “I don’t need to do that I’ve already had it signed off”

    From a trainer / assessor point of view e-Portfolio has advantages and disadvantages, you need a computer and an internet connection and that connection needs to be fast enough not to crash the system whilst you are using it. As an Educational Supervisor it allows you to see what your trainee has been up to or not, specifically with regards to WBA for which most specialties have a minimum requirement. It also allows a record of meetings to be made possibly not important for trainees progressing well, but useful for trainees with differing needs and therefore other supervisors if they rotate hospitals can see the issues that have gone on before.

    With regards to assessment (ARCP) an e-Portfolio can be reviewed in advance of the panel meeting by more than 1 assessor. This means that the trainee has had a fair assessment, all the evidence presented can be reviewed and outstanding areas or areas of weakness can then be reviewed and discussed by the whole panel. Mandatory components specified in the curriculum do not get missed. A paper portfolio cannot be reviewed by 2 or more people simultaneously, certainly not in as much detail.

    People above have commented on the tick box nature of WBA. I agree that the typical scoring system on most of these forms is unhelpful, the free text boxes are much more important especially if the person undertaking the assessment has not performed to a satisfactory standard, they give more weight to the assessment and in the case of curriculum mapping are extremely helpful. When assessing portfolios it is very difficult if all a WBA form has is satisfactory in all the boxes and no comments. Anaesthetics & ICM now has only satisfactory or unsatisfactory and free text boxes which should be completed.

    I am in favor of e-Portfolio but they need to be tailored more to the needs of the people using it than a 1 size fits all.

    COI – Involved in the implementation of the RCoA e-portfolio in my Deanery.

    • #26 by amcunningham on June 29, 2011 - 5:40 pm

      Really great insights here. Thank you!

  19. #27 by Malcolm McKenzie on June 29, 2011 - 4:36 pm

    What happens with the next Shipman? Will WPBA be traced back to find who signed this doc off? I’m not saying it’ll happen, but one thing that is traceable is accountability and with a public that sometimes brays for blood you have to consider what the web of paper we leave behind for our juniors may be used for.
    Just a Daily hate-Mail alarmist thought……

    • #28 by amcunningham on June 29, 2011 - 4:41 pm

      Well, yes, I am sure the trail would be looked back upon to make sure due process had been followed. But that doesn’t seem a reason not to implement a more traceable system to me. Why should it be?

  20. #29 by Malcolm McKenzie on June 29, 2011 - 4:42 pm

    Sould qualify my last comment, obviously we shouldn’t sign people off if they’re not competent, but with a population of 100,000 doctors or so there will always be a few who are incompetent or dangerous or crooked or drugged up and do slip through the net., When they’re found out there’s always surprise by the people who worked with them (in my experience commenting that some of them have been, ‘quite good, always caring’), but who wants to have that in writing after the event of discovery?
    More thoughts, please feel free to discard and I’m aware it’s a bit off topic
    Humbly

    • #30 by amcunningham on June 29, 2011 - 4:51 pm

      But there was always some form of ‘signing off’ as Carl Heneghan describes above. In an apprenticeship model if you serve your time and get through without incident is is assumed you are OK. But a competency model is about demonstrating that competency. The question is do the current systems increase the validity and reliability of ‘signing off’ as competent?
      Your comments aren’t off topic at all and get to the heart of it all in ways.

  21. #31 by Sean Williamson on June 29, 2011 - 5:31 pm

    Competency: well it’s half way to expertise and as such seems a fine aspiration with a view to the bigger goal. Tooke says it all in three words. Aspiring to Excellence. The current system does not encourage either the Trainer or the Trainee to do this as a rule. This is because of the way it is used and not because of a fault in it’s ideology.
    Some Trainers and Trainees actually get it and I have seen some fantastic evidence used especially in Foundation NES platform ePortfolio. There is a demonstration of the whole learning cycle, doing, reflecting, constructing, planning and doing it again, all with the benefit of great trainer feedback or should it really be feed forward.
    However the general message of the system seems to be, just get the box ticked. My 17yr old son is a competent driver. Need I say more, this reminds me to book the car into the garage for it’s respray…. Anyway, will continue to construct thought process ahead of Thursday discussion.

    • #32 by amcunningham on June 29, 2011 - 5:39 pm

      This is a really good point- any technology, and tool, can be used in multiple ways. It’s easy to blame the platform but if motivated most can be used well. It’s great if the system facilitates that of course.
      How can we help other trainees and trainers to get it? We need some positive stories from all parties to show that this is not just about hoop-jumping.

  22. #33 by Toby Hillman on June 29, 2011 - 10:34 pm

    The comments above are fascinating and insightful. There are a couple of points I would like to make on the subject (attendance at the chat tomorrow not possible)

    Regarding the emphasis placed on e-portfolios by our trainers. In the ‘guidance’ that sometimes flows down from the Deaneries – compliance to the letter of what needs to be provided at ARCP / RITA etc is fairly doom laden, with dire warnings about failing if not fulfilled. This is completely at odds with the importance many educational supervisors and panel members seem to put on the assessments and records once they are submitted. In this environment – is it any wonder that trainees hate filling out assessments – they seem to mean nothing, advance one’s career or experience not at all, and are carried out to avoid censure at the next set-piece appraisal.

    If WBPA’s were more often filled in with true meaning and intent by the assessors (as in not everyone is above average for stage of training – let us grow up a little and admit to ourselves and others that people will not be perfect at everything all of the time) then I think they would be seen as more useful in informing future learning needs – as they stand they do appear just to be ‘ticks’

    The point above about the audit trail reaching back to assessors is important – we should not let our standards drop simply because someone ‘needs a CBD for next week’

    Secondly – the e-portfolio as it stands is too clunky, complicated and slow to use. If they were to become truly useful, we need to have apps or similar which allow very quick accessing of the assessments to do as close as possible to the learning event to make the lessons applicable and relevant – and not biased by a dim recollection of something that might or might not have happened. By being distant and cumbersome, the porftolios are a millstone, instead of something which could actually improve people’s ability to demonstrate excellence.

    Finally a question for tomorrow – what would the effect be of having no minimum requirements each year – would trainees then use the portfolio as a way to demonstrate excellence, rather than minumum attainment – if WBPAs were taken seriously and effectively became mini-references in themselves, and were regarded as important as references by trainers, would that increase the meaning of the assessments, and result in higher quality, more rigorous feedback which could actually servce to improve people’s practice rather than the all too common line of ticks down the above average or satisfactory column?

  23. #34 by Tauseef Mehrali on June 30, 2011 - 9:11 am

    The number of comments (and their wonderful content) more than justifies the blog post – great work!

    Just a few thoughts on the post and subsequent discussions:

    1) I have known nothing but portfolio-based training (house job, 2 yr paeds SHO rotation and finally GP VTS). Although incredibly burdensome I took it seriously (i think) and the 2 posts where my trainer seemed to too were very rewarding.

    2) by far the most valuable learning relationship during my training was the year-long GP registrar apprenticeship. I’m not sure anything can replace the frequent f2f interaction yet alone capture its content.

    3) portfolios, as mentioned previously by Ben and Toby, don’t seem to inspire excellence especially when minimum requirements are set. They are often clunky, disjointed and not geared towards capturing the nuanced, intangible and spontaneous.

    4) the most recent e-portfolio incarnation I am contending with is the NHS appraisal toolkit – a UI monstrosity. Thankfully it does away with any minimum requirements but repeats the folly of tedious and difficult data entry and capture forced me to tweak my delicious.com bookmarking to reduce the pain.

    5) i think means of encouraging spontaneous rather than enforced ‘reflection’ would be beneficial as it’s often the data entry troubles that discourage trainees. iPhone apps, real-time recording of learning (via delicious and TILT). More freetexting too?

    6) the portfolio system should be complemented by regular trainer/mentor time too though and not regarded as a substitute. HR intensive but gives an incentive to keep portfolio up to date, provides a forum for fleshing out the entries and exploring new learning opportunities.

  24. #35 by ffolliet on June 30, 2011 - 9:45 am

    Portfolios are a brilliant idea that, effectively used will lead to the development of an increased number of more rounded, effective, educated, relective, developed clinicians. The educational theory is sound, the practice is appropriate and the dedication of time is completely reasonable.

    Sadly the vast majority of portfolios are a waste of time, resources and enthusiasm.

    This due principally to a lack of full engagement with the process by the majority of trainees and trainers. The reasoning behind this is of course complex but includes poor communicaation of the process, lack of education in using the process, established excellent practice being effectively ignored, laziness, time constraints of busy jobs and rotas, ignorance and the standard resistance to change. Changing this attitude is a huge challenge that needs effective engagement rather than the current setup.

  25. #36 by Vanessa on June 30, 2011 - 5:08 pm

    Opening my GP VTS eportfolio fills me with dread and a desire to be doing pretty much anything else. I know there is a similar feeling amongst many others.

    What I find useful is the discussion with my trainer and collegues rather than regurgitating and typing forced emotions to show that I have adequately reflected. The temptation is then just to write what you think they’ll want to hear rather than uncovering any new thoughts. The reflection takes place in my head and in discussions rather than in set boxes online.

    I can see that mulitsource feedback and patient surveys are useful and a note of any particular learning points you want for future reference. However I can’t see the relevance getting DOPS for procedures we were assessed on as medical students such as ECGs nor of major emphasis on emotion based reflection unless it’s useful to your personal style of learning.

  26. #37 by Suparna Das on June 30, 2011 - 6:17 pm

    Hi all – just managed to read through all the insightful comments prior to the #meded tweetchat tonight. My £0.02 worth:

    1. Never had to do an e-portfolio as an anaesthetic trainee although had to maintain a logbook and keep a hard copy folder of all my assessments, courses etc for SpR RITA. As mentioned above, RCoA is ahead of other specialties when it comes to e-logbook, e-portfolio etc
    2. RCoA was one of the first Royal Colleges to introduce competency based training and assessment – again I would say this was progressive of them. I even wrote an essay on this for my MSc (Medical Education) …
    3. Bloom’s taxonomy suggests that there are three types of learning: Cognitive (knowledge), Psychomotor (skills) and Affective (attitudes). This is the taxonomy the RCoA has adopted. I’m not sure how the F1 and F2 e-portfolios are structured but the above is a good structure underpinned by learning theory.
    4. My impression from reading the comments above is that portfolios or e-portfolios is a good idea but has probably been badly implemented
    5. Did the end-users have any input into the e-portfolio design? With any IT project, small or large, it is essential to involve end-users right from the design and development stage or it ends in failure. Information systems are socio-technical systems – so ignore the social/human/cultural aspect at your peril. ‘Bring in the IT and the change will follow’ doesn’t work.
    6. ‘Safe pair of hands’ or ‘good egg’ isn’t an objective assessment. Neither are references from a professor emeritus. Literature and studies from human resources (HR) research shows that semi-structured interviews + assessment centres have the best predictive validity for employability and future performance. Predictive validity of references is the same as that of graphology i.e. 0. Anyone for studying a doctor’s hand writing to employ them?

    • #38 by amcunningham on June 30, 2011 - 6:41 pm

      Thank you- hmm I think there is a difference between using an interview to give a global assessment of a potential candidate after 1-2 hours and the longer contact that an semi-structured interviews and assessment centres give. Mightn’t it be that simply more time with the candidate leads to a better assessment.
      Global assessment of someone who has been supervised for several months is different. What is the evidence that the ‘snapshots’ provided by portfolio entries provide a better assessment than this global assessment based on much time spent together?

  27. #39 by David Colquhoun on June 30, 2011 - 7:13 pm

    An enormous and expensive quango called Skills for Health provides ‘competencies’ in “distant healing” and every other form of nonsense. Nobody takes it seriously apart from HR people who wouldn’t recognise quality if they fell over it.

    The effect of these box ticking exercises is not simply to waste time, but often also to rubber-stamp poor quality. Just look at the pathetic performance of the QAA who will endorse any sort of rubbish if they can tick a few boxes. The saga of the University of Wales is as good an example as any http://www.dcscience.net/?p=3675

  28. #40 by DJ on June 30, 2011 - 7:28 pm

    Portfolios do not identify bad doctors.

    The Doctors with the lovely, completed portfolios, who reflect more than a hall of mirrors, are the ones who sit in the Mess filling them in. They couldn’t manage an MI, if it slapped them in the face and said “I’m an MI, here’s a book on how to manage me”.

    Portfolios only identify the doctors who don’t do the paperwork, or don’t manage to meet the ever-changing goals laid down to them by the Deanery.

    I know of several excellent doctors who have not passed their ARCP solely due to the fact that their portfolio wasn’t completed to the way the educationalists deemed appropriate.

    Whilst there probably is some benefit in keeping a portfolio, using it as an assessment tool, or – even worse – as a stick to beat us with, is completely inappropriate.

    The educationalists and those with a vested interest in keeping the charabanc on the road seem to think it is a good idea; I have yet to meet one “real world” end-user (either assessor or assessee) who finds these other than a waste of time, effort and money.

    We should expect that they’re here to stay, only because we have no choice because we are told they are good.

  29. #41 by rodric Jenkin on June 30, 2011 - 7:30 pm

    Well, I have found the experience horrendous. I echo ben goldacre’s comment. Post shipman it was realised that “something must be done” to ensure doctors are adequately trained. What we have is ‘something’. It is almost certainly not the right thing or a good thing.

    In many ways it is, I believe causing a trememdous amount of harm.

    The ‘medical educationalists’, I think, need to consider very carefully every intervention they instigate and specifically consider the time juniors have to expend to do it. Any new intervention should in itself aid learning and should take the minimum possible adminstrative time. Workbased assessments are the least of the evil of the eportfolio, although insisting I learn by WBAs is very proscriptive and fails to acknowledge that different learning methods suit different people.

    The true evil is ‘linking competencies’ to your curriculums. This is something that, particularly in medicine, I find myself spending hours and days of my life doing. I gain nothing from the process educationally and it is to even the briefest analysis absurd…

    I see a patient with chest pain, get a CBD, and perhaps see another patient on take as part of my on call and get an ACAT. A completely SEPARATE consultant (my supervisor) looks at this ‘evidence’ (this is terrible misuse of a word, we really should write ‘anecdote’) and designates me ‘competent’, whatever that means. I do this for every point on my 5 curriculums as I try to train in geriatric, general medicine and stroke. This time consuming process is absurd and probably would be deemed to unscientific to be used as the assessment method for a diploma in homeopathy. Worse still it wastes training my time, wastes ALL the time I spend with my educational supervisor and paints a totally false picture of peoples’ abilities.

    The interesting question of whether a “competency model is required by the wider society” is made at the top.

    The rationale for competency based medicine and workbased assessments is fascinating. Obviously competency based medicine doesn’t ensure all doctors are competent and the JAMA metaanalysis was fairly critical of existing data (although this is all so qualitatitive I remain far from convinced that an “evidenced based approach” to medical education will ever be anything than absurdly biased qualitative studies of poor quality).

    Another potential reason touted for competency based medicine is to ‘reassure the public’. I would be suprised if society needs huge amounts of beaurocratic paperwork to reassure them that doctors are well trained. I think that’s a patronising approach. We should instead say that we will use any intervention that aids learning or accurately assesses ability but need to respect doctors’ training time and give them freedom to learn. We should actually assure the public that we WON’T waste doctors training time by making them do pointless beaurocratic tasks.

    Finally some people suggest deaneries and medical schools want this system to provide legal protection should something go wrong with a trainee of theirs. First of all there is no case law or guarantee it would work in protecting them. Secondly wider society does not benefit from doctors expending thousands and thousands of manhours for a pre-emptive medicolegal defence rather than reading journals and learning medicine.

    It is a terrible thing on so many levels. Certainly if a quarter of the amount of time spent by PMETB/JRCTB on creating curricula, introducing new assessments was spent actually considering how to improve training (rather than improve the appearance of a trainees portfolio) medical education in the UK might not be sliding into the abyss.

    • #42 by maturinuk on June 30, 2011 - 9:35 pm

      I think you make good points. The process now demands written evidence (anecdotal or otherwise) that one has done such and such to so and so with the outcome, followed by references if possible to justify the approach, preferably referring to approved (local) guideline(s). This was not how I was educated and trained as a doctor nearly 25 years ago.

      While I have not seen the obituary for the profession of medicine, it is certainly not far away, This is a shame and few of the elite of our profession over the past 2-3 decades are available to account for what they have done for us who are the unwitting guineapigs of their great social experiment.

  30. #43 by drlj on July 8, 2012 - 6:42 pm

    This is such an important discussion to have. Great to see lots of contributions. I have a few short thoughts:
    1. The NHS ePortfolio is not fit for purpose and needs investment, trainee and supervisor input and a complete overhaul
    2. The validity of the WBAs, and the effectiveness of the current system to identify failing trainees and excellence needs to be properly examined. Trainees should not have to sacrifice their time doing things that are not of educational value.
    3. The word competence makes me want to vomit. Of course we need doctors to be competent in required skills and knowledge, but competency is the lowest possible level of attainment. We need to strive for excellence and model this to trainees. There is a huge problem with demoralisation in the current trainees, and the system is fuelling this. The phrases “jumping through hoops” and “tick-boxes” are said incredibly frequently. I rarely hear “inspiring discussion” or “valuable feedback.”

    Please see my hastily put together blog http://www.nhseportfoliorevolution.wordpress.com and leave lots of comments

  1. Portfolios and Competency: Pre #meded chat « Med Ed Connect
  2. 30 June #meded chat transcript: Portfolios « Med Ed Connect

Leave a reply to Catherine Cancel reply