The UK Clinical Aptitude Test (UKCAT) is a standardized test used as an entrance examination for the majority of medical and dental schools in the UK. Its purpose is to test the likely aptitude of a candidate for a clinical career.
Tag Archive: Education
If you’re interested in learning about wine, this will surely be a great course.
James Flewellen, one of the course organisers and tipped for great things by wine writer Jancis Robinson, writes at The Oxford Wine Blog which is also definitely worth checking out for a simple, no-nonsense approach to appreciating wine.
via Outre monde
After more than a year of enjoyable personal blogging on WordPress, I’m proud to announce a new blog: Get into Medicine UK.
Eagle-eyed readers will have noted the appearance of its link in my blogroll here this week, but I’d like to take a moment of your time to introduce it properly. As some of you know, I run a course helping prospective applicants Get into Medical School, explaining the complexities of the application process and the exams & interviews they face.
A key thing that students need to be able to demonstrate, both in their personal statements and interviews, is enthusiasm for the subject and an awareness of ongoing ethical & controversial issues in the field. Of course, you’d hope they’d be interested in keeping up to date for their own interest too, but it’s sometimes difficult to find time for this in addition to all the other work they’re doing.
Get into Medicine UK aims to help students maintain this awareness and enthusiasm by collecting some of the more interesting health & medicine stories from around the web into one easy-to-follow place. It also includes prompts for some of the ideas and questions the stories raise, which students might want to think about before their interviews. And best of all, the site is entirely free, with no restricted/pay content.
Of course, a lot of health news has relevance to a wider audience too, so even if you’re not in the health profession or trying to get into it, you should find something interesting to read there too.
Never fear, Beyond Anomie will remain my personal blog, but the great thing about WordPress is the flexibility of its platform and I’m excited about using it for Get into Medicine UK.
Thanks for your ongoing support, and do please spread the word to anyone you know who might be thinking about a career in Medicine!
‘I will build the Zodiac…
The lives of men,
from birth to final destruction,
shall be controlled
by the hidden workings of this mechanism.’
Destiny and Necessity are cemented together.
Destiny sows the seed.
Necessity compels the result.
Few can escape their fate
or guard against
the terrible influence of the Zodiac…
the rational part of a man’s soul
… the working of these gods is as nothing.
But such men are few.
Most are led and driven by the gods
which govern earthly life…
To my way of thinking, however,
it is our duty not simply to acquiesce
in our human state,
but, through intense contemplation
of divine things,
to detach ourselves from our merely mortal nature.
Hermes Trismegistus in The Hermetica, as translated by Timothy Freke & Peter Gandy
Borrowed from the library of Dr Neel Burton
Compiled thousands of years ago at the time of the Pharoahs, is there any more prescient statement of Man’s current condition?
Beyond the mortal influences of genetic nature (Destiny) and environmental nurture with its unconscious influence on behaviour (Necessity), lies choice.
It is up to us to choose to think, to act freely, to gain insight, and so also gain Enlightenment.
Few make that choice.
Our recent ferociously cold snap paused for breath today, allowing for a Sunday stroll through Oxford on a sunny and bright afternoon. After the persistent subzero temperatures of the past week or so, today’s several degrees above freezing seemed balmy by comparison. It was also a window of opportunity to relieve a mild case of cabin fever since the forecast is for temperatures to plummet again tonight. On the walk home, I was struck by just how many young people were roaming around town, clutching holdalls and consulting maps.
The reason for all this activity is that tomorrow marks the start of the annual interview season, when over 10,000 nervous applicants descend on the University over the space of a fortnight, to be grilled by their potential colleges as to whether they will be offered a place to read the subject of their choice. Their odds are actually pretty good by the time they get to the interview stage (a lot of the whittling down having been done at the shortlisting stage) but the nervous expressions etched onto their faces are understandable as they feel their entire lives are dependent on how they perform over one or two 20 minute interviews.
It took me back to how I felt when I was in their shoes, which is now rather more years ago than I care to calculate. That year also had particularly bad weather, with snow falling heavily on the day I had to attend. Fortunately the car made it through the inclement conditions and I was able to settle in comfortably before interviews the following day. I was still rather tense but the interviews themselves were surprisingly enjoyable, believe it or not, and it was a relief to feel I’d done my best. A relief which turned to delight a couple of weeks later when I was offered a place.
Nowadays, I find myself in the position of helping young people learn the skills they need to get through the application process, and in fact next weekend I’ll be teaching an interview skills workshop as part of the Get into Medical School courses I run with a friend. It’s always an enjoyable course to teach, partly because we have time to offer each attendee a full-length mock interview, so we can give them feedback on what they do well and the areas they might want to consider improving. That lets us get to know them a bit better than on our regular one-day GeMS course which gives more of an overview of the entire application process. It’s a nice feeling to know that you’ve made a positive contribution towards helping someone talented achieve their potential.
As part of that plan to try to offer something a little extra, we’re also running a Summer School next year, so that international potential medical school applicants (and others who find it tricky to travel to our one-day courses) can also get the benefit. The Summer School will let them combine getting that advice and coaching with the chance to experience what it’s like living in an Oxford college for a week.
It’s not always possible to predict where life will take you. When I first came for interviews here, I had no idea I’d still be living in this city many years later, let alone that I would be helping others get in. The illustration to this post is taken from the much larger fresco of The School of Athens, by Raphael, which is one of four frescoes dominating the Stanze di Raffaello in the Apostolic Palace of the Vatican City. It is thought to feature every major Greek philosopher, though the identities of some remain uncertain. Plato points to Heaven, Aristotle to the Earth, reflecting their different philosophical priorities. Thus, not everyone will find the same path to wisdom, but that the life task facing us all remains the same: finding what is best in ourselves by developing insight, maximising that potential through thought and study, and growing from the experience.
The oldest extant statute in English law is the Distress Act of 1267, specifying as it does the requirement for plaintiffs to pursue claims for damages (“distresses”) through the courts, rather than attempt to extort them by other means.
The Distress Act is part of the Statue of Marlborough issued by King Henry III. While the Distress Act contains the only laws not since repealed or superseded in the intervening centuries, other elements of the Statute of Marlborough took a long time to disappear from our legal system.
One such area was Benefit of Clergy, a provision originally enacted by Henry II, grandfather of the aforementioned Henry III. Prior to Henry II’s reign, English courts were presided over by both a representative of the Church and a representative of the state. Henry II was nothing if not a builder of centralised state power and disliked what he perceived as an infringement of ecclesiastical authority upon secular matters of crime and punishment. His creation of law courts without Church representation led to a power struggle with his former friend Thomas Becket, by then Archbishop of Canterbury but known to all schoolchildren since as a turbulent priest killed by miserable drones.
Becket in death achieved something he failed to during life. The public outcry over his murder was such that Henry was forced to permit the Church its own separate ecclesiastical courts, and any member of the clergy could choose to be tried under this system instead of by the secular state. This was much to the accused’s advantage as the ecclesiastical courts were generally far more lenient in both their trials and their punishments.
To gain this Benefit of Clergy, a priest had to demonstrate that he was indeed a man of the cloth. Initially this was done quite literally, by turning up appropriately garbed in religious clothes and with a tonsured head. This was later broadened into a literacy test, which allow educated laymen to also be tried under ecclesiastical court authority. The literacy test chosen was, appropriately enough, the ability to read from the Bible. Technically any section could be chosen… but the English legal system has always had a fondness for theatricality and ironic whimsy; Psalm 51 became the perennial favourite:
Miserere mei, Deus, secundum misericordiam tuam
O God, have mercy upon me, according to thine heartfelt mercifulness
Despite being watered down over the years, Benefit of Clergy was not fully expunged from the English legal system until the relatively late date of 1823. But conflict between Church and State authority remains. The protection from the criminal justice system afforded to paedophile priests by some in the Catholic Church, the growing use of Islamic Sharia Law even in secular Muslim states, the increasing encroachment of fundamentalist values into politics in America, the tension between secular and religious elements in Gaza and the West Bank, Hindu nationalism in parts of India… I could go on, but the list will be all too familiar to those reading this entry.
I do not think there is necessarily something specific to religion that leads to these tensions. I prefer to think of the conflict as something to be expected when two competing spheres of authority attempt to exert power over the same system. While religious faith can give its adherents a fixity of belief that can sometimes override logic, the same is true of anyone with a strong belief system whether religious or secular. The political world – and any pub by around 10pm – is full of such people.
The lesson to be learnt from the ever-present conflict between religious authority and secular power is that a nation can never truly be ruled by one element alone; Man is neither wholly rational nor wholly emotional, but a complex amalgam of the two, with the balance continually shifting in response to intrinsic and extrinsic variables. A shrewd powerbroker knows when to appeal to one and when to use another, just as the wise man knows when to use logic and when to trust his instincts.
After all the hype and debate, today was finally George Osborne’s chance to stand up as Chancellor of the Exchequer and spell out how the UK intends to balance its books. £150 billion of borrowing every year, running at around 11% of GDP, was unsustainable and now the cuts have to be made.
I suspect he probably had mixed feelings as he stood up at the Dispatch Box to announce the detail. On one hand, no Chancellor of any party likes delivering bad news; it tends to result in lost votes, after all. And he must have felt frustrated to have been placed in this dire financial, and electorally unpopular, position immediately after the general election. It’s a very different position to the very benign macroeconomic landscape in 1997 when the reins of power last changed hands.
On the other hand, there’s nothing like being in control of the decision-making process, and the current environment does offer some rare opportunities as well as risks. The last 10+ years have seen ever-increasing public expenditure, with a corresponding enlargement of the public sector. To those who believe that the state should only do what the private sector cannot or will not, the massive and unsustainable budget deficit offers the opportunity to kill two birds with one stone: rebalance the books, but also rebalance the role of the state.
It’s worth noting that the average cuts of 19% across the five-year Parliament are actually not that different – even a touch less – than the proposed 20% cuts hinted at by Labour back in March of this year, before they lost the General Election, despite the howls of protest from their benches today. The difference is not the headline figure, but the nature of the cuts and their presentation.
In this light, the Comprehensive Spending Review and today’s announcements may allow Osborne, and the coalition government the chance to alter the accepted paradigm of British political thinking, in a similar way to Attlee in the post-war period or Thatcher in the 1980s. It holds out the prospect of effecting not just an economic change, but a sociocultural one too.
The cuts are necessary to balance the books. But the method of cutting and its presentation is about rebranding the role of the state versus the role of the private sector. It is a bold attempt to redefine the social and cultural framework of the country. As President Sarkozy is finding out in France, whenever politicians attempt to alter cultural paradigms in this manner there is usually a significant backlash from entrenched vested interests in the status quo. If politicians are to succeed in their aims, they must acknowledge the resistance by consistently and clearly selling their message directly to the general public.
Politics is not all that different to advertising; those with the most effective and convincing message tend to gain market share. And the reward of gaining market share in this context is not just a few transient extra votes in one particular electoral cycle, but altering the terrain of the battleground for many cycles to come. The post-war Labour government created the welfare state, nationalised industries, and in doing so, set the terms of engagement for the next 30 years. Margaret Thatcher in the 1980s, by taking on the unions and privatising industry, set the terms of engagement for the next 30 years. Tony Blair was forced to fight on her territory even while winning landslide Labour party victories in 1997 and 2001. Blair never fundamentally altered those terms of engagement during his time in office; he was too careful not to lose votes through risk-taking (with the sole exception of his military interventionism). Brown, being less attuned to the electorate’s sensitivities, may just have managed such a shift if he had won the 2010 election.
Instead, Osborne has the opportunity. It is a risky route to take; massive public unpopularity is almost guaranteed in the short-term, and if the budgetary medicine does not work to reverse the deficit by 2015, that unpopularity will be sustained, probably resulting in a change of government. But the potential reward is massive: restoring fiscal stability and a sustainable approach to government spending, and perhaps even more importantly, altering the basic sociopolitical cultural landscape for years to come.
Those on both sides of the political divide recognise this. Expect the fireworks from today’s announcements to last way beyond the 5th of November…
In the light of yesterday’s story about plans to permit the charging of higher fees for university tuition in England, it’s worth remembering that not all wisdom comes from formal tuition, and natural curiosity can serve us all well in the acquisition of knowledge. Financial expenditure can certainly help create structures that enhance learning opportunities and further research opportunities, but the real key to learning is having an open and curious mind.
It is perhaps ironic then, that my most recent bit of newly acquired knowledge came directly from money. Not so much by spending it, but rather by receiving a 50p coin in change when buying a latte and a sandwich for lunch. The 50p coin in question, an image of which is at the top of this article, is a 2005 commemorative issue, and certainly not rare (apparently 17.6 million are in general circulation) though this was the first time I’d noticed it in my change.
The obverse has the usual portrait of the Queen, but the reverse has the dictionary definitions of both “fifty” and “pence”, as written in Johnson’s Dictionary of 1755. The existence of the coin was fresh knowledge to me, but the definitions themselves was even more interesting. I’d never consciously thought about what part of speech numbers are, but as can be seen on the coin, they are actually adjectives rather than nouns in most situations. My cursory research into the matter suggests that when used in isolation, they may be nouns, but when they are used to describe a number of objects, they become adjectives.
This is all quite obvious when one thinks about it. But the point is, I hadn’t. And probably wouldn’t have without noticing the coin. So keep your eyes open to the world around you, and ask yourself questions. The country may have run out of money, but even what’s left can still be illuminating!
- Grand designs for 50p coin (bbc.co.uk)
It is interesting to see how the culture of evidence-based medicine has led to an explosion of data and associated studies, which often proves little more than what is already self-evident. This is especially so for audit-based study rather than fresh research, but the problem of inefficiency and unnecessary duplication of work applies to both fields. Don’t get me wrong, I definitely appreciate it when a good scientific study verifies the benefit (or otherwise!) of an intervention. But the sheer obsession with evidence that now exists is indicative of a broader cultural malaise resulting in a society focused on data rather than people, and it is not unique to medical practice.
We have a historically unparalleled ability to extract immense amounts of data from our daily lives. We are monitored, measured and tracked continuously. CCTV cameras, security access cards, withdrawing money from a bank, using a credit card, making telephone calls, using sat-nav, browsing the internet, even sitting quietly reading a book in a library… almost everything we do generates an electronic papertrail and all that data can be collated, processed and analysed. And much of it is.
Correlations inevitably emerge, hypotheses will be formed and theories are generated which can then draw significant and sometimes conflicting followings. In this situation evidence is more often used as a propaganda tool than a means to reveal truth. Every new bit of data is interpreted within pre-existing intellectual positions rather than being used to drive us towards deeper understanding and a better appreciation of an underlying truth.
Innovation – the ability to think outside conventional paradigms – is stifled rather than encouraged by this obsession with evidence and data. Protocol is followed at the expense of originality. The excessive amount of data and the studies that emerge results in analysis-paralysis and a tendency towards ineffective intellectual inertia.
The individual genius, unless of a fearsomely driven character, will become exasperated by the burden of needing to prove his position to the world and will withdraw from an active role in society in favour of protecting himself from it. Thus, in the longer term it is our culture that will suffer for this excessive reliance on “evidence”.
As someone who left full-time NHS practice after specialising, this recent article caught my eye.
In summary, it points out that about a quarter of doctors who qualified in 2008, and who would now be eligible to begin their speciality training, did not do so. This staggering figure means that those doctors either postponed their careers or chose to leave the NHS or work aboard. Considering how much it costs to train doctors up to that level (5-6 years of medical school, plus two years working as foundation-year junior doctors), it also represents a massive waste of the public money spent on their educations.
There is no doubt that applying for speciality training remains competitive, as the article later points out, but the trend is very concerning. Already almost a quarter of doctors are not applying. If that continues, one would be forced to wonder about the quality of doctors who choose to remain.
What I found most amusing was the statement from the British Medical Association that “there has been little indication among junior doctors of significant issues”. This is laughable to anyone who has spoken to a junior doctor recently. There are so many significant issues: pay scales; the implementation of the European Working Time Directive; the increasingly abusive and impersonal way employers treat medical staff; the pointless bureaucracy imposed on doctors; the burden of current appraisal and upcoming revalidation procedures; the many inefficient ways of working that riddle most Trusts; and the general feeling of not being adequately valued by the organisations they work for.
The vocational aspect of the job has been eroded away over the past decade or two and what is left is a robotic bureaucracy. It is not surprising that so many rebel against living that way. Treat people as numbers, and they will respond by voting with their feet.
Looked at in that light, it is perhaps surprising that about 75% chose to stay and apply for speciality training. I wonder how many of them will do the higher training, and then leave later, as I did.