Medical professionals confronted a bleak future in 1945 because:
1) The terror of infectious diseases had been eradicated by public health measures.
2) The fear of bacteria infections was being eliminated by the widespread use of antibiotics.
3) Demobilised medical professionals would be competing for patients.
However, all was not lost because the medical professions unleashed preventive medicine upon an unsuspecting public during the 1940s.
In the 1940s, Hugh R. Leavell and E. Gurney Clark coined the term primary prevention.
Preventive medicine is a very powerful weapons for the medical professional:
a) It justifies spreading terror and fear amongst the public.
b) It justifies making arbitrary and unscientific decisions which are very profitable.
There are many methods for prevention of disease.
It is recommended that adults and children aim to visit their doctor for regular check-ups, even if they feel healthy, to perform disease screening, identify risk factors for disease, discuss tips for a healthy and balanced lifestyle, stay up to date with immunizations and boosters, and maintain a good relationship with a healthcare provider.
Some common disease screenings include checking for hypertension (high blood pressure), hyperglycemia (high blood sugar, a risk factor for diabetes mellitus), hypercholesterolemia (high blood cholesterol), screening for colon cancer, depression, HIV and other common types of sexually transmitted disease such as chlamydia, syphilis, and gonorrhea, mammography (to screen for breast cancer), colorectal cancer screening, a pap test (to check for cervical cancer), and screening for osteoporosis.
Genetic testing can also be performed to screen for mutations that cause genetic disorders or predisposition to certain diseases such as breast or ovarian cancer.
However, these measures are not affordable for every individual and the cost effectiveness of preventive healthcare is still a topic of debate.
The precautionary principle or precautionary approach to risk management states that if an action or policy has a suspected risk of causing harm to the public or to the environment, in the absence of scientific consensus that the action or policy is not harmful, the burden of proof that it is not harmful falls on those taking an action.
The principle is used by policy makers to justify discretionary decisions in situations where there is the possibility of harm from making a certain decision (e.g. taking a particular course of action) when extensive scientific knowledge on the matter is lacking. The principle implies that there is a social responsibility to protect the public from exposure to harm, when scientific investigation has found a plausible risk.
These protections can be relaxed only if further scientific findings emerge that provide sound evidence that no harm will result.
In some legal systems, as in the law of the European Union, the application of the precautionary principle has been made a statutory requirement in some areas of law.
Regarding international conduct, the first endorsement of the principle was in 1982 when the World Charter for Nature was adopted by the United Nations General Assembly, while its first international implementation was in 1987 through the Montreal Protocol. Soon after, the principle integrated with many other legally binding international treaties such as the Rio Declaration and Kyoto Protocol.
The last 60 years of preventive medicine has seen the oxymoronic preventable deaths entering the modern lexicon.
Death can not be prevented by medical professions – only delayed or advanced.
Therefore, it is important to know whether the medical profession is a net public health risk [or benefit] in terms of the mortality rate
The United Kingdom is a special case because the medical profession hit the employment jackpot in 1948 when the British government established the National Health Service which aims to provide [controlled and prioritised] access to a range of approved healthcare services that are [mainly] “free” when [and if] you are [finally] granted entry to a “point of use”.
The National Health Service (NHS) is the publicly funded healthcare system for England.
It is the largest and the oldest single-payer healthcare system in the world.
Primarily funded through the general taxation system, the system provides healthcare to every legal resident in the United Kingdom, with most services free at the point of use.
The Labour Government elected in 1945 had made manifesto commitments to implement the recommendations of the Beveridge Report of 1942.
The report’s recommendation to create “comprehensive health and rehabilitation services for prevention and cure of disease” was implemented across the United Kingdom on 5 July 1948.
The services were initially funded through general taxation and National Insurance as part of the introduction of a wider Welfare State.
They were initially free at the point of use, although some prescription charges were soon introduced in response to economic difficulties.
These charges are still in place with the English NHS, but not in the other three systems.
Expenditure for 2012/13 was projected to be:
£108.9 billion for National Health Service (England)
£3.9bn for Health and Social Care in Northern Ireland
£9.38bn for NHS Scotland
£5.3bn for NHS Wales
The total spending of GDP on healthcare, including private, in the UK is 9.4%, considerably less than comparable economies such as France (11.6%), Germany (11.3%), Netherlands (11.9%), Canada (11.2%) and the USA (17.7%).
Unfortunately, the introduction of the National Health Service in 1948 marks a change for the worse in the mortality rates for England and Wales.
Death rates at lowest ever levels in England and Wales – BBC News – 22 July 2010
In the short term, mortality rates increased although this may [partially] be associated with the snowy winter of 1950-51.
1947-50: Little or average snowfall, although some large falls of up to 12 inches.
1950-51: This winter will be remembered as the snowiest winter of last century at high levels. There were 102 days of lying snow at Dalwhinnie (1000ft) (83 days reputedly in 1946-47) 15th December saw 15 inches of snow In Shanklin, IOW, in 3.5 hours! Bournemouth saw 10 inches, Scarborough and Lowestoft, 14 inches. Snowy.
1951-54: Average years in terms of snowfall, though one noteworthy fall of 1ft in Wales and the Southern Midlands ( late November ) with drifts of 30 feet!!!!! Some other falls of 12 inches at the end of this period.
The history of British winters
Sadly, 1948 also marks a secular change for the worse because the steady underlying rate of decline in female mortality [since 1900] suddenly pivots and flattens after 1948.
Tragically, the steady underlying decline in male mortality [since 1900] effectively flat lines between 1948 and 1972 mainly due to an increase in drunk driving deaths.
After 1972 the male mortality rate started [once again] to decline after public health measures in 1968 introduced car seat belts and the breathalyser.
The Year of 1968 – First Breathalyser is Type Approved
The first preliminary roadside breathalyser to be type approved by the home office was the Alcotest 80, manufactured by Dräger Ltd. The number 80 in the name refers to the BAC (blood alcohol concentration) limit it was designed to detect.
The introduction of the breathalyser in the UK, along with a heavy Government run advertising campaign helped decrease the percentage of road traffic accidents where alcohol had been a factor from 25% to 15% in the first year.
This resulted in 1,152 fewer recorded deaths, 11,177 fewer serious injuries and 28,130 fewer slight injuries caused by road traffic accidents.
In the UK, a requirement for anchorage points was introduced in 1965, followed by the requirement in 1968 to fit three-point belts in the front outboard positions on all new cars and all existing cars back to 1965
Arguably, the steady decline in male mortality after 1972 simply reflects the general de-industrialisation of the United Kingdom rather than any improvement in healthcare.
Misery and Debt – Endnotes
Therefore, the statistics suggest the National Health Service has been a net public health risk which has [at vast cost] negatively impacted mortality rates in England and Wales for the last 67 years.
In the United States the medical professionals don’t appear to have had much impact [during the 20th century] upon the shape of the mortality curve which primarily reflects public health measures and the steady de-industrialisation of America after 1945.
Death and the Human Environment: The United States in the 20th Century
Jesse H. Ausubel, Perrin S. Meyer, Iddo K.Wernick
Program for the Human Environment – The Rockefeller University
Manufacturing Jobs Aren’t Coming Back, And That’s OK
David M. Ewalt – 8 November 2011 – Forbes
This is truly remarkably considering the amount spent on US healthcare.
Why Are U.S. Health Care Costs So High?
Todd Hixon – 1 March 2012 – Forbes
Another truly remarkable statistic is the number of deaths caused by the US healthcare system.
In 1999, Americans learned that 98,000 people were dying every year from preventable errors in hospitals.
That came from a widely touted analysis by the Institute of Medicine (IOM) called To Err Is Human.
As it turns out, those were the good old days.
According to a new study just out from the prestigious Journal of Patient Safety, four times as many people die from preventable medical errors than we thought, as many as 440,000 a year.
Stunning News On Preventable Deaths In Hospitals
Leah Binder – 23 Sept 2013 – Forbes
The statistics suggest the UK and the USA need to introduce public health measures to limit and control their healthcare systems if they want to reduce mortality rates further.