Articles Posted in Medical Trends

In the 1970s the occurrences of colon cancer were equal for blacks and whites. Then in the mid-1970s blacks began to show higher rates of colon cancer, with a large jump in black mortality rates in the 1980’s. The American Cancer Society stated that current instances of colon cancer are 50 percent higher in blacks than in whites.

Experts blame this trend on the lower rates of screening among blacks as compared to whites, and less access to quality healthcare. Physicians have encouraged colon screening as a way to early diagnose any colon problems, including colon cancer. Currently the screening rate for whites is 50 percent compared to 40 percent for blacks.

Yet if this were the reason for the widening gap then Hispanics, who traditionally undergo even less regular screening and have lower quality healthcare than blacks, would have higher rates of colon cancer than blacks. But in reality Hispanics are less susceptible to colon cancer than both blacks and whites, despite a screening rate of only 32 percent.

This paradoxical lower death rate is not unique to colon cancer. Researchers have found that poorly insured Hispanics have fared better than whites and blacks in several measures of cancer and heart disease.

Physicians admit that there is no reasonable explanation for the disparity in the rate of colon cancer deaths among whites, blacks and Hispanics. However, physicians recommend that everyone get regular check ups and colonoscopy exams from age 50 and up.

Continue reading

Recent studies have shown that decreasing body temperature through the use of the therapeutic hypothermia method increases the chance of surviving a heat attack. The practice of cooling patients was endorsed by the American Heart Association in An Advisory Statement by the Advanced Life Support Task Force of the International Liaison Committee on Resuscitation , and is starting to influence hospital critical care practices nationwide. Therapeutic hypothermia involves cooling the body’s temperature for 24 hours and then gradually returning it to normal temperature in an effort to slow cerebral metabolism.

The University of Chicago Hospital has adopted the practice and developed a Therapeutic Hypothermia Protocol for dealing with emergent cardiac arrest patients. Other cities have taken it a step farther by requiring that hospital emergency rooms are capable of performing therapeutic hypothermia.

On January 1, 2009, New York City will institute a rule that ambulances may only transport certain cardiac arrest patients to hospitals that have cooling systems available, even if it is not the closest hospital. The idea behind this new initiative is that the benefits of therapeutic hypothermia outweigh the benefits of a speedy arrival at a hospital.

Continue reading

It has become more commonplace in Chicago medical circles to order an MRI scan early on in the assessment process. Most Chicago and Illinois doctors view MRI scans as a useful tool to get to the bottom of a patient’s symptoms and aid them in diagnosing problems from headaches to foot aches.

For example, if you come in complaining of constant knee pain then your physician may order an MRI scan. Let’s say that the MRI scan shows torn cartilage and your physician tells you that surgery is the only way to relieve your pain and fix the problem. So you undergo surgery and then physical therapy, but still are experiencing the same pain. Eventually you find out that the torn cartilage was not responsible for your pain- your newly diagnosed arthritis was.

More and more we see patients who have some sort of irregularity on an MRI scan that may not necessarily be responsible for their medical problems. This is a particularly prevalent problem among the millions of people who go to doctors complaining of constant pain. Many of these patients demand that a scan be done to determine why their pain persists.

But in many cases it’s not clear whether what is shown on the scan is the real cause of pain. Take our above hypothetical- the scan did show torn cartilage which in time was determined to not be the pain’s cause, but not until after numerous procedures and many dollars later. This lack of a definitive cause is leaving many people in a medical conundrum of whether to have what amounts to unnecessary surgery that could make their condition worse, or do nothing decisive at all.

Continue reading

Even though a recent study showed infant death rate declining by 2% in the United States and Illinois declined recently, the percentage of decline is much less than in prior years. In fact, this is the smallest decrease since we first began recording the infant death rate in 1907. This trend is compounded by the fact that Illinois and the U.S. have more infant deaths than most other industrialized countries, a trend that has worsened with each passing year.

Each year more than 28,000 infants under one year-old die in the United States. Two-thirds of these infant deaths are preterm babies. In 2006, 6.71 infants died in the United States for every 1,000 live births. In 2006, Illinois was well above the national average with 7.2 infants deaths for every 1,000 births. Illinois’s death rate seems even more startling when compared with that of other countries. In 2004, twenty-two countries had infant mortality rates below 5.0 infant deaths for 1,000 live births, and many Scandinavian and Asian countries posting rates below 3.5.

The infant death rate is important because it is used as an international indicator of a nation’s health and quality of medical care. So even though individuals in the United States spend a much larger portion of its income on health care than those in other industrialized nations, we continue to fall short of the international standard. In 1960 the United States had the 12th lowest rates of infant mortality in the world. But by 2004 we had dropped to 29th lowest, the same rank as Slovakia and Poland.

If we are spending so much more than these other countries, why are we falling further and further behind the world-wide standard? Some look towards recent trends in preterm births, Cesarean deliveries, and other types of birth injury as the source of this problem. Others feel the problem is due to cultural issues, like drug use and obesity. And yet another group feels that the decentralization of our health care system is to blame.

Continue reading

Hospitals and doctors have begun to explore new ways of obtaining payment for their billed services. However, oftentimes these hospitals and doctors are only looking out for their best interests, not their patients’.

Health care providers have teamed up with credit card companies to create a ‘medical credit card’, which is essentially a credit card that can only be used on medical purchases. You can fill out an application in your doctor’s office and get approved while sitting in the waiting room. From a doctor’s perspective this is ideal because they receive instant payment for their services. But oftentimes the patient is the one who loses.

Patients are lured in with low interest rates and the ease of applying, but are not told that if they miss one payment the interest rates skyrocket. Also, some patients have reported that they didn’t even know they were applying for a credit card- they thought they were signing a financial payment agreement with their doctor. Not to mention that when these credit cards are offered to patients they are often in need of care, so their focus is more on their treatment and not their finances.

Another way hospitals have started to solve the problem of unpaid medical bills is to obtain a patient’s credit report. Hospitals attest that they only use these reports to determine whether they should offer charity care or if they should pursue them through bill collectors, but some worry that there is an ulterior motive. Hospitals are only required by law to treat patients with an emergency problem, or who constitute a medical necessity. So what’s to stop them from turning you away if they find out you have bad credit?

Continue reading

Hospitals in Chicago and nationwide could be affected by new evidence that suggests that doctors and nurses could be spreading infectious diseases through contaminated scrubs and clothing. Given all the new antibiotic-resistant disease, like methicillin-resistant Staphylococcus aureus (MRSA), now it is more important than ever to stem the spread of infection.

Chicago and Illinois hospitals have initiated programs advocating that medical providers wash their hands frequently to prevent spreading infection to their patients. And while this is a proven measure to stem the spread of diseases, physicians’ clothes and scrubs can still carry infection from patient to patient. Not to mention that when medical providers continue to wear their hospital clothes outside of the hospital, whether their scrubs or own clothes, then they can carry and spread infections to other areas.

The idea of clothes as a means of transferring infections is a fairly new concept, and as of yet there aren’t any definitive studies proving the extent of harm that can occur. But a recent U.S. study showed that if a hospital worker is in the same room as a patient with MRSA, then the bacteria ends up on their clothes 70% of the time, even if the employee had no physical contact with the patient. This discovery was particularly disturbing because it has been proven that bacteria and disease can live on fabrics for long periods of time.

Given the amount of interaction hospital employees have with numerous sick people throughout the course of their workday, there is an overwhelming likelihood that an employee is carrying some sort of bacteria. But what is the solution? If everyone is walking around in a hospital harboring disease in their very clothes, then how can they avoid transferring to others?

Continue reading

It’s becoming all too common of a tale in Chicago, Illinois and across the country- you go into your hospital for a simple procedure and end up being contaminated by an antibiotic-resistant bacteria. The prevalence of these resistant infections occurs more and more and is not going away any time soon.

Perhaps the most well-known of these is methicillin-resistant Staphylococus aureus (MRSA), a type of “staph” infection that is resistant to the broad spectrum antibiotics typically used to treat it. However, unlike many of the other superbugs coming to light, MRSA can be treated with alternate antibiotics. But the fear is that in time MRSA will also become resistant to these alternative antibiotics.

And while MRSA can still be treated by current medications, there are numerous “superbugs” out there that are virtually untreatable. One of these is Klebsiella, a bacteria similar to MRSA, except that it has an extra cellular layer that blocks out antibiotics that MRSA lets in. And strains resembling Klebsiella are becoming more prevalent, both in hospitals and within our community.

Why Are Bacteria Becoming Resistant to Antibiotics?

Since the introduction of antibiotics in the mid-twentieth century, bacterial infections were suddenly curable. Antibiotics soon became a cure-all and were prescribed to treat not only bacterial infections, but also for viral infections, even though antibiotics have no effect on them. Because of the widespread use of antibiotics, the bacteria soon began developing resistances and the common antibiotics no longer worked.

For awhile drug companies continued to develop new antibiotics to treat these mutations. Eventually, however, many of these sames pharmaceuticals withdrew from this area as the complexity of the research increased and profits decreased. So now we are not only seeing more and more bacteria that are developing resistances to common antibiotics, but we are developing fewer new treatments for these new strains. In short, we are quickly returning to the days before antibiotics were even invented- when bacterial infections were untreatable.

They’re Scary and They’re Out There- Now What?

Continue reading

For the past decade there has been a heated debate over the link between the Measles/Mumps/Rubella (MMR) vaccine and instances of autism. A new study adds further weight to the argument that there is no link between the two, but like all prior studies, does nothing to definitely disprove the opposing view.

The new study was done by researchers from Massachusetts General Hospital, Columbia University and the Center for Disease Control and Prevention. In it they tried to duplicate prior findings that showed the MMR vaccine caused autism, but were unable to do so. As has happened with many other medical studies, this recent one found no evidence that the MMR vaccine caused harm or was in any way linked to autism.

However, those who believe that autism is directly linked to the MMR vaccine are not convinced that the lack of a causal link in this recent study scientifically proves that they are not related. Many of the proponents of the vaccine causing theory have directly witnessed the development in autism in their child after they received the MMR vaccine. For these people nothing short of evidence that refutes the link beyond a shadow of a doubt will do.

Why Do People Think There’s A Link Between Vaccines and Autism?

Many parents began to notice that their children began to display autistic symptoms around the same time they received multiple vaccines. Additionally, a British study by Wakefield et al developed the theory that the MMR vaccine did in fact cause autism. However, it is important to note that it has seen come out that the Wakefield study could have been compromised because the lead researcher had a conflict of interest- part of the study was funding by a legal group involved in bringing cases against drug companies that distributed the MMR vaccine.

But again, while the medical community has not come up with any definitive evidence to refute this link, they have not come up with any evidence to support it. To explain why autism developments following administration of the MMR vaccine, scientists have suggested that the symptoms of autism manifest themselves around the same time that children receive these vaccines.

Regardless of which side of the debate you are on it is clear that too little is known about the cause of autism. Research should be directed at finding what leads to autism and how it can be prevented or cured.

Should My Child Receive Vaccines?

Continue reading

Every fall my now 93 year-old mom calls and reminds me that with the Chicago winter on its way, it is imperative that I get my annual flu shot. Being an obedient son, I comply by hustling over to the nearest clinic to get that shot. And of course, my mom also gets her prescribed flu shot and has thankfully avoided the dreaded flu during the winter season.

But now immunologists are coming forward with new studies reporting that the vaccine doesn’t work very well for those over 70. Yet the over 70 age bracket accounts for 75% of all the flu deaths.

When explaining the drastic shift in these new studies, researchers pointed to the faulty logic used in prior studies. Instead of evaluating the effectiveness of the vaccine against the flu, the studies instead seemed to support evidence of who received the shot and who didn’t.

People who are health conscious, like my mom, are more likely to get an annual flu shot. Whereas people that are frail and have trouble taking care of themselves are less likely to leave home to get the vaccine. And this second group is at greater risk of death, with or without the flu shot.

Continue reading

The United States Supreme Court is expected to hear the case of Wyeth v. Levine this November which could have far reaching implications as to pharmaceutical litigation cases against drug companies, including those in Illinois and the Chicago area. The legal advice comes from top doctors and editors of the New England Journal of Medicine who have submitted a friend-of-the-court brief. The doctors state that the Food and Drug Administration (FDA) “is in no position” to guarantee drug safety. The doctors went on to to say that lawsuits can serve as “a vital deterrent” and protect consumers if drug companies don’t disclose risks.

The underlying case is about Diana Levine, a Vermont guitarist, who lost her right arm below the elbow after being injected with the drug Phenergen, a medicine used mostly for nausea. She sued the drug manufacturer, Wyeth, contending that the drug company had a duty to warn consumers that injections, like the one she experienced, could have devastating consequences. The state courts in Vermont agreed with Ms. Levine in awarding her nearly $7 million.

But Wyeth appealed stating that it was protected from such lawsuits. It argued that the FDA’s judgment could not in effect be overruled by a state court. FDA scientists had weighed the risk and benefits of Phenergan in approving the drug’s safety literature as a guide for doctors. The FDA was aware of the risks associated with injecting some forms of Phenergan, but the label did not specifically warn about the technique used for this patient.

It has been commented before that the FDA has been the “gold standard” in drug evaluation. The New England Journal of Medicine editors warned the justices to be skeptical in taking such a view now.

Continue reading