Monday, 4 February 2019

Taking an anticlotting drug? If you need a procedure, be prepared

My local farmers’ market was busy with the Saturday morning bustle of people buying homemade goods and locally grown fruits and vegetables. One of the vendors had a swarm of customers inspecting freshly baked breads. “They’re sprouted-grain breads,” the baker told me, and explained that they tasted better and were healthier than regular whole-grain breads. A sample was delicious — the recipe included sprouted Kamut and spelt, and the bread had a nutty flavor — but was it more nutritious than the regular whole-grain bread I’d just purchased from another vendor?
About sprouted grains

For more on the subject, I turned to Kristina Secinaro, a registered dietitian at Harvard-affiliated Beth Israel Deaconess Medical Center.

She explained that sprouted grains are simply whole-grain seeds that have just begun to sprout. In order to catch the sprouts at just the right moment in the growing process, whole-grain seeds are typically soaked and then nurtured in environments with controlled amounts of warmth and moisture. This can be done at home (in a vented jar) or at food manufacturing plants (in special equipment).

The moist environment can promote bacterial growth. For that reason, Secinaro recommends that you don’t eat raw sprouted grains. Instead, mash them into a paste for use in baked goods, or cook the raw sprouts before adding them to a meal. Cooking or baking the sprouts should be enough to kill any bacteria. You’ll also need to refrigerate cooked sprouts and sprouted-grain baked goods.
Are they better than regular whole grains?

Sprouted grains have many health benefits. It’s the result of catching the sprouts during the germinating process. “This germinating process breaks down some of the starch, which makes the percentage of nutrients higher. It also breaks down phytate, a form of phytic acid that normally decreases absorption of vitamins and minerals in the body. So sprouted grains have more available nutrients than mature grains,” Secinaro says. Those nutrients include folate, iron, vitamin C, zinc, magnesium, and protein. Sprouted grains also may have less starch and be easier to digest than regular grains. “It may help people who are sensitive to digesting grains,” Secinaro says.
How much better?

Sprouted whole grains and regular whole grains contain the same nutrients, but in different quantities. “I do think there are benefits to sprouted grains, but they’re not a cure-all. I would replace some whole grains with sprouted grains at least once a day,” says Secinaro, “and over all, aim for three to six servings of whole grains each day.” A serving might be a piece of whole-grain bread or half a cup of whole-grain pasta.

But just because a product contains sprouted whole grains, that doesn’t mean it has more nutrients than a regular whole-grain product. You’ll have to read the Nutrition Facts label to compare nutrition content.
Buying sprouted-grain products

You can find sprouted-grain goods (flours, breads, buns, muffins, tortillas, crackers, and even pizza crust) at a farmers’ market, like I did, or in a grocery store. “They should be in a refrigerated or frozen section. If they’re not, they probably have preservatives in them, although sprouted quinoa or rice flour is safely kept on the shelf,” Secinaro says.

But don’t assume the products are made of 100% sprouted grains. Sometimes there are just small amounts of sprouted grains in a product, so read the ingredients list or talk to the food maker who’s selling it.In 2015, the opioid crisis was escalating to emergency-level proportions, claiming as many lives as car accidents. As the daughter of a longtime drug addict, the current burgeoning opioid epidemic managed to be both familiar and strange to me at the same time. My mother developed her addictions during the height of drug epidemics that occurred in New York City in the mid-1980s. The timeframe also marked the infancy of the AIDS crisis and the height of Reagan-era “Just Say No” programs. Back then, addiction was treated and viewed more as a crime than a disease, supposedly committed by scoundrels and misfits. The theory held that respectable people did not associate with addicts, much less share their homes and their blood with them.

The intense societal shaming and criminalization of her addictions led to more resistance by my mother to seek the treatment she needed, until she eventually stopped trying to quit altogether. The stigmatization of her disease impacted me profoundly as a child — almost as much as the regular abuses I endured from her due to her addictive behavior. Whether it was being the regular target of smacking, lying, spitting, stealing, or vicious name-calling, it stung all the more because society made me feel complicit by relation. I had no healthy outlet to vent my escalating outrage at my own victimization, at an age when I was too young to properly process or even fully understand what was happening. I learned to stay silent, to repress my feelings, and to isolate myself, so as not to mistakenly disclose our family secret and be swept away into the foster care system, potentially separated forever from my younger brother.

Nowadays, when I see the constant commercials and articles offering support and compassion to those suffering from opioid addiction, I am struck by ambivalence. While I feel both heartened and relieved that addiction is finally being treated as a disease for which such supports can exist, I am also embittered that it did not happen when I needed it. I am angry that the shift in dialogue around addiction — and the companion funding being offered for programs that stress rehabilitation over incarceration for those afflicted — is likely due to the demographic differences in race, class, and regional areas impacted by this epidemic as opposed to the epidemic that claimed my mother. My family was poor, undereducated, and hailed from a low-income inner-city neighborhood where most residents were not white. Thus, we were ignored.

As noted by the National Survey on Drug Use and Health, 75% of all opioid misuse starts with people using medication that wasn’t prescribed for them. Furthermore, 90% of all addictions begin either in adolescence or early adulthood, while most of those who misuse opioids already have a prior history of abusing alcohol and other drugs. In my mother’s case, she began experimenting with cocaine first before jumping to injecting heroin in her mid-twenties; there was no prescription medication involved. My uncle (who was also my godfather) died of an overdose of Xanax (which is a benzodiazepine, not an opioid) after mixing it with too much alcohol. My brother became addicted to my mother’s prescription Dilaudid (a class of opioid) while she was in the late stages of terminal cancer; this occurred in his mid-twenties, after he had struggled for more than a decade with alcoholism.

I personally decided to opt out of using opioids for long-term management of my own pain symptoms because I did not want to risk becoming addicted, considering my own substantial family history and potential genetic predisposition to the disease. However, I understand my decision is a personal one and not something I can or should expect of other people who live with chronic pain. For some patients, long-term opioid treatment can provide adequate pain relief without detracting from their quality of life, but for others it can do more harm over time.

When I hear of people with pain being shamed and stigmatized for trying to fill prescriptions for medications many of them have been using responsibly for years and even decades, it reminds me of the same shame that was thrust onto my mother and family, while we were also deprived of comprehensive and humane treatment for, and even genuine acknowledgement of, our disease. I hope the medical field will work to adopt more nuanced and individualized approaches to treating both pain and addiction that do not cater to one demographic at the expense of the other. Millions of people with cardiovascular disease take drugs that help prevent blood clots, which can lodge in a vessel and choke off the blood supply to part of a leg, lung, or the brain. These potentially lifesaving medications include warfarin (Coumadin) and a class of drugs called non-vitamin K antagonist oral anticoagulants or NOACs. Examples include dabigatran (Pradaxa) and rivaroxaban (Xarelto).

However, if you’re taking one of these drugs and need an invasive procedure — anything from a tooth extraction to a hip replacement — managing the risks can be tricky, says cardiologist Dr. Gregory Piazza, assistant professor of medicine at Harvard Medical School. “There’s a higher-than-normal risk of bleeding during and after the procedure, because your blood doesn’t clot as easily,” he says.

But stopping an anticlotting drug is also risky. Doing so increases the chance of a blood clot, especially if you have surgery, which also leaves you more prone to a clot. “Walking the tightrope between these two extremes can be a challenge for clinicians,” says Dr. Piazza. They need to consider if, when, and how long a person might need to stop taking their anticlotting medication. And the answer hinges on many different factors.
Different risk levels

Each year, about one in 10 people taking a NOAC requires a planned invasive procedure. These include diagnostic tests and treatments that require a doctor to use an instrument to enter the body. Some are more risky than others, of course. Minor procedures such as a skin biopsy aren’t very worrisome, because you can compress and bandage the wound, says Dr. Piazza.

Tooth extractions can bleed a fair amount. Compresses and topical treatments are usually sufficient for controlling the bleeding, although your doctor might suggest skipping your anticoagulant the day of the procedure.
Biopsies, injections, and surgeries

Deciding to stop an anticoagulant for a colonoscopy is more complicated. A diagnostic colonoscopy isn’t likely to cause bleeding. But if the doctor has to remove any polyps from the colon, the risk of bleeding rises. Other procedures that require careful planning for people on anticoagulants include breast and prostate biopsies, as well biopsies of internal organs, such as the kidney or liver, which can lead to hard-to-detect internal bleeding.

Another common procedure (especially in older people) is a steroid injection in the spinal column to treat back pain. This, too, may cause undetected and potentially dangerous bleeding around the spinal column in people taking anticoagulants.

People nearly always have to stop taking anticlotting medications a few days prior to any type of elective surgery. Sometimes, doctors will use injectable, short-acting anticlotting drugs right before and immediately after the operation. This technique, called bridging, helps them better balance the degree of blood clotting during that critical window of time.
A key conversation

In addition to the procedure itself, other factors that affect anticoagulant decisions include a person’s age, any other health problems or medications they take, and whether they’re taking warfarin (which stays in the body for days) or a NOAC (which may lose some of its effect after about 12 hours). Because of all these variables, the best strategy is to make sure that the doctor slated to perform your procedure talks directly with the doctor who prescribed your anticoagulant, says Dr. Piazza. “If that conversation doesn’t take place, patients can have problems with either bleeding or clotting,” he says. Many physicians who do procedures aren’t as familiar with NOAC prescribing guidelines, so they may mistakenly keep people off these medications for a week or more, putting them at risk for a clot.

No comments:

Leave a comment