Saturday, May 27, 2017

Timing of the first epinephrine dose in patients with shockable rhythm cardiac arrest


Popular usage of epinephrine is in cardiac arrest (CA) is early---often after the first shock in VF/pulseless VT arrest. What is widely ignored, however, is that this does not comport with guidelines. For CA with initial shockable rhythm ACLS guidelines call for epinephrine only after the second shock. European guidelines delay epinephrine until after the third shock. This issue was addressed for patients with in hospital onset of shockable CA in a BMJ study:

Setting Analysis of data from the Get With The Guidelines-Resuscitation registry, which includes data from more than 300 hospitals in the United States.

Participants Adults in hospital who experienced cardiac arrest with an initial shockable rhythm, including patients who had a first defibrillation within two minutes of the cardiac arrest and who remained in a shockable rhythm after defibrillation.

Intervention Epinephrine given within two minutes after the first defibrillation.

Main outcome measures Survival to hospital discharge. Secondary outcomes included return of spontaneous circulation and survival to hospital discharge with a good functional outcome. A propensity score was calculated for the receipt of epinephrine within two minutes after the first defibrillation, based on multiple characteristics of patients, events, and hospitals. Patients who received epinephrine at either zero, one, or two minutes after the first defibrillation were then matched on the propensity score with patients who were “at risk” of receiving epinephrine within the same minute but who did not receive it.

Results 2978patients were matched on the propensity score, and the groups were well balanced. 1510 (51%) patients received epinephrine within two minutes after the first defibrillation, which is contrary to current American Heart Association guidelines. Epinephrine given within the first two minutes after the first defibrillation was associated with decreased odds of survival in the propensity score matched analysis (odds ratio 0.70, 95% confidence interval 0.59 to 0.82; P less than 0.001). Early epinephrine administration was also associated with a decreased odds of return of spontaneous circulation (0.71, 0.60 to 0.83; P less than 0.001) and good functional outcome (0.69, 0.58 to 0.83; P less than 0.001).

Conclusion Half of patients with a persistent shockable rhythm received epinephrine within two minutes after the first defibrillation, contrary to current American Heart Association guidelines. The receipt of epinephrine within two minutes after the first defibrillation was associated with decreased odds of survival to hospital discharge as well as decreased odds of return of spontaneous circulation and survival to hospital discharge with a good functional outcome.

This makes perfect sense. After all, for all you know, the first shock may have resulted in ROSC and you may have no way of knowing that until after the next two minutes of compressions. In such a case a push of epi is the last thing the patient needs! So, although this study may be practice changing for some it shouldn't be because it merely reinforces the existing guidelines.

This is in contrast to non shockable CA in which the guidelines call for epinephrine as soon as possible following identification of PEA or asystole.

Friday, May 26, 2017

Encephalopathy secondary to metronidazole


Here is a case presentation and brief discussion in NEJM (free full text). From the paper:

Encephalopathy associated with metronidazole use is an uncommon side effect of the medication. It typically manifests as dysarthria and gait instability. Risk factors include liver dysfunction and a prolonged course of metronidazole (typical cumulative dose, greater than 20 g). MRI of the brain is usually diagnostic and typically reveals a symmetric, enhanced FLAIR signal in the dentate nuclei of the cerebellum.


Thursday, May 25, 2017

Dabigatran versus warfarin and the risk of AKI



Background Whether dabigatran is associated with a lower risk of acute kidney injury (AKI) in patients with nonvalvular atrial fibrillation (NVAF) remains unknown.

Objectives The authors compared the risk of AKI in Asians with NVAF who were prescribed dabigatran versus warfarin.

Methods The authors analyzed patients enrolled in the Taiwan nationwide retrospective cohort study from June 1, 2012, to December 31, 2013. Dabigatran and warfarin were taken by 7,702 and 7,885 NVAF patients without a history of chronic kidney disease (CKD) and 2,256 and 2,089 NVAF patients with a history of CKD, respectively. A propensity-score weighted method was used to balance covariates across study groups.

Results A total of 6,762 (88%) and 940 (12%) CKD-free patients and 2,025 (90%) and 231 (10%) CKD patients took dabigatran 110 mg and 150 mg twice daily, respectively. Dabigatran was associated with a lower risk of AKI than warfarin for either the CKD-free (hazard ratio [HR]: 0.62; 95% confidence interval [CI]: 0.49 to 0.77; p less than 0.001) or CKD (HR: 0.56; 95% CI: 0.46 to 0.69; p less than 0.001) cohort. As the increment in CHA2DS2-VASc score (a risk score based on congestive heart failure, hypertension, age 75 years or older, diabetes mellitus, previous stroke/transient ischemic attack, vascular disease, aged 65 to 74 years, and female sex) increased from 0/1 to 6+ points, the incidence of AKI for the dabigatran group was relatively stable (1.87% to 2.91% per year for the CKD-free cohort; 7.31% to 13.15% per year for the CKD cohort) but increased obviously for patients taking warfarin for either CKD-free (2.00% to 6.16% per year) or CKD cohorts (6.82 to 26.03% per year). The warfarin group had a significantly higher annual risk of AKI than the dabigatran group for those with a high CHA2DS2-VASc score (greater than or equal to 4 for the CKD-free cohort and greater than or equal to 3 for the CKD cohort). Subgroup analysis revealed that among dabigatran users, those taking either low-dose or standard-dose dabigatran, those with a warfarin-naïve or warfarin-experienced history, those with or without diabetes, and those with CHA2DS2-VASc greater than or equal to 4 or HAS-BLED greater than or equal to 3 (risk score based on hypertension, abnormal renal and liver function, stroke, prior major bleeding, labile international normalized ratios, age 65 years or older, drugs or alcohol usage history) all had a lower risk of AKI than those taking warfarin.

Conclusions Among Asians with NVAF, dabigatran is associated with a lower risk of AKI than warfarin.


The future of CME


A NEJM Perspective piece on this topic opened with:

The point at which a clinician takes ownership of his or her own learning agenda is a pivotal moment in professional growth.

That sentence drew me into the article. The author, Graham T. McMahon MD believes there is a point in professional development where the clinician takes ownership of his or her continuing education. Self evident as that may be there are those who oppose this view, believing that CME content should be determined by external authority. That, for example, is the basis for the arguments of those few who defend Maintenance of Certification (MOC).

McMahon went on to elaborate on the benefits of self guided learning as opposed to that driven by others from afar:

Now that information is ubiquitous, simple information exchange has relatively low value; in its place, shared wisdom and the opportunity to engage in problem solving in practice-relevant ways have become key. Physicians seeking professional development can recognize when they’re actively learning and tend to embrace activities that allow them to do so. Education that’s inadequate, inefficient, or ineffective, particularly when participation is driven by mandates, irritates physicians who are forced to revert to “box-checking” behavior that’s antithetical to durable, useful learning.

So what are we to do? Much of the solution, according to McMahon, lies in the attitude of the learner rather than the CME offering:

A key element is self-awareness: professionals who know their own strengths and weaknesses are most likely to have a productive experience when they identify the types of activities that help them grow and then actively participate in them. There are many ways to increase self-awareness, such as taking a self-assessment quiz…

To become self-aware, we have to step out of the protective cocoon of self-confidence and become humble and open enough to assess both how we can best maintain what’s working and how we can grow further.


CME offerings, according to the author, are most effective when they are interactive and learner-centric. While this is a challenge in the traditional didactic format, he does not argue for the elimination of the didactic as some have proposed over the last several years. [1] [2]

All in all, the piece is respectful of differences in learning needs among physicians and supportive of them being in charge of their own individual educational agendas. A  concern I have, though, is this statement in the middle of the piece:

The regulators, too, need to evolve. By relinquishing the fixed structural requirements for health education and instead focusing on educational outcomes (rather than process and time spent), regulators and accreditors can create the right conditions for maximizing educators’ flexibility and promoting innovation. By creating a diverse system that can address even superspecialized needs, we facilitate choice among formats, activity types, and locations. I envision a future in which educational expectations and professional competency obligations are aligned and integrated and in which all physicians have an educational “home” that helps them navigate their continuing growth — so that education is intertwined with practice throughout their careers.

That sounds a little too much like the MOC we loved to hate a couple of years ago.


Wednesday, May 24, 2017

Viewpoint article in JAMA issue devoted to conflicts of interest


The article is available here as free full text. Here are a few points of interest.

Early on in the piece the author, Harvey V. Fineberg, MD, PhD, says:

An individual’s conflict of interest is not tantamount to saying her or his judgment is affected, nor does it constitute an accusation of bias or prejudgment. The presence of a conflict of interest is no judgment about the appropriateness or value of the relationship that engenders the conflict in a particular situation.

Why then should we be worried about COI at all? The answer comes in the next paragraph (emphasis mine):

Some erroneously take a financial interest that qualifies as a conflict of interest to be an allegation that their thinking is tainted. They seek to defend their scientific integrity and adherence to evidence, and aver that the monetary payment they received or financial interest they hold could not possibly influence their scientific or clinical judgment. They may be right, and they miss the point. If a reasonable person would perceive that the financial circumstances could potentially influence their judgment, a failure to acknowledge and respond to the conflict of interest threatens to erode the trust that undergirds the value of professional judgment and expertise.

In short, it’s about the possibility that perceptions would be tainted. I’m looking for substance here and it’s a bit of a stretch. If the argument has any substance it is weakened by the postmodern view expressed a few paragraphs down:

If the presence of a conflict of interest is ultimately subjective—based on the judgment of a reasonable person—it is also situational, that is, dependent on the specific financial circumstances and relationship to the specific role of the person involved. The standards for adjudging a conflict of interest may also be bound by time and place, having different meaning in different cultures and at different moments in history.

The author addresses some of the many questions that arise in the management of COI. There appear to be no definitive answers:

What relationship to a payer and financial level triggers disclosure, discussion, and possible remedy? If not contemporaneous, for how long in the past is a financial relationship deemed relevant? Should the financial interests of a spouse, a parent, a minor child, and a sibling be considered as pertinent as those of the individual involved? In general, the answers to these questions should be guided by the reasonable person standard. To preserve public trust, it is better to lean toward more disclosure rather than less, while also protecting individual privacy and avoiding tangential matters. For example, the financial interests of a cousin or a niece are of less interest than those of a spouse or sibling.

On a refreshing note he takes an appropriately broad view of COI by not singling out industry, acknowledging many other types of financial conflicts as well as non financial conflicts.

The concluding paragraph says we need “disinterested expertise” (is there such a thing?) and reminds us once again that it’s mainly about perception:

Adherence to carefully considered, transparent, and evenhanded policies on conflict of interest can help physicians earn and maintain their trusted place in the minds of the public and policy makers.

What is conspicuously absent from the entire piece is a strong declarative statement that COI really does impair physician judgment let alone harm patients.


Sunday, May 21, 2017

Anticoagulant related nephropathy: huge problem, hugely under-recognized


From a commentary in JACC, here are some key points.

What is it?

Anticoagulant related nephropathy (ARN) is a form of AKI caused by systemic anticoagulation (generally over anticoagulation; in the original reports on warfarin the mean INR was in the mid 4 range).


What are the histopathologic findings?

Severe glomerular, and sometimes tubular, hemorrhage.


It's not just warfarin

Although originally described with warfarin and termed warfarin nephropathy it is now evident that other systemic anticoagulants (and probably any systemic anticoagulant) can cause it. The risk may be higher with warfarin than with the NOACs.


How do you diagnose it?

The difficulties in getting a renal biopsy in patients who are anticoagulated are obvious. Sometimes biopsy is done during a window of anticoagulant interruption. In other cases, if circumstances fit and there is no other plausible explanation “presumptive ARN” is diagnosed without a biopsy.


It is generally not reversible

According to the article, renal recovery tends to be poor.


During periods of excessive anticoagulation the risk is high

From the article:

To date, there have been 5 independent cohort studies...These studies show that the risk of ARN at the onset of coagulopathy is at about 20% overall and about 37% in patients with CKD (3).


According to the article the mortality is high, especially in CKD patients.


Saturday, May 20, 2017

Predicting ARDS mortality based on a simple clinical score


From a paper in Critical Care Medicine:

Objectives: Although there is general agreement on the characteristic features of the acute respiratory distress syndrome, we lack a scoring system that predicts acute respiratory distress syndrome outcome with high probability. Our objective was to develop an outcome score that clinicians could easily calculate at the bedside to predict the risk of death of acute respiratory distress syndrome patients 24 hours after diagnosis.

Design: A prospective, multicenter, observational, descriptive, and validation study.

Setting: A network of multidisciplinary ICUs.

Patients: Six-hundred patients meeting Berlin criteria for moderate and severe acute respiratory distress syndrome enrolled in two independent cohorts treated with lung-protective ventilation.

Interventions: None.

Measurements and Main Results: Using individual demographic, pulmonary, and systemic data at 24 hours after acute respiratory distress syndrome diagnosis, we derived our prediction score in 300 acute respiratory distress syndrome patients based on stratification of variable values into tertiles, and validated in an independent cohort of 300 acute respiratory distress syndrome patients. Primary outcome was in-hospital mortality. We found that a 9-point score based on patient’s age, PaO2/FIO2 ratio, and plateau pressure at 24 hours after acute respiratory distress syndrome diagnosis was associated with death. Patients with a score greater than 7 had a mortality of 83.3% (relative risk, 5.7; 95% CI, 3.0–11.0), whereas patients with scores less than 5 had a mortality of 14.5% (p less than 0.0000001). We confirmed the predictive validity of the score in a validation cohort.

The score is explained in this piece in ACP Hospitalist Weekly:

The score the researchers chose had a minimum of 3 points and a maximum of 9, based on 3 variables: age in years (less than 47, 47 to 66, greater than 66), PaO2/FIO2 in mm Hg (Greater than 158, 105 to 158, less than 105), and plateau pressure in cm H2O (less than 27, 27 to 30, greater than 30). Seven other variables were evaluated for potential inclusion in the score but were not selected. Patients with a score greater than 7 were found to have an inpatient mortality rate of 83.3%, compared to 14.5% in patients with a score below 5, the study found.

Note that the score is only to be applied to the Berlin categories of moderate and severe ARDS.


Friday, May 19, 2017

Antiarrhythmic drugs in cardiac arrest: should amiodarone be moved up two shocks? Is lidocaine the winner by a slim margin?

This trial, published in NEJM, has been the subject of numerous social media posts. The popular spin is that this is just another study showing that the drugs do not work. But isn’t that simple. Let’s take a look at the paper:


Patients


The trial included patients 18 years of age or older with nontraumatic out-of-hospital cardiac arrest and shock-refractory ventricular fibrillation or pulseless ventricular tachycardia, defined as confirmed persistent (nonterminating) or recurrent (restarting after successful termination) ventricular fibrillation or pulseless ventricular tachycardia after one or more shocks anytime during resuscitation (inclusive of rhythms interpreted as being shockable by an automated external defibrillator.


It is useful to pause here and note the importance of this paragraph. It says that amiodarone was tested in circumstances different from those in which it is recommended in the current resuscitation guidelines. The guidelines call for amiodarone after the third shock as a IIb recommendation. This study looked at antiarrhythmic use as early as after the first shock (more accurately, 2 minutes after the first shock since providers are blind to the rhythm for 2 minutes after shocking). Thus, we are looking at a potential move for antiarrhythmic therapy up two steps in the resuscitation sequence.


From the results and conclusions:




Results


In the per-protocol population, 3026 patients were randomly assigned to amiodarone (974), lidocaine (993), or placebo (1059); of those, 24.4%, 23.7%, and 21.0%, respectively, survived to hospital discharge. The difference in survival rate for amiodarone versus placebo was 3.2 percentage points (95% confidence interval [CI], −0.4 to 7.0; P=0.08); for lidocaine versus placebo, 2.6 percentage points (95% CI, −1.0 to 6.3; P=0.16); and for amiodarone versus lidocaine, 0.7 percentage points (95% CI, −3.2 to 4.7; P=0.70). Neurologic outcome at discharge was similar in the three groups. There was heterogeneity of treatment effect with respect to whether the arrest was witnessed (P=0.05); active drugs were associated with a survival rate that was significantly higher than the rate with placebo among patients with bystander-witnessed arrest but not among those with unwitnessed arrest. More amiodarone recipients required temporary cardiac pacing than did recipients of lidocaine or placebo.
Conclusions


Overall, neither amiodarone nor lidocaine resulted in a significantly higher rate of survival or favorable neurologic outcome than the rate with placebo among patients with out-of-hospital cardiac arrest due to initial shock-refractory ventricular fibrillation or pulseless ventricular tachycardia.


BUT---


In a subgroup analysis of patients whose arrests were witnessed there was a significant improvement in mortality with both drugs. Amiodarone was associated with an increased requirement for temporary pacing.


I disagree with the popular spin on this study, that the drugs were of no benefit. I do not think we should ignore subgroup analysis. I suspect what this means is that these drugs are more effective if used earlier post arrest.



Thursday, May 18, 2017

Ganglionic plexus ablation for atrial fibrillation: disappointing results


This paper in JACC introduces another category in the classification of atrial fibrillation: advanced AF. From the paper:

..advanced AF, defined as persistent AF, enlarged left atria, or previously failed catheter ablation.

The paper also reviews the indications and rationale for various types of ablation:

The arrhythmogenic trigger from the pulmonary veins (PVs) is the target for ablation in patients with paroxysmal AF without concomitant atrial or cardiac disease; the mechanism is less well established in patients with advanced AF, defined as persistent AF, enlarged left atria, or previously failed catheter ablation. Various treatment strategies have been advocated, combining more extensive myocardial ablation and ablation of non-PV and nonmyocardial targets, including stepwise catheter ablation approaches (3), in which PV isolation (PVI) is followed by linear left atrial (LA) ablation, ablation of continuous fractionated atrial electrograms (4), or ablation of rotors (5).

Furthermore, the rationale for ganglionic plexus (GP) ablation is explained:

As it has become clear that the autonomous nervous system plays a central role in initiating AF and in atrial autonomic remodeling (6,7), partial atrial denervation through ablation of the major autonomic ganglion plexus (GP), either alone or in combination with PVI, has been pursued (8,9).

GP stimulation promotes AF by a combined parasympathetic and sympathetic action resulting in action potential duration (APD) shortening and increased sarcoplasmic reticulum calcium release in PV myocardium, allowing early after-depolarizations to emerge and trigger AF (10). Aside from AF induction, GP stimulation affects local and global LA conduction time, consistent with a predominantly parasympathetic effect (11). Thus, the stimulation of the autonomic nerves within the GPs, beyond triggering AF, may also have a proarrhythmic effect on the atrial myocardium that perpetuates the arrhythmia (11).

Studies investigating the role of GP ablation in addition to PVI have demonstrated mixed results (8,12,13), as have nonrandomized studies during concomitant cardiac surgery (14,15).

This paper reports further results, which were disappointing:

Background Patients with long duration of atrial fibrillation (AF), enlarged atria, or failed catheter ablation have advanced AF and may require more extensive treatment than pulmonary vein isolation.

Objectives The aim of this study was to investigate the efficacy and safety of additional ganglion plexus (GP) ablation in patients undergoing thoracoscopic AF surgery.

Methods Patients with paroxysmal AF underwent pulmonary vein isolation. Patients with persistent AF also received additional lines (Dallas lesion set). Patients were randomized 1:1 to additional epicardial ablation of the 4 major GPs and Marshall’s ligament (GP group) or no extra ablation (control) and followed every 3 months for 1 year. After a 3-month blanking period, all antiarrhythmic drugs were discontinued.

Results Two hundred forty patients with a mean AF duration of 5.7 ± 5.1 years (59% persistent) were included. Mean procedure times were 185 ± 54 min and 168 ± 54 min (p = 0.015) in the GP (n = 117) and control groups (n = 123), respectively. GP ablation abated 100% of evoked vagal responses; these responses remained in 87% of control subjects. Major bleeding occurred in 9 patients (all in the GP group; p less than 0.001); 8 patients were managed thoracoscopically, and 1 underwent sternotomy. Sinus node dysfunction occurred in 12 patients in the GP group and 4 control subjects (p = 0.038), and 6 pacemakers were implanted (all in the GP group; p = 0.013). After 1 year, 4 patients had died (all in the GP group, not procedure related; p = 0.055), and 9 were lost to follow-up. Freedom from AF recurrence in the GP and control groups was not statistically different whether patients had paroxysmal or persistent AF. At 1 year, 82% of patients were not taking antiarrhythmic drugs.

Conclusions GP ablation during thoracoscopic surgery for advanced AF has no detectable effect on AF recurrence but causes more major adverse events, major bleeding, sinus node dysfunction, and pacemaker implantation. (Atrial Fibrillation Ablation and Autonomic Modulation via Thoracoscopic Surgery [AFACT]; NCT01091389)


Wednesday, May 17, 2017

Alcohol intake and the risk of atrial fibrillation


Although there is a well known “U shaped” curve describing the relation between the amount of intake and overall cardiovascular mortality, for a fib there is positive correlation which is linear and extends over the entire range of intake, including very modest amounts. See this article.

Tuesday, May 09, 2017

Monday, May 08, 2017

What is the best oxygen target for critically ill patients?




Question Among critically ill patients, is a conservative oxygenation strategy aimed to maintain arterial saturation within physiologic limits more beneficial than a conventional strategy?


Findings In this randomized clinical trial that included 480 patients with an expected intensive care unit length of stay of 72 hours or longer, a conservative protocol for oxygen supplementation was associated with an absolute risk reduction for intensive care unit mortality of 8.6% compared with that for patients treated with conventional therapy. However, the trial was terminated early because of difficulty with patient enrollment.


Meaning Among critically ill intensive care unit patients with a length of stay of 72 hours or longer, a conservative protocol for oxygen therapy may be beneficial; however, because the trial was terminated early, these findings must be considered preliminary.
Abstract


Importance Despite suggestions of potential harm from unnecessary oxygen therapy, critically ill patients spend substantial periods in a hyperoxemic state. A strategy of controlled arterial oxygenation is thus rational but has not been validated in clinical practice.


Objective To assess whether a conservative protocol for oxygen supplementation could improve outcomes in patients admitted to intensive care units (ICUs).


Design, Setting, and Patients Oxygen-ICU was a single-center, open-label, randomized clinical trial conducted from March 2010 to October 2012 that included all adults admitted with an expected length of stay of 72 hours or longer to the medical-surgical ICU of Modena University Hospital, Italy. The originally planned sample size was 660 patients, but the study was stopped early due to difficulties in enrollment after inclusion of 480 patients.


Interventions Patients were randomly assigned to receive oxygen therapy to maintain Pao2 between 70 and 100 mm Hg or arterial oxyhemoglobin saturation (Spo2) between 94% and 98% (conservative group) or, according to standard ICU practice, to allow Pao2 values up to 150 mm Hg or Spo2 values between 97% and 100% (conventional control group).


Main Outcomes and Measures The primary outcome was ICU mortality. Secondary outcomes included occurrence of new organ failure and infection 48 hours or more after ICU admission.


Results A total of 434 patients (median age, 64 years; 188 [43.3%] women) received conventional (n = 218) or conservative (n = 216) oxygen therapy and were included in the modified intent-to-treat analysis. Daily time-weighted Pao2 averages during the ICU stay were significantly higher (P greater than .001) in the conventional group (median Pao2, 102 mm Hg [interquartile range, 88-116]) vs the conservative group (median Pao2, 87 mm Hg [interquartile range, 79-97]). Twenty-five patients in the conservative oxygen therapy group (11.6%) and 44 in the conventional oxygen therapy group (20.2%) died during their ICU stay (absolute risk reduction [ARR], 0.086 [95% CI, 0.017-0.150]; relative risk [RR], 0.57 [95% CI, 0.37-0.90]; P = .01). Occurrences were lower in the conservative oxygen therapy group for new shock episode (ARR, 0.068 [95% CI, 0.020-0.120]; RR, 0.35 [95% CI, 0.16-0.75]; P = .006) or liver failure (ARR, 0.046 [95% CI, 0.008-0.088]; RR, 0.29 [95% CI, 0.10-0.82]; P = .02) and new bloodstream infection (ARR, 0.05 [95% CI, 0.00-0.09]; RR, 0.50 [95% CI, 0.25-0.998; P = .049).


Conclusions and Relevance Among critically ill patients with an ICU length of stay of 72 hours or longer, a conservative protocol for oxygen therapy vs conventional therapy resulted in lower ICU mortality. These preliminary findings were based on unplanned early termination of the trial, and a larger multicenter trial is needed to evaluate the potential benefit of this approach.


A nuanced discussion is provided in an accompanying editorial.


Sunday, May 07, 2017

Cardiocerebral syndrome


This refers to cognitive dysfunction occurring in heart failure. It has been referred to in various ways through the years. I first blogged about it 11 years ago when it was called cardiac encephalopathy. It has since gained increasing recognition. Here are a few key points from a review and accompanying audio summary in JACC:

There are not only cognitive changes but also structural changes in the brain.

It can occur in both heart failure with reduced EF and heart failure with preserved EF.

It may be at least partially reversible with improvement in cardiac status.

Cerebral autoregulation may be impaired in heart failure.

Neurohumoral activation is contributory.

TNF and other cytokines are elevated and may contribute to cognitive dysfunction.

One third of hospitalized patients with heart failure have been reported to have thiamine deficiency! This may cause brain changes other than the classic Wernicke and Korsakoff syndromes.

Depression is common in heart failure.


What should the clinician do?

Diagnose it via the MMSE or some other clinical tool and exclusion of other causes.
Manage electrolyte problems.
Optimize heart failure management.
Identify and treat depression.
Give thiamine???

Saturday, May 06, 2017

Wireless LV endocardial pacing takes CRT to the next level


The SELECT-LV Study, reporting results of the WiSE-CRT system (EBR Systems, Sunnyvale, California) was recently published in JACC. It was a small study, composed of 35 patients who, for one reason or another, had failed conventional CRT.

From the paper:

Background A total of 30% to 40% of patients with congestive heart failure eligible for cardiac resynchronization therapy (CRT) either do not respond to conventional CRT or remain untreated due to an inability or impediment to coronary sinus (CS) lead implantation. The WiSE-CRT system (EBR Systems, Sunnyvale, California) was developed to address this at-risk patient population by performing biventricular pacing via a wireless left ventricular (LV) endocardial pacing electrode.

Objectives The SELECT-LV (Safety and Performance of Electrodes implanted in the Left Ventricle) study is a prospective multicenter non-randomized trial assessing the safety and performance of the WiSE-CRT system.

Methods A total of 35 patients indicated for CRT who had “failed” conventional CRT underwent implantation of an LV endocardial pacing electrode and a subcutaneous pulse generator. System performance, clinical efficacy, and safety events were assessed out to 6 months post-implant.

Results The procedure was successful in 97.1% (n = 34) of attempted implants. The most common indications for endocardial LV pacing were difficult CS anatomy (n =12), failure to respond to conventional CRT (n = 10), and a high CS pacing threshold or phrenic nerve capture (n = 5). The primary performance endpoint, biventricular pacing on the 12-lead electrocardiogram at 1 month, was achieved in 33 of 34 patients. A total of 28 patients (84.8%) had improvement in the clinical composite score at 6 months, and 21 (66%) demonstrated a positive echocardiographic CRT response (greater than or equal to 5% absolute increase in LV ejection fraction). There were no pericardial effusions, but serious procedure/device-related events occurred in 3 patients (8.6%) within 24 h, and 8 patients (22.9%) between 24 h and 1 month.

Conclusions The SELECT-LV study demonstrates the clinical feasibility for the WiSE-CRT system, and provided clinical benefits to a majority of patients within an otherwise “failed” CRT population.

The complications included embolization of the LV electrode, stroke, device related infection and one death from VF during the procedure.


Friday, May 05, 2017

Yes Virginia, LDL really does matter


Numerous authors over the last several years, citing the pleiotropic effects of statins, have refused to believe that LDL reduction is important. This is despite several lines of evidence, as I have pointed out in numerous previous posts [1] [2] [3].


Now comes this systematic review and meta-analysis from JAMA. From the article:


Conclusions and Relevance In this meta-regression analysis, the use of statin and nonstatin therapies that act via upregulation of LDL receptor expression to reduce LDL-C were associated with similar RRs of major vascular events per change in LDL-C. Lower achieved LDL-C levels were associated with lower rates of major coronary events.


Multiple means of LDL reduction appear to reduce events to a similar degree.


Thursday, May 04, 2017

“Early” invasive versus selectively invasive strategy for non ST segment ACS


Here is the paper in question. I put the word early in quotes because it meant cath within 48 hours, not an immediate cath. From the paper:

Methods The ICTUS trial was a multicenter, randomized controlled clinical trial that included 1,200 patients with NSTE-ACS and an elevated cardiac troponin T. Enrollment was from July 2001 to August 2003. We collected 10-year follow-up of death, myocardial infarction (MI), and revascularization through the Dutch population registry, patient phone calls, general practitioners, and hospital records. The primary outcome was the 10-year composite of death or spontaneous MI. Additional outcomes included the composite of death or MI, death, MI (spontaneous and procedure-related), and revascularization…

Conclusions In patients with NSTE-ACS and elevated cardiac troponin T levels, an early invasive strategy has no benefit over a selective invasive strategy in reducing the 10-year composite outcome of death or spontaneous MI, and a selective invasive strategy may be a viable option in selected patients.

Don’t confuse this with another debate now raging concerning NSTEMI, which is whether such patients should go to the cath lab immediately rather than wait up to 48 hours. Clearly there are some, quite a few in fact, at least among NSTEMI patients as they are defined according to prevailing performance measures, who should. These patients often have ECG findings which, though not meeting the criteria for STEMI, suggest acute epicardial coronary occlusion or impending occlusion. A closely related debate is whether the STEMI/NSTEMI designation is even useful at all.