News
Article
Author(s):
In a recent study, participants had highest gout flare rates when serum urate levels were > 10 mg/dL and the lowest gout flare when levels were < 3.9 mg/dL.
The greater the serum rate levels, the greater the gout flares rate, according to a new study1.
American College of Rheumatology (ACR) had laid out gout treatment recommendations, but they did not clearly define optimal targets for serum urate levels. For urate-lowering therapy targeting gout glare, ACR recommends a target serum urate of < 6 mg dL and anti-inflammatory flare prophylaxis for 3 – 6 months.
While it is recommended to have < 6 mg dL of serum urate levels, if an individual has high serum urates before urate-lowering therapy, they still have a risk of gout flares during treatment.2 A prior study found a higher baseline serum urate, along with a decrease in serum urate, may lead to a higher risk of gout flare.
A new secondary analysis, led by Sara Tedeschi, MD, MPH, from Brigham, used data from the Cardiovascular Safety of Febuxostat or Allopurinol in Patients with Gout trial.1 In this trial, participants were randomized to either the febuxostat or allopurinol group, with a target serum urate level of < 6 mg/dL. Participants received treatment for 6 months with colchicine 0.6 mg daily. If colchicine was not tolerated, participants received naproxen 250 mg twice daily instead. The study included 6183 participants with a median age of 65. Most of the sample (84%) were male.
The investigators followed the participants from randomization to either death, last completed visit (drop out), or end of study. They assessed serum urate levels at months 0, 3, 6. Afterward, serum urate levels were assessed every 6 months. Levels were categorized as < 3.9, 4.0 – 5.9, 6.0 – 7.9, 8.0 – 9.9, and > 10 mg/dL. The primary outcome was self-reported gout flare every 3 or 6 months, and 1 flare was allowed if separated by > 14 days.
Over half (71%) received serum urate < 6 mg/dL by month 3—and this percentage slightly rose over time. The findings showed the highest gout flare rates were when serum urate levels were > 10 mg/dL and lowest gout flare rates when levels were < 3.9 mg/dL. Therefore, the higher the serum urate levels, the higher the gout flare rates. On the other hand, the lower the serum urate levels, the lower the gout flare rates.
Gout flare rates were the highest between months 0 – 3, which was around the time the urate-lowering therapy started and when serum urate levels had the greatest change. For months 0 – 6, gout flares for serum urate levels of < 3.9 mg/dL were 0.88 (95% CI, 0.67 – 1.16; P = .36); for levels of 4.0 – 5.9 mg/dL flares were 1.00; for 6.0 – 7.9 flares were 0.95 (95% CI, 0.79 – 1.13; P = 0.55); for 8.0 – 9.9 mg/dL flares were 1.09 (95% CI, 0.84 – 1.41; P = .50); and for levels > 10 mg/dL flares were 1.24 (95% CI, 0.85 – 1.79; P = .26). Thus, more participants had gout flares for serum urate levels of >10 mg/dL.
Furthermore, gout flares spiked between the months 6 – 12, around the time of treatment discontinuation. The investigators noted a dose-response relationship. For instance, serum urate levels < 3.9 mg/dL had significantly lower flare rates than levels 4.0 – 5.9. Though, levels > 10 mg/dL had significantly greater flare rates than levels 4.0 – 5.9.
“Gout flare rates were persistently higher when SU ≥6 mg/dL compared to SU at target after the first year of ULT, after accounting for censoring,” the investigators wrote. “These data suggest a potential benefit of achieving very low SU levels (≤3.9 mg/dL) and consideration of a longer duration of prophylaxis to reduce gout flares.”
References
Real-World Study Confirms Similar Efficacy of Guselkumab and IL-17i for PsA