Background:
Uric acid is the final product of purine metabolism in humans. Serum uric acid levels vary significantly between different species. Particularly, humans have the highest levels of uric acid compared to all other mammals. Markedly high uric acid levels in humans have been caused by the loss of the uricase-encoding gene during the Neogene period [
1,
2]. Uricase is the primary enzyme to convert uric acid to a more soluble form of allantoin. This comparative biology approach allowed for the development of the recombinant form of uricase to treat advanced tophaceous gout and prevent uricosuria in treating certain hematological and solid tumor malignancies. Increased urate levels beyond the solubility threshold could result in the formation of monosodium urate (MSU) crystals in and around the joints. Mobilization of MSU crystals triggers an inflammatory response known as gout flares.
In most mammals, uric acid levels could range between 1-2 mg/dL [
1]. Unlike mammals, primates have an increased urate level due to declining uricase activity, with healthy individuals maintaining uric acid levels around 3-4 mg/dL. While the loss of the uricase gene rendered humans the only mammal to develop gout, it was also proposed that retaining uric acid provided an adaptive mechanism for human survival [
1,
2,
3,
4]. Nonetheless, it has been elusive to characterize the exact adaptation mechanism, ushering in many hypotheses and speculations about the roles of thrifty genes and uric acid in human health.
Uricase is present in nearly all living organisms, from bacteria to mammals, with a common evolutionary origin. The gradual accumulation of genetic mutations, mainly nonsense mutations, in the uricase gene, during the evolution process, rendered the enzyme to be undetectable in humans and specific monkeys [
5]. Specifically, the lack of uricase activity in humans has been greatly attributed to the nonsense mutation in codon 33 in exon 2, a nonsense mutation in codon 187, and a splice mutation in exon 3 [
6]. The occurrence of independent mutations during hominoid evolution in parallel with the evolution of Old World and New World monkeys could be interpreted as evidence that there must have been a naturally selective advantage for early primates in having elevated uric acid [
3,
6].
The loss of the uricase enzyme in humans and the extensive reabsorption of uric acid suggests that retaining uric acid is beneficial to human health; hence, evolutionally physiology has treated uric acid as to not be a harmful waste. Also, this high conservation of uric acid has rendered humans to be susceptible to uric acid changes induced by diet and their propensity to spontaneously develop gout. Despite the well-established relationship between high uric acid levels and gout, researchers have been actively trying to elucidate the health benefits of uric acid, its role in adaptation to different environments, and potentially inform the development of less immunogenic uricase-based gout therapy [
7].
Proposed theories about the role of uric acid have been previously described [
1]. For example, growing evidence suggests that uric acid is a powerful antioxidant, accounting for roughly 50-60% of naturally occurring in humans [
8]. This naturally occurring protection may have elongated the life expectancy of hominoids (apes and humans). While two-thirds of uric acid levels are the results of indigenous sources, one-third of uric acid is the result of external factors, mainly diet, and certain lifestyle factors. Therefore, urate levels in hominoids are viewed as a mirror to the major and consequential changes in dietary patterns of early hominoids. Indeed, it is believed that with loss of the vitamin C encoding gene due to significant ingestion of fruits and vegetables, the primary source of vitamin C, uric acid became the primary antioxidant and free radical scavenger when consumption of fruits and vegetables dramatically decreased due to substantive climate changes.
Population dietary patterns and cultural habits have been implicated in the high prevalence of hyperuricemia and gout among specific population groups [
9]. For example, the Māori of New Zealand have been long known to have genetic polymorphisms that could increase their risk of developing both hyperuricemia and gout. However, there is no mention of gout among the same population before the 18
th century. Historically, the traditional diet of this population group included sweet potato fern roots, birds, and fish. After the introduction of the Western diet in the early 1900s, an epidemic of both obesity and gout developed. To date, the Māori of New Zealand have one of the highest gout rates in the world [
10,
11].
The purpose of this review is to highlight the effects of both high and low serum urate levels on human health. A review of the implication of genetics, as it intersects urate levels and various phenotypes associated with serum urates, is important to fully characterize the contribution of different environmental changes to human health. Additionally, analyzing the ramifications of the loss of the uricase gene could expand our knowledge base to better characterize the various physiological roles uric acid could play in human health and explain current public health trends and population risk differences for common diseases.
Genetic correlation between urate levels and cardiometabolic traits
Developing hyperuricemia or gout is multifactorial, including genetics, dietary habits, and other lifestyle factors [
9]. Multiple studies have demonstrated that certain risk factors, such as high body mass index (BMI), male sex, advanced age, and increased consumption of meat, alcohol, and high-fructose corn syrup, could increase uric acid levels, increasing the risk of gout [
12]. Other factors, such as genetic polymorphisms across major uric acid transporters, have been linked to uric acid levels and the risk of developing gout. For instance, genetic polymorphisms identified through genome wide association studies (GWAS), including
SLC2A9, SLC16A9, SLC17A1, SLC22A11, SLC22A12, ABCG2, PDZK1, GCKR, LRRC16A1,
HNFA4G, and
INHBC are some of the major determinants in uric acid levels and modulators of gout risk.[
13,
14] Additionally, the genetic polymorphisms within the former genes do not only show disproportionate allele frequencies among different ethnic and racial groups, but their frequencies are directionally consistent with the epidemiology of hyperuricemia or gout within their respective population [
15,
16].
Serum urate levels are highly heritable with an estimate of 30-60%. Despite the modest degree of heritability of urate levels, the variability explained in serum urate levels by constructing genetic models remains limited with up to 17%, using SNPs at 183 loci [
17]. In contrast, non-genetic models could explain a higher proportion of the variability in serum urate and precisely predict the risk of gout than traditional genetic risk models. In a study using UK Biobank data, the genetic risk model was a significantly weaker predictor (area under the receiver operating characteristic curve (AUROC)=0.67) than the demographic model (age and sex) (AUROC=0.84) [
17]. Nevertheless, elucidating the presence of genetic correlation may explain the shared biology of traits or diseases. Furthermore, leveraging quantitative genetic approaches can deconstruct the bivariate association patterns observed at the phenotypic levels into genetics and environmental components. The same framework has been used to disentangle the phenotypic correlation between the genetic and environmental causes between serum urate and traits representing comorbidities of clinical importance, such as serum creatinine, blood pressure, serum blood glucose, and BMI [
18]. Data from two independent datasets indicated independent genetic overlap between urate level and creatinine and urate and metabolic syndrome, supporting the notion that genetics may partly explain the clustering of gout, chronic kidney disease, and other metabolic comorbidities [
18].
Multiple major epidemiological studies repeatedly showed a strong association between serum urate levels with multiple cardiometabolic traits and other cardiovascular risk factors [
19,
20,
21]. Using the National Health and Nutrition Examination Survey, the conditional probability of obesity, given the individual has hyperuricemia, is between 0.4-0.5 [
22]. Similarly, the prevalence of hypertension, diabetes, and chronic kidney disease proportionally increases as uric acid levels increase; the same risk factors are more common in patients with gout than those without gout [
23]. With a robust genetic underpinning, the genetic correlation between urate levels and complex diseases has been evaluated [
17,
18]. Tin et al. assessed the genetic correlations between urate and 748 complex traits using cross-trait linkage disequilibrium score regression. Serum urate levels were significantly genetically correlated with 214 traits and conditions. The highest positive genetic correlation was with gout followed by the trait of the metabolic syndrome [
17]. The largest negative genetic correlations observed included high-density lipoprotein and estimated glomerular filtration rate consistent with the observational associations from epidemiological studies. The same group also assessed whether genetic correlations reflect a causal or pleiotropic relationship. To this extent, the genetic causality proportion (GCP) of seven common cardiometabolic traits were examined, using genetic causality of urate on gout as a positive control. Among the seven examined phenotypes, the largest GCP was observed for adiposity-related phenotypes (e.g., waist circumference) [
17]. Nevertheless, the directionality of effect (high urate causing obesity vs obesity causing high urate) is the focus Mendelian randomization studies, suggesting a casual effect of obesity on urate levels [
24]. This directionality of effect is consistent with the hypothesis that increased waist circumference, major obesity indicator, represent a high reservoir of purine, the primary precursor for uric acid [
17].
Regulating blood pressure: The role of salt and uric acid
A direct relationship between salt intake and blood pressure (BP) has been established. Excessive sodium intake has been shown to increase blood pressure and the onset of hypertension. Increased sodium consumption is associated with increased water retention, peripheral vascular resistance, oxidative stress, and microvascular remodeling [
30]. Following this growing evidence, salt restriction has become a pillar in hypertension management and patient counseling. However, the BP response to changing salt intake displays marked inter-individual variability, and thus salt sensitivity behaves as a continuous parameter at a population level. This BP variability could be modulated by the other biochemicals that exert varying effects on blood pressure, such as potassium and uric acid.
While being a naturally occurring antioxidant, uric acid has been strongly implicated in developing the hypertension epidemic and maintaining blood pressure during the early hominoids. Fossil evidence suggests that early hominoids heavily relied on fruits for their dietary needs. With a fruit-based diet, the sodium intake during the Miocene period was expected to be very low to maintain blood pressure. To evaluate this hypothesis, mild hyperuricemia was induced in rats by inhibiting uricase, recreating the pre-historic conditions by feeding the rats a low-sodium diet. Compared with hyperuricemic rats, normal rats receiving a low-sodium diet had either no change or a drop in systolic blood pressure. In contrast, systolic blood pressure increased in hyperuricemic rats, proportional to uric acid levels, and could be mitigated using allopurinol [
31,
32]. Therefore, it was presumed that increased uric acid levels may have maintained blood pressure during low sodium intake.
Uric acid levels may also contribute to the salt sensitivity associated with developing hypertension. A study using rats treated with oxonic acid and a low-salt diet for seven weeks resulted in mild hyperuricemia compared with controls. Following the discontinuation of oxonic acid, uric acid levels did not differ between groups. However, renal tissue obtained from rats with hyperuricemia showed persistent arteriolopathy with decreased lumen diameter and mild interstitial inflammation. When rats were then randomized to a low- or high-salt diet, an increase in blood pressure from the high-salt diet was observed only in rats that had been previously hyperuricemic.[
3] Taken together, a potential hypothesis would be that salt sensitivity-induced hypertension could be mediated by uric acid levels.
Hyperuricemia and hypertension: A cause or effect?
Inflammation remains the principal suspect linking gout and cardiovascular diseases. Mechanisms whereby serum urate levels could contribute to the development of hypertension is partly related to the uric acid’s primary effect on the kidney. These mechanisms include activating the renin-angiotensin-aldosterone system (RAAS) and the disposition of urate crystals in the urinary lumen. Additional evidence suggests elevated serum uric acid can decrease nitric oxide production, causing endothelial injury and dysfunction. It is well-established that monosodium urate crystals can activate the NLRP3 inflammasome cascade and the production of inflammatory cytokines such as IL-1 beta and IL-8 [
33]. Although uric acid is predominantly eliminated through the kidney, new evidence has reported that urate crystals can deposit in the coronary artery among gout patients [
34]. The deposition of urate crystals in the aorta and main arteries were associated with higher coronary calcium score and can trigger an inflammatory response like the observed response in the kidney. Beyond the joints, these observations support the hypothesis that high serum urate levels could deposit into soft tissues and extraarticular regions which can then become pro-oxidants, increasing the inflammatory burden and the development of the multiple comorbidities observed in patients with gout or hyperuricemia [
35].
A reverse causality of hypertension and hyperuricemia is also possible. Patients treated for hypertension could develop hyperuricemia secondary to decreased renal blood flow, increased BMI, and concomitant use of medication that reduce uric acid excretion or increase uric acid production. Another proposed hypothesis suggests that, given the route of elimination, uric acid can directly cause renal injury, which could result in the activation of RAAS, causing hypertension [
33].
Hypertension is an epidemic affecting more than 30% of adults worldwide [
36]. While genetics may contribute to the development of hypertension, the contribution has been limited, suggesting that the external environment plays a significant role in developing hypertension compared with genetics. Moreover, a growing body of evidence has repeatedly reported on the association between elevated urate levels (hyperuricemia) and hypertension. For example, a cross-sectional study determined that each 1 mg/dL increase in serum urate contributes to a 20% increased prevalence of hypertension in a general population not treated with hyperuricemia and hypertension [
37]. A retrospective study also identified that having hyperuricemia was independently associated with the development of hypertension, with 95% confidence interval hazard ratios of 1.37 (1.19-1.58) in men and 1.54 (1.14-2.06) in women [
38].
Incidence and prevalence of hypertension and obesity were minimal in specific populations until Westernization and the adoption of a Western diet [
39,
40]. Additionally, immigration of select population subgroups, Filipinos and Japanese, has been linked to earlier gout onset, higher gout and hypertension rates, and higher mean urate levels than their native home residents [
15,
40,
41,
42,
43,
44]. These observations support the hypothesis that dietary changes from a traditional diet to a Westernized diet enriched with fatty meats may be responsible for the increased prevalence of gout, high serum urate levels, and increased incidence of hypertension and diabetes.
The underlying association mechanism between westernization and the development of hypertension is complex. However, the most likely cause is fueled by the acculturation and assimilation of the Western world, which brings multiple changes in dietary habits and lifestyle factors [
39]. For instance, food with high sodium and low potassium content will likely increase blood pressure. A sedentary lifestyle is also associated with low energy expenditure, which could aggravate blood pressure secondary to dietary changes.
Uric acid and neurodegenerative diseases: The antioxidant hypothesis
The old hypothesis that the chemical structure of uric acid could mimic specific brain stimulants, such as caffeine or methylxanthine, may have provided selective advantages to early hominoids [
45]. To this hypothesis, the loss of uricase activity in early hominoids has caused an increase in uric acid levels in the brain which could have given rise to the quantitative and qualitative intelligence among early hominoids. Indeed, limited studies have shown that urate plasma levels maybe associated with higher intelligence [
45,
46,
47]. Nonetheless, the level of intelligence is a complex multifactorial trait involving both environmental and physiological factors.
Multiple epidemiological studies have examined the relationship between serum urate levels, gout diagnosis, and brain-related diseases [
48,
49]. Specifically, the associations between urate levels or gout and Alzheimer’s disease (AD) and Parkinson’s disease (PD) have been a focal point in recent studies [
50,
51,
52]. However, the results of these studies have been conflicting or showing no association [
53,
54]. Partly, these inconclusive findings have been attributed to the study design, not accounting for uric acid level confounders, and the varying degrees of disease progression among study subjects [
55]. While the transport of uric acid into the brain remains elusive, uric acid is produced in the human brain [
51]. Urate production in the brain has paved a path for a great deal of research into whether uric acid could exert an antioxidant activity, protecting the brain from developing neurodegenerative diseases by mitigating the burden of oxidative stress sequelae. Uric acid can function as an antioxidant and block peroxynitrite, which could be beneficial in slowing the progression of PD and multiple sclerosis (MS) [
56,
57,
58]. Lower uric acid levels have been linked with a greater risk of developing PD, the severity of motor features, and faster progression of both motor and non-motor features [
51]. This observation has led to the hypothesis that raising uric acid may provide a neuroprotective effect in PD patients [
59]. However, the study failed to show any benefits from raising uric acid, suggesting that PD may lower uric acid or that uric acid is a biomarker of other processes relevant to PD [
54].
To assess the causality between uric acid and PD, a Mendelian randomization approach was used to systematically assess the inherent risk of lower urate levels and disability requiring dopaminergic treatment in early PD patients. Genotyping 808 patients for three loci across the
SLC2A9 (rs6855911, rs7442295, rs16890979), cumulative scores of the risk alleles were created based on the total number of minor alleles across the three loci [
60]. Serum urate levels were 0.69 mg/dL lower among individuals with ≥ 4
SLC2A9 minor alleles versus those with ≤ 2 minor alleles [
60]. A 5% increased risk of PD progression with a 0.5mg/dL decrease in serum urate was observed. However, the hazard ratio for the progression to disability requiring dopaminergic treatment, though increased with the
SLC2A9 score, was not statistically significant (p=0.056) [
60].
Another study evaluated the association of a single nucleotide polymorphism (rs1014290) within
SLC2A9 and PD in Han Chinese population. Using a case-control study design, serum urate levels were significantly lower in PD patients than in controls. Individuals with the rs1014290 TT and CT genotypes had significantly higher serum urate levels than the CC genotypes [
61]. The lower serum urate-associated allele/genotype (C/CC) was statistically more frequent in PD patients than in controls. Those with the CC genotype had significantly higher odds of PD than those with TT or TC. Collectively, this data raises the possibility that modulating
SLC2A9-encoded transporter (GLUT-9) might be an alternative approach for urate-elevating therapy, compared to urate precursor administration as a potential PD treatment target.
The role of uric acid levels in mild cognitive impairment (MCI), a common diagnosis that precedes the development of clinical AD, has been extensively investigated. While asymptomatic hyperuricemia can go undiagnosed and rarely treated, exploring the association between hyperuricemia and neurodegenerative diseases may be confounded by recall bias, disease misdiagnosis, and secondary causes of hyperuricemia. Nonetheless, patients with gout are considered to have the highest uric acid burden and represent the extreme phenotype of hyperuricemia. Therefore, the association between gout diagnosis and the risk of neurodegenerative diseases has been of significant interest in neurodegenerative disease research.
Interest has emerged in investigating the association between serum urate levels and mild MCI. Given the transitional stage between declining cognition to a clinical diagnosis of AD, clinical interventions targeting MCI can become a window of opportunity to arrest or slow the progression of the disease [
62]. To this end, the neuroprotective effect of uric acid on MCI has been evaluated in multiple cross-sectional studies [
63,
64,
65,
66,
67]. While the results of some of these studies appear conflicting, they indeed highlight the double-edged sword nature of uric acid and the potential of other confounders that may affect urate levels [
68].
Hyperuricemia is an independent risk factor for multiple cardiovascular disease risk factors, which can, collectively, elevate the risk of cerebrovascular events. Nonetheless, the direct role of uric acid levels in developing stroke events is inconclusive [
69,
70]. On the contrary, increased uric acid levels during acute ischemic stroke events may result in better functional treatment outcomes than controls, supporting the antioxidant effects of uric acid [
71,
72].
Multiple studies have shown that subjects with AD have lower serum urate levels than matched controls. These findings have ushered in the hypothesis of uric acid being a neuroprotective compound and potentially a therapeutic target in neurodegenerative diseases. However, lower brain urate levels in AD patients than matched controls could be owing to the reduced metabolic activities that might lead to reduced uric acid production. To this extent, urate levels may serve as a proxy for overall brain metabolic activity rather than being the culprit of the disease. Additionally, one of the significant serum urate predictors is the overall nutritional status and BMI. Clinical manifestations of AD are often preceded by substantive weight loss, which may significantly account for the lower serum urate levels in the AD presentation.
Generally, patients with hyperuricemia tend to have a higher cardiovascular disease burden, warranting a concomitant use of varying medications which may affect serum urate levels. Also, serum urate levels can be influenced by sex, obesity, genetics, and overall nutrition. Therefore, the measurement of uric acid may not accurately reflect the inherent urate level of the individual to draw a robust association between serum urate levels and the condition of interest, let alone the cross-sectional design of these studies. While uric acid may serve as a potent antioxidant, a robust study design is needed to identify optimal urate levels to maximize the benefits and minimize the risks is needed.
Uric acid: A therapeutic target or disease bystander?
Uric acid levels are the strongest predictor of developing gout via the deposition of monosodium urate crystals in and around the joints. Chronic suppression of uric acid levels is the hallmark of reducing MSU crystal burden and preventing future gout flares. While different treatment modalities have been developed to lower serum urate, most treatments are focused on inhibiting uric acid production. And in the advanced forms of gout, the recombinant form of uricase has been used to reduce the tophi burden or if the patient is not responsive to available treatments. In addition to gout, high
uric acid levels have been incriminated in different chronic disease states, including hypertension, metabolic syndrome, diabetes, non-alcoholic fatty liver disease, and chronic kidney disease [
20,
27,
77,
78]
.
While multiple experimental and clinical studies support the role of uric acid as an independent cardiovascular risk factor, multiple studies also suggested that lowering serum urate may improve blood pressure. These studies included pre-hypertensive obese [
79], hypertensive adolescents [
80], hypertensive children on an angiotensin-converting enzyme inhibitor [
81], and in adults with asymptomatic hyperuricemia [
82,
83].
A growing body of evidence also suggests that urate-lowering therapy does not affect blood pressure. A cross-over single-center study enrolled ninety-nine participants randomized to 300 mg allopurinol or placebo over four weeks [
84]. There was no difference in the change in blood pressure between groups; however, the allopurinol group had a decrease in uric acid and improved endothelial function, estimated as flow-mediated dilation [
84]. Noteworthy, the study participants were relatively young (28 years old), having relatively normal systolic blood pressure (123.6 mm Hg) with a relatively normal uric acid level at baseline (5.9 mg/dL) [
84].
In a double-blind placebo-controlled trial, overweight or obese participants (n=149) with serum uric acid ≥5.0 mg/dL were randomly assigned to probenecid, allopurinol, or placebo [
85]. The primary endpoints were kidney-specific and systemic renin-angiotensin system (RAS) activity. Secondary endpoints included mean 24-hour systolic blood pressure, mean awake and asleep blood pressure and nocturnal dipping. The trial found that uric acid lowering did not affect kidney-specific or systemic RAS activity after eight weeks or on mean systolic blood pressure [
85].
With gout and hyperuricemia being associated with increased sterile inflammation, earlier report suggested that the use of allopurinol in patients with hyperuricemia can significantly reduce major inflammatory biomarkers [
82]. A recent evidence suggested that urate-lowering therapy, mainly allopurinol, was significantly associated with lower hs-CRP, LDL, and total cholesterol levels in patients with gout compared with those not receiving urate-lowering therapy [
86]. These results support that patients optimally treated for gout can garner added benefits above and beyond reduced uric acid burden. Although urate-lowering therapy was not associated with lowering blood pressure, high chronic kidney disease diagnosis in patients receiving allopurinol was observed [
86]. Overall, the study suggested that uric acid may be a remediable risk factor to lower the atherogenic disease burden in patients with gout [
86].
Inflammatory reactions are thought to be crucial contributors to neuronal damage in neurodegenerative disorders such as AD, PD, MS, and amyotrophic lateral sclerosis (ALS) [
57,
58]. Among the toxic agents released in brain tissues by activated cells is peroxynitrite, the product of the reaction between nitric oxide and superoxide [
58]. As a potential peroxynitrite blocker, raising urate levels has been recently explored in neurodegenerative disease management [
56]. Leveraging the neuroprotective hypothesis of uric acid, administering uric acid precursor, inosine, was evaluated in prospective clinical trials.
In the patients with early PD, administering inosine was safe, tolerable, and effective in raising serum and cerebrospinal fluids urate. However, among patients recently diagnosed with PD, treatment with inosine, compared with placebo, did not result in a significant difference in the rate of clinical disease progression, questioning the use of inosine as a treatment for early PD [
59,
87]. The safety and tolerability of elevated uric acid levels in patients with amyotrophic lateral sclerosis (ALS) were also evaluated. In a 12-week study, inosine use was shown to significantly increase serum urate levels in patients with ALS safely and effectively [
88]. Similarly, a 20-week study showed that inosine significantly increased uric acid without treatment-emergent adverse events compared to placebo. However, the study did not show functional benefits associated with inosine in ALS patients compared with receiving a placebo [
89].
Despite the disappointing results of urate-elevating studies, it is critical to recognize that some of the studies lacked the sample size to detect adequate differences. Also, inosine use could be problematic in achieving the desired serum urate levels as it can enter the purine salvage pathway. While addressing the sample size issue for future studies is obvious, the mechanism to elevate uric acid requires a robust thought process and potentially targeting uric acid excretion transporters.
Future Perspectives
The loss of activity mutation in the uricase gene may have been protective in situations of famine and food scarcity; thus, it rapidly took over the ancestral population, likely driven by harsh environmental factors during the Miocene era. In modern times, this adaptation is possibly leading to the state of hyperuricemia, increasing the risk for gout and other cardiometabolic disorders. Today, all humans are considered uricase knockout. Loss of uricase resulted in the inability to regulate uric acid effectively. Adaptation to famine and food scarcity in early hominoids dictated crucial changes to obtain and conserve energy for survival. In contrast to animal experiments and observational studies, targeting uric acid levels for therapeutic purposes beyond the purpose of gout remains conflicting.
As hyperuricemia and gout animal models continue to be instrumental in advancing the field of gout and uric acid metabolism, it is equally important to recognize that animal models are inherently predisposed to eliminating uric acid. Therefore, the effects of raising uric acid levels in animal models may not reflect the outcomes seen in human studies. Additionally, the sex effect on uric acid levels reinforces the need for adequate and robust analyses by sex. Studies showed that women are more sensitive to the cardiometabolic effects of increased uric acid after menopause versus men. Uric acid levels may present as a J shape phenomenon. Therefore, careful study design and a robust rationale for classifying or categorizing uric acid levels are needed to minimize the effect of extreme uric acid levels.
While genetics play a significant role in developing hyperuricemia or gout, it is equally important to recognize that the same genetics are therapeutic targets to modify uric acid. Genetic polymorphisms have been implicated in the racial difference in gout prevalence and may also contribute to the heterogeneity in response to urate-lowering or urate-elevating therapies. Therefore, genetic investigations in uric acid metabolism in clinical trials need to be considered in designing the study. Finally, as described earlier, uric acid levels are the culmination of endogenous cellular processes and external dietary sources; therefore, accounting for dietary and lifestyle habits in uric acid levels may minimize the effect of study confounders.