It is a clinically defined nosological entity with multi-factorial but poorly understood etiologic mechanisms

CBSST appears to be an effective intervention to address these concerns that requires minimal resources and a relatively brief treatment interval, making it ideally suited to adaptation to a variety of clinical settings. Future studies will compare CBSST to standard outpatient care with a focus on additional outcomes, including quality of life and healthcare utilization.Bipolar disorder is a severe, chronic, and disabling mental illness characterized by recurrent episodes of hypomania or mania and depression. The evidence from twin, family, and adoption studies provide compelling evidence for a strong genetic predisposition to BPD with heritability estimated to be as high as ≥80% . Given BPD is a heterogeneous disease with substantial phenotypic and genetic complexities, the identification for BPD risk loci has proven to be difficult. Some researchers have proposed that dissecting BPD into clinical subgroups with distinct sub-phenotypes may result in genetically homogeneous cohorts to facilitate the mapping of BPD susceptibility genes . Among the subphenotypes, early-onset BPD is of particular interest as several independent cohort studies have demonstrated their existence. Comparing to the non-early-onset BPD, the early-onset subtype is associated with a more severe form of clinical manifestations characterized by frequent psychotic features, more mixed episodes, greater psychiatric comorbidity such as drug and alcohol abuse and anxiety disorders, higher risk of suicide attempt, worse cognitive performance, and poorer response to prophylactic lithium treatment. In addition, the pattern of disease inheritance seems to differ between early‐ and late‐onset BPD families, with the former involving greater heritability. These observations indicate that early-onset BPD may be a genetically homogenoussubset and thus could be used for genetic study to identify its susceptibility genes.

A number of BPD genes identified by genome-wide association study have been widely replicated and intensively studied,cannabis grow equipment but these studies did not include early-onset BPD. Over the past two decades, a host of studies have investigated genetic loci responsible for early-onset BPD through linkage-analyses, candidate–gene association, analyses of copy number variants , and GWAS, but findings are inconclusive. Candidate–gene association studies have identified a number of genes potentially associated with early-onset BPD, including glycogen synthase kinase 3-β gene, circadian clock gene Per 3, serotonin transporter gene, brain-derived neurotrophic factor gene, and gene coding synaptosomal-associated protein SNAP25. However, very few positive findings of these studies have been replicated independently. Findings from linkage studies suggested chromosomal regions harboring the susceptibility genes at 3p14 and 21q22, plus the loci at 18p11, 6q25, 9q34 and 20q11 with nominal significance. Studies of CNVs in early-onset BPD were based on relatively small effect sizes and were irreproducible, suggesting that CNVs are unlikely the major source of liability. Finally, GWAS failed to find any risk variant with genome-wide statistical significance in Caucasian populations, despite some variants showed suggestive significance. In previous genetic studies, the definition of early-onset in BPD typically ranged from 15 to 25 years of age. These association studies were largely conducted with small sample size and were under powered . Most of them compared early-onset BPD vs. healthy control. Such a case–control design is more likely to identify susceptibility gene for BPD per se, but not for the early-onset sub-type. The optimal strategy to identify gene for the early-onset BPD is to include the non-early-onset BPD group for comparison. In this paper, we reported findings from a GWAS with high-density SNP chips on early-onset, defined as ≤20 years of age, BPI patients of Han Taiwanese descent.

The clinical phenotype assessment of manic and depressive episodes was performed by well-trained psychiatric nurses and psychiatrists using a cross culturally validated and reliable Chinese version of the Schedules for Clinical Assessment in Neuropsychiatry, supplemented by available medical records. All of them were diagnosed according to the DSM-IV criteria for BPI disorder with recurrent episodes of mania with or without depressive episode. The assessment of onset age was based on a life chart prepared with graphic depiction of lifetime clinical course for each of the BPI patient recruited. This life chart included largely all the mood episodes with date of onset , duration, and severity . The construction of this life chart was based on integrated information gathered from direct interview with patients and their family members, interviews with in-charge psychiatrists, and a thorough medical chart review. Different definitions for early onset of BPI have been proposed in previous work. Our selection of 20 as the age threshold was based on a systematic review for pediatric BPD. The age at onset was defined by the first mood episode . Of all patients, 1306 with genotyping data available were included in the discovery group for GWAS and the rest 473 without genotyping data were included in the replication group.In this paper, we have reported one of the largest GWASs to investigate genetic susceptibility to early-onset BPI with the first mood episode occurring ≤20 years of age. We employed standardized psychiatric interview and constructed a life chart with detailed clinical history to ensure the accuracy and homogeneity of phenotype for genotyping. Our GWAS with high-density SNP chips identified the SNP rs11127876 in CADM2 gene to be associated with early-onset BPI in both discovery and replication groups, and meta-analysis for the association was close to genome-wide significance .

The gene CADM2 on chromosome 3 encodes a synaptic cell adhesion molecule that is prominently expressed in neurons, and plays key roles in the development, differentiation, and maintaining synaptic circuitry of the central nervous system. In previous GWASs, CADM2 has been found to be associated with a number of mental health related traits, including alcohol consumption, cannabis use, reduced anxiety, neuroticism and conscientiousness, and increased risk-taking behavior. CADM2 was also reported to be associated with executive functioning and processing speed, general cognitive function, and educational attainment. Though there have been no investigations examining the risk-taking phenotype in early-onset relative to non early-onset BPD, Homes et al. showed that BPD patients with a past history of alcohol abuse or dependence had a higher risk-taking propensity, suggesting a relationship between early-onset BPD and risk-taking propensity. Of note, Morris et al. suggested that CADM2 variants may not only link with psychological traits, but also influence metabolic-related traits, such as body mass index, blood pressure, and C-reactive protein. In addition, they found that CADM2 variants had genotype specific effects on CADM2 expression levels in adult brain and adipose tissues. The finding highlights the potential pleiotropy of CADM2 gene, i.e., the genetic variants may influence multiple traits, and shared biological mechanisms across brain and adipose tissues through the regulation of CADM2 expression. Given that the metabolic comorbidities are prevalent in patients with early-onset BPD, it is conceivable that CADM2 variants may influence both psychological and physical traits, further contributing to a more severe clinical subtype of BPD with accompanying risk of metabolic adversities. In addition, an association between risk-taking and obesity has also been implicated in previous research, which suggests that risk takers tend to overlook health-related outcomes and are prone to aberrant reward circuitry predisposing them to poor dietary choices and excessive intake. Collectively,vertical grow rack in line with the characteristics found to be associated with CADM2 variants, it is likely that CADM2 may exert an effect on the constellation of clinical features related to early-onset BPD with greater symptom severity . Therefore, our findings suggest that CADM2 genetic variants may have significant effects on a subtype of BPD with early-onset. Two previous GWASs comparing early-onset BPD patients with healthy controls did not find any genetic variants reaching genome-wide significance. Our study included a larger sample of early-onset BPI patients to conduct GWAS using high-density genotyping . The statistical power was calculated using Post-hoc Power Calculator , combining the allelic frequencies of both discovery and replication groups. In this study of two independent samples of BPI with dichotomous endpoint, the power reached 99.4% and 18.2% under type I error = 0.05 and = 5 × 10−8 , respectively. Results of our study are also likely to be under powered under the stringency setting of type I error. However, the frequency of risk allele T was higher in patients with onset ≤20 than in patients with onset >20 in both discovery and replication groups. We believe all these have provided strong evidence to confirm the association of this SNP with earlyonset BPD. In Table 2, the minor allele frequencies differ quite a bit between the discovery and replication cohorts. In the NCBI SNP database, minor allele frequency of rs11127876 is 0.08 in Han Chinese in Beijing, close to our results and suggest that the different allele frequencies observed in Table 2 may mainly result from our sampling. The over-representative minor allele frequency in replication group may have come from random sampling or effects of hidden characters of our patients recruited. Genetic variant of CADM2 has been reported to be associated with behavioral and metabolic traits, which were not assessed in this study.

Though the minor allele frequencies of rs11127876 were different in discovery and replication groups, the same direction of ORs of rs11127876 minor allele supports the reliability of our findings. The SNP rs75928006 located in the upstream of MIR522 reached genome-wide significance in discovery group but failed to show statistical significance in replication group. MIR522 promotes glioblastoma cell proliferation, but there was no evidence to suggest its association with any psychiatric disorders. One major limitation of this study is the possibility of recall bias about the exact onset age of the first mood episode of BPI, particular when there was a long history of the illness. Previous studies have however suggested that age at onset can be assessed reliably. The preparation of life chart containing detailed clinical course and treatment based on a semi-structured clinical interview plus a thorough medical chart review for individual patients should have overcome this potential limitation satisfactorily. In summary, we have identified a genetic locus rs11127876 in CADM2 gene to be associated with earlyonset BPI. The finding has reflected the co-sharing genetic features of psychiatric disorders and behavioral traits. Further investigations of the CADM2 biological function in BPI are warranted.The misuse of opiates is a serious problem worldwide, is increasing in young adults, and has substantial individual and societal consequences. In 2014 in the United States alone, approximately 1.9 million people had an opiate use disorder, including 586,000 heroin users. Neuroimaging in opiate dependence indicates both altered brain structure, particularly in the anterior cingulate cortex , and brain function involving dorsolateral prefrontal cortex and ACC. Magnetic resonance spectroscopy allows the non-invasive quantitation of brain metabolites that provide information on the neurophysiologic integrity of brain tissue. The few 1H MRS studies in opiate dependence to date revealed lower concentration of N-acetylaspartate , a marker of neuronal integrity, in the medial frontal cortex, including the ACC, as well as lower glutamate , a primary excitatory neurotransmitter, or glutamate+glutamine concentration in some but not all studies. The discrepant MRS findings may relate to differences among study participants regarding the prevalence and severity of comorbid substance use , the type, dose and duration of replacement therapy for heroin users , and/or participant age. The ACC, DLPFC and orbitofrontal cortex are important components of the brain reward/executive oversight system, a neural network critically involved in the development and maintenance of addictive disorders. Structural brain imaging in opiate dependence revealed generally lower gray matter volume or density in frontal regions, including the DLPFC, with thinner frontal cortices related to longer duration of opiate misuse. Functional MR imaging showed that the DLPFC, OFC and ACC are involved in decision making, and in opiate dependent individuals, lower task-based fMRI activity in the ACC related to compromised behavioural control of cognitive interference. Furthermore, smaller frontal gray matter volume in opiate dependence related to higher impulsivity on the Barratt Impulsivity Scale . Correspondingly, opiate dependence is associated with cognitive deficits, primarily in executive functioning and self-regulation . Thus, the neuroimaging literature in opiate dependence suggests altered frontal brain structure as well as compromised neuronal integrity and glutamatergic metabolism. Few if any studies however investigated their relationships to opioid and other substance use behaviour or cognition. Further research into specific regional brain effects and their potential cognitive and behavioural correlates may inform better targeted treatment of individuals with opioid use disorders.

Posted in Commercial Cannabis Cultivation | Tagged , , | Comments Off on It is a clinically defined nosological entity with multi-factorial but poorly understood etiologic mechanisms

The results of our analysis underscore the importance of frequent STI screening among PWH

Approximately 23% reported using alcohol and drugs prior to sex in the last six months. Participants who had a partner on PrEP reported a higher number of sex partners in the last six months compared to those with only HIV-positive partners and those with HIV-negative partners not on PrEP . However, condomless sex did not statistically differ across the three partner groups, with a prevalence of condomless sex in the last six months of 13% among PWH who had a partner on PrEP, 19% among those with only HIV-positive partners, and 16% among PWH without PrEP partners .This study examined the association of alcohol and drug use and partner use of PrEP with STI risk among PWH with a history of unhealthy drinking. Results indicated that, although PWH in this cohort have reduced HIV transmission risk based on their viral suppression and/or the use of PrEP by partners, the risk of STI transmission remained a concern. We found that participants who had a partner on PrEP had nearly three times the prevalence of STIs compared with those who had HIV-negative partners not taking PrEP. Approximately 8% of the participants in our study had a positive STI test result during the study period. The prevalence we observed was slightly lower compared to recent estimates in the STD Surveillance Network. However, Lucar et al.found similar STI rates in their clinic-based cohort of PWH across multiple sites in the Washington DC metropolitan area. These findings should be interpreted with the context that overall STI incidence among MSM has been steadily increasing over the last two decades. In one study at a community health centre in Boston, incidence of STIs among MSM increased from 4.6 to 26.8 per 100 person-years between 2005 and 2015.This trend is likely multi-factorial and partly due to a decline in condom use, enhanced STI testing, flood tray and broader perception of HIV as a manageable illness.Some have proposed that the roll-out of PrEP has contributed to increases in condomless sex, thus driving incident STIs among MSM.

However, surveillance data have noted increases in STI prevalence in the general MSM population well before the widespread use of PrEP.In our analysis, we found no statistically significant differences in condom use among PWH who had a partner on PrEP and those who did not, suggesting the potential role of other behavioural factors, such as number of partners and alcohol and drug use. Sexual network characteristics may account for the associations we observed. PWH who had a partner on PrEP had four times the median number of sex partners in the last six months compared to others, suggesting a higher probability of STI exposure. Other network characteristics such as background STI prevalence, rate of partner exchange, concurrency of sex partners, and network density or how interconnected individuals are in a sexual network might have also influenced STI risk.w1 As PWH who had a partner on PrEP were more sexually active than others in our cohort, they may also have been screened for STIs more frequently, leading to increased detection of asymptomatic infections. We also found that PWH who used alcohol and drugs prior to sex had a higher STI prevalence, although estimates were not statistically significant. Chemsex, or the use of sex enhancing drugs such as amphetamines during sex, has become increasingly popular among MSM in industrialized countries, and is a well-known driver of sexual risk-taking.The associations we observed were attenuated in adjusted analyses but the direction of each association was consistent with findings from previous reports. For example, alcohol and drug use have been associated with condomless sex, impaired sexual decision-making, and having higher numbers of sexual partners.Our findings also support the need for ongoing discussions around sexual risk behaviours and STI risk reduction during routine clinic visits.

Given the cross-sectional design of our study, we cannot conclude that partner PrEP use is causally associated with STIs in PWH. However, this study has important implications for public health efforts. The prevalence of STIs among PWH who have partners on PrEP suggests that this subgroup may be a high-yield focus for targeted interventions. Along with efforts to increase STI screening, enhanced outreach that integrate HIV and STI care coordination, and novel strategies, such as STI post-exposure prophylaxis, are needed. A key strength of our analysis is the use of a primary care-based cohort in an integrated healthcare system, which allowed us to link interview data with laboratory-confirmed STI test results. However, we acknowledge some important limitations. We were limited in our ability to characterize participants’ sexual networks and assess temporality of exposure and outcome. It is conceivable that participants acquired an STI before any encounter with a partner on PrEP or acquired the infection from others in their sexual network. It is also possible that participants may not have accurately reported their partner’s PrEP use or their condom use behaviours due to recall bias, social desirability, and/or misinformation. Some in our sample may also have received STI testing outside of the KPNC healthcare system, results of which would not be captured in the electronic health record. All participants had a history of unhealthy alcohol use in the prior year, so findings may not be reflective of the experiences of other PWH. However, the prevalence of hazardous alcohol use in our cohort based on AUDIT scores was similar to other studies involving general PWH populations.Lastly, the majority of the participants in our study were MSM and all of them were insured; therefore findings may not be generalizable to the broader population of PWH, particularly cisgender women, transgender people, and those without health insurance coverage.

Despite these limitations, this study provides important insights that can inform efforts to address the STI epidemic. However, prospective studies are needed to more clearly understand the relationship between partner PrEP use and STI incidence among PWH.Health complications related to preterm birth may impose lifelong sequelae or death.In the United States, 17% to 34% of infant deaths within the first year of life are attributable to prematurity.Children born preterm are more likely to have vision or hearing loss, cerebral palsy, and physical or learning delays.The societal economic burden associated with preterm birth in the United States was estimated to be over $26 billion annually more than a decade ago.Years of study have identified numerous risk factors for preterm birth, including obesity, hypertension, diabetes, smoking, drug or alcohol dependence/abuse during pregnancy and a short interval between pregnancies.Few protective factors against preterm birth have been identified, but include maternal birth outside of the United States and interpregnancy interval of 24 to 60 months.Identification of risk and protective factors has not decreased preterm birth rates in the United States – instead rates have been showing an upward trend. In an effort to improve infant health outcomes, there has been a recent upsurge in efforts to reduce preterm birth rates in the United States.This effort is challenging, due to the complex biology of preterm birth, various clinical presentations,ebb and flow tray and socioeconomic and psychosocial influences.Due to the need for multi-pronged approaches to decrease preterm birth rates, a collaborative place-based approach may be an effective way to decrease rates locally. A place-based approach is designed to take into account the unique local and contextual conditions of specific locations, engage a diverse range of sectors in a collaborative decision making process, and leverage local talent, knowledge, and assets.By addressing drivers of preterm birth that may be more frequent based on location , this method recognizes that one size may not fit all, either in terms of drivers or interventions. California reports a 2016 preterm birth rate of 8.5%, with the highest rate in Fresno County, located in the Central Valley region. Fresno County has just under one million residents, half of whom are Hispanic, and has the highest value of agricultural crops by any county in the United States.Fresno County reports the highest poverty rate in California, with 32.3% of families with children living below the poverty level, and is considered a Primary Care Health Professional Shortage Area.In this study we evaluated the influences of maternal characteristics and obstetric factors on timing of birth in Fresno County to evaluate both risk and prevalence of risk by urban, suburban, and rural residence.

We aimed to identify risk and protective factors for birth before 37 weeks’ gestation that can inform policy and health care priorities designed to reduce preterm birth rates in Fresno County. In this retrospective cohort study, our sample was drawn from California live births between January 1, 2007 and December 31, 2012. The sample was restricted to women with singleton births with best obstetric estimate of gestation at delivery between 20 and 44 weeks, linked to the birth cohort database maintained by the California Office of Statewide Health Planning and Development, with no known chromosomal abnormalities or major structural birth defects, and a Fresno County census tract . The birth cohort database contained linked birth and death certificates, as well as detailed information on maternal and infant characteristics, hospital discharge diagnoses and procedures recorded as early as one year before delivery and as late as one year post-delivery. Data files provided diagnoses and procedure codes based on the International Classification of Diseases, 9th Revision, Clinical Modification .Structural birth defects for the study were considered “major” if determined by clinical review as causing major morbidity and mortality that would likely be identified in the hospital at birth or lead to hospitalization during the first year of life.The sample of Fresno County women was stratified by residence in urban, suburban and rural census tracts as defined by the Medical Service Study Areas . MSSAs “are recognized by the U.S. Health Resources and Services Administration, Bureau of Health Professions’ Office of Shortage Designation as rational service areas for purposes of designating Health Professional Shortage Areas , and Medically Underserved Areas and Medically Underserved Populations ”.Within each of these residence strata, known maternal preterm birth risk factors were compared for women who delivered before 37 weeks’ gestation to those of women who delivered between 37 and 44 completed weeks’ gestation, using Poisson logistic regression to calculate crude relative risks and 95% confidence intervals . Comparisons using data from birth certificate records included race/ethnicity, maternal age, education, payment for delivery, participation in the Women, Infants, and Children program ,parity, maternal birthplace, report of smoking during pregnancy, maternal body mass index , trimester when prenatal care began, and number of prenatal care visits. For multi-parous women, we examined the relationship between preterm birth and previous preterm birth, previous cesarean delivery, and interpregnancy interval. Interpregnancy interval was calculated from previous live birth as reported in linked records and estimated as months to conception of the index pregnancy. Given that the day of previous live birth was not available, the middle of the month was used for calculation purposes.Factors from hospital discharge ICD-9 diagnoses included: Preexisting hypertension without progression to preeclampsia, preexisting hypertension with progression to preeclampsia, gestational hypertension without progression to preeclampsia, gestational hypertension with progression to preeclampsia, preexisting diabetes, and gestational diabetes. We also compared preterm birth with respect to the frequency of coded infection, anemia, drug or alcohol dependence/abuse, and mental disorder . Multivariable models of maternal risk and protective factors for preterm birth were built for each location of residence category using backwards-stepwise Poisson logistic regression wherein initial inclusion was determined by a threshold of p < .20 in crude analyses. Adjusted RRs and their 95% CIs were calculated for each residence stratum. In an effort to visualize overall risk of preterm birth by census tract, cumulative risk scores estimated the overall risk of preterm birth. Scores were calculated for each woman by adding her risks and subtracting her protective factors – 1 remaining in the final multi-variable model. Risk scores were grouped into scores 0.0 or less, 0.1 to 0.9, 1.0 to 1.9, 2.0 to 2.9 and 3.0 or more. Drug dependence/abuse and mental illnesses were further classified based on ICD-9 diagnostic codes, although risks calculations were not computed due to small numbers. Drug dependence/abuse was defined by classification of drug: opioid, cocaine, cannabis, amphetamine, other drug dependence/abuse, and poly substance dependence/abuse. Mental illnesses were further classified as: schizophrenic disorders, bipolar disorder, major depression, depressive disorder, anxiety disorders, personality disorders, and more than one of the previously mentioned categories. Infection was further classified as asymptomatic bacteriuria, urinary tract infection, sexually transmitted infection, and viral infection .

Posted in Commercial Cannabis Cultivation | Tagged , , | Comments Off on The results of our analysis underscore the importance of frequent STI screening among PWH

Resting-state fMRI has become a popular tool to study how distant brain areas are functionally connected

Utilizing data from the IMAGEN study that included longitudinal fMRI and measures of PS at follow-up, Papanstasiou et al. observed increases in right frontal activation during reward anticipation and feedback of win from age 14 to 19 that was associated with PS at age 19; this increase over time was not observed in youth who did not report PS. The authors speculate whether this finding could be a possible compensatory mechanism. However, given that PS were not assessed at age 14, results are to be interpreted with caution. Unlike task-based fMRI, it is less susceptible to performance and vigilance differences between groups, which facilitates interpretation of group differences. Again, the PNC has allowed large-scale investigation of functional connectivity across development. With regard to static functional connectivity , Satterthwaite et al. showed that PS youth exhibited similar patterns of dysconnectivity to patients with overt psychosis. In particular, they observed hyperconnectivity within the default-mode network and reduced functional connectivity within the executive control network . However, in one of the largest pediatric population-based samples Karcher et al. recently reported hypoconnectivity within the DMN and within the executive control networks that is associated with increased PS in 9- to 11-year old children . These differences in observed hypo- vs. hyperconnectivity may be attributable to age differences between the two studies. Nevertheless, there has been a similar dissonance in adult cohorts with overt psychosis, where both hypo- and hyperconnectivity of the DMN and executive control networks has been described . In an elegant follow-up study that applied multivariate sparse canonical correlation analysis to the PNC resting state data, Xia and colleagues corroborated that in fact the segregation between the DMN and executive control networks is a common feature across multiple psychopathology dimensions, but the psychosis dimension shows the strongest effect .

Moreover, a recent study of this cohort that investigated dynamic properties of functional connectivity, i.e., time-varying patterns of whole-brain connectivity,vertical grow system found that previously described dysconnectivity between the DMN and executive control networks in youth experiencing PS is time-dependent, and only occurs during certain periods of a resting-state scan, whereas dysconnectivity in visual and sensorimotor areas is much more pervasive . The Human Connectome Project is an adult cohort in which resting-state fMRI as well as self-reported PS were acquired. Here, PS were significantly inversely correlated with cognitive abilities, an effect that was partially mediated by global efficiency of the executive control network, a measure of network integration . With regard to dynamic functional connectivity in the HCP, it has recently been shown that adults experiencing PS spend more time in a dynamic state, i.e., a distinct time-varying connectivity pattern, characterized by reduced connectivity within the DMN ; a finding that mirrors previous results in studies on individuals with overt psychosis . Even though pediatric population neuroscience is still in its infancy, studies overwhelmingly find that PS in childhood and adolescence pose a risk factor for later development of overt psychiatric illness, and are overall associated with reduced functioning and quality of life. Many early intervention specialty programs offer a coherent multi-modal treatment framework for clients, including psychopharmacological treatment, psychotherapy and psychoeducation as well as vocational counseling. Meta-analytic results suggest that multidisciplinary therapies can delay or prevent transition to overt psychosis . Low risk psychosocial interventions targeting functioning have been shown to be effective in CHR youth; such approaches are likely to be also effective in a broader audience .

These results find consideration in the recently published guidelines of the European Psychiatric Association where a dual treatment consisting of cognitive behavioral therapy and pharmacological treatment yields recommendation grade A for adult CHR individuals. For children and adolescents experiencing PS, as targeted by pediatric population neuroscience, the expert recommendation is specific psychological interventions to improve functioning and close monitoring of PS. PS are often preceded by non-specific behavioral and emotional problems in childhood related to increased adversity and trauma. Since these precursors in themselves pose a risk for development of diverse psychopathologies, we argue – as others before us – that these childhood-onset problems offer another promising target for population-based preventive interventions. However, causal mechanisms from abnormal neurodevelopment to subsequent psychopathology are not yet understood and require further longitudinal research. Since only a minority of individuals with PS access appropriate mental health services, it will be important to implement services appropriate to a broad audience, for example in schools. It will be essential to identify those individuals at highest risk, and to reduce the number of false positives in order to provide cost effective services and to reduce stigma. Individual risk calculators developed and tested in CHR cohorts may not work as well when broadening the target audience. With sufficient longitudinal data, questionnaires such as the Psychosis Questionnaire, Brief Version may be amenable for community samples, and may be used to develop risk calculators for youth in the general population. Given the evidence presented here and results from the Outreach and Support in South London and Headspace initiatives , we argue that findings from population-based studies are adequate for guiding policy-making toward further emphasis on public health efforts, although more systematic research is needed in this area.

Destigmatization initiatives for mental illness have been shown to be effective in reducing discrimination and stigma , and broadly accessible mental health programs like Headspace and Jigsaw are promising to make a difference in the field of adolescent mental health . However, the specific efficacy of these programs warrants further study, and caution is advised to not over-pathologize potentially transient occurrence of mental health problems.The prevalence of alcohol, tobacco, and other substance use is higher among gay, bisexual, and other men who have sex with men than in the overall population . Although Hughes and Eliason noted that substance and alcohol use have declined in lesbian, gay, bisexual, and transgender populations, the prevalence of heavy alcohol and substance use remains high among younger lesbians and gay men, and in some cases older lesbians and gay men. Marginalization on the basis of sexual orientation increases the risk for problematic substance use. For example, GBM men were approximately one and half times more likely to have reported being diagnosed with a substance use disorder during their lifetime than heterosexual men , and one and a half times more likely to have been dependent on alcohol or other substances in the past year . GBM also have higher rates of mental health issues than their heterosexual counterparts . In a review of 10 studies, Meyer found that gay men were twice as likely to have experienced a mental disorder during their lives as heterosexual men. More specifically, gay men were approximately two and a half times more likely to have reported a mood disorder or an anxiety disorder than heterosexual men. A review by King and colleagues found that lesbian, gay, and bisexual individuals were more than twice as likely as heterosexuals to attempt suicide over their lifetime and one and a half times more likely to experience depression and anxiety disorders in the past year,vertical grow rack as well as over their lifetime.Few Canadian studies have explored population-based estimates for mental health outcomes among GBM. In one cross-sectional study of Canadian gay/“homosexual” and bisexual men using 2003 Canadian Community Health Survey data, Brennan and colleagues found participants were nearly three times as likely to report a mood or anxiety disorder than heterosexual men. Pakula & Shoveller conducted a more recent cross-sectional analysis that used 2007–2008 Canadian Community Health Survey data and found again that GBM were 3.5 times more likely to report a mood disorder compared with heterosexual males. These analyses used government-run population-based study data, which may limit self-disclosure of sexual minority status, and further relied on a single identity variable to measure sexual orientation, which ignores same-sex sexual behaviors. There is an inextricable yet varied relationship between an individual’s mental health and substance use. Substance use may lead to poorer mental health or, inversely, poor mental health may lead to increased substance use . A variety of substances have been shown to be associated with negative mental health events or symptoms. For example, Clatts, Goldsamt, and Li found that a third of young MSM who used club drugs on a regular basis reported having attempted suicide, and almost half of those who had attempted suicide, did so multiple times over their lifetime. They also found that more than half of regular club drugs users had high levels of depressive symptoms. McKirnan and colleagues found that GBM who showed signs of depression were nearly twice as likely to smoke. Stall and colleagues identified a “dose-response” relationship between self-rated mental well being and alcohol related problems: GBM who self-rated their mental well-being as low were approximately three times more likely to have alcohol related problems and those who rated it as moderate were nearly twice as likely to have alcohol related problems. Respondents who scored as depressed were also one and half times more likely to report using multiple drugs and nearly twice as likely to report weekly drug use.

Syndemics [clusters of mutually reinforcing epidemics that interact with one another to make overall burden of disease within a population worse ] has been used in research with GBM to explain how various psychosocial variables such as poly drug use, mental health conditions, and intimate partner violence increase the likelihood of acquiring HIV . However, nearly all of these studies have relied on convenience samples through online and venue-based recruitment; thus, they may not be representative of the larger underlying population of GBM. In order to address issues of representativeness and limitations of non-probability sampling in past research with GBM, we used respondent-driven sampling to estimate population parameters that are more representative than convenience samples . RDS is a type of chain-referral research technique in which participants are asked to recruit individuals from within their social networks in successive waves, and estimates population parameters using measures of network size and recruitment homophily. By utilizing RDS we sought to produce a more representative sample of the GBM population in Metro Vancouver in order to determine the prevalence of mental health issues and substance use as well as the association between these factors.We analyzed cross-sectional data from participants enrolled in the Momentum Health Study, a longitudinal bio-behavioral prospective cohort study of HIV-positive and HIV-negative GBM in Metro Vancouver, Canada. The overall aim of this study was to examine the impact of a biomedical intervention—increased access to highly active antiretroviral therapy for HIV— on HIV risk behaviors among GBM. The present analysis utilized data collected from participants’ first study visit that occurred between February 2012 and February 2014. We used RDS to recruit GBM in the Greater Vancouver area . Initial seeds were selected in person through partnerships with community agencies or online through advertisements on GBM socio-sexual networking mobile apps or websites . These seeds were then provided with up to six vouchers to recruit other GBM they knew. All participants were screened for eligibility and provided written informed consent at the in-person study office in downtown Vancouver. A computer-assisted, self-administrated questionnaire was used to collect socio-demographic, psychosocial, and behavioral variables. Subsequently, a nurse-administered structured interview collected information on history of mental health and substance dependence diagnosis and treatment, and participants provided blood samples to test for HIV and other sexually transmitted infections . Participants received a $50 honorarium for completing the study protocol and an additional $10 for each eligible GBM they recruited into the study. All project investigators’ institutional Research Ethics Boards granted ethical approval. Moore and colleagues have published additional detail on the Momentum Health Study protocol.Independent variables included socio-demographics, sexual behaviors, substance use behaviors, Alcohol Use Disorders Identification Test categorical scores , and the Hospital Anxiety and Depression Scale categorical scores of anxious and depressive symptoms . Socio-demographic characteristics include: age, sexual identity , ethnicity , immigration status , residence , highest formal education attained, current student status, annual income, being out as gay, HIV serostatus, and current regular partnership status. Sexual behaviors included engaged in sex work in the past 6 months and number of male anal sex partners in the past 6 months. Participants reported whether they had used a variety of substances in the past six months: cigarettes, cannabis, erectile dysfunction drugs, poppers , steroids , cocaine, ecstasy, ketamine, gamma-hydroxybutyrate , hallucinogens , crystal methamphetamine , crack, other stimulants , heroin, morphine, other opioids , benzodiazepines, and other prescription drugs .

Posted in Commercial Cannabis Cultivation | Tagged , , | Comments Off on Resting-state fMRI has become a popular tool to study how distant brain areas are functionally connected

Antiretroviral therapy has dramatically improved the survival and quality of life of people with HIV

Rates have remained at least 2.5 times higher among males since 2016 . This is among the first national investigations to examine reasons for fentanyl exposure in a nuanced manner. With respect to route of administration, dermal use decreased, and injection and inhalation in particular increased. Inhalation not only increased over four-fold, but this route was a risk factor for patients experiencing a major effect or death. This is the first national report on the increase in inhalation as a route of illicit fentanyl use. These results are difficult to compare to various other studies that queried route of administration but only reported injecting behavior . In addition, a limitation of Poison Control data is that we often cannot differentiate types of use that constitute inhalation; specifically, we cannot determine whether the patients insufflated, i.e., snorted, or smoked the substance. Our results appear to add to a national cross-sectional study of fentanyl-related deaths in 2016 which found evidence for snorting and smoking the drug in 52.4% and 17.9% of cases, respectively . A recent study in San Francisco found that there was a shift between injecting heroin to smoking fentanyl . This shift appeared to be largely rooted in harm reduction behavior; e.g., avoiding injection related risks of skin and soft tissue infections. Further, injection is associated with increased risk of overdose , although in our multi-variable model, injection only approached significance as a risk factor for a major adverse outcome or death. Deaths involving opioids as recorded by the State Unintentional Drug Overdose Reporting System in the first half of 2019 suggest that among deaths with a reported specific route of administration, injection was most common, followed by ingestion , snorting/sniffing , and smoking . Indeed, snorting or smoking fentanyl is less common than injecting , but these methods appear to be becoming more prevalent among those who want to avoid injection or its associated risks . As such,how to cure cannabis research needs to continue to monitor shifts in route of administration as well as real or perceived changes in risk related to route.

Many Poison Control studies that examine drug exposures combine “abuse” and misuse into a single category, but we chose not to aggregate these reasons, which resulted in a more nuanced analysis. The proportion of cases involving “abuse” increased over time and represented the majority of cases in 2021. “Abuse” was also a major risk factor for patients experiencing a major effect or death. Misuse, which implies improper use of a legitimate medication, slightly decreased over time, and was actually inversely associated with patients experiencing a major effect or death. We believe this suggests higher call volume indicating use of illicitly manufactured fentanyl as opposed to use of pharmaceutical product. These results demonstrate the complexity of reasons for use which cannot be delineated in other datasets. Also, with respect to reasons for use, suspected suicide attempts decreased, although these cases were associated with higher risk of a major effect or death, possibly due to intentional high doses. Therapeutic error and adverse reactions decreased which we believe suggests stricter or more careful medical oversight of prescribed fentanyl in recent years. With regard to polydrug use, the proportion of cases involving methamphetamine and cocaine increased from 669.0% and 374.0%, respectively. This corroborates literature suggesting that co-use of fentanyl with stimulants is becoming more common , so much that it has been suggested that co-use of fentanyl and stimulants is now the “fourth wave” of the opioid overdose crisis . Despite increased co-use of methamphetamine, we found that this was actually a protective factor against experiencing a major adverse effect or death. The extent to which the drugs were directly combined, used en tandem, or merely used within the same day, is unknown. Given that some people use methamphetamine in attempt to prevent or reverse effects of opioids or other depressants and others use in attempt to alleviate opioid withdrawal , more research is needed to determine contexts of such co-use in relation to severity of overdose risk. While the proportion of patients co-using heroin increased, we found that co-use of prescription opioids decreased.

This finding adds to findings from mortality data which suggests that while deaths involving both prescription opioids and synthetic opioids increased from 2013 to 2017, deaths then leveled off from 2017 to 2019 . This decrease may be indicative of the decline of the “first wave” of the opioid crisis which was largely driven by prescription opioid use . Further, nationally, non-medical prescription opioid use and misuse has also been decreasing in the US . Although recent studies have found increasing rates of benzodiazepine use being involved in fentanyl-related deaths , we found that co-use of benzodiazepines actually decreased. A recent finding of a serious rise in seizures of counterfeit pills containing fentanyl highlights concern about unbeknownst co-use. The highest proportion of exposures occurred in the West, which is in contrast with other studies finding that most related deaths have occurred in the Northeast . With regard to the origin of calls, the proportion of calls to Poison Control centers increased from hospital centers and the proportion of calls made “on site,” typically referring to where the poisoning occurred, decreased. This suggests that it is primarily medical staff that call PCCs regarding fentanyl exposures, not patients or their caretakers. This perhaps demonstrates that while 911 emergency services are often called to respond to fentanyl overdoses, the public rarely calls PCCs for such cases. We also found that on site cases were less likely to experience a major outcome or death, although SUDORS data from the first half of 2019 suggest that the majority of fentanyl-related deaths occur at the decedent’s home followed by someone else’s home . Therefore, on site calls appear to be lacking using these data so more research is needed to focus on circumstances of nonfatal overdoses that occur in homes. The increase in poisonings found in this study highlight the rising risk environment due to illicitly manufactured fentanyl and point to the need for better prevention efforts. Improved drug supply surveillance is needed to better elucidate exposure risk and this could include crime lab data , or data from the High Intensity Drug Trafficking Areas Performance Monitoring Program .

The goal is timely data that can serve as an early warning system. Likewise, PCC data, with potentially timelier outcome data, can complement mortality data and aid in surveillance efforts. Drug checking for fentanyl, an intimate form of drug surveillance, is being explored to reduce the risk imposed by fentanyl . Further, wider distribution of naloxone is needed for more rapid response to fentanyl related overdose. For better household coverage, the US Food and Drug Administration is working on over-the-counter status for naloxone . We do not know definitively which cases involved prescribed, unprescribed, and/or illicitly manufactured fentanyl, although misuse tends to be associated with prescribed fentanyl . We did not present data on form of fentanyl reportedly used as this variable was missing for 64% of cases. However, results from a sub-analysis do indicate that 87.7% of patients who reported dermal use used fentanyl in patch form, suggesting that the vast majority of dermal use was in fact via patches. These data rely on caller or other contact information which may or may not include the patient. Some cases may rely on secondhand reporting, though this is not likely to have changed significantly over time and unlikely to have significantly biased trend estimates. Toxicology testing was not always conducted to confirm exposure to fentanyl or its analogs. Relatedly, given that fentanyl is a common adulterant in or replacement for drugs such as heroin, fentanyl exposures were likely underreported when people were unknowingly exposed. In fact, reporting of poisonings related to fentanyl is relatively rare compared to mortality studies. This is because calls to Poison Control are dependent on a patient or medical professional calling to report the poisoning or to ask for medical advice to treat a case. As such, these data are not generalizable to all poisonings; they are however, useful in informing other national studies. Exposures are also not generalizable to use or nonfatal overdose in the population as most cases reported involve adverse effects related to exposure and reporting exposures to PCCs is only voluntary. Finally,trimming cannabis it is possible for medical outcomes to be misclassified, but in at least three quarters of cases involving fentanyl, exposure information is obtained from medical facilities that monitor patients and PCC staff follows up on cases to obtain the most accurate information possible before closing a case .However, HIV-associated neurocognitive disorders remain highly prevalent, particularly in milder forms and even in PWH on suppressive ART . The most prominent neurocognitive deficits in the ART era tend to be observed in the domains of learning and executive function, though many PWH are also impaired in delayed recall, processing speed, working memory, and motor skills . Significant heterogeneity exists with regard to severity and profile of neurocognitive impairment in PWH and this is also reflected in neuroimaging and neuropathological studies. One explanation for this heterogeneity is that there may be subtypes of HAND with distinct underlying mechanisms, risk factors,and consequences. Identifying sub-types of HAND may improve our understanding of the etiology, nature, course, and treatment of HAND.

Multiple factors contribute to NCI in virally suppressed PWH , including viral persistence in the CNS , cellular and epigenetic factors , ART toxicity , comorbidities, and coinfections. A unifying factor across these mechanisms is chronic immune activation and inflammation. ART reduces, but does not normalize immune activation, in most PWH. Soluble biomarkers of immune activation and inflammation are elevated in PWH despite viral suppression and in PWH with HAND . This persistent level of low-grade chronic immune activation and inflammation, which induces neuroinflammation and neurovascular complications , is considered a key element of pathogenesis of NCI in ART-treated PWH. Chronic inflammation in ART-treated PWH can also lead to endothelial dysfunction , which is a key mechanism underlying the development of atherosclerosis and subsequent cardiovascular disease . During the immune response to infection, pro-inflammatory cytokines released from immune cells activate endothelial cells, triggering the expression of cellular adhesion molecules . Cell adhesion molecules interact with other molecules to promote the recruitment, adherence, and migration of leukocytes to and across the vascular endothelium . Chronic and sustained activation of immune and endothelial cells leads to increased adhesion molecule expression, vascular permeability, and immune cell migration, which further propogates the inflammatory response and can lead to endothelial dysfunction or damage . CVD is among the most prevalent of age-associated non-infectious conditions in PWH , may occur at earlier ages relative to age-matched people without HIV , and is further exacerbated by the increased incidence of traditional and non-traditional risk factors . In the general population, CVD, particularly in mid-life, is among the strongest risk factors for vascular, Alzheimer’s, and mixed dementia types . Studies of PWH, including those who are virally suppressed, have found adverse independent effects of CVD and/or its risk factors on NCI , brain inflammation, vascular abnormalities, and neural injury . As the HIV population grows older, CVD and its risk factors are increasingly recognized for their role in HAND. Recent studies have also found evidence of endothelial dysfunction and blood-brain barrier impairment in ART-treated and virally suppressed PWH . The BBB, composed of specialized endothelial cells and tight junctions and surrounded by a basement membrane, pericytes, astrocytes, microglia and neurons , is a primary target of HIV-associated neural injury and a central neuropathological factor underlying HAND . The BBB is also altered by CVD, further contributing to dysfunction . Increased BBB permeability is a key mechanism in cognitive impairment due to cerebrovascular disease, which is similar to HAND in terms of risk factors and often has similar underlying pathological processes given evidence of overlapping neuroimaging and neurocognitive phenotypes . Cerebrovascular-related cognitive deficits depend on the location and characteristics of the particular lesion, though these lesions are often distributed throughout fronto-subcortical and fronto-parietal white matter tracts important for processing speed and executive functioning . These domains are also linked to subclinical CVD and atherosclerosis in PWH , and similar patterns of diffuse white matter pathology are found in HAND .

Posted in Commercial Cannabis Cultivation | Tagged , , | Comments Off on Antiretroviral therapy has dramatically improved the survival and quality of life of people with HIV

Two CTN trials have employed EMA and passive sensing technologies

This pattern highlighting the effectiveness of TES was evident across diverse groups of patients, including those with stimulant, cannabis and alcohol use disorders , those with and without criminal justice involvement those with and without Internet access , and across both males and females and diverse racial and ethnic groups . TES was also found to have promising cost-effectiveness . This CTN trial built on a prior body of NIDA-funded single site trials showing, for example, that adding TES to buprenorphine treatment produces synergistic treatment effects; that replacing part of counselor-delivered treatment with TES treatment in methadone treatment systems greatly improves patients’ treatment outcomes, and that TES offered to incarcerated individuals can produce comparable treatment outcomes to those produced by exclusively clinician-delivered care . By conducting a national, highly rigorous multi-site trial, the CTN study was well-poised to demonstrate the safety and effectiveness of the TES digital intervention when reviewed by the FDA, leading to the very first FDA-authorized prescription digital therapeutic in the U.S. . This reflects a new category of FDA-regulated devices and allows for digital therapeutics to be prescribed by clinicians, in a manner similar to FDA-approved medications. This is a compelling example of how CTN research can change the landscape of care to scale-up access to evidence-based treatments for SUDs. This work led to several ancillary CTN studies focused on enhancing TES to be modified for American Indians and Alaskan Natives . And, several mobile digital therapeutics will be included in a new national CTN trial that will test strategies to improve treatment retention in medication treatment for OUD and to improve outcomes among individuals who are stabilized on OUD medications but wish to discontinue such medications .

Although treatments for SUDs,cannabis grow room including OUD, have been shown to be life-saving , many communities across the nation are challenged by not having sufficient clinician capacity to conduct universal screening and medication induction and maintenance to the large population who may need it. This is particularly challenging in rural contexts which have lower capacity for evidence-based opioid treatment, including fewer waivered buprenorphine prescribers, behavioral health clinicians, and opioid treatment programs and often lack of a widespread public transportation system. To address this challenge, the CTN is launching a large multi-site trial to evaluate the effectiveness of a telehealth model of care for the medication treatment of OUD , designed to help ensure sufficient and sustainable capacity to offer evidence-based opioid treatment in rural communities . This cluster-randomized, comparative effectiveness trial will examine the utility of adding tele-MOUD to outpatient MOUD treatment compared to outpatient MOUD treatment alone in rural areas highly impacted by the U.S. opioid crisis. Tele-MOUD will flexibly offer patients remote access to a core team of OUD experts who can assess and prescribe medications to individuals with OUD, provide therapy, and/or provide remote urine and/or saliva drug testing. Another new primary care-based CTN trial , which seeks to identify an effective strategy to address unhealthy opioid use and prevent escalation to an opioid use disorder, will also offer remote telecounseling to participants to enhance onsite care centrally led by a nurse care manager. The collective learning from this research will inform innovative models that can scale access to a suite of evidence-based treatment for OUD in high need, low resource settings. The first of these studies developed and evaluated the ability of a wrist-worn sensor suite to detect cocaine use . This work builds on prior promising work demonstrating that a chest band with electrodes can detect cocaine use via a computational model that uses heart rate and physical activity data .

The present study seeks to evaluate whether similar cocaine detection algorithms will work, that have been modified for use with sensor data collected via a less obtrusive, more user-friendly smartwatch that can be worn in daily life. Data from the smartwatch is compared to chest band data as well as EMA reports of cocaine use of cocaine use. Cocaine use is often measured via self-report, which can be inaccurate, and/or use is measured via urine drug tests which can be intrusive and may not capture the temporal granularity of cocaine use patterns . If smartwatch sensing is determined to be an acceptable and accurate way to measure cocaine use, it may offer rich information about the precise timing and duration of use events and could allow us to glean new insights into contextual factors that may serve as triggers for use events. Additionally, detecting cocaine use with greater precision may enhance our outcomes measurement in clinical trials that evaluate potential therapeutics for cocaine use disorder. The second of these CTN studies is, to our knowledge, the first study to employ passive mobile sensing, social media data, and active responses to queries on mobile devices using EMA to obtain moment-by-moment quantification of individual-level data that may lead to opioid use events, medication non-adherence and/or MOUD treatment dropout/retention in a population of persons with OUD in buprenorphine treatment. In this study, participants are asked to wear a smartwatch and carry a smartphone continuously for a period of 12 weeks. The smartwatch passively collects data regarding location and distance traveled, physical activity , sleep, and heart rate. Participants are also prompted to respond to questions through a smartphone multiple times per day. Questions assess sleep, stress, pain severity, pain interference, pain catastrophizing, craving, withdrawal, substance use risk context, mood, location, substance use, self-regulation, and MOUD adherence. In addition to the EMA prompts, individuals are asked to self-initiate EMAs if substance use occurred.

App usage, audio/conversation, call/text, GPS, screen on/off, phone lock/unlock, phone notification information, Wi-Fi & Bluetooth logs, sleep, ambient light, and proximity are passively collected via smartphone. Participants are also asked if they are willing to share their social media data from any social media platforms they may use . Sharing social media data is an optional component of study participation. The primary objective of the study is to evaluate the feasibility of utilizing digital health technology with OUD patients as measured by a 12-week period of continuous assessment using EMA and digital sensing. A secondary objective of this study is to examine the utility of EMA, digital sensing, and social media data in predicting OUD treatment retention and buprenorphine medication adherence. Overall, this line of research may inform which subset of digitally-derived data may be most useful to employ as part of outcome measurement in future clinical trials research. Digital data that capture the richness of clinical status and clinically trajectories as individuals go about their daily lives may greatly complement and enhance the learning from standardized, clinical outcomes assessment. And predicting OUD treatment retention and medication adherence via continuous digital assessments may be used to identify early those participants who show signs of non-adherence and trigger additional intervention to prevent ultimate non-response to treatment. In addition to the CTN-0084-A2 study referenced above which includes social media data as part of a broader set of digitally-derived data, the CTN supports a trial that centrally evaluates the relative utility of various social media platforms in recruiting a national sample from a hard-to-reach population. Specifically, this trial compares the relative effectiveness of using social media sites vs. online informational sites vs. online dating sites to promote HIV self-testing and seamless linkage to pre-exposure prophylaxis medication among young , racial/ethnic minority,grow trays high-risk men who have sex with men . In this study, individuals in the targeted sample who click on culturally tailored study advertisements and who provide online consent will be offered a free HIV self-test kit to be discreetly sent to their home with seamless linkage to PrEP for those who test HIV-negative, and linkage to HIV care resources for those who test positive. Among other outcomes, the primary outcome is the monthly rate by promotional platform . The modifying role of substance use on observed outcomes will also be examined. Online recruitment strategies allow for targeted recruitment of select audiences. The CTN-0083 study will target recruitment in the states that have hard-to-reach, high risk populations and limited availability of risk reduction services . This study illustrates how a targeted national sample can be recruited for clinical trials participation and how all intervention delivery and data collection in a clinical trial can be conducted remotely online.

This manuscript provides an overview of the breadth and impact of research conducted within the U.S. National Drug Abuse Treatment Clinical Trials Network in the realm of digital health. This work has included the CTN’s efforts to systematically embed digital screeners for SUDs into general medical settings to increase the diagnosis and treatment of SUDs across the nation. This work has also included a pivotal multi-site clinical trial conducted on the CTN platform, whose data led to the very first “prescription digital therapeutic” authorized by the U.S. Food and Drug Administration for the treatment of SUDs. Further CTN research includes the study of telehealth to increase capacity for science-based SUD treatment in rural and under-resourced communities. In addition, the CTN has supported an assessment of the feasibility of detecting cocaine-taking behavior via smartwatch sensing. The CTN has also supported the conduct of clinical trials entirely online . Further, the CTN is conducting innovative work focused on the use of digital health technologies and data analytics to identify digital biomarkers and understand the clinical trajectories of individuals with OUD in buprenorphine medication treatment for OUD. Given its unique national research infrastructure and access to a broad array of community and healthcare partners, the CTN is uniquely poised to accelerate the scope and impact of its work applying digital health to the assessment and treatment of SUDs. Among these opportunities, the CTN is positioned to evaluate the role of digital technologies in SUD care transitions. For example, offering persons with SUDs access to a digital therapeutic and/or telehealth when they transition from a period of incarceration, hospitalization, or inpatient SUD care to the community would provide them with 24/7 access to therapeutic support as they reintegrate into the community and/or community-based care. Digital tools may also be offered directly to individuals recruited online who are not engaged, and do not wish to engage in SUD care within the health care system. Given that only about 10% of persons with SUDs are engaged in treatment, there is tremendous opportunity to creatively use digital technology to provide the other 90% with evidence-based SUD resources . The CTN is optimally poised to conduct national implementation science trials and/or hybrid implementation-effectiveness trials to evaluate optimal strategies to implement and sustain digitally-enhanced models of care. Such trials could integrate the various digital health tools and approaches that the CTN has previously studied in separate studies to instead embed a suite of complementary digital tools spanning an entire model of care within an integrated implementation strategy. That is, a digitally-enhanced model of care could include digital screeners and assessments in medical settings, linkage to electronic clinical decision support tools to enhance providers’ ability to deliver state-of-the-science care, as well as provision of digital therapeutics that are available directly to patients to ensure evidence-based care is available to them anytime and anywhere and can complement the care they receive in the healthcare sector. Importantly, digital therapeutics offered to patients do not need to reflect static models of behavioral treatment that work exactly the same way with every end user. Rather, these tools can be adaptive and flexibly offer evidence-based therapeutic resources to individuals that are responsive to their changing clinical needs, preferences, and goals. There is tremendous opportunity to integrate the science of digital assessment and digital therapeutics for SUDs to help us understand when individuals may be most receptive to health promotion interventions. They can, in turn, inform optimal delivery of “Just-in-Time Adaptive Interventions” or in-the-moment interventions for SUDs that provide the right type/amount of therapeutic support at the right time . The large and diverse samples that can be recruited within the CTN offer many opportunities to conduct novel experimental approaches to systematically investigate who would benefit from which intervention and when , as well as to apply novel statistical machine learning methods to personalize SUD interventions at the individual level .

Posted in Commercial Cannabis Cultivation | Tagged , , | Comments Off on Two CTN trials have employed EMA and passive sensing technologies

This typical pattern of neural maturation occurred among adolescents who remained nondrinkers

We found significant drinking status × time interactions in a number of distinct and reproducible brain regions commonly associated with response inhibition. Prior to initiating substance use, adolescents who initiated heavy use showed less BOLD activation during inhibitory trials in frontal regions, including the bilateral middle frontal gyri, and non-frontal regions, including the right inferior parietal lobule, putamen, and cerebellar tonsil, compared with those who continued to abstain from alcohol use. This pattern of hypoactivity among youth who later initiated heavy drinking during response inhibition is consistent with studies showing decreased activity during response inhibition predicts later alcohol use and substance use . Indeed, change in BOLD response contrast over time in the right middle frontal gyrus was associated with lifetime alcohol drinks at follow-up. Together, these findings provide additional evidence for the utility of fMRI in identifying neural vulnerabilities to substance use even when no behavioral differences are apparent. At follow up, adolescents who transitioned into heavy drinking showed increasing brain activation in the bilateral middle frontal gyri, right inferior parietal lobule, and left cerebellar tonsil during inhibition; whereas, non-drinking controls exhibited decreasing brain activation in these brain regions. These regions have been implicated in processes of stimulus recognition, working memory, and response selection , all of which are critical to successful response inhibition. Indeed, neuroanatomical models of inhibitory control highlight the importance of frontoparietal attentional control and working memory networks . These models posit that inhibition and cognitive control involve frontoparietal brain regions when detecting and responding to behaviorally relevant stimuli. Thus,drying cannabis findings suggest that heavy drinkers recruit greater activity in these neural networks in order to successfully inhibit prepotent responses.

Given the longitudinal nature of the current study, it is important to consider our findings in the context of typical adolescent neural maturation. During typical neural maturation, adolescents exhibit less activation over time, as neural networks become more refined and efficient . Adolescents who transitioned into heavy drinking showed the opposite pattern – increasing activation despite similar performance, suggesting that alcohol consumption may alter typical neural development. The current findings should be considered in light of possible limitations. Although heavy drinking and non-drinking youth groups were matched on several baseline and follow-up measures, heavy drinking youth reported more cannabis, nicotine, and other illicit drug use at follow-up. Differential activation remained significant after statistically controlling for lifetime substance use and such differences may contribute to our findings. Further, simultaneous substance use might be associated with these results. Future research should explore the effects poly substance use during the same episode compared to the effects of heavy drinking on neural responses. It is also important to note that adolescence is a period of significant inter-individual differences in neural development, and as such, we matched self-reported pubertal development and age at baseline and follow-up to address this issue. For the current sample, histograms of age distributions at baseline and follow-up are provided in Online Resource 1. Again, our groups were well matched on these variables; however, additional longitudinal research to examine the effects puberty and hormonal changes on neural functioning and response inhibition are needed. In summary, the current data suggest that pre-existing differences in brain activity during response inhibition increase the likelihood of initiating heavy drinking, and initiating heavy alcohol consumption leads to differential neural activity associated with response inhibition.

These findings make a significant contribution to the developmental and addictive behaviors fields, as this is the first study to examine neural responses differences during response inhibition prior to and following the transition into heavy drinking among developing adolescents. Further, we provide additional support for the utility of fMRI in identifying neural vulnerabilities to substance use even when no behavioral differences are apparent. Identifying such neural vulnerabilities before associated behaviors emerge provides an additional tool for selecting and applying targeted prevention programs. Given that primary prevention approaches among youth have not been widely effective, it is possible that targeted prevention programs for youth who are at greatest neurobiological risk could be a novel, effective approach. As such, our findings provide important information for improving primary prevention programs, as well as answering the question of whether neural differences predate alcohol initiation or whether differences arise as a consequence of alcohol use.Although researchers in sociology, cultural studies, and anthropology have attempted, for the last 20 years, to re-conceptualize ethnicity within post-modernist thought and debated the usefulness of such concepts as “new ethnicities,” researchers within the field of alcohol and drug use continue to collect data on ethnic groups on an annual basis using previously determined census formulated categories. Researchers use this data to track the extent to which ethnic groups consume drugs and alcohol, exhibit specific alcohol and drug using practices and develop substance use related problems. In so doing, particular ethnic minority or immigrant groups are identified as high risk for developing drug and alcohol problems. In order to monitor the extent to which such risk factors contribute to substance use problems, the continuing collection of data is seen as essential.

However, the collection of this epidemiological data, at least within drug and alcohol research, seems to take place with little regard for either contemporary social science debates on ethnicity, or the contemporary on-going debates within social epidemiology on the usefulness of classifying people by race and ethnicity . While the conceptualization of ethnicity and race has evolved over time within the social sciences, “most scholars continue to depend on empirical results produced by scholars who have not seriously questioned racial statistics” . Consequently, much of the existing research in drug and alcohol research remains stuck in discussions about concepts long discarded in mainstream sociology or anthropology, yielding robust empirical data that is arguably based on questionable constructs . Given this background, the aim of this paper is to outline briefly how ethnicity has been operationalized historically and continues to be conceptualized in mainstream epidemiological research on ethnicity and substance use. We will then critically assess this current state of affairs, using recent theorizing within sociology, anthropology, and health studies. In the final section of the paper, we hope to build upon our ”cultural critique” of the field by suggesting a more critical approach to examining ethnicity in relation to drug and alcohol consumption. According to Kertzer & Arel , the development of the nation states in the 19th century went hand in hand with the development of national statistics gathering which was used as a way of categorizing populations and setting boundaries across pre-existing shifting identities. Nation states became more and more interested in representing their population along identity criteria, and the census then arose as the most visible means by which states could depict and even invent collective identities . In this way, previous ambiguous and context-dependent identities were, by the use of the census technology, ‘frozen’ and given political significance. “The use of identity categories in censuses was to create a particular vision of social reality. All people were assigned to a single category and hence conceptualized as sharing a common collective identity” , yet certain groups were assigned a subordinate position. In France, for example, the primary distinction was between those who were part of the nation and those who were foreigners, whereas British, American, and Australian census designers have long been interested in the country of origin of their residents. In the US, the refusal to enfranchise Blacks or Native Americans led to the development of racial categories, and these categories were in the US census from the beginning. In some of the 50 federated states of the US, there were laws,curing cannabis including the “one drop of blood” rule that determined that to have any Black ancestors meant that one was de jure Black . Soon a growing number of categories supplemented the original distinction between white and black.

Native Americans appeared in 1820, Chinese in 1870, Japanese in 1890, Filipino, Hindu and Korean in 1920, Mexican in 1930, Hawaiian and Eskimo in 1960. In 1977, the Office of Management and Budget , which sets the standards for racial/ethnic classification in federal data collections including the US Census data, established a minimum set of categories for race/ethnicity data that included 4 race categories and two ethnicity categories . In 1997, OMB announced revisions allowing individuals to select one or more races, but not allowing a multiracial category. Since October 1997, the OMB has recognized 5 categories of race and 2 categories of ethnicity . In considering these classifications, the extent to which dominant race/ethnic characterizations are influenced both by bureaucratic procedures as well as by political decisions is striking. For example, the adoption of the term Asian-American grew out of attempts to replace the exoticizing and marginalizing connotations of the externally imposed pan-ethnic label it replaced, i.e. “Oriental”. Asian American pan-ethnic mobilization developed in part as a response to common discrimination faced by people of many different Asian ethnic groups and to externally imposed racialization of these groups. This pan-ethnic identity has its roots in many ways in a racist homogenizing that constructs Asians as a unitary group , and which delimits the parameters of “Asian American” cultural identity as an imposed racialized ethnic category . Today, the racial formation of Asian American is the result of a complex interplay between the federal state, diverse social movements, and lived experience. Such developments and characterizations then determine how statistical data is collected. In fact, the OMB itself admits to the arbitrary nature of the census classifications and concedes that its own race and ethnic categories are neither anthropologically nor scientifically based . Issues of ethnic classification continue to play an important role in health research. However, some researchers working in public health have become increasingly concerned about the usefulness or applicability of racial and ethnic classifications. For example, as early as 1992, a commentary piece in the Journal of the American Medical Association, challenged the journal editors to “do no harm” in publishing studies of racial differences . Quoting the Hippocratic Oath, they urged authors to write about race in a way that did not perpetuate racism. However, while some researchers have argued against classifying people by race and ethnicity on the grounds that it reinforces racial and ethnic divisions; Kaplan & Bennett 2003; Fullilove, 1998; Bhopal, 2004, others have strongly argued for the importance of using these classifications for documenting health disparities . Because we know that substantial differences in physiological and health status between racial and ethnic groups do exist, relying on racial and ethnic classifications allows us to identify, monitor, and target health disparities . On the other hand, estimated disparities in health are entirely dependent upon who ends up in each racial/ethnic category, a process with arguably little objective basis beyond the slippery rule of social convention . If the categorization into racial groups is to be defended, we, as researchers, are obligated to employ a classification scheme that is practical, unambiguous, consistent, and reliable but also responds flexibly to evolving social conceptions . Hence, the dilemma at the core of this debate is that while researchers need to monitor the health of ethnic minority populations in order to eliminate racial/ethnic health disparities, they must also “avoid the reification of underlying racist assumptions that accompanies the use of ‘race’, ethnicity and/or culture as a descriptor of these groups. We cannot live with ‘race’, but we have not yet discovered how to live without it” . In mainstream drug and alcohol research, traditional ethnic group categories continue to be assessed in ways which suggest little critical reflection in terms of the validity of the measurement itself. This is surprising given that social scientists since the early 1990s have critiqued the propensity of researchers to essentialize identity as something ’fixed’ or ’discrete’ and to neglect to consider how social structure shapes identity formation. Recent social science literature on identity suggests that people are moving away from root edidentities based on place and towards a more fluid, strategic, positional, and context-reliant nature of identity . This does not mean, however, that there is an unfettered ability to freely choose labels or identities, as if off of a menu .

Posted in Commercial Cannabis Cultivation | Tagged , , | Comments Off on This typical pattern of neural maturation occurred among adolescents who remained nondrinkers

Eligible participants were invited to the laboratory for additional screening

Alcohol dependent patients who underwent cue-exposure extinction training had larger decreases in neural alcohol cue-reactivity in mesocorticolimbic reward circuitry than patients who had standard clinic treatment. Cognitive bias modification training, which similarly trains individuals to reduce attentional bias towards alcohol cues, resulted indecreased neural alcohol cue-reactivity in the amygdala and reduced medial prefrontal cortex activation when approaching alcohol cues . These studies suggest that fMRI tasks may be sensitive to treatment response. Further, neurobiological circuits identified using fMRI can be used to predict treatment and drinking outcomes, providing unique information beyond that of self-report and behavior. Individuals with alcohol use disorder who return to use demonstrate increased activation in the mPFC to alcohol cues compared to individuals with AUD who remain abstinent . Moreover, the degree that the mPFC was activated was associated with the amount of subsequent alcohol intake, but not alcohol craving . Activation in the dorsolateral PFC to alcohol visual cues has been associated with higher percent heavy drinking days in treatment-seeking alcohol dependent individuals . Increased activation in the mPFC, orbitofrontal cortex, and caudate in response to alcohol cues has also been associated with the escalation of drinking in young adults . Mixed findings have been reported for the direction of the association between cue-induced striatal activation and return to use. Increases and decreases in ventral and dorsal striatal activation to alcohol cues have been associated with subsequent return to use. Utilizing a different paradigm, Seo and colleagues found that increased mPFC, ventral striatal,industrial drying racks and precuneus activation to individually tailored neutral imagery scripts predicted subsequent return to use in treatment-seeking individuals with AUD . Interestingly, brain activity during individually tailored alcohol and stress imagery scripts was not associated with return to use .

While initial evidence indicates that psychological interventions are effective at reducing mesocorticolimbic response to alcohol-associated cues, few studies have prospectively evaluated if psychosocial interventions attenuate neural cue-reactivity that in turn reduces drinking in the same population. Furthermore, no previous studies have used neural reactivity to alcohol cues to understand the mechanisms of brief interventions. Therefore, this study aimed to examine the effect of a brief intervention on drinking outcomes, neural alcohol cue-reactivity, and the ability of neural alcohol cue-reactivity to predict drinking outcomes. Specifically, this study investigated: 1) if the brief intervention would reduce percent heavy drinking days or drinks per week in non-treatment seeking heavy drinkers in the month following the intervention and 2) if the brief intervention would attenuate neural alcohol cue-reactivity. In the first case, we predicted significant effects on drinking based on the existing clinical literature and, in the second case, we predicted decrements in alcohol’s motivational salience based on the feedback about the participant’s drinking levels relative to clinical recommendations and their personal negative consequences of drinking. The effects of neural cue reactivity on subsequent drinking outcomes were tested in order to elucidate patterns of neural cue-reactivity that predict drinking behavior prospectively. Participants were recruited between November 2015 and February 2017 from the greater Los Angeles metropolitan area. Study advertisements described a research study investigating the effects of a brief health education session on beliefs about the risks and benefits of alcohol use. Inclusion criteria were as follows: engaged in regular heavy drinking, as indicated by consuming 5 or more drinks per occasion for men or 4 or more drinks per occasion for women at least 4 times in the month prior to enrollment ; a score of ≥8 on the Alcohol Use Disorder Identification Test .

Exclusion criteria included under the age of 21; currently receiving treatment for alcohol problems, history of treatment in the 30 days before enrollment, or currently seeking treatment; a positive urine toxicology screen for any drug other than cannabis; a lifetime history of schizophrenia, bipolar disorder, or other psychotic disorder; serious alcohol withdrawal symptoms as indicated by a score of ≥10 on the Clinical Institute Withdrawal Assessment for Alcohol-Revised ; history of epilepsy, seizures, or severe head trauma; non-removable ferromagnetic objects in body; claustrophobia; and pregnancy. Initial assessment of the eligibility criteria was conducted through a telephone interview. Upon arrival, participants read and signed an informed consent form. Participants then completed a series of individual differences measures and interviews, including a demographics questionnaire and the Timeline Follow-back to assess for quantity and frequency of drinking over the past 30 days. All participants were required to test negative on a urine drug test . A total of 120 participants were screened in the laboratory, 38 did not meet inclusion criteria and 12 decided not to participate in the trial, leaving 60 participants who enrolled and were randomized. Of the 60 individuals randomized, 46 completed the entire study. See Figure 1 for a CONSORT Diagram for this trial. The study was a randomized controlled trial. Participants were assessed at baseline for study eligibility and eligible participants returned for the randomization visit up to two weeks later. During their second visit, participants completed assessments, and then were were randomly assigned to receive a 1-session brief intervention or to an attention-matched control condition. Immediately after the conclusion of the session participants completed a functional magnetic resonance imaging scan to assess brain activity during exposure to alcohol cues and completed additional assessments. Participants were followed up 4 weeks later to assess alcohol use since the intervention through the 30-day Timeline Follow back interview. Participants who completed all study measures were compensated $160. The brief intervention consisted of a 30–45 minute individual face-to-face session based on the principles of motivational interviewing .

The intervention adhered to the FRAMES model which includes personalized feedback , emphasizing personal responsibility , providing brief advice , offering a menu of change options, conveying empathy , and encouraging self-efficacy . In accordance with MI principles the intervention was non-confrontational and emphasized participants’ autonomy. The content of the intervention mirrored brief interventions to reduce alcohol use that have been studied with non-treatment seeking heavy drinkers. The intervention included the following specific components: 1) giving normative feedback about frequency of drinking and of heavy drinking; 2) Alcohol Use Disorders Identification Test score and associated risk level ; 3) potential health risks associated with alcohol use; 4) placing the responsibility for change on the individual; 5) discussing the reasons for drinking and downsides of drinking; and 6) setting a goal and change plan if the participant was receptive . The aim of the intervention was to help participants understand their level of risk and to help them initiate changes in their alcohol use. Sessions were delivered by master’s-level therapists who received training in MI techniques, including the use of open-ended questions, reflective listening, summarizing, and eliciting change talk,commercial greenhouse benches and in the content of the intervention. All sessions were audiotaped and rated by author MPK for fidelity and for quality of MI interventions using the Global Rating of Motivational Interviewing Therapists . On the 7-point scale, session scores ranged from 5.87 to 6.93 with an average rating of 6.61 ± 0.23, which indicates that the MI techniques used in the intervention were delivered with good quality. Supervision and feedback were provided to therapists by author MPK following each intervention session. The treatment manual is available from the last author upon request. Participants randomized to the attention-matched control condition viewed a 30-minute video about astronomy. In the control condition there was no mention of alcohol or drug use beyond completion of research assessments. Both the intervention and attention-matched control sessions took place within the UCLA Center for Cognitive Neuroscience in separate rooms from the neuroimaging suite. The following individual questionnaires and interviews were administered during the study: the 30-day timeline follow-back was administered in interview format to capture daily alcohol and marijuana use over the 30 days prior to the visit by trained research assistants ; the self-report alcohol use disorders identification test was administered in order to assess for drinking severity ; the Penn Alcohol Craving Scale to measure alcohol craving over the past week . Participants also completed the Fagerstrom Test for Nicotine Dependence .

Lastly, participants completed a demographics questionnaire reporting, among other variables, age, sex, and level of education. The Alcohol Cues Task involves the delivery of oral alcohol or control tastes to elicit physiological reward responses and subjective urges to drink . During the task, each trial began with the presentation of a visual cue such that the words Alcohol Taste or Control Taste were visually presented to participants. This was followed by a fixation cross , delivery of the taste , and a fixation cross . Alcohol and water tastes were delivered through Teflon tubing using a computer controlled delivery system as described by Filbey and colleagues . Participants were instructed to press a button on a response box placed in their right hand upon swallowing. Alcohol tastes consisted of participants’ preferred alcoholic beverage . Beer could not be administered due to incompatibility of the alcohol administration device with carbonated liquids. The presentation of visual stimuli and response collection were programmed using MATLAB and the Psychtoolbox on an Apple MacBook running Mac OSX , and visual stimuli were presented using MRI compatible goggles . The Alcohol Cues Task was administered over the course of two runs with 50 trials/run. For the analysis of the cues task, all first-level analyses of imaging data were conducted within the context of the general linear model , modeling the combination of the cue and taste delivery periods convolved with a double-gamma hemodynamic response function , and accounting for temporal shifts in the HRF by including the temporal derivative. Alcohol and water taste cues were modeled as separate event types. The onset of each event was set at the cue period with a duration of 11 seconds. Six motion regressors representing translational and rotational head movement were also entered as regressors of no interest. Data for each subject were registered to the MBW, followed by the MPRAGE using affine linear transformations, and then normalized to the Montreal Neurologic Institute template. Registration was further refined using FSL’s nonlinear registration tool . The Alcohol Taste > Water Taste contrast was specified in the first level models. Higher level analyses combined these contrast images within subjects and between subjects . Age, sex, cigarette smoking status, and positive urine THC were included as covariates. Additional analyses evaluated if neural response to alcohol taste cues was predictive of drinking outcomes. Two models were run, evaluating percent heavy drinking days and the average number of drinks per week in the 4 weeks following the intervention or matched-control. Both models controlled for age, sex, cigarette smoking status, positive urine THC, and baseline percent heavy drinking days or average drinks per week depending on the drinking outcome model. Z-statistic images were thresholded with cluster-based corrections for multiple comparisons based on the theory of Gaussian Random Fields with a cluster-forming threshold of Z > 2.3 and a corrected cluster-probability threshold of p < 0.05 . This study examined the effect of a brief intervention on drinking outcomes, neural alcohol cue-reactivity, and the ability of neural alcohol cue-reactivity to predict drinking outcomes. Results did not find an effect of the brief intervention on alcohol use in this sample, and the intervention was not associated with differential neural alcohol cue reactivity. Exploratory secondary analyses revealed inverse relationships between differential neural activity in the precuneus and medial frontal gyrus in relation to alcohol-related outcomes, but these relationships were across conditions. The lack of main effect of intervention on either drinking outcomes or on neural alcohol cue reactivity is contrary to the study hypothesis whereby individuals assigned to the brief intervention condition were expected to show greater reductions in alcohol use compared to a no-intervention control condition . In the present study, reductions in alcohol use were observed for both conditions and it appears that simply participating in an alcohol research study at an academic medical center prompted notable behavioral changes. Reductions in drinking following study participation may be attributable to assessment reactivity, in which participants curb drinking after completing alcohol-related assessments and interviews . This phenomenon has been well-documented across several assessment modalities , including the AUDIT and TLFB interviews, which were used in the present study. In addition, recent studies have highlighted the fact that single session interventions, while efficacious in relatively large RCTS, have modest effect sizes .

Posted in Commercial Cannabis Cultivation | Tagged , , | Comments Off on Eligible participants were invited to the laboratory for additional screening

Examining recognition and delayed recall was a critical first step to inform future diagnostic improvements

Marsland et al., 2015 did find that IL-6 and CRP were associated with worse memory and smaller hippocampal volumes in middle-aged adults; however, it was cortical grey matter volume, not the hippocampus, that mediated the relationship between inflammation and memory. Studies in adults with HIV have found that peripheral biomarkers of immune activation but not biomarkers examined in this study were associated with frontal and temporal lobe regions . Interestingly, in post hoc analyses examining participants on ART who were virally suppressed, greater CRP was associated with a thinner parahippocampal gyrus. This finding may be in line with the Marsland et al., 2015 study. However, the current study had a small sample size, and several analyses were examined in post hoc analyses without accounting for multiple comparisons, so this finding should be interpreted cautiously. Integrating the aging and HIV literature, it is unclear if the association between peripheral inflammation, medial temporal lobe, and episodic memory is consistently observed in mid-life. While the best method of determining the necessary sample size to detect a mediation effect is debated, it is still likely this study’s modest sample size of 92 is under powered to detect a mediation effect, particularly given that large effect sizes were not expected . Therefore, the role of inflammation and its association with brain integrity and episodic memory in PWH should continue to be examined, particularly in larger samples with greater power to detect these associations. It will be particularly important to examine these relationships in PWH aged 65 and over given that this is the age range in which these associations between inflammation, memory, and MCI/AD risk are more consistently found. One thing to note is that these peripheral inflammatory biomarkers were examined separately,botanicare rolling benches as each biomarker may have a different relationship with memory and brain integrity. There is currently no “gold-standard” way to combine inflammation biomarkers into a single composite. However, some researchers have examined inflammation composites .

Therefore, future studies may want to examine a wider array of biomarkers and employ an inflammation composite, particularly given that the impact of inflammation on brain integrity and memory may be due to the compounding effects of multiple inflammatory biomarkers. Additionally, these biomarkers were only examined at one time point, so a better understanding of how changes in these inflammatory biomarkers over time are associated with brain integrity and cognition is also needed. Lastly, this study examined peripheral inflammation. Peripheral inflammation is easier to assess more non-invasively in comparison to a lumbar puncture which is needed to collect CSF . However, peripheral inflammation may not be as reflective of neuroinflammation compared to CSF biomarkers. Although, some studies have shown that plasma inflammation may be more associated with cognition . Thus, future studies should ideally examine both plasma and CSF biomarkers to determine if examining peripheral inflammation is sufficient. Ultimately, a better understanding of the role of inflammation and the most efficient way to measure it could help to inform interventions that could lower inflammation in PWH, if future research indicates that lowering inflammation may be cognitively beneficial. In addition to the limitations discussed above, there are additional limitations that should be considered. First, the generalizability of the sample should be considered. As noted several times in the discussion, the age range may be too young to expect a significant number of participants to have started to accumulate AD pathology. Additionally, the sample was predominantly male , which is somewhat reflective of the current demographics of PWH in the United States . Nevertheless, there are known sex differences in HIV, AD, and inflammation that this project is under powered to test but should be further examined in future studies. For example, women living with HIV are at greater risk of neurocognitive impairment, particularly in the domains of memory, speed of information processing, and motor function potentially due to a difference in psychosocial factors , comorbid conditions , and biological factors .

It is also known that women are at greater risk of AD . Additionally, participants with severe confounding comorbid conditions were excluded from this study, and this sample was characterized by relatively low current drug use and relatively high ART use. These factors are also known to impact cognitive and brain functioning; for example, cannabis use has been associated with better cognitive functioning and lower inflammation in PWH . As the HIV population continues to age, it will be important to understand if there are any associations between these sociodemographic variables and AD risk that str specific to PWH. Related to generalizability, one odd finding was the higher-than-expected number of participants with the APOE e2 allele. The percentage of participants with at least one e4 allele was somewhat comparable to the general population, with estimates ranging from 10% to 25% of people having at least one e4 allele. Additionally, it is known that Black/African American persons and persons of African ancestry have increased rates of the APOE e4 allele compared to non-Hispanic White people or those of European descent . Indeed, the CHARTER study has found an increased prevalence of the e4 allele in Black/African American participants as compared to nonHispanic White participants . The APOE e2 allele is much less studied because it is more rare, but having an APOE e2 allele is associated with a lower-than-average risk of AD. In this study, the percentage of participants with at least one APOE e2 allele was higher than the general population . Similar to the APOE e4 allele, the prevalence of APOE e2 is known to vary by ancestorial continent and latitude. The APOE e2 allele penetrance is 9.9% in Africa, which is higher than the APOE e2 allele penetrance in Europe . Even accounting for these demographic differences, the prevalence of the APOE e2 is high, and this over representation of the APOE e2 allele may mean this group is, on average, at decreased risk of AD. This increased prevalence could be due to a selection bias .

Information on the APOE e2 in PWH is very limited, but more research is certainly needed to understand AD risk in diverse groups of PWH. One minor point is that four participants with the APOE e24 genotype were categorized as APOE e4-. The limited literature on this genotype does suggest a somewhat elevated risk of AD associated with this genotype, but much less than that of those that are APOE e34 or APOE e44 . Therefore, the APOE e24 participants were categorized as APOE e4- given the only slightly elevated risk. Other categorizations could be explored, although given the small number of participants that are APOE e24 it is unlikely to make a significant difference. In addition to the potentially limited generalizability due to the demographics and clinical characteristics of this sample, this study examined a relatively modest sample size. A sample size of 92 is not necessarily small compared to other imaging studies. However, as highlighted throughout this discussion, this modest sample size could still limit the power to detect associations. Future studies in this area would benefit from improving statistical power either by enrolling a larger overall sample and/or recruiting participants with memory impairment,commercial plant racks particularly recognition impairment. This study is also limited in that it does not include an HIV-negative comparison group. Utilizing preexisting CHARTER data allowed for longitudinal analysis over 12 years and the ability to efficiently examine the neuroanatomical correlates of memory in middle-aged and older PWH. However, this study is therefore limited by pre-defined CHARTER protocol and design. Specifically, CHARTER did not enroll HIV-negative comparison participants, which precludes examination of how the relationship between memory profiles and brain integrity differ by HIV serostatus. While there is ample HIV-negative middle-aging literature to compare these results to, many of these HIV-negative middle-aging studies are demographically and psychosocially different than this group. However, even with a good comparison group, it is difficult to discern the effect of HIV versus the neurotoxic effects of ART and the downstream consequences of ART . Nevertheless, future studies would benefit from a demographically and psychosocially similar HIV-negative group to better understand if the associations between memory and neuroimaging correlates are specific to PWH or if these are associations seen regardless of HIV status. In the current study, delayed recall and recognition were examined separately rather than dichotomously splitting participants into aMCI versus non-aMCI groups or comparing HAND versus aMCI groups as in Sundermann et al. . Additionally, examining delayed recall continuously was advantageous because it increases variability and more subtle differences observed in mid-life may not be captured by diagnostic cut-points. However, associations between biological markers associated with AD have been found in PWH using aMCI criteria .

Therefore, data could be reexamined using adapted aMCI criteria and HAND criteria to examine if a more comprehensive approach to examining episodic memory is more sensitive to the medial temporal lobe than examining delayed recall and recognition separately. As described in the Methods section, the differences in scanner by site was corrected by regressing scanner from the data. Accounting for scanner was necessary given that prior CHARTER studies have shown that pooling MRI data from multiple sites is feasible, but there are documented differences between the scanners . However, accounting for scanner is essentially accounting for study site, which is somewhat problematic given that study site has been shown to be associated with the risk of neurocognitive impairment in the CHARTER study. For example, Marquine et al. found a significant effect of study site, specifically when comparing New York and San Diego, on the risk of neurocognitive impairment that was not fully accounted for by race/ethnicity differences. It is thought that differences in the risk of neurocognitive impairment are likely due to psychosocial and environmental factors that are associated with geographic location . These psychosocial and environmental factors could also impact brain integrity, and thus accounting for scanner, while necessary, may mask real differences in brain integrity. Therefore, future studies may want to employ a different statistical method that could account for differences in scanner while not eliminating the effect of study site. Relatedly, future studies could explore alternative ways to analyze the imaging data. For example, a priori regions of interest were selected given the interest in focusing on brain structures associated with HAND and aMCI. However, the FreeSurfer processing approaches provide a broad array of additional regions that could also be explored. Furthermore, additional data-driven analytic approaches exist such as whole-brain voxel-based morphometry. This study took a hypothesis-driven approach, although examination of other regions of interest, such as subdivisions of the cingulate cortex, could be done in an exploratory fashion. Other imaging modalities such as diffusor tensor imaging to examine white matter integrity, arterial spin labeling to examine cerebral blood flow, MRS to examine neurochemical alterations, and amyloid PET imaging may also help to better understand episodic memory in PWH. Despite these limitations, this study has several clinical implications. This study showed that memory in these participants aged 45 to 68 was associated with prefrontal structures but not medial temporal lobe structures. This suggests that episodic memory in middle-aged PWH is more associated with frontally mediated etiologies such as HIV rather than etiologies associated with the medial temporal lobe such as AD. Second, recognition impairment was quite variable over time. Due to this variability over time, recognition may not serve as a good clinical marker to help distinguish aMCI from HAND. However, this group of participants is considerably younger than when late-onset AD presents; therefore, continued research is needed to examine if recognition may be a useful clinical marker to differentiate aMCI and HAND in older age. This study suggests that in middle-aged PWH without severe confounding medical conditions and high rates of ART use, there is not a greater than expected decline in delayed recall. However, more research is needed to more definitively determine if there is accelerated memory decline in middle-aged PWH. Lastly, while there was some indication that peripheral CRP may be associated with memory, overall, most biomarkers of inflammation were not associated with episodic memory and the medial temporal lobe did not mediate a relationship between inflammation and episodic memory. However, given the limitations described above, ongoing research on this topic is needed.

Posted in Commercial Cannabis Cultivation | Tagged , , | Comments Off on Examining recognition and delayed recall was a critical first step to inform future diagnostic improvements

Volumetric subcortical regions of interest included the hippocampus as well as the basal ganglia

The majority of older PWH are currently between the ages of 50 to 65, with a much smaller percentage over the age of 65. However, aging trends in the HIV population are predicted to continue . Additionally, age-associated physical comorbidities appear 5-10 years earlier in PWH , and, there is evidence of premature brain aging . Due to the neurotoxic effects of HIV and ART, as well as medical comorbidities and possible accelerated brain aging, PWH also may have less brain reserve to compensate for accumulating neurodegenerative pathology. Therefore, cognitive deficits indicative of aMCI could appear earlier in PWH compared to HIV-negative peers. Taken together, examining PWH in mid-life is advantageous as it could identify those with early signs of aMCI when interventions may be particularly efficacious. After excluding participants that did not meet inclusion/exclusion criteria as detailed below, the study included 92 PWH between the ages of 45 to 68 years old. All participants underwent at least one structural MRI scan between 2008 and 2010, comprehensive neuropsychological, neuromedical, and neuropsychiatric evaluation, as well as a blood draw. Most participants completed at least one follow-up neuropsychological, neuromedical, and neuropsychiatric study visit occurring in 6-month intervals. Participants were drawn from five participating sites: Johns Hopkins University, Mt. Sinai School of Medicine, University of California San Diego, University of Texas Medical Branch, and University of Washington. All CHARTER study procedures were approved by local Institutional Review Boards, and all participants provided written informed consent. UC San Diego IRB approval was sought for the current study, and it was determined by the IRB that this study was exempt. The CHARTER study aimed to recruit PWH to reflect the geographic and sociodemographic diversity of PWH around university-affiliated treatment centers in the U.S.; thus, CHARTER inclusion criteria were minimal and did not exclude participants with comorbidconditions that may impact cognitive function.

To determine the extent to which non-HIV related comorbidities have contributed to neurocognitive impairment,rolling tables developmental and medical histories of each participant were determined by Dr. R. K. Heaton and re-reviewed by an independent CHARTER clinician investigator. Participants with severe “confounding” comorbidities, as defined by Frascati criteria , were excluded from this project. Severe “confounding” comorbid conditions include comorbidities that could sufficiently explain neurocognitive deficits and thus preclude a HAND diagnosis. During clinician review, time course of comorbidities in relation to HIV and cognitive decline as well as the severity of comorbidities were considered when making comorbidity classification determination. Comorbid conditions that were reviewed and considered include history of neurodevelopmental disorders , cerebrovascular events , systemic medical comorbidities , non-HIV neurological conditions , and substance-related comorbidities . This comorbidity classification system has been shown to have excellent inter-rater reliability . The decision to exclude confounding comorbidities was further supported by a recent CHARTER paper showing that those with severe “confounding” comorbidities had worse brain integrity, but those with moderate comorbidities had fairly equivalent brain abnormalities as those with mild comorbidities . Additionally, CHARTER recruited a wide range of ages. To study the effect of aging with HIV, the age range for the current study was restricted to participants that were aged 45 or older at the time of the MRI scan. Additionally, one participant was excluded from the study given that their T1 structural MRI scan did not yield usable data .Tests of memory in the CHARTER study included the Hopkins Verbal Learning Test – Revised and the Brief Visuospatial Memory Test-Revised .

The HVLT-R and the BVMT-R include three learning trials, a longdelay free recall trial in which participants are asked to recall the stimuli previously presented, and a recognition trial in which participants are presented both target stimuli and non-target stimuli and asked if stimuli were presented in the learning trials. The delayed recall raw score is the total number of words correctly recalled during the long-delay free-recall trial. A recognition discrimination raw score was calculated by subtracting false positives from the total number of true positives. Note, this score is reflective of recognition discriminability, but this will be referred to simply as “recognition” throughout the text. Both the HVLT-R and BVMT-R have six alternate forms to attempt to correct for practice effects. Raw recognition scores were converted to Z-scores that account for demographic variables using normative data from the HNRP . Given that practice effect correction was not available for recognition and participants had a varying number of previous administrations, number of prior neuropsychological evaluations was included as a covariate in statistical analyses examining recognition. Raw delayed recall scores were converted to T-scores that account for demographic variables and practice effects using normative data from the HNRP. HVLT-R and BVMT-R recognition Z-scores were averaged to create a recognition composite. HVLT-R and BVMT-R delayed recall T-scores were averaged to create a delayed recall composite. Test-retest reliability estimates of the and HVLT-R recognition ranges from r = 0.27 – 0.40 and delayed recall ranges from r = 0.36 – 0.39. HVLT-R recognition and delayed recall show adequate convergent validity with other tests of verbal memory . The BVMT-R recognition and delayed recall trial have been shown to have adequate convergent validity with other tests of visual memory . Recognition and delayed recall were initially examined continuously rather than dichotomously splitting participants into impaired versus unimpaired groups. Examining recognition and delayed recall continuously is advantageous because it increases variability and more subtle differences observed in mid-life may not be captured by diagnostic cut-points.

However, when examining linear regression analyses from aim 1, the recognition analyses did not meet all assumptions for linear regression . Therefore, recognition was dichotomized into an impaired recognition group and an unimpaired recognition group for all analyses. Processing speed and psychomotor T-scores were used to examine processing speed and psychomotor performance . Raw scores from individual tests were converted to T-scores that adjust for the effect of age, sex, education, race/ethnicity, and practice effects using center-specific normative data. The T-scores from all tests in the domain are then averaged to obtain a domain T-score . The Wide Range Achievement Test-III , which has been shown to be a measure of premorbid verbal IQ in PWH , was reported to characterize the sample. Participants completed a standardized CHARTER neuromedical evaluation at each study timepoint. HIV serostatus was determined by enzyme-linked immunosorbent assay with a confirmatory Western Blot. The following HIV disease characteristics were collected from most participants at each visit: 1) current CD4 count measured via flow cytometry; 2) nadir CD4 measured via a combination of self-report and medical records; 2) CDC HIV staging; 3) HIV RNA in plasma measured by ultra-sensitive PCR ; 4) estimated duration of HIV disease collected via self-report; and 5) current ART regimen. Comorbid medical conditions , diabetes, hypertension, hyperlipidemia) were determined by self-report or taking medication for the condition. Comorbid psychiatric and substance use conditions were determined with the Composite International Diagnostic Interview , which is consistent with the DSM-IV. Additional details on the standardized CHARTER neuromedical assessment can be found in Heaton et al. . Additionally, CHARTER participants also have APOE genotype data for additional information). APOE genotype was dichotomized into APOE e4+ and APOE e4- . FreeSurfer version 7.1.1 was used to obtain cortical thickness and subcortical volume measures for several regions of interest , with a similar approach as earlier CHARTER work . After FreeSurfer processing, all T1 scans were visually inspected; in addition to the one participant excluded from all analyses as described above, one participant’s hippocampi were very overestimated, and therefore their hippocampal data were excluded from analyses. Neocortical thickness regions of interest included medial temporal lobe structures , prefrontal , and primary motor cortical areas. Specific structures were analyzed separately. Left and right volumes or cortical thicknesses for these regions of interest were averaged. In post hoc analyses,cannabis grow supplies if there were significant findings for the average region of interest then the left and right regions were examined separately to examine laterality. The differences in scanner from site to site was corrected for by regressing scanner from the data, given that differences between scanners have been well-documented in prior CHARTER work .

Differences in head size was accounted for by including estimated total intracranial vault volume as a covariate in volumetric data. Mean cortical thickness was included as a covariate in cortical thickness analyses. Additionally, age was included as a covariate to adjust for the normal differences of age on the brain. Five inflammation biomarkers were examined in this study. All inflammatory biomarkers have been found to be elevated in the context of HIV and aMCI . Plasma for biomarker assays was collected via routine venipuncture and EDTA vacuum tubes from all participants. All plasma biomarkers were measured using commercially available, multiplex, bead-based immunoassays according to manufacturer protocols; CRP was plated on a separate immunoassay given that it required a different dilution than other plasma biomarkers. Biomarker precision was ensured by assaying specimens in duplicate and repeating measurements with coefficients of variation greater than 20% or outliers that were more than 4standard deviations from the mean. Additionally, 10% of all assays were repeated to ensure batch consistency. The concentrations of these biomarkers typically have skewed distributions; therefore, the data were log-transformed prior to statistical analysis. Logistic regression was used for dichotomous recognition analyses . Multi-variable linear regression was used for continuous outcomes in aims 1b, 1c, and part of 1d. Primary predictors were tested separately. Age and imaging covariate were included as covariates in every model. The number of prior neuropsychological evaluations was included as a covariate in recognition models. Additional covariates , comorbidities, HIV disease characteristics, APOE status were selected by evaluating the bivariate relationships between potential covariates and outcomes. If a potential covariate was significantly associated with an outcome at p<0.10 it was then entered as a covariate in the model. Given the number of possible additional covariates, these additional covariates were only retained in the full model if the covariate remained associated with the outcome at p<0.10. Power analysis was conducted using GPower . These analyses were powered to detect medium effect sizes , with a two-tailed a = 0.05, and up to 5 covariates. Current CDC guidelines recommend immediately initiating ART and maintaining an undetectable viral load . Despite the fact that only 80% of PWH are engaged in care and 57% of PWH in the United States are virally undetectable , there is a trend towards examining PWH who are virally suppressed and on ART particularly in studies examining biological processes such as inflammation and neuroimaging . Therefore, post hoc analyses examining delayed recall, processing speed, psychomotor skills excluding participants that were not ideally treated for HIV disease , had a detectable viral load were excluded. Additionally, given the significant effects of methamphetamine on the brain , participants who had a current methamphetamine use disorder were also excluded in post hoc analyses . Dichotomous recognition models were not re-examined given that, with these exclusions, only 7 participants were impaired on the recognition composite. This aim utilized multi-level modeling to examine recognition and delayed recall across follow-up visits. Outcomes were examined separately. The “lme4 version 1.1-30” R package was used to conduct mixed-effects regressions . Mixed-effects logistic regression models were used to examine dichotomous recognition as the outcome. Models examining continuous delayed recall used linear mixed effects models. Analyses included a random intercept and a random effect for years since baseline . A cross-level interaction was used to test if baseline medial temporal lobe structure is associated with longitudinal recognition impairment or decline in delayed recall. Between-persons covariates included: age at baseline, imaging covariate, and covariates identified in aim 1. Power analysis was conducted using RMASS2 , and observed attrition was accounted for in these estimates. These analyses were found to be powered to detect small-to-medium effect sizes , with a two-tailed a = 0.05. Multi-level modeling was selected because it uses all available data and gives heavier weight to participants with more waves of data; thus, this methodology can account for participants that may have missed a follow-up visit and samples that have a differing number of follow-up assessments.

Posted in Commercial Cannabis Cultivation | Tagged , , | Comments Off on Volumetric subcortical regions of interest included the hippocampus as well as the basal ganglia

Previous studies have focused primarily on alcohol users but have not excluded participants for nicotine use

This suggests that brain areas implicated in processes such as reward and cognition show the most consistent gray matter atrophy in alcohol dependent individuals, but it is unclear whether overall amount of alcohol consumption or aspects of dependence severity explain these findings. Furthermore, some of the neuroimaging studies focusing on alcohol users have not mentioned whether the alcohol users also used nicotine , did not examine the effects of nicotine use on brain structure , did not control for nicotine use in their analyses , assessed nicotine use with a dichotomous questionnaire , or simply mentioned the number of smokers in the study . This makes it difficult to ascertain whether the observed neural effects were attributable to either alcohol and/or nicotine use and further illSimilar to studies of alcohol use effects on brain morphometry, several MR imaging studies have been conducted to specifically examine the effects of nicotine use on brain structure . As with studies of alcohol users, studies of cigarette smokers have attempted to quantify and incorporate a lifetime use variable, such as pack-year smoking history, which has been found to negatively correlate with PFC gray matter densities as well as gray matter volume in the middle frontal gyrus, temporal gyrus, and the cerebellum . Interestingly, Brody et al., found no significant association between pack-year smoking history and regions of interest determined as having significant between group differences, such as the left dorsolateral PFC, ventrolateral PFC, and left dorsal ACC. Given these conflicting findings, it is uncertain whether quantity variables, such as pack-year smoking history, account for many of the gray matter volume reductions observed in nicotine dependence. Dissimilar to studies of alcohol dependent individuals,indoor vertical garden systems some studies of nicotine dependent individuals have examined symptoms of dependence severity in relation to brain morphometry.

For example, the Fagerström Test for Nicotine Dependence , which was not associated with pack-year smoking history, was not correlated with PFC or insular gray matter density . The lack of a significant correlation between FTND scores and pack-year smoking history suggests that quantity of use and dependence severity symptoms may be unrelated in nicotine dependence, and thus have distinct relationships with brain structure. Overall, gray matter degradation has been observed in the thalamus, medial frontal cortex, ACC, cerebellum, and nucleus accumbens in nicotine dependent individuals . Due to widespread results, a meta-analysis was conducted, which found that only the left ACC showed significant gray matter reductions in nicotine dependent individuals compared to healthy controls . While studying primarily alcohol or nicotine using populations carries unique benefits, specific investigation is needed into heavy drinking smokers as past studies have shown compounded neurocognitive effects , as well as pronounced gray matter volume reductions in heavy drinking smokers when compared to nonsmoking light drinkers . Chronic cigarette smoking has been found to have negative consequences on neurocognition during early abstinence from alcohol and in one particular study, it was found that after 8 months of abstinence, actively smoking alcohol-dependent individuals performed worse on several neurocognitive measures, such as working memory and processing speed, when compared to never-smoking alcohol-dependent individuals . Additionally, formerly smoking alcohol users were found to perform more poorly than never-smoking alcohol users at this time point. These findings not only illustrate the contribution of smoking status on neurocognitive measures but establish the clinical relevance of nicotine use in heavy drinkers. This relevance paired with the compounded neurocognitive and morphometric effects further merit investigation into this unique sub-population of substance users.

The present work aimed to ascertain the effects of alcohol and nicotine dependence severity on gray matter density in a sample of 39 non-treatment seeking heavy drinking smokers using standard voxel-based morphometry . While some imaging studies have previously investigated the relationship of FTND scores with brain structure, to our knowledge, no imaging study to date has examined how alcohol dependence severity relates to gray matter density in heavy drinking smokers. Thus, the goal of this study was to examine if alcohol or nicotine dependence severity was correlated with gray matter density in heavy drinking smokers, while controlling for age, gender, and total intracranial volume . By examining dependence severity scores in addition to quantity of use variables, we may be able to capture how dependence is related to structural changes in the brain in a way that is not captured by variables that focus singularly on quantity of use. Based on previous findings, we hypothesized that gray matter density would be negatively related to quantity of both alcohol and nicotine use, in regions such as the middle frontal gyrus. We also hypothesized that dependence severity scores would uniquely relate to gray matter atrophy in several regions previously identified across the meta-analyses of voxel-based morphometry studies, such as the ACC, dorsal striatum, and insula.The subjects for the present study are a subset of participants from a medication development study of varenicline, naltrexone, and their combination in a sample of heavy drinking smokers. Subjects participated in the medication component of the study, details of which have been described in a previous publication , and a sub-sample was invited to complete a neuroimaging session.

Participants were recruited from the greater Los Angeles area through online and print advertisements with the following inclusion criteria: 1) between 21 and 55 years of age; 2) reported smoking at least 7 cigarettes per day; and 3) endorsed heavy drinking per the National Institute on Alcohol Abuse and Alcoholism guidelines: for men, >14 drinks per week or ≥5 drinks per occasion at least once per month over the last 12 months; for women, >7 drinks per week or ≥4 drinks per occasion at least once per month over the last 12 months. Participants were excluded from the study based on the following criteria: 1) had a period of smoking abstinence greater than 3 months within the past year; 2) reported use of illicit substances within the last 60 days, confirmed via positive urine toxicology screen at assessment visit ; 3) endorsed lifetime history of psychotic disorders, bipolar disorders, or major depression with suicidal ideation; 4) endorsed moderate or severe depression symptoms as measured by a score of 20 or higher on the Beck Depression Inventory-II ; 5) reported current use of psychotropic medications; 6) reported any MRI contraindications, such as any metal fragment in the body or pregnancy; and 7) reported MRI constraints, such as left-handedness or color blindness. As no Structured Clinical Interview for Diagnostic Statistical Manual 4th edition , or DSM 5th edition , Axis I Disorders was administered, drinking status for participants was determined solely via NIAAA heavy drinking guidelines . After a telephone screening to determine eligibility, participants came to the laboratory for a screening visit, during which informed,plant drying rack written consent was obtained. A urine cotinine test along with carbon monoxide levels verified self-reported smoking patterns and a breath alcohol concentration of 0.00 was required at the beginning of each visit. Eligible participants then came in for a physical examination and if eligible afterwards, began taking medication for nine days, previously described elsewhere . Participants received varenicline alone , naltrexone alone , their combination, or matched placebo. After the medication period, participants who were eligible for the MRI session were selected at random, given an additional three days of medication, and scanned within those three days. To our knowledge, no studies to date have tested the effects of varenicline and naltrexone on structural MRI measures; however, to ensure that there were no significant gray matter differences between the medication groups, we conducted a whole-brain one-way between-subjects ANOVA . A total of 40 subjects participated in the neuroimaging study. The Institutional Review Board of University of California, Los Angeles, approved all procedures for the study.Participants were administered the Alcohol Dependence Scale , the FTND, and the 30-day Timeline Follow-back . The ADS is a 25-item self-report measure that identifies elements of alcohol dependence severity over the past 12 months, such as withdrawal symptoms and impaired control over alcohol use on a scored scale with a range of zero to 47. The FTND is a six-item self-report measure that captures features of nicotine dependence severity on a scored scale of zero to 10, and questions on this measure are not confined to a specific time frame of substance use.

The TLFB assessed the daily amount of alcoholic drinks and cigarettes participants consumed in the past 30 days before the scan, from which mean drinks/drinking day and cigarettes/day were calculated.All images were obtained with a 3.0 Tesla Siemens Trio MRI Scanner at the Center for Cognitive Neuroscience at UCLA.As we expected no structural differences unrelated to gray and white matter volumes to be present in the sample, paired with past studies employing methodologies similar to ours, we chose to follow standard VBM protocols and spatially normalize the T1-weighted raw images to the same stereotactic space first . To do this, each image was registered to a standard template in Montreal Neurological Institute space using Diffeomorphic Anatomical Registration Through Exponentiated Lie Algebra . After spatial normalization, the resulting DARTELwarped T1-weighted images were segmented into three classifications . The segmented images were then modulated, a process by which the images are multiplied by the Jacobian determinants produced for each image during spatial normalization. The advantage of modulation is that it corrects for individual brain size and brain matter expansion or contraction that occurs during normalization. The sample homogeneity of the resulting images was checked using a mean covariance boxplot, which assesses the covariance among the sample of images across participants. Higher covariance values are preferred, which indicate the image is more similar to other volumes in the sample, while a lower covariance value signals a potential outlier . The mean covariance value for the current sample was .74. One participant had a covariance value greater than 2 standard deviations from the mean. Upon inspection, the image appeared to have failed during segmentation due to motion artifact and was excluded from further analyses. This resulted in a total of 39 subjects. Finally, modulated images were smoothed using an 8-mm full width at half maximum Gaussian kernel. The smoothed, modulated images were used for subsequent analyses.Two separate multiple regression models were built, with the first analyzing the relationship between symptoms of dependence severity and gray matter density. This model included ADS scores and FTND scores as predictor variables. The second model examined the relationship between quantity of substance use and gray matter density. The variables DPDD and CPD were chosen for this model and entered as predictor variables. Age, gender, and ICV were entered as covariates in both models. The significance level was set at p < 0.001, uncorrected with an absolute threshold mask value of 0.1, and a spatial extent threshold of 78 voxels was empirically determined per standard VBM protocol and used for analyses. Additionally, post-hoc achieved power analyses were conducted using the effect sizes calculated with Cohen’s f 2 .Previous research has indicated that gray matter tissue can regenerate within 14 days of alcohol abstinence in alcohol dependent patients and that gray matter regeneration is most profound within the first week to month of abstinence . Given these findings, we examined whether days to last drinking day before the imaging session correlated with gray matter density at the whole-brain level. Days to last drinking day was computed for each participant based on the TLFB information collected at the time of image acquisition. The analysis conducted included days to last drinking day as a predictor variable and age, gender, ICV, and ADS scores as covariates of interest. Furthermore, to understand whether any of the effects were related to cannabis use within the current sample, we examined the relationship between frequency of cannabis use and drinking and nicotine variables using non-parametric Spearman’s correlations. Cannabis use was assessed using a single-item categorical question asking, “On average, how often do you smoke marijuana?”The purpose of the present study was to examine the relationship between quantity of alcohol/nicotine use and alcohol/nicotine dependence severity with gray matter density in heavy drinking smokers. Similarly, some prior studies that examined nicotine users did not establish exclusionary criteria based on alcohol use .

Posted in Commercial Cannabis Cultivation | Tagged , , | Comments Off on Previous studies have focused primarily on alcohol users but have not excluded participants for nicotine use