Self-reports by medical users in California and Canada indicate that a substantial proportion substitute marijuana for alcohol and other drugs. In 1996, California voters passed an initiative Proposition 215, which legalized marijuana use for medical purposes. Because the Proposition was implemented in an abrupt and uniform manner, legalization presented a “natural experiment.” To estimate the causal impact of legalization on suicide, annual time series of total, gun, and non-gun suicides were analyzed by comparing California with an estimated counterfactual state in a Synthetic Control Group design. The synthetic control time series for California were constructed as a weighted combination of 41 states that did not legalize medical marijuana during the time frame. Post-intervention differences between California and its constructed control time series were interpreted as the causal effect of the medical marijuana law on suicide.Findings reveal that rates of total suicide and gun suicide dropped significantly in the aftermath of Proposition 215. Findings also reveal, however, that legalization’s impact on non-gun suicides is considerably smaller, and arguably no different than what would be expected to occur by chance. Confidence in these findings is underscored by the methodological approach undertaken in the study. A strength of the Synthetic Control Group Design is that it allows us to examine the net effect of medical marijuana legalization on suicide. Despite the strengths of this design,planting racks important limitations remain, many of which present opportunities for future directions in research.
Because we examine suicide trends over eight post-intervention years, we are fairly confident that the effects are permanent. Because our time series end in 2005, on the other hand, it is difficult to generalize our theoretical result to subsequent years. We are limited by the fact that medical marijuana laws began to proliferate across the U.S. after 2005, threatening to contaminate the “donor pool” of untreated states. In virtually all the states that legalized medical marijuana after 2005, moreover, reforms were not implemented abruptly or uniformly, making confident causal interpretations more difficult. Another limitation that presents a future direction relates to the mechanisms that may account for the findings of the study. What are the mechanisms responsible for the sharp decline in total, but especially gun, suicides following medical marijuana legalization in California? We proposed mechanisms related to the substitution of marijuana for alcohol and other related substances; marijuana use itself, which may reduce actual motivation for suicide; the inability of medical marijuana patients to purchase firearms; and changes in the culture of recreational substance use, leading to fewer unsupervised opportunities to commit suicide in the home. Each of these pathways should be tested, although many will require additional data collection. For example, one likely fruitful research direction would be to collect annual data on alcohol consumption in California and assess whether it is a plausible mechanism by which medical marijuana legalization could cause a reduction in gun suicides. Beyond adjudicating these various pathways, testing mechanisms could yield insight into why we do not find the expected reduction in non-gun suicides following legalization. Unfortunately, we do not have the data to test these mechanisms, yet it will be essential for future researchers to do so.
Correlation coefficients between environmental and biomarker measurements are widely used in environmental health assessments and epidemiology to explain the exposure associations between environmental media and human body burdens. As a result considerable attention and effort have been given to interpretation of these coefficients. However, there is limited information available on how the variance in environmental measurements, the relative contribution of exposure sources, and the elimination half-life affect the reliability of the resulting correlation coefficients. To address this information gap, we conducted a simulation study for various exposure scenarios of home-based exposure to explore the impacts of pathway-specific scales of exposure variability on the resulting correlation coefficients between environmental and biomarker measurements. Biomonitoring data, including those from blood, urine, hair, etc., have been used extensively to identify and quantify human exposures to environmental and occupational contaminants. However, because the measured levels in biologic samples result from multiple sources, exposure routes, and environmental media, the levels mostly fail to reveal how the exposures are linked to the source or route of exposure. Thus, comparison of biologic samples with measurements from a single environmental medium results in weak correlations and lacks statistically significance. In addition, cross-sectional biological sample sets that track a single marker have large population variability and do not capture longitudinal variability, especially for compounds with relatively short biologic half-lives, which can be on the order of days such as pesticides and phthalates. Therefore, in the case where the day-to-day variability of biological sample measurements is large, the use of biomarker samples with a low number of biological measurements in epidemiologic studies as a dependent variable can result in a misclassification of exposure as well as questions of reliability.
For chemicals frequently found at higher levels in indoor residential environments than in outdoor environments, it is common to assume that major contributions to cumulative intake are home-based exposure and/or food ingestion. This simplification can be further justified because people generally spend more than 70 percent of their time indoors. Compounds with significant indoor sources and long half-lives in the human body– on the order of years for chemicals such as polybrominateddiphenyl ethers –have been found to have positive associations between indoor dust or air concentrations and serum concentrations in U.S. populations. On the other hand, extant research has not reported significant associations between indoor samples and biomarkers for chemicals primarily associated with food-based exposures, for example, bisphenol-A [18] and perfluorinated compounds. For chemicals with both homeand food-based exposure pathways and short body half-lives , as is the case for many pesticides, a significant association between indoor samples and biomarkers is found less frequently or relatively weak compared to PBDEs . To better interpret these types of findings, we provide here a simulation study for various exposure scenarios to explore the role of the chemical properties and exposure conditions that are likely to give rise to a significant contribution from indoor exposures. We then assess for these situations the magnitude and variance of the associated correlation coefficients between biomarker and indoor levels. The objectives of this study are to generate simulated correlation coefficients between environmental measurements and biomarkers with different contributions of home-based exposure to total exposure and different day-to-day and population variability of intake from both residential environments and food, to interpret the contribution of home-based exposure to human body burden for two hypothetical compounds whose half-lives are on the order of days and years,trimming tray and to determine how the pattern of variability in exposure attributes impacts the resulting correlation coefficients linking biomarker levels to exposure media concentrations.
In this study, our first step is to synthetically generate daily environmental concentrations and food exposure concentrations based on variations of day-to-day intake from residential environments and food as well as different relative contributions of home-based and food-based exposure. As different chemicals are likely to have different relative contributions from the homebased and food-based exposure pathways, we conducted our simulations across the full range of relative contributions between the two pathways to address all plausible scenarios for various compounds. We combine the simulated home-based exposures associated with indoor environmental concentrations and food concentrations, assuming that the total intake results only from home-based exposure and food ingestion. From these inputs we estimate time-dependent biomarker concentrations using a onecompartment pharmacokinetic model. We then computed correlation coefficients between simulated environmental and biomarker concentrations. In order to facilitate numerous simulations, several simplifications are made regarding a representative environmental medium for home exposure, a distribution of environmental and food intake, and sources of exposure. First, we select chemical concentrations from indoor wipe samples as a way to represent home-based exposures that result from all potential exposure routes, including inhalation, nondietary dust ingestion, and dermal uptake. From these wipe concentrations, resulting home-based exposure can be assumed to be linearly related to Cwipe and Ehome and Cwipe are assumed for simplicity to be equal. In addition, we assume that a contaminated food intake rate represents food exposures . Second, we select Cwipe and Efood from log-normal distributions of variability across both population and time. Lastly, we assume that the total intake accounting for biomonitoring data results from Ehome and Efood, excluding any other exposure pathways.Because some indoor contaminants are considered potential threats to human health, many studies have applied significant resources to examine the relationship between exposure to indoor pollutants and adverse health effects. However, these studies are potentially limited by the use of a single or a few environmental and biological samples. The significant implications of this situation are reflected in our results. Multi-day, multi-person sample analyses are costly and labor-intensive.
In addition, the resulting R2 values from these studies are not interpreted or poorly interpreted in terms of variability and contribution of exposure sources and the biological half-life of a compound. In this regard, the simulation study in this paper provides an important step towards interpreting the relative contribution of home-based exposure to human body burden for two compounds whose biological half-lives are significantly different . Although these two compounds do not cover the full range of chemical substances, bracketing half lives allows us to quantify thesignificance of source, measurement, and exposure pattern variability for disaggregating body burden. In particular, it shows that exposure variability and different contributions of exposure sources are more interconnected than commonly considered in many experimental studies. The work also brings to attention the need to understand the impact of a chemical half-life on the relationship between environmental exposures and biomonitoring data. The sensitivity of day-to-day variability of wipe concentrations and food exposures on the resulting R2 values also points to the importance of understanding variability and contribution of exposure sources. Finally, future work includes computing the relative number of samples needed for various levels of confidence to disaggregate body burden for various types of compounds , environments, and exposure pathways. Despite the lack of experimental data, the simulated results provide key insights on the role of the variability and contribution of exposure sources and biological half-lives in quantifying a relationship between indoor exposure and human body burden. This approach will be useful for designing future exposure and epidemiologic studies that includes indoor environmental samples and biomonitoring data.Bovine respiratory disease complex is one of the most common causes of death in dairy calves and poses a significant welfare and economic burden on the industry . Reported morbidities for calves in the 1991 NAHMS study evaluating the health of preweaning heifers on dairies was only 8.9% in calves up to 8 weeks of age . In 2010, respiratory disease in dairy heifers was reported as the cause in 22.5% of deaths before and 46.5% of deaths after weaning . In addition, 18.1% of preweaning heifers on dairy heifer-raising operations were reportedly affected by pneumonia, making it the second most common calf illness after diarrhea . Hence, over the last few decades, no improvement has been reported in morbidity from BRD in dairy calves. Given the lack of improvement in BRD incidence in US dairy cattle, despite the availability of numerous vaccines and antimicrobial drugs labeled for BRD, novel approaches that target prevention in addition to control should be evaluated. The complexity of etiologic agents and predisposing factors for BRD combined with the difficulty of accurate diagnosis pose challenges in the prevention and control of this disease on the farm that may be addressed using a risk-assessment approach in combination with a disease-scoring system. A multitude of tests can be used to identify pneumonia calves in a herd; however, scoring systems require minimal training, are low cost, and can be reasonably accurate, making them a viable tool to estimate the disease burden in the herd . Repeated use of a risk-assessment tool to target preventive management practices combined with a scoring system to benchmark the burden of BRD in a calf herd over time may offer a low-cost, rapid, and comprehensive control program for BRD. In contrast to a chronic disease, such as Johne’s disease, where changes implemented to control the disease may not result in a reduction in incidence for many years, BRD primarily presents as an acute disease.