vertical grow rack system – Hemp Growing https://hempcannabisgrow.com Growing Indoor & Outdoor Cannabis Thu, 23 Nov 2023 07:03:00 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 The rewarding effects of cocaine are reduced in a very specific way by CB1 receptor antagonists https://hempcannabisgrow.com/2023/11/23/the-rewarding-effects-of-cocaine-are-reduced-in-a-very-specific-way-by-cb1-receptor-antagonists/ Thu, 23 Nov 2023 07:03:00 +0000 https://hempcannabisgrow.com/?p=924 Continue reading ]]> In addition to anecdotal evidence that cannabis increases appetite, especially for sweet food, in recreational cannabis smokers , several preclinical studies have shown that CB1 receptor agonists facilitate food reward, in particular, the hedonic response to sweet food that Berridge and Robinson call ‘liking’. For example, THC increases the intake of food and increases the consumption of sweet solutions in rats. In addition, low doses of THC increase hedonic reactions to sucrose and decrease aversive reactions to bitter quinine solutions and THC increases the palatability of sucrose in rats. Also, we recently found that the motivation to respond for food, as measured by break points in responding for food under a progressive-ratio schedule , is increased by administration of THC . Interestingly, enhancement of the motivation to respond for food by THC is dependent on actual food consumption , suggesting that both appetitive and consummatory aspects of food reward may involve the endocannabinoid system. Taken together, these findings provide a rationale for the clinical use of CB1 receptor agonists such as Marinol in anorexic cancer and HIV patients . There are contradictory reports on the effects of CBs on brain stimulation reward. Brain stimulation reward or intracranial self-stimulation is an operant procedure where animals have to press a lever to receive a small electrical current in restricted areas of the brain . Brain stimulation reward is arguably the most robust form of reinforcement and is believed to derive from the ability of electrical currents to activate, probably indirectly, the dopaminergic mesolimbic system . In support of this hypothesis, drugs that activate the dopaminergic system and increase dopamine levels in the nucleus accumbens  also facilitate brain stimulation reward , whereas drugs that block dopamine receptors reduce thresholds for self-stimulation.

Concerning, CB1 receptor agonists, Gardner and colleagues found that THC facilitates brain stimulation reward ,vertical grow rack system whereas other investigators found no effects of the syntheticagonists CP55,940 or AMG-3 and some investigators have found a reduction of brain stimulation reward with CB1 receptor agonists . These discrepancies, which are likely due to procedural differences, remain to be resolved. A caveat of all experiments with directly acting CB1 receptor agonists is that, for several reasons, these drugs do not provide a realistic picture of the physiological role of the endocannabinoids. First, anandamide is a partial agonist , whereas synthetic CB1 receptor agonists are often full agonists and have higher affinities for CB1 receptors . Second, anandamide and 2-AG have very short half-lives , whereas THC and synthetic CB1 receptor agonists have relatively long half-lives. Finally, systemic injections of these compounds result in the activation of all brain areas containing CB1 receptors, whereas physiological activation of endocannabinoid synthesis and release is likely to be region, neuron or even synapse specific . The availability of mice genetically deprived of CB1 receptors in a tissue-specific manner may help address this possibilityAssessment of the roles the endocannabinoid system plays in brain reward processes was greatly facilitated by the discovery of selective CB1 receptor antagonists/inverse agonists such as rimonabant and AM251 . CB1 receptor antagonists decrease the rewarding effects of a wide variety of abused drugs under certain conditions. For example, the rewarding effects of opioids are generally decreased in both intravenous self-administration and conditioned place preference procedures . There have also been reports that rimonabant and AM251 reduce the rewarding effects of methamphetamine , alcohol and nicotine .

CB1 receptor antagonists do not generally alter self-administration of cocaine under low fixed-ratio schedules or conditioned place preference procedures , but AM251 has been found to significantly reduce self-administration of cocaine under progressive-ratio schedules and rimonabant prevents relapse to cocaine-induced and cue-induced cocaine-seeking behaviour . This suggests that the appetitive and conditioned effects of cocaine, but not its direct reinforcing effects, depend on CB1 receptor activation. The effects of rimonabant on opioid reward may be mediated primarily in the NAcc, as blockade of CB1 receptors in this area reduces heroin self-administration . On the other hand, the modulation of ethanol reward by the CB system appears to take place both in the NAcc and in the prefrontal cortex . The brain sites where CBs act to alter the rewarding effects of nicotine and psychostimulants are not known at present. Drugs of abuse share the ability to elevate extracellular levels of dopamine in the shell of the NAcc, as measured by in vivo microdialysis, and this effect is believed to play an important role in their reinforcing effects . CB1 receptor antagonists have been shown to block the elevations of accumbal dopamine levels induced by administration of nicotine or ethanol , but not by administration of heroin , morphine or cocaine . Transient surges of dopamine in the NAcc, as measured by cyclic voltammetry, are also produced by drugs of abuse and are believed to be involved in drug seeking . Interestingly, the transient increases in dopamine produced by administration of nicotine, ethanol and cocaine in the shell of the NAcc of freely moving rats are all blocked by CB1 receptor antagonists . Consistent with a role for endocannabinoids in the rewarding effects of food and in the regulation of appetite and food intake , blocking endocannabinoid tone with CB1 receptor antagonists reduces intake of food and sweet solutions . Also, injection of rimonabant within 24 h of birth completely prevents milk intake and causes almost 100% mortality in mouse pups . The motivational effects of food measured by a progressive-ratio schedule of food reinforcement and the appetitive aspects of food reward are significantly reduced by rimonabant in rats, indicating that some aspects of food intake regulation involve reward and motivational processes.

In addition, AM251 decreases hedonic reactions to sucrose and increases aversive reactions to quinine . Consistent with these preclinical findings, rimonabant has been found to be effective in the clinical treatment of obesity , although the clinical efficacy of this agent appears to be primarily due to its ability to alter peripheral lipid metabolism, rather than to reduce food intake . As in the case of CB1 receptor agonists, the effects of CB1 receptor antagonists on brain stimulation reward are somewhat controversial. Rimonabant has been shown to increase the threshold for brain stimulation reward in some studies or to produce no change in brain stimulation reward thresholds in other studies .Mice genetically engineered to lack CB1 receptors do not show dramatic changes in body weight, food consumption or fertility , suggesting that CB1 receptors modulate rather than mediate basic reward functions or that other systems can compensate for their absence. By using CB1-null mice, the role of CB1 receptors in the rewarding effects of drugs of abuse has been confirmed. For example, in these mutant mice morphine is not self administered , does not induce conditioned place preferences and does not elevate dopamine levels in the NAcc . Also, the rewarding effects of alcohol are reduced in CB1-null mice, as demonstrated by data showing that CB1-null mice do not develop conditioned place preference with this drug and that they do not prefer it over water in a two-bottle free-choice paradigm . However, another study reported that CB1-null mice show slight and short-lasting decreases in preference for ethanol. Development of conditioned place preferences with cocaine is unaltered in CB1-null mice . On the other hand, development of cocaine self-administration behaviour under fixed-ratio schedules of reinforcement in CB1-null mice was reported to be unaltered when mice were restrained , but was reduced in freely moving mice . In addition, cocaine self-administration was significantly reduced under a progressive-ratio schedule of self-administration in CB1-null mice . These contrasting results highlight the fact that, when working with genetically modified mice such as CB1-null mice, not only methodological differences but also differences in the genetic background may result in very different and sometimes contrasting behavioural outputs . It is interesting to note that CB1-null mice show normal elevations in dopamine levels in the NAcc following administration of cocaine ,grow vertical but no elevations following administration of morphine or ethanol , compared with wild-type controls. Finally, CB1-null mice do not develop conditioned place preferences to nicotine , but they self-administer the drug like wild-type controls . It remains to be seen whether increasing the effort needed to obtain nicotine, as with cocaine , could reveal a role of CB1 receptors in some aspects of nicotine reinforcement as suggested by the results with CB1 receptor antagonists . CB1-null mice have also provided evidence for the involvement of the endocannabinoid system in food reward. For example, CB1-null mice eat less than their wild-type control litter mates after food restriction . Moreover, CB1-null mice respond less for sucrose in a two-lever paradigm, have lower break points under progressive-ratio schedules of sucrose delivery and show less preference for sucrose over water in a two-bottle free-choice procedure .

Genetic ablation of CB1 receptors results in small reduction in body weight, reduction in adiposity and resistance to diet-induced obesity . However, as in the case of CB1 receptor antagonists, these effects appeared to be related more to increased metabolic energy consumption than to differences in rewarding effects of food or hypophagia . Interestingly, in the study by Fride et al. , administration of rimonabant in mice pups lacking CB1 receptors still induced a decrease in milk intake and survival rate, suggesting that some of the effects of CBs on food intake may be mediated by still uncharacterized CB receptors. To date, no study has investigated the effects of CB1 receptor deletion on brain stimulation reward. On the other hand, it should be noted that CB1-null mice show increased anhedonia after chronic mild stress , a measure of reduced activity of the reward system and a model of depression , further supporting a role for the endocannabinoid system in brain reward functions.Although measurements of the effects of systemic or intracranial injections of anandamide provide useful information on the functions of the endocannabinoid system, to provide support for a role of neurally released anandamide in brain reward processes it is also important to measure anandamide released in the brain by different abused drugs, by food and by electrical brain stimulation. Release of neurotransmitters such as dopamine or glutamate can be readily measured by micro-dialysis techniques, but only a few studies have employed microdialysis techniques to measure extracellular brain levels of endocannabinoids . Thus, most information on the modification of endocannabinoid levels comes from measurements of tissue levels in different brain areas. Measuring tissue level of anandamide has the limitation that only one time point can be established at a time, limiting information on the pattern of endocannabinoid release. Using tissue levels, it has been demonstrated that chronic administration of several drugs of abuse leads to region specific increases in anandamide levels. For example, when administered chronically, THC decreases levels of anandamide in the striatum , ethanol decreases levels of anandamide in the midbrain but not in the striatum , nicotine decreases levels of anandamide in the striatum but not in the midbrain , and cocaine and morphine do not alter anandamide levels, either in the striatum or midbrain . However, it is difficult to determine from these studies whether measured levels of endocannabinoids reflected the consequences of chronically administered drug or withdrawal. Vigano et al. compared the effects of chronic versus acute administration of morphine on endocannabinoid levels in the brain. They found that acute administration of morphine increased anandamide levels in the striatum, whereas chronic treatment with the drug failed to do so. In addition, they found that chronic treatment with morphine did not alter the ability of a challenge dose of morphine to increase anandamide levels in the striatum; that is, repeated administration of morphine did not induce sensitization or tolerance to this effect . Thus, chronic administration of drug followed by withdrawal, chronic administration of drug followed by an acute drug challenge and acute administration of drug can lead to very different changes in brain anandamide levels. Such profiles of release may indicate that anandamide is released in response to relevant changes in homoeostasis but not when an adaptive response has already occurred.

]]>
Other factors such as gender and family history of AUDs may moderate this relationship https://hempcannabisgrow.com/2023/11/21/other-factors-such-as-gender-and-family-history-of-auds-may-moderate-this-relationship/ Tue, 21 Nov 2023 05:47:10 +0000 https://hempcannabisgrow.com/?p=920 Continue reading ]]> Cognitive theories attempting to explain adolescent risk-taking as the result of underdeveloped decision-making skills have found little, if any support, as studies demonstrate that adolescents show an adequate understanding of the steps involved in the decision-making process, such as weighing pros and cons. In fact, children as young as 4 years old have some understanding of consequence probabilities and adolescents and adults show equal levels of awareness of the consequences associated with risky behaviors . In some cases, adolescents may even overestimate their personal vulnerability to risk consequences compared to adults . Further, interventions designed to provide adolescents with information about the risks of substance use, drinking and driving, and unprotected sex have proved largely unsuccessful and have done little to change adolescents’ actual behavior . Simplified theories of “immature cognitive abilities” in adolescence are also inconsistent with a developmental perspective, as the increase in cognitive sophistication from childhood to adolescence would imply a decrease in risk-taking behaviors with age, rather than an increase . Steinberg proposes an alternative view of adolescent risk-taking behavior that is rooted in developmental neuroscience. Specifically, heightened risk-taking in adolescence is described as the product of a “competition” between a socioemotional network that is sensitive to social and emotional stimuli , and a cognitive control network that is responsible for regulating executive functions such as planning, organization, response inhibition, and self-regulation. The socioemotional network relies on limbic and paralimbic structures such as the amygdala, ventral striatum, orbitofrontal cortex,cannabis plant growing ventromedial prefrontal cortex, and superior temporal sulcus, while the cognitivecontrol network consists of lateral prefrontal and parietal cortices as well as the anterior cingulate . During adolescence, the brain undergoes significant structural, functional, neurochemical, and hormonal changes that directly impact the development of the socioemotional and cognitive control networks, among other regions.

Specifically, synaptic pruning and myelination processes result in reduced gray matter volume and increased white matter volume by late adolescence/early adulthood . Increases in white matter during adolescence are associated with greater structural connectivity and faster, more efficient neural communication between brain regions . Evidence from neuroimaging studies note that dramatic changes occur in the brain’s dopaminergic system at puberty, primarily in prefrontal and striatal regions . Specifically, dopamine activity shows substantial decreases in the nucleus accumbens, an important region of the ventral striatum well known for its role in reward processing. Dopamine has been implicated as a primary mechanism of affective and motivational regulation and is linked to the socioemotional network ; thus, the sudden decrease in this neurochemical creates a “dopamine void” which may compel adolescents to seek out novel and risky behaviors to compensate . Changes in brain regions associated with the cognitive control network also take place in adolescence, including gray matter decreases and white matter increases in the prefrontal cortex, and an overall increase in synaptic connections among cortical and subcortical regions of the brain . In contrast to the acute changes that occur to socioemotional regions with puberty, changes in the cognitive control network are gradual and typically continue into the mid-twenties. As a result of timing differences in the developmental brain changes that occur during adolescence, there appears to be a “timing gap” between the maturation of the socioemotional network and the maturation of the cognitive control network. Greater motivational drives for novel and rewarding experiences combined with an immature cognitive control network may predispose adolescents to risky behavior, including substance use. This becomes especially relevant under conditions of high emotional arousal , where the socioemotional network is likely to become highly activated and the cognitive control network must “work harder” to override it . While neurochemical modifications and other developmental brain changes may contribute to adolescents’ increased propensity for risk-taking , it is paradoxical, as the brain may be especially vulnerable to the insult of alcohol during this critical time . A handful of studies have shown a deleterious effect of heavy alcohol use on adolescents’ neuropsychological performance in varied domains, including visuospatial abilities , verbal and non-verbal retention , attention and information processing , and language and academic achievement .

Female adolescent alcohol users have also shown deficits on tasks of executive functioning, specifically those involving in planning, abstract reasoning, and problem-solving . Post-drinking effects, such as hangover severity and withdrawal symptoms have been demonstrated to be important predictors of alcohol-related neurocognitive impairment, as greater self reported withdrawal symptoms have been linked with poorer visuospatial functioning and poorer verbal and non-verbal retention . Longitudinal studies have examined whether observed neurocognitive deficits in this population represent premorbid risk factors for use or consequences of heavy alcohol use. In one study, after controlling for recent alcohol use, age, education, practice effects, and baseline neuropsychological functioning, substance use over an 8-year follow-up period significantly predicted neuropsychological functioning at Year 8. Specifically, adolescents who reported continued heavy drinking and greater alcohol hangover or withdrawal symptoms showed impairment on tasks of attention and visuospatial functioning compared to non-using adolescents . These findings were replicated in a prospective study that characterized at-risk adolescents prior to initiating alcohol use. For females, initiation of alcohol use over the follow-up period was associated with worsening visuospatial functioning, while greater hangover symptoms over the follow-up period predicted poorer sustained attention in males . Taken together, these studies suggest that heavy drinking during adolescence is associated with deficits in cognitive performance, which likely result from, rather than predate, alcohol use.Specifically, evidence suggests that females may be more vulnerable to the negative impact of heavy alcohol use in adolescence , and a positive family history of AUDs has been associated with worse neurocognitive performance in adolescent heavy alcohol users, particularly in language and attention domains . Structural magnetic resonance imaging studies provide evidence for anatomical brain abnormalities in adolescents with histories of heavy lifetime alcohol use, compared to their non-using peers.

The hippocampus appears to be one area of potential vulnerability, as decreased bilateral hippo campal volumes have been observed in adolescents meeting criteria for AUDs, with smaller hippo campi related to earlier onset and longer duration of the disorder . Nagel, Schweinsburg, Phan, and Tapert found similar results, with smaller left hippo campal volumes observed in heavy alcohol-using adolescents compared to controls, even after excluding teens with co-occurring Axis I disorders. Hippo campal volume did not correlate with degree of alcohol use in this study, suggesting that between-group differences may be reflective of premorbid factors, and not solely the result of heavy alcohol use. Another area of the brain that may be especially vulnerable to the effects of heavy alcohol use in adolescence is the prefrontal cortex. As a key component of both the cognitive control and socioemotional networks, this region is important to the study of risk-taking. In a sample of adolescents with co-occurring psychiatric and AUDs, DeBellis and colleagues found significantly smaller prefrontal cortex volumes in alcohol users compared to controls. These findings were replicated by Medina and colleagues in a sample of alcohol dependent adolescents without psychiatric disorders; however,vertical grow rack system a significant group by gender interaction was observed. Specifically, alcohol dependent females showed smaller prefrontal cortex and white matter volumes than female controls, and alcohol dependent males showed larger prefrontal and white matter volumes than male controls. In a cortical thickness study of adolescent binge drinkers, Squeglia, Sorg, and colleagues found alcohol use by gender interactions in four left frontal brain regions, where female binge drinkers had thicker cortices than female controls and male binge drinkers had thinner cortices than male controls. Thicker frontal cortices corresponded with poorer visuospatial, inhibition, and attention abilities for females and worse attention abilities for males, providing further evidence that females may be especially vulnerable to brain changes brought on by heavy alcohol use in adolescence. Diffusion tensor imaging studies have yielded corroborating evidence of altered brain development in adolescent heavy alcohol users. In one study, adolescents with histories of binge drinking showed decreased white matter integrity in 18 major fiber tract pathways, specifically in the frontal, cerebellar, temporal, and parietal regions . Another study found reduced white matter integrity in the corpus callosum of youth with AUDs, particularly in the posterior aspect . In addition, reduced white matter integrity in this region was related to longer durations of heavy drinking, larger quantities of recent alcohol consumption, and greater alcohol withdrawal symptoms .

There is evidence that poorer white matter integrity may be both a consequence of adolescent alcohol use and a predisposing risk factor for use. Specifically, in a study of 11- to 15-year-old alcohol naïve youth, Herting, Schwartz, Mitchell, & Nagel, found that youth with a positive family history of AUDs had poorer white matter integrity in several brain regions, along with slower reaction time on a task of delay discounting, when compared to youth without a family history of AUDs. In addition, Jacobus, Thayer, Trim, Bava, and Tapert found that poorer white matter integrity measured in 16- to 19-year old adolescents was related to more self-reported substance use and delinquency/aggression at an 18-month follow-up. In fMRI studies, altered neural processing has been observed in heavy drinking adolescents during cognitive tasks of spatial working memory , verbal encoding, and visual working memory . Tapert and colleagues found that adolescents with a history of heavy drinking over the past 1-2 years showed increased blood oxygen level-dependent response in bilateral parietal regions during a SWM task, but decreased BOLD activation in the occipital and cerebellar regions compared to lighter drinkers. In addition, BOLD activation abnormalities were associated with more withdrawal, hangover symptoms, and greater lifetime alcohol consumption. Similarly, in a study of verbal encoding, Schweinsburg, McQueeny, Nagel, Eyler, and Tapert showed that adolescent binge drinkers had more BOLD response in the right superior frontal and bilateral posterior parietal regions but less BOLD response in the occipital cortex, compared to non-drinkers. Control adolescents also showed significant activation in the left hippocampus during novel encoding, whereas binge drinkers did not. A 2011 follow-up to this investigation found increased dorsal frontal and parietal BOLD response among 16- to 18-year-old binge drinkers, and decreased inferior frontal response during verbal encoding . Squeglia, Pulido, and colleagues found comparable results during a VWM task, in that heavy drinking adolescents showed more BOLD response compared to matched controls in right inferior parietal, right middle and superior frontal, and left medial frontal regions, but less BOLD response in left middle occipital regions. Notably, this investigation included a longitudinal component with a separate sample of adolescents in which the brain areas showing group differences in BOLD response to the VWM task were identified as ROIs. Adolescents were scanned at baseline before they ever used alcohol or drugs and then scanned again at a 3-year follow-up time point. Adolescents from this sample who transitioned into heavy drinking during the follow-up period showed less BOLD response to the VWM task compared to continuous non-drinkers in frontal and parietal regions at baseline; in addition, BOLD response in these regions increased significantly over the follow-up period for the heavy drinkers, while controls’ BOLD response did not change significantly over time. Finally, less BOLD activation at baseline predicted subsequent substance use, above and beyond age, family history of AUDs, and baseline externalizing behaviors. Taken together, results from these studies suggest that the adolescent brain is indeed sensitive to the insult of excessive alcohol use, and structural alterations and neural reorganization may result from continued heavy drinking. In turn, this altered brain development may trigger cognitive, emotional, and behavioral changes, leading to further alcohol use and other risk-taking behaviors. As the majority of fMRI studies of adolescent alcohol users to date are cross sectional in nature, it is difficult to determine whether the observed neural abnormalities predate the onset of alcohol use, or are consequences of alcohol use. However, results of the Squeglia, Pulido et al. study suggest that a combination of both explanations may be most accurate. Specifically, neural functioning differences may be evident prior to the initiation of drinking, but early alcohol use may also change the trajectory of normative brain development observed in adolescence, leading to less efficient neural processing over time.

]]>
Soil moisture monitoring practices help ensure precise frequency and duration of irrigations https://hempcannabisgrow.com/2023/10/13/soil-moisture-monitoring-practices-help-ensure-precise-frequency-and-duration-of-irrigations/ Fri, 13 Oct 2023 06:12:36 +0000 https://hempcannabisgrow.com/?p=862 Continue reading ]]> The significant vgsc mutations observed could be a result of selection pressure build-up that is due to more contact with insecticides in indoor-based interventions. From Kisian, the G119S mutation was present at low frequencies even though it was higher in the progeny of mosquitoes resting indoors compared to those resting outdoors. This was more in Kisian, where the vgsc mutations were at lower frequencies than in Kimaeti. These findings suggest that these mutations could be arising from different pressures that could be present in the lowland and absent in the highland.The metabolic enzymes, associated with insecticide resistance activities were found to be elevated, more in indoor resting malaria mosquitoes compared to the outdoor counterparts from both sites. From the phenotypic assays, pre-exposure to PBO synergist restored the susceptibility of the malaria vectors to the pyrethroids commonly used in LLINs by public health. Phenotypic exposures with prior PBO contact demonstrated more activity of monooxygenases in aiding metabolic resistance. The involvement of monooxygenases in pyrethroid resistance has been reported in Western Kenya. In Kimaeti, there was increased levels β-esterases, higher indoors than outdoors. Kisian, on the other hand, did not show involvement of β-esterases in contributing to resistance as shown by similar levels in indoor and outdoor resting mosquitoes. The glutathione-S-transferase possibly played a part in the resistance levels as a previous study reported since it was higher in mosquitoes resting indoors than those resting outdoors from both Kisian and Kimaeti. These levels, therefore, suggest that monooxygenases were the main mechanism of insecticide resistance in Kisian, especially with the low frequency of resistant alleles, whereas in Kimaeti, the case pointed be a combination of genotypic and metabolic mechanisms. The expression of phenotypic,grow solutions greenhouse genotypic and metabolic resistance appears to be higher in indoor than outdoor resting malaria mosquitoes in these regions.

The widespread use of LLINs in attempts to controlling these vectors and the extensive agrochemical use could be strengthening the increase of insecticide resistance in the sites. The higher levels indoors suggest that these mosquitoes could be resting indoors because they are adequately resistant to the insecticides used in LLINs, posing a threat to the wide coverage LLINs. On the other hand, outdoors, the resistance mechanisms were present as well pointing to exposure to these insecticide-based interventions in just enough pressure to elicit expression of the resistance traits. The levels of resistance could be enough to elicit an increase in malaria incidence due to the reduced mortality of resistant malaria vectors that could hinder current vector control interventions.Increasing temperatures and higher variability in precipitation in California are part of a larger regional trend in the Western United States . This is consistent with global trends that indicate that 2000-2010 has been warmer at the Earth’s surface than any preceding decade since 1850 . Observed increases in temperature and precipitation extremes in semi-arid regions, such as Southern California, clearly translate into more severe future impacts than analogous trends in temperate regions, such as projections of increased frequency and duration of heat waves and droughts over the remainder of the current century . Previous studies suggest that agriculture in the largely irrigated Western United States may not be as susceptible to precipitation trends as agriculture in the more temperate East . This holds for long-run mean precipitation conditions . However, this conclusion minimizes the severity of the recent drought experienced in California with historically low precipitation and soil moisture levels . The recurrence and longer duration of droughts in California over the past two decades has greatly affected the agricultural industry, which, on average, uses about 80% of freshwater resources . Figure 1.1 illustrates the percentage of California’s area in drought from 2000-2016. Not only does this reveal the large spatial and temporal extent of the most recent drought, but the colors reveal the large area under extreme and exceptional drought from mid-2013 to 2017. The most immediate economic impacts are lost agricultural revenue emanating from fallowed acres and yield declines, and farm job losses for one of the most vulnerable socioeconomic groups. For example, the 2009 drought resulted in revenue losses of $370 million with fallowing of 285 thousand acres in the San Joaquin Valley, and almost 10 thousand farm jobs losses .

Arguably the most important variables explaining how agriculture will be affected by climatic changes are those of human ingenuity at the farm level. Human ingenuity is simply another word for adaptation to climate change in order to minimize welfare losses. Thus, the overarching theme of our three subsequent analyses is quantifying grower responsiveness to farm-level microclimate in Southern California, our study area. Using original survey data, we study differential impacts of short-run weather and long-run climate—based on farm size, type, and water source—on productivity per acre and likelihood of adopting water management practices, which have not been studied in previous county-level analyses. Further, we are able to decompose water sources into price, pricing structure, frequency of rate increases, senior water rights, quality, and type of source . In addition to studying farm-level productivity, we study short-run fluctuations in weather on likelihood of adoption of water management technologies and practices, and on parcel-level land sales. Our contribution to the literature is based upon an original survey instrument we developed and disseminated to growers in the region . The contact information was taken from the respective county Agricultural Commissioner Offices. This survey is comprised of 28 multiple choice and fill-in questions on grower, farm, and water source characteristics. This was disseminated via mail by a team of 3 undergraduate students, to growers in the study region, with a 14.6% response rate. We focus on Southern California agriculture, specifically Imperial, Riverside, San Diego, and Ventura counties. The region is often overlooked as analyses tend to focus on the Central Valley, California’s most productive agricultural region. Yet, there are several crops for which 50% or more of California’s production originates in these four counties, including raspberries, lemons, flowers and foliage, avocado, and sudan hay. All of the state’s date and sugar beet production originates in these four counties . Imperial, Riverside, San Diego, and Ventura counties are amongst the top 15 agricultural counties in the state, representing approximately 16% of statewide agricultural revenue . They also represent the diverse climate of the region with two coastal , and two desert counties.

The 4 counties also vary in farm size with San Diego County having the largest share of farms under 10 acres, and, at the other extreme, Imperial County having the largest share of farms with 1000 or more acres . There is also a wide distribution in gross revenue across these counties . An immediate concern with aggregation at the county level is the omission of data on decision-maker/grower , farm ,marijuana drying rack and detailed water source attributes . Excluding such information assumes a priori a limited role of the economic agent to influence farmland productivity. It also simplifies the inherent complexity in representing farm and water source characteristics. It is not for lack of explanatory power that these variables are excluded. It is more likely that they would have been studied had they been available in existing data sources. The USDA Farm and Ranch Irrigation Survey , a major source of US agricultural data for economic analyses, does not provide these variables at the farm level to researchers. There is, however, little reason to assume that the climate, soil, and water variables in county-level studies are correlated with any of these microlevel variables, thus ruling out the potential bias in climate, soil, and water estimators. Aggregation at the county level also leaves the model susceptible to measurement error on certain explanatory variables . Measurement error is defined as an imprecise measure of an economic variable, dependent or explanatory, which has a well-defined quantitative meaning . 1 Following the classical errors-in variable assumption, this could lead to estimators that are asymptotically inconsistent and biased downward in their respective probability limits . 2 The remaining sections in this chapter present the theoretical framework behind each of the 3 empirical analyses in this dissertation: the Farm-Level Ricardian, the Discrete Choice of Adoption, and the Parcel-Level Models. Each subsection also includes hypotheses on the impact of climate and other key variables on the respective dependent variables . In addition to studying the impact of climate and other relevant variables on farmland productivity, we study the factors influencing the adoption of technologies to monitor soil moisture and salinity.5 Adoption of climate-effective monitoring practices is particularly important as projections of prolonged drought continue throughout the current century. Most growers in our sample have already adopted micro-irrigation practices for vegetables, orchards, and vineyards, and extension experts suggest that consistent and/or sophisticated monitoring of growing conditions represents the next stage of irrigation efficiency adaptations .

Salinity monitoring affects water availability in both the short and long run. Too much leaching leads to water waste and, ultimately poor irrigation and economic efficiency. Too little leaching affects soil salinity and water quality at both the farm and basin level, and ultimately water availability at the farm-level in the long run. We implement logistic regression, consistent with previous studies on technology adoption , to study the factors influencing adoption of at least one soil moisture monitoring practice , or at least one water salinity monitoring practice . Prior to implementing the pilot survey, we received approval from the UCR Institutional Review Board.There were two primary objectives to the pilot survey: field-test survey questions, and gauge response rate. Rather than rely on focus groups to field-test the survey questions, we chose to disseminate a pilot survey. The major benefit of sending a pilot survey is that we could potentially receive valuable input from respondents who could not participate in focus groups due to financial, time, or physical constraints. A second benefit was time savings in survey implementation. Focus groups require managing multiple schedules to find a convenient meeting time and place, and possibly funding travel and accommodation. Although we planned to disseminate an online survey, we had not yet at that stage secured assistance from either Agricultural Extension or Farm Bureaus in each county to host our survey. In order to save time, we sent the pilot survey via postal mail using contact information from the Agricultural Commissioner Pesticide Permit Database . An informal team of fellow graduate students and family/friends helped prepare the pilot phase mailings. Each mailing package included invitation letters , consent documents , first version of questionnaire , and a self-addressed return envelope. Using a random-number function in Microsoft Excel, we randomly selected 300 respondents in total from Riverside and San Diego counties. We selected these counties as they are representative of the type of agriculture found in the region . Based on our discussions with extension experts,8 we were sensitive to the potential apprehension with which Imperial County growers, in particular, would react to our survey. Growers in Imperial County have held senior water rights for over a century due to the Seven-Party Agreement.They are aware that they have been criticized for using less efficient irrigation practices , and many fear that they will be mandated to change these practices . Thus, they may be hesitant to providing any information on irrigation and other practices. In order to minimize Imperial growers’ time burden, we chose to field test the survey on a potentially more receptive audience, and send only the final survey to Imperial. Since Ventura County has a relatively similar distribution of farm types as San Diego County , we also decided to exclude Ventura from the pilot. The pilot survey consists of 20 questions, including grower characteristics , farm characteristics , water source characteristics , water management practices , perceptions of water scarcity , and an open-ended comment space at the end of the survey . The majority of these questions are multiple choice often with an “other” choice that included an option to write in a response that was not pre-determined. Eight questions are fill-in style. We received a roughly 10% response rate from the pilot phase , and learned valuable lessons on question structure for preparing the final survey. First, there were far too many questions on water scarcity perceptions, which could be consolidated into fewer questions. Second, income questions were better placed at the end of the survey to minimize participant suspicion.

]]>