To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge-org.demo.remotlog.com
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The purpose of this study was to confirm reduced training metrics previously associated with a ketogenic low-carbohydrate (CHO) high-fat diet (LCHF) and investigate their attenuation with caffeine supplementation. At baseline, n 21 elite race-walkers followed a high CHO availability (HCHO) diet and performed a tempo hill session (14 km with a 450 m elevation gain). Athletes were then assigned to either the HCHO or LCHF in a parallel groups design for 3 weeks, where the 14 km tempo hill session was repeated each week. On weeks 2 and 3, in a randomised crossover allocation, all participants received 3 mg/kg caffeine or placebo (gum), 20 min before the session. Race-walking speed, heart rate, ratings of perceived exertion, blood metabolites and Stroop word-colour test metrics were collected. Although LCHF athletes walked faster at baseline compared with HCHO (P = 0·049), the HCHO group improved by week 2 (P = 0·009) and week 3 (P = 0·007), whereas the LCHF group was significantly slower in Week 1 (P < 0·001) and Week 2 (P = 0·026) compared with baseline. During the 14 km hill session, within-group analysis shows that athletes walked significantly faster (P = 0·010) and at a higher percentage of vVO2max (P = 0·007) when using caffeine compared with a placebo. Between-group differences remained present, with HCHO athletes walking at a higher percentage of vVO2max than those adhering to the LCHF diet (P = 0·035). No interaction between supplement treatment and dietary group occurred (P = 0·640). Caffeine supplementation partially reversed the performance impairment associated with an LCHF diet, but training quality remained lower than the combination of caffeine and high CHO availability.
During mass-casualty incidents (MCIs), prehospital triage is performed to identify which patients most urgently need medical care. Formal MCI triage tools exist, but their performance is variable. The Shock Index (SI; heart rate [HR] divided by systolic blood pressure [SBP]) has previously been shown to be an efficient screening tool for identifying critically ill patients in a variety of in-hospital contexts. The primary objective of this study was to assess the ability of the SI to identify trauma patients requiring urgent life-saving interventions in the prehospital setting.
Methods:
Clinical data captured in the Alberta Trauma Registry (ATR) were used to determine the SI and the “true” triage category of each patient using previously published reference standard definitions. The ATR is a provincial trauma registry that captures clinical records of eligible patients in Alberta, Canada. The primary outcome was the sensitivity of SI to identify patients classified as “Priority 1 (Immediate),” meaning they received urgent life-saving interventions as defined by published consensus-based criteria. Specificity, positive predictive value (PPV) and negative predictive value (NPV) were calculated as secondary outcomes. These outcomes were compared to the performance of existing formal MCI triage tools referencing performance characteristics reported in a previously published study.
Results:
Of the 9,448 records that were extracted from the ATR, a total of 8,650 were included in the analysis. The SI threshold maximizing Youden’s index was 0.72. At this threshold, SI had a sensitivity of 0.53 for identifying “Priority 1” patients. At a threshold of 1.00, SI had a sensitivity of 0.19.
Conclusions:
The SI has a relatively low sensitivity and did not out-perform existing MCI triage tools at identifying trauma patients who met the definition of “Priority 1” patients.
Vital signs are an essential component of the prehospital assessment of patients encountered in an emergency response system and during mass-casualty disaster events. Limited data exist to define meaningful vital sign ranges to predict need for advanced care.
Study Objectives:
The aim of this study was to identify vital sign ranges that were maximally predictive of requiring a life-saving intervention (LSI) among adults cared for by Emergency Medical Services (EMS).
Methods:
A retrospective study of adult prehospital encounters that resulted in hospital transport by an Advanced Life Support (ALS) provider in the 2022 National EMS Information System (NEMSIS) dataset was performed. The outcome was performance of an LSI, a composite measure incorporating critical airway, medication, and procedural interventions, categorized into eleven groups: tachydysrhythmia, cardiac arrest, airway, seizure/sedation, toxicologic, bradycardia, airway foreign body removal, vasoactive medication, hemorrhage control, needle decompression, and hypoglycemia. Cut point selection was performed in a training partition (75%) to identify ranges for heart rate (HR), respiratory rate (RR), systolic blood pressure (SBP), oxygen saturation, and Glasgow Coma Scale (GCS) by using an approach intended to prioritize specificity, keeping sensitivity constrained to at least 25%.
Results:
Of 18,259,766 included encounters (median age 63 years; 51.8% male), 6.3% had at least one LSI, with the most common being airway interventions (2.2%). Optimal ranges for vital signs included 47-129 beats/minute for HR, 8-30 breaths/minute for RR, 96-180mmHg for SBP, >93% for oxygen saturation, and >13 for GCS. In the test partition, an abnormal vital sign had a sensitivity of 75.1%, specificity of 66.6%, and positive predictive value (PPV) of 12.5%. A multivariable model encompassing all vital signs demonstrated an area under the receiver operator characteristic curve (AUROC) of 0.78 (95% confidence interval [CI], 0.78-0.78). Vital signs were of greater accuracy (AUROC) in identifying encounters needing airway management (0.85), needle decompression (0.84), and tachydysrhythmia (0.84) and were lower for hemorrhage control (0.52), hypoglycemia management (0.68), and foreign body removal (0.69).
Conclusion:
Optimal ranges for adult vital signs in the prehospital setting were statistically derived. These may be useful in prehospital protocols and medical alert systems or may be incorporated within prediction models to identify those with critical illness and/or injury for patients with out-of-hospital emergencies.
Perinatal malnutrition is a critical cause of diseases in offspring. Based on the different rates of organ development, we hypothesised that malnutrition at varying early life stages would have a differential impact on cardiovascular disease in middle-aged and older adults. This study sought to assess the long-term impact of exposure to the 1959–1961 Great Chinese Famine (GCF) during early developmental periods on risks of cardiovascular diseases in the late middle-aged offspring. A total 6, 662 individuals, born between 1958 and 1964, were divided into six groups according to the birth date. The generalised line model was used to control age and estimate differences with 95% confidence interval (CI) in blood pressure. Binary logistic regression was applied to evaluate the association between famine exposure and cardiovascular diseases. Compared to the unexposed late middle-aged persons, blood pressure was elevated in the entire gestation exposure group, regardless of postnatal exposure to GCF. Increased blood pressure was also found in the female offspring exposed to GCF during early and middle gestation. The early-childhood exposure was associated with the risk of bradycardia in the offspring. The risks of vertebral artery atherosclerosis were elevated in GCF famine-exposed groups except first trimester exposed group. The chronic influence of GCF in early life periods was specific to the developmental timing window, sexesand organs, suggesting an essential role of interactions among multiple factors and prenatal malnutrition in developmentally “programming” cardiovascular diseases.
Terrorism and trauma survivors often experience changes in biomarkers of autonomic, inflammatory and hypothalamic-pituitary-adrenal (HPA) axis assessed at various times. Research suggests interactions of these systems in chronic stress.
Study Objective:
This unprecedented retrospective study explores long-term stress biomarkers in three systems in terrorism survivors.
Methods:
Sixty healthy, direct terrorism survivors were compared to non-exposed community members for cardiovascular reactivity to a trauma script, morning salivary cortisol, interleukin 1-β (IL-1β), and interleukin 2-R (IL-2R). Survivors’ biomarkers were correlated with psychiatric symptoms and diagnoses and reported functioning and well-being seven years after the Oklahoma City (OKC) bombing.
Main outcome measures were the Diagnostic Interview Schedule (DIS) Disaster Supplement for Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision (DSM-IV-TR) diagnoses, Impact of Events Scale-Revised (IES-R), Beck Depression Inventory-II (BDI-II), Distress and Functioning Scale (DAF), and General Physical Well-Being Scale.
Results:
Survivors had higher inflammatory IL-1β, lower anti-inflammatory IL-2R, lower cortisol, higher resting diastolic blood pressure (BP), and less cardiovascular reactivity to a trauma script than comparisons. Survivors’ mean posttraumatic stress (PTS) symptom levels did not differ from comparisons, but survivors reported worse well-being. None of survivors’ biomarkers correlated with PTS or depressive symptoms or diagnoses or reported functioning.
Conclusions:
Alterations of biological stress measures in cardiovascular, inflammatory, and cortisol systems coexisted as an apparent generalized long-term response to terrorism rather than related to specific gauges of mental health. Potential interactions of biomarkers long after trauma exposure is discussed considering relevant research. Longer-term follow-up could determine whether biomarkers continue to differ or correlate with subjective measures, or if they accompany health problems over time. Given recent international terrorism, understanding long-term sequelae among direct survivors is increasingly relevant.
Postural orthostatic tachycardia syndrome is a debilitating disorder. We compared paediatric patients with this dysautonomia presenting with and without peak upright heart rate > 100 beats per minute.
Materials and Methods:
Subjects were drawn from the Postural Orthostatic Tachycardia Syndrome Program database of the Children’s Hospital of Philadelphia diagnosed between 2007 and 2018. Subjects were aged 12–18 years at diagnosis with demographic data, supine and peak heart rate from 10-minute stand, symptoms, and family history. Patients were divided into “low heart rate” (peak less than 100 beats/minute) and “high heart rate” (peak at least 100 beats/minute) groups.
Results:
In total, 729 subjects were included (low heart rate group: 131 patients, high heart rate group: 598 patients). The low heart rate group had later age at diagnosis (16.1 versus 15.7, p = 0.0027). Median heart rate increase was 32 beats/minute in the low heart rate group versus 40 beats/minute in the high heart rate group (p < 0.00001). Excluding palpitations and tachypalpitations, there were no differences in symptom type or frequency between groups.
Discussion:
Paediatric patients meeting heart rate criteria for postural orthostatic tachycardia syndrome but without peak heart rate > 100 demonstrate no difference in symptom type or frequency versus those who meet both criteria. Differences observed reached statistical significance due to population size but are not clinically meaningful. This suggests that increased heart rate, but not necessarily tachycardia, is seen in these patients, supporting previous findings suggesting maximal heart rate is not a major determinant of symptom prevalence in paediatric postural orthostatic tachycardia syndrome.
There is significant public health interest towards providing medical care at mass-gathering events. Furthermore, mass gatherings have the potential to have a detrimental impact on the availability of already-limited municipal Emergency Medical Services (EMS) resources. This study presents a cross-sectional descriptive analysis to report broad trends regarding patients who were transported from National Collegiate Athletic Association (NCAA) Division 1 collegiate football games at a major public university in order to better inform emergency preparedness and resource planning for mass gatherings.
Methods:
Patient care reports (PCRs) from ambulance transports originating from varsity collegiate football games at the University of Minnesota across six years were examined. Pertinent information was abstracted from each PCR.
Results:
Across the six years of data, there were a total of 73 patient transports originating from NCAA collegiate football games: 45.2% (n = 33) were male, and the median age was 22 years. Alcohol-related chief complaints were involved in 50.7% (n = 37) of transports. In total, 31.5% of patients had an initial Glasgow Coma Scale (GCS) of less than 15. The majority (65.8%; n = 48; 0.11 per 10,000 attendees) were transported by Basic Life Support (BLS) ambulances. The remaining patients (34.2%; n = 25; 0.06 per 10,000 attendees) were transported by Advanced Life Support (ALS) ambulances and were more likely to be older, have abnormal vital signs, and have a lower GCS.
Conclusions:
This analysis of ambulance transports from NCAA Division 1 collegiate football games emphasizes the prevalence of alcohol-related chief complaints, but also underscores the likelihood of more life-threatening conditions at mass gatherings. These results and additional research will help inform emergency preparedness at mass-gathering events.
Effects of acute thermal exposures on appetite appear hypothetical in reason of very heterogeneous methodologies. The aim of this study was therefore to clearly define the effects of passive 24-h cold (16°C) and heat (32°C) exposures on appetitive responses compared with a thermoneutral condition (24°C). Twenty-three healthy, young and active male participants realised three sessions (from 13.00) in a laboratory conceived like an apartment dressed with the same outfit (Clo = 1). Three meals composed of three or four cold or warm dishes were served ad libitum to assess energy intake (EI). Leeds Food Preference Questionnaires were used before each meal to assess food reward. Subjective appetite was regularly assessed, and levels of appetitive hormones (acylated ghrelin, glucagon-like peptite-1, leptin and peptide YY) were assessed before and after the last meal (lunch). Contrary to the literature, total EI was not modified by cold or heat exposure (P = 0·120). Accordingly, hunger scores (P = 0·554) were not altered. Levels of acylated ghrelin and leptin were marginally higher during the 16 (P = 0·032) and 32°C (P < 0·023) sessions, respectively. Interestingly, implicit wanting for cold and low-fat foods at 32°C and for warm and high-fat foods at 16°C were increased during the whole exposure (P < 0·024). Moreover, cold entrées were more consumed at 32°C (P < 0·062) and warm main dishes more consumed at 16°C (P < 0·025). Thus, passive cold and hot exposures had limited effects on appetite, and it seems that offering some choice based on food temperature may help individuals to express their specific food preferences and maintain EI.
Cardiac vagal tone is an indicator of parasympathetic nervous system functioning, and there is increasing interest in its relation to antisocial behavior. It is unclear however whether antisocial individuals are characterized by increased or decreased vagal tone, and whether increased vagal tone is the source of the low heart rate frequently reported in antisocial populations.
Methods
Participants consisted of four groups of community-dwelling adolescent boys aged 15.7 years: (1) controls, (2) childhood-only antisocial, (3) adolescent-only antisocial, and (4) persistently antisocial. Heart rate and vagal tone were assessed in three different conditions: rest, cognitive stressor, and social stressor.
Results
All three antisocial groups had both lower resting heart rates and increased vagal tone compared to the low antisocial controls across all three conditions. Low heart rate partially mediated the relationship between vagal tone and antisocial behavior.
Conclusions
Results indicate that increased vagal tone and reduced heart rate are relatively broad risk factors for different developmental forms of antisocial behavior. Findings are the first to implicate vagal tone as an explanatory factor in understanding heart rate – antisocial behavior relationships. Future experimental work using non-invasive vagus nerve stimulation or heart rate variability biofeedback is needed to more systematically evaluate this conclusion.
Recent evidence suggests better appetite control in states of high-energy flux (HEF) in adults and lean children. Nevertheless, it is unknown whether this extends to youth with obesity. This study compares the effects of low, moderate or HEF on short-term appetitive control in adolescents with obesity. Sixteen adolescents with obesity (12–16 years, Tanner stages 3–5, 11 females) randomly completed three conditions: (i) low-energy flux (LEF); (ii) moderate energy flux (MEF; + 250 kcal) and (iii) HEF (HEF; + 500 kcal). Energy flux was achieved in MEF and HEF through elevated energy intake (EI) and concomitant increase in energy expenditure using cycling exercise (65 % VO2peak). Ad libitum EI, macronutrient intake and relative EI were assessed at dinner, subjective appetite sensations taken at regular intervals and food reward measured before dinner. Ad libitum EI at dinner was greater in LEF compared with HEF (P = 0·008), and relative EI (REI) was higher in LEF compared with MEF (P = 0·003) and HEF (P < 0·001). The absolute consumption of carbohydrates was lower in LEF compared with MEF (P = 0·047) and HEF (P < 0·001). Total AUC for hunger and desire to eat was lower in HEF compared with LEF (P < 0·001) and MEF (P = 0·038). Total AUC for prospective food consumption was lower on HEF compared with LEF (P = 0·004). Food choice sweet bias was higher in HEF (P = 0·005) compared with LEF. To conclude, increasing energy flux may improve short-term appetite control in adolescents with obesity.
In far-forward combat situations, the military challenged dogma by using whole blood transfusions (WBTs) rather than component-based therapy. More recently, some trauma centers have initiated WBT programs with reported success. There are a few Emergency Medical Service (EMS) systems that are using WBTs, but the vast majority are not. Given the increasing data supporting the use of WBTs in the prehospital setting, more EMS systems are likely to consider or begin WBT programs in the future.
Objective:
A prehospital WBT program was recently implemented in Palm Beach County, Florida (USA). This report will discuss how the program was implemented, the obstacles faced, and the initial results.
Methods:
This report describes the process by which a prehospital WBT program was implemented by Palm Beach County Fire Rescue and the outcomes of the initial case series of patients who received WBTs in this system. Efforts to initiate the prehospital WBT program for this system began in 2018. The program had several obstacles to overcome, with one of the major obstacles being the legal team’s perception of potential liability that might occur with a new prehospital blood transfusion program. This obstacle was overcome through education of local elected officials regarding the latest scientific evidence in favor of prehospital WBTs with potential life-saving benefits to the community. After moving past this hurdle, the program went live on July 6, 2022. The initial indications for transfusion of cold-stored, low titer, leukoreduced O+ whole blood in the prehospital setting included traumatic injuries with systolic blood pressure (SBP) < 70mmHg or SBP < 90mmHg plus heart rate (HR) > 110 beats per minute.
Findings:
From the date of onset through December 31, 2022, Palm Beach County Fire Rescue transported a total of 881 trauma activation patients, with 20 (2.3%) receiving WBT. Overall, nine (45%) of the patients who had received WBTs so far remain alive. No adverse events related to transfusion were identified following WBT administration. A total of 18 units of whole blood reached expiration of the unit’s shelf life prior to transfusion.
Conclusion:
Despite a number of logistical and legal obstacles, Palm Beach County Fire Rescue successfully implemented a prehospital WBT program. Other EMS systems that are considering a prehospital WBT program should review the included protocol and the barriers to implementation that were faced.
Contactless photoplethysmography (PPG) potentially affords the ability to obtain vital signs in pediatric populations without disturbing the child. Most validity studies have been conducted in laboratory settings or with healthy adult volunteers. This review aims to evaluate the current literature on contactless vital signs monitoring in pediatric populations and within a clinical setting.
Methods:
OVID, Webofscience, Cochrane library, and clinicaltrials.org were systematically searched by two authors for research studies which used contactless PPG to assess vital signs in children and within a clinical setting.
Results:
Fifteen studies were included with a total of 170 individuals. Ten studies were included in a meta-analysis for neonatal heart rate (HR), which demonstrated a pooled mean bias of −0.25 (95% limits of agreement (LOA), −1.83 to 1.32). Four studies assessed respiratory rate (RR) in neonates, and meta-analysis demonstrated a pooled mean bias of 0.65 (95% LOA, −3.08 to 4.37). All studies were small, and there were variations in the methods used and risk of bias.
Conclusion:
Contactless PPG is a promising tool for vital signs monitoring in children and accurately measures neonatal HR and RR. Further research is needed to assess children of different age groups, the effects of skin type variation, and the addition of other vital signs.
Contactless photoplethysmography (cPPG) is a method of physiological monitoring. It differs from conventional monitoring methods (e.g., saturation probe) by ensuring no contact with the subject by use of a camera. The majority of research on cPPG is conducted in a laboratory setting or in healthy populations. This review aims to evaluate the current literature on monitoring using cPPG in adults within a clinical setting. Adhering to the Preferred Items for Systematic Reviews and Meta-analysis (PRISMA, 2020) guidelines, OVID, Webofscience, Cochrane library, and clinicaltrials.org were systematically searched by two researchers. Research articles using cPPG for monitoring purposes in adults within a clinical setting were selected. Twelve studies with a total of 654 individuals were included. Heart rate (HR) was the most investigated vital sign (n = 8) followed by respiratory rate ((n = 2), Sp02 (n = 2), and HR variability (n = 2). Four studies were included in a meta-analysis of HR compared to ECG data which demonstrated a mean bias of –0.13 (95% CI, –1.22–0.96). This study demonstrates cPPG can be a useful tool in the remote monitoring of patients and has demonstrated accuracy for HR. However, further research is needed into the clinical applications of this method.
Alterations in heart rate (HR) may provide new information about physiological signatures of depression severity. This 2-year study in individuals with a history of recurrent major depressive disorder (MDD) explored the intra-individual variations in HR parameters and their relationship with depression severity.
Methods
Data from 510 participants (Number of observations of the HR parameters = 6666) were collected from three centres in the Netherlands, Spain, and the UK, as a part of the remote assessment of disease and relapse-MDD study. We analysed the relationship between depression severity, assessed every 2 weeks with the Patient Health Questionnaire-8, with HR parameters in the week before the assessment, such as HR features during all day, resting periods during the day and at night, and activity periods during the day evaluated with a wrist-worn Fitbit device. Linear mixed models were used with random intercepts for participants and countries. Covariates included in the models were age, sex, BMI, smoking and alcohol consumption, antidepressant use and co-morbidities with other medical health conditions.
Results
Decreases in HR variation during resting periods during the day were related with an increased severity of depression both in univariate and multivariate analyses. Mean HR during resting at night was higher in participants with more severe depressive symptoms.
Conclusions
Our findings demonstrate that alterations in resting HR during all day and night are associated with depression severity. These findings may provide an early warning of worsening depression symptoms which could allow clinicians to take responsive treatment measures promptly.
Many Emergency Medical Service (EMS) systems in the United States restrict albuterol therapy by scope of practice to Advanced Life Support (ALS). The State of Delaware has a two-tiered EMS system in which Basic Life Support (BLS) arrives on scene prior to ALS in the majority of respiratory distress calls.
Study Objective:
This study sought to evaluate the safety, efficacy, and expedience of albuterol administration by BLS compared to ALS.
Methods:
This retrospective observational study used data collected from July 2015 through January 2017 throughout a State BLS albuterol pilot program. Pilot BLS agencies participated in a training session on the indications and administration of albuterol, and were then authorized to carry and administer nebulized albuterol. Heart rate (HR), respiratory rate (RR), and pulse oximetry (spO2) were obtained before and after albuterol administration by BLS and ALS. The times from BLS arrival to the administration of albuterol by pilot BLS agencies versus ALS were compared. Study encounters required both BLS and ALS response. Data were analyzed using chi-square and t-test as appropriate.
Results:
Three hundred eighty-eight (388) incidents were reviewed. One hundred eighty-five (185) patients received albuterol by BLS pilot agencies and 203 patients received albuterol by ALS. Of note, the population treated by ALS was significantly older than the population treated by BLS (61.9 versus 51.6 years; P <.001). A comparison of BLS arrival time to albuterol administration time showed significantly shorter times in the BLS pilot group compared to the ALS group (3.50 minutes versus 8.00 minutes, respectively; P <.001). After albuterol administration, BLS pilot patients showed improvements in HR (P <.01), RR (P <.01), and spO2 (P <.01). Alternately, ALS treatment patients showed improvement in spO2 (P <.01) but not RR (P = .17) or HR (P = 1.00). Review by ALS or hospital staff showed albuterol was indicated in 179 of 185 BLS patients and administered correctly in 100% of these patients.
Conclusion:
Patients both received albuterol significantly sooner and showed superior improvements in vital signs when treated by BLS agencies carrying albuterol rather than by BLS agencies who required ALS arrival for albuterol. Two-tiered EMS systems should consider allowing BLS to carry and administer albuterol for safe, effective, and expedient treatment of respiratory distress patients amenable to albuterol therapy.
Gavaging (oral dosing) has previously been shown to have only a short-term effect on behavioural parameters in the laboratory rat. The aim of this study was to determine if the gavaging of laboratory rats influenced their heart rate, blood pressure and body temperature, and if so, whether the duration of this impact correlated with the volume gavaged. The three stress parameters were measured using telemetric transponders placed in the abdomen of eight female Sprague-Dawley (Mol:SPRD) rats. Using a Latin Square cross-over design, the rats were gavaged with three different doses of barium sulphate (4, 10 and 40 ml kg–1); in addition, there was a control of no dose, only insertion of the tube. The heart rate, blood pressure and body temperature of the rats were monitored continuously for 4 h after dosing and again for 1 h, 24 h after dosing. The gavaging of laboratory rats was shown to induce an acute reaction: after 30 min, blood pressure and heart rate were significantly higher than before gavaging, and body temperature was significantly higher 60 min after gavaging — indicators of stress levels comparable to those of other basic experimental procedures. A significant correlation between heart rate and dosage was observed until 10 min after gavaging. This indicates that the dosage gavaged is of only minor importance in causing stress, and only important for the most acute reaction. However, because of the resistance and discomfort observed when administering a 40 ml kg–1 dose, this dose should be administered only with caution.
A new, external non-invasive telemetric heart rate (HR) monitoring system was evaluated on eight wapiti, Cervus elaphus canadensis, yearlings in July and August 1996. The assembly consisted of a leather girth strap, onto which a HR transmitter and a customized carriage bolt electrode system were fixed. To prevent the girth strap from rotating on the animal, it was secured with adjustable nylon straps extending anteriorly between the forelegs up to an adjustable neck collar. In preliminary testing, audible tones were received during 99 per cent (n = 902) of the 15s intervals when the animals were active, but only during 33 per cent (n = 156) when they were bedded. After 2 weeks, the equipment remained functional (and was removed); the effective signal range was consistently beyond 500m. This HR monitoring system is easy to attach externally, obviates complications from surgery, and provides coverage over an extended range.
The monitoring system offers a reliable, humane and inexpensive method for short-term measurement of HR in captive or wild ungulates. Further tests may reveal a potential for long-term application. The ability to measure physiological responses under different management regimes can aid ungulate farmers in selecting optimal herd sizes and social structures for their animals; and in developing superior housing, enclosure designs, handling and transport methods. This improves the animals’ welfare, and ultimately leads to an increase in animal growth and herd productivity. In addition, information about heart rates can help wildlife managers to improve their management strategies, by gaining an understanding of the energy expenditure associated with various activities and environmental influences.
This study investigated the effect of olfactory substances on the heart rate and lying behaviour of pigs during transport simulation. Five treatments were tested through the application of each substance to pigs’ snouts with a paintbrush. These consisted of: 1) control treatment (wiping without product); 2) 2 ml of a synthetic, maternal-like pheromone; 3) 5 ml of a synthetic, maternal-like pheromone; 4) a commercial, non-relevant odour and 5) 2 ml of a placebo (solvent of the synthetic pheromone without active ingredients). In total, 90 pigs took part in this study and each treatment was tested on a group of three pigs with six replicates per treatment. Pigs were vibrated in the vertical direction in a transport simulator with a frequency of 8 Hz and an acceleration of 3 m s−2. Cardiac activity and lying behaviour during vibration were quantified. The effect of vibration was found to be statistically significant, ie causing an increase in heart rate and numbers of ventricular ectopic beats (VEB). Both 2 and 5 ml of synthetic pheromone were generally found to decrease the minimum, mean, and peak heart rate values in comparison with the other treatments (in particular the control and the non-relevant odour group) but only minimum heart rate reached statistical significance. However, the number of VEBs was highest for these two synthetic pheromone groups during vibration. No dose-dependent synthetic pheromone effects were found and there were no differences in the amount of time pigs spent lying. The use of olfactory substances may support pigs’ ability to cope with real transport conditions thereby improving their welfare.
The effect of transport on core and peripheral body temperatures and heart rate was assessed in ten 18-month-old Coopworth ewes (Ovis aries) Manual recordings of core (rectal) temperatures were obtained, and automated logging of peripheral (external auditory canal and pinna) temperatures and heart rate was carried out on the day prior to (day 1) and during (day 2) a standardised transport procedure. Transport produced a significant increase in the rectal temperature, which declined following unloading. Peripheral measures of body temperature also exhibited changes with transport. However, both ear-canal and pinna temperatures declined during actual transport, reflecting to some extent the decline in ambient temperatures recorded externally by sensors on the ear tags of the animals. Peripheral measurement of temperature, particularly at the readily accessible ear canal, may offer potential as a technique for the long-term monitoring of thermal responses to stress. However, further research is required into the potentially confounding effects of ambient temperature and wind chill factors.
This study was designed to investigate the physiological responses induced in sheep (n = 18) by two different loading techniques followed by a short road journey. All animals were prepared with venous catheters, to minimize the disturbing effects of blood sampling, and nine sheep were fitted with heart rate monitors. The animals were loaded onto a transport vehicle in groups of three, alternately using a conventional tailgate ramp or a crate raised with a hydraulic lift. When all of the sheep were loaded, they were taken on a journey lasting 195min. Blood samples were collected in the home pen, directly after loading, and at 15min intervals during the journey. Measurements were made of plasma concentrations of Cortisol, prolactin and catecholamines (adrenaline and noradrenaline). The results indicated that heart rate increased during loading, regardless of the method used. No changes in concentrations of Cortisol or the catecholamines were detected, although a small increase in prolactin was noted when animals were loaded using the ramp. During transport, all sheep exhibited increases in plasma Cortisol concentrations which were greatest during the first 2h of the journey. The results suggest that, under the conditions employed in this experiment, the effects of the two loading procedures were similar and that transport appeared to be more stressful than loading.