To save this undefined to your undefined account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you used this feature, you will be asked to authorise Cambridge Core to connect with your undefined account.
Find out more about saving content to .
To send this article to your Kindle, first ensure no-reply@cambridge-org.demo.remotlog.com is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Over 30 years ago it was proven beyond doubt that folic acid supplementation of mothers in early pregnancy protects against neural tube defects (NTD) in their babies. Such conclusive scientific evidence led to clear recommendations for women worldwide to take 0⋅4 mg/d folic acid before conceiving and in early pregnancy, but implementing these into effective policy has been problematic. As a result, there has been no change in the incidence of NTD in Ireland, the UK or any other European country over the 25-year period that the current strategy, recommending periconceptional folic acid supplements to women, has been in place. Thus preventable NTD are not being prevented. Notably, in September 2021, the UK government announced that starch is to be fortified with folic acid on a mandatory basis. A similar decision is now urgently needed in Ireland, where rates of NTD are among the highest in the world. A policy of mandatory folic acid fortification of food would be highly effective in preventing NTD because it reaches all women, including those who have not planned their pregnancy. International evidence shows that wherever such a policy has been introduced, it has proved to be effective in reducing rates of NTD in that country. Apart from preventing NTD, the driver of policy in the area, other potential health benefits across the lifecycle can be anticipated from folic acid fortification. Urgent action is needed on implementation of mandatory food fortification with folic acid in Ireland so that mothers and their babies can benefit.
The role of early life nutrition's impact on relevant health outcomes across the lifespan laid the foundation for the field titled the developmental origins of health and disease. Studies in this area initially concentrated on nutrition and the risk of adverse cardio-metabolic and cancer outcomes. More recently the role of nutrition in early brain development and the subsequent influence of later mental health has become more evident. Scientific breakthroughs have elucidated two mechanisms behind long-term nutrient effects on the brain, including the existence of critical periods for certain nutrients during brain development and nutrient-driven epigenetic modifications of chromatin. While multiple nutrients and nutritional conditions have the potential to modify brain development, iron can serve as a paradigm to understand both mechanisms. New horizons in nutritional medicine include leveraging the mechanistic knowledge of nutrient–brain interactions to propose novel nutritional approaches that protect the developing brain through better timing of nutrient delivery and potential reversal of negative epigenetic marks. The main challenge in the field is detecting whether a change in nutritional status truly affects the brain's development and performance in human subjects. To that end, a strong case can be made to develop and utilise bioindicators of a nutrient's effect on the developing brain instead of relying exclusively on biomarkers of the nutrient's status.
The objective of this paper is to review the global effort to eliminate iodine deficiency and its impact on public health. Iodine is an essential component of hormones produced by the thyroid gland. Iodine deficiency has multiple adverse effects in humans due to inadequate thyroid hormone production that are termed the iodine deficiency disorders. The major adverse effect is impaired cognition in children. The WHO's first estimate of the global prevalence of goitre in 1960 suggested that 20–60 % of the world's population was affected, with most of the burden in low- and middle-income countries. Iodine deficiency was identified as a key global risk factor for impaired child development where the need for intervention was urgent. This spurred a worldwide effort to eliminate iodine deficiency led by a coalition of international organisations working closely with national governments and the salt industry. In most countries, the best strategy to control iodine deficiency is carefully monitored iodisation of salt. The reach of current iodised salt programmes is remarkable: in 2018, 88 % of the global population used iodised salt. The number of countries with adequate iodine intake has nearly doubled over the past 20 years from 67 in 2003 to 118 in 2020. The resulting improvement in cognitive development and future earnings suggests a potential global economic benefit of nearly $33 billion. Iodine programmes are appealing for national governments because the health and economic consequences are high and can be easily averted by salt iodisation, a low-cost and sustainable intervention.
The present paper reviews progress in research on dietary fibre and human health over the past five decades. There is now convincing evidence from prospective cohort studies that diets low in dietary fibre are associated with increased risk of common non-communicable diseases including CVD, type 2 diabetes and colorectal cancer. These findings provide strong support for hypotheses proposed by Denis Burkitt 50 years ago, based on very limited evidence but with considerable imagination and insight. For the first two to three decades of this period, research on dietary fibre was hampered by the lack of consensus about the definition, and measurement, of this complex and diverse dietary component and by the lack of appropriate tools for investigating the gut microbiome that is central to understanding mechanisms of action. Recent technical and scientific advances in microbiome research (based on fast, low-cost, DNA sequencing) are facilitating investigation of the associations between dietary fibre, the gut microbiome and human health. Current challenges include the need for agreement about the characteristics of a healthy gut microbiome. Although the health benefits attributed to higher dietary fibre intake are likely to be shared with most types of dietary fibre, one should anticipate that different sources of dietary fibre and the other components (resistant starch and non-digestible oligosaccharides) that make up dietary fibre will have characteristically different effects on human physiology and disease risk. In conclusion, population-level intakes of dietary fibre are low and there is a public health priority to develop and implement more effective interventions to increase intake.
Postgraduate Symposium
Conference on ‘Impact of nutrition science to human health: past perspectives and future directions’
Diet-related diseases are the leading cause of death globally and strategies to tailor effective nutrition advice are required. Personalised nutrition advice is increasingly recognised as more effective than population-level advice to improve dietary intake and health outcomes. A potential tool to deliver personalised nutrition advice is metabotyping which groups individuals into homogeneous subgroups (metabotypes) using metabolic profiles. In summary, metabotyping has been successfully employed in human nutrition research to identify subgroups of individuals with differential responses to dietary challenges and interventions and diet–disease associations. The suitability of metabotyping to identify clinically relevant subgroups is corroborated by other fields such as diabetes research where metabolic profiling has been intensely used to identify subgroups of patients that display patterns of disease progression and complications. However, there is a paucity of studies examining the efficacy of the approach to improve dietary intake and health parameters. While the application of metabotypes to tailor and deliver nutrition advice is very promising, further evidence from randomised controlled trials is necessary for further development and acceptance of the approach.
Adolescence is a pivotal, yet frequently overlooked, period of life, with this age group often no longer receiving the focus, care and protection devoted to other life stages. Nutritional vulnerability increases in adolescence due to heightened nutritional requirements, yet the quality of the diets consumed by this age group often deteriorates significantly. Poor-quality dietary patterns and insufficient nutrient intakes are frequently observed amongst adolescents both in Ireland and globally. This deterioration in diet quality is greatly influenced by individual, social and environmental determinants of behaviour and health. The influences of each of these factors change and increase as adolescents begin to interact independently with the surrounding world. Poor nutrition during adolescence can result in several immediate and long-term health consequences, including micronutrient deficiencies, increased risk of overweight/obesity and increased presentation of cardiometabolic risk factors, all of which have been observed as persistent issues amongst adolescents in Ireland and internationally. Adolescence is a critical period of intervention to protect youth both now and into their future lives. This age group can be particularly receptive to the influence of society and the surrounding environment, posing several avenues in which to influence adolescents towards more health-promoting behaviour. This review aims to summarise the key nutritional and dietary characteristics of adolescents, to provide an overview of the causes and consequences of poor nutrition in adolescence, and to highlight potential opportunities for intervention to protect the health of this age group, with a particular focus on evidence from an Irish context.
Vitamin D is crucial for musculoskeletal health, with evidence suggesting non-skeletal benefits. Cutaneous vitamin D synthesis is limited in Ireland due to its northern latitude (52–55°N) and the population is dependent on dietary sources, yet intakes are inadequate. No study to-date has comprehensively examined vitamin D intakes and status in Ireland (Northern Ireland and the Republic). We aimed to review the evidence since 2010 and summarise the results in subgroups of the Irish population. We found that in the largest studies prevalence of deficiency [25-hydroxyvitamin D (25(OH)D) < 30 nm/l] was 15–17% in pregnancy, 15–23% in children and 13% in adults. Approximately half the population had 25(OH)D < 50 nm/l. There were only four small studies in an ethnic population with the largest in Southeast Asians finding that 67% were deficient. All studies found higher rates of deficiency and levels <50 nm/l in winter v. summer. Vitamin D intake was lowest in children (mean 2⋅3–4⋅2 μg/d) and pregnant women (mean 1⋅9–5⋅1 μg/d) and highest in older adults (6⋅9 μg/d), with over 90% of the population not meeting the recommended daily allowance. This review indicates that low vitamin D status and dietary vitamin D intake are widespread with children, adolescents, younger adults, pregnant women and ethnic minorities most at-risk. However, data are sparse in at-risk groups including the Travelling community, non-Europeans and institutionalised adults. Given the significant prevalence of deficiency, public health policies to promote better awareness of recommended vitamin D intakes and explore the options of food fortification are needed to address this issue.
Adolescence is a critical time of physical, psychological and social development, and thus, optimal nutritional intakes are required during this life stage. Despite this, adolescence is recognised as a period of nutritional vulnerability, with many reportedly failing to meet current dietary guidelines. The school-setting presents a favourable environment to intervene and promote positive dietary behaviours and is also inclusive regardless of socio-economic status. However, a lack of consensus exists on how best to utilise schools to facilitate improvements in dietary behaviours among this age group. Whilst previous research has focused on identifying the factors motivating dietary choices within the school-setting, less is known on the optimum strategies to enhance these dietary choices which could positively contribute to the design of future interventions. It is reported that adolescents have good nutritional knowledge, although this does not appear to be a central consideration when making their dietary choices. Alternative factors at the individual (taste, visual appeal, familiarity, food quality, price, portion size, value for money, time/ convenience), social (peer influence), physical (product placement) and macro environment (food availability) levels have been frequently cited as important influences on adolescents' dietary choices in school. Although school-based interventions have shown potential in achieving positive dietary change among adolescents, more research is needed to determine the most effective methods in improving dietary behaviours in schools. This review summarises the key factors which influence adolescents' school-based dietary choices and the effectiveness of previously conducted interventions, identifying promising components for consideration when developing future dietary interventions within the school-setting.
Symposium two: ‘Stuck in neutral’: current challenges for nutrition science
Conference on ‘Impact of nutrition science to human health: past perspectives and future directions’
A high intake of fruit and vegetables (FV) has consistently been associated with a reduced risk of a number of non-communicable diseases. This evidence base is largely from prospective cohort studies, with meta-analyses demonstrating an association between increased FV intake and reduced risk of both CHD and stroke, although the evidence is less certain for cancer and diabetes. Controlled intervention trials examining either clinical or intermediate risk factor endpoints are more scarce. Therefore, evidence that FV consumption reduces the risk of disease is so far largely confined to observational epidemiology, which is hampered by some methodological uncertainties. Although increased FV intake is promoted across all dietary guidelines, national surveys confirm that dietary intakes are suboptimal and are not increasing over time. A range of barriers to increasing FV intake exist, including economic, physical and behavioural barriers that must be considered when exploring potential opportunities to change this, considering the feasibility of different approaches to encourage increased FV consumption. Such interventions must include consideration of context, for example, challenges and uncertainties which exist with the whole food system.
This review summarises evidence relating to a potential role for vitamin D supplementation in the prevention or treatment of coronavirus disease 2019 (COVID-19). Laboratory studies show that the active vitamin D metabolite 1,25-dihydroxyvitamin D induces innate antiviral responses and regulates immunopathological inflammation with potentially favourable implications for the host response to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). Meta-analyses of cross-sectional, case-control and longitudinal studies report consistent protective associations between higher circulating 25-hydroxyvitamin D [25(OH)D] concentrations or vitamin D supplement use and reduced risk and severity of COVID-19. However, Mendelian randomisation studies testing for associations between genetically predicted circulating 25(OH)D concentrations and COVID-19 outcomes have yielded consistently null results. Positive findings from observational epidemiological studies may therefore have arisen as a result of residual or unmeasured confounding or reverse causality. Randomised controlled trials of prophylactic or therapeutic vitamin D supplementation to reduce risk or severity of COVID-19 reporting to date have yielded inconsistent findings. Results of further intervention studies are pending, but current evidence is insufficient to support routine use of vitamin D supplements as a therapeutic or prophylactic agent for COVID-19, or as an adjunct to augment immunogenicity of SARS-CoV-2 vaccination. Accordingly, national and international bodies have not made any recommendations regarding a role for vitamin D in the prevention or treatment of COVID-19.
Symposium three: Frontiers and future prospects for nutrition
Conference on ‘Impact of nutrition science to human health: past perspectives and future directions’
The precision nutrition paradigm is based on the premise that substantial variation exists between human subjects in terms of diet-related disease risk and response to dietary interventions. In terms of better defining, ‘the right diet for the right person at the right time’ may be more appropriate than ‘one-diet-fits-all’. This review will explore how systems biology and nutrigenomics approaches have advanced the precision nutrition paradigm. We will draw upon a number of elegant mechanistic studies that have enhanced our understanding with respect to the complex biology and inter-organ crosstalk, relating to inflammation and metabolism, that underpin cardio-metabolic health. Also, this review will explore the extent to which more targeted, precision nutrition approaches may attenuate adverse risk factors associated with cardio-metabolic disease. We will focus on the key characteristics or ‘metabotypes’ of high- v. low-risk individuals and response v. non-response to interventions, to generate greater insights with respect to risk stratification and therapeutic interventions to enhance disease prevention. The goal is to utilise systems biology to enhance understanding by underpinning more targeted nutritional approaches, which may improve efficacy of personalised nutrition interventions.
Observational research, mainly prospective cohort studies (PCS), has represented a long-standing challenge for those attempting to draw up consistent policy recommendations in the area of diet and health. This has been due to the inherent limitations in ascribing causality from observed associations due to problems of confounding of the findings and publication and citation bias. Developments in nutritional epidemiology research over the past 20–30 years have enabled causal criteria to be derived from observational studies and the totality of the primary literature to be reviewed objectively, reducing previous focus on narrative accounts of individual studies. The gold standard approach to assessing causal relationships is via randomised controlled trials (RCT), but neither RCT nor PCS provide direct evidence for biological plausibility, which is a key criterion for assessing causality. Although extensive mechanistic data are available in the literature, a systematic approach to select and assess quality and relevance of published studies has not been available. This limits their use in the development of diet and health policy. Recent studies have investigated a proposed two-step framework and novel methodologies for integrating heterogeneous data from cell, animal and human studies. Pilot and feasibility studies have shown this to be a useful novel approach to studies of diet and cancer, but further refinements are required, including development of appropriate quality criteria which are less dependent on RCT designs. Future studies are needed to fully verify the approach and its potential for use in other diet–disease relationships.
Julie Wallace Award Lecture
Conference on ‘Impact of nutrition science to human health: Past perspective and future directions’
Diminished skeletal muscle strength and size, termed sarcopenia, contributes substantially to physical disability, falls, dependence and reduced quality of life among older people. Physical activity and nutrition are the cornerstones of sarcopenia prevention and treatment. The optimal daily protein intake required to preserve muscle mass and function among older adults is a topic of intense scientific debate. Older adults require protein intakes about 67 % higher than their younger counterparts to maximally stimulate postprandial muscle protein synthesis rates. In addition, evidence suggests a possible benefit of increasing protein intake above the population reference intake (0⋅83 g/kg/d) on lean mass and, when combined with exercise training, muscle strength. In addition to protein quantity, protein quality, the pattern of protein intake over the day and specific amino acids (i.e. leucine) represent key considerations. Long-chain n-3 PUFA (LC n-3 PUFA) supplementation has been shown to enhance muscle protein synthesis rates, increase muscle mass and function and augment adaptations to resistance training in older adults. Yet, these effects are not consistent across all studies. Emerging evidence indicates that an older person's dietary, phenotypic and behavioural characteristics may modulate the efficacy of protein and LC n-3 PUFA interventions for promoting improvements in muscle mass and function, highlighting the potential inadequacy of a ‘one-size-fits-all’ approach. The application of personalised or precision nutrition to sarcopenia represents an exciting and highly novel field of research with the potential to help resolve inconsistencies in the literature and improve the efficacy of dietary interventions for sarcopenia.