*Geek Box: Latency Periods

*Geek Box: Latency Periods

The ‘latency period’ of a disease is the period during which it is developing, without any overt signs or symptoms, prior to diagnosis. For example, while heart attacks increase in prevalence from age 50 onwards, we know that the process of atherosclerosis can begin to develop in teenage years. This process adds up to a cumulative burden over time, manifesting as a disease after up to 40-years of an exposure [in this case, and exposure to LDL-cholesterol mediated by genetic, dietary, and other lifestyle factors].

All chronic lifestyle diseases – heart disease, diabetes, stroke, etc. – are diseases characterised by long-latency periods. This is critically important to how we study diet-disease relationships. Nutrition science emerged during a period when the conditions facing public health – rickets, beri-beri, pellagra, goitre – were conditions defined by single-nutrient deficiency states, and characterised by short-latency periods: symptoms onset rapidly in deficiency, and resolved rapidly once deficiency was addressed, and nutritional adequacy restored.

However, modern nutritional epidemiology is faced with a logistical problem of trying to determine relationships between diseases with long latency periods and populations in which the exposure of interest – food – is a continuous, daily, exposure. In this respect, long-term prospective cohort studies are an under-appreciated – and misunderstood – tool with which to examined long-term diet-disease relationships. Capturing the relationship between dietary exposures earlier in life, and disease onset later in life, has been the defining challenge for nutritional epidemiology, and the primary reason for the development of food-frequency questionnaires, which may capture a picture of past dietary history, to be related to disease outcomes years later during the follow-up period of the study.