Did you know 2015

2015:01 Maternal Body Mass Index (BMI) and subsequent disease in offspring 

1501. Maternal Body Mass Index (BMI) and subsequent disease in offspring

A cynical scenario of the relationship between a mother’s BMI and subsequent development of cardio-metabolic disease in her offspring could be one in which the mother is sued by her child for failing to protect her/him from the metabolic syndrome. Certainly much has been written about the Barker hypothesis i.e. rapid post-natal weight gain predisposing to later onset of cardiovascular and metabolic disease if foetal growth was restricted as a result of factors such as foetal malnutrition or maternal vascular disease. However some researchers propose that it is the rapid weight gain that is the culprit, irrespective of whether the foetus was under- or over-nourished. These issues link to the current focus of attention on obesity among pregnant women, something that is encountered in both developing and developed countries. Whether the maternal overweight or obesity precede the pregnancy or manifest during the pregnancy there is evidence of poorer pregnancy outcome. Longer term follow-up of such offspring has also shown an increased BMI and the development of insulin resistance. Adding to the body of literature on the subject is a study from the renowned Liggins Institute in New Zealand. Named after the obstetrician who pioneered and advocated maternal antenatal steroids in order to stimulate foetal surfactant production, the Institute has followed up recipients of the intervention and found evidence of metabolic derangement after 30 years. In contrast to studies involving mothers whose BMI’s were either abnormally low or abnormally high, the Institute’s research involved data from 54 healthy mothers and 70 children who were born at term and were appropriate for gestational age. Pre-pregnancy BMI was based on data recorded during antenatal care and this was correlated with BMI at the time of follow up of the children which occurred at 8.9±1-9 years. Measurements at the time of follow up included the child’s height, weight and DEXA-based body composition, insulin sensitivity, total cholesterol, HDL and LDL, insulin-like growth factor, leptin, adiponectin and blood pressure. Irrespective of whether mothers’ pre-pregnancy BMI was normal or high there was a positive correlation between pre-pregnancy BMI and BMI standard deviation scores in the children, and a negative correlation with insulin sensitivity. With each 1kg/m2 increase in maternal pre-pregnancy BMI there was a 4% decrease in offspring insulin sensitivity. Furthermore, greater maternal pre-pregnancy BMI was associated with higher daytime systolic, night-time diastolic blood pressure and 24-hour mean arterial pressure in the offspring. The effects in offspring were independent of offspring adiposity. The authors conclude that higher maternal BMI prior to pregnancy (even among women with normal BMI) may contribute to increased risk of type 2 diabetes and other metabolic disease in offspring.

Read more:
Clin Endocrinol 2014 doi:10.1111/cen.12665 
Pediatrics 2014; 133: 863-71 
Diabetes Care 2008; 31: 1872-6

2015:02 Optimal postnatal growth trajectory for small for gestational age (SGA) babies

1502. Optimal postnatal growth trajectory for term small for gestational age (SGA) babies

Finally after about 25 years we have some insights and possible guidance regarding optimal growth for infants subjected to intrauterine growth restriction and potentially predisposed to cardiometabolic disease in adulthood. The issues that were raised 10 years ago in this series of summaries (0538) went around what healthcare providers should be doing about infant, childhood and adolescent growth if we were to protect an SGA infant from adult onset disease. This is addressed in a study carried out by Chinese researchers but based on data from the Collaborative Perinatal Project (CPP) which was a prospective study involving 12 US academic centres between 1959 and 1976, and followed offspring from birth to 7 years of age. De-identified data from the project are publicly available through the US National Archives (www.archives.gov). In light of previous data that have shown that rapid postnatal catch-up growth in SGA infants is related to endothelial dysfunction and cardiometabolic disease while persistently poor postnatal growth is associated with more-frequent infection, short stature and impaired cognitive development, the researchers interrogated the data in search of an optimal growth trajectory for SGA infants. Some 57 000 pregnancies were enrolled in the CPP which, after a number of exclusions left 1957 SGA infants born after 37-43 weeks gestation. Sophisticated statistical modeling resulted in the identification of five weight-growth trajectories. SGA infants with no catch-up growth (22.4%) had higher risks of infection in infancy, growth restriction, and low IQ at 7 years. Those with excessive catch-up growth (8.9%) had higher risks of overweight/obesity and elevated blood pressure at 7 years. Slow catch-up growth or regression after 4 months (31.4%) were also at risk for low IQ and growth restriction. Only babies with appropriate catch-up growth (37.3%) did not have increased risk for adverse outcomes. Appropriate and optimal catch-up was further defined as fast catch-up to around the 30th percentile in the first few months, modest catch-up thereafter and a target of maintenance around the 50th percentile by 7 years of age. While these data provide some insights they nevertheless only represent an hypothesis i.e. one does not know whether it is foetal ‘programming’ or postnatal nutrition that is influencing the growth trajectory and outcome. What we need now is a formal nutritional study that follows these recommendations and observes whether adherence is related to good outcome while deviations and ‘undesirable’ patterns result in adverse outcome.

Read more:
J Pediatr 2015; 166: 54-8 
Rev Endocr Metab Disord 2012; 13; 141-7 
Int J Obes Relat Metab Disord 2002; 26: 214-9

2015:03 What are Sleep Coaches?

1503 What are Sleep Coaches?

No, this does not refer to forms of overnight transport in which one has the luxury of a fully-reclining seat or a bunk-bed, but to a category of caregivers who deal with paediatric sleep disorders.  Such disorders are extremely common and in the United States there are advanced training opportunities that lead to certification in sleep medicine or behavioural sleep medicine.  Paediatricians seeking board certification in sleep medicine must complete a one-year accredited fellowship and pass the subspecialty examination.  Certification in behavioural sleep medicine is even more demanding, requiring a postdoctoral degree, a valid licence to provide mental health services, completion of a Society of Behavioral Sleep Medicine training programme or have two years of clinical experience and success in a certification examination.  Demand for the services of these trained experts exceeds professional capacity, seemingly resulting in opportunities for alternative practitioners.  The topic of ‘sleep coaches’ was covered recently by authors from Colorado and Wisconsin who performed an extensive internet search for the number and range of service providers in this category and their qualifications.  Their search produced 102 individual websites offering sleep coach services with 60% clustered in 8 populous states such as California, New York and Texas.  Forty-four percent did not report any post-secondary education, ~25% had a Bachelor’s degree, 23% a Masters and 4% a PhD.  Over half did not report any previous healthcare or educational experience.  The price of consultations ranged from $84 for group therapy to >$300 for contact or phone/on-line consultations and >$1100 for overnight packages.  Training as coaches ranged from an on-line application and telephone interview to a four-month online component followed by an examination and satisfactory management of three pro-bono cases.  The authors appeal for “thoughtful guidelines” for how these providers might be optimally integrated into the care of children with sleep problems, noting such issues as when to refer to a more-qualified specialist.  A quick search of South African websites for assistance with sleep disorders is somewhat reassuring in that the majority of providers, whether in solo or group practices are registered with the Health Professions Council (HPCSA), with most also belonging to the national Society for Sleep Medicine.  However we also have providers who would be classified as sleep coaches where the requirement to join the ranks relates mainly to being an independent, entrepreneurial business owner who is professional and caring and is a good listener and communicator.  Whereas the trained professionals in this field are regulated by and accountable to the HPCSA, formal oversight is lacking for others who are not specifically qualified and set up shop as consultants and therapists.

Read more:

J Pediatr 2015; 166: 487-9

 American Board of Sleep Medicine http://www.absm.org/BSMSpecialists.aspx

The Sleep Lady http://www.sleeplaldy.com/press/   

2015:04 Diet and telomere length

1504. Diet and telomere length

Telomere biology is an exciting field that has received attention in the context of ageing, age-related diseases and assisted reproductive biology. Long telomeres have been associated with longevity, while short telomeres are associated with decreased life-expectancy and increased risk of chronic diseases such as type-2 diabetes, cardiovascular disease, liver disorders and cancer. Given the current focus on childhood obesity and other determinants of adult disease, this subject is of interest to paediatricians. Telomeres prevent the loss of genomic DNA at the ends of chromosomes and protect their physical integrity. Attrition has been shown to be accelerated by oxidative stress and inflammation, but studies indicate that this attrition is modifiable. For example there is substantial variability in the rate of telomere shortening that is independent of chronological age, suggesting that factors such as lifestyle and/or diet may play a role. Several independent studies have made use of the prospective multi-state Nurses’ Health Study (NHS) that commenced in the US in 1976 and enrolled >120 000 subjects between 30-55 years of age. Blood samples were drawn in 1989-1980 with leukocyte telomere length included among the investigations. Over the years several studies have used the database, showing positive relationships between telomere length and dietary fibre intake, physical activity (specifically calisthenics or aerobics) and general healthy lifestyle, while inverse relationships have been found for high energy intake (in men), unsaturated fatty acid intake and waist circumference. In contrast to studies which have focused on specific factors such as dietary fibre or polyunsaturated fatty acid intake, researchers from Boston and Seattle recently published results of a study that focused on the Mediterranean Diet but also looked at other eating patterns that were classified according to the Alternative Healthy Eating Index, Prudent Diet, or Western Dietary Pattern. This is a topical issue in South Africa, given current debate between proponents of the Banting/Noakes/Atkins diets and supporters of the Mediterranean Diet. Data from the NHS allowed the researchers to adapt the Mediterranean Diet Score, measuring intake of vegetables (excluding potatoes), fruits, nuts, whole grains, legumes, fish, red and processed meats, and alcohol, and the ratio of saturated to monounsaturated fats. The study population included 4676 women ranging in age from 42-70 years. As expected, younger women had longer telomeres (p<0.001), as did those with the highest Mediterranean Diet scores who also had lower BMI, smoked less, and were more physically active. However none of these individual factors were significantly related to telomere length. The Alternative Health Eating Index showed a weaker but positive relationship with telomere length but no relationship was found for the Prudent or Western Diets. The antioxidant properties of the study diet may explain the favourable influence on telomere length. While this cross-sectional study gives an indication of beneficial effects of the Mediterranean Diet, the next question is the extent to which positive changes in lifestyle and diet are able to slow telomere shortening in at-risk individuals.

Read more:
BMJ 2014; 349: g6674 
Lancet Oncol 2013; 14: 1112-20 
Am J Epidemiol 2012; 175: 414-22

2015:05 D-dimer as a marker of traumatic brain injury (TBI)

1505. D-dimer as a marker of traumatic brain injury (TBI)

Healthcare providers whether in hospital emergency departments or rooms-based practice are frequently confronted with the decision to observe vs. CT scan an infant or child presenting with a head injury. Summary 1108 of this series dealt with the impact of the Children’s Head injury Algorithm for predicting Important Clinical Events (CHALICE) clinical decision rule on the frequency of CT scans, showing that contrary to expectation, the CT scan rate increased (from 3% to ~14%) after introduction of the rule. CHALICE calls for a scan based on history (e.g. loss of consciousness, seizure), examination (e.g. GCS <14, focal neurology) and mechanism of injury (e.g. high speed motor vehicle accident, high speed projectile) and was a positive indication for scan in some 114 subjects (57 prospective, 57 retrospective) evaluated for TBI in a pediatric trauma centre in California between 2007 and 2008. Because of previous evidence of coagulopathy following TBI, D-dimer concentration was one of a number of tests carried out at the time of presentation. Plasma levels of D-dimer were associated with TBI on CT on both univariate and multivariate analysis (vs. other markers such as prothrombin time and PTT). D-dimer was also inversely related to the baseline Glasgow Coma Scale. A D-dimer level of <500pg/µl had a 94% negative predictive value for brain injury on CT i.e. in children who meet typical clinical criteria for a CT scan after trauma, low plasma D-dimer suggests the absence of significant brain injury (and eligibility for discharge without a CT scan) . While several studies of adult TBI have demonstrated the prognostic value of D-dimer levels, the latter study appears to be the only one involving children until a recent report from Pittsburgh that focused on D-dimer levels in infants and children, in particular those considered for a diagnosis of abusive head trauma (AHT). Again, retrospective and prospective studies were performed to ensure that results in the one were confirmable in the other. The retrospective study involved 195 children <4 years of age (102 cases with TBI, 93 controls), while the prospective study included 44 children (20 cases, 24 controls). In both components the median D-dimer concentration was significantly higher in cases than in controls, while a receiver operator curve for the prospective group showed an impressive area under the curve of 0.91 (95%CI 0.83 – 0.99). At a cut-off level of 0.59µg/l the sensitivity and specificity for case identification were respectively 90% and 75%. In contrast to the other study which identified the level below which a CT scan was not necessary, this study suggested that D-dimer could be used to identify which young children at risk for TBI and particularly AHT might benefit from CT. 

Read more:
J Pediatr 2014; 166: 383-8 
J Emerg Trauma Shock 2013; 6: 180-5 
J Trauma 2010; 68: 1072-7

2015:06 Endoscopic vs. open repair of oesophageal atresia and congenital diaphragmatic hernia (CDH) 

1506. Endoscopic vs. open repair of oesophageal atresia and congenital diaphragmatic hernia (CDH)

Two quotations that pertain to the above title and procedures are: “Be not the first by whom the new are tried, nor the last to lay the old aside” (Alexander Pope – 1688-1744) and an old Chinese proverb “Those that say it can’t be done should get out of the way of those doing it.” Such sentiments, combined with the downsizing of endoscopic instruments and advances in neonatal anaesthesia, have paved the way for paediatric surgeons to apply thoracoscopic surgery to oesophageal atresia (± fistula) and CDH in neonates ( the latter also in the foetus as far back as 2001). However debates continue as to if/when ‘minimally invasive surgery’ is appropriate for these two conditions. There is no question that results improve with experience of the surgical team and the point is made that while not uncommon, these conditions do not occur frequently enough for the average trainee surgeon to gain the necessary endoscopic experience. Two recent articles in the Journal of Pediatric Surgery make interesting reading: one draws attention to fairly profound changes in CO2 and pH during insufflation of the gas that is an essential concomitant of thoracoscopic surgery. In CDH repair in particular the CO2 increased to 83mmHg (vs 61mmHg in the open procedure) and pH fell to 7.13 (vs 7.24 for the open procedure). It is stated that the hypercapnoea originates from the procedure and is not of metabolic origin, and that absorption of the gas is greater during thoracoscopic surgery than during laparoscopic procedures. However in this regard one should be aware of the duration of surgery e.g. in the second article, several series indicate that the endoscopic procedure takes longer than the open, with mean or median times extending to between 2 and 3 hours. This latter article also examines the pros and cons of the two approaches for the two intrathoracic conditions. In the case of oesophageal atresia there are more leaks with endoscopy and greater technical dexterity is required. These ‘cons’ need to be weighed against the better cosmetic result and reduced risk of the scoliosis that is associated with open surgery. For repair of CDH, endoscopic surgery has a greater recurrence rate, risk of the abovementioned metabolic consequences and the need for greater technical dexterity. Benefits of the minimally invasive approach are better cosmesis and reduced risk of bowel obstruction. Units with sufficient numbers may well be moving towards less-invasive surgery, but even in such units there may be selection bias towards larger infants with a single anomaly. The answers once again lie in a sufficiently-powered controlled study in which surgeons equally skilled in both techniques randomize comparable neonates to one or other procedure, but given preferences of surgeons and their teams, this is unlikely to happen. 

Read more:
J Pediatr Surg 2015; 50: 240-6 and 247-9 
Ann Surg 2013; 258: 895-900 
J Pediatr Surg 2010; 45: 355-9

2015:07 Asymmetric dimethylarginine (ADMA) in infants with bronchopulmonary dysplasia (BPD)

1507. Asymmetric dimethylarginine (ADMA) in infants with bronchopulmonary dysplasia (BPD)

Summary 1505 dealt with the possibility of D-dimer levels assisting with the diagnosis of traumatic brain injury. Another circulating metabolite that may prove to be of clinical value is ADMA, a naturally-occurring methylated analogue of L-arginine, the precursor of nitric oxide (NO) which is an endothelially-derived signaling molecule that causes smooth muscle dilatation and is central to the maintenance of low pulmonary vascular resistance. Inhaled NO has become an accepted intervention in the NICU for management of neonates with pulmonary hypertension (PH) but there is growing evidence that disturbances of endogenous signaling may be associated with the failure of some infants with PH to respond to exogenous NO. One such disturbance might involve ADMA which is formed by the methylation by arginine methyltransferases of arginine residues contained in proteins. Subsequent proteolysis results in the release of methylated arginines that include ADMA which competes with L-arginine for the active site of nitric oxide synthetase (NOS). In the presence of such binding by ADMA, the production of NO by NOS is inhibited and vascular dilatation is compromised. The balance between production of ADMA and its degradation by DDAH (dimethylarginine dimethylaminohydrolase) normally results in low levels of ADMA and relatively little inhibition of NOS, but levels are actually increased in cardiovascular, renal and critical hepatic disease. To test the hypothesis that ADMA would also be increased in preterm infants with BPD-associated PH, researchers from Ohio compared ADMA levels in 23 patients with BPD+PH vs. 95 patients with BPD without PH. Results showed that subjects with both BPD and PH had higher levels of ADMA than those with isolated BPD and that the plasma arginine:ADMA ratio was lower in the PH cases than in the BPD-only controls. These preliminary findings require confirmation but add to the literature supporting the use of ADMA as a biomarker of a variety of cardiovascular diseases and predictor of adverse events, poor prognosis and mortality. 

Read more:
J Pediatr 2015; 166: 222-3 and 230-3 
Eur Respir J 2008; 32: 503-12 
Vasc Med 2005; 10(Suppl 1): S19-25 

2015:08 Better air quality benefits childhood lung development

1508 Better air quality benefits childhood lung development

While South Africa might not have the smog and pollution problems of Beijing there are nevertheless known pockets of industrialization in which concerns have been expressed around the impact of the pollution on both childhood and adult illnesses.  Several authors have described the link between exposure to air pollution and impaired lung function in children, which is not only associated with an increased risk of asthma but also with long term effects such as chronic respiratory and cardiovascular disease in adults.  California, which in the 1970’s and 1980’s had extremely high levels of pollution from petrol- and diesel-driven motor vehicles and emission-related physical and photochemical reactions, has through control strategies and legislation significantly improved the environment, thereby allowing researchers to monitor changes in successive cohorts of children.  This is reported in a recent article in the New England Journal, showing beneficial effects in three groups of children aged between 11 and 15 years who were enrolled in the Children’s Health Study and monitored between 1994-8, 1997-2001, and 2007-11.  Lung function studies included FEV1 and FVC, while ozone, nitrogen dioxide and particulate matter of diameter <2.5µm (PM2.5) and <10µm (PM10) were measured in order to track air purity.  Overall 2120 children were tested and information obtained regarding age, gender, ethnicity, health insurance status, parental education and child’s health conditions such as asthma and/or other respiratory conditions. Air quality improved significantly over the study period; for example in the area with the highest initial levels, PM2.5 decreased from 31.5µg/m3 to 17.8µg/m3 and there were large declines in nitrogen dioxide.  Linear regression models were used to examine the relationship between declining pollution levels over time and lung function development. Over the 13 years covered by the study, declining levels of nitrogen dioxide and reductions in PM2.5 and PM10 were associated with improvements in growth of both FEV1 and FVC (almost all with p<001). These improvements were irrespective of gender, presence of asthma or improved socioeconomics over time.  Furthermore, the percentage of children with low FEV1 (<80% of predicted value) declined from 7.9% to 3.6% as air quality improved. These changes were directly related to the improvements in air quality and were unrelated to physical growth/height between 11 and 15 years of age.

Read more:

N Engl J Med 2015; 372: 905-13

Thorax 2014; 69: 540-7

Am J Resp Crit Care Med 2002; 166: 76-84 

2015:09 Does early peanut consumption protect at-risk children from peanut allergy?

1509 Does early peanut consumption protect at-risk children from peanut allergy?

In Western countries there has been a doubling in the prevalence of peanut allergy over the past 10 years with rates as high as 3% in some countries. In the USA the rate has quadrupled from 0.4% in 1997 to >2% in 2010. In such countries allergy to peanuts is now the leading cause of food-related anaphylaxis and death, resulting in many schools banning peanut butter, and some airlines no longer serving peanuts.  While it has been proposed that sensitization might involve early environmental exposure through the skin, oral exposure appears to be important in the development of immune tolerance.  Several small studies have studied oral exposure to peanut in older children with established allergy and while results have been promising, allergic reactivity has returned a few months after discontinuing the immunotherapy.  Late introduction of peanuts into children’s diets does also not prevent allergy, as evidenced by the retraction in 2008 of the previous recommendation from the American Academy of Pediatrics that parents should refrain from feeding peanuts to infants at risk of atopic disease until the age of 3.  Further evidence that delayed exposure to peanuts is associated with a higher rate of allergy was provided by du Toit and colleagues in the UK who showed that peanut allergy was 10-times higher in non-exposed Jewish children in London than in Jewish children in Israel who consumed peanut products before their first birthday.  An expanded investigative team has now published results of their LEAP study (Learning Early about Peanut Allergy) that enrolled 640 at-risk infants with severe atopy (eczema &/or egg allergy) at a mean age of 7.8 months and tested for peanut allergy by skin-prick (SPT). Subjects were randomised to peanut avoidance or consumption groups. Consumption consisted of 6g of peanut protein per week until 60 months of age. The primary outcome was the proportion of participants with evidence of peanut allergy at 60 months.  Intention-to-treat (ITT) and per protocol (PP) analysis were performed.  Among 520 infants in the ITT population with initially-negative SPT the prevalence of peanut allergy at 60 months was 13.7% in the avoidance group and 1.9% in the consumption group ((P<0.001).  Among the 98 with initially-positive SPT the prevalence of peanut allergy was 35.3% in the avoidance group and 10.6% amongst consumers (p=0.004). The authors conclude that early introduction of peanuts into the diet of at-risk children significantly decreases the frequency of peanut allergy.  Further studies are required to answer questions such as whether all infants should receive peanuts within the first year of life, if so at what ‘dose’ and frequency, and will tolerance persist if/when peanut ingestion ceases?

Read more:

N Engl J Med 2015; 372: 803-13  and 875-877

J Allergy Clin Immunol 2008; 122: 984-91

Pediatrics 2000; 106: 346-9

 

2015:10 Serum lactate as an indicator of injury severity in paediatric trauma patients

 

1510 Serum lactate as an indicator of injury severity in paediatric trauma patients

 Two recent summaries (1505, 1507) deal with circulating molecules that may assist with the diagnosis and severity of specific conditions in children. To these one may perhaps add a study from North America that sought to use serum lactate as a predictor of significant injury in children.  In the USA paediatric trauma has been estimated to annually account for $51billion dollars while the global burden is close to $500 billion.  Hospital admission and investigative costs contribute to the expenditure and there is thus great interest in diagnostic algorithms, severity scores and laboratory investigations that assist in the process of triage.  Two factors have been shown to confound the evaluation of traumatic injury in children in whom a) a pliable musculoskeletal skeletal and close proximity of internal organs conspire to produce internal injuries with minimal external signs; and b) there is a greater capacity than in adults for haemodynamic compensation, so that haemodynamic parameters and fluid responsiveness are less predictive of injury severity.  That having been said, the researchers nevertheless set out to explore whether lactate, which is a well-established early indicator of severity and outcomes in adults, had similar predictive value in children.  The study group included 236 children with a mean age of 9.2±4.7 years seen in an urban trauma centre between 2011 and 2013.  Biochemical parameters of interest were serum lactate, pH and base deficit. Lactate of >2mmol/l was considered to be elevated, a base deficit greater than -5mmol/l was considered to be high, and pH below 7.30 was defined as acidosis.  Injury severity was classified according to the ISS (Injury Severity Score) with >15 regarded as ‘injured.’  Injuries were mostly due to motor vehicle collisions, pedestrian injuries and falls.  ICU admission was necessary in almost one-third of patients, intubation was necessary in 14% and 22% underwent a major procedure; however there were only four deaths.  Some 47% had a lactate >2.0mmol/l on admission and as a group had longer hospital stays than the normal lactate group (6.9 vs. 3.3 days), lower GCS scores (12.7 vs. 14.7) higher ISS (12.8 vs. 4.6) and increased need for ICU admission, intubation and major procedures.  The mean lactate level of the four patients who died was 13.0±6.7mmol/l.  However it was found that the definition of an elevated lactate (>2.0mmol/l) was probably too liberal, lacked adequate sensitivity and specificity and was only of value in predicting a low risk patient if below 2.0.  On the other hand, lactate of >4.7mmol was strongly suggestive of severe injury.  The pH was also useful in predicting severe injury and would likely have performed better had the definition of acidosis been stricter e.g. around 7.25-7.27 rather than 7.30.

Read more:

J Pediatr Surg 2015; 50: 598-603

J Trauma 1974; 14: 187-96

Injury 2009; 40: 104-8   

2015:11 Bacterial lysates (BLs) to prevent infections and allergy in children 

1511 Bacterial lysates (BLs) to prevent infections and allergy in children

It is stated in a review article from Poland that bacterial lysates have been used empirically in Europe for 100 years to prevent respiratory infections in children.  Details are not provided as to how the lysates were prepared or presented to the host at the turn of the 20th century, but from what we now know in the 21st century there appears to be a basis for this form of treatment.  Bacterial lysates are mechanically fractionated or chemically degraded lyophilized cells derived from pathogenic bacterial strains responsible for respiratory tract infections (RTIs).  OM-85BV, the compound most-frequently cited in research, is commercially available as Broncho-Vaxom, Broncho-Munal, Ommunal, Paxoral and Vaxoral, and contains extracts from eight bacterial species including Staph aureus, Pneumococcus, Klebsiella and H influenzae.  Bacterial lysates can be administered orally, intranasally or sublingually, the latter two routes activating local mucosal immunity at the level of the respiratory epithelium and submucosa.  Oral administration of BL is thought to activate dendritic cells by acting as ligands for toll-like receptors (TLRs).  Following absorption into the Peyer’s patches and presentation of the antigens to the dendritic cells, antigen-specific T-lymphocytes are generated that modulate B-cell isotype switching to IgA by the secretion of cytokines including TGF-β, TNF and a proliferation-inducing ligand.  Lymphocytes, lymphoblasts and activated dendritic cells migrate to mesenteric nodes for maturation, then circulate through the blood, thoracic duct and lymph before depositing in distant mucosa-associated lymphoid tissue.  There is also an immunoregulatory response within the latter tissue that appears to be mediated in part by dendritic cell-specific adhesion molecules that affect living bacteria by inducing regulatory T cells and suppressing T-helper cell activity.  Bacterial lysates have been used to decrease the number and frequency of acute and recurrent RTIs. A Cochrane systematic review and meta-analysis and another systematic review both showed that BLs were effective in reducing the frequency of infection by ~40%.  Studies have also shown an effect of BLs on the duration of symptoms, the need for antibiotic treatment and, in a group of children with IgG subclass deficiency, a decrease in infectious episodes.  Lysates have also been shown to be effective in children with allerg: for example in a placebo-controlled study, daily BL for three months decreased wheezing by almost 40% and shortened obstructive episodes.

Read more:

Ann Allergy Asthma Immunol 2015, http://dx.doi.org/10.1016/j.anai.2015.02.008

J Allergy Clin Immunol 2001; 69: 3719-27

Curr Opin Allergy Clin Imuunol 2013; 13: 293-5     

 

2015:12 Outcomes for infants born with congenital diaphragmatic hernia (CDH) 

1512 Outcomes for infants born with congenital diaphragmatic hernia (CDH) .

Summary 1507 in this series discussed the perceived imperative for budding neonatal surgeons to gain endoscopic experience in repairing CDH and oesophageal atresia. The discussion was presented in a manner that suggested that the fundamentals of medical management are under control, so now it is time to focus on surgical management.  But do we fully appreciate the effects of changes in medical/supportive management over the years?  In a review of the situation in a neonatal unit in Utah, researchers analysed data for 192 of 260 infants treated between 1998 and 2013. Exclusions were for delayed diagnosis, associated congenital anomalies with poor prognosis, surgery elsewhere prior to transfer, failed early stabilizatio, or declined consent.  Four treatment epochs were defined: Era 1 (1998-9): pre modalities of inhaled nitric oxide (iNO), ECMO, prostaglandin-I2 and milronone; Era 2 (2000-2): iNO available but the rest not; Era 3 (2003-4): iNO and ECMO available, others not; Era 4 (2005-13): all available.  In addition to the above changes, since 2007 written guidelines for management of CDH included gentle ventilation (high frequency oscillation, tolerating moderate hypercarbia, maximum mean airway pressure of ≤16cm H2O), treatment aimed to minimize pulmonary hypertension including preductal oxygen saturation of >90% and PGI2 for pulmonary vascular smooth muscle relaxation or milronone, a phosphodiesterase-2 inhibitor which dilates pulmonary vasculature and also improves cardiac function in both systole and diastole.  Introduction of these various modalities was empirical and results had not been formally analysed at any of the points at which new modalities were introduced. Logistic and linear regression were used in the analysis. The primary outcome of death prior to discharge was analysed by Cox regression modelling.  Age at discharge tended to increase over the 4 eras (median 32 days in Era 1 to 50 days in Era 4) but birth weights, ventilator days and oxygen exposure were similar.  Overall survival was 81% (which increased to 88% if the 15 patients not eligible for surgery were excluded). Over the period iNO use increased from 38% to 80% while ECMO was around 25% throughout. Milronone use was at ~50% whereas PGI2 use declined from 43% to 15%.  There were no differences in hazard ratios for death across the four eras despite the results suggesting treatment of sicker cohorts of patients between the early and later eras.

Read more:

J Pediatr Surg 2015; 50: 524-7

Neonatology 2012; 102: 130-6

Pediatr Perinat Epidemiol 2011; 25: 144-9

 

2015:13 Pertussis immunization and risk of sudden infant death syndrome (SIDS) 

1513 Pertussis immunization and risk of sudden infant death syndrome (SIDS)

 To a large extent SIDS has dropped ‘below the radar’ since the widespread shift in infant sleeping position from prone to supine. However in the USA approximately 2000 infants still die each year with a diagnosis of SIDS.  Concerns about SIDS are often linked to concerns about immunization, with opponents of this preventive health strategy often resorting to (unfounded) claims that pertussis immunization is linked to neurological disease (specifically autism) and SIDS. ‘Support’ for the latter was largely the result of both SIDS and pertussis immunization (as DPT) occurring at around the same stage of infancy.  Two subsequent meta-analyses, mainly involving case-control studies, have shown odds or risk ratios of 0.54 and 0.67 i.e. DTP immunization was associated with a reduced risk of SIDS. The relationship has recently been explored in even greater detail using US national databases that contained data on cause of death, completeness of immunization and sleeping position.  A stated weakness of the study was the need to use different databases over the period which overall ran from 1968 until 2009.  SIDS mortality rates increased significantly from 1968 to 1971 (>27% per annum), by 47% between 1971 and 1974, by only 3% over the subsequent 5 years, and remained constant thereafter.  However much of the increase during the 1970s was the result of changes in the ICD coding system that introduced a specific code for SIDS.  The decline in SIDS cases during the 1990s has largely been attributed to ‘Back to Sleep’ campaigns, but pertussis immunization rates also changed during the three decades covered by the study under review.  Concerns about adverse neurological reactions led to immunization coverage falling from 75% in 1975 to 64% in 1985, but following publication of a report by the Institute of Medicine which showed the absence of neurological complications, the rates picked up again.  In the study under review immunization coverage dropped from percentages in the mid 70s in 1969 to mid-60s in 1985, but picked up thereafter, reaching 90+% by 1993.  Respondents in the survey on sleeping position indicated that by 2000 some 75% of infants were sleeping supine.  Multivariate regression analysis showed an inverse relationship between DTP immunization and SIDS rates and also between supine sleeping and SIDS. Because of co-administration of DTP with polio vaccine it is difficult to distinguish between possible protective effects on SIDs, but the aforementioned meta-analyses studied DTP±OPV and found similar effects.  If there is indeed a relationship the authors suggest that DTP may protect against pertussis infection, and that in the absence of immunization some cases of SIDS are undiagnosed pertussis with nocturnal gasping inducing upper airway obstruction.  The value of this study perhaps lies more in providing parents with a good reason to immunize than in trying to establish the underlying physiology.

Read more:

BMC Pediatrics 2015; 15: 1

Vaccine 2007; 25: 4875-9

 Arch Dis Child 1988; 63: 41-7

 

2015:14 Should adolescent boys be vaccinated against human papillomavirus (HPV)? 

1514 Should adolescent boys be vaccinated against human papillomavirus (HPV)?

Given national cervical cancer rates in South Africa there is a clear case for introduction of routine HPV immunization in females. Notwithstanding the fact that a successful programme for teenage girls would result in adequate herd immunity and protection for heterosexual males as well, statisticians and epidemiologists are now turning their attention to potential benefits of males also being immunised. This is to some extent because female uptake rates may need to be >80% to limit the need for male vaccination and such rates of uptake are not easy to achieve.  In western countries the incidence of HPV-related oropharyngeal carcinoma (HPV-OPC) is increasing more rapidly among men than among women, and it is projected that by 2020 OPC will be the most common HPV-related cancer in the USA (with cervical cancer rates assumed to be falling due to immunization programmes).  Based on literature, national data and 2000-2010 data from a major Ontario cancer centre, a Markov model was populated for vaccine efficacy, oral-HPV-infection development rate, HPV-OPC development rate, disease-specific survival from HPV-OPC, all-cause mortality, and direct cost data for patients treated in Ontario for OPC between 1997 and 2007.  Five health states were identified ranging from well through HPV-infection, cancer, survival to death, and movement between these states was determined by transition probabilities obtained from the various data sources.  The model assumed a cohort of well 12-year olds who could acquire oral HPV infection and enter the HPV-infected state. Most of these infections clear naturally but a minority persist, with some persisting until death and others going on to HPV-OPC and movement between the various states of cancer, survival and death (from OPC or other cause).  For the exercise and based on a conservative Canadian estimate, risk of HPV-OPC was set at 2/100 000 (0.002%).  The cohort of 12 year olds was based on the 2012 Canadian population, with the model designed to compare the costs and effectiveness of HPV vaccine against the current standard of care (no vaccine).  Assuming 99% vaccine efficacy and 70% uptake the vaccine produced 0.05 more Quality Adjusted Life Years (QALYs) and saved $145 (Canadian) per individual. At 50% efficacy and 50% uptake the outcome was 0.023 more QALYs and $42 saved.  This translates to between $8 and $28m for the cohort and as such is regarded as cost effective, recognizing that the cohort of 12-year olds is repeated each year.  The savings would be greater if indirect costs/gains (e.g. related to productivity) had been included.  On the other hand, while policy makers might be persuaded by these elegant projections and predictions, one needs to acknowledge that it takes many years before the costs of the programme are offset by the benefits, that treatments may change (becoming more or less expensive), that herd immunity of females improves etc. and that opportunity costs must be considered.

Read more:

Cancer 2015 doi: 10.1002/cncr.29111

JAMA 2012; 307: 693-703         

N Engl J Med 2007; 356: 1928-43

 

2015:15 Therapeutic hypothermia (TH) after paediatric cardiac arrest

1515 Therapeutic Hypothermia (TH) after paediatric cardiac arrest

Paediatric cardiac arrest is a relatively uncommon condition; for example it is stated that there are only ~16000 out-of-hospital occurrences in the US each year and only 2% of children in American paediatric ICUs have/have had an in-hospital cardiac arrest (IHCA).  Causes of out-of-hospital cardiac arrest (OHCA) range from disorders of cardiac conduction to trauma and near-drowning, while in-hospital events would also include post-operative congenital heart cases, medical conditions with multi-organ failure, overwhelming infections etc.  Published guidelines continue to be reviewed in terms of when to withhold or terminate resuscitation and how to treat in hospital. In terms of the former, a 2014 article in Pediatrics authored by American Colleges an Association and an Academy included a statement that “it is reasonable to withhold resuscitative efforts in a child under circumstances of decapitation and rigor mortis!” The article also proposes that if resuscitation has exceeded 30 minutes and the nearest facility is more than 30 minutes away, then death or poor outcome is inevitable.  As for in-hospital management, largely based on neonatal and adult data, TH has been recommended after cardiac arrest-related coma in children by bodies such as the American Heart Association/ILCOR.  But paediatric/adolescent cardiac arrest is a complex field with significant differences in outcome for IHCA and OHCA. For example survival after OHCA is between 2-12% vs. 27-51% for OHCA. Infants have the highest mortality whereas children and adolescents appear to do better than adults.  Poor neurological outcomes among survivors of OHCA are common (~76%) vs 24-53% for IHCA.  A systematic review published in 2015 does not provide much guidance in terms of TH as a treatment modality, concluding that the AHA/ILCOR recommendations are not supported since TH does not appear to provide an advantage over normothermia.  However it must be noted that the few studies that could be included in the systematic review were not of high quality, were limited in numbers, included IHCA and OHCA cases, had wide age ranges and included conditions such as underlying congenital heart disease.  The review looks forward to answers from the large, multicentre THAPCA (Therapeutic Hypothermia After Pediatric Cardiac Arrest) trial that is funded by the American Heart, Lung and Blood Institute and is currently underway with the goal of enrolling 900 patients.  A preliminary result that was presented at the 2015 Annual Meeting of the Pediatric Academic Societies (PAS) included 260 comatose OHCA patients from 38 collaborating centres and concluded that in this group, while TH showed a trend towards improved survival and neurobehavioural outcome at 1-year, the results were not significant (TH 27/138 vs. 15/122 treated with normothermia: p=0.13). However it is worth noting that with the same rates but double the sample size the result would be significant at p=0.02.

Read more:

Paediatr Emer Care 2015; 31: 296-303

Resuscitation 2015; 92: 19-25

Crit Care Med 2009; 37: 2259-67 

2015:16 Incorporation of a parental decision aid instrument into management of high-risk neonates 

1516 Incorporation of a parental decision aid instrument into management of high-risk neonates

Calls for instruments such as management guidelines, decision trees and algorithms are loudest in areas in which there is least agreement among clinicians as to what is best practice or standard of care. One such area is the management of a patent ductus arteriosus (PDA) in the extremely low birthweight (ELBW) infant.  Several decades down the line there is still debate as to whether prophylactic closure is better than symptomatic intervention and neonatal units base decisions on a combination of their clinicians’ experience and scientific evidence (given that one hopes that the former includes a measure of the latter). However for some time a third element has been recommended for inclusion in the decision-making process, that of parental engagement and ultimately parental choice.  In this regard clinicians in Saudi Arabia tested their published decision aid instrument (DAI) on three groups of women in order to establish whether prophylaxis or symptomatic treatment would be the preferred option.  Three groups of women were included: well pregnant women between 23 and 28 weeks attending the antenatal clinic, undelivered women with problems at the same gestational age and hospitalized in the antenatal wards, and women whose ELBW infants had been admitted to the neonatal ICU.  The DAI provides facts on prematurity, its potential complications, and scientific information about the use of indomethacin for PDA as prophylaxis or symptomatic treatment.  Consequences such as PDA ligation, BPD, oliguria, IVH and long term neurosensory outcome are covered.   Participants were well-educated and computer- and internet-literate, and ~40% were employed.  The authors state that a selection bias for prophylaxis was minimized yet, perhaps contrary to expectation, 82% of the 298 subjects opted for a prophylactic indomethacin strategy vs. symptomatic treatment.  There are several limitations to the study: only 6% of the mothers actually had ELBW infants in the NICU at the time, while 75% were healthy women for whom the exercise was completely theoretical. Secondly, only maternal opinion was canvassed and thirdly one might perhaps question the adequacy of the DAI because in the final analysis neurosensory outcome did not appear to be regarded as particularly important.  Also, in the group opting for symptomatic treatment, oliguria appeared to be perceived as more serious than IVH.  While one might ask whether it is fair in the situation of uncertainty around treatment options to effectively say to parents “We don’t know what to do, so please will you decide,” there is a definite place for parental involvement.  This DAI and others should therefore be reviewed and considered for inclusion in the NICU-decision-making process, perhaps after appropriate modifications and adaptations for particular cultures and local use.       

Read more:

BMC Pediatrics 2015; 15: 47   and 2011; 11: 78

Cochrane Database Syst Rev 2010; 5: CD006732   and  2009; 3: CD001431

Clin Genet 2007; 72: 208-17

 

2015:17 More on management of asphyxiated neonates 

1517 More on management of asphyxiated neonates

The previous two summaries in this series (1515, 1516) highlight some of the difficulties encountered in decision-making in the intensive care unit.  Indeed, particularly in managing critically ill patients with questionable outcomes it is highly desirable to have unit or ward policies that attending staff are obliged to follow.  For example, not too long ago newborns of <1000 g were not ventilated in South African public hospitals, some units followed a policy of not ventilating children who have ingested paraffin, and many units stop resuscitation of asphyxiated neonates if there is no heartbeat after 10 minutes.  The problem with such policies that are in place where prognosis is poor is that they are not only self-fulfilling, but also deny the possibility of outcomes changing as a result of new discoveries, technologies and procedures.  However it is not always easy to decide on the new approach if it is found that the old one is flawed. For example, while guidelines from ILCOR (International Liaison Committee on Resuscitation), the Australian Resuscitation Council (ARC) and the American Academy of Pediatrics justify stopping if there are no signs of life after 10 minutes of continuous and adequate resuscitation in a newborn with no heart rate, in a recent paper from Australia the authors comment that these guidelines predate the era of therapeutic hypothermia.  Newer studies are cited indicating that outcome is not universally bad e.g. one study showed that 5 of 24 survived “without moderate-to-severe disability.” The Australian group treated 13 infants between 2007 and 2013.  Treatment was withdrawn in 8 because of severe encephalopathy on clinical examination, electrocortical inactivity on EEG and extensive damage on MRI.  Only one of the survivors had severe spastic quadriplegia while another was developmentally normal but suffered from bilateral deafness.  Their results are both helpful and unhelpful: a) they show that outcome is not universally bad but b) they beg the question of whether a successful outcome in 3 of 13 justifies the intervention; and c) they implicitly ask whether there are modalities that would help to identify the successful 3. In this regard another recent article from Canada reports that MRI scans performed in such infants at 2-3 days of age during hypothermia correlate 100% with scans done at 10 days of age. As such one might be moving towards early prediction, even though this might be in the direction of identifying the patients without damage who will do well rather than being able to predict the ones with damage whose outcome might be more equivocal.

Read more:

Arch Dis Child Fetal Neonatal Ed 2015; 0: F1-3,  2015;100:F238-242, and 2014; 100: F102-5

Am J Obstet Gynecol 2007; 196: 463 e1-5

Arch Pediatr Adolesc Med 2011; 165: 692-700

 

2015:18 What do we know about e-cigarettes in relation to children? 

1518 What do we know about e-cigarettes in relation to children?

The simple answer to the above question is probably ‘very little,’ but perhaps this is a topic one should be considering.  Recent reports in the lay press have highlighted two issues: 1) Dr Derek Yach, Director of the Vitality Institute, is on record as stating that e-cigarettes are a welcome innovation and health risks have been exaggerated; and 2) at least one infant has died in the USA after drinking from what is thought to have been an e-cigarette refill bottle. Importantly, in relation to the latter case, many e-cigarette refills are flavoured (e.g. mint, candy, chocolate) and may be mistaken by children for sweetened juices.  The concentration of nicotine in the refills varies considerably, from <20mg/ml to >100mg/ml. Based on an accepted toxic dose in children of 1.4mg/kg, a French study has estimated that a child would need to ingest <1ml of liquid from a refill cartridge with a concentration of 16mg/ml i.e. one at the lower end of the range.  In South Africa the Minister of Health has repeatedly stated that he wishes to ban e-cigarettes, meanwhile the official standpoint is that as a nicotine substitute they are only available on prescription from pharmacies.  The extent to which access is restricted is not known, particularly since products are freely available online from local distributors.  Countering the media report of the fatality in New York State is a review article from Texas that covered paediatric exposures reported to poison centres between 2010 and 2014. Over the period a total of 203 cases were reported in children of <5 years of age (2 in 2010, 5 in 2011, 20 in 2012, 70 in 2013 and 106 within 6 months in 2014).  The dramatic increase mirrors the increase in sales and utilization of e-cigarette, which now represent a multi-billion dollar industry.  Both sexes were equally affected and three-quarters were <3 years old at the time of exposure.  Ingestion accounted for 93% of cases (the others being inhalation or dermal exposure).  Eleven percent were classified as having serious consequences, mainly vomiting, followed by lethargy/drowsiness and coughing/’choking.’ One patient was reported to have stopped breathing on the way to the hospital but recovered without formal resuscitation.  Treatment was mainly by ‘dilution, irrigation, washout, food/snack’ i.e. ‘decontamination.’ Activated charcoal was administered in only 10 cases.  While these results might be reassuring in terms of the relative safety (i.e. non-lethality) of ingested nicotine in infants and young children, they nevertheless raise awareness of an apparently-increasing and worrying trend.

Read more:

J Emerg Med http://dx.doi.org/10.1016/j.jemermed.2014.12.073

CJEM 2015 Apr20:1-5 [Epub ahead of print]

Clin Toxicol (Phila)2014; 52: 542-8      

 

2015:19 And e-cigarettes and the foetus? 

1519 And e-cigarettes and the foetus?

The global interest in and uptake of e-cigarettes as a ‘healthy alternative to smoking’ is reflected in the fact that the Oxford Dictionaries’ 2014 Word-of-the-Year was “vape” - used as a verb to describe the inhalation and exhalation of e-cigarette vapours or as a noun to describe the e-cigarette itself.  If nothing else, unless the product is outlawed we are at the beginning of a new era in paediatric research.  The preceding summary covers aspects of poisoning in infants and small children, but what do we know about the foetus?  Certainly we have known for decades that maternal smoking is linked to intrauterine growth retardation and adverse outcomes, but turns out we don’t really know exactly what role each of the potentially-noxious agents plays as mothers inhale and absorb tobacco smoke.  While the e-cigarette proponents regard nicotine as safe, or at least safer than tobacco, there are those who believe we are ignoring the potential harm to the foetus given the evidence that nicotine is a developmental ‘toxicant.’  Nicotine crosses the placental barrier and binds to nicotine acetyl choline receptors (nAChRs) which are widely expressed throughout the foetal nervous system, and which regulate brain maturation.  Animal models provide compelling evidence that exogenous nicotine has detrimental effects on neurodevelopment due to cell damage, reduced cell number, impaired synaptic activity, accelerated change from cell replication to differentiation, and initiation of apoptosis.  In humans, smokeless tobacco use during pregnancy has been shown to have a modest effect on birthweight but increased risk of preterm and stillbirth, and neonatal apnoea rates that are similar to those in infants of cigarette smokers. Such data suggest that mothers should be discouraged from smoking during pregnancy, whether they are smoking e- or regular cigarettes. But then we have data from a study that explores outcome of offspring at two years of age after women received nicotine patches vs. placebo during pregnancy in order to assist with smoking cessation.  Slightly more than 1000 women were enrolled at 12-24 weeks of pregnancy and randomly assigned to study and control groups.  Post-partum smoking cessation rates among mothers were similar, but the data suggest that the mothers who received the nicotine patches (for up to 8 weeks) did smoke less during the second trimester.  As for developmental outcome at two years of age, those exposed to nicotine were “more likely to have unimpaired development.” This is not to say that nicotine is beneficial to the foetus, but is perhaps due to reduced foetal exposure to maternal cigarette smoking in the nicotine-patched group.  Are these first-world issues that should not be considered in our country? Perhaps, but with our rapidly emerging middle class and youth susceptible to global trends, these issues should probably not be completely ignored.

Read more:

Am J Prev Med 2015 http://dx.doi.org/10.1016/j.amepre.2015.01.015

Lancet Respir Med 2014; 2: 728-37

Nat Rev Neurosci 2009; 10: 303-12     

 

2015:20 Surfactant-budesonide combination to prevent chronic lung disease in preterm infants?

1520 Surfactant-budesonide combination to prevent chronic lung disease in preterm infants?

While the survival of extremely preterm infants has increased over the past decades, chronic lung disease/bronchopulmonary dysplasia (BPD) remains as a significant complication and cause of morbidity and mortality. Lung inflammation and host responses have been implicated in the pathogenesis of BPD, and anti-inflammatory agents such as steroids have long been the focus of preventive strategies. Systemic administration has been associated with adverse sequelae in recipients, so several studies have evaluated whether outcomes are better if steroids are administered by inhalation or tracheal instillation.  The comparisons have been subjected to several Cochrane reviews with the most recent comment being a) that inhaled corticosteroids offer no advantage over systemic steroids in the management of ventilator dependent preterm infants; and b) on the basis of available evidence, neither inhaled nor systemic steroids can be recommended as standard treatment for ventilated preterm infants.  But what if there is a more effective way of delivering the steroid to the lungs? In this regard a recent article from Taiwan redirects attention to a 2008 report.  The former article shows how a combination of surfactant+budesonide resulted in significantly improved pulmonary distribution of a fluorescent dye into the lungs of mice than if the dye was introduced on its own vs. in combination with either surfactant or budesonide.  The article and an accompanying editorial refer to a pilot study from 2008 in which 116 VLBW neonates with severe respiratory distress syndrome were randomized to receive either surfactant alone or the surfactant-budesonide combination.  Infants in the combination treatment group required lower mean airway pressure on days 1 and 3 and had better oxygenation and ventilation indices than the surfactant controls.  More infants in the combined treatment group were extubated by 2 weeks of age, and the combined outcome of death by 36 weeks postconceptional age or chronic lung disease was significantly lower (19/60 vs 34/56).  This same article is highlighted in a very recent online issue of Neonatology, again appealing for an adequately powered, ideally multicentred trial that would establish whether this form of therapy does indeed offer improved outcomes for the large number of ELBW and VLBW infants who despite advances in neonatal care are still at risk for BPD.

Read more:

Pediatrics and Neonatology 2015; 56: 19-24

Neonatology 2015; 107: 358-9

Pediatrics 2008; 12: e1310-8

 

2015:21 Timing of bilateral cochlear implants in children 

1521 Timing of bilateral cochlear implants in children

Irrespective of whether one’s patient is in the private or public sector in South Africa the mention of cochlear implants is met with a response about affordability. However the matter must be seen in perspective. A 2012 World Health Organization report on the prevalence of disabling hearing loss in children cited a figure of 32 million worldwide, with 6.8 million reported from sub-Saharan Africa.  Interesting correlates were noted such as the inverse relationships between prevalence and gross national income per capita, and between prevalence and parental literacy rates.  In health economic terms, while cochlear implantation is extremely expensive, the procedure is widely regarded as best practice/standard of care with long term benefits and savings to patients and society.  Several recent reviews and analyses have focused on timing of the second implant in children with severe bilateral hearing loss.  Historically procedures were performed sequentially for reasons that included availability of resources, concerns about increased surgical and anaesthetic risks and complications if both procedures were performed simultaneously, parental choice, children’s reluctance to be completely ‘weaned’ off hearing aids, and the concept of ‘saving’ one side in anticipation of future technological advances.  At this time, while the data favour simultaneous surgery, it would appear that up to 50% of families are opposed to simultaneous cochlear implantation (SCI).  In a review of 25 studies Spanish researchers reported that SCI is the procedure of choice for children with profound bilateral deafness, this because compared to sequential implantation there are no differences in complication rates, lengths of stay, medication exposure etc.  SCI is more effective than sequential surgery in respect of sound perception, noise and speech recognition, and language development.  On the other hand, a long inter-implant interval with the sequential procedure correlates negatively with development of linguistic abilities, both at comprehension and expression levels.  The critical period for language development is between 1 and 3,6 years, with brain and auditory plasticity decreasing over time. For this reason early SCI or, at worst, short-interval sequential implantation is recommended.  The preservation of one side for future technology is discouraged because the perceived benefits are outweighed by the loss of binaurality during a critical developmental period.  Overall, predictive factors of successful SCI are age at implantation, duration of auditory deprivation, binaurality, availability of resources and involvement of families.

Read more:

Int J Pediatr Otorhinolaryngol 2015; 79: 786-92

Laryngoscope 2014; 124: 1511-2

Cochlear Implant Int 2011; 12: S8-14 

 

2015:22 Diagnosis of bronchiolitis by ultrasound (US)

1522 Diagnosis of bronchiolitis by ultrasound (US)

In a neat and well-conducted study from Bari in Italy, researchers compared US to clinical assessment of infants presenting with bronchiolitis. The group studied 106 infants with mean age of 87.4 days.  Clinical severity was based on standard criteria e.g. points were awarded for factors ranging from 1 each if the respiratory rate was between 50 and 60, the infant had difficulty feeding, subcostal or intercostal retractions were present and there were ‘crackles’ or an end-expiratory wheeze; to 3 points each for a rate of >70, cyanosis/drowsiness, generalized retractions/nasal flaring, and diminished breath sounds &/or inspiratory and expiratory wheezing. Mild bronchiolitis scored from 1-4 and severe from 9-12.  US scans were recorded by a skilled pediatrician and a radiologist, and were captured anterolaterally as well as paravertebrally/posteriorly. Scoring was based on the presence, number and confluence of horizontal and vertical artifacts (A- and B-lines respectively) and on the presence or absence of subpleural lung consolidation.  An increasing concentration of B-lines corresponds to increasing lung congestion, and permits a quantification of ‘pulmonary interstitial syndrome.’  According to clinical scores 74 infants had mild bronchiolitis, 30 had moderate disease and 2 were severely affected.  Agreement between clinical and US diagnosis was >90%, as was inter-observer US diagnosis. Furthermore, US identified infants in need of supplementary oxygen even if initial assessment in room air was satisfactory (saturation >94% or capillary oxygen tension >45mmHg). Posterior and paravertebral US findings, particularly subpleural lung consolidation were more predictive of severe disease and oxygen requirement, most likely because infants of this age are essentially supine. Ultrasound appears to be popular for lung assessment in Italy and in 2012 an international panel reviewed the data and made recommendations for point-of-care ultrasound.  In the paediatric field, evidence was found to be strong for diagnosis of respiratory distress syndrome, transient tachypnea and pneumonia.  Perhaps bronchiolitis will be added to the list but one wonders what clinical advantage the modality provides other than if/when one lacks x-ray facilities but has access to US and the necessary skills to perform and interpret the findings.  On the other hand, in this age of telemedicine it is possible if not likely that a smartphone ‘app’ would be helpful as a screening/diagnostic tool for community healthcare workers faced with an infant with respiratory difficulties.

Read more:

BMC Pediatrics 2015; 15: 63

Pediatr Radiol 2014; 44: 900

Intensive Care Med 2012; 38: 577-91

2015:23 Value of eye examination in children with cerebral malaria 

1523 Value of eye examination in children with cerebral malaria

While cerebral malaria (CM) might not be a common diagnosis in South African paediatric practice it is nevertheless a reality.  The condition is defined clinically as peripheral parasitaemia with coma not directly attributable to convulsions, hypoglycaemia, concomitant infection or any other identifiable cause.  However this definition is broad, and in endemic areas may misclassify cerebral malaria in up to 25% of cases when postmortem histopathology of the brain is the reference standard. But what is this reference standard? At ~18-20 hours after invasion with P falciparum the parasite matures and parasitized red cells (pRBCs) sequester in the microvasculature of various tissues, including neural tissue.  Intracerebral accumulation of sequestered pRBCs is linked to vascular pathology, including blood-brain barrier dysfunction.  Hemozoin, the result of haemoglobin consumption by parasites, is visible at 30-34 hours, is a useful indicator of stage of parasite development, and becomes extra-erythrocytic when released into the vessel lumen after schizont rupture.  Vascular congestion occurs in vessels following accumulation of blood cells and impaired outflow, and has been identified as a potential mechanism for coma in CM.  Intense sequestration is enhanced by adherence of non-infected RBCs to pRBCs (so-called rosetting), thereby aggravating vascular congestion.  Because the brain and retina are both of neuroectodermal origin, and because paediatric CM is associated with retinal signs collectively known as malarial retinopathy, a group of researchers from Malawi, the UK, US and Canada correlated retinal and brain findings in 18 clinically-CM subjects who died.  Typical malarial retinopathy findings are retinal whitening, vessel discolouration, retinal haemorrhages and papilloedema, and are important because they are the only signs during life that distinguish between the subsequently histopathologically-confirmed CM cases vs those that die from other causes.  The authors postulated and subsequently confirmed that the presence of sequestration in retinal vessels would parallel neurovascular sequestration in the brain and distinguish between CM and non-CM coma.  They further hypothesized and confirmed that there is an association between the severity of retinopathy during life and degree of sequestration (percentage of parasitized vessels), intensity (number of pRBCs sequestered), and maturation stage of the sequestered pRBCs. Five of the 18 cases studied had clinically-defined CM (i.e. parasitaemia plus coma) but no malarial retinopathy, scanty intracerebral sequestration and other causes of death, whereas in the other 13 cases malarial retinopathy severity correlated with percentage of microvessels parasitized in the retina and brain. Furthermore, vascular congestion was more intense, sequestered parasites were more mature, and the quantity of extra-erythrocytic hemozoin was higher in the more-severe cases of retinopathy. The data therefore confirm a histopathological basis for the known correlation between degrees of retinopathy and cerebral dysfunction in CM.

Read more:

J Infect Dis 2015; 211: 1977-86

Brain 2014; 137 (pt 8): 2119-42

Trop Doct 2006; 36(suppl 1): 1-13

 

2015:24 Hepcidin: the "master iron regulator" 

1524 Hepcidin: the “master iron regulator”

In 2000 and 2001 hepcidin was independently discovered and named as LEAP-1 (liver-expressed antimicrobial peptide) by Krause et al who noted its antimicrobial activity, and named hepcidin (hepatic bacterial protein) by Park et al who noted that its structure was antimicrobial in nature. At the same time Pigeon et al identified an iron-regulated gene that encoded for LEAP-1 with high expression in the liver and lower expression in other organs such as kidney, adipose tissue, heart and brain. A central role for LEAP/hepcidin in iron homeostasis was unambiguously recognized by the serendipitous finding that inactivation of the gene was associated with severe iron overload in the liver and pancreas, while transgenic mice overexpressing the peptide had severe and often lethal anaemia unresponsive to iron.  In humans the hepcidin gene is in chromosome 19 and humans with mutations are affected by haemochromatosis.  Returning to the original identification of the peptide as an antibacterial and antifungal agent, the mechanism of action is not fully understood, but could be the result of interference with iron-availability for invading organisms (particularly since inflammation and specifically IL-6 are major stimulants of hepcidin production) or could be the direct result of hepcidin disruptively combining with membrane phospholipids of bacteria and fungi.  Physiologically, hepcidin produced in the liver enters the circulation and is widely distributed, exerting its main action on the three cell types associated with iron metabolism viz. enterocytes, hepatocytes and reticuloendothelial phagocytes which may store the iron (e.g. as ferritin) or release as the major sources of plasma iron attached to transferrin.  Hepcidin exerts its influence by binding to and inducing internalization and degradation of ferroportin which is the only known ‘exporter’ of iron from the cells.  Hepcidin itself is regulated in response to a wide range of internal and external stimuli including tissue iron stores, transferrin saturation, inflammation, hypoxia and erythropoietic demand.  Increased bone marrow erythropoiesis e.g. stimulated by blood loss, anaemia or hypoxia suppresses hepcidin production, whereas iron loading and inflammation induce production. Genetic hepcidin overexpression occurs as an autosomal recessive condition presenting with iron refractory iron-deficiency anaemia.  Overexpression also occurs with various cancers and as a result of radiation therapy. Given such responses the relationship to the anaemia of chronic disease becomes clear. Greater understanding of the relationship of hepcidin to various disease states has also established a field of research into therapeutic agents to either stimulate or inhibit hepcidin production.

Read more:

Pharmacogn Rev 2015; 9: 35-40

Biosci Rep 2015; 35: art:e00192

Am J Physiol Gastrointest Liver Physiol 260: 290: G199-203             

 

2015:25 More on hepcidin

1525 More on hepcidin

These summaries on hepcidin were stimulated by a statement in a recent article on iron deficiency in the South African Medical Journal which stated that “the hepcidin assay has promise in diagnosing pure iron deficiency anaemia” (in which levels would be low vs. high levels of hepcidin found in the anaemia of chronic disease). The question is: how close are we to the “promise” of the assay?  One of the articles referred to in the previous summary comments on the assay’s availability and affordability being fairly limited, and also notes the importance of being able to distinguish between biologically-active hepcidin and other isoforms.  The required technologies range from those that are antibody-based (ELISA, chemiluminescence, dot/blot assays) to various types of mass spectrometry.  While reports indicate that there might be substantial variability in results in terms of both diurnal and intra-individual variation, there is nevertheless the need to ultimately be able to accurately measure hepcidin levels and gain insight into its relationship to various conditions and states. For example hepcidin appears to be an accurate acute-phase biomarker of inflammation and sepsis, since it has been shown to be reliable in both early- and late-onset sepsis in neonates, with levels returning to ‘normal’ in response to treatment.  Relationships become more complex in adults, particularly where inflammation is regarded as being fundamental to conditions such as atherosclerosis, coronary artery disease, allergic disease and obesity, and where several may coexist. Severe exertion also results in low-grade inflammation and transient dysregulation of iron homeostasis.  So, while one might put all this together and consider that athletes using erythropoietin to boost performance could easily be identified via suppression of their hepcidin levels, it turns out that false positives and negatives are found where there is inflammation, co-existing iron deficiency, relative hypoxia during altitude training etc.  The article on use of the assay in athletes also comments on the cost of the equipment, skill of the technicians required, and again, the uncertainty about reference ranges in the normal population.

Read more:

S Afr Med J 2015; 105: 607

Frontiers Pharmacol 2014; 5: article 86

Br J Pharmacol 2012; 165: 1306-15      

 

 

2015:26 The need to include trials registry data in systematic reviews and meta-analysis 

1526 The need to include trials registry data in systematic reviews and meta-analyses

In this age of evidence-based medicine there is obvious and necessary emphasis on validity of the data on which one is basing everything from development of a clinical guideline to management of an individual patient.  Adequate sample sizes of studies are imperative and multicentre studies are common in order to achieve both standardisation of research methodology and the patient numbers for statistical significance.  Further evidential strength is obtained by reviewing published studies and performing meta-analyses when data are sufficiently homogeneous, or systematic reviews when not. However, when health economists and statisticians interrogate results from the pooled published data it is obvious through methodologies such as funnel plots that some of the data are missing and results are biased. Review and analysis of journals reveals an unsurprising tendency towards selective reporting and acceptance of research showing either a large positive effect of a new intervention or even non-inferiority of the new vs. the old.  Researchers and journals are typically less excited about trials that show a new treatment to be inferior to the old, or about inconclusive studies if, for example, the researcher failed to achieve the desired sample size.  In 2005 the International Committee of Medical Journal Editors (representing 11 prestigious journals) proposed a registry for clinical trials that prospectively assign human subjects to intervention or comparison groups in order to study the relationship between the intervention and a health outcome.  The editors went further, stating that no such study would be published in their journals if it had not been registered at the outset.  Such a register would require updating with data and be publicly accessible, thereby permitting the inclusion of results even if there was no subsequent formal publication.  South Africa followed suit in the same year with establishment of the SA National Clinical Trials Register and the statement that “as from the 1st December 2005 all new clinical trials to be conducted in the country must be registered.” The aim was to promote collaboration through the sharing of research information; assist researchers to identify and participate in clinical trials; decrease publication bias; reduce duplication of research effort, promote best use of limited research resources; and contribute to global efforts to reduce/eliminate disease.  It is questionable whether South Africa has achieved registration of all clinical trials but the same probably applies internationally since the requirement to register was included in the Helsinki Declaration in 2008 and the World Health Organisation in 2015 found it necessary to update and publish its position statement on public disclosure of clinical trial results.  Disappointingly, a recent study published in PLOS One shows that in a search of 194 studies of which 78 met selection criteria, only 11 incorporated trials registry data into their results.  Seems the quest to a) access all clinical trials data and b) reduce bias is still ‘work in progress.’ For those who intend complying with the requirements it is important to establish that whichever registry is utilised must be linked to a database such as www.clinicaltrials.gov in order to be truly accessible to those who want to make future use of the data.

Read more:

PLOS One 2015 doi:10.1371/journal.pone.0134596

N Engl J Med 2004; 351: 1250-1

South African National Clinical Trials Register http://www.sanctr.gov.za/

 

 

2015:27 Disparities in health care 

1527 Disparities in health care

Much has been published on this topic with North American and European countries focusing on disparities in care received by Caucasians vs. Hispanics, insured vs. uninsured populations, citizens vs. immigrants etc.  However, very little has emerged from South Africa in the way of formal comparative studies. This is perhaps surprising given our history, but on the other hand the debate continues as to whether race or ethnicity should be formally recorded on patient databases or omitted on the grounds of racial profiling or falsely concluding that various conditions are racially- or genetically-based when they are actually the result of environmental, socio-economic or cultural factors. But without the identification it is difficult to pick up trends. For example, in an article published not long ago it was noted that in South Africans with equivalent medical aid cover there were different rates of caesarean section in white vs. black members and different approaches to dental care: whites received more in the way of restorative procedures while blacks underwent more extractions. Such differences cannot be ascribed to race and it would be useful to know whether the likely attitudinal differences are with the patients, providers or both. Such questions are not confined to the way patients are treated but also apply to how patients respond to doctors.  In a recent issue of Pediatrics five contributors discuss an ‘ethics case study’ in which an African-American paediatric resident is prevented from examining a 3-year-old white child with otitis and told by the family that they would prefer a white doctor. Such a situation was not unknown to academic medicine in our country in the late 1980s and early 1990s as apartheid policies were rejected and hospitals, wards and staff were ‘desegregated.’ It would be safe to say that the usual response from the registrars, consultant staff and hospitals was that if that was the family’s position they should seek help elsewhere. Four of the discussants in the Pediatrics article are more-senior paediatricians whose duties or interests specifically include issues of ethics and diversity. Two of the four make a case for the family, arguing for patient rights and autonomy, but also putting forward the argument that one would not be offended by a Muslim woman who asks for a female doctor, or by a Korean patient who on historical grounds does not feel comfortable being attended to by a Japanese doctor. Two discussants have little sympathy for such arguments and differentiate between cultural, religious or historical issues and this case, which to them is simply racist. The fifth discussant turns out to be the subject of the discussion and lauds her senior for entering the room and offering the family the option of seeking care elsewhere.  The American Medical Association’s formal opinion (2003) is that prejudice such as this constitutes justification for transfer of care. The HPCSA’s National Patients’ Rights Charter might be seen as supportive of the family in the case study as paragraph 2.5 of the Charter identifies the right to choose a particular health care provider. In practice, given our history and the diverse and multicultural nature of our society, prejudice is likely to raise its head when there are differences between doctor and patient. One hopes that we would reject racist incidents as we did in the 1980s/90s, while always being cognisant of instances in which the patient’s request is reasonably based on gender-sensitive, religious or cultural issues and should probably be accommodated.

Read more:

Pediatrics 2015; 136: 381-6

American Medical Association 2003: Opinion 9.123

S Afr J Med 2008; 98: 435-8

 

 

2015:28 'Unprovoked seizures' - can one identify the ones that require urgent intervention?

1528 ‘Unprovoked seizures’ – can one identify the ones that require urgent intervention?

It is stated that as many as 40 000 children in the US experience a first seizure that is not associated with a precipitating factor such as fever or trauma. This is invariably an extremely stressful event for the family and, as with significant head trauma in children, clinicians in casualty/emergency departments presented with unprovoked seizures must decide if and when to obtain radiological studies.  The evidence is that most will undergo acute neuroimaging (CT more than MRI) but urgent intervention is only required in a minority. The question of whether one can refine the ‘decision tree’ or clinical algorithm was addressed in a prospective multicentre study carried out over two years by six high profile academic paediatric departments in New York State, Texas, Delaware and California. Subjects of 1 month to 18 years of age were eligible for enrolment if they presented to an emergency department for evaluation of a first unprovoked seizure or a history of such seizure that had not been medically evaluated.  Syncope, breath-holding spells and altered mental states without seizure were excluded, as were known neurological disorders, recent fever, or toxin ingestion.  On the other hand, cases with a seizure-free history of brain tumour, other neoplasm, stroke, coagulopathy, cardiac defect or intraventricular shunt were included.  Of 444 patients who were eligible 354 had a CT or MRI in the emergency department or within 4 months of the visit.  Almost two-thirds of the 354 were discharged home from the emergency department, 31% were admitted to the general inpatient service and almost 4% were admitted to ICU.  Despite these relatively high admission rates, only 40 (11.3%) had clinically relevant intracranial abnormalities of which 3 required urgent intervention.  The intracranial abnormalities ranged from tumour or metastases in two cases and infarction in one to several that were categorised as abnormal myelination, cortical or sub-cortical hyperintensities or increased white matter signal, and fewer with findings that intuitively might be regarded as more sinister such as arterio-venous malformation, cysticercosis and Chiari malformations.  In terms of predicting intracranial abnormalities, a high-risk past history (odds ratio 9.2; 95%CI 2.4-35.7) and any focal aspect to the seizure (OR 2.5; 95%CI 1.2-5.3) were independently associated with clinically relevant abnormalities.  However it is important to note that greater percentages of subjects with the latter ‘predictive’ factors did not have intracranial abnormalities i.e. in statistical terms they lack predictive value. Along these lines, while the urgent cases had significant history in two cases and focal features in the third, these features were present in many non-urgent cases as well. Taking the reverse perspective, in terms of efforts to identify patients at low risk for intracranial abnormalities, the authors were also unable to develop a model with sensitivity greater than 62.5%. The bottom line is therefore that overall the prevalence of intracranial abnormalities is low in first-time unprovoked seizures and very few cases require urgent intervention, but clinicians will most likely continue to subject patients to early if not immediate radiological studies.

Read more:

Pediatrics 2015; 136: e351-60

Epilepsy Res 2001; 43: 261-9

Pediatr Neurol 2008; 39: 404-14

 

 

2015:29 Risk of hearing loss and kernicterus in relation to exchange transfusion thresholds

1529 Risk of hearing loss and kernicterus in relation to exchange transfusion thresholds

It is stated that severe hyperbilirubinaemia affects 481 000 term or near-term infants worldwide per year, of which 114 000 demise and >63 000 survive with moderate to severe disability.  Some 75% of these cases occur in sub-Saharan Africa.  The spectrum of disease ranges from acute bilirubin encephalopathy and kernicterus in the early phases with later manifestations of intellectual disability, gross developmental delay, cerebral palsy (CP) and sensorineural hearing loss (SNHL).  The preponderance of cases on the African continent is attributed to factors such as a delay in seeking care and delays in providing treatment.  In this regard protocols have been devised to promote early detection of hyperbilirubinaemia, appropriate intervention and follow-up.  Risks of bilirubin-related sequelae are also greater in preterm infants with risk factors such as hypoalbuminaemia, CNS insults, infection and inflammation, resulting in the entity of low bilirubin encephalopathy. There is also an expanded spectrum of disease that might be bilirubin-related known as BIND (bilirubin-induced neurological dysfunction) that is characterised by less-specific neuromotor signs, muscle tone abnormalities, hyper-excitable neonatal reflexes, neuro-behavioural manifestations, speech and language abnormalities and an evolving array of cerebral processing abnormalities such as SNHL and visuomotor dysfunction.  What is clear from the current literature is that paediatric hyperbilirubinaemia continues to attract attention and research opportunities. In this regard two recent articles focus on the risk of SNHL and CP in term and near-term infants who reach exchange transfusion thresholds (ETT) which historically have been based on factors such as average neonatal albumin levels and consequent bilirubin binding capacity (e.g. 1g albumin binds 8mg bilirubin, so average 2.5g/dl in term neonate binds 20mg bilirubin) and retrospective reviews of frequency of disease above and below ETT. These fairly long-term studies make use of the LIGHT cohort (Late Impact of Getting Hyperbilirubinaemia or photoTherapy) which included 525 409 infants born at ≥35 weeks at any of 15 Kaiser Permanente Northern California hospitals between 1995 and 2011.  The researchers found that SNHL was confirmed in 11 of 1834 subjects with ≥1 bilirubin level above ETT (0.6%) vs 43 of a random sample of 19 004 subjects with all bilirubin levels below ETT (0.23%)   Only subjects with bilirubin levels ≥10mg/dl above ETT were at significantly greater risk of SNHL (hazard ratio 36; CI 13-101). Put differently, only bilirubin levels ≥35mg/dl were at greater risk (HR 91;CI 32-255).  In the CP study the frequency was 7/1833 in the exposed group (0.4%) vs 86/104 716 (0.1%) who did not reach ETT (Relative Risk 4.7;CI 2.2-10.0). Actual kernicterus (based on dyskinesia and globus pallidus injury on MRI) was only found in 3 subjects, all of whom had bilirubin ˃5mg/dl above ETT with at least 2 risk factors for neurotoxicity e.g. GSPD deficiency or hypoxia-ischaemia.  While these results are reassuring to some extent, they relate to the first world and risks may be considerably greater in the developing world.

Read more:

Pediatrics 2015; 136: 505-12

BMC Pediats 2015; 15: 39

JAMA Pediatr 2015; 169: 239-46

Semin Fetal Neonatal Med 2015; 20: 6-13

 

 

2015:30 Hepatic biomarkers in asphyxiated neonates 

1530 Hepatic biomarkers in asphyxiated neonates

In an article from India researchers discuss the value of liver function tests in the evaluation of birth asphyxia. It is accepted that cardiovascular reflexes in the asphyxiated neonate protect the cerebral circulation at the expense of organs such as the liver and kidney and, depending on the severity of the reduced blood flow to these organs, serological tests of hepatic and renal function may be deranged.  These tests may be carried out to assess the extent of organ dysfunction, for example it is important in the calculation of drug dosages where metabolism and/or elimination are dependent on normal renal and/or hepatic function.  However the authors of the Indian article argue that whereas there are obviously all the usual markers and criteria for asphyxia such as pH, Apgar scores, clinical staging of asphyxia and EEG findings, particularly in cases referred in for treatment and in which sedative and anticonvulsant drugs may cloud the assessment, it is useful to have other markers. The authors found that liver function was impaired in 43% of asphyxiated neonates and when 70 neonates meeting criteria for asphyxia were compared to 30 normal controls, on day one there were elevations of AST (aspartate transferase), ALT (alanine transferase), ALP (alkaline phosphatase), LDH (lactate dehydrogenase) and total bilirubin; PT (prothrombin time) was longer and INR (international normalized ratio) was elevated.  Total protein was lower in the asphyxiated group.  There were also excellent correlations between clinical (Sarnat and Sarnat) stages of HIE and both ALT and AST levels (although standard deviations were quite wide). On day three LDH was more important, increasing with progression of HIE stage and between HIE stages.  Other authors suggest that IMA (ischaemia-modified albumin) may also be a sensitive biomarker where hypoxia, acidosis, ischaemia and/or reactive oxygen species produced during reperfusion modify albumin and interfere with albumin binding to metals such as cobalt, nickel and copper. IMA increases within 5-10 minutes of an insult and continues to rise if the event persists, but returns to baseline after about 12 hours in the absence of continuing ischaemia.  Originally greeted with some excitement by cardiologists assessing myocardial ischaemia, the test currently seems to enjoy less support in that area but appears to be returning as a measurement tool not only for perinatal asphyxia but also as a measure of impaired liver function in chronic liver disease and cardiac damage following doxorubicin administration in cancer patients.

Read more:

Clin Med Insights: Pediatr 2015; 9

J Obstet Gynaecol Res 2013; 39: 663-71

J Matern Fetal Neonatal Med 2012; 25: 2401-5

 

 

2015:31 Researchers call for more information on pharmacokinetics of dobutamine in neonates

1531 Researchers call for more information on pharmacokinetics of dobutamine in neonates

 To most clinicians involved in the care of unstable and mostly-preterm neonates a cautionary statement from the UK regarding the pharmacokinetics of dobutamine must come as something of a surprise since this is such a widely-used drug. Discussions and debates in NICUs around dopamine and dobutamine and the superiority of one  vs the other are common, so why the sudden interest in whether we actually know enough about the drug to use it as liberally as we do? Dobutamine has been around for forty years and since its discovery in 1975 has been used off-label for treating haemodynamic insufficiency in newborns and children. In fact the use of unlicensed or off-label drugs is widespread in paediatrics, particularly so in neonates, and it has been estimated that approximately 75% of all medications used in NICUs are prescribed off-label!  The European Commission (EC) recently established a regulation encouraging research into the use of off-label medications in paediatrics, and at this time both the European Medicines Agency and NIH in the US have highlighted dobutamine as a candidate for further investigation.  Aspects of this debate are captured in summary 1405 which also speaks to the plethora of drugs utilized in neonatal intensive care and makes a case for ‘scavenged sampling” which involves the analysis of blood samples taken for other purposes and pooled for pharmacokinetic measurements.  Returning to the UK article under discussion here, the authors who are members of the EC-funded NEOCIRC (the Neocirculation Research Consortium) conducted a literature review that initially retrieved 963 age- and drug-related studies but eventually found only 46 suitable for inclusion in the analysis.  Most of the studies (38) covered pharmacodynamic aspects and eight covered pharmacokinetics. Fourteen of the 38 dynamics studies included neonates, as did 7 of the kinetic studies.  In most, heart rate and blood pressure increased with dobutamine and echocardiographic data showed quicker ventricular relaxation and improved ventricular filling as well quicker myocardial contraction.  There were also significant increases in stroke volume. Pulmonary vascular resistance decreased in most studies as did systemic vascular resistance and flow in the superior vena cava in two studies in preterm infants. There were however significant differences in methodology and dosing regimens in the studies that were reviewed, thus making it difficult to reach firm conclusions.  The pharmacokinetic studies showed that while infusion rate was positively correlated with plasma concentration, there was great variability in clearance between individuals.  While the call is for urgent, high quality, prospective studies in order to better understand the pharmacokinetics of the drug in neonates, one wonders whether many will heed the call in light of 40 years use of dobutamine, relatively consistent dosing regimens (2.5 – max 20µg/kg/min) and the desired clinical outcomes as reported in the review. On the other hand, one can be fairly sure that a cautionary statement regarding the adequacy of knowledge around the pharmacokinetics of dobutamine will be brought into medicolegal cases and count adversely against practitioners unless they can show that drug levels were measured and acted upon.

Read more:

Pediatr Cardiol 2015; DOI10.1007/s00246-015-1263-9

J Pediatr 2014; 164: 986-91

Semin Fetal Neonatal Med 2014; 19: 54-9

 

 

2015:32 Adverse effects of hyperglycaemia in extremely low birthweight (ELBW) neonates 

1532 Adverse effects of hyperglycaemia in extremely low birthweight (ELBW) neonates

The potentially harmful effects of hyperglycaemia in critically ill (non-diabetic) patients have been the subject of debate since the beginning of this century. During this time both adult and paediatric intensivists moved from a position of regarding hyperglycaemia simply as a marker of severity of illness to one in which the raised glucose levels were considered noxious and worthy of tight control via administration of insulin.  Recent publications have not only shown that insulin-related hypoglycaemic episodes are problematic, but have also begun to question whether the hyperglycaemia should in fact be treated. A number of articles published in the neonatal literature over the past decade have also drawn attention to the possibility of hyperglycaemia contributing to morbidity and mortality in ELBW infants, with a recent publication from Norway adding to the debate. The latter study focused on two time periods, before and after the introduction of ‘enhanced parenteral nutrition’ (EPN) in 2005. This strategy has been adopted as standard of care in many neonatal centres and was introduced in response to the observation that ELBW infants are at risk of in-hospital growth failure which in turn has been linked to impaired growth and poor developmental outcome in early childhood.  Following the introduction of EPN in their neonatal ICU the researchers noted both an increase in the rate of insulin use and an increase in mortality.  Their sense was that hyperglycaemia could be responsible since a number of observational studies have linked early elevated glucose levels to mortality, retinopathy, sepsis, necrotizing enterocolitis and poor neurodevelopmental outcome.  With the introduction of EPN the feeding regimen changed from 10% glucose IV for the first few days of life with addition of amino acids and lipids from days 3 and 4, to initiation of EPN immediately after birth, glucose at a minimum rate of 8.5g/kg/day on day 1 plus amino acids at 1.5-2.0 g/kg/day, and lipids at 0.5-1.0 g/kg/day on day 1 or 2.  All nutrients were increased on subsequent days to specified maxima.  Enteral nutrition was introduced early during both periods (day 1 or 2).  Glucose was monitored 4-8 times per day and hyperglycaemia (2 consecutive measures at least 3 hours apart) was defined as mild (150-181mg/dl; ~9mmol/l)), moderate (182-216 mg/dl; ~11mmol/l) or severe (>216mg/dl; >12mmol/l). The duration of hyperglycaemia was also categorized into 3 groups denoting exposure to severe hyperglycaemia for 0, 1, or ≥2 days.  Almost 350 infants were included in the study, 129 in the historical group and 214 in the EPN group, with the latter protocol resulting in significantly more episodes of severe hyperglycaemia during week 1 of life (41.6% vs 11.6%) and higher mortality (24.3% vs 10.9%). After sophisticated statistical adjustment for other risk factors, whereas the EPN regimen itself was not a risk, severe hyperglycaemia remained a strong and independent risk factor for death together with gestational age.   

Read more:

JAMA Pediatr 2015 doi:10.1001/jamapediatrics.2015.1667

World J Diabetes 2015; 6: 1082-91

Pediatr Diabetes 2014; 15: 75-83  

 

2015:33 Mono- vs combination therapy for treatment of community-acquired pneumonia (CAP) 

1533 Mono- vs combination therapy for treatment of community-acquired pneumonia (CAP)

There is no doubt that the introduction of pneumococcal vaccine, whether 7-, 10- or 13-valent has had a profound impact on pneumococcal pneumonia in infants and small children worldwide, although data are not entirely consistent when one separates out the impact on ambulatory visit rates vs. hospital admissions.  Nevertheless CAP is still encountered worldwide, with estimates of ~120 million CAP episodes in children in 2011 of which 14 million progressed to severe disease and 1.3 million died (>80% of the latter ˂2 yrs of age).  Blood cultures are not helpful in CAP, a recent review and meta-analysis showing an overall positivity rate of only 5.14% which increased to 9.89% if only severe CAP was included. Streptococcus pneumonia was found in 76.7%, H influenza in 3.1% and S aureus in 2.1%. Consistent with these findings, 2011 guidelines jointly sponsored by the American Pediatric Infectious Diseases Society and the Infectious Diseases Society of America recommended beta-lactam antibiotic therapy for most children with CAP treated in the outpatient setting. In older children and adolescents and in hospitalized children, consideration should be given to inclusion of macrolides to cover Mycoplasma and Chlamydophila pneumoniae.  To address the question of monotherapy vs. combinations of beta-lactams and macrolides, researchers from the Geisinger Health System in Pennsylvania reviewed data for their 31-county primary care service which includes >50 clinics and 3 acute care hospitals.  Subjects between 1 and 18 years seen between Jan 1 2008 and Jan 31 2010 with an initial diagnosis of CAP were eligible for study if they received a beta-lactam antibiotic alone or in combination with a macrolide.  The primary outcome measure was treatment failure, defined as a follow-up visit with a respiratory-related diagnosis accompanied by a change in antibiotic therapy in the outpatient setting, emergency department or during hospitalization within 14 days of the initial event.  Of 717 subjects 540 received beta-lactam monotherapy and 147 received the combination (of which 58.2% were below 6 years of age).  Overall treatment failure rates were quite low at 8.1% in monotherapy patients and 6.1% with the combination, but on separating the group into those above and below 6 years of age, failure rates were highest in the 6-18 year-olds on monotherapy (12.9%) vs. 4% with the combination.  This benefit is consistent with a higher prevalence of Mycoplasma and other atypical organisms in older children, and with data that show shorter hospital stays for children hospitalized with CAP and treated with combination therapy vs. beta-lactam monotherapy.

Read more:

Pediatr Pulmonol 2015doi 10.1002/ppul.23312

Hosp Pediatr 2015; 5: 324

Pediatrics 2011; 127: 411-8          

 

 

2015:34 Effect of intraduodenal tastants on satiety, hunger and food intake 

1534 Effect of intraduodenal tastants on satiety, hunger and food intake

 A simple model of satiety after eating invokes the combination of signals from mechanical and chemical receptors that are transmitted to the brain via nervous and/or endocrine pathways.  For example gastric distension results in vagal and spinal nerve stimulation, whereas intestinal sensing        and metabolism of nutrients triggers release of an array of gut peptides from enteroendocrine cells.  Typically one considers the nutrients (carbohydrates, fats and proteins) as being responsible for the chemo-activation, but is there also a role for tastants i.e. the substances that stimulate the five basic senses of sweet, sour, bitter, salty and umami taste.  Sensing of sour and salty is mediated by ion channels, sweet and umami by type 1 taste receptors, and bitter by type 2 receptors which one usually regards as being present on the tongue but have also been shown to exist elsewhere along the gastrointestinal tract, including the colon.  To investigate how duodenal type 1 and 2 receptors respond to bitter, sweet and umami tastants (i.e. non-ion channel mediated) researchers from the Netherlands introduced them directly via intra-duodenal infusion.  The study involved 15 healthy volunteers in a double blinded, randomized, placebo-controlled cross-over trial. The placebo was 120ml water, the same volume used to dissolve all tastants which were delivered individually and also in combination. Delivery was at a rate of 2ml per minute over 1 hour.  The outcomes of interest were food intake at a subsequent ad libitum meal and feelings of hunger and fullness (as measured by visual analog scale).  Blood was drawn at regular intervals for cholecystokinin, glucagon-like peptide 1 (GLP-1) and peptide YY (PYY) to determine whether the objective or subjective outcomes were associated with changes in the hormones.  Infusion of the combination of tastants was associated with a significant reduction in food intake vs placebo during the ad libitum meal (422±97 kcal vs 486±104), whereas both umami and the combination decreased hunger scores compared with placebo.  Interestingly, no changes were observed in cholecystokinin, GLP-1 or PYY concentrations.  The study poses more questions than answers but appears to demonstrate that non-caloric substances have the ability to act directly on receptors in the small intestine and influence satiety, sense of hunger and actual food intake. If proven in subsequent studies the findings could pave the way for weight reduction interventions that package the appropriate chemicals in a system that is acid resistant and delivers them to their target in the duodenum or beyond.

Read more:

Am J Clin Nutr 2015; 102: 729-35  and 717-8 (Editorial)

Int J Obesity 2015; 39: 235-43

Curr Opin Endocrinol 2008; 15: 73-8         

 

 

2015:35 Interrelationships between anaemia and vitamin D levels during pregnancy 

1535  Interrelationships between anaemia and vitamin D levels during pregnancy

The observation that vitamin D and iron insufficiency coexist has been recorded for many decades, in fact since the early 1930’s, across the age spectrum from infancy and childhood to the elderly, and in various disease states.  Most often the association is attributed to an overall poor quality diet, but this is unlikely because vitamin D production is mainly endogenous, so researchers have quite recently begun to focus on biological mechanisms for vitamin D affecting iron metabolism and erythropoiesis, and/or iron indirectly modulating vitamin D metabolism.  The coexistence of anaemia and vitamin D deficiency or insufficiency during pregnancy is worrying since both have been associated with maternal and foetal/neonatal problems ranging from impaired intrauterine growth, preterm delivery, neonatal hypocalcaemia and impaired bone development in the neonate, to postpartum depression pre-eclampsia and even death in the mother.  A meta-analysis published in 2015 confirmed that in a sample of 7 studies involving >5000 participants, vitamin D deficiency increased the risk of anaemia by a factor of 2.25 (95%CI 1.47-3.44).  Other research has proposed a role for vitamin D in erythropoiesis since anaemia was found in 66% of subjects with D-deficiency vs 35% in those who were D-sufficient.  Pregnancy is in any event a high risk situation for the development of anaemia and also vitamin D deficiency or insufficiency, with teenage pregnancies at even higher risk.  In a study of 158 teenage singleton pregnancies (≤18 years) in good health at ≥12 weeks of gestation, maternal blood was collected mid-gestation and at delivery. Anaemia and/or vitamin D insufficiency (25-OH D) were treated with appropriate additional supplementation i.e. in addition to the standard prenatal vitamin and mineral supplementation. Regression analysis was used to generate the odds ratio for anaemia as a function of vitamin D status, and a mediation analysis was performed to examine direct and indirect relationships between vitamin D status, haemoglobin and erythropoietin in maternal serum.  Maternal 25-OH D was positively correlated with maternal haemoglobin at both mid-gestation and at delivery, and the odds of anaemia at delivery were 4-8-fold greater in adolescents with delivery 25-OH-D concentrations below 50nmol/l.  Maternal 25-OH D was inversely associated with erythropoietin at both mid-gestation and at delivery.  Clinical support for the latter relationship exists in that vitamin D supplementation to patients with chronic renal disease significantly reduces the need for erythropoietin to maintain haemoglobin levels. This may be mediated via calcitriol which enhances the sensitivity of haematopoietic cells to erythropoietin through an effect on erythropoietin receptors. 

Read more:

Am J Clin Nutr 2015 doi; 10.3945/ajcn.115.116756

Renal failure 2015; 37: 929-34

Indian J Clin Biochem 2015; 30: 313-7     

 

 

2015:36 Hypothermia for encephalopathy in low income countries 

1536  Hypothermia for encephalopathy in low income countries

 Summary 2014:02 in this series commented on a meta-analysis of 567 infants in low and middle income (LMI) countries who received therapeutic hypothermia (TH) for management of hypoxic ischaemic encephalopathy (HIE). Results of this analysis showed no advantage of TH vs. usual treatment and the summary concludes with an appeal for clinicians providing TH for HIE in LMI countries to participate in large, standardized multicentre trials such as HELIX (Hypothermia for Encephalopathy in Low Income Countries). For researchers interested in HELIX, the website www.clinicaltrials.gov shows that the first, a single group study which investigated feasibility has been completed, and the randomized and controlled sequel which was registered in February 2015 is currently in the recruiting phase.  The debate around neuroprotection for HIE in LMI countries is taken further in a recent commentary in the Journal of Pediatrics which starts by citing an estimated global incidence of 1.15 million cases of HIE in 2010 of which 96% were from LMI countries, and again bemoans the heterogeneity of results of studies from such countries.  Reasons for the heterogeneity range from selection bias to technological incompetence, but also include the possibility of population differences e.g. as a result of more perinatal sepsis or chronic antenatal insults resulting from malnutrition and intrauterine growth restriction.  Given that at this time TH has not conclusively been shown to improve outcomes for HIE in LMI countries and should therefore not be regarded as standard of care in such settings, the authors make a strong case for initiatives that improve maternal nutritional status, enhance obstetric care and decrease infection rates. Along these lines an analysis in 78 LMI countries found that scaling up midwifery and other maternal and neonatal interventions had the potential to avert >80% of maternal deaths, stillbirths and neonatal deaths.  They also comment that for those wishing to take the route of neuroprotection for HIE there might be options that are more affordable and pragmatic than formal TH, such as a combination of mild hypothermia and magnesium which is currently under investigation (registered as Mag-Cool 1 – NCT01646619).  Possibly countering the argument that TH should not be considered to be standard of care in LMI countries is a retrospective review of 100 neonates treated at the Tygerberg Hospital. Results are quite positive in terms of time of enrolment, maintenance of hypothermia, frequency of adverse effects and neurological outcome in returnees at ~12 months of age. On the other hand a high percentage was lost to follow up, which is regarded as a possible feature of care in an LMI environment. But the bigger question is whether a small, highly academic, well-staffed first-world tertiary facility in a well-developed environment is representative of the national LMI situation. The study also suffers from the bias identified in other LMI research on the topic in that the sample included a significant percentage of mild HIE patients and the number requiring assisted ventilation was low. So ultimately we are back to the point that if we wish to contribute meaningfully to the debate, research should be representative of the LMI situation, and should ideally be part of large-scale, standardized studies that are designed and powered to provide answers and direction.

Read more: 

Arch Dis Child Fetal Neonatal Ed 2015; 100: F519-23

J Pediatr 2015; 167: 25-8

PLoS One 2013;8(3): e58834      

 

 

2015:37 Predicting severe infection in chemotherapy-induced febrile neutropaenia (FN) 

1537 Predicting severe infection in chemotherapy-induced febrile neutropaenia (FN)

Infections are common and potentially life-threatening in children with chemotherapy-induced neutropaenia, and fever is frequently the main or only symptom on admission.  Patients are likely to be admitted under these circumstances and subjected to intravenous broad spectrum antibiotics and close monitoring, thereby incurring costs, impacting on the patient and parents, and putting the child at risk for nosocomial infections and emergence of antimicrobial resistance.  Considering that severe infection is subsequently confirmed in only a minority of cases (15-25%), there has been great interest in developing clinical pathways, guidelines, algorithms or care plans that would inform treating practitioners as to when to intervene aggressively with maximal treatment in patients at high risk of infection, and when to consider lower levels of care such as shortened intravenous antibiotic treatment and even oral antibiotics and treatment as an outpatient.  In an article published in 2010,

French researchers attempted to validate 6 clinical decision rules by applying them retrospectively to a group of 167 children who jointly experienced 377 episodes of FN.  Their objective was extremely stringent i.e. they specified that an acceptable predictive rule was one yielding a sensitivity of 100% and highest possible specificity. None of the 6 achieved the 100% mark (although one was at 96% and another at 95%) and specificities were too low in these (25% and 5% respectively) for the authors to consider adoption for patient management.  The same group of researchers then went about subjecting the same patient population to rigorous analysis in order to identify variables that would be useful for prediction of severe infection in children with FN.  The sample included 160 subjects of <16 years of age managed for haematological malignancies or solid tumours and admitted for chemotherapy-induced FN.  Fever was defined as ≥38.5oC measured once or 38oC on 2 occasions over a 6-hour period.  Neutropaenia was an absolute neutrophil count of ≤500/µl. The outcome of interest was severe infection as defined by bacteraemia, a positive culture in a normally sterile body fluid, an invasive fungal infection, or a localised infection at high risk of extension. Bacteraemia was defined as 1 or more positive blood cultures, but at least 2 were required to confirm coagulase negative Staphylococcus or any other contaminant.  Severe sepsis required factors such as a systemic inflammatory response and cardiovascular, respiratory or at least 2 other organ dysfunctions.

Having reviewed the gamut of clinical and laboratory observations in the patients, and after univariate and multivariate analysis, the authors identified severe infection in 16% of subjects.

Variables predictive of severe infection at the time of admission were disease with a high risk of prolonged neutropaenia, haematological cancer, fever ≥38.5oC and C-reactive protein ≥ 90mg/l. Of note is that none of these parameters achieved the 100% sensitivity demanded in their previous study, but that was not a focus of this research. Rather they have identified factors to be studied in further research, in particular validating which of the four are predictive whether alone and/or in various combinations.  

Read more:

J Pediatr Hematol Oncol 2015; 37; e468-74

J Clin Oncol 2012;30: 4427-38

Pediatr Blood Cancer 2010; 55; 662-7    

 

2015:38 Optimal management of mild traumatic brain injuries (TBIs) in children 

1538 Optimal management of mild traumatic brain injuries (TBIs) in children

 Several summaries in this series (e.g. 2015:37) discuss the use of clinical algorithms that guide clinicians in the management of patients, particularly in the acute care situation.  While such guidelines are of assistance and usually appreciated by more-junior doctors, physician assistants, nurses, accident and emergency personnel and others at the ‘coalface,’ more senior staff quite frequently complain about ‘cookbook medicine’ and the demise of clinical acumen and diagnostic skills in favour of resorting to flowcharts, whether in hard copy or digital format.  An important point to note in this regard is that in this litigious era in which one practices ‘defensive medicine’ it is far more difficult to use a ‘years of personal experience’ as a defence than being able to demonstrate that one adhered to an evidence-based guideline.  Clinical staff dealing with traumatic brain injury in children are often faced with having to make a decision as to how far to go with investigation (particularly CT scanning), when to admit the patient and when to take the responsibility of discharging the child to parental or other care.  Evaluation and treatment algorithms are well developed for moderate and severe TBI, but not so much for mild cases, and debate continues in this area.  Certainly algorithms have been developed to assist with the major investigative step of which cases are likely to benefit from a CT scan, and admission following demonstration of an intracranial haemorrhage and/or depressed skull fracture is accepted practice, but management of patients with a normal scan is less certain.  In a retrospective review of 2867 blunt trauma patients aged below 16 years at two level 1 trauma centres in the USA, all who were scanned were eligible for inclusion.  Those with haemorrhage or displaced/depressed skull fracture were excluded from further analysis. A post-hoc assessment of the indication/necessity for CT was performed based on the previously-validated PECARN criteria which include Glasgow Coma Scale (GCS) ‹15, altered mental status, palpable skull deformity, ›5sec loss of consciousness, scalp haematoma in a child ‹2 years or severe headache if ≥2 years, and severe mechanism for the TBI e.g. thrown from a vehicle or pedestrian struck by a vehicle.  The majority (88%) of the 631 who were scanned met PECARN criteria of which 192 were excluded for haemorrhage or displaced skull fracture, 397 had normal CTs and 42 had non-displaced skull fractures.  All patients with an initial GCS of 13-15 and no intracranial injury were neurologically normal on discharge home. Not admitting those with initial GCS of 13-15, normal CT and no other injuries can safely be discharged home without admission, saving 1.8±1.5 hospital days per patient.  While this was a retrospective study there are others that are prospective and show the high sensitivity of the PECARN criteria beyond that of other accepted protocols such as ‘CATCH’ and ‘CHALICE.’

Read more:

J Pediatr Surg 2015;1758-61

Postgrad Med J 2015; 91: 634-8

Ann Emerg Med 2014; 64: 145-52 

       

 

2015:39 Interventions for cerebral palsy (CP) 

1539 Interventions for cerebral palsy (CP)

The socio-economic divide in South Africa is evident in many aspects of healthcare in the country but perhaps particularly so in the management of cerebral palsy in deprived vs. affluent societies. For example an article in the Lancet which refers to the problem in Africa and comments on a meeting in Cape Town at which 22 African nations were represented, focuses on the higher CP rate on the continent vs. developed countries, the different (and more-preventable aetiologies) and the need for interventions which apart from those directed towards primary prevention during pregnancy and infancy are directed towards improvements in the areas of rehabilitation (physio-, occupational- and speech  therapy, and education and training of caregivers). In this context there is no mention of interventions that would be considered for children from more-privileged and affluent environments; interventions such as selective dorsal rhizotomy (SDR), botulinum toxin (btox) injections or baclofen (intrathecal or oral).  Interestingly, some of the most contributory work in the SDR field was done by Peacock in Cape Town some 20 years ago. While one cannot ignore the desperate social, financial and educational plight of rural families affected by a child with spastic CP, one should also be aware of developments that have taken place and interventions that are likely to be considered for those with access to tertiary services.  A recent Cochrane review interrogated the evidence for intrathecal baclofen which is a gamma-aminobutyric acid (GABA) agonist that acts selectively on receptors in the brain and spinal cord and inhibits presynaptic transmitter release and also postsynaptic neuronal activity by increasing potassium conductance.  Oral baclofen is effective in treating spastic CP but the high doses required to achieve therapeutic levels in the CSF result in significant adverse effects, particularly sedation. Intrathecal baclofen (ITB) administration has been in use for around 20 years and has resulted in a number of publications, however the Cochrane review identified only 5 studies for systematic review, four of which involved short-term delivery (e.g. via lumbar puncture) and one involved implantable pumps. The conclusion is that there is a small amount of evidence that the drug is an effective treatment reducing spasticity in the short term, but long-term impact is unclear. Retrospective analysis of experience in Japan with 181 patients treated with SDR and 131 with ITB showed that SDR impacts positively on spasticity in the long term, whereas ITB is best for severe spasticity and dystonia, and btox is considered for mild spasticity. SDR is identified as being most successful for patients classified as II or III on the GMFCS (Gross Motor Function Classification System). This is echoed in a recent review from the UK which makes a case for the “re-emergence” of SDR, a treatment originally contemplated and implemented in 1908 but which has since undergone a number of refinements.

Read more:

Arch Dis Child 2015; 100: 798-802

Cochrane Database Syst Rev 2015 doi: 10.1002/14651858.CD004522.pub2

Lancet 2015; 14: 876-7

 

 

2015:40 Visual screening and performance in children 

1540 Visual screening and performance in children

A fairly common factor in this series of summaries is the ability of researchers to study cohorts of subjects over long periods of time using large, integrated databases. In a recent article published in the British Journal of Ophthalmology, researchers from the UK analysed visual outcomes in children from Tayside, a region in the east of Scotland with an established orthoptist-delivered vision screening programme for children between 4 and 5 years of age.  This programme is superimposed on the UK’s policy of screening all newborns for a red reflex and for major eye abnormalities, and the 6-8 week examination (usually by a GP) which looks for normal visual behavior, red reflex and eye abnormalities.  Despite amblyopia being regarded as a preventable cause of visual disability, according to UK guidelines the testing at 4-5 years by trained practitioners is the most cost effective and efficient means of capturing children at risk for amblyopia and associated visual problems.  For the study researchers utilized the community child health database for every child who underwent preschool vision screening between March 2010 and February 2011. The outcomes of the examination and all referrals were extracted from the orthoptic PSVS (Preschool Vision Screening) database and from clinical notes that provided long-term visual and refractive data.  The object of the exercise was to relate risk of impaired vision to socioeconomic and home background which were respectively measured by the SIMD (Scottish Index of Multiple Deprivation) which is based on postcodes, and the HPI (Health Plan Indicator), which is a code assigned by a Health Visitor to every child around the time of birth. This code is based on a comprehensive assessment of the needs of a child and family, and identifies families as requiring Core, Additional or Intensive help.  Analysis of 4365 children showed that 523 (11.9%) failed the screening test.  The odds of least-deprived children passing were 1.4 times higher than for the most-deprived, and a child from a family coded as Intensive had a 3-fold higher risk of failing than for a family coded for Core care.  More children who did not appear for follow-up came from families coded as Intensive, and 59% of those that did attend were prescribed glasses. These data once again make the case for vision screening, particularly for pre-school children from deprived backgrounds. Another recent article studied children with amblyopia vs. strabismus (without amblyopia) and identified slow reading as a problem in those with amblyopia, also making a case for treatment of visual acuity impairment particularly in monocular amblyopia. Such treatment might include speed reading classes and providing extra time for tests and assignments.

Read more:

Br J Ophthalmol 2015; 0: 1-5

J AAPOS 2015 http://dx.doi.org/10.1016/j.jaapos.2015.09.002

Invest Ophthalmol Vis Sci 2010; 51: 3502-8   

Articles from other years 

2018 | 2017 |  2016 |  2014

- Author Prof Alan Rothberg

Copyright © University of Pretoria 2024. All rights reserved.

FAQ's Email Us Virtual Campus Share Cookie Preferences