Debate

The development of disease modifying therapies for Alzheimer disease (AD) is an extremely important goal that has become the focus of major public health initiatives involving governments, civil society, academia and the pharmaceutical industry. The “amyloid cascade” hypothesis of AD is one of the best understood elements of the pathophysiology of this disease, and is based on strong pathological, preclinical and genetic data, including familial and early onset AD and the role of risk factors associated with the production and clearance of amyloid protein, such as ApoE4 genotypes and variants in the APP associated with BACE cleavage.
Based on this hypothesis, amyloid-targeting therapies have been the focus of drug development for the past 2 decades. Several different approaches to remove amyloid, employing both active and passive immunization therapies, as well as enzymatic inhibition of the production of amyloid-beta peptides have been tested in large scale clinical trials. So far all these trials failed to produce clinically meaningful results, or have highlighted significant safety risks for gamma secretase and BACE inhibitors.
This has led many in the field to question the validity of the amyloid hypothesis and to calls to abandon pharmaceutical research on this target. However, the other side of the story of amyloid-targeting therapies has been one of continuous improvement and learning about AD. Data from several of the recently reported trials with anti-amyloid antibodies have shown consistent trends for efficacy in earlier stages of AD, especially in patients receiving higher doses of antibody. At the same time, key advances in diagnostic criteria, including the use of high quality molecular testing of CSF for amyloid and tau proteins, and the widespread use of PET imaging for amyloid (and development of ligands for Tau), have made it operationally feasible to identify the right patients for trials, and to use pharmacodynamic measures to predict therapeutic doses. Finally, our understanding of trial methodology, including the selection and behavior of clinical endpoints in these populations, and identifying predictors of progression, has improved significantly. While there are still doubts about which of the amyloid species in the cascade is the optimal target for therapy, there is founded hope that current trials being conducted with high-dose anti-amyloid antibodies and BACE inhibitors in prodromal and mild AD will be successful.
Nonetheless, the pathophysiology of AD is very complex and involves several other mechanisms and targets beyond amyloid (e.g. p-Tau, inflammation, ER stress, authophagy), and the future of AD treatment will likely be in combination therapies that include targeting amyloid and some if not all of these additional mechanisms to maximize efficacy. Therefore, even if anti-amyloid therapies may not provide a cure for AD, they are very likely to be the first medicines to show disease modifying properties and will remain one of the cornerstones of therapy in the future.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

There is widespread recognition in the urgency to understand the causes and mechanisms of senile dementia. Attempts to find cures for Alzheimer's disease (AD) have, however, failed so far, in spite of enormous investments, intellectual and financial. We therefore have to reconsider the problem from new angles. AD is regarded as a disease because of its clinical manifestations and underlying pathology. However, this combination does not define a disease but rather a syndrome, in analogy with hepatic cirrhosis in which liver pathology causes metabolic changes, but which can result from many different etiologies. It is unlikely that attacking a downstream phenomenon, like apoptosis or beta-amyloid accumulation, can cure AD, or prevent the progression of the disease.
It is probable that senile dementia is the result of a combination of several processes, working differently in each person. Epidemiological studies have identified many risk factors for "senile dementia of the Alzheimer type", some genetic but most environmental and thus modifiable. Therefore a concerted action to fight the dementia epidemic must be made by aggressive action against its risk factors, and this battle must begin in midlife, not in old age.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

The gratifying development of new drugs for the treatment of MS has greatly broadened our therapeutic armamentarium over the past 10 years. Availability of more efficacious agents has also raised the bar and prompted definition of more ambitious treatment goals. Following an approach adopted by rheumatologists some time ago, the concept of treating to target has also been introduced in the management of MS. In the absence of curative therapies, earlier goals to reduce relapse rate and slow progression have been abandoned and redefined with the aim of silencing disease activity and halting disease progression. Proof of this comes from clinical assessment and MRI evaluation of disease activity and burden. The Disease activity freedom status (DAF) was first analyzed posthoc in the AFFIRM trial of natalizumab. Freedom from disease was operationally defined as absence of relapses, disease progression, gadolinium enhancing T1 lesions and new or enlarging T2 lesions. Havradova et al could show superiority of natalizumab to placebo in attaining disease free status. Subsequently, completed phase 3 trials of new drugs were also analysed to determine what now is termed NEDA, no evidence of disease activity. Clearly, this aggregate outcome provides a more comprehensive view of the efficacy of a drug and is more sensitive to register impact of an agent than clinical or MR outcomes looked in isolation. More recently, in recognition of the importance of brain volume loss as a surrogate marker of the overall pathologic process and a predictor of disability, the composite NEDA 4 has been introduced integrating brain atrophy into the equation. There is discussion whether it might be possible to further enlarge the concept by adding measures of cognition, a very significant domain of neurological functioning impacted by the disease process. Looking at NEDA also aids in assessing relative efficacies of drugs in the absence of head-to-head trials.s

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

With the advent of more efficacious MS therapies in recent years, a mere reduction of relapse rate is no longer considered a satisfactory therapeutic goal. Moreover, recent data on MS disease course suggest that probably only relapse rates in the first few years from onset are associated with long-term disability and prognosis, thus underscoring the necessity for additional measures of disease activity and progression that could help guide treatment decisions. Therefore, the concept of “no evidence of disease activity” (NEDA) which roots in the field of rheumatology were disease-free status is an accepted goal for clinical trials and patient care was transferred to MS and first applied in 2009 to the AFFIRM trial (“disease activity free status”), one of the pivotal natalizumab trials. NEDA was defined as a composite score comprising the criteria “No EDSS worsening”, “Freedom from relapses”, “No new/enlarging T2 lesions” and “No gadolinium-enhancing lesions”. 37% of study participants who received natalizumab in the AFFIRM trial were “disease activity free” over 2 years according to this definition. Subsequent analyses for dimethyl-fumarate, fingolimod, cladribine and the combination of glatiramer acetate / interferon beta-1a reported NEDA rates ranging from 28 to 44%. However, despite the beneficial promotional aspect of such a composite score for pharmaceutical companies, the clinical relevance for individual patient management and treatment decisions has not been proved. Moreover, NEDA has been criticized because it heavily relies on radiographic measures whose correlation with clinical measures is only moderate and whose relevance for long term prognosis is equivocal. Most NEDA data are derived from very short follow-up periods and lack comparability between studies which questions the long term importance and predictive value of this measure.
Moreover, the NEDA concept disregards relevant features of disease pathology (diffuse tissue damage, grey matter atrophy, retinal atrophy etc.) that are probably more relevant for disease prognosis and long term disability than focal T2/gadolinium-enhancing lesions. Even more serious is the fact that it is totally unclear what the clinical meaningfulness of NEDA is from the patients’ perspective as it does not entirely reflect the clinical need by not taking “intangible” but often severely debilitating symptoms like fatigue, depression, cognitive impairment, pain and sleep disorders into account. Moreover, a recent study from the US has shown that NEDA is hardly an achievable goal in clinical practice as only 7.9% of more than 200 RRMS patients retained NEDA status after a period of observation of 7 years. Recent attempts to overcome some of the criticism by including brain atrophy measurements into the concept (NEDA-4) fall short as standardized brain volume assessments in clinical routine are probably not feasible in the nearer future. In sum, NEDA is currently not relevant for therapeutic decisions.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
We cannot ignore the power of the MRI today which detects clinically relevant changes in the absence of clinical signs or symptoms. Subclinical (MRI only) disease activity is relevant enough to be included in all current renditions of the new diagnostic criteria for MS, fulfilling all the requirements now for ‘dissemination in space’ (DIS) and ‘time’ (DIT) in the absence of any clinical signs or symptoms. In fact, a recent MAGNIMS publication has raised that diagnostic criteria should be even further changed calling for a more simplified, less ambiguous definition of DIS; meeting DIT criteria whether or not the lesions are symptomatic; and indicating the lack of value of non-enhancing hypointense lesions on T1- weighted images (i.e. T1 black holes) in predicting conversion to clinically definite MS when added to the current DIS criteria. They go on to make further recommendations regarding the use of MRI in both CIS and RIS.
When it comes to prognosis or monitoring response to therapy, the same MAGNIMS consortium have also addressed this and indicated clearly that MRI also makes an important contribution to the monitoring of treatment, and can be used to determine baseline tissue damage and detect subsequent repair. This use of MRI can help predict treatment response and assess the efficacy and safety of new therapies. They build upon the seminal study by Prosperini et al which was used to establish the cut-off of new lesion development over a year of treatment which by itself (in the absence of any clinical signs or symptoms of relapse or progression) should prompt consideration for switching therapies in the Canadian Treatment Optimization Recommendations.
The recommendations for monitoring also require strict adherence to an informative standardized MRI set of acquisition sequences, which offer accuracy in terms of follow-up comparisons to allow for making recommendations based on new lesion development while on a DMT, but furthermore are vital to detecting relevant safety concerns such as the development of an atypical lesion that might suggest PML.
It is therefore quite clear that if we are to embark on expensive and sometimes risky medications in order to get a foothold on MS disease, that we intercede and determine as quickly as possible that the medication is not futile and switch to one that will make a difference. MRI monitoring will now allow us to do this with accuracy and in a shorter time than waiting for clinical signs or symptoms of the disease to develop.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
It is imperative to recognize multiple sclerosis (MS) patients with high risk of disability progression as soon as possible and offer them more potent treatment.
Conventional (enlarging T2, novel T2 and Gd lesions) and unconventional (brain atrophy) MRI parameters are putative biomarkers of the disease progression.
The data about influence of early conventional MRI parameter worsening (without clinical progression or relapses) on early or late disability in treated MS patients are controversial and available mainly for interferons beta. Some of the studies showed that the development of new T2 or Gd enhancing lesions in the first year of interferon beta treatment predicted second and third year disease activity or worse late clinical outcome, but some of the studies were negative. In a single study from Barcelona the first-year MRI activity did not predict clinical worsening of the disease in the next two years in patients treated with glatiramer acetate.
There are many caveats which need to be considered when interpreting comparative MRI data in the treated MS patients.
How many novel silent T2 MRI lesions have to be present in the first year for a poor early or late prognosis? One, two or three? In the interferon beta studies different number of early new T2 lesions predicted worse future outcome.
One has to be aware that the clinical effects of interferons beta and glatiramer acetate are delayed therefore novel T2 lesions could appear before the start of efficacy of the agents.
The substantial problem of all injectable drugs is adherence to the therapy. In a large recent study from Germany only 30-40% from more then 50.000 patients were adherent to the injectables.
Furthermore, there is also a problem of an interpretation reliability of paired MRI data among neuroradiologists. In a study from Cleveland for example between-rater variability was high for enlarging T2 lesions, intermediate for novel T2 lesions and low only for Gd enhancing lesions.
So far there are no data which would indicate that patients treated with oral drugs or monoclonal antibodies have a poor prognosis with ´MRI only´ worsening.
It is even more difficult to include unconventional MRI parameters, such as brain atrophy measurements into early therapeutic decision making. The majority of disease modifying drugs have moderate and inconsistent effect on brain volume which is often delayed (e.g. pseudoatrophy).
Therefore, taking into account also a less favorable safety profile of more potent drugs, common clinical reasoning is crucial in the management of an individual with MS and escalation therapy should be given to patients with more realistic risk of poor prognosis.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
Vitamin D can modulate the innate and adaptive immune responses. Existing data show that vitamin D supplementation has major effect in determining MS risk.
Data from two large prospective cohorts of woman ( Nurses´ Health Study, 92.253 followed from 1980-2000) and Nurses´ Health Study II ( 95.310 woman followed from 1991-2001) has shown that woman who used supplemental vitamin D had a 40% lower risk of MS, that woman who did not use vitamin D supplements.
In the prospective cohort study of 145 participants with relapsing-remitting MS higher 25-OH-D levels were associated with a reduced hazard of relapse. Also the relative risk of developing MS has been found to be lower among woman born to mothers with high vitamin D intake during pregnancy.
Controversies exist regarding the therapeutic effect of vitamin D supplementation on the course of MS and not allow conclusion that vitamin D can be regarded as a substantial disease modifier in patients with MS.
In a small prospective study 15 MS patients were treated with vitamin D3 2.5 mcg/d for 48 weeks showing that the on study exacerbation rate was lower than baseline. Also in a randomized, double-blind, placebo controlled trial with vitamin D3 as an add on treatment to interferon beta 1b in patients with MS, patients in the vitamin D group have shown a significant reduction of MRI activity in comparison with group of patients only treated with interferon beta 1b. But in a 96-week randomized controlled trial in 68 MS patients , supplementation with 20.000 IU vitamin D3 weekly did not result in beneficial effect on the relapse rate, EDSS , MSFC components or fatigue. Some clinical trials are ongoing. Large prospective trials are needed to resolve this issue.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
MS is mainly a silent disease in its early phases. Furthermore, most inflammatory disease MS activity detected by MRI scans is subclinical. Hence, it is not surprising that a clinically silent phase precedes overt manifestations of MS in most patients. What was needed to prove this hypothesis was: 1) unfettered access to MRI technology in large numbers of individuals with symptoms irrelevant to MS (e.g. migraine headaches and other non-MS related symptoms) and 2) an interest in the MS community to follow patients with silent MS-suggestive lesions. In recent years, both these requirements have been satisfied, and there is now a robust interest in the early subclinical phase of MS within the MS community. The term most widely applied to this condition is radiologically isolated syndrome (RIS), and it is clear that it can be the precursor of relapsing remitting MS, and in some prospectively documented cases of primary progressive MS. Wisely, experts have cautioned against making the diagnosis in a confident way in individuals without symptoms of neurological disease and to avoid disease modifying treatment. They have recognized the lack of specificity of MRI markers of MS. While no doubt some will “convert” to MS, it is unclear whether this is the majority. Even if the majority eventually convert, it is not justifiable to expose all patients with subclinical white matter lesions who might eventually manifest symptoms to extremely expensive, indefinitely prescribed and potentially hazardous treatments. In this debate, I will argue against institution of treatment or conducting clinical trials with outcome measures that are predictable (e.g. reduced risk of “conversion to MS”) but should not, in their own right, lead to inappropriate practice recommendations if the trials were to yield positive results.
While strong arguments can be made against instituting long term MS disease modifying treatment (DMT) in every patient with early demyelinating disease until it is clear that relapses occur, the strongest argument against use of DMT in patients with RIS is that the diagnosis is uncertain. Nonspecific white matter lesions are extremely common. While current criteria for RIS proposed by Okuda use the more rigorous criteria of Barkhof rather than the more liberal Swanton criteria that have replaced the Barkhof criteria in the latest (2010) version of the McDonald criteria, even the Barkhof criteria have not been rigorously assessed in a general practice setting. The criteria were based on predicting whether dissemination in time and space will be satisfied in patients presenting with typical clinical characteristics of MS such as optic neuritis. A far more common problem occurs in patients who have anything less than an unequivocal presentation of demyelinating disease (e.g. major visual loss and afferent pupillary defect in a patient with optic neuritis). Many patients experience symptoms (e.g. visual migraine, paresthesias in the context of fibromyalgia) that are mistaken for symptoms of demyelinating disease. It is very common to detect nonspecific white matter lesions in the brain that are touted as evidence that patients have radiological evidence for “dissemination in space” and hence MS. Often, the diagnosis is not removed even when the patient sees an expert who is strongly convinced that the diagnosis was made in error and even when the patient is on long term immunomodulators. While many neurologists may argue that the correct diagnosis will eventually declare itself and modifications in therapy will be made, some conditions such as fibromyalgia lead to continuing but unchanging symptoms; definitive evidence of an alternative neurological diagnosis will never emerge. Such patients are at risk of being left on disease modifying therapies (DMT’s) for lengthy periods of time until the diagnostic error is identified, if it ever is.
As is the case for MS, but even more convincingly for RIS, the prognosis is indeterminate. Many patients with MS have lengthy periods of remission with no or punctuated by only small numbers of attacks with few sequelae. Benign MS, while only diagnosed confidently retrospectively, is nonetheless common. While it is difficult, if not impossible, to reliably assign a prognosis early, continued observation clinically and monitoring with MRI’s generally permits early detection of recurrent disease activity and institution of DMT’s at a point when the diagnosis is convincing and the prognosis is clearer. The point at which therapy should be introduced is still not well-defined, although there is a trend for treatment to be instituted as soon as a diagnosis of MS is satisfied using McDonald criteria. While perhaps more aggressive than can be justified, even such this approach is preferable to routine institution of treatment before a confident diagnosis of MS is established.
Patients are understandably concerned when they are told that they have RIS that they might suffer a devastating demyelinating attack from which they may not recover. It is possible to monitor patients with RIS and identify informative lesions. The benefits of being sure that a long term DMT is necessary outweigh any realistic concerns about a patient experiencing a devastating clinical attack that will not respond to rescue treatment.
It is difficult to know after initiating treatment whether a patient’s course deviates from what might be expected for that patient give the variability in the natural course of MS and the incompleteness and variability of outcomes of patients on virtually all DMT’s for MS. It is easy to attribute success to a drug that is unwarranted especially without a period of prior observation to gauge the natural course of disease in a given patient. Furthermore, there are no guidelines for stopping of DMTs, so at the present time, commitment of a patient to a course of therapy is for an indefinite period, a further argument in favor of being certain about the indications for treatment.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
The answer to this dilemma depends on the two fundamental issues: if it is active or nonactive form of secondary progressive (SP) MS and what type of disease-modifying therapies was introduced in the early stages of the disease.
Active SPMS is defined as form with relapses and presence of MRI activity in the CNS.
In SPMS IFN beta 1b and IFN beta 1a s.c are approved by European Medicines Agency (EMA) and mitoxantrone by FDA and EMA. In the European SPMS study the significant positive effects on disease progression, number of attacs and several MRI parameters was observed in patients treated with IFN beta 1b. Contrary, in the North American SPMS study with IFN-beta 1b no treatment benefit was seen on the time to confirmed progression of disability, relapse- and MRI-related outcomes. A comparison of the both studies revealed that the European trial with positive results included patients with more active MS than the North American trial.
The subgroup analysis in SPECTRIMS study (with IFN beta 1a s.c in SP MS) has shown that patients who still had attacs benefit from the medication in terms of disease progression.
So existing data indicate that only patients with active form of SPMS should continue treatment with described immunomodulatory treatment.
In the MIMS study patients with worsening RRMS or SPMS were assigned placebo or mitoxantrone. At 24 months the mitoxantrone group experienced benefit compared to placebo group for disability progression. The characteristics of the patients included in the MIMS trial however does not allow conclusions to be made in relation to the efficacy of mitoxantrone in SPMS patients without relapses.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Pro Anti B
More than a decade ago the dominant theories on the etiology and pathogenesis of MS revolved around a prime role of T cells that would recognize target autoantigens in the brain and orchestrate inflammatory insults on CNS parenchyma. Subsequently, numerous lines of research advanced robust evidence for a role of humoral autoimmunity and B lymphocytes in driving or contributing to the disease process. These included histopathological analyses of MS brains detecting immunoglobulin and complement deposits as well as B cells in lesions, and the retrieval of myelin-reactive antibodies and B cells in blood and CSF. The most convincing evidence came from therapeutic studies with the B cell depleting monoclonal anti-CD20 antibody rituximab in RRMS. This highly effective monoclonal induced early suppression of inflammatory disease activity. This temporal profile suggested an action on B cell function as antigen presenters and instructors of T cells rather than a modifying effect on autoantibody production. The clinical development of the humanized anti-CD 20 antibody ocrelizumab culminating in the two OPERA studies in RRMS completed last year replicated the marked therapeutic effects achieved with rituximab in the earlier phase 2 trial. Very interestingly, ocrelizumab also appeared effective in a recently completed phase 3 trial in PPMS. These results raise a number of questions as to the role of B cells in the pathogenesis of this disease type.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Pro Anti B+T
Multiple sclerosis (MS) is a multifactorial disease, with complex aetiology driven by genetics and the environment. Pathogenesis is clearly immune-mediated, as had been suspected for a long time. In this respect, findings from genome-wide association studies (GWAS; Sawcer et al. 2011; Nature 476:214), the pathology of MS brain lesions (Kutzelnigg and Lassmann 2014; Handb Clin Neurol 122:15), animal model studies (Constantinescu et al., 2011; Br J Pharmacol 164:1079), and the response to immunotherapies (Nylander and Hafler 2012; J Clin Invest 122:1180) have confirmed that the immune system is the effector of central nervous system (CNS) tissue damage. In progressive disease, it is likely that neurodegenerative mechanisms are also important.
I will argue that MS is driven by potent T-cell and B-cell cooperative mechanisms.
Activated T cells are thought to be essential. They enter the CNS through its microvasculature, forming characteristic perivascular cuffs, and are re-activated by CNS-resident antigen-presenting cells to trigger an inflammatory and autoreactive cascade (Hohlfeld et al 2016; Lancet Neurol 15: 198). Such mechanisms have been interfered with, most specifically and successfully, by designing natalizumab as an effective anti-integrin Mab that blocks T-cell entry into the CNS (Yednock et al 1992; Nature 356:6; Polman et al 2006; N Engl J Med 354:899). CD4+ T cells, crucial coordinators of the adaptive immune response, can activate or inhibit other immune cells as they respond to foreign or self antigens in physiological or pathological responses. CD8+ T cells are also well represented in MS lesions and could potentially target oligodendrocytes and neurons directly, through HLA class I-restricted antigen recognition (Friese et al 2008; Nat Med 14:1227).
B cells are involved in an immune signature of MS in >90% of patients, namely cerebrospinal fluid oligoclonal IgG bands (OCB; Housley et al 2015; Clin Immunol 161:51). Although the specificity of such oligoclonal response remains unclear, we know from its molecular features that it is antigen-driven, quite possibly by one of more viruses, such as Epstein-Barr virus (EBV, HHV-4). The importance of B cells is also emphasised by their targeting for infection by EBV, a virtually essential factor in susceptibility to MS (Pakpoor et al 2013; Mult Scler 19:162). Production of antibodies by B-cell derived plasma cells is also known to promote more efficient myelin damage by monocyte-derived CNS-infiltrating macrophages, chemo-attracted by activated T cells (Hohlfeld et al 2016; Lancet Neurol 15:317).
In addition to their productive interactions in peripheral lymphoid organs and the “classic” perivascular cuffs that characterise MS lesions (immune pathogenesis “from the inside”), T and B cells also interact in the subarachnoid spaces adjacent to the pial surface of the cerebral cortex, most likely leading to different types of cortical lesions “from the outside” (Calabrese et al 2015; Nat Rev Neurosci 16:147). This is most evident, but not necessarily limited to, the tertiary lymphoid follicles observed in advanced progressive cases (Pikor et al 2016; Front Immunol 6:657). Crucially, not only GWAS, but also epigenetic studies indicate unequivocally that MS susceptibility genes are most highly expressed in both T cells and B cells (Farh et al., 2015; Nature 518:337).
The excitement for the success of B-cell targeting therapies that started with rituximab (Hauser et al 2008; N Engl J Med 358:676) and led to ocrelizumab (Sorensen and Blinkenberg 2016; Ther Adv Neurol Disord 9:44) and other potential Mabs, should not hinder our efforts to control disease by thoughtful, pathogenesis-driven approaches. We know that the effects of anti-B cell treatments are too fast to be mediated by antibody production and that plasma cells are not even depleted by such antibodies. It is likely that antigen presentation by B cells to T cells is inhibited instead, as are other pro-inflammatory, antibody-independent B-cell functions. We should also keep in mind that drugs that we consider as mainly targeting B cells are in fact also affecting T cells – consider for example the depletion of CD20dim T cells by rituximab (Palanichamy et al 2014; J Immunol 193:580). Conversely, treatments that we consider as aimed at T cells, also have effects on B cells. For example, natalizumab affects the levels of circulating B cells with different naive/memory profiles depending on its effects on the mobilization of hematopoietic stem cells (Mattoscio et al 2015; Neurology 84:1473). In addition, fingolimod promotes a regulatory phenotype and function of B cells (Gruetzke et al 2015; Ann Clin Transl Neurol 2: 119) and reduces the repertoire diversity of newly produced T as well as B cells (Chiarini et al 2015; Mult Scler 21:726). The most effective disease-modifying treatments, either licensed for use (alemtuzumab) or experimental (hematopoietic stem cell transplantation), potently deplete both T cells and B cells, the latter recovering more quickly in subsequent months (Jones and Coles 2014; Exp Neurol 262:37; Sullivan et al 2010; Biol Blood Marrow Transplant 16: S48; Mancardi et al 2015; Neurology 84:981).
In conclusion, on the basis of the above mentioned observations, it would be unwise to exclusively target B cells in MS. We are in fact not even able to do so – and until more crucial disease mechanisms are clarified, we probably should not.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
The pathological hallmarks of all multiple sclerosis (MS) subtypes are focal areas or of demyelinating plaques in the central nervous system (CNS), with surrounding inflammation and neurodegeneration. Demyelination leads to decreased nerve conduction velocity, and predisposition of the axons to neurodegeneration due to lack of physical and metabolic support. In fact, axonal loss can be detected at the earliest stages of MS. In addition, accelerated rates of brain atrophy on magnetic resonance imaging (MRI) is evident at all stages of MS in patients with all clinical phenotypes.
The pathological substrate of neurological disability is likely the damaged myelin sheet, and, to a larger extend, the loss of intact axons. Damage to axons in the CNS may be acute or subacute, and very likely irreversible. Thus, remyelinating strategies will have very limited efficacy.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
The precise cause of multiple sclerosis (MS) is still unknown and the disease pathology is complex. Despite the long held assumption of a primary myelin-specific autoimmune process in MS, no myelin-specific autoimmune reaction has so far been identified. Demyelination is one of the key pathological changes in MS, but not the only one. Disability progression in MS is the cumulative effect of chronic and diffuse neurodegenerative process in the grey and white matter of the brain and spinal cord. This is clearly shown in primary progressive multiple sclerosis (PPMS) where the burden of cerebral demyelinating lesion is relatively low. Multi-focal demyelination in brain is considered to be the hallmark of relapsing remitting MS but functional improvement after a clinical demyelinating episode occurs spontaneously over time, even in patients with large (tumefactive) cerebral demyelination.
Spontaneous remyelination in MS lesions is either restricted to the lesion edge or extends through the entire lesion area (shadow plaques); these remyelinated plaques may become the future target of new demyelinating events. Shadow plaques are commonly observed in the brain and spinal cord of MS patients and in one post-mortem study, white matter lesions were remyelinated in nearly half of all cases (47%) on average and 22% of them were found to be fully remyelinated. Even during plaque development, remyelination may occur very rapidly and ongoing myelin breakdown may coexist with areas of remyelination. There is no evidence that the number of shadow plaques or early remyelination correlated with better functional preservation in patients with any form of MS.
Several attempts have been made over the years to promote remyelination in MS. The concept that remyelination would prevent axonal loss and neuronal degeneration in MS leading to long term improvement however is not established in clinical studies. In animal models of EAE, natural occurring autoantibodies or intravenous immunoglobulins (IVIg) have been shown to successfully induce remyelination; however, human IVIg is not effective as a treatment in MS. Several remyelination pathways have recently become targets of new drug development in MS, including those of LINGO-1, hyaluronan, Notch-1, retinoid X receptor; targets also include pathways involving chemokine receptor type 4 and G protein-coupled receptor 17. There are also a number of existing (“re-purposing”) drugs claimed to enhance remyelination, such as benztropine, clemastine, quetiapine, olesoxime, and ibudilast. However, there is not enough evidence that any of these drugs could be effective in promoting remyelination beyond the level that naturally occurs in MS.
Therapeutic attempt at remyelination in MS is at best likely to be partial or incomplete; remyelination alone would not prevent recurrent myelin injury or restore axonal integrity that has already been lost or irreparably damaged. If one takes the view that MS being primarily a neurodegenerative disease with secondary inflammation leading to focal perivenous demyelination in metabolically vulnerable areas of brain and spinal cord, then it is even more difficult to speculate meaningful and long term clinical improvement from therapeutic remyelination in MS. Cortical atrophy occurs before substantial white matter demyelination in MS and predicts future disease progression. A characteristic feature of MS is the disease pathology involving normal-appearing cerebral white and grey matter which inexorably increases in severity with disease and disability progression. It seems unlikely that remyelination will reverse these changes and reduce long term disability progression. The likely physiological benefit of remyelination would be improved nerve conduction and reduced ephaptic transmission rather than protection of cerebral grey or white matter from progressive MS pathology.
The recently reported clinical trial outcome of anti-LINGO antibody in acute optic neuritis showed electrophysiological improvement in about 40% patients without clinical benefit; it is not known if some of the treated patients enrolled in this trial might have progressed to relapsing remitting MS. LINGO-1, its signalling partner proteins and pathways have been implicated in several neuropsychiatric disorders and antagonists of LINGO-1 may clinically benefit MS patients possibly for reasons other than remyelination.
The appropriate imaging marker of remyelinating lesions in MS is not yet established and its correlation with conventional MRI parameters of disease progression (brain volume loss and spinal cord atrophy) is still completely unknown. Until there are robust clinical trial data supported by imaging markers to confirm sustained functional recovery and prevention of disease progression in MS, the expectation of long term clinical improvement from remyelination therapy would be purely speculative.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
Neuromyelitis optica (NMO) and its spectrum (NMOSD) are rare antibody-mediated autoimmune conditions of the central nervous system (CNS) characterized by recurrent aggressive inflammatory attacks causing demyelination and axonal loss mainly in the spinal cord, optic nerves and other brain areas rich in the water channel aquaporin-4 (AQP4). Recovery from attacks is only partial and the devastating nature of the disease may result in high mortality rate or permanent and severe loss of function such as paralysis, blindness and respiratory failure. In contrast to multiple sclerosis (MS), where disability is dissociated from relapses and accumulates mainly in the later progressive phase of the disease, disability in NMO arises solely from cumulative relapse-related injury. Therefore, effective treatment to prevent relapses and associated disability should be initiated as early as possible.
Most patients, especially those with the relapsing phenotype, are sero-positive for a pathogenic autoantibody directed against AQP4 located on astrocyte foot processes, known also as NMO-IgG. This antibody is highly specific and helps distinguish NMO from MS, where attacks involving the optic nerves and spinal cord are also common.
Immunosuppressive therapies in other rare antibody-mediated autoimmune conditions such as myasthenia gravis and the autoimmune encephalopathies are well-accepted as beneficial although their use is mainly empirical. Standard therapies for these conditions include corticosteroids (CS), plasma exchange (PE), azathioprine (AZA), mycophenolate mofetil (MMF), mitoxantrone or methotrexate (MTX), and their safety profiles are well-established. It seems unnecessary to reinvent the wheel by performing randomized controlled trials each time a new auto-antibody is identified.
Although all therapeutic studies in NMO to date have been either small, retrospective case series or uncontrolled prospective studies, they have been consistent with supporting a beneficial response of NMO to immunosuppressants including B-cell targeted therapies and established their use as standard of care in NMO. This is reflected in European Guidelines and a consensus document produced by an international group of NMO experts, which support the initiation of preventive treatment with immunosuppressive drugs as soon as the diagnosis of NMO is made. The European guidelines recommend first-line therapy with azathioprine in combination with prednisolone, or rituximab for B-cell depletion, and second line therapy with cyclophosphamide, mitoxantrone or MMF and potentially with MTX or IVIg, with optional PE for treatment escalation. The International Group of NMO Experts concludes that 6 treatments appear to be likely effective in preventing attacks and stabilizing disability in NMO patients and that the currently available studies provide a limited but helpful insight on treatment effect and tolerability. The expert panel provides recommendations for doses and regimen for 4 first line treatments (azathioprine, MMF, rituximab, or prednisone) and 2 second line treatments (MTX and mitoxantrone), as well as guidelines for monitoring and treatment change considerations. Indeed, several studies have shown that relapses respond to CS and PE, earlier diagnosis and treatment of NMO after the discovery of NMO Ab’s resulted in reduction in the mortality rate over the past 20 years, relapses recover better while on immunosuppressive therapy and treatment with AZA+CS or rituximab is associated with a longer time to next attack. Therefore, a standard of care exists for NMO, which is widely accepted. Still, Prospective trials in treatment-naive patients are required to corroborate the efficacy suggested from nonrandomized studies, compare the effectiveness of various regimens to each other, and determine optimal first-line treatment. New potential therapies for NMO need also to be tested in well-designed controlled trials, which may require participation of many centers, and the issue of comparing them to placebo or to one of the recommended first-line therapies is valid for ethical, scientific and clinical reasons.
The rarity of the disease, severity and lack of reversibility of the relapses, early morbidity and mortality in untreated NMOSD suggest that placebo-controlled trials may be unethical. The rationale for comparator rather than placebo-controlled trials may extend beyond the lack of ethics in preventing patient from being treated with recommended therapies, thus exposing them to unnecessary risks of relapses and irreversible neurological damage: The key question about the potential superiority of new agents over the current standard of care will remain unanswered in placebo-controlled trials; Investigators may be under subtle pressures to recruit for a placebo study against their expert opinion and the expected recommendation to start treatment immediately; Recruitment may be slow and insufficient as both clinicians and patients will be reluctant to delay treatment initiation or to take patients off their treatment before enrolling them into the study, and there are likely to be "selection biases", favoring milder patients or those who are unresponsive to standard therapies. Moreover, after study completion, the "better than placebo" and more expensive drug with higher class of evidence but limited safety data may become preferable over currently available effective treatments.
The risk of placebo-controlled trials in NMO can be reduced by shortening the time in the comparative phase, and allowing immediate switch to the active drug after the occurrence of a relapse. However, this may lead to earlier escape from the randomized placebo phase of the study and to insufficient comparative safety data on the new treatment. The claim that total number of relapses required to show a statistically significant difference between the study drug and the placebo is smaller and thus more ethical than when comparing active to current treatment may also be misleading, as this may not be the case if current therapies lack evidence of efficacy (thus providing the ground for "clinical equipoise"), a statement that constitutes the main argument for placebo trials in NMO. Even if it is true and less harm may be caused to the whole group, the risk for the individual patient is greater both in terms of the risk to experience a relapse and the severity of the relapses which have been shown to be milder and recover better with immunosuppressive treatment.
Although there may be a case for clinical equipoise in NMO and a need for treatments with higher level of evidence and a better benefit/risk ratio, clinical trials in NMO should adopt a design other than a placebo-only control design that is too risky and unethical.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
Many considerations are in play when exploring the most appropriate design for a randomized, controlled study in NMO, a design that balances the need for scientific rigor with the safety of participating patients. The two most significant considerations related to placebo-controlled design are those related to the ethics of clinical trials and those related to what constitutes “standard of care.” These two issues, ethics and standard of care, are closely related to each other in the context of assessing the appropriateness of clinical trial design.
NMO is a serious disease, and attacks may lead to devastating neurological consequences. There have been no randomized, controlled trials in this disease to direct decision-making; physicians committed to treating patients have adopted immunosuppressive medications approved for use in other autoimmune diseases. The fundamental question is whether the use of a number of available immunosuppressive medications, unapproved for NMO/NMOSD, constitutes a “standard of care” and can be used as an active control and/or background therapy in randomized, controlled trials for new treatments for NMO/NMOSD.
The WMA Declaration of Helsinki #33, October 2013 states: “Use of placebo is appropriate where no proven intervention exists and where for compelling and scientifically sound methodological reasons, the use of any intervention less effective than the best proven one is necessary to determine the efficacy and safety of an intervention.
In this systematic review, out of 2,438 citations noting NMO/NMOSD, 77 primary studies met the inclusion criteria. Of those, 49 were studies of maintenance therapy to prevent NMO/NMOSD relapses. The systematic review demonstrated that ALL studies that assessed current unproven treatments to prevent NMO/NMOSD attacks are Class IV studies (the lowest) which includes all published Rituximab, AZA, and Mycophenolate Mofetil studies. This means that all published studies were uncontrolled, small, retrospective observational studies.
Also, benefit/risk assessment for NMO/NMOSD maintenance therapies cannot be determined based on the published studies because of the minimal reporting of safety evaluations. Therefore, based on this systematic review, all current unproven treatments used to prevent NMO/NMOSD relapses meet Level “U” of the AAN treatment guidelines and do not meet the criteria for establishing clinical guidelines, and probably cannot be referred to as “standard of care”.
It is clear that the lack of evidence for treatments in NMO/NMOSD is a significant factor in establishing the ethical basis for a placebo-controlled study. A low level of evidence and lack of “standard of care” are grounds for reasonable professional disagreement among physicians as to the appropriate clinical approach for this disease. The status of disagreement, the “Clinical Equipoise,” provides for an ethical argument that aims at reconciling the overall needs and interests of the “NMO society” with the duties and rights of physicians and patients. When equipoise exists, it is considered ethical to offer physicians and patients the option to participate in a placebo-controlled study, and at the same time, the option to not participate in such a study.
To enhance open discussion and the exchange of ideas on the ethical grounds for a placebo controlled study in NMO, a sponsored public open symposium was held at the European and American Committee on Treatment and Research in Multiple Sclerosis (ECTRIMS/ACTRIMS) meeting in Boston on 11 September 2014. This open forum provided a stage for the spectrum of opinions related to the ethical grounds for a placebo-controlled design, which were presented by a range of established experts in NMO and ethics. This symposium highlighted the current state of disagreement among the main stakeholders of this important dilemma, and concluded with the recognition that the current state of clinical equipoise gives ethical legitimacy to a placebo-controlled design for those physicians and patients who decide to participate in such a study. The European Medicine Agency (EMA) conducted a NMO workshop in October 2014 in London which included the participation of NMO expert physicians, ethicists, pharmaceutical representatives, European regulators and patient advocacy group representatives. In this workshop, Dr. Simon Woods, co-director of the policy ethics and life sciences research institute at Newcastle University, summarized what would be required for giving a placebo-controlled design to be ethically justified: Clinical equipoise in place, least possible exposure to placebo, Cross-over/open-label extension, appropriate form of ethical review, consultation with patient groups. Dr. Woods concluded that indeed clinical equipoise is in place in the case of NMO, and those public forums like the EMA NMO workshop, and the ethical symposium in Boston are an appropriate way of ethical review.
The use of a placebo controlled trials in NMO and its inherited ethical controversies have led to several recent publications from NMO experts and ethicists analyzing the acceptability of such trial design in this disease (Cree B, 2015; Rhodes R, 2015; Greenberg B, 2015; Weinshenker B, 2015; Palace J, 2015; Levy M, 2015). The authors also analyzed the ethical concerns regarding the harm, benefit and justice of a placebo design in NMO patients as individuals and as the population living with the disease and its implications to treating physicians. Rhodes and Cree concluded that ultimately the choice to participate in a placebo trial comes down to the patient and physician after providing a full informed consent. Rhodes concluded that reluctance to undertake these types of studies benefits neither the potential subjects of the study nor the patients outside the study living with NMO and provides a barrier to the advancement in NMO research.
Placebo designed study in NMO should be done with the aim of striking a balance between patient safety and clinical/scientific integrity. Specific measures should be implemented into the study design to mitigate the concerns related to a placebo-controlled study such as short exposure to placebo, unequal randomization, immediate access to rescue therapy, and occurrence of a single relapse as a primary endpoint.
In summary, a placebo design study in NMO is ethical and meets all GCP guidelines. This type of a study should be offered to patients and physicians and although not all physicians and patients will choose to participate in such a study, they should have the opportunity to make this decision,

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
Neuromyelitis optica (NMO) is a demyelinating inflammatory disorder of the central nervous system (CNS) that is clinically and pathologically defined as the co-occurrence of optic neuritis and myelitis. Aquaporin (AQP)4 is considered a potential autoantigen in patients with NMO after an autoantibody, designated NMO-IgG, that binds to human (h) AQP4 was detected in the serum of the vast majority of patients with NMO. The presence of the NMO-IgG has led many neurologist and neuroimmunologists to believe that NMO may be a primarily B cell-mediated disease.
However, there is evidence to suggest a cellular immune response in NMO during disease initiation or perpetuation. HLA haplotype analyses of patients with NMO suggest a positive association with HLA-DRB1* 03:01 (DR17), a gene that codes for a major histocompatibility class (MHC) II molecule that presents linear antigens to CD4+ T cells. Also, NMO-IgG is undetectable in a substantial number of patients with NMO. A NMO-IgG, antibody isotype switch from IgM to IgG could not occur without CD4+ T cell involvement, which are abundantly present in NMO lesions. B cell–depleting therapies are not consistently beneficial in patients with NMO. Finally, transfer of AQP4-reactive T cells into wild-type mice and rats results in neurological deficits and CNS inflammation.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Steroids
Chronic inflammatory demyelinating polyradiculoneuropathy (CIDP) is a chronic and often disabling neuropathy that often respond to immune therapies including corticosteroids, plasma exchange and high-dose intravenous immunoglobulin (IVIg) as also summarized in recent Cochrane reviews. It is however difficult to decide what therapy should be first used in CIDP. This decision should consider the short-term and long–term efficacy, the cost and side effects of each these therapies. A few randomized trials have shown a comparable short-term efficacy of IVIg and oral corticosteroids and of IVIg and plasma exchange in CIDP. Plasma exchange is considered however less suitable for long–term treatment than IVIg and is often reserved to patients with an insufficient response to IVIg or corticosteroids. A randomized trial comparing the six-month efficacy of monthly therapy with IVIg or intravenous methylprednisolone (IVMP) showed that IVIg was more frequently effective and tolerated (87.5%) than corticosteroids (47.6%) during the first six month of treatment. When effective however, corticosteroids were less frequently associated with deterioration than IVIg in the six months following therapy discontinuation. This was confirmed in the follow-up extension of the study showing that the median time to deterioration was significantly longer after discontinuing IVMP (14 months) than IVIg (4.5 months). A similar proportion of patients however eventually deteriorated in the 42 months of median follow- up after discontinuing IVIg (87%) or IVMP (79%) confirming that none these therapies eventually cured CIDP. A similar discrepancy in the prolonged efficacy of steroids and IVIg can be derived from previous studies individually assessing the frequency of deterioration after IVIg or corticosteroids discontinuation. In a study comparing six-month treatment with IVIg and placebo, discontinuation of IVIg was followed by deterioration in 45% of the patients after 24 weeks (5.6 months). Similar data were obtained in a study comparing the efficacy of interferon-beta and placebo in preventing disease progression after IVIg suspension. Clinical deterioration was observed within 16 weeks (3.7 months) in 48% of the patients suspending IVIg in the placebo group. On the other hand the extension of the PREDICT study that compared the efficacy of six month therapy with pulsed oral dexamethasone and daily oral prednisolone in CIDP showed that the median time to relapse after therapy discontinuation was 11 months for oral prednisolone and 17.5 months for pulsed oral dexamethasone, similar to the 14 months of our study. A similar more prolonged efficacy of steroids than of IVIg can be also assumed from a five year follow-up study of 70 patients with CIDP showing that the possibility to stop treatment tended to be more frequent in patients who responded to steroids than to IVIg. The long-term efficacy of continuous treatment with steroids was also shown in an uncontrolled retrospective study on 20 patients with CIDP, 15 of whom were continuously treated for 5 years with monthly high-dose of intravenous methylprednisolone irrespective of the possible phase of reactivity or remission of the disease. The improvement in these patients was maintained up to 5 years and, in those further treated, up to 10 years. Considering the safety of therapy, no significant differences in the proportion of patients experiencing adverse events was observed after six month therapy with IVIg or IVMP, even if it is not possible to exclude that this might occur after a more prolonged use of these therapies. In addition it was recently confirmed that the annual cost per patient was considerably higher for patients with CIDP treated with IVIg (49,000£) than with other therapies (9400£). In conclusion, even if IVIg are more frequently effective and possibly safer than steroids as initial treatment in CIDP, the latter have a more prolonged efficacy and consistently lower cost than IVIg.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: IVIg
Chronic inflammatory demyelinating polyneuropathy (CIDP) is the most common chronic, acquired immune-mediated disease of the peripheral nervous system with a prevalence of up to 9/100,000. Because it is a treatable form of chronic polyneuropathy, prompt recognition is needed to improve outcome. In CIDP, motor and sensory signs develop slowly, over months, with a minimum of 2 months from the outset. Patients with typical CIDP present with symmetrical proximal and distal weakness, diminished or absent tendon reflexes, symmetrical distal sensory deficits, elevation of the spinal fluid (CSF) protein, demyelination with conduction block on electrophysiology and histological signs of demyelination. Disease variants characterized by asymmetry, prominent motor or sensory deficits and multifocal symptoms are increasingly recognized and may generate diagnostic challenges. About 10-15% of CIDP patients may present acutely resembling Guillain Barré syndrome (GBS) and require early recognition because therapeutic strategies in this subset may be different from the outset.
Both cellular and humoral factors have been implicated in the immunopathogenesis of CIDP. T cells, activated macrophages, cytokines, costimulatory molecules and antibodies operate in concert with each other. The increasingly recognized concomitant axonal loss secondary to primary demyelination remains an important factor in therapeutic approaches. Up to 10% of CIDP patients have IgG4 antibodies to paranodal proteins, neurofascin-155 and contacting/Caspr 1(CNTN1), and may represent a distinct subset relevant to therapies because many of them are suboptimally responding to IVIg.
Control trials have demonstrated the efficacy of IVIg, plasmapheresis and corticosteroids in most patients. Results and lessons learnt from the largest ever conducted study with IVIg (the ICE trial) after which IVIg gained FDA-approval, suggest that early treatment with IVIg is effective, well tolerated and prevents axonal degeneration; it is therefore preferable to corticosteroids or plasmapheresis. Controlled trials with beta-interferon and Methotrecxate have failed; the other immunosuppressants such as Cyclosporin, Mycophenolate, Rituximab, or Cyclophosphamide may occasionally offer some benefit but controlled studied have not been performed. At least 15% of CIDP patients initially responding to IVIg, continue receiving it for long periods and seem to be “over-treated” because their disease has been in remission or burnt-out; periodic checks to assess whether further treatment with IVIg is needed is therefore essential . Up to 30% of CIDP patients do not adequately respond to available therapies, and new therapeutic strategies are explored in ongoing clinical trials.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
Carbamazepine was synthesised by Schindler at Geigy in 1953, when the company was investigating tricyclic analogues of the recently introduced chlorpormazine. It was first licensed for the treatment of epilepsy in 1965. Carbamazepine is no more effective for focal and generalised tonic-clonic seizures than any of its appropriate competitors. Common side effects include dizziness, fatigue, diplopia, nausea, vomiting, drowsiness and ataxia. It produces dose-dependent hyponatraemia, which can be a particular problem for the elderly and patients taking diuretics and other drugs known to reduce serum Na+. Life threatening idiosyncratic adverse reactions, such as aplastic anaemia, hepatotoxicity and Stevens-Johnson syndrome can be an unusual complication of its introduction. Serious drug rashes are substantially more common in Asian patients possessing the HLA-B*1502 and in Europeans possessing the HLA-A* 3101 alleles. Carbamazepine is also a dose dependent teratogen.
The main reason, however, that carbamazepine is now an outmoded drug for first line use is the increasing appreciation of the adverse effects of its enzyme inducing properties on exogenous and, particularly, endogenous substrates. It has the property of increasing the synthesis of a wide range of oxidative and conjugating metabolic enzymes in the liver and throughout the body. Most therapeutic drugs are substrates for these pathways, resulting in a reduction in their elimination half-life and bioavailability by around 50-66%. This substantially reduces their efficacy unless the dose is appropriately increased, which can have major cost implications. Potential problems include unwanted pregnancy in patients taking oral contraceptives, increased cancer mortality, progressive AIDs, transplant rejection, uncontrolled hypertension, breakthrough pain etc. Doses of many other AEDs that are metabolised in the liver also need to be increased when combining them with carbamazepine. Withdrawal of carbamazepine, in addition, can more than double the circulating levels of all of these medicines leading to unexpected toxicity if their doses are not appropriately reduced. This too can be a problematic process.
More recently there has been increasing awareness of the effects of longterm enzyme induction with carbamazepine on vitamin D resulting in reduced bone density with subsequent osteoporosis and an increased propensity for fractures. The drug can also produce sexual dysfunction in both men and women by inducing the breakdown of a range of sex hormones. Lastly, and arguably most worrying, treatment with carbamazepine increases cholesterol levels leading to higher risks of myocardial infarction and stroke.
Enzyme induction with carbamazepine continues as long as the patient takes the drug. No one can predict what health problems lie in wait down life’s journey. Managing concomitant treatment in enzyme-induced patients can be difficult and withdrawing the drug can have disastrous consequences, since many doctors across a range of clinical disciplines will be unaware of its pitfalls. In conclusion, enzyme induction with carbamazepine can subtly and unpredictably complicate the lives of people with epilepsy. I would not take the drug myself nor would I prescribe it for a member of my family and so why would I offer carbamazepine to my epilepsy patients when a number of other equally effective, safer and user-friendly alternatives are available?

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
Carbamazepine (CBZ) is a major first-line AED for partial seizures and generalized tonic-clonic seizures. It is a tricyclic compound discovered by chemist Walter Schindler at J.R. Geigy AG, Switzerland, in 1953 while they were searching for more active psychotrphic drugs. It was first marketed as a drug to treat epilepsy in Switzerland in 1963 and its use for trigeminal neuralgia was introduced at the same time. In 1968, (CBZ) was approved, initially for the treatment of trigeminal neuralgia; later, in 1974, it was approved for partial seizures in USA. CBZ’s main mode of action is to block sodium channels during rapid, repetitive, sustained neuronal firing and to prevent posttetanic potentiation.
It is available in immediate and extended release and syrup formulas
CBZ is one of the most widely used AEDs in the world. It is highly effective for partial-onset seizures, in both adults and children. It also has demonstrated good efficacy in the treatment of generalized tonic-clonic seizures..
Although it is considered to be a first-generation drug, studies failed to show any superior efficacy of second- generation drugs over CBZ. Adverse effects were more common with CBZ in some studies however majority of them used immediate release formulas which can cause unwanted effects more than extended release formulations. It is among the first line recommended drugs for partial and generalized tonic clonic seizures even in recent updated guidelines.
Its psychotropic effects is also very important as many newer drugs such as VGB, TPM and LEV have serious behavioral and cognitive side effects which may complicate the treatment of patients with epilepsy who very frequently suffer from various psychological problems.
The therapeutic range is considered between 4-12 mcgr/ml and available for monitoring in almost everywhere which may ease the drug management.
The drug is highly effective and well tolerated. Its major disadvantages are transient adverse dose-related effects at initiation of therapy and occasional toxicity. Potential dose-related adverse effects include dizziness, diplopia, nausea, ataxia, and blurred vision. Rare idiosyncratic adverse effects include aplastic anemia, agranulocytosis, thrombocytopenia, and Stevens-Johnson syndrome. Asymptomatic elevation of liver enzymes is observed commonly during the course of therapy in 5-10% of patients. Rarely, severe hepatotoxic effects can occur. Nevertheless, long term usage yielded an accumulated information about the long and short term side effects related to dosage or idiosyncratic reactions which helps the physician to warn and monitor the patients. Moreover recent advances in pharmacogenomics studies showed serious hypersensitivity reactions in patients with HLA B* 1502 from Han Chinese and South Asian populations. Therefore performance of specific tests can improve the safety of the drug in suspected groups. On the other hand all new drugs have unwanted side effects i.e. renal stones with TPM and ZNM, retinal problems with VGB, hypersensitivity reactions with LTG, psychiatric disorders with LEV etc. Third generation more newer drugs are so recent that we have to be very careful for unexpected events such as skin discoloration and retinal changes happened with retigabine which limited its usage.
The teratogenic effect may differ among the pregnancy registries between 2- 6% where dosage below than 400mg seems to be relatively safe but coadministration with PB or VPA increases the risk of malformations.
In addition the newer drugs can be significantly more expensive and there is no clear evidence that they are more cost effective.
Finally because of its wide availability, relative inexpensiveness and proven efficacy, CBZ continues to be widely used for epilepsy.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
Iopathic generalized epilepsies (IGEs) are genetically determined forms of epilepsy. They are age specific and can start in infancy, childhood, adolescence, and even in adulthood. The age related IGE’ s are usually life long and comprise about 1/3 of all epilepsies. According to the ILAE the following are the major types are benign myoclonic epilepsy in infancy (BMEI),generalized epilepsies with febrile seizures plus (GEFS+), epilepsy with myoclonic-astatic seizures (EMAS), epilepsy with myoclonic absences (EMA), childhood absence epilepsy (CAE), and IGEs with variable phenotypes that include juvenile absence epilepsy (JAE), juvenile myoclonic epilepsy (JME), and epilepsy with generalized tonic-clonic seizures only (EGTCSO).
Some of these syndromes that are more common in the adolescent and adult patient are CAE, JME, JAE, and EGTCSO. In theses syndromes withdrawal of AEDs can be successful in certain instances.
In CAE only without GTCs, medication can be withdrawn, most commonly after 2 years. However, newer results suggest that withdrawal after 5 or 6 years might be more reliable as GTCs can appear at 5 years. For JME, about 30% can remain seizure free after drug withdrawal. Seizure freedom can even be more successful in JAE and EGTCSO after an extended period of seizure freedom with AEDs.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
The subgroup of generalized epilepsies which for decades has been known as ‘idiopathic’ has now been suggested to be termed ‘genetic’. Beyond terminology, the conceptual considerations however are very similar, and idiopathic / genetic epilepsy is, ‘as best as understood, the direct result of a known or presumed genetic defect in which seizures are the core symptom of the disorder’. Prevalence studies consistently demonstrate that idiopathic generalized epilepsies (IGE) make up 15 to 20 % of all epilepsies. IGE subsyndromes are characterized and defined by age at onset of epilepsy and by the predominant seizure type. Three IGE subsyndromes commonly commence in adolescence, i.e. between the age of 12 and 18 years. These comprise juvenile absence epilepsy (JAE), juvenile myoclonic epilepsy (JME), and epilepsy with grand mal only either manifesting on awakening (EGMA) or by random (EGMR). Long-term seizure outcome in regard of remittance even after withdrawal of antiepileptic drugs may be non-congruent in these subsyndromes. In IGE, long-term seizure outcome data are not available from prospectively followed incidence cohorts, but only from retrospective prevalence cohorts.
In an Austrian study, 64 JAE patients had a follow-up of 22 years, 37 % were seizure free in the terminal 2 years, the vast majority was still treated antiictally. A meta-analysis on childhood absence epilepsy and JAE with 27 % of patients being adolescents or adults at absence epilepsy onset revealed terminal seizure freedom (duration depended on time of follow-up which was heterogeneous) in 59 % of patients. Absence epilepsies without generalized tonic clonic seizures had a more favorable outcome (78 % seizure-free) than those with generalized tonic clonic seizures (35 %). Another predictor for long-term seizure freedom in absence epilepsies was older age; the older the patients were and thus the longer epilepsy has lasted, the less likely they still had seizures.
Until some years ago, the axiomatic dogma was that JME requires lifelong antiictal treatment, otherwise seizure recurrence would be almost inevitable. In the last couple of years, five retrospective studies on long-outcome of JME have been published. A total of 208 patients were followed up for at least 20 years. Five-year seizure remission was seen between 27 and 68 %, and for at least 5 years, 8 to 26 % of all patients in addition to seizure freedom were off antiictal medication. Out of 45 patients who were off medication at the end of the study, 31 were seizure free for at least 5 years. However, it is unclear how many patients in the course of their disease had seizure relapse after withdrawal of antiictal medication and then restarted regular drug intake.
Our center identified manifestation of additional absence seizures at onset of JME as an independent predictor for lack of terminal 5-year seizure remission. In univariate analysis, other studies demonstrated that long duration of epilepsy with unsuccessful treatment, antiictal polytherapy, and generalized tonic clonic seizures preceded by bilateral myoclonic seizures are significantly associated with lack of seizure freedom. There was a general trend that the older the patients were, the more likely they were in remission.
In the early course of EGMA, favorable treatment response to antiictal substances is well known, but until recently long-term data on seizure prognosis had not been available. We reported 42 patients with a ‘pure’ form of EGMA lacking absence seizures and myoclonia. Patients had a mean follow-up of 40 years, and 26 subjects (62 %) had been seizure-free for at least the last 5 years. Only five seizure-free patients were off antiictal medication. We identified current age to be the only independent predictor for lacking seizure freedom in the terminal 5 years. Remission rates were 35.7% in patients 55 years and younger (n = 14), 66.7% in patients aged between 56 and 65 years (n = 12), and 81.3% in patients older than 65 years (n = 16).
Withdrawal of antiictal substances had been performed in 19 patients (45.2%), 12 of which had seizure relapse (63.2%). Mean time between withdrawal and relapse was 22 ± 31 months (median 7 months). We do not know how many of the 23 continuously treated patients would still be seizure free, if antiictal drugs had been withdrawn in the course of the disease. This, however, was done rather reluctantly due to our early experiences regarding seizure relapse after withdrawal of antiictal agents.
In a Canadian population-based study, 40 patients with epilepsy with grand mal by random were reported. In 33 patients, antiictal drugs had been withdrawn and 27 of those patients (75 %) were seizure-free for more than 15 years. These findings are in line with 15 of our patients with EGMR, 12 of those (80 %) were seizure-free in the last 5 years.
To conclude, currently available data are based on single-center retrospective studies revealing heterogeneous findings in regard of seizure remittance. Roughly speaking, probability of seizure freedom within the last 5 years is determined by higher age, but the majority of patients are still on antiepileptic drugs. Up to now, published data do not allow to sufficiently predict which IGE patients will remain seizure-free after antiepileptic drug withdrawal.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
Autoimmune epilepsy exists when seizures are accompanied by evidence of autoimmune-mediated central nervous system inflammation. Seizures have long been known to occur as part of the spectrum of a variety of systemic autoimmune disorders, particularly systemic lupus erythematosus. More recently, an increasing number of ‘autoimmune encephalitis’ syndromes have been identified, with seizures either as the major manifestation, or accompanying other presentations such as psychiatric disorders.
Both intracellular (such as Hu and Ma) and surface antigens (for example GABAR or LGI1) may be antibody targets in autoimmune encephalitis. The former are more likely to be associated with tumors although NMDAR antibodies may be particularly associated with ovarian teratomata.
The overall incidence of autoimmune encephalitis is unknown. Surveys such as the California Encephalitis project may only test patients with specific symptoms or signs suggesting an immune etiology. For example, of 761 patients with encephalitis of unknown origin, only 47 were tested fro NMDAR antibodies; 32 were positive. Twenty-one percent of patients in another survey had immune-mediated encephalitis. These patients, however, all had symptoms of acute encephalitis, rather than seizures alone. Other antibodies are much rarer. GABAB receptor antibodies were detected in seven of 3989 patients evaluated for autoimmune encephalopathy; five of them had small cell lung cancer. AMPA receptor antibodies were found in 1% of patients with suspected encephalitis/encephalopathy. In the Bethel Antibody laboratory from 2012-2014, 4.7% of 6893 patients with epilepsy had positive tests (NMDAR and GAD65 antibodies the most common).
Some clinical features may suggest that autoimmune encephalitis should be considered as a potential seizure etiology. Some patients have signs or symptoms of 'limbic encephalitis' in addition to seizures, such as psychiatric manifestations or increased hippocampal signal. Other explanations such as bone marrow transplant-associated HHV6 encephalitis need to be excluded as well. Adults presenting with new-onset seizures, particularly if frequent or status epilepticus occurs, and there is no other explanation such as a tumor, should be considered for evaluation. Young women with a suspected ovarian teratoma and new-onset seizures should be tested for NMDA receptor antibodies. There are a few specific seizure syndromes, such as 'faciobrachial dystonic seizures,' that have been associated with autoimmune encephalitis due to LGI1 antibodies, as well as the 'extreme delta brush' EEG pattern in NMDAR encephalitis. CSF may show non-specific markers of 'inflammation.
Although detection of some antibodies has high specificity for the diagnosis of autoimmune encephalitis, others have less certain implications, particularly when tiers are low. Treatment involves combinations of IVIG, plasmapheresis, and steroids as well as other drugs such as cyclophosphamide or rituximab that have serious potential toxicity. Moreover, controlled trial data are limited. These considerations suggest that testing all patients with seizures of uncertain etiology for possible autoimmune encephalitis may well lead to overdiagnosis and treatment, with increased adverse effects, as well as delay in furthering understanding of the best therapeutic approach.
Only patients whose signs and symptoms lead to reasonable suspicion of the diagnosis should be tested.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
The Wada test is typically used for identification of the language- and memory-dominant cerebral hemisphere. The test is considered the “gold standards”—a term implying that the evidence it furnishes is trust worthier than evidence from other methods. The advent of noninvasive functional neuroimaging now raises the possibility of replacing this “gold standard” pro-vided it offers equally trustworthy results. It will be argued here that these methods, namely functional magnetic resonance imaging fMRI, magnetoencephalography (MEG), and transcranial magnetic stimulation (TMS) do provide equally trustworthy results, and that therefore they may replace the Wada test in many if not most cases.
The compatibility of the language lateralization results of the Wada and the fMRI procedure has been shown in a number of studies reporting perfect concordance or high concordance equally high is the compatibility between the results of the Wada and MEG with respect to language lateralization. Fewer studies, however, report comparisons of laterality estimates for memory between fMRI and Wada, with some reporting high and others also low concordance raising the questions as to which method is to be trusted.
Given that the degree of concordance between the Wada and the noninvasive methods is not perfect (and sometimes low as in the case of memory lateralization) the question becomes: The results of which method is to be considered valid? On the basis of the assumption that the Wada is the “gold standard,” the tendency is to consider discordant estimates as failures of the neuroimaging methods. However, when that assumption is put to the empirical test it becomes obvious that the results of the Wada should not be considered any more valid than the results of the noninvasive methods. For example, the efficacy of the Wada procedure is lower than would be expected for a gold standard procedure for predicting the likelihood of postoperative language and memory deficits. In contrast, fMRI has shown better predictive efficacy than the Wada. In addition, many studies are conducted every year for the purpose of fine tuning the noninvasive methods in revealing with increasing reliability brain regions involved in different aspects of memory and language performance using fMRI, MEG and, lately, TMS and verifying the validity of the findings, mainly against prior knowledge gained from lesion studies.
Therefore, given that the validity of the data of the Wada is limited, there is no justification in considering them more trustworthy than data supplied by neuroimaging, in those few cases that the results are in fact discordant.
The practical question of course remains: Given that both types of methods are suboptimal, which type should be used preoperatively? But, given the compatibility of the two sets of methods, both in terms of their concordance in most cases and their imperfections in few cases, the noninvasive methods should be used as a matter of course because, in case their results are ambiguous, testing can be repeated and the results of the different ones cross-validated (e.g. of fMRI or MEG against TMS for expressive language; MEG and fMRI for receptive language, and so on). And only if the ambiguity is still not resolved should the Wada test be performed in the hope that they may resolve it.
Moreover, although both types of methods have limitations, the following ones, specific to the invasive but not to the non-invasive methods, render the latter preferable. First, the Wada is associated with appreciable morbidity, ranging between 3 and 5%. Second, it is associated with patient discomfort. In contrast, no morbidity and only minor discomfort is associated with neuroimaging. Third, the results of the Wada procedure may not suffice for assessing memory laterality because delivery of the sodium amobarbital to the hippocampal formation is not always possible and because the structure of the Wada protocol does not allow for separate estimation of hemispheric dominance for verbal and for nonverbal encoding, although there is evidence of stimulus modality–specific encoding in the left and the right hippocampus. Needless to say, fMRI can discern the involvement of all brain structures associated with memory, both neocortical and paleo-cortical, and because it can be repeated, it can identify distinct components of the memory-specific brain circuitry.
A fourth and a fifth limitation of the Wada test are due to the narrow time window in which it must be performed and its repetition for establishing reliability of the results is impractical. Moreover, the Wada may not probe for the mechanisms of a host of different cognitive operations that are subsumed under “language” and “memory.” Yet, the neuronal networks of such operations can be assessed separately in the context of several neuroimaging sessions.
A seventh problem, also reducible to the time constraints, is the inability to control for situational variables that may corrupt the integrity of the data. Attention lapses on the part of the patient in the crowded Wada suite where a number of tasks have to be done under time constraints may well produce misleading data.
Eighth, the Wada test cannot supply information about the primary visual and auditory cortex that both MEG and fMRI can readily supply. Finally, the Wada test requires alertness and cooperation on the part of the patient. Thus it cannot be used with patients with attention deficit and hyperactivity problems, patients in a state of confusion, patients with encephalopathies, or patients who are very young. None of these limitations apply to neuroimaging, where localization of sensory, motor, and even receptive language cortex can be accomplished with the patient under sedation.
For all the preceding reasons it is proposed that the Wada may be replaced as the methods of choice in many if not most cases where the non-invasive methods are available.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
Over the past 20 years, more than 15 new antiepileptic drugs (AEDs) have been introduced globally many possessing unique mechanisms of action. Despite this plethora of novel agents, outcomes for the common epilepsies in children and adults have not substantially improved, largely because all AEDs are symptomatic anti-seizure and not anti-epilepsy drugs. Three main patterns of response have been identified with just less than 60% of patients having a good prognosis once an appropriate well tolerated AED has been successfully introduced, with another 25% developing refractory epilepsy probably de novo. The remainder of the patient population demonstrate a relapsing/remitting course with around half being seizure free at any one time. Very few patients attain remission after failing their first 2 AED schedules particularly due to lack of efficacy. The International League against Epilepsy (ILAE) defines drug resistant epilepsy as “failure of adequate trial of two tolerated, appropriately chosen and used AED schedules (whether as monotherapies or in combination) to achieve sustained seizure freedom”.
If treatment fails at a low dosage due to poor tolerability or following the development of an idiosyncratic reaction, another AED should be substituted. Similarly if the patient documents worsening in seizure control or no useful improvement, another drug, arguably possessing a different mechanism of action, should be tried. However, if there is good response to treatment falling short of seizure freedom with the first or second AED, another drug should be added, again with a different mechanism of action. Attention should be paid particularly to drug burden, which is a function of high dosage as well as number of AEDs.
Different mechanistic groups of AEDs include those affecting the fast and/or slow inactivated state of the Na+ channel, e.g. lamotrigine, oxcarbazepine and lacosamide, calcium channel blockers e.g gabapentin and pregabalin, GABA-ergic drugs e.g clobazam, those that modulate SV2A, e.g levetiracetam, Kv7 neuronal potassium channels e.g retigabine, or AMPA receptors e.g perampanel. Alternatively, the addition of a broad spectrum AED with multiple mechanisms of action can be tried, such as sodium valproate, topiramate or zonisamide.
In clinical practice more than 50 combinations of AEDs have been reported to be successful in individual patients. The approach of combining rather than substituting AEDs in patients with difficult to control epilepsy is more likely to be effective and is a safer option than switching randomly from one drug to another, which carries with it the risk of seizure exacerbation and/or acute neurotoxicity. The most successful duotherapy is lamotrigine with sodium valproate, for which there is good evidence of synergism. There are increasing data supporting the use of AEDs possessing different mechanisms of action in combination. Keeping doses low allows combinations of 2 or sometimes 3 AEDs to be tried in the hope of attaining sustained seizure freedom without unacceptable toxicity. In addition the patient is more likely to retain confidence in their physician if a clear plan of action is laid out early for those who are most likely to have pharmacoresistant epilepsy. The major problem in this setting is the lack of good evidence in support of either course of action in the population at large or for the individual patient and so clinical experience is the driver in this setting.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
The case scenario of this debate from my side is an adult patient with new onset focal seizures. Earlier studies of Brodie (2001, 2012) lead us to believe that there is no use of trying the third antiseizure drug (commonly called antiepileptic drug-AED) if the first two fail before going on to polytherapy and that the whole exercise is rather useless. This is not, however, correct because there is still a lot that can be done before going on to polytherapy with all the side effects that can occur with combinations. First an assumption before the argument: The correct AEDs are given to the patient in the first place and that the patient actually takes the drugs and that the patient actually has focal onset epilepsy.
First of all, when 2 drugs fail, then the patient needs to be referred to an epileptologist who understands the full range of AEDs for further dose adjustments or change of medication. Some of the adjustments that can be done are switching to a drug with another mechanism of action, or dose increases up to higher drug concentrations of the drug that the person is currently taking. Time of dosing can also be an important factor in seizure control. Environmental stressors should be analyzed and corrected when possible. According to other studies, a significant minority of patients (about 16%) can be rendered seizure free by changing up to 3-5 AEDs. Recent examples that even refractory patients can become seizure free are the new down-titration to monotherapy drug trials that have shown success for some new AEDs as lacosamide and eslicarbazepine.
Patients should be given more than 2 chances of trying more than 3 AEDs before calling them refractory.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
Multiple subcortical brain structures and cortical epileptogenic foci have been targeted for invasive—deep or subdural—brain stimulation in epilepsy. Most data on efficacy are derived from small, uncontrolled clinical studies hampering the significance of reported findings. Reliable data are available for continuous stimulation of the anterior nucleus thalamus and for responsive stimulation of the seizure onset zone in neo- or archicortical structures. Whether positive results of regulatory trials can be translated to the broad spectrum of difficult-to-treat epilepsies in the community, needs to be assessed at best by use of large international multi-centre registries. Prerequisite for successful invasive brain stimluation in epilepsy is ineligibility for possibly curative resective surgery and—most importantly—accurate phenotypisation of patients; different clinical forms of epilepsy may respond differently to individual targets and stimulation parameters.
The general concept of deep brain stimulation in epilepsy is continuous electrical stimulation in order to increase neuronal inhibition independent of the state of cortical excitability or seizure occurrence. An alternative approach is to continuously record electrocorticographically neuronal activity in the supposed seizure focus and to stimulate the epileptogenic zone responsively only in the case of abnormal activity. In clinical practice, induction of stimulation can be rather frequent, and up to 3,000 trains of stimulation have been observed within 24 h questioning the concept of responsiveness.
In a large randomised controlled trial on patients with intractable partial epilepsy, 109 adult patients from 17 centres in the U.S. underwent either 3-month bilateral electrical stimulation of the anterior thalamus or no stimulation starting 1 month after electrodes had been implanted. Compared to a 3-month prospective baseline period, patients with stimulation on had a reduction of seizure frequency of 40.4% while patients with stimulation off had a reduction of 14.5%, indicating significant efficacy of chronic ANT stimulation. The two most common self-reported adverse effects were depression (14.8%) and memory impairment (13.0%), both of which were significantly more common compared to non-stimulated controls (1.8%, resp.). This regulatory clinical trial resulted in receipt of the CE certification (CE = Conformité Européenne) in the year 2010 in Europe that allows for ANT implantation in patients with refractory epilepsy. So far, there is no approval by the Food and Drug Administration in the U.S.
After the end of the 3-month blinded period, all patients were offered open-label ANT stimulation. One year after electrode implantation, median reduction in seizure frequency compared to baseline was 41%, and after 5 years, frequency reduction was 69%. Along with reduced seizure frequency, clinical variables such as seizure severity, quality of life, and neuropsychological test composite scores including depression, anxiety, and subjective cognitive function significantly improved.
In another randomized controlled trial, eventually 191 patients underwent intracranial implantation of a neurostimulator directly addressing the seizure onset zone. Responsive stimulation was successful, 3-month stimulation resulted in a significant reduction of median seizure frequency of 38% vs. 17% in the non-stimulated group. In four out of the 191 patients, the stimulation had to be explanted due to infections all of which involved soft tissue and not the brain or the skull. This NeuroPace RNS® system has been approved by the U.S. Food and Drug Administration in 2013 but so far not in Europe. Recording and stimulation algorithms have not been disclosed by the above named company.
Long-term follow-up data confirm increased efficacy over time with reduction of median seizure frequency of 53% after 2 years and 63% of 4 years. Along with decreased seizure frequency, quality of life improved.
To conclude, efficacy data on continuous deep brain stimulation and on responsive direct stimulation of the seizure focus seem to be similar. A clinical trial directly comparing the two approaches—in particularly in patients with well defined seizure focus—is desirable, but for various reasons this is unlikely to happen. Against this background, there is currently no indication to prefer craniectomy with implantation of the recoding and stimulation system within the skull to the less invasive approach requiring only two small skull boreholes.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
There are several clinical challenges in the diagnosis and management of NCSE .The working definition involves a prolonged state of impaired consciousness or altered sensorium, associated with continuous paroxysmal activity or electrographic discharges on the EEG. This mandates the need for continous EEG monitoring of these patients. NCSE may be more common than thought: 25 % of all SE, about 27% of ICU pts with altered mental status and 8% of pts in coma /critically will have NCSE.
NCSE often has the following problems. Frequent subtle/no clinical manifestations except altered sensorium, a need for EEG confirmation of ongoing epileptic activity, physicians lack of awareness of the possibility of NCSE, underdiagnosis and deleterious consequences with increased mortality and morbidity.
NCSE has steadily became a therapeutic Pandora’s box and often is a nightmare to manage due to its unusual clinical features requiring an high index of suspicion, challenging EEG patterns, controversial/unclear treatment paradigms & prognosis.
Response to treatment is one of the modes of confirming the diagnosis a positive response to antiseizure and antistatus medication would go in favour of the diagnosis. Hence treatment is a part and parcel of the diagnosis.
Response to anticonvulsants both clinical and EEG is controversial,and sadly often initiated after long delay.
It is recommended to initiate treatment quickly—when NCSE developed out of convulsive status epilepticus (CSE) as per accepted guidelines for the management of CSE and as soon as it is suspected when happening denovo.
Treatment recommendations are different for the various subtypes.
Absence SE: BZD, if resistant VPA / PB
Discontinue / Avoid AEDs which trigger SE
Complex partial NCSE (CPNCSE) : similar to CSE
NCSE in coma: dilemma about diagnosis and treatment exist but aggressive treatment similar to that for subtle SE epilepticus, Intravenous AEDs are a must because the response to first-line treatment may be poor (IV benzodiazepine must be used for both diagnostic and therapeutic purposes under EEG surveillance ) many AEDs must be tried. There is a scope for a good outcome with aggressive treatment approach
Response to treatment may be quite delayed, often as much as 24 h or longer in NCSE and depends on the subtype, underlying etiology & timing of treatment. Mortality rates of 20-30% may be due to the underlying etiology itself or the complications of disease / treatment.
In CPSE a mortality of 18% is based on etiology.
In Children a mortality rate of 25 %, and in elderly it is higher to about 56%.
Cognitive sequel are found after 15-30 % of adults and would be worse if no aggressive management is done.
Unfortunately it will not be prudent to miss or under treat these patients with uncertainty in approach and this be realised retrospectively. It is hence necessary to have a deft and all FIRE BRIGADES blazing approach to put out the fire in NCSE.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
Generalized convulsive status epilepticus (GCSE) is a life-threatening medical emergency. Realizing that "time is brain", modern treatment protocols are quite aggressive, and after failure of a benzodiazepine and one intravenous (IV) antiepileptic drug (AED) the protocols call for induction of generalized anesthesia with drugs such as midazolam, propofol, thiopental or pentobarbital.
Non-convulsive status epilepticus (NCSE) is different from GCSE. It can be defined as a change in behavior and/or mental processes from baseline associated with continuous epileptiform discharges on the EEG. It may or may not include dyscognitive features ("complex partial status" - CPSE). The compromise between the danger related to untreated SE and danger of damage induced by possibly unnecessary aggressive treatments may prove difficult.
Available human data indicate that many clinical forms of NCSE are benign in terms of morbidity and mortality. Poor outcome may be attributed to the etiology and to associated complications. The evidence for neuronal damage induced by NCSE in humans is scant. Mortality in patients with NCSE due to pre-existing epilepsy is as low as 3%; in patients with NCSE due to acute medical disorders mortality reaches 27%, and the cause has a major effect on NCSE outcome.
For most forms of NCSE that persist after treatment with BZD and an AED, additional trials of IV AEDs should be preferred, rather than early escalation to anesthetics. This strategy is especially relevant in cases of NCSE in which consciousness is somewhat preserved, and the risks of anesthetics (including arterial hypotension, respiratory depression, gastroparesis, paralytic ileus, immunosuppression, infections, propofol infusion syndrome and prolonged sedation due to drug accumulation) might outweigh the risks of continued seizure activity.
EFNS guidelines (2010) state the therapeutic decision should be based on the type of SE, age, comorbidity and prognostic issues, and that this is of special relevance in patients with CPSE because the risks of anesthesia may be greater than the risks of ongoing non-convulsive epileptic activity.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
Early clinic-pathological studies demonstrated that the two cardinal lesions associated with Alzheimer disease (AD), neurofibrillary tangles (NFT) and amyloid deposits, have a differential impact on cognition both at early and late stages of the neurodegenerative process. In contrast to ß-amyloid (Aß) deposition that occurs diffusely in the human brain over 60 years of age, NFT formation follows hierarchical schemes of regional and cellular vulnerability affecting first the entorhinal cortex and parahippocampal formation before moving in adjacent neocortical association areas. Long before the emergence of novel imaging techniques, it was clear that Aß deposits correlate very weekly with cognition and downstream neurodegenerative biomarkers. In contrast, NFT and associated synaptic loss is strictly related to the loss of cognitive functions not only at late but also at early stages of AD. The last decade was characterized by the exponential increase of knowledge in the field of AD predictive biomarkers and, most importantly, characterization of tracers for ß-amyloid (Aß). It is now widely acknowledged that amyloid deposits in positron emission tomography (PET) with Pittsburg compound B (PiB; a marker of Aß fibrillar deposits) precede dementia by 5-10 years, and PiB burden inversely correlates with concentration of A?42 in the cerebro-spinal fluid. However, increased PiB burden was reported in nearly 20% to 30% of controls in the general population pointing to the fact that Aß deposition is not sufficient to cause cognitive decline in AD. Moreover, the rate of Aß accumulation is not related to neurodegeneration at baseline and only 8% of controls display both decreased hippocampal volume and increased PiB signal. According to Jack’s model, all of the aforementioned markers become positive well before dementia onset, and the ones related to amyloid pathology already reach their plateau at the time of first cognitive deficits. More recently, selective tau tracers became available for clinical research. Although a PiB equivalent is not yet ready for tau imaging, the recent development of tau tracers with higher selectivity, reduced non-specific binding and improved tracer kinetics compared to the first molecules raise increasing expectations among the scientific community. Given the tight association between tau deposition, cognition and neurodegeneration, and unlike Aß imaging, tau imaging will be essential for assessing disease progression. Furthermore, they may help to resolve the controversy about the temporal sequence of tau pathology in AD. The new diagnostic criteria by Dubois and collaborators consider that the development of tau pathology, at least under its fibrillar forms, is a late phenomenon in AD dependent, at least partly, on the Aß deposition in prodromal states. Recent contributions showed that tau-related markers (but also structural MRI changes) might become positive in the absence of PiB deposits mainly in preclinical cases. Ultimately, tau imaging will provide the tool to change the landscape and explore whether or not presymptomatic administration of anti-Aß therapy impacts on the progression of tau pathology that determines the clinical expression of AD.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
The number of individuals affected by dementia is increasingly growing worldwide. Based on the Global Burden of Disease study (GBS), more than 130 millions of people affected by dementia by the year 2050. About at least seventy per cent of these cases are AD or mixed dementia. A definitive AD diagnosis, according to the 1993 NINDS-NIA criteria is achievable only by postmortem examination. On the other hand, the clinical diagnosis of dementia is still challenging for the dementia specialists applying both the old and the new clinical criteria. It is clear that the diagnostic accuracy can be improved by the support of paraclinical investigations such as imaging (structural and functional) plus fluid biomarkers (biomarkers of pathological specific process, neuronal damage, and inflammation). This is important for clinical work especially in the differential diagnosis process but probably essential to anticipate the diagnosis after first symptom and in the preclinical stage, as required in the recent classification systems and in trials for new therapies. The advancement of imaging Alzheimer disease pathology has included several markers for PET ?-amyloid imaging agents. The recognized growing importance of neocortical neurofibrillary tangles as marker of disease progression and pathology has determined the development of several tau imaging compounds such as [(18)F]T807, [18F]THK523, [18F]THK5117, [18F]THK5105 and [18F]THK5351, [18F]AV1451(T807) and [11C]PBB3. In particular F T807 binding has been described as associated with clinical impairment particularly in the inferior temporal gyrus, stronger than the association of beta-amyloid marker in the cortex with the same clinical features.
In this direction several cell and animal studies suggest that tau propagation from cell to cell along specific anatomic pathways with a prion like mechanism may favour the aggregation and region-to-region spread of tau pathology within the central nervous system, determining the clinical phenotype. The pathological accumulation of the tau protein is important not only for other dementias both mainly environmental like chronic traumatic encephalopathy and others of more complex clinical and genetic classification such as frontotemporal dementias, progressive supranuclear palsy, and corticobasal degeneration. In AD, It has been clearly shown that the hyperphosphorylated tau tangles and not beta amyloid accumulation, correlates with neuronal dysfunction and death. Therefore also the clinical stage and severity of AD seem more closely correlate to tau load and spreading than with beta-amyloid accumulation. The pathogenesis and some data on the clinical progression seem to suggest that tau imaging, a surrogate in vivo of tau accumulation, may be instrumental in the classification and staging of dementias.
On the other hand recent neuropathological studies in subjects over 85 with samples collected in population-based studies show that for older subjects the relationship between clinical features and tau accumulation may be non linear. This is particularly important in subjects with dementia where the correlation is lost after diagnosis.
Therefore, the benefit of follow -up of patients with dementia and in some cases even the diagnosis may be less significant in subjects in advanced age.
The age structure of people with dementia is rapidly changing with 2/3 of subjects being 85 and older. The challenge is therefore to obtain a valid and early diagnosis taking into account the changing pattern of disease phenotype of subjects with dementia. Subjects with dementia, being mostly in late stages of life, are going to have more than one pathological condition (more likely several pathological conditions and furthermore to be frail and with a short expectation of life , in most of cases less than ten years). Diagnostic and therapeutic research, including imaging in the area so dementia should carefully taking account this shift to determine improvement of clinical management and prognosis of patients. In this conceptual framework the challenge of early diagnosis should look carefully at change in tau accumulation with age.
For disease progression other markers may be important: imaging markers of inflammation may be at least as important as tau deposition. Both genetic (particularly GWAS) and epidemiological studies have shown a link between neuroinflammation and neurodegeneration in AD. Microglial activation and reactive astrocytes have been described in critical area for dementia. Inflammatory genes like IL6r and C9 genes in particular have been associated with A? and tau burden. Recent work has shown that PET ligands for neuroinflammation may act as good markers of disease progression, allowing for the development of more complex and integrated model of AD natural history.
PET tau imaging probably enables the assessment of the longitudinal pattern of tau deposition. This is, however, not the only imaging marker important for the assessment of the clinical evolution of disease. There are however significant problems that needs to be addressed before considering tau imaging key in diagnostic and especially in the follow-up of subject with dementia . These controversial issues will be discussed in the presentation.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Continuum
Dementia with Lewy bodies (DLB) and Parkinson’s disease dementia (PDD) are the second most common type of degenerative dementia after Alzheimer’s disease (AD) in patients older than 65 years. For years, there has been an ongoing debate whether DLB and PDD should be considered as part of one spectrum of dementia related to cortical Lewy body disease or whether they are two distinct conditions. One’s position on this debate in normally reflected by viewpoint on the disease itself whether it’s clinical, pathological or genetic. The current consensus criteria recognize the clinical distinction between DLB and PDD using the timing of the onset of cognitive symptoms in relation to motor symptoms (i.e. diagnosis of DLB is assigned when dementia and motor symptoms appear together within 1 year, PDD when dementia occurs >1 year after motor impairment). This “one-year” rule regarding temporal relationship between dementia and parkinsonism is however considered artificial and cognitive impairment is recognized to occur not only in more advanced PD, but also in early, untreated PD patients, and even in those patients with pre-motor syndromes such as REM sleep behaviour disorder (RBD) and hyposmia. There are no clinical symptoms that categorically distinguish DLB and PDD as both may show visual hallucinations, autonomic symptoms, RBD, cognitive fluctuations, and neuroleptic sensitivity. Regarding motor symptoms, DLB patients have been described with more symmetrical parkinsonism, relatively higher rigidity and lower resting tremor but this generally would not lead to high sensitivity in diagnosis. In addition to the clinical similarity, DLB and PDD also share a common neuropathological feature of cortical alpha-synuclein-positive Lewy bodies (LBs) and neurites that does not differentiate DLB from PDD or in fact even from Parkinson’s disease (PD) without dementia. This is reflected in the overlapping staging criteria of the two syndromes that both examine the topographical distribution of alpha-synuclein pathology (i.e. Braak PD stages 1-3= McKeith’s brainstem DLB, Braak stages 4-5= McKeith’s limbic DLB, Braak stages 5-6= McKeith’s neocortical DLB). Generally, most DLB, PDD and PD patients die with end-stage disease at which point the brain is diffusely involved. Some studies have suggested that there is more beta-amyloid accumulation in DLB causing a more aggressive course of disease (i.e. shorter time to dementia) but concomitant Alzheimer-type pathology is also very common in PDD and thus cannot be used as a diagnostic marker to distinguish the two syndromes. The only difference appears to be the nigral neuronal loss which can be more significant in PDD than in DLB suggesting that the most vulnerable neurons may differ between these disorders; however this has not been systematically studied. Genetic factors may also play a role in the expression of cognitive deficits in DLB and PDD, as suggested by dominant familial forms of DLB/PDD. Notably, missense mutations in the alpha-synuclein gene (SNCA) and locus multiplications are associated with clinical and pathological phenotypes ranging from PD to PDD to DLB. In world-wide populations SNCA genetic variability remains the most reproducible risk factor for idiopathic PD and SNCA gene has recently been also associated to DLB. However, only few investigators have looked at SNCA variability in terms of the different clinico-pathological groups. We used targeted next-generation sequencing to comprehensively characterize the 135kb SNCA locus in a large multi-national cohort of patients with PD, PDD, DLB and healthy controls. An analysis of 44 tagging single nucleotide polymorphisms (SNPs, with MAF>5%) across the entire SNCA locus showed two distinct association profiles for parkinsonism and dementia, respectively towards the 3’ or the 5’ of the SNCA gene. In addition, we defined a specific haplotype in intron 4 that is directly associated with PDD. The PDD risk haplotype has been interrogated at single nucleotide resolution and is uniquely tagged by an expanded TTTCn repeat.
Our genetic study suggests that there may be specific haplotypes that have functional consequences in both mRNA and protein level that explain where in the continuum patients would fall. The fundamental question is the mechanism(s) whereby these subtle allelic differences lead to mismetabolism of alpha-synuclein responsible for the neurodegeneration and subsequent clinical presentation. In order to unravel this, it is important that clinicians, pathologists and geneticists work together each describing the variables they can measure reliably in optimal detail rather than obscuring subtle differences by trying to fit subjects into certain disease categories. Thus, the question whether DLB and PDD are on a continuum or distinct entities is rather moot. Single Lewy body disorder model however is more useful for studying disease pathogenesis with an ultimate aim of developing disease-modifying therapies that target the alpha-synuclein-related neurodegeneration.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Two distinct entities
Age-related neuropsychiatric disorders such as Parkinson disease with and without dementia (PDD and PD, respectively), dementia with Lewy bodies (DLB), and Alzheimer disease dementia (AD) represent a growing socioeconomic challenge. These disorders show substantial clinical and neuropathological overlap, limiting diagnostic accuracy and questioning the concept of distinct clinical entities. Indeed, the notion that PD and AD may be extremes of a spectrum of neurodegenerative diseases, with DLB and PDD presenting overlapping neuropathologic and clinical features within this spectrum, has received growing attention in recent years. Although pathophysiologically and clinically different, PD and AD share some aspects in common; both are age-related neurodegenerative disorders characterized by aggregation of pathologic proteins leading to dysfunction of cerebral networks and distinct patterns of metabolic changes. Cases characterized by pure PD (?-synuclein aggregation) or pure AD (amyloid-? and tau aggregation) pathology do not represent most affected patients. Biologically and histopathologically, there is an overlap of these age-associated proteinopathies. They form a continuum with concomitant amyloid-?-, tau-, and ?-synuclein aggregation as well as microvascular changes. DLB and PDD are age-related neurodegenerative disorders sharing clinical and histopathologic aspects with both PD and AD. Hence, they can be seen as intermediate neurodegenerative disorders in a spectrum between pure PD and pure AD. Because the pattern of histopathology, neuronal network dysfunction, and associated clinical deficits is indeed continuous, the traditional view of distinct disease entities is increasingly being questioned.
Temporal differences in the emergence of symptoms and clinical features warrant the continued clinical distinction between DLB and PDD. While DLB and PDD groups' neuropsychological profiles often differ from those in AD, the diagnostic sensitivity, specificity, and predictive values of these profiles remain largely unknown. PDD and DLB neuropsychological profiles share sufficient similarity to resist accurate and reliable differentiation. Although heterogeneous cognitive changes (predominantly in memory and executive function) may manifest earlier and more frequently than previously appreciated in PD, and executive deficits may be harbingers of dementia, the enthusiasm to uncritically extend the concept of mild cognitive impairment (MCI) to PD should be tempered. Instead, future research might strive to identify the precise neuropsychological characteristics of the prodromal stages of PD, PDD, and DLB which, in conjunction with other potential biomarkers, facilitate early and accurate diagnosis, and the definition of neuroprotective, neurorestorative, and symptomatic treatment endpoints.
Biomarker-based approaches targeted to disentangle histopathology-clinical relationships within this spectrum may further help to guide classification of neurodegenerative disorders and treatment stratification. Imaging and fluid biomarker studies are available that support the notion of distinct disease entities, but research also supports the idea of a continuum of cerebral changes between the two extremes of pure AD and pure PD. The present paper will present the relevant evidence and argue in favour of the two distinct entities approach.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Spectrum
Despite significant clinical overlapping, frontotemporal dementia with motor neuron disease and amyotrophic lateral sclerosis (ALS) have been historically considered as separate entities. In 2006, TAR DNA-binding protein 43 (TDP-43), encoded by the TARDBP gene, has been identified as the major pathological protein the so-called frontotemporal lobar degeneration (FTLD) with ubiquitin-immunoreactive (up-in) inclusions (FTLD-U) with or without motor neuron disease and amyotrophic lateral sclerosis (ALS). This finding created the basis for unifying the majority of FTLD-U and ALS as a spectrum of TDP-43 pathies. Interestingly, though, groups focusing on ALS and FTD are yet to integrate fully, either by structural reasons as these groups were historically allocated to different clinics or because it was unclear why some patients showing TDP-43 type B proteinopathy would not manifest motor neuron disease. The discovery of the C9orf72 mutation in 2011 shed light on this questions. Affected members of families with C9orf72 mutation, an autosomal dominant disease, express a broad range of clinical phenotypes from pure ALS to pure FTD, AD-type symptoms, and parkinsonism. Studies in C9orf72 families unravel that certain genetic variations including in the gene TMEM10b and ATAX2 may contribute to the phenotype variation. In summary, TDP-43 type B proteinopathy underlies the majority of ALS with frontotemporal deficits and FTD cases. Genetic variations may impact the clinical presentation and explain the broad spectrum of presentations seen in these patients sharing similar neuropathological features.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
The mild cognitive impairment (MCI) concept was developed to identify the earliest stages of cognitive impairment. MCI and, more specifically, amnestic MCI were initially proposed as pathological transitional states that ultimately progress to full blown AD. However, it has been found that MCI subjects do not uniformly progress to dementia (either AD or another) and may revert back to normal cognitive state. The MCI is concept has been borrowed to other neurodegenerative diseases, particularly Parkinson's disease (PD). However the operational definition of MCI may not adequately convey the intended concept. Additional modifications to the concept and its operationalization are needed in order to better identify patients with incipient cognitive impairment and to guide clinical and research practices.
Patients with PD have a very high likelihood of developing dementia, which develops insidiously. Cognitive impairment may start even before other symptoms, although this is not in accordance with Braak's sequence.
There is no available data to support the concept that a certain constellation of cognitive symptoms in an otherwise healthy individual will herald development of PD or indeed will progress to dementia.
Therefore, at present, identification of subtle cognitive dysfunction even in a person with diagnosed PD does not benefit the patient and should be avoided, except for research purposes.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
Accumulation of phosphorylated tau (p-tau) is accepted by many as a long-term consequence of repetitive mild neurotrauma based largely on brain findings in boxers (dementia pugilistica) and, more recently, former professional athletes, military service members, and others exposed to repetitive head trauma. The term “chronic traumatic encephalopathy” has been in the literature for decades, although the term has been applied more liberally to sporting activities since 2005. The most specific pathology according to a recent consensus group is the presence of phosphorylated tau in perivascular areas and in depths of sulci. Some caution before accepting chronic traumatic encephalopathy as an entity is warranted, however. Concussions and subconcussive head trauma exposure are poorly defined in available cases, the clinical features reported in chronic traumatic encephalopathy are not at present distinguishable from other disorders, and adequate control groups as well as prevalence data are virtually non-existent. Moreover, dementia pugilistica, which has widespread acceptance, has had autopsy correlation in a surprisingly small number of cases, which are further complicated by numerous co-morbidities, including substance abuse, vascular disease, infection, and genuine neurodegenerative disease. With respect to the modern iteration of chronic traumatic encephalopathy, the association of sparse immunohistochemical reactivity with psychiatric signs such as impulse control issues and suicide is also problematic. Predicting complex behaviors on the basis of such changes is beyond the reach of neuropathological interpretation. In general, because the definition of neurodegenerative disease requires a defined clinical phenotype, invariable disease progress, and a defined pathological phenotype, chronic traumatic encephalopathy falls short of neurodegenerative disease. Indeed, the clinical phenotype varies from normal to advanced dementia, disease progress is lacking in the majority of reported cases, and the pathological phenotype varies from absence of pathology to widespread pathology and co-morbitidies. As such, chronic traumatic encephalopathy is more a hypothetical construct or concept than a neurodegenerative disease entity.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
Normal pressure hydrocephalus is characterized by the classic triad of cognitive dysfunction, gait disorder and urinary incontinence, sometimes colloquially referred to as the 3 Ws “woozy, wobbly and watery.” A well-published New York Times article claimed that a vast percentage of Alzheimer’s disease cases did not in fact have AD, but rather had NPH, which was treatable. This lay article among other lay literature has suggested that NPH is an underdiagnosed condition. As my university, the University of California San Francisco (UCSF) is a major neurology and neurosurgery tertiary referral center we have been referred many cases of suspected NPH, often for shunting, in whom we have determined patients do not have an NPH disorder. Part of the problem lies with NPH being a poorly defined entity both clinically and radiologically. Although the entity of NPH exists and some of the symptoms can be treatable and even reversible, it is imperative to avoid placing a shunt in a patient with a neurodegenerative disease that will result in progressive atrophy leading to shunt complications such as subdural hematomas.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
Corticobasal degeneration has been described in the late 1960's as "corticodentatonigral degeneration with neuronal achromasia" in three patients presenting parkinsonism and involuntary motor activity. All patients showed asymmetric degeneration of the perirolandic and parolfactory cortices, basal ganglia and substantia nigra. In most cases, the affected cortices featured gliosis and ballooned neurons. From this reports, other followed with alternative nomenclature, but similar clinicopathological features. By the late 1980's, when the term corticobasal degeneration (CBD) received almost universal acceptance, CDB was considered a unique clinicopathological entity, meaning an almost perfect correlation between the clinical features with a particular neuropathological entity. However, with the advent of immunohistochemistry for tau protein, it became evident that different pathological entities including Alzheimer's disease, frontotemporal lobar degeneration with tau inclusions of the Pick's or PSP-type could underly a "CBD" clinical presentation. Moving forward, the term corticobasal syndrome (CBS) replaced CBD as a clinical diagnosis and the term CBD was reserved to describe a distinctive 4-repeat tauopathy with an involvement of the gray and white matter. Recent clinicopathological series report that about 1/3 of patients developing CBS show CDB. The remaining present other entities, especially Alzheimer's diseases. Ongoing studies are focusing in refining CBS clinical classification to enable better antemortem prediction of the underlying pathology in these patients.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
Attention to psychological and behavioral issues become even greater treatment considerations as the frequency of a patient’s migraine increases, there is increased disability secondary to headache, and/or there is inadequate response to usually effective treatment. As migraine becomes more chronic, headache-related disability increases, there are higher direct and indirect costs, and there are higher rates of co-morbid conditions, greater poly-pharmacy, and greater social impediments. Such findings highlight the complexity of such patients which necessitate a comprehensive, behavioral treatment strategy. Hence, behavioral therapy is necessary for the complete treatment of migraine and chronic migraine.
It should be emphasized that behavioral and other non-pharmacological treatments are not anti-pharmacological. The combination of both pharmacological and non-pharmacological treatment has been shown to be superior to each individually and appear to maximize long-term therapeutic benefit. It is a mistake to view pharmacological vs. behavioral treatment strategies as adversarial, contradictory, or oppositional.
In addition, effective non-pharmacological therapies help to ensure pharmacological treatment compliance which has been shown to be a significant problem with headache patients. Barriers to the use of abortive and preventive medications have been identified and have been found to be limiting factors in treatment efficacy. Modifying maladaptive behaviors which undermine adherence to pharmacological treatment is a critical component in the treatment of migraine and chronic migraine. This underscores the reality that pharmacological treatment involves a series of behaviors, and therefore, should be considered “behavior therapy” as well as a non-pharmacological treatment strategy.
The American Academy of Neurology-U.S.Consortium noted a variety of reasons that cause migraine patients to seek behavioral and other non-pharmacological treatment for migraine headache. These include: 1. Patient preference; 2. Poor tolerance/poor response to preventive medications; 3. Medical contraindications to medications; 4. Pregnancy, planned pregnancy, or nursing; 5. History of overuse of acute-care medications; 6. Significant stress or deficient stress/pain coping strategies.
More than 100 empirical studies have examined the efficacy of bio-behavioral therapies and headache. The American Academy of Neurology-U.S. Consortium published evidence-based guidelines for migraine headache treatment and concluded that relaxation training, thermal biofeedback combined with relaxation training, EMG biofeedback, and cognitive-behavioral therapy were effective treatment options for migraine. These results have been confirmed in other meta-analytic reviews.
Research has identified a variety of “modifiable” risk factors (including medication overuse) that appear to be associated with the escalation of the frequency and severity of migraine headache and are amenable to behavioral treatment. These include attack frequency, obesity, medication overuse, stressful life events, caffeine overuse, and snoring/sleep apnea. These risk factors are amenable to behavioral treatment.
The above behavioral treatments are critical in formulating a treatment strategy to address the modifiable risk factors noted above for the escalation of migraine. Effective treatment begins with a thorough diagnostic interview and the introduction of a headache diary as a tool for self-monitoring. Educating patients about headache mechanisms and the course of treatment allows a collaborative relationship between the patient and clinician. Patients must be “active participants” in the management of their headache issues. It is essential that clinicians monitor and be attentive to treatment adherence with respect to both pharmacological and non-pharmacological treatment strategies.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
All treatments for headache serve 1 or more of 3 functions. One is to abort an existing attack, another is to relieve pain when headache is present, and the other is to prevent occurrence of future headache attacks. This presentation reviews the evidence base for medication and psychological interventions for achieving each of these aims, paying special attention to the qualifiers “necessary” and “complete”. Numerous well-designed clinical trials support the utility of various medications, primarily triptans, for aborting migraine attacks. Only one psychophysiologically-based approach—blood volume biofeedback for constricting blood flow in the temporal artery—purports to serve abortive functions. Although developed and designed with this intent in mind, not a single investigation has attempted to examine if it indeed serves this purpose (or what is its exact mechanism of action). Evidence is similarly lacking to support the notion that psychological approaches alone have palliative effects for existing headaches, while evidence does support medication as lessening intensity of extant headaches. The major thrust of psychologically-based treatments for headache can thus be considered as focusing on prevention of future attacks. In this regard, both treatment approaches (medication and psychological) have been shown to be of value to chronic migraineurs. Again, numerous well controlled trials and qualitative and quantitative reviews (meta- analyses) support the efficacy of varied prophylactic pharmacological agents (anti- epileptics, beta-blockers, botulinum toxin, etc.) for reducing key headache parameters. With very few exceptions, the clinical trials examining psychological prophylaxis have been confounded by allowing patients to continue on their current medication regimes. Meta-analyses comparing medication and these “confounded” psychological treatments generally find similar outcomes, raising questions about the unique contributions of the psychological interventions. Only a few large-scale clinical trials have compared “unconfounded” psychological treatments to medication alone, with both forms of treatment being found to produce similar effects. The combination of the 2 different treatments in these limited trials has shown some additive effects. Having presented this brief review of the evidence, it is time to return to the central aspects of this debate—are both treatments necessary for complete treatment? The conclusion has to be a definitive no, for two main reasons. 1. Psychological treatments have yet to show any clinical utility for aborting or palliating existing headaches. For many patients, medications are not only sufficient, but they are the only approach with supportive evidence for these purposes. For many patients, medication alone provides the level of relief they are seeking (and thus may be considered complete). 2. As a substantial number of migraine patients respond well to prophylactic medications, one must ask what is the incremental utility of additionally pursing psychologically-based treatments, given they require special training that few providers have, are often effort-intensive, sometimes require specialized devices (in the case of biofeedback), are not widely available, have yet to be routinely imported into primary care and specialty settings where treatment is typically administered, and often are not covered by third party payers. It may turn out to be the case that a small percentage of migraineurs will find it necessary to pursue both treatment options (perhaps others as well), but the research existing at present provides very few insights into deciding which patients might in fact need or even want a combination of treatments, nor is there a solid basis for deciding what in fact constitutes complete treatment.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
2500 years ago, patients would queue in the town square and await their turn to see the traveling physician, hoping he would have the right diagnosis and treatment in his head. We have come a long way since then. Today, patients sit in waiting rooms, awaiting their turn to see the physician, hoping he or she would have the right diagnosis and treatment in his head. 2500 years ago, according to the best medical science, there were four diseases, corresponding to black bile, yellow bile, phlegm and blood. And there were four treatments: bleeding, vomiting, diarrhea, and expectoration. Most physicians could keep the diagnoses and treatments straight. But today, approximately 100,000 distinct disease states have been described, and most have multiple treatment options. To expect “most physicians” which means primary care doctors, to retain enough information to diagnose and properly treat 100,000 different diseases is a tall order.
The challenge to a primary care doctor confronted with a patient presenting with headache is only slightly less daunting. Bearing in mind that headache makes up only a small percentage of the patient complaints seen in a typical primary care clinic, the primary care physician would need to know the diagnostic criteria for more than 150 primary headache disorders identified in the ICHD 3 (beta). And this number does not even include the many secondary headache disorders that could present in the out-patient setting.
Admittedly, many of the 150 headache disorders identified in ICHD 3 (beta) are relatively uncommon, but even among the common primary headaches: Migraine with aura, Migraine without aura, Cluster, Tension-type headache, and chronic the chronic forms of each, the diagnostic criteria are quite specific, and it is the rare Neurologist, much less primary care physician that can accurately characterize each of these clinical entities. Indeed, even among headache specialists, careful review of documentation does not always support the clinical impression if one adheres strictly to ICHD 3 (beta) criteria.
Thus, there is little evidence to suggest that primary care doctors do an adequate job of diagnosing the primary headaches. In fact, there is considerable evidence to the contrary. Is a computer any better?
A computer program is only as good as the programmer makes it. If the program does not ask the right questions, the correct conclusions cannot be drawn. If the rule engine does not extract the correct data points or the correct data points are not available, then the program fails. Fortunately, when there are established criteria for diagnostic categories, as there are in the ICHD 3 (beta) it is a simple matter of reverse engineering to create questions that will identify data points to meet or exclude those criteria. Thus, a computer program cannot, by definition, draw a conclusion that is not supported by data. The variable, of course, is the quality of the data obtained.
There is a variety of techniques to help ensure the reliability of the query. For example, the same data point can be approached several ways with different questions, or even the same question phrased several different ways. The consistency of the responses can then be valued with respect to reliability. For example: if we want to identify the migrainous feature of light sensitivity, the question could be “are you sensitive to light?” This question, standing alone could have several interpretations that may or may not be related to a diagnosis of migraine. The patient could have photophobia due to a concurrent condition such as retinitis pigmentosa, or light sensitivity of long-standing, not correlated with other migrainous features resulting from light eye color or even central sensitization due to another chronic primary headache condition. However, if the response to that question is combined with a question about light avoidance during a headache, and association with sound sensitivity (ICHD 3 (beta) requires BOTH photophobia and phonophobia), the reliability of the data point [+ photophobia] becomes more reliable.
It is also possible to use decision-tree analysis (in which the response to one question prompts the next question), and machine learning to improve the quality of the data. These techniques when combined with clinical expertise from a panel of experts can reliably tie historical elements obtained in a web or app-based history to specific sets of diagnostic criteria. One might even argue, that a well-designed computer program is MORE reliable than an unstructured or semi-structured interview for reaching a diagnosis that meets specific criteria because it is not prey to the vagueries of memory or interviewer bias.
Where then is the role of the clinician in the diagnosis of primary headache? The diagnostic algorithms of a computer-based tool is only as good as the data that is entered. There is no physical examination, no diagnostic testing. At best, the computer can make a diagnosis based on historical elements provided by the patient and the physician. If the physician can also enter findings on examination and testing, the diagnostic accuracy can be improved in so far as findings on examination and testing are requisite to the diagnosis.
In the case of a patient presenting with a primary headache disorder, the consensus among headache specialists is that the vast majority of these headaches can be diagnosed based on history alone. This is not to say that examination and testing are irrelevant, only that these components of the medical encounter rarely change the diagnosis generated from the history in patients with primary headache. Given that the gold standard, at present, is a clearly articulated set of criteria for a large number of entities, the computer is ideally suited to correlate data points in the history with elements of diagnostic criteria, not prone to interviewer bias or knowledge deficits.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
Primary headache disorders are the most common brain conditions among general population with specific diagnostic criteria revised every five years (ICHD-IIIbeta). But headache is a symptom in the main rather a condition. Only when headache attacks fulfill those specific diagnostic criteria consistently does a primary headache disorder occur. In addition, ICHD-IIIbeta criteria demand a normal neurological examination for a primary headache disorder that only an experienced neurologist performs. In most cases headache is primary but secondary headache disorders may be related to life threatening conditions. They may respond to common analgesics and mimic primary ones a lot. Diagnostic tests are necessary therefore when the treating physician doubts for the primary origin of headache. Although there are official recommendations for these tests (EHF published recently a consensus on technical investigation for primary headache disorders), the clinical relevance and interpretation of the test findings remains extremely complex, that only an experienced physician performs, again. There is no doubt that computers offer tremendous assistance in searching and analyzing data; nevertheless this support stays poor in front of a person suffering from headache. Because headache is the commonest presenting symptom in people asking medical consultation, the above concerns (among many others) to use digital diagnostic procedures exclusively are principal in managing headache.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
In 1962, the Congress of the United States passed the Kefauver-Harris Amendment that mandated that manufacturers provide evidence of drug effectiveness in addition to safety in order for the Food and Drug administration (FDA) to approve the agent for a specific clinical indication. The FDA in 1970 published guidelines describing what acceptable controls in a clinical trial were. The double-blind randomized clinical trial was established as the “gold standard” for the emerging pharmaceutical industry. In 2012, the International Headache Society (IHS) Clinical Trials Committee published “guidelines for controlled trials of drugs in migraine: Third Edition. A guide for investigators”. In that document, they noted placebo rates ranged from 6 to 47% in clinical trials for abortive agents with respect to migraine relief. Placebo rates in headache reduction in preventive trials ranged from 20 to 40% (or even higher). The committee recommended that in clinical trials “Drugs used for acute treatment of migraine should be compared with placebo”. With respect to preventive agents, “Drugs used for migraine prophylaxis should be compared with placebo. When two presumably active drugs are compared, placebo control should also be included in order to test the reactivity (assay sensitivity) of the trial which would allow greater generalizability of study results”.
In 2002, the World Medical Association Declaration of Helsinki stated that when an effective treatment for a disease exists, it was unethical to assign patients in a research study to a treatment known to be less effective. Standards for the acceptable use a placebo in clinical trials have changed over time, and (with informed consent), it is now considered acceptable to use placebos in clinical trials in which withholding the best current treatment will result in only temporary discomfort and no serious adverse effects. The IHS guidelines (noted above) state that research protocols should allow the use of rescue medication any time after the first primary efficacy time point (typically, two hours after intake of study medication). This is necessary for the evaluation of “new treatments”.
Demonstration of treatment efficacy demands that the target (active) agent must be shown to be statistically significantly superior to an inert substance (placebo) not believed to be a specific therapy for the target condition. As noted above, this is the “gold standard” in clinical research. Placebo rates (and factors that influence them) become increasingly important as potential methodological manipulations (e.g., “over-powering” clinical studies) may allow small differences between groups to reach statistical significance when, in fact, such differences may be clinically meaningless. Similarly, placebo rates have been shown to vary dramatically depending upon a variety of “non-specific” treatment factors (the type of treatment, degree of invasiveness, contextual factors in the research interactions, unbalanced randomization ratios, etc).
Placebo-related variables are believed to contribute to treatment efficacy in clinical settings. While they create “noise” in the interpretation of research results, enhancing these variables is desirable in clinical settings. In sum, issues related to placebo are extremely important variables in both research and clinical and settings.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
Medicine is based on placebo than millennia since man first became conscious of himself and this continues today. Because pain is not only a sensory but an emotional experience as well, brain modifies pain perception considerably. Thus, placebo effects for pain and headache appear maximal while placebo effects for outcomes like cancer survival appear to be minimal. In randomized controlled studies (RCTs) for headache prevention and acute treatment placebo effect reaches 30% approximately. Interestingly, the benefits of placebo persist even if placebo is honestly described indicating that whether treatment involves medication or placebo, the information provided to patients and the ritual of pill taking are important components of care (Kam-Hansen et al., 2014). Thus, placebo and medication effects can be modulated by expectancies. There is good evidence that several agents are superior to placebo for almost all primary headache disorders so far. Patients treated with placebo do experience adverse effects in addition (nocebo effect, Mitsikostas et al., 2015) demonstrating crucial ethical concerns. These observations indicate that the use of placebo in headache RCTs is not as much essential as we used to believe before, because the headache treatment itself includes the placebo already, as in clinical practice. In the contrary, head-to-head comparisons of drugs in RCTs remain indispensable and practical.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
The management of patients with migraine is often unsatisfactory because available acute and preventive treatment is either ineffective or poorly tolerated. The peptide calcitonin gene-related peptide (CGRP) has been found to have a key role in migraine, supported by studies showing that CGRP is released in migraine attacks, and that different CGRP receptor antagonists (gepants) aborted the migraine pain and one study indicated a prophylactic effect.
Recently, three different monoclonal antibodies targeting the CGRP ligand (LY2951742, ALD403 and TEV-48125) and one targeting the CGRP receptor (AMG334) have completed phase 2 trials in frequent episodic migraine and the results reported. These early trials revealed them all to be significantly more effective than placebo. TEV-48125 has also been studied in chronic migraine with a good outcome. The adverse effects in these trials were not different from placebo.
In migraine prevention, these humanized antibodies against CGRP or the CGRP receptor are agents that represent a promising new approach in therapy and are currently in phase 3 studies.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
The International Headache Society defines chronic migraine as more than fifteen headache days per month over a three month period of which more than eight are migrainous, in the absence of medication overuse. Episodic migraine is the other migraine sub-type, which is defined as less than 15 headache days per month
We always start the preventive treatment in combination with a detoxification programme in the cases where the patient fulfils analgesic overuse criteria. Many different strategies and drugs have been proposed to treat this difficult condition (steroids, NSAIDs, different antidepressants, antiepileptic, ect., but only topiramate and onanobotulinumtoxinA have proved their efficacy in clinical trials. Other alternatives with less evidence could be valproic acid, propranolol or a combination of different therapeutic approaches. With this approach we obtain a very good response in 30-40% of patients and a good response in another 30-35%.
The importance of Calcitonin gene-related peptide (CGRP) in the pathogenesis of migraine is well characterized. Several trials with different compounds have proved the efficacy of the migraine anti-CGRP antibodies in episodic migraine. A multicenter, randomized trial using two different doses of a humanized monoclonal antibody (TEV-41825) against placebo has been published. The patients included were allowed to use up to 2 different preventives in a stable dose. The trial was positive with a reduction in headache-hours of any severity, in the number of moderate or severe headache-days, and significant difference in number of days on which triptans were used between the placebo group and each of the TEV-48125 dose groups. The tolerability was good with no serious adverse events. But the percentage of reduction in these parameters were not superior to the actual treatments. Besides, around of 40% of patients were using at least one additional preventive.
Bearing in mind these data and also the possible cost of these biological compounds, the anti-CGRP antibodies will be an alternative, but not the treatment of choice in the management of chronic migraine.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
Chronic Migraine is a disabling condition for patients and severely affects the ability to lead a productive life. The prevalence is around 2% of general population and it is defined as headache lasting 15 or more days per month for more than 3 months in subjects with a history of migraine. Overuse of medication is very frequent and, in headache centres, more than 90% of patients with chronic migraine meet overuse criteria.
According to the ICHD-III? Medication Overuse Headache is headache occurring on 15 or more days per month developing as a consequence of regular overuse of acute or symptomatic headache medication (on 10 or more, or 15 or more days per month, depending on the medication) for more than 3 months. It usually resolves after the overuse is stopped
All he drugs used to treat migraine can produce MOH, although the drugs change over time and from region to region.
Mechanisms may differ from one class of overused drug to another and these mechanisms may include a combination of pronociceptive pain facilitation with weakened descending pain inhibition. We will discuss these different mechanisms from the different compounds and their implication in the chronification of headache. Besides, more of the drugs used can produce dependence and the patients with MOH share many characteristics with patients with other drug dependence.
In relation with the treatment, withdrawal of the overused medication is the treatment of choice. Withdrawal of the drug overused leads, in most cases, to an improvement of the headache and even can resolve the problem completely.
For all these reasons medication over-use is the main causative factor in chronic migraine.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
Majority of patients referred to tertiary headache clinics suffer from chronic migraine and medication overuse headache that represents the most important challenge to the headache specialists. Chronic migraine is considered to evolve from episodic migraine headaches with an incidence rate of 2.5%. The transition of episodic form to more frequent attacks pattern that may require several months or years, and is influenced by lifestyle, life events, comorbid conditions and genetic background. More importantly this transition period is also frequently accompanied by overuse of abortive headache medication. There are many factors identified to play a role in the migraine chronification process. Risk factors such as older age, female sex, Caucasian race, low socioeconomic status with low education level and income, and genetic factors, family history of mood disorders and substance use disorders (alcohol, drugs) are unfortunately not modifiable. Those patients tend to have greater psychiatric disorders such as depression, anxiety, personality problems, impairment in occupational, social, and family functioning, medical comorbidities such as hypertension, diabetes, high cholesterol levels, obesity and chronic pain disorders. In addition, physical inactivity, smoking, medication overuse, caffeine /tein overuse, sleep disorders (e.g., insomnia, snoring), chronic musculoskeletal and gastrointestinal complaints are also implicated. Lack of awareness in avoiding trigger factors, irregular life-style rhythms, using ineffective drugs and/or dosages and late referral to headache centers also increase the risk of medication overuse in the chronification process of migraine. Medication overuse headache (MOH) is a common and debilitating secondary headache that may complicate every type of primary headache and all the drugs employed for abortive headache treatment can cause MOH. Withdrawal from medication overuse is an important step in the treatment, which reduce the attack frequency. However withdrawal of overusing abortive drugs per se does not revert the chronic migraine process. In conclusion, medication overuse is one of the significant contributors in the chronification process but NOT the main causative factor.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
A cerebral cortical phenomenon known as cortical spreading depression (CSD) was linked to lateralized headache and shown to be able to activate peripheral trigeminal fibers and second order trigeminal neurons in the brainstem nucleus (TNC). CSD is implicated in releasing CGRP and nitric oxide from trigeminal nerve endings and leading to neurogenic inflammation in the dura mater. CSD is a key to understand familial hemiplegic migraine phenotype, critical involvement of glutamatergic synapse, female hormonal influence and the efficacy of preventive anti-migraine drugs.
Animal studies investigating the mechanisms of migraine and CSD are commonly conducted under anesthesia, despite the fact that pain is a conscious experience. Anesthesia have profound effects on the mechanisms by which CSD is initiated and propagated, and clearly prevents observation of any associated behavioral response. Functional decortication of one hemisphere by CSD in lissencephalic brain would result in transient visual, somato-sensory & motor deficits. CSD in freely moving lissencephalic animals evoked reduced locomotor activity, freezing & grooming episodes and emitted pain calls (22-27 KHz) during freezing episodes. Activation of thalamic reticular nucleus was detected by CSD in only awake animals. Electrocorticographic recordings demonstrated the direct propagation of CSD waves in to thalamic reticular nucleus. Activation was unilateral and lateralized to the side that CSD occurred. It was also lateralized to the side that trigeminal pain nucleus is activated. It was dependent on full conscious experience as highly vulnerable to anesthetics. Blockade of TRN activation by valproate, triptans and CGRP antagonists implicated its relation to nociception. CSD selectively activated visual sector of TRN, though other six TRN sectors of auditory, gustatory, visceral, somatosensoriyal, motor and limbic TRN were not affected by CSD.
TRN consists of GABAergic neurons that surround the thalamus. TRN projects to thalamic relay nuclei in an inhibitory manner and receives glutamatergic excitatory afferents from both cortex and thalamic relay nuclei. TRN mainly functions as a gatekeeper of sensory outflow to the cortex, which is involved in selective attention, lateral inhibition, and discrimination of sensory stimuli. Burst firing of TRN neurons inhibits thalamo-cortical transmission and are associated with sleep spindles or silent periods during wakefulness. Lack of bilateral activation in TRN is against non-specific attention or being awake. Thalamic burst firing occurs spontaneously in human neuropathic pain conditions and also following noxious stimulation. Fifty to 65% of neurons in somatosensoriel TRN are nociceptive. Cav3.1 & 3.2 (T-channel) knockout mice, exhibit increased threshold for somatic & visceral nociception, which are incapable of thalamic burst firing, and fewer bursts. Oscillations in the low-frequency spindle range observed during freezing periods following CSD is suggested to be associated with pain perception.
Involvement of TRN as a subcortical thalamic structure by a cortical event is important to explain several clinical features of migraine such as 1) Dysfunction of the GABAergic neurons in TRN would result in enhanced transmission of sensory information to the visual cortex, 2) Photophobia and visual hallucinations of aura may reflect dysregulation of visual stimuli by the TRN, 3) TRN could play a role in either termination or initiation of an attack as sleep is closely related with migraine, attacks are often associated with the circadian cycle and are typically relieved by sleep, 4) an activation of an ipsilateral central route particularly in awake subjects could play also a role in activating ipsilateral brainstem pain structures secondary to cortical activation.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
Migraine is a complex disease surrounded by numerous hypotheses. No doubt there is a genetic background but only mechanistically shown for hemiplegic migraine with glutamate as a common trait: hyperexcitability and reduced threshold for induction. It is a challenge to investigate the early phase of migraine attacks for technical reasons. Olesen (1981) early described rCBF alterations following injection of the radioactive tracer 133Xenon and later confirmed by MRI (Hadjikhani 2001); there was first an initial hyperemia followed by oligemia that spread across the cortex anteriorly without respecting the vascular territories. In addition there was close correlation between rCBF changes and the observed neurology. Woods (1994) reported in a PET study bilateral spreading wave of cerebral hypoperfusion in spontaneous migraine attack, associated by headache but without clear aura. Hadjikhani described a spreading wave of rCBF reductions with MRI in one patient that induced a migraine aura during basketball training. The symptoms observed in these studies correlated with the neurology symptoms. Thus, evidence exists for association between the aura phase preceding pain in a migraine attack and association with reduction in rCBF.
Numerous experimental studies have examined induced cortical spreading wave of depression (CSD) as a surrogate method to obtain and understand this early part of a migraine attack but only scant clinical data exist. CSD leads to dramatic alterations in cerebral hemodynamics, however, mechanisms involved in promoting and counteracting cerebral vasodilator responses are unclear (Busija 2008). Experimental data suggest that the hyperemia phase as seen in rodents does not appear in primates but the cortical depression can be induced (Lauritzen).
Is the cortex involved in all migraine attacks and a starting point?
Maniyar (2014/5) recently revealed subcortical activation in the premonitory phase (posterolateral hypothalamus, midbrain tegmental area, periaqueductal grey, dorsal pons) and in various cortical areas including occipital, temporal and prefrontal cortex in conjunction with glyceryl trinitrate-triggered migraine attacks. These brain activations can explain many of the premonitory symptoms. Despite demonstration of cortical participation in migraine aura, the contribution of other brain structures including subcortical nuclei may indicate that the aura phenomenon is present only in some patients; the sequence of neurobiological events during a migraine attack remains to be elucidated further.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
By Electrical Stimulation or Neurostimulation or Neuromodulation for the treatment of Cluster Headache we mean the following procedures: 1. Hypothalamic deep brain stimulation (DBS); 2. Occipital nerve stimulation (ONS); 3. Sphenopalatine ganglion stimulation (SPGS); 4. vagal nerve stimulation (VNS) and 5) spinal cord stimulation (SCS).
Regarding the following features:
1. Procedures characteristics: 1. Surgical procedures; 2. Invasive procedures;
2. The time elapse to start the effect with a highly variability;
3. A fast relapse after an interruption on stimulation;
4. Potential complications: Infectious, hemorrhagic, functional, mechanical and/or technical;
5. Effectiveness: A reduced number of randomized and controlled studies and degree of effectiveness: with a large variability in different series;
6. The cost/effectiveness profile unfavorable;
7. Its clinical applicability: 1. Reduced amount of patients: 2. Restricted use to: a. Chronic CH patients and b. Pharmacologic refractory CH patients.
I should say that Electrical Stimulation will not replace medications for the treatment of Cluster Headache, at least in the near future and, without better evidence on efficacy, safety and cost effectiveness of the procedures nevertheless its great utility for Chronic Refractory Cluster Headache patients.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
Unruptured intracranial aneurysms (UIA) are relatively common in the general population and can be found in percentage as high as 6%. UIA are not static vascular anomalies but grow over time and eventually rupture. The subarachnoid hemorrhage that results from this rupture can be a dramatic event causing high morbidity and even death. The impact of a ruptured UIA can be depicted in a Finnish study, where 178 UIAs who were hospitalized, and during a mean follow-up of 13 years, had a 50% excess mortality compared with the general population. In the United States, rates of in-hospital mortality in acute care have reached 6.3%. Therefore the decision whether to treat or not to treat an UIA must take into account the fact that this pathological finding is not benign, affect young individuals and causing significant clinical but social burden. For all the reasons pointed this far, we conclude that, in theory and with complications-free treatment, all UIA should be treated. In favor of this approach is the fact the rate of treatment complications have been reducing progressively in the last decades making endovascular and surgical treatment safer. Sometimes, a wrong decision to exclude a UIA from intervention is related to a falsely belief that small aneurysm are devoid of risk of rupture. This is based on the findings of older studies like the International Study of Unruptured Intracranial Aneurysms (ISUIA). In this study, patients with no history of subarachnoid hemorrhage and IUA <7 mm in diameter did not show ruptures in follow-up. However, ISUIA have been criticized for several reasons. First, the number of patients in certain categories is small, so some of the estimates of rupture risk in the strata shown in are imprecise. The study show some internal inconsistency because some predictors of rupture confirmed at first phase some were not present phase. Additionally, the proportion of patients undergoing an interventional procedure varied tremendously from center to center in this nonrandomized study, in general, the surgeon or radiologist evaluating the patient would only have conservatively managed those patients who were deemed to be at low risk of rupture, and therefore, selection biases could change the risk profile of included participants. Finally, differential follow-up and detection biases could alter apparent rates, and some outcome events may have been missed. In studies with very long follow-up, have found that the rate of rupture can be has high as 29% during their lifetime, and the annual rupture rate per patient was 1.6%. The real picture seems that a patient an IUA may have a more dynamic and serious course and if follow-up is stretched enough all UIA will rupture. The most recent meta-analysis of all studies combined show that studies vary dramatically in size and duration of follow-up, and they included both prospective and retrospective design. As suspected aneurysms <7 mm also showed rupture, at an annual rate of 0.4%. Curiously, family history and previous rupture from a different aneurysm were not identified as risk factors for rupture. This means that we cannot predict really “safer” IUA based on size or in clinical ground. In conclusion, because UIA can have such a catastrophic clinical outcome and treatments are increasingly safer, all UIA are potentially indicated for treatment.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
Three to four percent of stroke cases are caused by subarachnoidal bleeding due to aneurysma rupture. Unruptured intracranial aneurysms (UIA) are:asymptomatic incidental aneurysms, symptomatic aneurysms, and multiple aneurysm cases in SAH patients. The rupture incidence of unruptured aneurysms in the general adult population should be at least 1% per year. Recently, well-designed prospective clinical studies, metaanalyses and guidelines have been published dealing with diagnosis and therapy of UIAs.
The prevalence of unruptured intracranial aneurysm is different from population to population, more frequent findings among elderly, females and polycystic kidney patients. Unruptured familial intracranial aneurysm patients represent ca. 8-10% of cases. Because only a minority of UIA patients will present with SAH it would be important to identify those UIA patients who live in high risk for rupture. Hypertension, smoking, certain locations, growth, special morphology of UIA are associated with high risk for rupture. But it is unclear if the change of modifiable risk factors influences the outcome of previously asymptomatic UIAs. Patients with family history of aneurysm, cranial nerve symptoms or with a prior SAH live in higher risk and need individual consideration and close follow up. Patients with polycystic kidney disease and persons with a family history of aneurysms or SAH may benefit from screening but the cost-effectivness of screening in other groups is unclear. Patients with ?2 family members with IA or SAH should be offered aneurysmal screening by CTA or MRA. TOF MRA is preferred to CTA for repeated long-term follow-up.
With increasing size over 7 mm, the risk of SAH increases. The internal carotid and basilar artery aneurysms were more likely to grow than in other regions. Cavernous carotid aneurysms have the lowest, anterior circulation aneurysms have intermediate rates of rupture while posterior circulation aneurysms have the highest rates of rupture. The ICA and basilar artery aneurysms were more likely to grow than in other regions. Unfortunately, both the interval between imaging studies and the mode of that remain unclear. Although DSA is the optimal method for decision on repair, but follow-up imaging should be performed by either CTA or MRA.
The physician should consider patient age, location and size, comorbidities and the long term outcomes of his/her center. The microsurgical intervention is associated with higher morbidity, than endovascular repair therefore in elderly patients the benefit of coiling seems to be greater. The microsurgery could be preferred in the treatment of the majority of MCA aneurysms and the endovascular intervention in the treatment of most basilar apex and vertebrobasilar confluence aneurysms. The flow-diverting stents, and stent-assisted coiling procedures might be new treatment strategies in the future and can be also considered in carefully selected cases, but the long-term outcome is not yet determined. After microsurgery or endovascular intervention, repeated imaging and assessment of cognitive outcome is warranted. The assessment of the degree of aneurysm obliteration after surgical or endovascular intervention is also necessary to determine the frequency of follow-up. Long- term follow-up is particularly important for those aneurysms that are incompletely obliterated. Summary: (1) Numerous factors should be considered for determining the optimal outcome of a UIA (growth, size, location, morphology, age, prior SAH, family history, multiplicity): all these factors may predispose to a higher risk of rupture; (2) Although the surgery may be associated with longer lasting protection against aneurysm regrowth, but the endovascular intervention could be superior to surgery in other aspects (lower morbidity and mortality, shorter length of stay, costs), so it may be reasonable to choose this therapy in basilar apex UIA and in old patients; (3) Endovascular coiling is associated with a reduction in procedural morbidity and mortality over surgical clipping in selected cases but has an overall higher risk of recurrence; (4) If many risk factors exist in a patient with small asymptomatic UIA and low hemorrhage risk, observation is a reasonable alternative; and (5) Although the ESO guideline summarizes only general recommendation on the therapy of UIA, both guidelines (ESO and AHA/ASA) agree that individual decision is necessary before any therapeutic step.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: EBM
The way doctors manage patients—and management is here used in the sense of diagnosing, treating and following up patients—is a central issue for modern health systems.
In fact, when one looks at the new tendencies for health care policies—quality assurance systems, patient-centred care, rational practice implementation and outcome based financing (to name just a few)—the central role of the quality of care is obvious. And quality of care means above all clinical care, done by physicians.
The central issue is then: which is the best information source for clinical care, science or experience?
The classic definition of Evidence-Based Medicine (EBM) is from Dr. David Sackett (1996): …“the conscientious, explicit and judicious use of current best evidence in making decisions about the care of the individual patient. It means integrating individual clinical expertise with the best available external clinical evidence from systematic research.”
EBM is the integration of three factors into the decision making process for patient care (Sackett D, 2002): research evidence (found in clinically relevant studies conducted using sound methodology), clinical expertise (clinician’s cumulated experience) and patient values (personal preferences and unique concerns and expectations).
The practical steps of EBM include: 1) assess the patient, 2) ask the clinical question, 3) acquire the evidence, 4) critically appraise the evidence, 5) apply the results to the patient and 6) self-evaluate one’s practice.
Despite the impressive foundations EBM has constructed to use research into practice, some clinicians still debate the role of clinical experience of the individual doctor as opposed to scientific data from high quality clinical studies in taking care of patients.
Certainly, clinical experience is crucial for the quality of clinical care if for no other reason because it captures a reality that science hardly can. For example, the average elderly patient usually presents with three or four diseases and a couple of extra risk factors, and these type of patients is seldom studied in clinical research. The patients usually included in clinical study‘s samples are very homogeneous, possessing mostly the same level of baseline risk (or in subgroups well defined), have much less co-morbidities and therefore are less representative of the group of daily patients needing care.
On the other hand, clinical experience alone is absolutely insufficient to give patients a high quality of care. One has to stay abreast of the scientific evolution of one’s field of clinical practice, so that patients have the full benefit of innovation of care happening every day. As a well-known saying goes: “…If the clinician does not continually learn the scientific basis of his/her trade, after a while the patient is not consulting with a doctor but with a museum…”
The problem of combining one´s expertise with clinical data is not very problematic if these two types of knowledge are more or less overlapping. The problems arise when they are opposite. For example, for decades patients with serious head trauma (GCS <14) were treated with steroids to diminish cerebral edema (an intervention never tested in a proper fashion). The study CRASH (Lancet 2004; 364: 1321–28) showed that, when compared with placebo, there was no significant reduction in mortality with the use of methylprednisolone in the 2 weeks after head injury (21.1% vs 17.9%; relative risk 1.18 [95% CI 1.09–1.27]; p=0.0001). However, the risk of death at 6 months of follow-up (Lancet 2005; 365: 1957–59) was clearly higher in the corticosteroid group than in the placebo group (25.7% vs 22.3%, relative risk 1.15, 95% CI 1.07–1.24; p=0.0001), as was the risk of death or severe disability (38.1% vs 36.3% dead or severely disabled; 1.05, 0.99–1.10; p=0.079). This is an example in which clinical impressions were invalidated by sound scientific data originating in high quality studies. Confronted with these results, the clinician must decide the appropriate course for the individual patient, always justifying his/her specific choices.
Concerning the scientific basis for medical decision support, two concepts are important (Horton R, 2007):
1. How good are the data in terms inherent quality? This is reliability.
2. How appropriate are those data to the individual patient’s problem? This relevance.
If doctors want to use scientific data to manage patients, reliability and relevance are the most important questions to consider.
What about patients with stroke?
Given the fact that there is a lot of studies on stroke (a quick search of Medline looking for papers with the word “stroke” in the title gives back almost 70,000 articles…), the question should be how to select the reliable and relevant studies to support medical decision making in stroke patients.
Specifically concerning therapy, we need to select clinical trials that are useful to guide us through interventions on stroke victims by providing data that correctly assesses the effects of treatments on major morbidity as well as mortality, on subgroups of patients presenting with different baseline risks. This is due to the well-known fact that some treatments for chronic diseases can produce large benefits but, given the fact that stroke is a heterogeneous condition (similar patients have different prognosis), the selection of individualized therapy should be based on clear and reproducible data.
The scientific basis of clinical practice demands that, once the best evidence is selected, every study should be appraised in terms of its internal validity (how rigorous is the design of the study to answer the clinical question it posed), the importance of its results (clinical, not statistical significance) and its external validity (the degree of generalizability of its results). For example, to minimise biases and random errors in clinical trials one should guarantee proper randomization, intention-to-treat analysis, rigorous blinding and accounting of patients (to name only a few factors).
Clinicians are used to treat individual patients and therefore may feel that clinical trials do not give individual information for optimal care. However, clinicians that feel this way should bear in mind that all the diagnostic or therapeutic techniques available for them were developed in groups of patients similar to the ones they wish to manage, as one of the reasons to do these type of studies is that the individual variability of prognostic factors can only be identified through sufficiently large randomized groups of patients compared among them.
Concerning therapy for acute stroke, the external validity of clinical trials results is paramount, since it allows using the data generated in these studies. But its limitations should also be understood: among the most important are the setting of the trial, the specific characteristics of trial patients, outcome measures, difference between trial protocols and clinical practice and the difference in the rate of adverse effects of treatment. Once careful analysis is done, the practitioner can then apply (or not) the interventions provided by the studies.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

There are a number of rare causes of stroke resulting from single gene disorders. The most common of these is Cerebral Autosomal Dominant Arteriopathy with Subcortical Infarcts and Leukoencephalopathy (CADASIL). In such cases a diagnosis can be made reliably from genetic testing, and most individuals with a NOTCH3 mutation causing, for example, will develop features of the disease during their lifetime. In such cases genetic testing is very important both to make a diagnosis, and also to offer presymptomatic testing to family members. This may be particularly relevant if family members are planning to have children and would like to consider prenatal testing.
However these causes are rare and genetic factors are much more important, on a population basis, for multifactorial stroke. Epidemiological studies, and more recently techniques estimating heritability from genome-wide association study (GWAS) data, have shown that genetic factors are important in common stroke. GWAS analyses have identified a number of genetic associations for stroke and strikingly almost all of these are associated with specific stroke subtypes (i.e. for ischaemic stroke, large artery disease or cadioembolic stroke).
It has been suggested that genotyping for these variants could be useful in predicting stroke risk in individuals. Indeed a few years ago some commercial companies were offering risk prediction using genotyping for patients with a variety of complex diseases including cardiovascular disease.
However currently this is not useful to the individual patient, and indeed this led to the FDA stopping companies from advertising these services. The reason it is not useful is that each genetic variant accounts for only a small amount of increased risk. The odds ratios are usually between 1.1 and 1.2, i.e. they cause an extra 10% or 20% increase in risk. By studying sibling relative risk (an epidemiological measure of how genetic stroke is) one can work out how many variants with an odds ratio of about 1.1 to 1.2 account for the observed heritability of stroke. This comes to about 100-200. Currently we have only described only about 8 risk variants for ischaemic stroke. This means that even if we genotype these 8 variants we are only accounting for a very small proportion of overall stroke risk. Essentially the amount we can account for is so small that it does not serve any useful predictive use. Indeed it could be misleading in giving patients a full sense of reassurance.
There are other issues which include whether patients would really want this information, and whether giving patients this information would have a useful effect on lifestyle measures to reduce stroke risk.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Aspirin is all that is necessary
Carotid dissection is an important cause of stroke in younger individuals. It has been estimated it may account for as many as 25% of stroke in patients under 50. It is associated with an increased risk of early recurrent stroke. It is believed this is primarily due to thromboembolism from the site of the dissection and this has led to clinicians giving antithrombotic treatment to try to reduce this risk of recurrent stroke.
It has been suggested that anticoagulants may be more effective because there is thrombus at the site of dissection. However there are also potential disadvantages of anticoagulants in that they could lead to further bleeding within the vessel wall and extension of the dissection and vessel occlusion. Furthermore many clinicians used to give anticoagulants for tight carotid stenosis but data subsequently showed that antiplatelet agents are more effective. Embolism from the stenosis is also thought to be the main cause of recurrent stroke in carotid stenosis.
The CADISS trial recently reported the first randomised comparison of anticoagulants versus antiplatelet agents in patients with recent carotid and vertebral dissection. The striking finding was that there were very few recurrent events in this patient group. There was no difference between recurrent events in patients taking aspirin or anticoagulants. The trial only included 250 patients but provides the most robust data on which to base our clinical management of this patient group. On the basis of the low recurrent stroke risk and the lack of any difference between anticoagulants and antiplatelet agents, antiplatelet agent therapy alone is sufficient in patients with recent carotid dissection.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
Carotid dissection is a frequent cause of stroke in the young adult, often producing devastating deficits. After the acute phase, recurrence of dissection and of stroke is rare. Anticoagulants, antiplatelet drugs and less often endovascular interventions or vascular surgery are used to reduce the risk of recurrence. Until recently, anticoagulation was the most used preventive treatment. Two dangers of anticoagulation are often feared: enlargement of the intramural hematoma and severe intracranial or systemic bleeding. There is no evidence from serial imaging and clinical studies that anticoagulation causes increase in size of the intramural hematoma. Two systematic reviews did not find any difference on the comparative efficacy of anticoagulants and antiplatelets to prevent recurrent strokes. A recent RCT—CADISS—found no difference in efficacy of antiplatelet and anticoagulant drugs at preventing stroke and death in patients with symptomatic carotid and vertebral artery dissection but stroke was rare in both groups, and much rarer than reported in some observational studies. The study has however several methodological limitations, including being a feasibility trial not reaching the target no of inclusions, having a lower than expected no of endpoints, the diagnosis of dissection was not confirmed after review in many cases and several non-included patients meet inclusion criteria for randomization. The risk of bleeding with anticoagulants is overestimated in patients with dissection. Patient with dissection are younger and healthier and the period of anticoagulation is shorter than in elderly patients with AF.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
Transient ischemic attacks (TIA’s) are very important in identifying stroke risk; it is reported to be 3.1% in 2 days and 5.2% in 7 days! While this was the average risk, there are high-risk's TIA patients which carry even higher early risk of stroke.
This was reported clearly in the past 10 years and it was shown that those with the high early stroke risk can be identified by clinical and investigational means.
Clinical scores such as ABCD2, ABCD3 were developed for TIA patients and most studies reported their usefulness in identifying those high-risk patients.
As specific etiologies such as high grade carotid stenosis along with positive brain imaging were also found to be associated with higher risks, the ABCD3-I score was developed; it includes brain and cervical imaging and its usage was claimed to carry even a higher predicting value.
These high-risk patients need urgent therapeutic approach, as effective stroke prevention measures are available and invaluable; by appropriate immediate approach, the stroke risk was reduced by up to 80% in one study! Other studies showed similar results. These good results can be achieved only by utilizing facilities to identify urgently TIA patients- i.e. 24h TIA clinics such as the SOS TIA clinic in Paris, which also reported favorable results.
Specific measures to reduce stroke risks (such as carotid endarterectomy or stenting) are available yet it is impractical to wait for such intervention without introducing immediate medical treatment in the interim and, likewise, in cases where potential interventions are not identified.
TIA’s, in most cases, stem from the arterial tree due to embolic particles originating at unstable atherosclerotic plaques where thrombotic & thromboembolic processes occur; by antithrombotic treatment we can slow the coagulation cascade at the site of unstable plaques.
Of all antithrombotic agents, antiplatelets (AP) were proven to be the most effective in preventing or reducing these processes (besides statins).
Recommended AP agents for secondary stroke prevention include aspirin alone, clopidogrel alone and the combination of dipyridamole with low dose aspirin. Its relative stroke risk reductions range from 22 to 37%, yet these results are based on studies which emphasize long term treatment.
The key issue nowadays, as TIA treatment should be urgent, is how early these agents exert their protective effects.
Clopidogrel is a pro-drug and its effectiveness can be hastened by the administration of a loading dose (300-600 mg). Its combination with aspirin was found useful in several clinical vascular situations (such as unstable angina and stent implantation). Yet, similar studies in secondary stroke prevention- the MATCH and SPS3 studies- found no beneficial effect throughout the studies' periods (1.5 & 3.4 years respectively), mainly due to bleeding side effects. Therefore this combination is consider risky and is not recommended for secondary stroke prevention.
Both studies, however, did not aim for the very early post TIA/ stroke period- the period when the risk is the highest! Small studies comparing this combination early on in patients with symptomatic carotid disease (CARESS, CLAIR) demonstrated a significant reduction in emboli production and subsequent meta-analyses of outcome data from these studies along with data taken from larger studies (for those patients recruited very early) as well as from newer, relatively small, studies (FASTER, EARLY), as well as a large Chinese study (CHANCE) have shown beneficial effect for dual AP regimen in reducing risks of stroke and death early on.
Therefore it seems that we have a powerful AP treatment which could be used for our high-risk TIA patients. It's worth the risk, at least for this subgroup (by using the ABCD grading scores).
Another issue is how early should the investigation be done? the ABCD2 score (and alike) is useful for this decision; a recent NICE guideline suggest a cutoff of 4 points to splint patients into 2 or 7 days of investigation completion schedule.
Thus the ABCD2 grading score is useful in the management of TIA patients.
These points will be further elaborated in this debate.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
It is an unusual opportunity to be tasked with the request to take the negative position on time in of starting anticoagulants in post-stroke atrial fibrillation patients. I usually begin the anticoagulant program earlier than later.
I have two hedge my comments based on the assumption of the stroke type; its location, size, isolated event, or the most recent of several events; coexisting brain lesions contraindicating anticoagulation, and systemic factors that may influence anticoagulation safety.
Most atrial-fibrillation-related strokes are infarct in type, the most common path up the internal carotid to the circle of Willis; from there to the middle cerebral artery stem and assuming a typical bifurcation, passing into the lower division; the final lodgment occurring in the posterior sylvian region and posterior temporal lob. The most common syndrome is a Wernicke type aphasia in the dominant hemisphere, and behavior disturbance with hemineglect in the nondominant. The severity of the syndrome reflects the degree to which one or more of the usual three branches of the lower division are affected, and whether collateral from the posterior cerebral artery minimizes the extent of temporal, parietal, and lateral occipital infarction.
Assuming the infarct is confined to the posterior portion of the sylvian fissure with good collateral, there should be no hesitation in starting anticoagulants as soon as the syndrome is clinically evident and the extent of the injury documented by imaging. Relying on the syndrome alone was the classical approach before imaging. I have several painful examples of patients with primary hematoma or major hemorrhagic infarction, neither the diagnosis or stroke severity obvious on initial clinical examination alone.
In other territories, early trials with thrombolytics demonstrated even hemorrhagic conversion for infarcts of one gyrus size seemed well-tolerated and were not unduly made worse by early anticoagulation.
Obstructions in the distal intracranial internal carotid or major circle of Willis vessels have so often been followed by hemorrhagic changes in the lenticulostriates that those in our group have been reluctant to recommend early anticoagulation for fear of exaggerating the already major hemorrhagic conversion.
Few would argue against withholding anticoagulants when major intracranial hemorrhagic disease coexists. In many the first imaging ever performed on the patient was after a stroke, because anticoagulants were instituted when atrial fibrillation was discovered without brain imaging beforehand.
The tolerance of the brain for simultaneous anticoagulation, with intracranial aneurysms, arteriovenous malformations, or even amyloid angiopathy seems quite remarkable.
The per-day recurrent embolic risk in a setting of atrial fibrillation is static and low. Early institution of anticoagulants is not required in the initial days after a stroke event. However, a prothrombotic state justifies early intervention.
The common practice of delaying anticoagulants for a week or more seems outmoded, now that the features of the stroke can be well-characterized by modern imaging.
Our practice is to use a flat dosing program (no loading dose) if warfarin is the choice. Aspirin is given daily during the running phase can be discontinued when the INR appears in a therapeutic range. Recalling how long it takes the intravascular coagulation status to be reflected by the INR means that most cases are likely not to be fully anticoagulated even when the clinicians take the INR at face value.
The introduction of oral thrombin inhibitors are changing the anticoagulant management program and when successful could mean early therapeutic anticoagulant values will become common and allow the testing of whether hemorrhagic conversion will prove more or less common compared with the traditional oral warfarin programs. Aspirin or other antiplatelet agents were initially assumed to be safe with these new oral thrombin inhibitors, but less so at present.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Start with IV tPA
Large vessel occlusions represent ca.10-15% of all ischemic stroke. There is strong evidence for efficacy for thrombectomy in adjunct to i.v. fibrinolysis, when compared with i.v. fibrinolysis alone. Recently, 5 multicenter, prospective studies have confirmed the benefit of mechanical thrombectomy (met).
The results of recent studies comparing the efficacy of combined intervention (intravenous + endovascular therapy versus intravenous only) are as follow (we focus only on those studies that included 100% iv. Thrombolysis patients as control group): the IMS-III included patients with intravenous t-pa and additional endovascular therapy or intravenous t-pa alone, in a 2:1 ratio. The primary outcome measure was a modified rankin scale score of 2 or less. Unfortunately, the ims trial had inhomogenous met group using four endovascular interventions: intraarterial t-PA (51 patients), microsonic sv infusion system with intraarterial t-pa (14 patients), merci retriever (77 patients), penumbra system (39 patients), and solitaire fr revascularization device (4 patients). There was no significant difference between the endovascular-therapy and intravenous t-pa groups in the overall proportion of participants with a modified rankin score of 2 or less (40.8% and 38.7%, respectively. The trial showed similar safety outcomes and no significant difference in functional independence with endovascular therapy after intravenous t-PA, as compared with intravenous t-pa alone.
Extend-IA investigators applied 0.9 mg/kg of t-pa in less than 4.5 hours after the onset of ischemic stroke either to undergo endovascular thrombectomy with the solitaire fr or to continue receiving alteplase alone. All the patients had occlusion of the internal carotid or middle
Cerebral artery and evidence of salvageable brain tissue and ischemic core of less than 70 ml on computed tomographic (CT) perfusion imaging.
All patients received alteplase at a dose of 0.9 mg per kilogram as standard care. Endovascular therapy improved the functional outcome at 90 days, with more patients achieving functional independence (score of 0 to 2 on the modified rankin scale, 71% vs. 40%; p=0.01). There were no significant differences in rates of death or symptomatic intracerebral hemorrhage.
The swift prime investigators assigned stroke patients to t-pa alone (control group) or to undergo endovascular thrombectomy with the use of a stent retriever within 6 hours after symptom onset. Patients had confirmed occlusions in the proximal anterior intracranial circulation and an absence of large ischemic-core lesions. The rate of functional independence (modified rankin scale score, 0 to 2) was higher in the intervention group than in the control group (60% vs. 35%, p<0.001). There were no significant between-group differences in 90-day mortality (9% vs. 12%, p=0.50) or symptomatic intracranial hemorrhage (0% vs. 3%, p=0.12).
The other studies (McClean, REVASCAT, Escape etc.) have also confirmed the beneficial effect of mechanical intervention.
The recent positive RCTs share same common features: patients with a high NIHSS and aspect score of 8–10 were included, and most commonly in the control group there was proof of vessel occlusion required. In general the time to endovascular treatment was below 4.5h and cotreatment with rtpa occurred in more than 90% of all cases.
The recently published karolinska guideline summarizes: “mechanical thrombectomy, in addition to intravenous thrombolysis within 4.5 h when eligible, is recommended to treat acute stroke patients with large artery occlusions in the anterior circulation up to 6 h after symptom onset (grade A, level 1A, KSU grade A).
Mechanical thrombectomy should not prevent the initiation of intravenous thrombolysis where this is indicated, and intravenous thrombolysis should not delay mechanical thrombectomy (grade A, level 1A, KSU grade A).
Similar statement has been formulated in the American guideline: “patients eligible for intravenous r-tPA should receive intravenous r-tPA even if endovascular treatments are being considered (class I; level of evidence A).
But recently Kass-Hout t et al. Analysed patients with acute large artery occlusion. Forty-two received endovascular therapy in combination with iv thrombolysis (bridging group), and 62 received endovascular therapy only. The favorable outcome (mrankin <2 at 90 days), did not differ between the bridging group and the endovascular-only group (37.5% and 32.76%; p=0.643). There was no difference in mortality rate (19.04% and 29.03%; p=0.5618) and sICH rate (11.9% and 9.68%; p=0.734). A significant difference was found in mean time from symptom onset to treatment in the bridging group and the endovascular-only group (227±88 min vs. 125±40 min; p<0.0001). They concluded, that combining iv thrombolysis with endovascular therapy resulted in similar outcome, revascularization, sICH, and mortality rates compared with endovascular therapy alone.
So, the iv. Part could be omitted and the patients should go directly to the catheter lab.
But the final answer could be given by a prospective, randomized, multicenter trial with the following criteria: 1. Acute ischemic stroke patients 4,5 hours after stroke; 2. All patients should undergo CT or MR angiography; 3. Only patients with large vessel occlusion (ICA, MCA occlusion) should be randomized; 4. One group of patients should be treated only with mechanical thrombectomy as soon as possible; 5. The other group of patients should receive first iv. thrombolysis with additional mechanical thrombectomy; and 6. mRankin scale after 3 months and complications should be compared.
This study will finally answer the question if a large vessel occlusion patient should be transferred directly to the catheter lab or start first with iv. thrombolysis.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Straight to the catheter lab
The recent compelling evidence of intra-arterial therapies in acute ischemic stroke have urged the revision of local algorithms in stroke units across the globe. In fact, in the presence of a proximal intracranial occlusion endovascular treatments reached unprecedented hemodynamic and functional efficacy in an otherwise problematic clinical scenario. Up until recently, the attempt to reperfuse the symptomatic area was limited to the “old” intravenous infusion of alteplase (IVtPA). Albeit its merits, the clinical impact caused is significantly impaired due to its narrow therapeutic window, extensive list of contra-indications and limited efficacy in large vessel occlusions.
In acute ischemic stroke with salvageable cerebral tissue a simplistic and pragmatic approach could define three main clinical scenarios: patients with contra-indication to IVtPA, patients with and those without large-vessel occlusions. For the first clinical scenario the answer is straightforward: to the catheter lab in all those with proximal occlusions. The answer is more troublesome in the advent of a large vessel occlusion without contra-indication to IVtPA. In this setting IVtPA has a reported recanalization rate of 10-20% with very limited clinical impact. On the other hand, it is not without side effects in the ischemic area, in remote cerebral areas as well as other organs susceptible to bleeding, not rarely in uncompressible locations. Ultimately, it may represent exposing the patient to potentially severe risks for minimal impact in the ischemic brain. For patients without proximal intracranial occlusions IVtPA is highly efficacious, rendering intra-arterial therapies as unnecessary.
In conclusion, IVtPA will remain the mainstay of acute stroke treatment for all those with clear clinical indications. However, the advent of intra-arterial therapies has had a dramatic impact on stroke algorithms worldwide. Particularly, in the subset of patients with large vessel occlusions the option of going straight to the catheter laboratory is appealing as it would prevent the use of a marginally effective therapy with rare but potentially severe complications, promoting the need for urgent intra-arterial recanalization.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
The epidemiological evidence linking protrombothic states to arterial stroke is much weaker than that with venous stroke. Despite that evidence, screening for acquired and genetic thrombophilia is performed in young stroke patients in many stroke centers. This means searching for protein C and S and antithrombin deficiencies, factor V Leiden and prothrombin G20210A mutations and homocysteine plasma levels and for lupus anticoagulant and autoantibodies (anticardiolipin and anti-beta2 glycoprotein) linked to the antiphospholipid syndrome. This screening must also take in consideration the effect of acute stroke phase inflammatory reaction, the influence of anticoagulants, the need for repeated testing and in some instances the need for additional genetic or familiar studies. All these evaluations take time and lead to incremental costs in the standard work up package of “young stroke”. With the exception of hyper-homocysteine plasma and antiphospholipid syndrome all the other prothrombotic condition are very rare. Therefore systematic screening for thrombophilia in young stroke victims has a low yield. Basing screening on clinical hints such as recurrent stroke, strong family history, combination of venous and arterial thromboembolic events or on clinical features suggesting antiphospholipid syndrome, such as recurrent thrombotic events or unfavorable pregnancy outcomes, in particular miscarriage, increases the efficiency of the screening. The therapeutic consequence of editing a prothrombotic state is in general lifetime anticoagulation. However the evidence supporting long term anticoagulation after stroke in patients with a prothrombotic condition is weak and it is unknown if long term anticoagulation improves the outcome and Quality of Life of young stroke patients. Therefore systematic testing for thrombophilia is unlikely to improve the outcome and to be cost-effective in young stroke patients.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Restoration
Early rehabilitation needs to be focused on restitution not compensation
Data will be presented from both humans and animal models that show there is a unique time limited period early after stroke in which motor recovery at the level of impairment occurs. Rehabilitation techniques need to be developed that maximally- exploit this window of spontaneous biological recovery and enhanced responsiveness to training. Compensatory training should be avoided in this period and can be postponed until after the window has closed.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
Selegiline is a selective, irreversible MAO-B inhibitor at therapeutic dose of 10 mg/day, but loses its selectivity at greated dosage. The potential of selegiline to modify disease progression in PD was proposed when it was shown to prevent MPTP-induced parkinsonism in monkeys. There is no conclusive evidence from clinical trials to prove that selegiline has „disease-modification” effects. Long term clinical trials of selegiline have shown improved motor outcome and reduced levodopa requirement. Whether these findings were attributed to the symptomatic benefits of the disease-modification property of selegiline remain debate. Unlike rasagiline in which delayed-start design trials were carried out in an attempt to separate confounding symptomatic effects from disease-modifying effects, there are none for selegiline. Rasagiline is a second generation propalgylamine-based selective, irreversible MAO-B inhibitors. It was reported to have potent anti-apoptotic effects independent of MAO inhibition in in vitro and in vivo experimental parkinsonian models. Unlike selegiline, rasagiline is not metabolized to L-amphetamine-like metabolites which may cause appetite suppression and insomnia. In the PRESTO study the Clinical Global Impression (CGI) and the UPDRS ADL scores during „off” time showed improvement as secondary end points, with both doses of rasagiline, but not with PD Quality of Life summary score. In the LARGO study, patients who had received rasagiline had statistically significant reduction of mean daily „off” time. Recent study supports the efficacy and safety of safinamide (oral aminoamide derivative with broad mechanism of action such as reversible MAO-B inhibition, blockage of voltage-dependent sodium channels, modulation of calcium channels and of glutamate release) as an adjunct to levodopa in PD patients with motor fluctuations. In earlier clinical studies, it was shown to improve motor control in patients with PD who received it as an add-on to dopamine agonist or as an adjunct to levodopa. The dual mechanism (reversible MAO-B inhibition and glutamate release inhibition) is belived responsible for the alleviation of motor symptoms of PD.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
To develop a truly disease-modifying therapeutic, it will be necessary to stop the progression of the underlying pathophysiology. A tremendous amount of research has been devoted to understanding the role of misfolding and aggregation of the synaptic protein alpha-synuclein (ASYN).
Both human genetic studies and experiments in animals provide compelling links between the dysregulation of ASYN and PD. The neuropathological hallmark of PD and other synucleinopathies is the accumulation of alpha-synuclein (ASYN) containing cytosolic inclusions, called Lewy-bodies. As with other misfolded proteins, it is likely that in PD the microscopically detectable Lewy bodies containing ASYN polymers are final deposits while earlier stages of aggregates are involved in the pathogenic process. Recent studies further suggest that membrane-embedded oligomers of ASYN may be a particularly toxic form of ASYN, resulting in disruption of synaptic function, loss of cell membrane integrity and ultimately, in neuronal degeneration.
Treatments that prevent the formation and accumulation of these toxic membrane-embedded oligomeric aggregates of ASYN may prevent further decay or even restore synaptic function in impaired systems and slow the rate of degeneration, thus providing a therapeutic benefit for patients. Treatment approaches that target the misfolding and aggregation process are currently being explored in early clinical studies with antibodies, vaccination and small molecules.
Another, therapeutic principle is to enhance the clearance of these protein aggregates by rectifying defects in a dysregulated clearance mechanism . Approaches aimed at enhancing clearance are still at the animal-testing stage but hold out promise because they may prove to be effective even after the disease has progressed to its later stages.
We have developed molecules based on both approaches—inhibition of aggregation and enhancement of clearance— with convincing effects on alpha-synuclein reduction and in the ‘clinical’ assessments in these animal models.
In summary, while the physiological role of ASYN is currently not fully understood, it is clear that the accumulation of misfolded forms of ASYN contributes to the pathology of PD and that preventing the formation of toxic oligomers and/or enhancing cellular clearance mechanisms may be viable therapeutic approaches to halt or slow disease progression.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: No
Alpha-Synuclein (alpha-syn) and its aggregation tendency plays a pivotal role in the pathogenesis of LBD ,PD and MSA. Less clear is the pathogenic role of alpha-syn in Hallervorden Spatz Disease. Particularly in LBD there is general agreement that dissolving alpha-syn aggregates would take care of the most important pathogenic process leading to dementia. True, in AD all the attempt to obtain symptomatic benefits by dissolving beta-amyloid (amy) aggregates have miserably failed. However, beta-amy aggregation is only one pathogenic factor in AD. Thus, clearance of beta-amyloid from brains cannot be expected to have the same beneficial results of clearing alpha-syn in the synucleopathies. Said this, the very high risk of successfully translating benefits obtained in animal models of neurodegenerative diseases to the clinical setting still remains. At present no molecule in development impedes both the antifibrillogenic and the anti-oligomerization of the protein which may key to the pathogenic process. Several compounds, are in various phase of development, at least two are mab and claim "vaccination" potential. The certainty that encephalitic processes will result from prolonged treatment has not been reached yet. Also the consequences of blocking alpha-syn throughout the brain (and in other parts of the bodies) have not been fully explored. The field of a disease modifying treatment addressing alpha-syn aggregation in PD may move forward more expeditely using small molecules or peptides.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
COMT inhibitors are an important pharmacological class for the treatment of motor fluctuations in Parkinson's disease (PD). However, issues such as the magnitude of the effect, relevancy of the benefit, safety and best time for their use are still matters of discussion. Best data available comparing entacapone with placebo concludes on a reduction of 41 minutes of daily OFF time and the clinical relevancy of such an effect may be questioned. On the other hand, although there is a consensus regarding the higher potency of tolcapone, its utilisation is limited due to safety concerns on liver toxicity. Conversely, other antiparkinsonian drugs like the dopamine agonists and the MAO-B inhibitors have also demonstrated efficacy for the treatment of wearing-off and these questions the best strategy for the sequential use of the different alternatives in the management of the disease. Considering all these factors, the future of COMT-I for the management of PD is dependent on having new drugs which are more potent, safer or easier to use and which additional benefits they add to the currently available armamentarium. The recently available data from a new COMT inhibitor, opicapone, currently under evaluation by the European Medicines Agency, argues in favour of the possibility to have new drugs which add potency compared with the old COMT-I, with a simpler mode of administration which may add a benefit for the use of this pharmacological class.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

Point of view: Yes
Pallidotomy and thalamotomy began in the 1950s and electrolytic thalamotomy with electrophysiological intraoperative assessment became established for the treatment of tremor in the 1960s. This procedure waned for over 2 decades with pharmacological developments(i.e. levodopa and other drugs). Surgery for movement disorders regained prominence when pallidotomy first and deep brain stimulation (DBS) son after were applied to the treatment of Parkinson’s disease (PD) and dystonia in the 1990s. Lesion of the subthalamic nucleus (STN) have been less popular because of the fear to cause hemiballism, but actually employed in many countrie,s particularly because of the high cost and technological demands of DBS, without major complications.
Clinical trials have investigated the safety and efficacy of thermal lesions created by transcranial, HIFU, which now offers the possibility of using more frequently focal ablative treatment for movement disorders. The outcomes of HIFU treatment is highly predictable as essentially should be the same than well-established for thalamotomy, pallidotomy and subthalamotomy. The potentials side effects are also the same without the risk of intracranial surgery. Recent, still preliminary data, indicates that HIFU can indeed be used with marked efficacy for the treatment of essential tremor and other disorders with tremor predominant manifestation, including Parkinson’s disease. The reduced invasiveness and excellent benefit to risk profile allows to use it to treat patients that otherwise could not benefit from surgical procedures. This is a welcome addition to the therapeutic armamentarium of movement disorders.

Special Issue on Controversies in Neurology. From the 10th World Congress on Controversies in Neurology (CONy), Lisbon, Portugal. 17–20 March 2016.

We present a 79-year-old man with a previous history of hypertension, dyslipidemia and ischemic stroke of the left middle cerebral artery caused by severe atherosclerotic stenosis of the left internal carotid artery (treated with carotid artery stenting) in 2011. He was medicated with clopidogrel 75mg/day, ramipril 2.5mg/day and atorvastatin 20mg/day. He was admitted for a left-hand motor deficit after he woke up. Vital signs were normal and there was a mild motor deficit of the left upper limb, with apraxia of the left hand. Electrocardiography showed sinus rhythm, and there were no signs of acute ischemia or hemorrhage in brain CT. Carotid ultrasound showed a normal position of the left stent with no residual stenosis and, additionally, an irregular hypoechoic atherosclerotic plaque in the proximal right internal carotid artery without significant stenosis. He was treated with a loading dose of acetylsalicylic acid and maintained double antiplatelet therapy and atorvastatin 80mg/day. On the third day after admission, the left motor deficit worsened. The brain MRI revealed multiple acute hyperintense ischemic lesions in the right hemisphere (cortical and subcortical) with varying intensities suggesting different timings of the ischemic lesions.
At this point we would like to discuss with the experts the best management in this case, namely among the following options: maintain double antiplatelet therapy and close monitoring; start anticoagulation; or proceed to endovascular or surgical treatment of the unstable plaque.

Introduction: Unruptured cerebral aneurysms are currently considered a contraindication to thrombolytic therapy for acute ischemic stroke, due to its theoretical increase in the risk of haemorrhage from aneurysm rupture.

Results: A 51-year-old female presented at the Emergency Department with a sudden language change. Past history was relevant for dyslipidaemia treated with simvastatin and regular consumption of pharmacologic preparations intended for weight loss. The initial observation revealed mild aphasia, flattened right nasolabial fold and right mild hemiparesis with mild sensory loss (NIHSS 5). Brain CT scan was normal and CT angiography revealed a probable occlusion of the Sylvian branch of the left middle cerebral artery (M3 segment) and a saccular aneurysm of the anterior communicating artery with approximately 8mm. Given the minor and regressing clinical picture and the presence of an aneurysmal formation, it was decided not to treat with thrombolytic therapy. At the Stroke Unit, a brain MRI revealed multiple acute ischemic lesions in several arterial territories suggestive of an embolic source. Her EKG monitoring remained always in sinus rhythm. Transthoracic echocardiogram revealed a slightly dilated left atrium and mild-to-moderate aortic insufficiency. Transesophageal echocardiogram showed no additional relevant changes. Extracranial and transcranial ultrasounds were normal. At discharge, she maintained some degree of anomic pauses and paraphasia with mild slurring of speech, mild flattened right nasolabial fold and loss of right-hand fine motor skills with mild sensory loss of the right lower limb (NIHSS 4). The aetiology of these changes remains unknown. She was released with combined clopidogrel-aspirin and a plan for readmission 3 weeks later for aneurysm endovascular treatment.

Conclusion: This case illustrates the difficulty in deciding stroke acute-phase treatment when aneurysms with more than 5mm are identified, due to the uncertainty on intravenous alteplase safety in the treatment of these patients.

Background: Dissection of the carotid artery can cause stenosis and occlusion. In certain cases, acute phase carotid stenting is an option.

Methods: We present a case of stent placement in the acute phase of bilateral dissection of internal carotid artery (ICA).

Results: We report a case of a 46-year-old woman, with no past relevant history. Her only medication was oral contraception. She presented with headache, vertigo and bilateral leg paresis with left predominance. At the emergency room (ER), no neurological focal signs were detected and she was discharged.
In the following day, she returned to the ER with the same symptoms. This time she had left hemianopia, central facial palsy (LCFP), dysarthria and left hemiplegia (NIHSS 16). CT revealed an ischemic lesion on the right middle cerebral artery (rMCA) territory with occlusion of the right ICA and stenosis of the left ICA, with no repercussion on the transcranial Doppler (TCD).
Three days later, TCD showed low blood flow velocity in the left MCA and anterior cerebral artery (ACA), with collateral compensation by the posterior circulation, suggesting a distal ICA lesion.
At that time, the patient underwent digital subtraction angiography, showing an irregular stenosis of nearly 80, with cervical aneurysmatic dilatation of the left ICA, and a delay in distal perfusion. A carotid stent was placed with satisfactory reperfusion. The neurosonological study was repeated, revealing an occlusion of the stent and the patient was submitted to mechanical thrombectomy.
The patient’s age and angiographic features suggested bilateral carotid dissection with rICA occlusion and left ICA stenosis. The patient was discharged with NIHSS of 12.

Conclusion: Acute phase carotid stenting is not consensual, but what should we do when facing contralateral occlusion?

Results: A 76-year-old male presented to his family doctor complaints of nausea and vomiting since the previous day and left hemiparesis with a 3-hour symptom onset. His previous medical history was positive for hypertension, ischemic heart disease, dyslipidaemia, alcohol abuse, past smoking and sleep apnoea. He was transported to our institution after pre-hospital Stroke Code activation. Initial evaluation revealed flattened left nasolabial fold, left arm pronator drift and mild sensory loss in his left arm (NIHSS 4). According to the patient, the neurological deficits were improving. Brain CT scan was unremarkable and CT angiography revealed bilateral diffuse atherosclerosis with moderate-to-severe stenosing plaques at the bilateral carotid bifurcation. Given the minor and regressing clinical picture, it was decided not to treat with thrombolytic therapy. Brain MRI showed multiple hyperintensities on DWI in the right middle cerebral artery (MCA) territory involving the cortex, with significant DWI-FLAIR mismatch. Extracranial ultrasound (US) confirmed severe proximal bilateral ICA stenosis, with hemodynamic repercussion in the right ophthalmic artery. Transcranial US revealed microembolic signals in the right MCA. Given the active embolic source, he was started on clopidogrel-aspirin combined therapy and a single dose of abciximab for CAS. The patient was submitted to a conventional cerebral angiography on the following day, and bilateral CAS was performed, followed by mechanical angioplasty with intra-stent balloon. Follow-up Doppler-US examination confirmed stent patency. The patient was discharged with mild neurological improvement (NIHSS 3) and maintained under clopidogrel-aspirin combined therapy until reassessment in an outpatient setting.

Conclusion: This case illustrates that, although carotid stenting is not recommended for the acute phase treatment of symptomatic stenosis, in selected patients it can be a valid treatment option.

A 58-year-old woman with history of non-treated hypertension presented sudden-onset right side hemiparesis, dysarthria and facial asymmetry. The Stroke Code was activated. Upon admission at the emergency department she was alert, oriented, with a left gaze palsy, normal eye field, subtle horizontal rotatory nystagmus to the left, right hemiplegia and ipsilateral hypoesthesia (NIHSS: 12). The Brain computed tomography (CT) revealed no acute lesions. CT angiography showed basilar artery megadolichoectasia. Treatment with alteplase was promptly initiated, and she was admitted in the Stroke Unit.
Close clinical monitoring showed stable neurological deficits in the first 24 hours and the control brain magnetic resonance imaging scan (MRI) exhibited acute left paramedial pons ischemic infarct. Antiplatelet treatment was started. On the second day, neurological deterioration was noted: ophthalmoparesis with right side one-and-a-half syndrome and left limb dysmetria. A new MRI brain scan showed expansion of the ischemic lesion, encompassing the pons and the mesencephalon bilaterally. CT angiography revealed the presence of a nonocclusive endoluminal thrombus. The patient was started on non-fractioned heparin infusion for 48 hours, and then switched to fractioned heparin after CT scan with no bleeding. A control CT angiography revealed reduction of thrombus size.
The neurologic deficits stabilized. Slight improvement of the ophthalmoparesis was noted.
Conclusion: Would a complementary endovascular approach have an additional benefit 48 hours after symptom onset?

Clinical Case: A previously independent 77-year-old man, with history of ischemic cardiac disease, hypertension, diabetes and smoking and drinking habits, was recently hospitalized for transient episodes of pre-syncope, dysphagia and left hemiparesis. MRI showed acute ischemic lesions in the territory of the left middle cerebral artery (MCA). The CT-angiography revealed bilateral severe internal carotid artery (ICA) stenosis, stenosis of the M1 segment of the right MCA and V4 segments of both vertebral arteries (VA) and also pre-occlusive stenosis of the basilar artery (BA). Double anti-platelet therapy was initiated and bilateral carotid stents were placed. At discharge, the patient was asymptomatic but returned 4 days later for fluctuating right brachial paresis, four limbs dysmetria and horizontal-rotational nystagmus. Doppler studies confirmed severe stenosis of the BA, with both carotid stents patent. Hypoperfusion of the vertebrobasilar territory was admitted and after multidisciplinary discussion, mechanical angioplasty and stenting of the BA was performed. However, because of further clinical worsening, MRI was made and revealed multiple acute ischemic lesions of the anterior left territory and posterior territory, with occlusion of the left ICA stent and BA, with all intracranial circulation dependent on anastomosis from the right ICA. The patient was clinically stable and was maintained on aspirin and ticagrelor.

Conclusions: This case raises several important questions: indications for endovascular therapy in multiple stenosis/occlusions, timing of treatment in multiple stenosis and the ideal double anti-platelet scheme.

Introduction: It is established that patients who are eligible for intravenous rt-PA therapy should be treated even if endovascular therapy is considered. However, there are situations in which thrombectomy could be directly performed.

Case: 69-year-old male patient with hypertension and pernicious anemia. He had sudden right arm paresis. Four hours later, at the hospital, the neurological examination was unremarkable. At admission, he was hypertensive (191/94 mmHg) but, otherwise, had normal vital signs. Analytical study, electrocardiogram and brain CT were normal and he was admitted for transient ischemic attack(TIA) study. The next day, after being asymptomatic for 22 hours, he developed right central facial and limb paresis, scoring 10 in NIHSS. Repeated brain CT and CT-angiography (CTA) showed no early ischemic signs, a left internal cervical carotid (ICA) occlusion and tandem thrombus at the terminal intracranial carotid. Intravenous rtPA was given and he was selected tomechanical thrombectomy. Before the procedure, 2h30 after symptom onset, NIHSS was 5 and a new CT and CTA were performed, showing resolution of the terminal ICA occlusion, left parietal subarachnoid blood and a deep hemorrhagic focus. Thrombectomy was held-up and he returned to stroke unit. He had progression of deficits scoring 20 over the following 72 hours. Ultrasound showed the same left ICA occlusion with patent intracranial main vessels. No other etiology was advanced after study.

Conclusion: The authors question the need of CTA in TIA’s with presumable cortical symptoms and if, in this particular case, thrombolysis was not beneficial but even detrimental to patient evolution, making impossible to perform thrombectomy.

Clinical Case: A 46-year-old female, with history of iron deficiency anemia due to hiatal hernia, is taken to the emergency department with symptoms of speech disturbance and right-sided paresis. Stroke protocol was activated. Neurologic examination showed global aphasia, left oculocephalic forced deviation, right-sided hemiparesis and hemihypoesthesia – NIHSS 23. A brain CT showed loss of grey-white matter differentiation at the left lenticular nucleus (ASPECTS 9). Prompt treatment with alteplase (0.9mg/kg) was initiated, 205 minutes after the onset of symptoms. CT angiography showed a terminal left internal carotid artery (ICA) and proximal (M1) middle cerebral artery (MCA) occlusion and urgent mechanical thrombectomy was performed with TICI 2a reperfusion of the left carotid circulation. The patient was subsequently admitted to the stroke unit. Close clinical monitoring showed no neurologic improvement at 24 hours. At this point, a control brain CT showed left basal ganglia infarct, as well as hyperdense left middle cerebral artery and a carotid/transcranial doppler ultrasound (TCD) revealed persistence of the left ICA thrombus, with no significant hemodynamic effect, as well as high resistance and turbulent flow through the M1-M2 segment of the left MCA. The patient remained neurologically stable (NIHSS 18). On the fourth day, a worsened TCD pattern of the proximal MCA prompted an urgent brain CT angiogram, which confirmed a distal left MCA reocclusion and persistence of left ICA thrombus. Despite not having neurological deterioration, after multidisciplinary decision, the patient underwent DSA and mechanical thrombectomy, successfully removing the internal carotid thrombus but incapable of distal MCA reperfusion. No intracranial hemorrhage or neurological deterioration was noted and NIHSS at discharge was 19. Secondary prevention with single anti-platelet and statin therapy was adopted.

Conclusion: Should this patient have been submitted to a late reperfusion, after ischemic stroke and reocclusion, considering there was no neurologic deterioration?