Malaria is a female mosquito-borne infectious disease caused by a eukaryotic protist of the genus Plasmodium. It is widespread in tropical and subtropical regions, including parts of the Americas (22 countries), Asia, and Africa. After a period of between two weeks and several months (occasionally years) spent in the liver, the malaria parasites start to multiply within red blood cells, causing symptoms that include fever, and headache. In severe cases the disease worsens leading to hallucinations, coma, and death.
Five species of the plasmodium parasite can infect humans: the most serious forms of the disease are caused by Plasmodium falciparum. Malaria caused by Plasmodium vivax, Plasmodium ovale and Plasmodium malariae causes milder disease in humans that is not generally fatal. A fifth species, Plasmodium knowlesi, is a zoonosis that causes malaria in macaques but can also infect humans.
Malaria transmission can be reduced by preventing mosquito bites by distribution of inexpensive mosquito nets and insect repellents, or by mosquito-control measures such as spraying insecticides inside houses and draining standing water where mosquitoes lay their eggs. Although many are under development, the challenge of producing a widely available vaccine that provides a high level of protection for a sustained period is still to be met.
A variety of antimalarial medications are available. In the last 5 years, treatment of P. falciparum infections in endemic countries has been transformed by the use of combinations of drugs containing an artemisinin derivative. Severe malaria is treated with intravenous or intramuscular quinine or, increasingly, the artemisinin derivative artesunate. Several drugs are also available to prevent malaria in travellers to malaria-endemic countries (prophylaxis). Resistance has developed to several antimalarial drugs, most notably chloroquine.
Each year, there are more than 250 million cases of malaria, killing between one and three million people, the majority of whom are young children in sub-Saharan Africa. Ninety percent of malaria-related deaths occur in sub-Saharan Africa. Malaria is commonly associated with poverty, and can indeed be a cause of poverty and a major hindrance to economic development.
Signs and symptoms
Typical fever patterns of malaria
Symptoms of malaria include fever, shivering, arthralgia (joint pain), vomiting, anemia (caused by hemolysis), hemoglobinuria, retinal damage, and convulsions. The classic symptom of malaria is cyclical occurrence of sudden coldness followed by rigor and then fever and sweating lasting four to six hours, occurring every two days in P. vivax and P. ovale infections, while every three days for P. malariae. P. falciparum can have recurrent fever every 36–48 hours or a less pronounced and almost continuous fever. For reasons that are poorly understood, but that may be related to high intracranial pressure, children with malaria frequently exhibit abnormal posturing, a sign indicating severe brain damage. Malaria has been found to cause cognitive impairments, especially in children. It causes widespread anemia during a period of rapid brain development and also direct brain damage. This neurologic damage results from cerebral malaria to which children are more vulnerable. Cerebral malaria is associated with retinal whitening, which may be a useful clinical sign in distinguishing malaria from other causes of fever.
Severe malaria is almost exclusively caused by P. falciparum infection, and usually arises 6–14 days after infection. Consequences of severe malaria include coma and death if untreated—young children and pregnant women are especially vulnerable. Splenomegaly (enlarged spleen), severe headache, cerebral ischemia, hepatomegaly (enlarged liver), hypoglycemia, and hemoglobinuria with renal failure may occur. Renal failure is a feature of blackwater fever, where hemoglobin from lysed red blood cells leaks into the urine. Severe malaria can progress extremely rapidly and cause death within hours or days. In the most severe cases of the disease, fatality rates can exceed 20%, even with intensive care and treatment. In endemic areas, treatment is often less satisfactory and the overall fatality rate for all cases of malaria can be as high as one in ten. Over the longer term, developmental impairments have been documented in children who have suffered episodes of severe malaria.
Chronic malaria is seen in both P. vivax and P. ovale, but not in P. falciparum. Here, the disease can relapse months or years after exposure, due to the presence of latent parasites in the liver. Describing a case of malaria as cured by observing the disappearance of parasites from the bloodstream can, therefore, be deceptive. The longest incubation period reported for a P. vivax infection is 30 years. Approximately one in five of P. vivax malaria cases in temperate areas involve overwintering by hypnozoites (i.e., relapses begin the year after the mosquito bite).
Causes
A Plasmodium sporozoite traverses the cytoplasm of a mosquito midgut epithelial cell in this false-color electron micrograph.
Malaria parasites
Malaria parasites are members of the genus Plasmodium (phylum Apicomplexa). In humans malaria is caused by P. falciparum, P. malariae, P. ovale, P. vivax and P. knowlesi. P. falciparum is the most common cause of infection and is responsible for about 80% of all malaria cases, and is also responsible for about 90% of the deaths from malaria. Parasitic Plasmodium species also infect birds, reptiles, monkeys, chimpanzees and rodents. There have been documented human infections with several simian species of malaria, namely P. knowlesi, P. inui, P. cynomolgi, P. simiovale, P. brazilianum, P. schwetzi and P. simium; however, with the exception of P. knowlesi, these are mostly of limited public health importance.
Malaria parasites contain apicoplasts, an organelle usually found in plants, complete with their own functioning genomes. These apicoplast are thought to have originated through the endosymbiosis of algae and play a crucial role in various aspects of parasite metabolism e.g. fatty acid bio-synthesis. To date, 466 proteins have been found to be produced by apicoplasts and these are now being looked at as possible targets for novel anti-malarial drugs.
Life cycle
The parasite's primary (definitive) hosts are humans and other vertebrates. Female mosquitoes of the Anopheles genus are secondary hosts and transmission vectors. Young mosquitoes first ingest the malaria parasite by feeding on an infected human carrier and the infected Anopheles mosquitoes carry Plasmodium sporozoites in their salivary glands. A mosquito becomes infected when it takes a blood meal from an infected human. Once ingested, the parasite gametocytes taken up in the blood will further differentiate into male or female gametes and then fuse in the mosquito's gut. This produces an ookinete that penetrates the gut lining and produces an oocyst in the gut wall. When the oocyst ruptures, it releases sporozoites that migrate through the mosquito's body to the salivary glands, where they are then ready to infect a new human host. This type of transmission is occasionally referred to as anterior station transfer. The sporozoites are injected into the skin, alongside saliva, when the mosquito takes a subsequent blood meal.
Only female mosquitoes feed on blood, thus males do not transmit the disease. The females of the Anopheles genus of mosquito prefer to feed at night. They usually start searching for a meal at dusk, and will continue throughout the night until taking a meal. Malaria parasites can also be transmitted by blood transfusions, although this is rare.
Pathogenesis
The life cycle of malaria parasites in the human body. A mosquito infects a person by taking a blood meal. First, sporozoites enter the bloodstream, and migrate to the liver. They infect liver cells (hepatocytes), where they multiply into merozoites, rupture the liver cells, and escape back into the bloodstream. Then, the merozoites infect red blood cells, where they develop into ring forms, trophozoites and schizonts which in turn produce further merozoites. Sexual forms (gametocytes) are also produced, which, if taken up by a mosquito, will infect the insect and continue the life cycle.
Malaria develops via two phases: an exoerythrocytic and an erythrocytic phase. The exoerythrocytic phase involves infection of the hepatic system, or liver, whereas the erythrocytic phase involves infection of the erythrocytes, or red blood cells. When an infected mosquito pierces a person's skin to take a blood meal, sporozoites in the mosquito's saliva enter the bloodstream and migrate to the liver. Within minutes of being introduced into the human host, the sporozoites infect hepatocytes, multiplying asexually and asymptomatically for a period of 8–30 days. Once in the liver, these organisms differentiate to yield thousands of merozoites, which, following rupture of their host cells, escape into the blood and infect red blood cells, thus beginning the erythrocytic stage of the life cycle. The parasite escapes from the liver undetected by wrapping itself in the cell membrane of the infected host liver cell.
Within the red blood cells, the parasites multiply further, again asexually, periodically breaking out of their hosts to invade fresh red blood cells. Several such amplification cycles occur. Thus, classical descriptions of waves of fever arise from simultaneous waves of merozoites escaping and infecting red blood cells.
Some P. vivax and P. ovale sporozoites do not immediately develop into exoerythrocytic-phase merozoites, but instead produce hypnozoites that remain dormant for periods ranging from several months (6–12 months is typical) to as long as three years. After a period of dormancy, they reactivate and produce merozoites. Hypnozoites are responsible for long incubation and late relapses in these two species of malaria.
The parasite is relatively protected from attack by the body's immune system because for most of its human life cycle it resides within the liver and blood cells and is relatively invisible to immune surveillance. However, circulating infected blood cells are destroyed in the spleen. To avoid this fate, the P. falciparum parasite displays adhesive proteins on the surface of the infected blood cells, causing the blood cells to stick to the walls of small blood vessels, thereby sequestering the parasite from passage through the general circulation and the spleen. This "stickiness" is the main factor giving rise to hemorrhagic complications of malaria. High endothelial venules (the smallest branches of the circulatory system) can be blocked by the attachment of masses of these infected red blood cells. The blockage of these vessels causes symptoms such as in placental and cerebral malaria. In cerebral malaria the sequestrated red blood cells can breach the blood brain barrier possibly leading to coma.
Although the red blood cell surface adhesive proteins (called PfEMP1, for Plasmodium falciparum erythrocyte membrane protein 1) are exposed to the immune system, they do not serve as good immune targets, because of their extreme diversity; there are at least 60 variations of the protein within a single parasite and effectively limitless versions within parasite populations. The parasite switches between a broad repertoire of PfEMP1 surface proteins, thus staying one step ahead of the pursuing immune system.
Some merozoites turn into male and female gametocytes. Since the gametocytes are formed in the blood of the vertebrate host, the vertebrate host is the definitive host of the disease. If a mosquito pierces the skin of an infected person, it potentially picks up gametocytes within the blood. Fertilization and sexual recombination of the parasite occurs in the mosquito's gut. New sporozoites develop and travel to the mosquito's salivary gland, completing the cycle. Pregnant women are especially attractive to the mosquitoes, and malaria in pregnant women is an important cause of stillbirths, infant mortality and low birth weight, particularly in P. falciparum infection, but also in other species infection, such as P. vivax.
Genetic resistance
Malaria is thought to have been the greatest selective pressure on the human genome in recent history. This is due to the high levels of mortality and morbidity caused by malaria, especially the P. falciparum species. A number of diseases may provide some resistance to it including sickle cell disease, thalassaemias, glucose-6-phosphate dehydrogenase, Duffy antigens, and possibly others.
Diagnosis
The mainstay of malaria diagnosis has been the microscopic examination of blood. Although blood is the sample most frequently used to make a diagnosis, both saliva and urine have been investigated as alternative, less invasive specimens.
Symptomatic
Areas that cannot afford even simple laboratory diagnostic tests often use only a history of subjective fever as the indication to treat for malaria. Using Giemsa-stained blood smears from children in Malawi, one study showed that when clinical predictors (rectal temperature, nailbed pallor, and splenomegaly) were used as treatment indications, rather than using only a history of subjective fevers, a correct diagnosis increased from 21% to 41% of cases, and unnecessary treatment for malaria was significantly decreased.
Blood films
Species |
Appearance |
Periodicity |
Liver persistent |
Plasmodium vivax |
|
tertian |
yes |
Plasmodium ovale |
|
tertian |
yes |
Plasmodium falciparum |
|
tertian |
no |
Plasmodium malariae |
|
quartan |
no |
The most economic, preferred, and reliable diagnosis of malaria is microscopic examination of blood films because each of the four major parasite species has distinguishing characteristics. Two sorts of blood film are traditionally used. Thin films are similar to usual blood films and allow species identification because the parasite's appearance is best preserved in this preparation. Thick films allow the microscopist to screen a larger volume of blood and are about eleven times more sensitive than the thin film, so picking up low levels of infection is easier on the thick film, but the appearance of the parasite is much more distorted and therefore distinguishing between the different species can be much more difficult. With the pros and cons of both thick and thin smears taken into consideration, it is imperative to utilize both smears while attempting to make a definitive diagnosis.
From the thick film, an experienced microscopist can detect parasite levels (or parasitemia) down to as low as 0.0000001% of red blood cells. Diagnosis of species can be difficult because the early trophozoites ("ring form") of all four species look identical and it is never possible to diagnose species on the basis of a single ring form; species identification is always based on several trophozoites.
One important thing to note is that P. malariae and P. knowlesi (which is the most common cause of malaria in South-east Asia) look very similar under the microscope. However, P. knowlesi parasitemia increases very fast and causes more severe disease than P. malariae, so it is important to identify and treat infections quickly. Therefore modern methods such as PCR (see "Molecular methods" below) or monoclonal antibody panels that can distinguish between the two should be used in this part of the world.
Antigen tests
For areas where microscopy is not available, or where laboratory staff are not experienced at malaria diagnosis, there are commercial antigen detection tests that require only a drop of blood. Immunochromatographic tests (also called: Malaria Rapid Diagnostic Tests, Antigen-Capture Assay or "Dipsticks") have been developed, distributed and fieldtested. These tests use finger-stick or venous blood, the completed test takes a total of 15–20 minutes, and the results are read visually as the presence or absence of colored stripes on the dipstick, so they are suitable for use in the field. The threshold of detection by these rapid diagnostic tests is in the range of 100 parasites/µl of blood (commercial kits can range from about 0.002% to 0.1% parasitemia) compared to 5 by thick film microscopy. One disadvantage is that dipstick tests are qualitative but not quantitative - they can determine if parasites are present in the blood, but not how many.
The first rapid diagnostic tests were using P. falciparum glutamate dehydrogenase as antigen. PGluDH was soon replaced by P.falciparum lactate dehydrogenase, a 33 kDa oxidoreductase [EC 1.1.1.27]. It is the last enzyme of the glycolytic pathway, essential for ATP generation and one of the most abundant enzymes expressed by P.falciparum. PLDH does not persist in the blood but clears about the same time as the parasites following successful treatment. The lack of antigen persistence after treatment makes the pLDH test useful in predicting treatment failure. In this respect, pLDH is similar to pGluDH. Depending on which monoclonal antibodies are used, this type of assay can distinguish between all five different species of human malaria parasites, because of antigenic differences between their pLDH isoenzymes.
Molecular methods
Molecular methods are available in some clinical laboratories and rapid real-time assays (for example, QT-NASBA based on the polymerase chain reaction) are being developed with the hope of being able to deploy them in endemic areas.
PCR (and other molecular methods) is more accurate than microscopy. However, it is expensive, and requires a specialized laboratory. Moreover, levels of parasitemia are not necessarily correlative with the progression of disease, particularly when the parasite is able to adhere to blood vessel walls. Therefore more sensitive, low-tech diagnosis tools need to be developed in order to detect low levels of parasitemia in the field.
Differential
Fever and septic shock are commonly misdiagnosed as severe malaria in Africa, leading to a failure to treat other life-threatening illnesses. In malaria-endemic areas, parasitemia does not ensure a diagnosis of severe malaria, because parasitemia can be incidental to other concurrent disease. Recent investigations suggest that malarial retinopathy is better (collective sensitivity of 95% and specificity of 90%) than any other clinical or laboratory feature in distinguishing malarial from non-malarial coma.
Prevention
Anopheles albimanus mosquito feeding on a human arm. This mosquito is a vector of malaria and mosquito control is a very effective way of reducing the incidence of malaria.
Methods used in order to prevent the spread of disease, or to protect individuals in areas where malaria is endemic, include prophylactic drugs, mosquito eradication and the prevention of mosquito bites.
The continued existence of malaria in an area requires a combination of high human population density, high mosquito population density and high rates of transmission from humans to mosquitoes and from mosquitoes to humans. If any of these is lowered sufficiently, the parasite will sooner or later disappear from that area, as happened in North America, Europe and much of Middle East. However, unless the parasite is eliminated from the whole world, it could become re-established if conditions revert to a combination that favours the parasite's reproduction. citation needed Many countries are seeing an increasing number of imported malaria cases owing to extensive travel and migration.
Many researchers argue that prevention of malaria may be more cost-effective than treatment of the disease in the long run, but the capital costs required are out of reach of many of the world's poorest people. Economic adviser Jeffrey Sachs estimates that malaria can be controlled for US$3 billion in aid per year.
A 2008 study that examined international financing of malaria control found large regional variations in the levels of average annual per capita funding ranging from US$0.01 in Myanmar to US$147 in Suriname. The study found 34 countries where the funding was less than US$1 per capita, including 16 countries where annual malaria support was less than US$0.5. The 16 countries included 710 million people or 50% of the global population exposed to the risks of malaria transmission, including seven of the poorest countries in Africa (Côte d'Ivoire, Republic of the Congo, Chad, Mali, Democratic Republic of the Congo, Somalia, and Guinea) and two of the most densely populated stable endemic countries in the world (Indonesia and India).
Brazil, Eritrea, India, and Vietnam, unlike many other developing nations, have successfully reduced the malaria burden. Common success factors have included conducive country conditions, a targeted technical approach using a package of effective tools, data-driven decision-making, active leadership at all levels of government, involvement of communities, decentralized implementation and control of finances, skilled technical and managerial capacity at national and sub-national levels, hands-on technical and programmatic support from partner agencies, and sufficient and flexible financing.
Medications
Several drugs, most of which are also used for treatment of malaria, can be taken preventively. Modern drugs used include mefloquine (Lariam), doxycycline (available generically), and the combination of atovaquone and proguanil hydrochloride (Malarone). Doxycycline and the atovaquone and proguanil combination are the best tolerated with mefloquine associated with higher rates of neurological and psychiatric symptoms. The choice of which drug to use depends on which drugs the parasites in the area are resistant to, as well as side-effects and other considerations. The prophylactic effect does not begin immediately upon starting taking the drugs, so people temporarily visiting malaria-endemic areas usually begin taking the drugs one to two weeks before arriving and must continue taking them for 4 weeks after leaving (with the exception of atovaquone proguanil that only needs be started 2 days prior and continued for 7 days afterwards). Generally, these drugs are taken daily or weekly, at a lower dose than would be used for treatment of a person who had actually contracted the disease. Use of prophylactic drugs is seldom practical for full-time residents of malaria-endemic areas, and their use is usually restricted to short-term visitors and travelers to malarial regions. This is due to the cost of purchasing the drugs, negative side effects from long-term use, and because some effective anti-malarial drugs are difficult to obtain outside of wealthy nations.
Quinine was used historically however the development of more effective alternatives such as quinacrine, chloroquine, and primaquine in the 20th century reduced its use. Today, quinine is not generally used for prophylaxis. The use of prophylactic drugs where malaria-bearing mosquitoes are present may encourage the development of partial immunity.
Vector control
Further information: Mosquito control
Efforts to eradicate malaria by eliminating mosquitoes have been successful in some areas. Malaria was once common in the United States and southern Europe, but vector control programs, in conjunction with the monitoring and treatment of infected humans, eliminated it from those regions. In some areas, the draining of wetland breeding grounds and better sanitation were adequate. Malaria was eliminated from most parts of the USA in the early 20th century by such methods, and the use of the pesticide DDT and other means eliminated it from the remaining pockets in the South by 1951. In 2002, there were 1,059 cases of malaria reported in the US, including eight deaths, but in only five of those cases was the disease contracted in the United States.
Before DDT, malaria was successfully eradicated or controlled also in several tropical areas by removing or poisoning the breeding grounds of the mosquitoes or the aquatic habitats of the larva stages, for example by filling or applying oil to places with standing water. These methods have seen little application in Africa for more than half a century.
Sterile insect technique is emerging as a potential mosquito control method. Progress towards transgenic, or genetically modified, insects suggest that wild mosquito populations could be made malaria-resistant. Researchers at Imperial College London created the world's first transgenic malaria mosquito, with the first plasmodium-resistant species announced by a team at Case Western Reserve University in Ohio in 2002. Successful replacement of current populations with a new genetically modified population, relies upon a drive mechanism, such as transposable elements to allow for non-Mendelian inheritance of the gene of interest. However, this approach contains many difficulties and success is a distant prospect. An even more futuristic method of vector control is the idea that lasers could be used to kill flying mosquitoes.
Indoor residual spraying
Indoor residual spraying (IRS) is the practice of spraying insecticides on the interior walls of homes in malaria affected areas. After feeding, many mosquito species rest on a nearby surface while digesting the bloodmeal, so if the walls of dwellings have been coated with insecticides, the resting mosquitos will be killed before they can bite another victim, transferring the malaria parasite.
The first pesticide used for IRS was DDT. Although it was initially used exclusively to combat malaria, its use quickly spread to agriculture. In time, pest-control, rather than disease-control, came to dominate DDT use, and this large-scale agricultural use led to the evolution of resistant mosquitoes in many regions. The DDT resistance shown by Anopheles mosquitoes can be compared to antibiotic resistance shown by bacteria. The overuse of anti-bacterial soaps and antibiotics led to antibiotic resistance in bacteria, similar to how overspraying of DDT on crops led to DDT resistance in Anopheles mosquitoes. During the 1960s, awareness of the negative consequences of its indiscriminate use increased, ultimately leading to bans on agricultural applications of DDT in many countries in the 1970s. Since the use of DDT has been limited or banned for agricultural use for some time, DDT may now be more effective as a method of disease-control.
Although DDT has never been banned for use in malaria control and there are several other insecticides suitable for IRS, some advocates have claimed that bans are responsible for tens of millions of deaths in tropical countries where DDT had once been effective in controlling malaria. Furthermore, most of the problems associated with DDT use stem specifically from its industrial-scale application in agriculture, rather than its use in public health.
The World Health Organization (WHO) currently advises the use of 12 different insecticides in IRS operations, including DDT as well as alternative insecticides (such as the pyrethroids permethrin and deltamethrin). This public health use of small amounts of DDT is permitted under the Stockholm Convention on Persistent Organic Pollutants (POPs), which prohibits the agricultural use of DDT. However, because of its legacy, many developed countries previously discouraged DDT use even in small quantities.
One problem with all forms of Indoor Residual Spraying is insecticide resistance via evolution of mosquitos. According to a study published on Mosquito Behavior and Vector Control, mosquito species that are affected by IRS are endophilic species (species that tend to rest and live indoors), and due to the irritation caused by spraying, their evolutionary descendants are trending towards becoming exophilic (species that tend to rest and live out of doors), meaning that they are not as affected—if affected at all—by the IRS, rendering it somewhat useless as a defense mechanism.
Mosquito nets and bedclothes
Mosquito nets help keep mosquitoes away from people and greatly reduce the infection and transmission of malaria. The nets are not a perfect barrier and they are often treated with an insecticide designed to kill the mosquito before it has time to search for a way past the net. Insecticide-treated nets (ITNs) are estimated to be twice as effective as untreated nets and offer greater than 70% protection compared with no net. Although ITNs are proven to be very effective against malaria, less than 2% of children in urban areas in Sub-Saharan Africa are protected by ITNs. Since the Anopheles mosquitoes feed at night, the preferred method is to hang a large "bed net" above the center of a bed such that it drapes down and covers the bed completely.
Vaccination
Main article: Malaria vaccine
Immunity (or, more accurately, tolerance) does occur naturally, but only in response to repeated infection with multiple strains of malaria. Vaccines for malaria are under development, with no completely effective vaccine yet available. The first promising studies demonstrating the potential for a malaria vaccine were performed in 1967 by immunizing mice with live, radiation-attenuated sporozoites, providing protection to about 60% of the mice upon subsequent injection with normal, viable sporozoites. Since the 1970s, there has been a considerable effort to develop similar vaccination strategies within humans. It was determined that an individual can be protected from a P. falciparum infection if they receive over 1,000 bites from infected yet irradiated mosquitoes.
Other methods
Education in recognizing the symptoms of malaria has reduced the number of cases in some areas of the developing world by as much as 20%. Recognizing the disease in the early stages can also stop the disease from becoming a killer. Education can also inform people to cover over areas of stagnant, still water e.g. Water Tanks which are ideal breeding grounds for the parasite and mosquito, thus cutting down the risk of the transmission between people. This is most put in practice in urban areas where there are large centers of population in a confined space and transmission would be most likely in these areas.
The Malaria Control Project is currently using downtime computing power donated by individual volunteers around the world (see Volunteer computing and BOINC) to simulate models of the health effects and transmission dynamics in order to find the best method or combination of methods for malaria control. This modeling is extremely computer intensive due to the simulations of large human populations with a vast range of parameters related to biological and social factors that influence the spread of the disease. It is expected to take a few months using volunteered computing power compared to the 40 years it would have taken with the current resources available to the scientists who developed the program.
An example of the importance of computer modeling in planning malaria eradication programs is shown in the paper by Águas and others. They showed that eradication of malaria is crucially dependent on finding and treating the large number of people in endemic areas with asymptomatic malaria, who act as a reservoir for infection. The malaria parasites do not affect animal species and therefore eradication of the disease from the human population would be expected to be effective.
Other interventions for the control of malaria
include mass drug administrations and intermittent preventive therapy.
Furthering attempts to reduce transmission rates, a proposed alternative to mosquito nets is the mosquito laser, or photonic fence, which identifies female mosquitoes and shoots them using a medium-powered laser. The device is currently undergoing commercial development, although instructions for a DIY version of the photonic fence have also been published.
Another way of reducing the malaria transmitted to humans from mosquitoes has been developed by the University of Arizona. They have engineered a mosquito to become resistant to malaria. This was reported on the 16 July 2010 in the journal PLoS Pathogens. With the ultimate end being that the release of this GM mosquito into the environment, Gareth Lycett, a malaria researcher from Liverpool School of Tropical Medicine told the BBC that "It is another step on the journey towards potentially assisting malaria control through GM mosquito release."
Treatment
The treatment of malaria depends on the severity of the disease. Uncomplicated malaria is treated with oral drugs. Whether patients who can take oral drugs have to be admitted depends on the assessment and the experience of the clinician. Severe malaria requires the parenteral administration of antimalarial drugs. The traditional treatment for severe malaria has been quinine but there is evidence that the artemisinins are also superior for the treatment of severe malaria. A large clinical trial is currently under way to compare the efficacy of quinine and artesunate in the treatment of severe malaria in African children. Active malaria infection with P. falciparum is a medical emergency requiring hospitalization. Infection with P. vivax, P. ovale or P. malariae can often be treated on an outpatient basis. Treatment of malaria involves supportive measures as well as specific antimalarial drugs. Most antimalarial drugs are produced industrially and are sold at pharmacies. However, as the cost of such medicines are often too high for most people in the developing world, some herbal remedies (such as Artemisia annua tea) have also been developed, and have gained support from international organisations such as Médecins Sans Frontières. When properly treated, a patient with malaria can expect a complete recovery.
Epidemiology
Countries which have regions where malaria is endemic as of 2003 (coloured yellow). Countries in green are free of indigenous cases of malaria in all areas.
It is estimates that malaria causes 250 million cases of fever and approximately one million deaths annually. The vast majority of cases occur in children under 5 years old; pregnant women are also especially vulnerable. Despite efforts to reduce transmission and increase treatment, there has been little change in which areas are at risk of this disease since 1992. Indeed, if the prevalence of malaria stays on its present upwards course, the death rate could double in the next twenty years. Precise statistics are unknown because many cases occur in rural areas where people do not have access to hospitals or the means to afford health care. As a consequence, the majority of cases are undocumented.
Although co-infection with HIV and malaria does cause increased mortality, this is less of a problem than with HIV/tuberculosis co-infection, due to the two diseases usually attacking different age-ranges, with malaria being most common in the young and active tuberculosis most common in the old. Although HIV/malaria co-infection produces less severe symptoms than the interaction between HIV and TB, HIV and malaria do contribute to each other's spread. This effect comes from malaria increasing viral load and HIV infection increasing a person's susceptibility to malaria infection.
Malaria is presently endemic in a broad band around the equator, in areas of the Americas, many parts of Asia, and much of Africa; however, it is in sub-Saharan Africa where 85– 90% of malaria fatalities occur. The geographic distribution of malaria within large regions is complex, and malaria-afflicted and malaria-free areas are often found close to each other. In drier areas, outbreaks of malaria can be predicted with reasonable accuracy by mapping rainfall. Malaria is more common in rural areas than in cities; this is in contrast to dengue fever where urban areas present the greater risk For example, several cities in Vietnam, Laos and Cambodia are essentially malaria-free, but the disease is present in many rural regions. By contrast, in Africa malaria is present in both rural and urban areas, though the risk is lower in the larger cities. The global endemic levels of malaria have not been mapped since the 1960s. However, the Wellcome Trust, UK, has funded the Malaria Atlas Project to rectify this, providing a more contemporary and robust means with which to assess current and future malaria disease burden.
History
Malaria has infected humans for over 50,000 years, and Plasmodium may have been a human pathogen for the entire history of the species. Close relatives of the human malaria parasites remain common in chimpanzees. Some new evidence suggests that the most virulent strain of human malaria may have originated in gorillas.
References to the unique periodic fevers of malaria are found throughout recorded history, beginning in 2700 BC in China. Malaria may have contributed to the decline of the Roman Empire, and was so pervasive in Rome that it was known as the "Roman fever". The term malaria originates from Medieval Italian: mala aria — "bad air"; the disease was formerly called ague or marsh fever due to its association with swamps and marshland. Malaria was once common in most of Europe and North America, where it is no longer endemic, though imported cases do occur.
Malaria was the most important health hazard encountered by U.S. troops in the South Pacific during World War II, where about 500,000 men were infected. Sixty thousand American soldiers died of malaria during the North African and South Pacific campaigns.
Prevention
An early effort at malaria prevention occurred in 1896, just before the mosquito malaria link was confirmed in India by a British physician, Ronald Ross. An 1896 Uxbridge malaria outbreak prompted health officer, Dr. Leonard White, to write a report to the State Board of Health, which led to study of mosquito-malaria links, and the first efforts for malaria prevention. Massachusetts State pathologist Theobald Smith, asked that White's son collect mosquito specimens for further analysis, and that citizens 1) add screens to windows, and 2) drain collections of water. Carlos Finlay was also engaged in mosquito related research, and mosquito borne disease theory, in the 1880s in Cuba, basing his work on the study of Yellow Fever.
Treatment
A continuous P. falciparum culture was established in 1976
Scientific studies on malaria made their first significant advance in 1880, when a French army doctor working in the military hospital of Constantine in Algeria named Charles Louis Alphonse Laveran observed parasites for the first time, inside the red blood cells of people suffering from malaria. He, therefore, proposed that malaria is caused by this organism, the first time a protist was identified as causing disease. For this and later discoveries, he was awarded the 1907 Nobel Prize for Physiology or Medicine. The malarial parasite was called Plasmodium by the Italian scientists Ettore Marchiafava and Angelo Celli. A year later, Carlos Finlay, a Cuban doctor treating patients with yellow fever in Havana, provided strong evidence that mosquitoes were transmitting disease to and from humans.
This work followed earlier suggestions by Josiah C. Nott, and work by Patrick Manson on the transmission of filariasis.
It was Britain's Sir Ronald Ross, working in the Presidency General Hospital in Calcutta, who finally proved in 1898 that malaria is transmitted by mosquitoes. He did this by showing that certain mosquito species transmit malaria to birds. He isolated malaria parasites from the salivary glands of mosquitoes that had fed on infected birds. For this work, Ross received the 1902 Nobel Prize in Medicine. After resigning from the Indian Medical Service, Ross worked at the newly established Liverpool School of Tropical Medicine and directed malaria-control efforts in Egypt, Panama, Greece and Mauritius. The findings of Finlay and Ross were later confirmed by a medical board headed by Walter Reed in 1900. Its recommendations were implemented by William C. Gorgas in the health measures undertaken during construction of the Panama Canal. This public-health work saved the lives of thousands of workers and helped develop the methods used in future public-health campaigns against the disease.
The first effective treatment for malaria came from the bark of cinchona tree, which contains quinine. This tree grows on the slopes of the Andes, mainly in Peru. The indigenous peoples of Peru made a tincture of cinchona to control malaria. The Jesuits noted the efficacy of the practice and introduced the treatment to Europe during the 1640s, where it was rapidly accepted. It was not until 1820 that the active ingredient, quinine, was extracted from the bark, isolated and named by the French chemists Pierre Joseph Pelletier and Joseph Bienaimé Caventou.
In the early 20th century, before antibiotics became available, Julius Wagner-Jauregg discovered that patients with syphilis could be treated by intentionally infecting them with malaria; the resulting fever would kill the syphilis spirochetes, and quinine could be administered to control the malaria. Although some patients died from malaria, this was considered preferable to the almost-certain death from syphilis.
The first successful continuous malaria culture was established in 1976 by William Trager and James B. Jensen, which facilitated research into the molecular biology of the parasite and the development of new drugs.
Although the blood stage and mosquito stages of the malaria life cycle were identified in the 19th and early 20th centuries, it was not until the 1980s that the latent liver form of the parasite was observed. The discovery of this latent form of the parasite explained why people could appear to be cured of malaria but suffer relapse years after the parasite had disappeared from their bloodstreams.
Society and culture
Malaria is not just a disease commonly associated with poverty but also a cause of poverty and a major hindrance to economic development. Tropical regions are affected most, however malaria’s furthest extent reaches into some temperate zones with extreme seasonal changes. The disease has been associated with major negative economic effects on regions where it is widespread. During the late 19th and early 20th centuries, it was a major factor in the slow economic development of the American southern states. A comparison of average per capita GDP in 1995, adjusted for parity of purchasing power, between countries with malaria and countries without malaria gives a fivefold difference ($1,526 USD versus $8,268 USD). In countries where malaria is common, average per capita GDP has risen (between 1965 and 1990) only 0.4% per year, compared to 2.4% per year in other countries.
Poverty is both cause and effect, however, since the poor do not have the financial capacities to prevent or treat the disease. The lowest income group in Malawi carries (1994) the burden of having 32% of their annual income used on this disease compared with the 4% of household incomes from low-to-high groups. In its entirety, the economic impact of malaria has been estimated to cost Africa $12 billion USD every year. The economic impact includes costs of health care, working days lost due to sickness, days lost in education, decreased productivity due to brain damage from cerebral malaria, and loss of investment and tourism. In some countries with a heavy malaria burden, the disease may account for as much as 40% of public health expenditure, 30-50% of inpatient admissions, and up to 50% of outpatient visits.
A study on the effect of malaria on IQ in a sample of Mexicans found that exposure during the birth year to malaria eradication was associated with increases in IQ. It also increased the probability of employment in a skilled occupation. The author suggests that this may be one explanation for the Flynn effect and that this may be an important explanation for the link between national malaria burden and economic development. A literature review of 44 papers states that cognitive abilities and school performance were shown to be impaired in sub-groups of patients (with either cerebral malaria or uncomplicated malaria) when compared with healthy controls. Studies comparing cognitive functions before and after treatment for acute malarial illness continued to show significantly impaired school performance and cognitive abilities even after recovery. Malaria prophylaxis was shown to improve cognitive function and school performance in clinical trials when compared to placebo groups.
War
Throughout history, the contraction of malaria (via natural outbreaks as well as via infliction of the disease as a biological warfare agent) has played a prominent role in the fortunes of government rulers, nation-states, military personnel, and military actions. "Malaria Site: History of Malaria During Wars" addresses the devastating impact of malaria in numerous well-known conflicts, beginning in June 323 B.C. That site's authors note: "Many great warriors succumbed to malaria after returning from the warfront and advance of armies into continents was prevented by malaria. In many conflicts, more troops were killed by malaria than in combat." The Centers for Disease Control ("CDC") traces the history of malaria and its impacts farther back, to 2700 BCE.
In 1910, Nobel Prize in Medicine-winner Ronald Ross (himself a malaria survivor), published a book titled The Prevention of Malaria that included the chapter: "The Prevention of Malaria in War." The chapter's author, Colonel C. H. Melville, Professor of Hygiene at Royal Army Medical College in London, addressed the prominent role that malaria has historically played during wars and advised: "A specially selected medical officer should be placed in charge of these operations with executive and disciplinary powers [...]."
Significant financial investments have been made to fund procure existing and create new anti-malarial agents. During World War I and World War II, the supplies of the natural anti-malaria drugs, cinchona bark and quinine, proved to be inadequate to supply military personnel and substantial funding was funnelled into research and development of other drugs and vaccines. American military organizations conducting such research initiatives include the Navy Medical Research Center, Walter Reed Army Institute of Research, and the U.S. Army Medical Research Institute of Infectious Diseases of the US Armed Forces.
Additionally, initiatives have been founded such as Malaria Control in War Areas (MCWA), established in 1942, and its successor, the Communicable Disease Center (now known as the Centers for Disease Control) established in 1946. According to the CDC, MCWA "was established to control malaria around military training bases in the southern United States and its territories, where malaria was still problematic" and, during these activities, to "train state and local health department officials in malaria control techniques and strategies." The CDC's Malaria Division continued that mission, successfully reducing malaria in the United States, after which the organization expanded its focus to include "prevention, surveillance, and technical support both domestically and internationally."
Malaria antigen detection tests are a group of commercially available tests that allow the rapid diagnosis of malaria by people who are not otherwise skilled in traditional laboratory techniques for diagnosing malaria or in situations where such equipment is not available. There are currently over 20 such tests commercially available (WHO product testing 2008). The first malaria antigen suitable as target for Rapid Diagnostic Tests (RDTs) was a soluble glycolytic enzyme Glutamate dehydrogenase. None of the rapid tests are currently as sensitive as a thick blood film, nor as cheap. A major drawback in the use of all current dipstick methods is that the result is essentially qualitative. In many endemic areas of tropical Africa, however, the quantitative assessment of parasitaemia is important, as a large percentage of the population will test positive in any qualitative assay.
Antigen-based Malaria Rapid Diagnostic Tests
Malaria is a curable disease if the patients have access to early diagnosis and prompt treatment. Antigen-based Rapid Diagnostic Tests (RDTs) have an important role at the periphery of health services capability because none of the rural clinics has the ability to diagnose malaria on-site due to a lack of microscopes and trained technicians to evaluate blood films. Furthermore, in regions where the disease is not endemic laboratory technologists have very limited experience in detecting and identifying malaria parasites. An ever increasing numbers of travelers from temperate areas each year visit tropical countries and many of them return with a malaria infection. The RDT tests are still regarded as complements to conventional microscopy but with some improvements it may well replace the microscope. The tests are simple and the procedure can be performed on the spot in field conditions. These tests use finger-stick or venous blood, the completed test takes a total of 15–20 minutes, and a laboratory is not needed. The threshold of detection by these rapid diagnostic tests is in the range of 100 parasites/µl of blood compared to 5 by thick film microscopy.
pGluDH
Plasmodium Glutamate dehydrogenase (pGluDH) precipitated by host antibodies
An accurate diagnosis is becoming more and more important, in view of the increasing resistance of Plasmodium falciparum and the high price of alternatives to chloroquine. The enzyme pGluDH does not occur in the host red blood cell and was recommended as marker enzyme for Plasmodium species by Picard-Maureau et al. in 1975. The malaria marker enzyme test is suitable for routine work and is now a standard test in most departments dealing with malaria. Presence of pGluDH is known to represent parasite viability and a rapid diagnostic test using pGluDH as antigen would have the ability to differentiate live from dead organisms. A complete RDT with pGluDH as antigen has been developed in China and is now undergoing clinical trials. GluDHs are ubiquitous enzymes that occupy an important branch-point between carbon and nitrogen metabolism. Both NAD [[[EC number|EC]] 1.4.1.2] and NADP dependend GluDH [EC 1.4.1.4] enzymes are present in Plasmodia; the NAD dependend GluDH is relatively unstable and not useful for diagnostic purposes. Glutamate dehydrogenase provides an oxidizable carbon source used for the production of energy as well as a reduced electron carrier, NADH. Glutamate is a principal amino donor to other amino acids in subsequent transamination reactions. The multiple roles of glutamate in nitrogen balance make it a gateway between free ammonia and the amino groups of most amino acids. Its crystal structure is published. The GluDH activity in P.vivax, P.ovale and P. malariae has never been tested, but given the importance of GluDH as a branch point enzyme, every cell must have a high concentration of GluDH. It is well known that enzymes with a high molecular weight (like GluDH) have many isozymes, which allows strain differentiations (given the right monoclonal antibody). The host produces antibodies against the parasitic enzyme indicating a low sequence identity.
Histidine Rich Protein II
The histidine-rich protein II (HRP II) is a histidine - and alanine -rich, water-soluble protein, which is localized in several cell compartments including the parasite cytoplasm. The antigen is expressed only by P. falciparum trophozoites. HRP II from P. falciparum has been implicated in the biocrystallization of hemozoin, an inert, crystalline form of ferriprotoporphyrin IX (Fe(3+)-PPIX) produced by the parasite. A substantial amount of the HRP II is secreted by the parasite into the host bloodstream and the antigen can be detected in erythrocytes, serum, plasma, cerebrospinal fluid and even urine as a secreted water-soluble protein. These antigens persist in the circulating blood after the parasitaemia has cleared or has been greatly reduced. It generally takes around two weeks after successful treatment for HRP2-based tests to turn negative, but may take as long as one month, which compromises their value in the detection of active infection. False positive dipstick results were reported in patients with rheumatoid-factor-positive rheumatoid arthritis. Since HPR-2 is expressed only by P. falciparum, these tests will give negative results with samples containing only P. vivax, P. ovale, or P. malariae; many cases of non-falciparum malaria may therefore be misdiagnosed as malaria negative (some P.falciparum strains also don’t have HRP II). The variability in the results of pHRP2-based RDTs is related to the variability in the target antigen.
pLDH
P.falciparum lactate dehydrogenase (pLDH) is a 33 kDa oxidoreductase [EC 1.1.1.27]. It is the last enzyme of the glycolytic pathway, essential for ATP generation and one o the most abundant enzymes expressed by P.falciparum. PLDH does not persist in the blood but clears about the same time as the parasites following successful treatment. The lack of antigen persistence after treatment makes the pLDH test useful in predicting treatment failure. In this respect, pLDH is similar to pGluDH. LDH from P. vivax, P.malariae, and P.ovale exhibit 90-92% identity to pLDH from P.falciparum.
pAldo
Fructose-bisphosphate aldolase [EC 4.1.2.13] catalyzes a key reaction in glycolysis and energy production and is produced by all four species. The P.falciparum aldolase is a 41 kDa protein and is 61-68% homologous to known eukaryotic aldolases. Its crystal structure has been published. The presence of antibodies against p41 in the sera of human adults partially immune to malaria suggest that p41 is implicated in protective immune response against the parasite. |