Community Transmission Of Novel Coronavirus In India Is In Progress And Real Fight Would Begin Soon

There are two approaches to the latest pandemic named novel Coronavirus or COVID-19. The first one is to accept it as a pandemic and start preparing to tackle it at the very first moment. The second one is to let the virus spread first and then try to contain it. Almost all of the countries of the world, including India, have adopted the second approach i.e. let the virus spread and then contain it. While countries like South Korea fall in the first category that adopted rigorous testing, separation, quarantine and treatment from the first day itself.

India adopted very poor approach and response to COVID 19 and it is still pushing the same rhetoric and narrative that a lockdown is the best solution for India. That is clearly an uneducated and naïve claim. Lockdown, in itself, is more problem than solution and it amounts to infecting Indians at the family level. Instead of individual societal cases, now we would see cases of infection at the family level. This exercise would increase the infection manifolds as community transmission is already taking place in India.

There are many reasons why countries like India, UK, US, China, etc allowed the COIVD 19 to spread and most of them are highly controversial. We at Perry4Law Organisation (P4LO) do not wish to discuss the same here as we have already contacted United Nations and World Health Organisation in this regard for few of such aspects. But we believe that public has a right to know the truth and hiding the simple and very crucial truth of community transmission is not only inhuman but also violation of Human Rights of Indians.

So irrespective of the false claims of Indian Government, we wish to reiterate that community transmission of novel Coronavirus is in progress and soon it would be in full swing in India. What would happen next is easy to ascertain as countries like China, Italy, Spain, etc have already seen that tragedy. As India would use temporary accommodations to keep infected patients and to treat them, we need many available hostels, hotels, houses, etc for that purpose.

As far as Gujarat is concerned, the completion of Motera Stadium seems to be a timely exercise. But we may need more arrangements than such stadiums as we cannot make more such stadiums now. We may soon be forced to use passenger trains, railway platforms, hotels, govt offices, etc for this purpose.

Also instead of getting better and timely services, India would witness major curtailment of services like broadcasting, cable, telecommunications and Internet, etc. Police and military cracking may also increase so Indians must be mentally prepared for few weeks or months isolation. This would have severe psychological impact upon Indians and they must invent methods to fight back against the severe side effects of this isolation exercise and bad news that would soon be there.

Media (including radio) would not give you correct picture and your existing communication mechanisms could be severely curtailed. So start removing your dependence upon TV, mobiles, etc as soon as possible and try alternative and effective mechanisms of entertainment and hobby pursuing.

We at P4LO believe that you must be physically and mentally strong to withstand this pandemic. Do not loose your hope, stay positive and strong and use your common sense and gut feelings.

We at P4LO are always available to help global stakeholders in best possible manner and in whatever possible modes we can. Stay safe, eat healthy food, drink lots of water and take adequate rest.

See you on the other side of the world, a world free from novel Coronavirus and a time where we would be rebuilding whatever we people have lost due to this pandemic.

The History Of Aircrafts And Jet Based GeoEngineering And Weather Modification

Aircrafts and jets based GeoEngineering and Weather Modification has been going on since 1950 within actual knowledge of all stakeholders, including United Nations (UN). Even if we ignore the chemical and metallic composition of aerosols emitted by these aircrafts and jets still it is very clear that their role as GeoEngineering and Weather Modification methods was well known to scientists and politicians world over.

The scope of  this article is not to discuss whether these emissions were contrails or chemtrails as we would cover this aspect seperately. In the present  article we are proving that world at large was aware of the GeoEngineering and Weather Modification role of aircrafts and jets emissions still  nothing was done to prevent them for almost 70 years.

In 2019 nations are claiming that these contrails may have “inadvertently” resulted in GeoEngineering and Weather Modfication. But we at PTLB firmly believe that this was not inadvertent but a deliberate act to allow aircrafts and jets emissions based GeoEngineering and Weather Modification that has brought disastrous results in our present times.

Let us start the discussion on this topic. In 1958, protests were raised against emissions from jets above as the tourist trade was severely effected due to clouds coverage by such jets emissions.

In 1963, Walter Orr Roberts opined that jet airplane contrails are modifying the climate.

In 1963, the Central Intelligence Agency (CIA) got an Air America Beechcraft and had it rigged up with silver iodide. There was another demonstration by people in Vietnam and U.S. seeded the South Vietnam area and created rain.

In 1968, Reid A. Bryson said that jet aircraft contrails were one of the more recent types of cirrus clouds, which are comprised of ice crystals at high altitude. Where jets are operating today, cirrus clouds have increased by 5% to 10%, Bryson said. He estimated that if the day came when 300 supersonic transports were in the air at one time, the region of operation of most SST’s “might easily be 100% covered with cirrus clouds.” The net effect of this would be further to reduce the heating of the earth, and blue skies might become a rarity.

In 1970, Illinois and New Jersey officials decided not to settle pollution suits against the nation’s major airlines out of court, despite an agreement between the airlines and the federal government to lean up the jet aircraft exhaust.  Representatives of 31 major domestic airlines agreed to install “burner cans” to eliminate most of the smoke from their nearly 1,000 aircraft by 1972.

In 1970, during the Ninety-first Congress, second session, on S. 3229, S. 3466 and S. 3546 hearing, Secretary John A. Volpe of Transportation informed that airline representatives and govt officials met and obtained their agreement to retrofit the present fleet of planes that produces the largest amount or smoke. He informed that instead of being completed in 1974, as had been originally suggested, they agreed to complete this retrofit operation by 1972. Under the plan, the devices were required to be installed on engines whenever they were “down” for routine overhaul after about 5,000 flying hours each on the average. An estimated 3,000 Pratt & Whitney JT8D engines, mostly on Boeing 727, 737 and McDonnell Douglas DC9 short haul craft, required modification. The devices were improved combustors (the chambers in which the fuel is ignited) and were manufactured by Pratt & Whitney.”

In 1971, a study revealed that a jet air plane in one landing and takeoff drenches the environment with as much soot as 2,500 automobiles produce in a entire day.

In 1974, direct manufacture of carbon dust (Pdf) on aircraft or from carbon particle generating sources on ships or at surface sites was encouraged. This was done to avoid the clumping, packing, and logistical problems of obtaining the carbon particles from the factory.

If carbon particles are dispersed directly from carbon generating burners, they would convert liquid hydrocarbon fuel to CO2. This reduced logistical hurdles and ensured that right sized CO2 particles were dispersed in 1974 without anybody noticing.

By slightly modifying readily available jet engines of 1974, carbon dust particles could have been produced and dispersed into air at a rate of 20-30,000 pounds per hour per engine. Lies were also spread to support climate change and global warming rhetoric.

Use of afterburner type jet engines to generate carbon directly was suggested in 1974. This was cheapest, most convenient, and secretive alternative for GeoEngineering,  global cooling and criminal climate change purposes.

On April 1990 Hughes Aircraft Company suggests GeoEngineering jet fuel.  The United States Patent 5003186  was granted to it for Stratospheric Welsbach seeding for that it claimed could reduce global warming. The particles could have been seeded by dispersal from seeding aircraft; one exemplary technique may be via the jet fuel as suggested by prior work regarding the metallic particles. Once the tiny particles were dispersed into the atmosphere, these particles remained in suspension for up to one year.

For history from 1991 to 2015, please see this website.

After much protests from people challenging the contrails theory of politicians, nations have started accepting that emissions from aircrafts are producing strong GeoEngineering and Weather Modification changes since 1950. Keeping in mind the widely know impact of such emissions since 1950 to all stakeholders, the “Inadvertent GeoEngineering” theory of nations is another lie to fool people. The nations have to accept that they knew about the happening of this GeoEngineering and Weather Modification since 1950 and they deliberately failed to curb the same.

Views Of WHO On Electromagnetic Fields (EMFs) And Their Effect On Health

This is one of the most controversial topics of present times. People and scientists are divided on the ill effects of electromagnetic fields (EMFs) generated by man-made actions and technologies. Further, there are also some gaps in the scientific study of impact of man-made EMFs upon health of humans, plants, animals, etc. We are writing this article as an introductory discussion covering views of World health Organisation (WHO) as it is usually cited in various discussions. Please see this work for complete discussion of WHO in this regard.

We would add more aspects of EMFs in our subsequent articles as the discussion of WHO is basic discussion and it has not discussed many crucial questions. Further, the discussion of WHO is based on many assumptions that must be independently verified by latest scientific studies. We do not accept or deny the observations of WTO in this article and our independent opinion would be shared in subsequent articles.

Let us start the discussion with a brief discussion about electric fields and magnetic fields. Electric fields are created by differences in voltage: the higher the voltage, the stronger will be the resultant field. Magnetic fields are created when electric current flows: the greater the current, the stronger the magnetic field. An electric field will exist even when there is no current flowing. If current does flow, the strength of the magnetic field will vary with power consumption but the electric field strength will be constant.

Besides natural sources the electromagnetic spectrum also includes fields generated by human-made sources. The electricity that comes out of every power socket has associated low frequency electromagnetic fields. And various kinds of higher frequency radiowaves are used to transmit information – whether via TV antennas, radio stations or mobile phone base stations. One of the main characteristics which defines an electromagnetic field (EMF) is its frequency or its corresponding wavelength. Fields of different frequencies interact with the body in different ways. One can imagine electromagnetic waves as series of very regular waves that travel at an enormous speed, the speed of light. The frequency simply describes the number of oscillations or cycles per second, while the term wavelength describes the distance between one wave and the next. Hence wavelength and frequency are inseparably intertwined: the higher the frequency the shorter the wavelength. The more waves you generate (higher frequency) the smaller will be the distance between them (shorter wavelength).

It is very important to understand the difference between non-ionizing radiation and ionising radiation. Wavelength and frequency determine another important characteristic of electromagnetic fields: Electromagnetic waves are carried by particles called quanta. Quanta of higher frequency (shorter wavelength) waves carry more energy than lower frequency (longer wavelength) fields. Some electromagnetic waves carry so much energy per quantum that they have the ability to break bonds between molecules. In the electromagnetic spectrum, gamma rays given off by radioactive materials, cosmic rays and X-rays carry this property and are called ‘ionizing radiation’. Fields whose quanta are insufficient to break molecular bonds are called ‘non-ionizing radiation’. Man-made sources of electromagnetic fields that form a major part of industrialised life – electricity, microwaves and radiofrequency fields – are found at the relatively long wavelength and low frequency end of the electromagnetic spectrum and their quanta are unable to break chemical bonds.

Magnetic fields arise from the motion of electric charges. The strength of the magnetic field is measured in amperes per meter (A/m); more commonly in electromagnetic field research, scientists specify a related quantity, the flux density (in microtesla, µT) instead. In contrast to electric fields, a magnetic field is only produced once a device is switched on and current flows. The higher the current, the greater the strength of the magnetic field. According to WHO, like electric fields, magnetic fields are strongest close to their origin and rapidly decrease at greater distances from the source. However, magnetic fields are not blocked by common materials such as the walls of buildings.

Magnetic fields are created only when the electric current flows. Magnetic fields and electric fields then exist together in the room environment. The greater the current the stronger the magnetic field. High voltages are used for the transmission and distribution of electricity whereas relatively low voltages are used in the home. The voltages used by power transmission equipment vary little from day to day, currents through a transmission line vary with power consumption.

A static field does not vary over time. A direct current (DC) is an electric current flowing in one direction only. In any battery-powered appliance the current flows from the battery to the appliance and then back to the battery. It will create a static magnetic field. The earth’s magnetic field is also a static field. So is the magnetic field around a bar magnet which can be visualised by observing the pattern that is formed when iron filings are sprinkled around it.

In contrast, time-varying electromagnetic fields are produced by alternating currents (AC). Alternating currents reverse their direction at regular intervals. In most European countries electricity changes direction with a frequency of 50 cycles per second or 50 Hertz. Equally, the associated electromagnetic field changes its orientation 50 times every second. North American electricity has a frequency of 60 Hertz.

The time-varying electromagnetic fields produced by electrical appliances are an example of extremely low frequency (ELF) fields. ELF fields generally have frequencies up to 300 Hz. Other technologies produce intermediate frequency (IF) fields with frequencies from 300 Hz to 10 MHz and radiofrequency (RF) fields with frequencies of 10 MHz to 300 GHz. The effects of electromagnetic fields on the human body depend not only on their field level but on their frequency and energy. Our electricity power supply and all appliances using electricity are the main sources of ELF fields; computer screens, anti-theft devices and security systems are the main sources of IF fields; and radio, television, radar and cellular telephone antennas, and microwave ovens are the main sources of RF fields. These fields induce currents within the human body, which if sufficient can produce a range of effects such as heating and electrical shock, depending on their amplitude and frequency range. (However, to produce such effects, the fields outside the body would have to be very strong, far stronger than present in normal environments.)

Mobile telephones, television and radio transmitters and radar produce RF fields. These fields are used to transmit information over long distances and form the basis of telecommunications as well as radio and television broadcasting all over the world. Microwaves are RF fields at high frequencies in the GHz range. In microwaves ovens, we use them to quickly heat food.

At radio frequencies, electric and magnetic fields are closely interrelated and we typically measure their levels as power densities in watts per square metre (W/m2).

The key points from the abovementioned discussions are as follows:

(1) The electromagnetic spectrum encompasses both natural and human-made sources of electromagnetic fields.

(2) Frequency and wavelength characterise an electromagnetic field. In an electromagnetic wave, these two characteristics are directly related to each other: the higher the frequency the shorter the wavelength.

(3) Ionizing radiation such as X-ray and gamma-rays consists of photons which carry sufficient energy to break molecular bonds. Photons of electromagnetic waves at power and radio frequencies have much lower energy that do not have this ability.

(4) Electric fields exist whenever charge is present and are measured in volts per metre (V/m). Magnetic fields arise from current flow. Their flux densities are measured in microtesla (µT) or millitesla (mT).

(5) At radio and microwave frequencies, electric and magnetic fields are considered together as the two components of an electromagnetic wave. Power density, measured in watts per square metre (W/m2), describes the intensity of these fields.

(6) Low frequency and high frequency electromagnetic waves affect the human body in different ways.

(7) Electrical power supplies and appliances are the most common sources of low frequency electric and magnetic fields in our living environment. Everyday sources of radiofrequency electromagnetic fields are telecommunications, broadcasting antennas and microwave ovens.

Tiny electrical currents exist in the human body due to the chemical reactions that occur as part of the normal bodily functions, even in the absence of external electric fields. For example, nerves relay signals by transmitting electric impulses. Most biochemical reactions from digestion to brain activities go along with the rearrangement of charged particles. Even the heart is electrically active – an activity that your doctor can trace with the help of an electrocardiogram.

Low-frequency magnetic fields induce circulating currents within the human body. The strength of these currents depends on the intensity of the outside magnetic field. If sufficiently large, these currents could cause stimulation of nerves and muscles or affect other biological
processes.

Heating is the main biological effect of the electromagnetic fields of radiofrequency fields. In microwave ovens this fact is employed to warm up food. The levels of radiofrequency fields to which people are normally exposed are very much lower than those needed to produce significant heating. The heating effect of radiowaves forms the underlying basis for current guidelines. Scientists are also investigating the possibility that effects below the threshold level for body heating occur as a result of long-term exposure. To date, no adverse health effects from low level, long-term exposure to radiofrequency or power frequency fields have been confirmed, but scientists are actively continuing to
research this area.

Biological effects are measurable responses to a stimulus or to a change in the environment. These changes are not necessarily harmful to your health. For example, listening to music, reading a book, eating an apple or playing tennis will produce a range of biological effects.

Nevertheless, none of these activities is expected to cause health effects. The body has sophisticated mechanisms to adjust to the many and varied influences we encounter in our environment. Ongoing change forms a normal part of our lives. But, of course, the body does not possess adequate compensation mechanisms for all biological effects. Changes that are irreversible and stress the system for long periods of time may constitute a health hazard.

An adverse health effect causes detectable impairment of the health of the exposed individual or of his or her offspring; a biological effect, on the other hand, may or may not result in an adverse health effect.

It is not disputed that electromagnetic fields above certain levels can trigger biological effects. Experiments with healthy volunteers indicate that short-term exposure at the levels present in the environment or in the home do not cause any apparent detrimental effects. Exposures to higher levels that might be harmful are restricted by national and international guidelines. The current debate is centred on whether long-term low level exposure can evoke biological responses and influence people’s well being. Based on a recent in-depth review of the scientific literature, the WHO concluded that current evidence does not confirm the existence of any health consequences from exposure to low level electromagnetic fields. However, some gaps in knowledge about biological effects exist and need further research.

As far as cancer hazard is concerned, it is clear that if electromagnetic fields do have an effect on cancer, then any increase in risk will be extremely small. The results to date contain many inconsistencies, but no large increases in risk have been found for any cancer in children or adults. A number of epidemiological studies suggest small increases in risk of childhood leukemia with exposure to low frequency magnetic fields in the home. However, scientists have not generally concluded that these results indicate a cause-effect relation between exposure to the fields and disease (as opposed to artifacts in the study or effects unrelated to field exposure). In part, this conclusion has been reached because animal and laboratory studies fail to demonstrate any reproducible effects that are consistent with the hypothesis that fields cause or promote cancer. Large-scale studies are currently underway in several countries and may help resolve these issues.

The key points from the above discussions are as follows:

(1) A wide range of environmental influences causes biological effects. ‘Biological effect’ does not equal ‘health hazard’. Special research is needed to identify and measure health hazards.

(2) At low frequencies, external electric and magnetic fields induce small circulating currents within the body. In virtually all ordinary environments, the levels of induced currents inside the body are too small to produce obvious effects.

(3) The main effect of radiofrequency electromagnetic fields is heating of body tissues.

(4) There is no doubt that short-term exposure to very high levels of electromagnetic fields can be harmful to health. Current public concern focuses on possible long-term health effects caused by exposure to electromagnetic fields at levels below those required to trigger acute biological responses.

(5) WHO’s International EMF Project was launched to provide scientifically sound and objective answers to public concerns about possible hazards of low level electromagnetic fields.

(6) Despite extensive research, to date there is no evidence to conclude that exposure to low level electromagnetic fields is harmful to human health.

(7) The focus of international research is the investigation of possible links between cancer and electromagnetic fields, at power line and radiofrequencies.

Countries set their own national standards for exposure to electromagnetic fields. However, the majority of these national standards draw on the guidelines set by the International Commission on Non-Ionizing Radiation Protection (ICNIRP). This non-governmental organization, formally recognized by WHO, evaluates scientific results from all over the world.

The WHO has recently launched an initiative to harmonize exposure guidelines worldwide. Future standards will be based on the results of the WHO’s International Electromagnetic Field Project. See the exposure limits for low-frequency fields (public) data by country and exposure limits for radio-frequency fields (public) data by country too. We could not find the data about India there and would appreciate if any person or organisation can provide the same.

Sun Has A Direct And Substantial Impact Upon Warming And Cooling Of Earth

Fake news, false propaganda and public opinion manipulations are going on since ages. There is virtually no area where these tactics have not been used. But when it comes to global warming hoax, these tactics have taken shape of even mass scale brainwashing of public. Wherever critical thinkers challenged the obvious lie of gloabl warming due to CO2, these thinkers were labelled as conspiracy theorists, skeptics, etc. Nevertheless, public spirited individuals and many scientists kept on telling the truth. Because of these wonderful human beings we are in a position to discuss about the ways and methods to save our environment and the future of our planet earth. Unfortunately, even the United Nations (UN) got swayed away by this false propaganda and it wasted many precious decades upon incorrect, inappropriate and even dangerous options like GeoEngineering and Solar GeoEngineering.

There are many reasons for warming and cooling of earth and scientists have closely observed periods of Solar Minimum and Solar Maximum for ages. We are not discussing these reasons for warming of earth here as the scope of this article is to analyse whether sunlight or solar radiation can create global warming or not?

Why this point and subject is important at all? Because if sun is creating global warming, then the global warming hoax created against CO2 emissions by road transports would be busted. That is why it is of paramount importance that solar radiation or sun heat must be removed from the equation of global warming.

There is no uniformity of opinion in this regard. The proponents of CO2 induced global warming are denying effect of sun while many other scientists are simply not convinced. They belive sun is the true source of global warming/cooling and they have validly proven this fact using data and analysis of Solar minimum and Solar Maximum periods of many centuries.

The Solar Minimum period witnessed global cooling while Solar maximum period witnessed global warming. Those who believe that global warming is a hoax frequently cite these scientific opinions and literature. Irrespective of what pro and anti global warming people say, we have analyse the situation and position on our own.

On the face of it the argument that fossil fuels generated CO2 is responsible for global warming is absurd and against common sense. Even this argument has been twisted and misrepresented multiple times and has lost its appeal due to dishonesty and suppression of other factors and facts.

For instance, the lie that 97% climate scientists believe that global warming is caused by CO2 has done more damage that proving the point. The truth is that only 1.6 percent explicitly stated that man-made greenhouse gases caused at least 50 percent of global warming. In fact, the scientists who were claimed to be part of this 97% lies openly rejected this false assumption. Even the very greenhouse gases that are frequiently blamed for global warming consist of other gases too besides CO2.

So what the most dedicated 1.6% supporter of global warming are saying is that along with various gases, CO2 is one of the gas that has produced at least 50% of global warming.  The remaining 50% cannot be attributed to solar radiation by the proponents of global warming otherwise the very foundation of CO2 being the culprit would collapse.

“Variations in solar energy output have far more effect on Earth’s climate than soccer moms driving SUVs,” Southwestern Law School professor Joerg Knipprath, writes in his ‘Token Conservative’ blog. “A rational thinker would understand that, especially if he or she has some understanding of the limits of human influence. But the global warming boosters have this unbounded hubris that it is humans who control nature, and that human activity can terminally despoil the planet as well as cause its salvation.”

Many climate scientists agree that sunspots and solar wind could be playing a role in climate change. Because when sunspot numbers rise and fall, there is more going on than simply changes in solar brightness. Periods of reduced sunspot activity correspond to periods of reduced magnetic activity on the sun, and reduced outflows of charges particles from the sun (the so-called solar wind). The solar wind whizzes past the Earth and deflects cosmic rays from deep space from hitting our atmosphere. A recent proposal from Danish scientists suggest that when cosmic rays strike our atmosphere, they create tiny aerosol particles that lead to increased cloud formation and less sunlight hitting the Earth. So it’s a double whammy, fewer sunspots mean a dimmer sun, which also means more cosmic rays into the atmosphere and more cloud cover which further cools the Earth. And vice-versa when there is more solar activity.

Computer models of the climate do not take these indirect effects of solar activity into account when calculating the change in global climate. And while human activity counts for only 5% of carbon dioxide emitted into the atmosphere each year, the sun accounts for ALL the energy striking the Earth and driving its dynamic and enormously complex ocean currents and atmosphere.

For many years, solar scientists considered variation in solar irradiance to be too small to cause significant climate changes. However, Svensmark has proposed a new concept of how the sun may impact Earth’s climate (Svensmark and Calder, 2007; Svensmark and Friis-Christensen, 1997; Svensmark et al., 2007). Svensmark recognised the importance of cloud generation as a result of ionization in the atmosphere caused by cosmic rays. Clouds highly reflect incoming sunlight and tend to cool the Earth. The amount of cosmic radiation is greatly affected by the sun’s magnetic field, so during times of weak solar magnetic field, more cosmic radiation reaches the Earth. Thus, perhaps variation in the intensity of the solar magnetic field may play an important role in climate change.

Man-made carbon dioxide is cited as a cause to produce global warming. However, in an article entitled “Does Carbon Dioxide Drive Global Warming?” Larry Vardiman presented several major reasons why carbon dioxide is probably not the primary cause. But if carbon dioxide is not the cause, then what is? Evidence is accumulating that cosmic rays associated with fluctuations in the sun’s electromagnetic field may be what drives global warming. A new theory called cosmoclimatology that proposes a natural mechanism for climate fluctuations has been developed by Henrik Svensmark, Head of the Center for Sun-Climate Research at the Danish National Space Center.

Edward L. Maunder reported in 1904 that the number of spots on the sun has an 11-year cycle.  Sunspots can be observed in real time online at www.spaceweather.com. Figure below shows a 400-year record of the monthly number of sunspots. Note the low number of sunspots in the period from 1645 to 1715. This period is called the Maunder Minimum and coincides with the Little Ice Age, the coldest period of temperature during the last 1,000 years.

In 1995, Henrik Svensmark discovered a startling connection between the cosmic ray flux from space and cloud cover. He found that when the sun is more active-more sunspots, a stronger magnetic field, larger auroras, stronger solar winds, etc.-fewer cosmic rays strike the earth and cloud cover is reduced, resulting in warmer temperatures.

For the 22-year period from 1983 to 2005, the average amount of low-level cloud follows the flux of cosmic rays very closely. In fact, Svensmark claims that the correlation coefficient is 0.92, a very high correlation for this type of data. In addition, when looking at various longer periods of record using proxy data for these two variables, he also found good correlations and similar trends. In particular, he suggested that during the Little Ice Age when the sun was inactive, cosmic ray flux from space was high, cloud amount was greater, and global temperatures were cooler. As the sun became more active after 1750, cosmic ray flux decreased, cloud amount decreased, and global temperatures warmed. Svensmark proposed that the global warming we’ve experienced for the past 150 years is a direct result of an increase in solar activity and attendant warming.

A potential change in cloud cover of 3-4 percent caused by changes in cosmic ray flux is sufficient to explain global temperature changes of several degrees due to the change in the reflectivity of clouds. The reason the variation in direct radiation from the sun was rejected earlier is because it has been found to vary only by a few tenths of a percent. This is insufficient to explain observed global warming.

These statistical correlations are intriguing, but many critics are skeptical of Svensmark’s theory until he can explain the mechanism by which cosmic rays create more clouds. This led him to design a laboratory experiment to demonstrate that cosmic rays produce more cloud nuclei on which cloud droplets can form. In 2007, Svensmark et al published the results of an experiment which confirmed his theory that cosmic rays increase the number of cloud condensation nuclei (CCN). [ See Svensmark, H. et al. 2007. Experimental evidence for the role of ions in particle nucleation under atmospheric conditions. Proceedings of the Royal Society A. 463 (2078): 385-396.]

Svensmark’s theory of cosmoclimatology is now complete. He has discovered a complete chain of events that explains the variations in global temperature that have puzzled climatologists for so many years, and that has now led to an explanation for the recent global warming episode. It starts with cosmic rays coming to earth from exploding supernovas and collisions of remnants of stars with nebula in space. Many of these cosmic rays are shielded from striking the earth by the electromagnetic activity of the sun. When the sun is active, the solar wind prevents cosmic rays from entering the earth’s atmosphere by sweeping them around the earth. When the sun is inactive, more of them penetrate the atmosphere. Upon reaching the lower atmosphere where more sulphur dioxide, water vapor, and ozone is present, the cosmic rays ionize the air, releasing electrons that aid in the formation of more CCN and form more dense clouds. This increase in low-cloud amount reflects more solar energy to space, cooling the planet. Variations in electromagnetic activity of the sun and fluctuations in cosmic ray intensity from space result in the periodic warming and cooling of the earth.

Solar-modulated cosmic ray processes successfully explain the recent global warming episode. It would be prudent for the political leadership in the U. S. and the world/UN to look more closely at Svensmark’s theory of cosmoclimatology for an explanation of global warming before restructuring our entire economic system to eliminate carbon dioxide. If, in fact, Svensmark is correct, reducing the concentration of carbon dioxide will have little impact, anyway.

UN Blatantly Lied About Global Warming Due To CO2 Emissions By Fossils Fuels

The tussle between global cooling and global warming has been going on even before 1960’s. Upto 1962 we were discussing ways to warm up the Arctic but with the death of Harry Wexler in 1962 things changed forever. In the 1963 GeoEngineering proposals to warm the Arctic took a largely unexplained U-turn when oceanographer, Roger Revelle’s research concluded that carbon dioxide was already warming the climate for free and without the need for expensive and risky geoengineering projects. The worst part was that the lies, hoax and false propaganda of global warming was spread by those having vested interests. There was neither any agreement nor any global consensus among the stakeholders involved in this field in 1970.

Even after almost 50 years the lie of global warming is not only shamelessly peddled but it has even been supported by United Nations (UN). Instead of scientific and conclusive studies in this field, UN preferred to rely upon lies, presumptions, conjectures and surmises. As a result nations imposed carbon emissions tax and other financial penalties that were utilised for even worst sinister purposes .i.e. GeoEngineering and manipulation of earth’s climate using earth based and space based technologies.

A big lie was also spread that 97% of the climate scientists believe that global warming is a man-made disaster. Firstly, there is nothing like “global warming” as even within a country one place may be having a snowfall while other may be facing severe heat. So UN and its so-called scientists used data fudging and labelled “regional warming” as “global warming” to continue its global warming hoax.

Secondly, 97% scientists, climate scientists, physicists, etc have never claimed that global warming is a result of man-made actions like CO2 emissions of fossil fuels. On the contrary, only 1.6 percent explicitly stated that man-made greenhouse gases caused at least 50 percent of global warming. In fact, the scientists who were claimed to be part of this 97% lies openly rejected this false assumption.

In their process to continue to peddle this lie UN and its scientists are doing greater damage to our environment and earth. They are trying to justify GeoEngineering and manipulation of earth’s environment using unscientific, untested, dangerous and irreversible technologies. This time we should not let the climate criminals manipulate us and our earth to suit their own commercial interests.

A Brief History Of GeoEngineering, Weather Warfare, Weather Modification And Chemical Manipulations Of Earth’s Environment

The history of GeoEngineering, Weather Warfare, Weather Modification and Chemical manipulations of Earth’s environment goes back to past century and more. Since 1850 people in U.S. have been trying to use cloud seeding and rain making methods to control rains. As per publically available documents government agencies, the military and commercial interests of the U.S. and abroad have invested in a radical scalar system of weather modification that is now revealed to have far too many unintended negative consequences to sustain life on earth. A good research in this regard is available at this website that has been independently analysed and verified by us.

A detailed analysis of historical development on the abovementioned topic is beyond the scope of this article as we would cover these developments individually and independently. A brief history of this topic is as follows:

1877: Harvard geologist Nathaniel Shaler proposed channeling more of the warm Kuroshio Current through the Bering Strait to raise temperatures in the Polar region by 30 degrees.

1912: New York Engineer and Industrialist, Carroll Livingston Riker proposed building a 200 mile jetty off Newfoundland to increase the Gulf Stream’s flow into to the Arctic Basin with the added benefit that it would “shift” the axis of planet earth.

[ See The New York Times September 29, 1912, Article titled “To Move The Earth And Melt The Pole (1912)” (Pdf) ]

1929: Hermann Oberth, German-Hungarian physicist and engineer; Proposed building giant mirrors on a space station to focus the Sun’s radiation on Earth’s surface, making the far North habitable and freeing sea lanes to Siberian harbors.

[ See Climate Engineering- Technical status, future directions, and potential responses (2011) (Pdf) ]

In 1930 three European scientists pushed the experiments and science further than ever before. August Veraart of Holland triumphantly proclaimed that the dry ice he dispersed into the clouds caused it to rain, but his voice was so loud and his claims so extravagant that he was dismissed altogether. At the same time a duo from Scandinavia and Germany experimented with freezing vapor on ice crystals in clouds and claimed that “at comparatively slight expense, it will, in time, be possible to bring about rain artificially.”

None of these men garnered much credit for the first successful cloud seeding. That honour goes to Irving Langmuir, Vincent Schaefer, and Bernard Vonnegut of the General Electric Laboratories in Schenectady, New York, in 1946. Langmuir’s work on cloud seeding is a footnote in a long and storied career as a chemist and inventor. As a researcher, associate director, and consultant at General Electric, he advanced many fields and earned top honors for his work, including the Nobel Prize in 1932. Schaefer was Langmuir’s assistant who gained recognition for having serendipitously discovered how to form ice crystals in his home ice box—dry ice—which is solid carbon dioxide. In November 1946 he confirmed, in a four-mile long stratus cloud,—what he tested in his ice box—that ice crystals formed when clouds were cooled. Not long thereafter, the younger Vonnegut found that silver iodide could be used to seed clouds to produce rain and snow.

[ See Project Skywater (1961) Page 7 (Pdf) ]

1945: Julian Huxley was the biologist and Secretary-General of UNESCO. In 1946-48 he proposed exploding atomic bombs at an appropriate height above the polar regions to raise the temperature of the Arctic Ocean and warm the entire climate of the northern temperate zone.

[ See Climate engineering- Technical status, future directions, and potential responses (2011) ]

1947: The year 1947 was the first turning point in the history of U.S. GeoEngineering and Weather Modification activities. The National Security Act of 1947 was passed in U.S.. Aside from the military reorganisation, the act established the National Security Council and the Central Intelligence Agency (CIA), the U.S.’s first peacetime non-military intelligence agency.

The Bureau of Reclamation took interest in weather modification in 1947, a year after Schaefer’s seeding demonstrations.

[ See Project Skywater (1961) Page 8 (Pdf) ]

1958: M. Gorodsky, Soviet engineer and mathematician, and Valentin Cherenkov, Soviet meteorologist; proposed placing a ring of metallic potassium particles into Earth’s polar orbit to diffuse light reaching Earth and increase solar radiation to thaw the permanently frozen soil of Russia, Canada, and Alaska and melt polar ice.

[ See Climate engineering- Technical status, future directions, and potential responses (2011) (Pdf) ]

1958: Arkady Markin, Soviet engineer; Proposed that the United States and Soviet Union build a gigantic dam across the Bering Strait and use nuclear power–driven propeller pumps to push the warm Pacific current into the Atlantic by way of the Arctic Sea. Arctic ice would melt, and the Siberian and North American frozen areas would become temperate and productive.

[ See Climate engineering- Technical status, future directions, and potential responses (2011) (Pdf) ]

1958: Russian Oil engineer, P.M. Borisov’s proposed melting the Arctic and Greenland icecaps by spreading black coal dust on the ice, creating cloud-cover across the poles to trap heat and to divert warm Atlantic waters into the polar regions. This scheme was taken seriously by Soviet climatologists. Two conferences were held in Leningrad in the early 1960’s following an initial meeting in Moscow by the Presidium of the USSR Academy of Sciences in 1959.

1958: Atlantic Richfield geologist L.M. Natland, proposed exploding up to 100 underground nuclear bombs to mine the Alberta Oil Sands. Heat from the detonations was expected to boil the bitumen deposits, reducing their viscosity to the point that standard drilling operations could be used. The plan was encouraged by US efforts to find “peaceful uses” for atomic energy. The project was approved in 1959 but the Canadian government reversed their decision in 1962 and declared that Canada was opposed to all forms of nuclear testing. In 2012 the Canadian Tar Sands are, again an issue of international concern.

During US Congress, Senate, Committee on Inter-State and Foreign Commerce, Weather Modification Research, Hearing, Washington D.C. US Govt. Printing Offlce, March 18-19, 1958: Lowell Ponte quotes Capt. Orville as reporting “that the Dept. of Defense was studying ways to manipulate the charges of earth and sky and so affect the weather by means of an electronic beams to ionize or de-ionize the atmosphere over a given area” …. Capt. Orville also discussed ongoing US Air Force experiments with ‘sodium vapor, ejected from jet planes to intercept solar radiation ‘ over enemy countries and rain their weather.

[ See WEATHER MODIFICATION: THE EVOLUTION OF AN R&D PROGRAM INTO A MILITARY OPERATION (1986) (Pdf) ]

1961: Project Skywater came fifteen years after Irving Langmuir, Vincent Schaefer, and Bernard Vonnegut of the General Electric Laboratories in Schenectady, New York, successfully demonstrated in 1946 that “seeding” clouds with nucleating agents like dry ice (carbon dioxide) and silver iodide produced rain.

[ See Project Skywater (1961) (Pdf) ]

1961: Scientists propose artificial ion cloud experiments. In 1960’s the dumping of chemicals (barium powder etc.) from satellites/rockets began.

1961-62: Soviets and USA blast many EMPs in atmosphere, 300 megatons of nuclear devices deplete ozone layer estimated at 4%.

1962: Launch of Canadian satellites and start of stimulating plasma resonances by antennas within the space plasma.

1962: This was the second turning point in the history of U.S.  Harry Wexler (March 15, 1911- 1962) was an MIT graduate and PhD in meteorology. Wexler had been researching the link connecting chlorine and bromine compounds to the destruction of the stratospheric ozone layers, but died in 1962. The Weather Bureau in Washington said the exact cause of death was not known. He had been in New England Baptist Hospital for about a week before his death.

[ See The Baltimore Sun, 12 Aug 1962, Sun, Page Page 37. ]

After the death of Wexler, the policy focus shifted to cooling the planet instead of heating it up.

In the 1963 Geoengineering proposals to warm the Arctic took a largely unexplained U-turn when oceanographer, Roger Revelle’s research concluded that carbon dioxide was already warming the climate for free and without the need for expensive and risky geoengineering projects.

If the science of Roger Revelle’s forecast for global warming turned out to be wrong or too slow, the DoD could step in – for reasons of national security – to assist arctic warming as secret component of the military’s classified weather modification and weapons program.

Revelle had worked with the Navy in the late 1940’s to determine which projects gained funding and successfully promoted the idea that the Navy should invest more in “basic research”. Revelle was deeply involved in the global growth of oceanography. He was also one of the committee chairmen in the influential National Academy of Sciences studies of the “Biological Effects of Atomic Radiation” (BEAR), 1954-1964. Revelle’s world influence was significant as president of the Scientific Committee on Oceanic Research, an international group of scientists devoted to advising on international projects. Revelle and other scientists at Scripps Institution of Oceanography helped the U.S. government to plan nuclear weapons tests so that oceanographers might make use of the data.

The conclusions of the BEAR report were understandably significant for demonstrating the harmful biological and environmental damage of atomic radiation and could easily suffice to thwart geoengineering projects that recommended detonating H-bombs. But the evidence is weak that all intentions to mediate arctic climate was totally abandoned.

If the fundamental goal to warm the arctic remains an unspoken priority of national security in the energy sector, the project could be taken out of public view and committee oversight to become a classified operation in the development of the military’s weather warfare program –an initiative that was acknowledged by civilian weather modification programs formalised by the 1966 NASA and ICAS charter.

1972: First reports on “ionospheric heater” experiments with high frequency radio waves, at Arecibo. 100-megawatt heater in Norway built later in decade; can change conductivity of auroral ionosphere.

1972: Potential Value of Satellite Cloud Pictures in Weather Mod. Projects – Report prepared for NASA by Institute of Atmospheric Sciences South Dakota School of Mines and Technology Rapid City.

[ See POTENTIAL VALUE OF SATELLITE CLOUD PICTURES IN WEATHER MODIFICATION PROJECTS (1972)  (Pdf)]

1974: United Nations General Assembly bans environmental warfare. ENMOD.

1975: Stanford professor Robert Helliwell reports that VLF from power lines is altering the ionosphere.

1975: U. S. Senator Gaylord Nelson forces Navy to release research showing that ELF transmissions can alter human blood chemistry.

1975: Pell Senate Subcommittee urges that weather and climate modification work be overseen by civilian agency answerable to U.S. Congress. No action taken.

1976: Drs. Susan Bawin and W. Ross Adey show nerve cells affected by ELF fields.

1979: Launch of NASA’s third High-Energy Astrophysical Observatory causes large-scale, artificially-induced depletion in the ionosphere. Plasma hole caused by “rapid chemical processes” between rocket exhaust and ozone layer.” …“ionosphere was significantly depleted over a horizontal distance of 300 km for some hours.”

1985: This was the third turning point in the U.S.’s GeoEngineering and Weather Modification history.  Bernard J. Eastlund applies for patent “Method and Apparatus for Altering a Region in the Earth’s Atmosphere, ionosphere and/or Magnetosphere,” (First of 3 Eastlund patents assigned to ARCO Power Technologies Inc.)

1986: US Navy Project Henhouse duplicates Delgado (Madrid) experiment — very low-level, very-low-frequency pulsed magnetic fields harm chick embryos.

1987: In the later part of the decade the U.S. begins network of Ground Wave Emergency Network (GWEN) towers, each to generate Very Low Frequency (VLF) waves for defense purposes

1987-92: Other APTI scientists build on Eastlund patents for development of new weapon capabilities.

1994: Military contractor E-Systems buys APTI, holder of Eastlund patents and contract to build biggest ionospheric heater in world (HAARP).

1994: Congress freezes funding on HAARP until planners increase emphasis on earth-penetrating tomography uses, for nuclear counter proliferation efforts. (Oil and gas exploration)

1995-1997: Public complaints accumulate across the US regarding unusual cloud formations and sudden increase in observable persistent jet contrails that appear unnaturally under dry atmospheric conditions. These observations are accompanied by complaints of biological specimens and web formations that appear to fall from the sky. Many instances of qualified lab analysis reveal high concentration of aluminum, barium and other elements that are consistent with DoD electromagnetic experiments.

1995: Raytheon buys E-Systems and old APTI patents. The technology is now hidden among thousands of patents within one of the largest defense contractor portfolios.

1995: Test of patent number 5,041,834 to generate an Artificial Ionospheric Mirror (AIM), or a plasma layer in the atmosphere. The AIM is used like the ionosphere to reflect RF energy over great distances.

1994-6: Testing of first-stage HAARP (euphemistically named High frequency Active Auroral Research Program) equipment continues,
although funding was frozen.

1996: HAARP scientists test the earth-penetrating tomography applications by modulating the electroject at Extremely Low Frequencies (ELF)

1998: Projected date for fully-operating HAARP system.

2009: Operation HAMP – Department of Homeland Security operation to Modify and Steer Hurricanes with Geoengineering Aerosols.

[ See this website ]

The 1996 Air Force document that forecasts “Owning the Weather in 2025” would not rule out using Tesla and plasma technologies to increase arctic temperatures in order to disadvantage a perceived enemy. A decision not to intervene might betray the military’s primary objective of “Full Spectrum Dominance”. After all, access to Oil and Gas has been a national security priority for decades.

[ See Weather as a Force Multiplier: Owning the Weather in 2025 (pdf) ]

2012: Celebrating 50 years of Success. A Compilation of highlights from the Institute of Atmospheric Sciences at South Dakota School of Mines & Technology Rapid City.

This is just an illustrative list and we would add more resources, links and information to this article. Please visit this page regularly and visit the blog for individual and supportive documents and research.

Disaster Management Should Be Holistic, Comprehensive And Must Involve Maximum Stakeholders

Disaster are unpredicted especially if they are man-made disasters. For almost a century nations have been messing up with earth and its environment and we have started facing its consequences too. Frequency and intensity of floods, famines, droughts, cyclones, etc have significantly increased as nations keep on messing up with the environment and nature.

By their very nature human beings are destructive in nature and they are applying this principle to mother earth and nature too. The desire of humans to control nature is very old. Since 1850 modern humans have been trying to successfully use cloud seeding and rain making methods to control rains as per their desire and needs. For instance, in 1891 the rain making process involved suddenly chilling the atmosphere by rapid evaporation of compressed carbonic acid gas with heavy air concussion using an explosion in air in order to set different air currents in motion. Research on heating of Ionosphere started in 1950’s and availability of airplanes was a major boost in the weather modification, weather warfare and geo engineering efforts of countries like U.S.

What started as a simple cloud seeding exercise turned up into a weather control race, especially for the armed forces of U.S. and Russia. Both countries kept on experimenting in one form or another to continue this mad race to destroy our planet. When nuclear devices were detonated in 1950’s and 1960’s in earth’s atmosphere that was sufficient hint that we must put a brake on this madness. But nothing happened and till February 2019 nations are still messing up with our environment by engaging in illegal and unethical geo engineering, weather modification. aerosols injections in air, etc.

There are also claims that the U.S. Army covertly tested 239 germs warfare tests in open air using live bacteria between 1949 and 1969 to wage biological warfare and to defend against it. Needless to say this was done without consent and approval of the test subjects. There is a complete blackout of research and document pertaining to human atrocities done since 1900. Geo engineering, nuclear warfare, human experiments, chemical and biological warfare, etc are going on with almost no control. We must stop this madness urgently. When we condone biological warfare by nations against their own and non consenting citizens, we lose faith and trust in not only such nations but in international organisations like United Nations (UN), International Court of Justice, (ICJ), etc too. We cannot ignore chemtrails, geo engineering and global cooling realities in current times.

The Central Intelligence Agency (CIA) of U.S. has been very active in fields of weather warfare, weather modification and geo engineering since 1960. Despite contrary claims, military uses of weather, atmosphere and space is going on for decades and it is still in process. Military and weather manipulating satellites have been launched by many nations, including India, under the garb of communication and weather satellites. These satellites can cool, warm or manipulate earth, soil, water, air, space, ice, glaciers, Ionosphere, and oceans by mere entering of few codes at the ground stations. Artificial sun and moon have also been created and are in active use since at least 2017.

As people started questioning the activities of U.S. govt after 1960’s so CIA invented the term “Conspiracy Theorists” in 1967 to silence the criticism of illegal and unethical activities of govt. Nevertheless, critics remained unshaken in their criticisms and they kept on adding evidence from time to time. Some of the so-called conspiracy theories of their times were subsequently found correct but till the time the damage was already done. To neutralise the conspiracy theorists card of CIA, critics seem to have also developed the card of “Sheeples” and it  is having the same effect that the CIA intended against the critics.

With this brief background, I would like to say that the disasters that we would face very soon require techno legal solutions so that disaster risk reduction and resilience for all objectives can be achieved. To start with UN and nations must stop forcing unethical and dangerous geo  engineering, weather modification and solar geo engineering till we have a good consensus on these aspects. A handful of people cannot decide the fate of our planet and this is more so when their theories of climate change and global warming are nothing more than hoax and white lies.

Let the truth be revealed and let us have widespread discussion and brain storming before forcing something that is totally irreversible and catastrophic in nature.